GB2480602A - Orientation surfaces and uses thereof - Google Patents
Orientation surfaces and uses thereof Download PDFInfo
- Publication number
- GB2480602A GB2480602A GB1008528A GB201008528A GB2480602A GB 2480602 A GB2480602 A GB 2480602A GB 1008528 A GB1008528 A GB 1008528A GB 201008528 A GB201008528 A GB 201008528A GB 2480602 A GB2480602 A GB 2480602A
- Authority
- GB
- United Kingdom
- Prior art keywords
- pattern
- cell
- imaging device
- command
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
- G06F3/0321—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
A method of operating a computer to execute a command (e.g. entering a character and/or activating a link) includes, providing an orientation surface 100, carrying a plurality of patterns, each pattern printed on a location on the surface and having a visual characteristic indicative of the location. Providing an image of a pattern and identifying the location in accordance with a visual characteristic of a captured pattern and operating a computer to execute a command associated with the location. The pattern may be associated with a focus point and the visible characteristic a continuous function of the distance between the pattern and the focus point. The pattern may include border regions and cells (fig 1a ,105). The borders may be include sections of lines radiating from a focus point. The patterns may be such that they can be identified even when out of focus. A method and system for locating an imaging device is also set out. The pattern may include a series of circles centred around one or more focus points.
Description
AN ORIENTATION SURFACE AND USES THEREOF
FTELD AND BACKGROUND
The invention generally concerns entering data to a cellular telephone or other small personal communication devices. Some embodiments utilize inventive ways of camera allocation.
Small personal communication devices, such as cellular phones, generally have small keypads, which make them very comfortable to hold at hand or in the pocket, but very inconvenient for text entry.
WO 01/61449, the contents of which is incorporated herein by reference, describes an electronic pen that writes on a page that has a predetermined pattern printed thereon. When a user writes on the pattern, a camera on the pen takes images of the pattern just beneath it, capturing different pattern portions as it moves. The captured pattern portions are translated to a series of locations, and the locations of the moving pen are then deciphered into hand-written letters, to electronically read the hand-written text.
US 2006-098899, the contents of which is incorporated herein by reference, describes a system for identifying electronic counterparts of paper documents by scanning or capturing a portion of the document with a scanner or cell phone camera.
The system allows performing actions (for example, scrolling, copying, or pasting) on the electronic document by making gestures with the capturing device that captures the corresponding paper document.
SUMMARY
According to an exemplary embodiment, there is provided a method of operating a computer to execute a command. Optionally, the command comprises typing a character and/or activating a link. The method comprises: a. providing an orientation surface, carrying a plurality of patterns, each pattern printed on a location on the surface, and having visual characteristic indicative of said location; b. associating said command to a specified location on the orientation surface; c. providing an image, capturing a pattern printed in the vicinity of said specified location, d. identifying the captured location in accordance with the visual characteristics of the captured pattern; and e. operating the computer to execute the command associated with the identified location.
In some embodiment, each pattern is associated with a focus point, wherein each focus point has a known location, and wherein the visible characteristic is a continuous function of a distance between the pattern and the focus point.
Optionally, each pattern is associated with one or more focus points, and the total number of focus points is four or less.
Optionally, the captured pattern is out of focus.
Tn exemplary embodiments, the visual characteristics include one or more of the colour, the brightness, and the shape of the pattern.
In some embodiments, each of the patterns comprises bordering regions; and the shape of borders between said bordering regions is indicative of the pattern location on the surface.
Optionally, the borders are sections of lines; and at least some of said lines meet in a focus point.
In some illustrative embodiments, the bordering regions, differ in at least one visible characteristic to an extent allowing identifying the borders between the regions even when the regions are out of focus.
Optionally, the method comprises extrapolating the borders between regions in the patterns to identify a point at which the extrapolated borders meet, and identifying a cell associated with said pattern in accordance with the coordinates of said point.
Optionally, the orientation surface comprises cells, each being associated with a pattern having visible characteristics corresponding to the location of the cell in the orientation pattern.
Optionally, the command is associated with one or more of said cells.
In an exemplary embodiment, the method comprises hovering above the orientation surface with an imaging device and capturing portions of the orientation surface from distances shorter than the focus distance of said imaging device.
Optionally, the method comprises finding in a look up table (LUT) which command is associated with an identified location.
In accordance with another exemplary embodiment, there is provided a method of finding a position of an imaging device in respect of a substrate carrying an orientation surface, the orientation surface comprising cells, each cell being associated with a unique pattern having visible characteristics indicative of the location of the cell on the orientation surface in respect of a specified focus point, the method comprising: a. associating each unique pattern with a visibility volume, from which said unique pattern is visible to said imaging device; b. processing an image of a portion of said orientation surface taken with said imaging device positioned at said position to identify a unique pattern captured in the image; and c. identifying the position of said imaging device as being within the visibility volume associated with the captured unique pattern.
Optionally, identifying a visibility volume associated with the unique pattern comprises analysing the shape of the pattern to determine the location of the pattern on the surface.
In some embodiments, each pattern is further associated with a second pattern, shaped to represent said character.
Tn accordance with yet another exemplary embodiment, there is provided a system for locating an imaging device, the system comprising: a. said imaging device b. a surface carrying an orientation surface comprising various portions, each being associated with a unique pattern, and with a visibility volume, from which said unique pattern is visible to said imaging device; c. a processor, having i. an input module connected to an output of said imaging device, for receiving in the processor images captured by said imaging device, ii. a pattern identifying module, for identifying in an image received from the imaging device through said input module a unique pattern, and iii. an allocating module, configured to receive from said pattern identifying module a signal indicative of the identified unique pattern and associate with said identified unique pattern a visibility volume, from which the identified unique pattern is visible to said imaging device.
In some such exemplary systems, the allocating module comprises a storage medium, storing an LUT associating patterns to visibility volumes.
Optionally, the allocating module is configured to run a process, which analyses visible characteristics of the identified unique pattern to determine the location of the pattern on said surface.
In an exemplary embodiment, the surface is a substrate carrying an orientation surface, said orientation surface comprising cells, each cell being associated with a unique pattern having visible characteristics indicative of the location of the cell on the orientation surface in respect of a specified focus point.
There is further provided by an exemplary embodiment of the invention a system for controlling a computer to execute a command, the system comprising: a. a surface carrying an orientation surface comprising various portions, each carrying a unique pattern, and each being associated with a visibility volume, from which said unique pattern is visible to said imaging device; b. an imaging device, which takes an image of one of said patterns c. a processor, having iv. an input module connected to an output of said imaging device, for receiving in the processor images captured by said imaging device, v. a pattern identifying module, for identifying a unique pattern in an image received from the imaging device through said input module; and vi. a command identifying module, configured to receive from said pattern identifying module a signal indicative of the identified unique pattern and to associate with said identified unique pattern a command to be executed by said computer d. a communication module, which communicates said command to a computer for carrying out the command.
In an exemplary embodiment, the processor further comprises a fine-positioning module, which determines the position of the centre of the image within the captured cell and wherein the command identifying module identifies the command based on said position.
Optionally, the fine-positioning module measures distances between the centre of said image and two adjacent edges of a cell associated with said unique pattern, to determine the position of the centre of the image in respect of the cell's edges.
Alternatively or additionally, the fine-positioning module measures distances between the centre of said image and specified points within the cells to determine the position of the centre of the image in respect of the specified points.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods, substrates, and systems, different from those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods, systems, and substrates are described below. The following description, however, is illustrative only and is not intended to be necessarily limiting.
BRIEF DESCRIPTION OF THE DRAWINGS
Some exemplary embodiments are described below for illustrating how the invention may be carried out in practice. Here and in the entire specification, the term "exemplary" is to be understood as serving as an example or an instance, and not necessarily as excellent or deserving imitation.
With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
In the drawings: FIG. 1 A is a schematic illustration of an orientation surface according to an exemplary embodiment; FIG. 1 B is an illustration of an orientation surface according to another exemplary embodiment; FIG. 1 C is an illustration of an orientation surface according to another exemplary embodiment; FIG. 2A is a flow chart of actions to be taken in a method of operating a computer according to an exemplary embodiment; Fig. 2B is a flow chart of actions taken in an exemplary method of controlling a computer according to an exemplary embodiment; Fig. 3 is a flowchart of actions to be taken in a method of allocating an imaging device in respect of an orientation surface, according to an exemplary embodiment; Fig. 4 is a block diagram of a system for allocating an imaging device and/or controlling a computer to execute a command, according to an exemplary embodiment.
DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
Overview An aspect of some embodiments of the invention concerns entering text or commands to a computer.
In some embodiments the computer is of a small communication device, such as a cellular phone or a palm (PDA). Alternatively or additionally, the computer can be of any other type, for example, a personal computer (PC).
In an illustrative embodiment, entering a character is by capturing the character to be entered with a camera, for example, a camera of a cell phone. Text is optionally entered by capturing characters in the order they appear in the text to be entered. For example, for entering the text "hi", first an h is captured, and then an 1.
This embodiment optionally utilizes a large sheet of paper or other surface, covered with large print of a set of characters, arranged in space as to facilitate locating them, for example, alphabetically or as in a QWERTY keypad. The user then captures one letter at a time, and this way enters the text. Such a large sheet of paper or other surface is referred below as a character board.
In some embodiments, the character board carries an orientation pattern, comprising a unique pattern in the vicinity of each of the characters to be entered.
Optionally, each of the unique patterns has an easily recognizable feature (for example, an angle or a colour) or combination of easily recognizable features. In some embodiments, the easily recognizable features allow analysing the pattern to identify it without resorting to pattern recognition OCR, or the like. For example, if the orientation pattern comprises only one pattern that is red and has an angle of 35 degrees, it is sufficient to analyse the colour and the angle in order to identifly the pattern, and there is no need to store all the patterns and recognize them by methods of pattern recognition.
Thus, if the characters surrounding this pattern are known, for example, if it is known that to the left of the red pattern having an angle of 35 degrees there is an "h" character, an image having the unique pattern on its right is recognized as h without resorting to image recognition algorithms, and without even storing the shape of an h. Some embodiments allow the capturing of characters from a character board with a camera which is out of focus, for example, a camera held very close to the character board. For this, the unique patterns are easy to analyse in an image taken by the camera when the camera is out of focus.
Optionally, the shape of each pattern in the orientation pattern is indicative of the location of this pattern on the board, and a character is associated with each location.
Additionally or alternatively of being associated with characters the patterns and/or locations are associated with computer commands, such that capturing a command-associated pattern is equivalent to entering a command. Some exemplary commands include copy, paste, and activate a link.
A surface that carries unique patterns in various positions, such that a pattern captured by a camera is indicative of the position of the camera is referred herein as an orientation surface.
In some embodiments, a camera is moved above an orientation surface, and the user makes a predefined gesture when the camera is above a region of interest, for example, above a character that should be entered. Optionally, the gesture is identified using an acceleration sensor, as well known in the art. The camera captures the unique pattern in the vicinity of the character, and thus the character to be entered is identified by the system. Optionally, the entered character is displayed, and if the user is not satisfied with the displayed character, the character is deleted, optionally by capturing a delete key, and re-entered. Optionally, the display is the camera's display. Alternatively or additionally the display is audio display, for example, the loudspeaker of the cellular phone.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
Exemplary application In some embodiments, an orientation surface is used to enrich the experience of "surfing" the internet.
One such embodiment allows a cell-phone user to navigate within a web-page using a printed page and the cell-phone camera, instead of navigating the same web-page using the small display of the cell-phone.
In an exemplary embodiment, the user is equipped with a page carrying an orientation pattern together with a printout of the page. When the user takes an image of a link in the printout, the browser on the user's cell phone activates the link.
This is achieved by first processing the page to create a table associating each link to a unique pattern on the orientation surface. When the link and a nearby unique pattern are captured in an image taken by the user, the image is processed to identify the unique pattern, and the table is consulted to identify which link (if any) is associated with the captured pattern. Then, the identified link is activated.
It is envisaged that some locations, for example, crowded places such as terminals, hospitals, malls, and coffee shops will provide public access to printers, and pages carrying orientation patterns, thus allowing users to create printouts of web pages on orientation surfaces, and navigate web-pages using these printouts, rather than using the small and crowded display of the cell-phone.
Associating the links with unique patterns is optionally carried out by an application residing on the printing computer. Alternatively, where no close printer is available, but a fax machine is found, the processing of the link can be carried by a remote special WAP application at the cellular provider, which is capable of sending HTML pages to fax. Optionally, a dedicated software module for printing and processing the page is provided on the computer. Optionally, the table associating links to unique patterns is sent to the cell-phone, for example, by blue-tooth communication. Optionally, the processing of the image is done by an application residing on the cell-phone.
Alternatively or additionally, another computer, for example, a computer provided to the public in the crowded place, processes the image captured by the cell-phone camera so as to identify the unique pattern captured in the image. In an exemplary embodiment, the cell-phone has only the browser, and an input receiving module, for receiving commands to navigate to URL addresses supplied to it through the receiving module. Optionally, the receiving module is adopted for receiving input through blue-tooth communication.
Exemplary orientation surfaces One problem faced by some embodiments is determining a cell position from an out of focus image of the cell, for example, an image taken by a camera from a distance shorter than the focus distance of the camera.
Another problem faced by some embodiments is determining a cell position from an image of the cell, without utilizing complex pattern recognition algorithms, for example, by measuring a small number of parameters or features shown in the image, and applying to these parameters a simple mathematical function.
In some embodiments, an orientation surface of the kind illustrated in Figs. 1A, 1B, or 1C is used for addressing one or both of the above problems.
Each of the orientation surfaces illustrated in Figs. 1A, 1B, and 1C comprises cells (for example, cell 110, bOB, blOC in Figs 1A, 1B, and 1C respectively), and each cell is associated with a unique pattern. In Fig. 1A, each unique pattern comprises marks, indicative of the distance of the cell associated with the patterns from the upper left cell 105, also referred herein as origin.
The markings on the lower edge of each cell indicate the vertical distance of the cell from the origin; and the markings on the right edge indicate the horizontal distance of the cell from the origin. The distances are indicated using a binary method, wherein an empty circle symbolizes 0, a circle with a cross or X inside symbolizes 1, and a circle with asterisk symbolizes end of number.
Thus, each cell has a circle with asterisk at its lower right corner, allowing orientation of an image that captures such an asterisk in respect of the origin. In addition, all the cells in the first row, for example, have at their bottom edges an empty circle to indicate that they belong to row number 0. Similarly, all the cells in the third column, for example, have at their right edge "®O®" which together symbolize the number 2, in binary presentation. Thus, the third cell from the left in the first row is associated with the following unique pattern: C® Similarly, each of the cells is associated with a unique pattern indicative of the position of the cell in respect of the origin.
In some embodiments, the position of the cell may be calculated from geometrical parameters of the unique pattern. For example, in Fig. 1 B each cell (for example 1 lOB) is associated with a unique pattern comprising adjacent regions, for example 112 and 112', the shapes of which are indicative of the cell associated with the unique pattern.
To allow identification of the shapes of the regions forming the unique patterns, each two adjacent regions (for example, regions 112 and 1 12') are different in colour, brightness, or other visible parameter. Optionally, the difference is to an extent allowing identification of the borders between said regions in an out of focus image.
In some exemplary embodiments, each of adjacent regions 112 and 112' is a trapezoid, having the parallel sides aligned along a vertical side of the orientation surface.
Thus, identification of such trapezoid sides, even if in a blurred, out of focus image, allows identification of the vertical direction in the orientation surface.
Optionally, there are also trapezoids with parallel sides aligned along a horizontal side of the orientation surface. Optionally, trapezoids aligned horizontally are of a different colour than those aligned vertically.
In some embodiments, the non-parallel sides of the vertically aligned trapezoids associated with a cell are designed to originate in a single focus, that is, if the non-parallel sides are continued, they all meet in a single point, referred herein as a focus.
The cell is optionally identified by its position in relation to the focus.
Some embodiments of the invention concern entering data to a cell-phone in a method including processing of images captured by the cell-phone camera.
In some embodiments, each surface (for example, sheet of paper) intended to be used for data or command entry by the cell-phone camera carries an orientation surface.
Fig. 1 B is an illustration of an orientation surface (100) according to an exemplary embodiment of the invention.
Orientation surface 100 consists of horizontal lines (102) and vertical lines (104), forming a matrix of rectangular cells, such as cell 110.
Orientation surface 100 comprises regions (for instance, trapezoidal regions 112 and 114), the borders between which are identifiable in an out of focus image of pattern 100. In some embodiments, the borders are identifiable thanks to a difference between each two bordering trapezoids in at least one visible characteristic, for example, colour or brightness. Each cell is associated with a pattern (for example, pattern 120) comprising trapezoids marked on an edge of the cell. Optionally, each cell is also associated with a second pattern (not shown), for instance, with a character printed inside the cell, which is not necessarily visible to the imaging device when the imaging device is out of focus.
As discussed in more detail below, the shapes of the trapezoids, and particularly the direction of the borders between them are indicative of the specific cell with which the trapezoids are associated, and thus with the command associated with the cell.
The position of a cell, or a sequence of such positions, is optionally indicative of a value of a function operated by an application running in the cell-phone. This way, the matrix formed by the cells becomes a basis for interactions between the camera and an application operated in the cell-phone.
Optionally each cell is associated with a command, for example, a command to enter a letter into an edited text. In some embodiments, groups of cells are associated with the same command. For example, in some embodiments, each of a group of 5 horizontally consecutive cells is associated with the space bar.
Tn an illustrative embodiment, each cell is slightly smaller than the field of view of the camera in the distance at which it is expected to be used for entering commands. In one embodiment, this size is about 2.4 x 3 cm. In other embodiments the cells are larger, and this allows the user to gesture around a broader space for entering a particular letter or command. In some embodiments, the cells are smaller, and this allows the orientation surface to make room for more letters, or -in turn -to reduce the size of the character board, making it more convenient to handle in crowded areas, say, or while travelling.
In order not to force the user to carry a few sheets, each of a different orientation pattern, it is advantageous to have a universally standard orientation pattern.
Tn pattern 100B, each of the cells contains a right edge 106. Tn some embodiments not illustrated in Fig. 1B, the cells also have bottom edges. The edges are optionally narrow enough to leave room for printing the commands (for example, characters) in the cell, but wide enough to allow analysis of the captured bristles. In an illustrative embodiment, the width of the edges is 3-4 mm.
The edges are marked with sections of beams, such as beam 108. The beams are not necessarily marked on the board in their entirety. In Fig. 1 B only beam portions within the edges are marked. Beam 108 is marked in full for illustration and explanation only.
Sections of the beams that fall within the edges are marked in orientation surface 1 OOB, and referred herein as bristles. Tn pattern 100, the quadrangles defined by the edge borders and the bristles going through the edge, referred to herein as trapezoids are preferably of sharply distinct colours, brightness, or other visible characteristic, such that the bristle is easily detected by a camera even when the bristle is out of the focus of the camera.
Optionally, all the beams meet at one point, referred to as a focus. Optionally, there are several foci, and some of the beams meet at each focus. In the illustrated embodiment, the meeting point is outside the board, at the left centre, and is accordingly referred to as a left focus. In some embodiment, where bottom edges are also used, there are corresponding vertical beams meeting at an upper (or lower) focus.
Optionally, a focus is inside the pattern, for example, at the geometrical centre of the pattern. Optionally, a focus is outside the pattern or at the circumference thereof, for example, on the left, right, bottom or upper side of the pattern.
Tn some embodiments, marking on one edge of a cell are indicative of the cell's position along the length of the orientation surface (vertical index), and markings on another edge of a cell are indicative of the cell's position along the width of the orientation surface (horizontal index). For example, the markings appearing on the right edge of each cell are optionally indicative only to the vertical index of the cell, and the markings appearing on the bottom edge of each cell are optionally indicative only to the horizontal index of the cell. Some such arrangements allow for less accurate distinction ability between markings during analysis, but require analysis of two patterns for identifying each cell.
In orientation surface 100C, each cell, for example, 1 1OC is marked with one or more curves 150. In Fig. 1C each curve is a portion of a circle (that is, arc). In other embodiments, the curves are parts of other shapes, for example, parabolas, or hyperboles.
In some embodiments, it is preferred that the curve will allow determination of a focus point outside the curve by simple mathematical calculation. For example, an arc allows calculation of the position of the centre of the circle, of which the arc is a portion. In another example, a parabolic curve allows calculation of the focus of the parabola. The centre, focus, or other unique points determined by other curves, function similarly to the foci in the example of Fig. lB. Analysis of the curves allows determining the position of the focus, in respect of the captured curve portion, and thus positioning of the captured cell in respect of the focus. Optionally, curves associated with each focus are characterized by another characteristic, allowing to identify distance from which focus has been calculated. For example, in one embodiment, arcs are used, and arcs associated with different centres have different radii. For example, from a left focus the arcs are portions of circles having radii of 1, 2, and 3 cm, and from an upper focus the arcs are portions of circles having radii of 1.5, 2.5, and 3.5 cm. This way, calculating the radius of curvature of the curve allows identifying the focus of the curve.
Additionally or alternatively, cell horizontal walls (for example, 1 04C) are drawn with double lines of different thickness, where the thicker line turns down. This helps in breaking symmetry, in case symmetry exists between associated with two foci. For example, in Fig. 1 C the cells containing the digit 0 and the letter B differ in that the curve in the 0 cell touches a thick line, while the curve in the B cell touches a thin line.
Additionally or alternatively, cell vertical walls are drawn with double lines of different thickness, providing further aid in identifying the position of a captured cell on the image.
Additionally or alternatively to drawing double lines of different thickness, the lines may be different in other characteristics that are easily visible, for example, colour.
Exemplary ways to choose between commands associated with the same cell In some embodiments, a single cell may have two or more assignments, for example the same Latin letter as capital or as small, or the same German vowel with or without an "umlaut".
Optionally, the user chooses between the various commands associated by gesturing a specified gesture with the cell phone, when the camera is above the cell.
Exemplary gestures that may be used for this or other purposes include, for example, lowering and lifting the camera, moving the camera in a circle, circling in a specified direction (e.g. clockwise or counter clockwise), and moving back and forth above the cell.
Alternatively or additionally, one of the commands associated with each cell is a default, but when a cell is captured right after activating a "shift" key (for instance by capturing a specified cell) a specified non default command is to be executed.
Alternatively or additionally, one of the cells is defined as a "caps lock key", and after the capture of this cell, all captured cells are associated with specified non-default commands until the caps-lock cell is captured again.
Alternatively or additionally, a cell-phone key is defined to function as a shift key when the phone is in the mode of command entry via the camera. Alternatively or additionally, a cell-phone key is defined to function as a caps lock key when the phone is in the mode of command entry via the camera.
Exemplary operation and control methods Fig. 2A is a flow chart of actions to be taken in a method of operating a computer according to an exemplary embodiment.
At 202A commands are associated with positions in an orientation surface.
Optionally, the orientation surface is of the kind illustrated in Fig. 1A, 1B, or 1C.
Examples of commands to be associated with positions in the orientation surface include text editing commands, and activating a link, for instance a link to a URL address. Text editing commands comprise, for instance, typing a character, copying, pasting, or scrolling a page.
At 204A an image, capturing a portion of the orientation surface, is analysed to identify a command associated with the captured pattern. In a preferred embodiment, the portion to be analysed is out of focus.
At 206A the computer is operated to execute the identified command, for example, to type a character, activate a link, etc. Fig. 2B is a flow chart of actions taken in an exemplary method (200) of controlling a computer to execute a command associated with a cell of orientation surface 100, once the associated cell (and a right edge thereof) is captured by the camera. The method assumes that the camera is very close to the orientation surface, such that a single
cell fills the entire field of view of the camera.
At 202B markings are identified in the captured image. In the example of Fig. 1B, for instance, bristles are identified thanks to the interchanging colours and sharp edges of the markings.
At 204B the directions of at least two of the identified bristles are determined, with respect to the direction of the edges said bristles run through.
At 206B, the position of at least one focus is determined with regard to the captured cell, by extrapolating the identified bristles, and calculating the point at which the extrapolated bristles intersect.
Optionally, the meeting point of two bristles (preferably the two far most) is first calculated, and then verified using a third bristle associated with the same focus.
Optionally, all pairs of bristles are used, each to independently determine the focus coordinates, and the independently determined values are averaged, to provide an improved estimation of the focus coordinates.
In some embodiments, only an upper focus is used. In some embodiments, calculations made with two bristles that meet at the left focus are verified by one bristle that originates at an upper focus. Other possibilities are similarly available to a skilled person.
At 208B the imaged cell is indexed with a row-index and a column-index (counting, for example, from the leftmost uppermost cell), based on the horizontal and vertical distance of the captured right edge from the calculated coordinates of the left focus. The terms "horizontal" and "vertical" are used to denote directions parallel to the direction of the grid lines, which not necessarily coincide with the orientation of the captured image. If the pattern contains bristles associated with more than one focus, for example, with a left focus and an upper focus, results obtained with the two foci are optionally compared for verification, or averaged as suggested above.
At 21 OB, the indexes of the cells are communicated to an application, which optionally resides on the cell phone.
At 212B the application searches a LUT to find a command associated with the indexed cell; and At 214B the command is executed.
Tn some embodiments, an orientation surface is used to identify not only the indices of the imaged cell but also an accurate location within the cell. Tn some embodiments, an exact location of the camera is estimated based on the coordinates of the centre of the field of view of the camera in respect of the identified focus.
Optionally, the accurate position is measured with respect to the grid lines forming the cell after the cell is indexed as in method 200. Examples for applications that may benefit from obtaining an exact location of the camera include the drawing tool, and the internet browsing aid (when multiple hyper-texts are found in the same cell).
An exemplary allocation method Fig. 3 is a flowchart of actions to be taken in a method 300 of allocating an imaging device in respect of an orientation surface, according to an exemplary embodiment.
At 302 an orientation surface is provided. Optionally, the orientation surface is of the kind illustrated in Fig. lB.
The orientation surface comprises various portions, each carrying a unique pattern. For example, one kind of a unique pattern appearing on pattern 100 of Fig. lB is cell 110, carrying a pattern composed of trapezoids 112' and 112" on the left edge of cell 110, and/or trapezoids 114 on the right edge of cell 110.
In an exemplary embodiment, each portion of the orientation surface, for example, each cell in pattern 100, is associated with a visibility region, from which the portion is visible to the imaging device. Tn one example, each cell is associated with a region having the cell at the centre of its base. The borders of the region base are optionally the borders of the field of view of the imaging device. The height of the region is optionally a predetermined height, at which the orientation surface is out of focus of the imaging device. Optionally, the predetermined height is that height, at which the size of the field of view of the imaging device is of about one cell. In this case, the cell is the base of the region forming the visibility volume.
At 304 an image of a portion of the orientation surface, taken from a position to be identified, is processed to identify a unique pattern captured in the image.
For example, and edge map of the image is prepared, as known per se in the art of image processing. The edges found in the image are considered edges characteristic of the captured unique pattern, for example, the bristle and edge portions defining a trapezoid in Fig. lB. The shapes of the edges (for example, the angles between sides of an identified trapezoid) are used to identify the location of the pattern.
At 306, the identified unique pattern is associated with a specific portion of the orientation surface.
In some embodiments, there is a look-up-table (LUT), associating each observable pattern with a portion of the orientation surface, for example, each trapezoid combination appearing in orientation surface 100 with a cell.
Alternatively or additionally, the shape of the unique pattern captured in the image is analysed, for example, computationally, to identify the surface portion associated therewith. For example, in the orientation surface of Fig. 1B, the bristles of the captured trapezoids are extrapolated to identify the orientation of the captured bristles in respect of the focus.
In case several lines of markings are captured (that is, portions of several edges), the captured pattern is optionally associated with the cell at the left of the rightmost line of markings, that is, the rightmost cell, the right edge of which was captured.
In another example, the captured pattern is associated with cells going from the rightmost cell, the right edge of which was captured and the leftmost cell, the right edge of which was captured.
In still another example, the captured pattern is associated with the cell in the middle between the rightmost and leftmost cells, the right edge of which are captured.
Tn some embodiments, more than one of the above examples are used, and several visibility volumes are suggested for a captured image. Optionally, one of them is chosen respective to the results of the analysis of image(s) taken earlier and/or later. For example, when each cell is associated with a character, and some images are associated with several cells, predictive text techniques may be applied for choosing which cell to associate with which image.
At 308, the position of the imaging device at the moment capturing the analysed image is identified to be in a visibility volume associated with the captured pattern.
In some embodiments, the position of the imaging device is associated with a command to be performed by a computer, for instance, via an LUT. In other embodiments, the position of the imaging device is used for other purposes, for instance for tracking the imaging device.
An exemplary system for locating an imaging device Fig. 4 is a simplified block diagram of a system 400 for locating an imaging device, in accordance with an exemplary embodiment. System 400 comprises: imaging device 402, having an output 404 for outputting digital image data acquired by device 402; an orientation surface 410, optionally carried on a substrate 412, such as a sheet of paper; and a processor 420.
The processor has an input module 422, connected to output 404 of imaging device 402, for receiving in the processor image data captured by the imaging device.
Processor 420 also has a pattern identifying module 424, for identifying a unique pattern in image data received from imaging device 402 through input module 422.
Processor 420 also has an allocating module 430, configured to receive from pattern identifying module 424 a signal indicative of the identified unique pattern and associate with said identified unique pattern a visibility volume, from which the identified unique pattern is visible to said imaging device.
Output 432 of processor 420 outputs the location of said visibility volume as the identified location of imaging device 402.
Optionally, system 400 also comprises a fine-positioning module (436), which determines the position of the centre of the image within the captured cell and provides this information to module 430, for more accurate allocation of the imaging device. For example, if the centre of the image is in the upper half of the cell, the imaging device is identified to be in an upper portion of the visibility volume (where upper portion of the visibility volume' means a portion having a base parallel to the upper portion of the cell).
Optionally, fine-positioning module 436 measures distances between the centre of an image taken by imaging device 402 and two adjacent edges of a cell associated with the unique pattern identified by identifying module 424, to determine the position of the centre of the image in respect of the cell's edges.
Alternatively or additionally, fine-positioning module 436 measures distances between the centre of the image and specified points within the cell identified by identifying module 424 to determine the position of the centre of the image in respect of the specified points.
An exemplary system for operating a computer A system of similar simplified block diagram is optionally used for controlling a computer to execute a command. In such a system, allocating module 430 is replaced with a command identifying module, configured to receive from pattern identifying module 424 a signal indicative of the identified unique pattern and to associate with said identified unique pattern a command to be executed by said computer.
Optionally, fine-positioning module (436), which determines the position of the centre of the image within the captured cell provides this information to module 430, for distinguishing between two commands associated with the same cell.
Optionally, output 432 is connected to an input of a computer 440 that is to execute the command. This connection is optionally via wireless communication.
Alternatively or additionally, this connection is via wired communication.
Optional embodiments of different system parts Imaging device 402 is optionally a camera of a cell-phone. In some embodiments the imaging device is a camera, a CCD, or other imaging device, the location of which in respect of the orientation surface is to be found.
Processor 420 is optionally in the same cell-phone as imaging device 402, in which case output 404 and input 422 are internal in the cell-phone.
The orientation surface is optionally according to an embodiment of the present invention, for example, an orientation surface as illustrated in Fig. 1A or 1 B. The pattern identifying module optionally runs an edge-recognizing procedure as know per se in the art of image processing, to identify marks or edges between adjacent regions in the orientation surface.
Allocating module 430 optionally comprises a storage medium 434, storing an LUT associating patterns to visibility volumes.
Additionally or alternatively, allocating module 430 rnns a process, which analyses visible characteristics of the identified unique pattern to determine the location of the pattern on the orientation surface.
Claims (26)
- CLAIMS1. A method of operating a computer to execute a command, the method comprising: a. providing an orientation surface, carrying a plurality of patterns, each pattern printed on a location on the surface, and having visual characteristic indicative of said location; b. associating said command to a specified location on the orientation surface; c. providing an image, capturing a pattern printed in the vicinity of said specified location, d. identifying the captured location in accordance with the visual characteristics of the captured pattern; and e. operating the computer to execute the command associated with the identified location.
- 2. A method according to claim 1, wherein each pattern is associated with a focus point, wherein each focus point has a known location, and wherein the visible characteristic is a continuous function of a distance between the pattern and the focus point.
- 3. A method according to claim 2, wherein each pattern is associated with one or more focus points, and the total number of focus points is four or less.
- 4. A method according to claim 1, wherein the captured pattern is out of focus.
- 5. A method according to any of the preceding claims, wherein said visual characteristics comprises one or more of the colour, the brightness, and the shape of the pattern.
- 6. A method according to any of the preceding claims, wherein each of the patterns comprises bordering regions; and the shape of borders between said bordering regions is indicative of the pattern location on the surface.
- 7. A method according to claim 6, wherein said borders are sections of lines; and at least some of said lines meet in a focus point.
- 8. A method according to claim 6 or 7, wherein said bordering regions, differ in at least one visible characteristic to an extent allowing identifying the borders between the regions even when the regions are out of focus.
- 9. A method according to any of claims 6 to 8, comprising extrapolating said borders between regions in said patterns to identify a point at which the extrapolated borders meet, and identifying a cell associated with said pattern in accordance with the coordinates of said point.
- 10. A method according to any of the preceding claims, wherein said orientation surface comprises cells, each being associated with a pattern having visible characteristics corresponding to the location of the cell in the orientation pattern.
- 11. A method according to claim 10, wherein the command is associated with one or more of said cells.
- 12. A method according to any of the preceding claims, comprising hovering above said orientation surface with an imaging device and capturing portions of said orientation surface from distances shorter than the focus distance of said imaging device.
- 13. A method according to any of the preceding claims, comprising fmding in a look up table (LUT) which command is associated with an identified location.
- 14. A method of finding a position of an imaging device in respect of a substrate carrying an orientation surface, the orientation surface comprising cells, each cell being associated with a unique pattern having visible characteristics indicative of the location of the cell on the orientation surface in respect of a specified focus point, the method comprising: a. associating each unique pattern with a visibility volume, from which said unique pattern is visible to said imaging device; b. processing an image of a portion of said orientation surface taken with said imaging device positioned at said position to identify a unique pattern captured in the image; and c. identifying the position of said imaging device as being within the visibility volume associated with the captured unique pattern.
- 15. A method according to claim 14, wherein identifying a visibility volume associated with the unique pattern comprises analysing the shape of the pattern to determine the location of the pattern on the surface.
- 16. A method according to any of claims 1 to 13, wherein said command comprises typing a character.
- 17. A method according to claim 16, wherein each pattern is further associated with a second pattern, shaped to represent said character.
- 18. A method according to any of claims 1 to 13, wherein said command comprises activating a link.
- 19. A system for locating an imaging device, the system comprising: a. said imaging device b. a surface carrying an orientation surface comprising various portions, each being associated with a unique pattern, and with a visibility volume, from which said unique pattern is visible to said imaging device; c. a processor, having i. an input module connected to an output of said imaging device, for receiving in the processor images captured by said imaging device, ii. a pattern identifying module, for identifying in an image received from the imaging device through said input module a unique pattern, and iii. an allocating module, configured to receive from said pattern identifying module a signal indicative of the identified unique pattern and associate with said identified unique pattern a visibility volume, from which the identified unique pattern is visible to said imaging device.
- 20. A system according to claim 19, wherein said allocating module comprises a storage medium, storing an LUT associating patterns to visibility volumes.
- 21. A system according to claim 19, wherein said allocating module is configured to run a process, which analyses visible characteristics of the identified unique pattern to determine the location of the pattern on said surface.
- 22. A system according to any one of claims 19 to 21, wherein said surface is a substrate carrying an orientation surface, said orientation surface comprising cells, each cell being associated with a unique pattern having visible characteristics indicative of the location of the cell on the orientation surface in respect of a specified focus point.
- 23. A system for controlling a computer to execute a command, the system comprising: a. a surface carrying an orientation surface comprising various portions, each carrying a unique pattern, and each being associated with a visibility volume, from which said unique pattern is visible to said imaging device; b. an imaging device, which takes an image of one of said patterns c. a processor, having i. an input module connected to an output of said imaging device, for receiving in the processor images captured by said imaging device, ii. a pattern identifying module, for identifying a unique pattern in an image received from the imaging device through said input module; and iii. a command identifying module, configured to receive from said pattern identifying module a signal indicative of the identified unique pattern and to associate with said identified unique pattern a command to be executed by said computer d. a communication module, which communicates said command to a computer for carrying out the command.
- 24. A system according to claim 23, wherein the processor further comprises a fine-positioning module, which determines the position of the centre of the image within the captured cell and wherein the command identifying module identifies the command based on said position.
- 25. A system according to claim 23, wherein said fine-positioning module measures distances between the centre of said image and two adjacent edges of a cell associated with said unique pattern, to determine the position of the centre of the image in respect of the cell's edges.
- 26. A system according to claim 24 or 25, wherein said fine-positioning module measures distances between the centre of said image and specified points within the cells to determine the position of the centre of the image in respect of the specified points.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1008528A GB2480602A (en) | 2010-05-24 | 2010-05-24 | Orientation surfaces and uses thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1008528A GB2480602A (en) | 2010-05-24 | 2010-05-24 | Orientation surfaces and uses thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201008528D0 GB201008528D0 (en) | 2010-07-07 |
GB2480602A true GB2480602A (en) | 2011-11-30 |
Family
ID=42341136
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1008528A Withdrawn GB2480602A (en) | 2010-05-24 | 2010-05-24 | Orientation surfaces and uses thereof |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2480602A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020044134A1 (en) * | 2000-02-18 | 2002-04-18 | Petter Ericson | Input unit arrangement |
WO2007003682A1 (en) * | 2005-06-30 | 2007-01-11 | Nokia Corporation | Camera control means to allow operating of a destined location of the information surface of a presentation and information system |
GB2431269A (en) * | 2005-10-13 | 2007-04-18 | Hewlett Packard Development Co | Detector for generating a model representing a form of markings for a pattern |
US20090226101A1 (en) * | 2008-03-05 | 2009-09-10 | Sony Ericsson Mobile Communications Ab | System, devices, method, computer program product |
-
2010
- 2010-05-24 GB GB1008528A patent/GB2480602A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020044134A1 (en) * | 2000-02-18 | 2002-04-18 | Petter Ericson | Input unit arrangement |
WO2007003682A1 (en) * | 2005-06-30 | 2007-01-11 | Nokia Corporation | Camera control means to allow operating of a destined location of the information surface of a presentation and information system |
GB2431269A (en) * | 2005-10-13 | 2007-04-18 | Hewlett Packard Development Co | Detector for generating a model representing a form of markings for a pattern |
US20090226101A1 (en) * | 2008-03-05 | 2009-09-10 | Sony Ericsson Mobile Communications Ab | System, devices, method, computer program product |
Also Published As
Publication number | Publication date |
---|---|
GB201008528D0 (en) | 2010-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101324107B1 (en) | Information output apparatus | |
JP5084718B2 (en) | Combination detection of position coding pattern and barcode | |
US6992655B2 (en) | Input unit arrangement | |
US6844871B1 (en) | Method and apparatus for computer input using six degrees of freedom | |
JP4203517B2 (en) | Information output device | |
US20160147723A1 (en) | Method and device for amending handwritten characters | |
US20100225664A1 (en) | Content display apparatus | |
KR20040038643A (en) | Universal computing device | |
KR20020033775A (en) | Notepad | |
JP5664303B2 (en) | Computer apparatus, input system, and program | |
JP3879106B1 (en) | Information output device | |
EP2591441B1 (en) | Dot code pattern for absolute position and other information using an optical pen, process of printing the dot code, process of reading the dot code | |
CN113535055A (en) | Method, equipment and storage medium for playing point reading material based on virtual reality | |
WO2006135329A1 (en) | On demand generation of position-coded bases | |
JP4308306B2 (en) | Print output control means | |
WO2001048590A1 (en) | Written command | |
GB2480602A (en) | Orientation surfaces and uses thereof | |
JP6048165B2 (en) | Computer apparatus, electronic pen system, and program | |
JP5810724B2 (en) | Terminal device, electronic pen system, and program | |
JP2009230411A (en) | Character input system | |
JP5663543B2 (en) | Map with dot pattern printed | |
JP5678697B2 (en) | Computer apparatus, input system, and program | |
JP5294060B2 (en) | Print output processing method | |
JP5104904B2 (en) | Information processing system and display processing program | |
JP2012164139A (en) | Computer, input system and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |