HK1141607B - Note capture device - Google Patents
Note capture device Download PDFInfo
- Publication number
- HK1141607B HK1141607B HK10107863.7A HK10107863A HK1141607B HK 1141607 B HK1141607 B HK 1141607B HK 10107863 A HK10107863 A HK 10107863A HK 1141607 B HK1141607 B HK 1141607B
- Authority
- HK
- Hong Kong
- Prior art keywords
- image
- capture device
- recited
- image capture
- camera
- Prior art date
Links
Description
Cross Reference to Related Applications
The present application claims benefit of U.S. provisional patent application entitled "Note Capturedevice" filed on.2, 15, 2007 with application number 60/890180, and which is incorporated herein by reference.
Technical Field
Exemplary embodiments disclosed herein pertain to digital electronic devices for capturing information. More particularly, exemplary embodiments disclosed herein pertain to an apparatus for capturing handwriting and related metadata.
Background
As a convenient way to record information, handwritten notes are used throughout the entire recording history. While some changes have occurred over the years in the technology used to produce paper and writing instruments such as pens and pencils, the actual method used to write notes by hand has not changed substantially over the thousands of years. This fact is a proof of the convenience and effectiveness of this form of recorded information.
With the advent of electronic computers, various attempts have been made to improve handwritten notes. Notably, "personal digital assistants" (PDAs) and "pen computers" have been developed for capturing handwritten content and providing various organizational services.
Handwriting recognition is often provided in PDAs and pen computers to gain the advantage of being able to search for and edit information.
Handwriting recognition is the ability of a computer to receive understandable handwriting input. By optical scanning (optical feature recognition), an image of the written text can be perceived "off-line" from the paper. Alternatively, the movement of the pen tip may be sensed "on-line", for example, by a pen-based computer screen surface.
Problems of recognition accuracy were encountered in early attempts at handwriting recognition. It is generally understood that recognition algorithms do not have the contextual understanding of the reader and therefore are difficult to capture handwritten information. Even sometimes people have difficulty understanding handwriting.
A series of bases based on the success of the PalmThe PDA of the system is identified.Usability is improved by defining a set of strokes for each character. While remembering this stroke pattern does extend the learning curve of the user, it makes the possibility of erroneous input less likely.
In recent years, attempts have also been made to produce ink pens that include digital elements so that one can write on paper and digitally store the text produced. The success of these products has to date been determined.
It has become clear that: real-time handwriting recognition places an undue burden on the user as he is making notes (i.e., as he is focusing on the content of his notes), who is required to correct recognition errors or to handle such things asIs disturbed by the new writing form of (a).
The tactile sensation of writing on paper is also reduced when using a typical PDA or pen input computer. Rather than on the familiar pad of the notepad and with the familiar drag to stabilize the writing, it appears on a hard and apparently low friction surface. The user is forced to compensate by stabilizing the pen by himself. This unfamiliar writing mode also imposes on the user's thinking process when writing content about a particular note.
Post-it note (Post-it note) invented and manufactured by 3M is a paper with a re-attachable strip of adhesive on the back, designed for temporary attachment of notes to documents, computer displays, etc. While the wide range of colors, shapes and sizes now available, the most common size for note paper is 3 square inches and the logo color is yellowish.
Note papers have been presented as a convenient medium for recording informal notes of reminders, telephone numbers, telephone messages, and the like. They feature a low-tack adhesive strip that allows each note to be easily adhered and removed without leaving marks or residue. The use of note paper is ubiquitous. When making notes, such notes exhibit a very high standard of convenience. The user may simply write on the notepad, take the note off, and then place the note in a prominent location as a reminder, or place it on a file as a comment, etc.
The electronic devices of the prior art do not achieve the level of convenience in making notes that is achieved when writing a note paper. What is needed is a device that combines the convenience of note paper with the advantages of electronic capture for transmittal, retrieval and editing. Unfortunately, the prior art is limited in that it fails to provide any such solution.
These and other limitations of the prior art will become apparent to those of ordinary skill in the art upon a reading of the following description and a study of the several drawings.
Disclosure of Invention
Some non-limiting exemplary embodiments provide a note capture device that includes a writing surface, a camera fixed to view the writing surface, and electronic circuitry to capture images of the surface, or various alternative surfaces, including, for example, a business card surface. Various image transformations are understood which can correct for camera angles with respect to the tilt of the surface, for example. Other transformations to enhance the readability of the image, or to add indicia, logos, etc., are also contemplated.
Some example embodiments provide for automatic detection of completion of a written note and, when detected, cause capture of the completed note. Some embodiments employ different sensors to detect completion of writing notes. Other embodiments perform this detection by using button taps. Still other embodiments perform this detection by image analysis. Various combinations of these embodiments are understood.
In some exemplary embodiments and combinations, the captured images are stored internally, transmitted electronically to an external device such as a personal computer, cell phone, server, or the like. The mode of communication between note capture device and the external device may include a cable directly or indirectly coupling the devices, or a wireless connection directly or indirectly coupling the devices. It will further be appreciated that this communication may occur through the use of removable media such as magnetic media, flash memory devices, and other non-volatile memory devices and media. Other general methods of transferring data from one device to another are also understood.
In some embodiments, a screen is provided to allow a user to interact with the note capture device and direct different modes of communication for the purpose of sending captured notes to different destinations, such different protocols including, by way of example and not limitation, SMTP, FTP, SFTP, HTTP, HTTPS, file sharing protocols such as those provided by different commercially available operating systems including Windows, Mac OS, Linux, CE Mobile, Palm OS, Symbian, Java based OSes, and the like.
Handwriting recognition software may be employed to generate text representing captured notes in whole or in part, in different modes of operation. The text thus identified may be used along with or instead of the image information associated with the note.
The image information associated with the note may be converted in different ways, including converting it to stroke information or compressing, etc.
Some other exemplary embodiments may include a planar image capture device including a rest surface defining a base image plane and a solid state camera device defining an optical axis that is oblique to the base image plane. As used herein, a solid state camera generally refers to a camera made from semiconductor manufacturing techniques and, therefore, does not necessarily include one or more lenses. However, other cameras of different types may also be used in conjunction with the claimed embodiments.
Another exemplary embodiment includes a planar image capture system having a planar image capture device. The planar image capture device has a rest surface defining a base image plane and a solid state camera device defining an optical axis that is oblique to the base image plane, the camera device having a fixed optical focus with a depth of field that extends from the solid state camera device by about at least 2.5 inches but no more than about 7.5 inches. In addition, the system includes a computing device coupled to the planar image capture device, the computing device capable of at least receiving image information from the capture device.
Yet another exemplary embodiment provides a method for providing image information. The method includes aiming a solid-state camera device having an optical axis oblique to a base image plane and having a fixed focal point with a depth of field extending from the solid-state camera device by about at least 2.5 inches but no more than about 7.5 inches. The method also includes capturing image data with the solid state camera and storing, at least in part, the image data in a digital memory.
Another exemplary embodiment provides a computer-usable medium having computer-readable instructions stored thereon for execution by a processor to perform a method. The method includes aiming a solid-state camera device having an optical axis oblique to a base image plane and having a fixed focal point with a depth of field extending from the solid-state camera device by about at least 2.5 inches but no more than about 7.5 inches, capturing image data with the solid-state camera device and at least partially storing the image data in a digital memory.
Yet another exemplary embodiment provides an image capture and transmission system including a network and a planar image capture device. The planar image capture device includes a support surface defining a base image plane, a solid state camera device defining an optical axis that is oblique to the base image plane, the solid state camera device having a fixed optical focus with a depth of field that extends from the solid state camera device by at least about 2.5 inches but no more than about 7.5 inches, and a computing device integrated with the planar image capture device capable of at least receiving image information from the capture device. The planar image capture device is coupled to the network and is operable to transmit the image information to another device over the network.
These and other embodiments as well as advantages and other features disclosed herein will become apparent to those of ordinary skill in the art upon reading the following description and studying the various drawings.
Aspects of the different illustrative embodiments provide software to view, browse, share, search, and edit captured notes, their associated metadata, recognition results, and the like. The different exemplary embodiments include human interface software associated with a user's "desktop" or different application windows.
Drawings
Exemplary embodiments are described with reference to the drawings, wherein like parts are designated with like reference numerals. These exemplary embodiments are intended to illustrate, but not to limit, the invention. The following figures are included:
FIG. 1 is a high-level diagram showing note capture device 2 and various other devices in communication therewith;
FIG. 2 shows note capture device 2 of FIG. 1 in greater detail;
FIGS. 3A-3B still more particularly illustrate note capture device 2 of FIGS. 1 and 2;
fig. 4 shows the camera head 32 of fig. 2 and 3 in more detail;
FIG. 5 illustrates note capture device 2 with yet another sensor mechanism for detecting a different mode of operation;
FIG. 6 shows the field of view of camera head 32 in more detail;
FIG. 7 shows copy button 24 in more detail;
FIG. 8 shows note capture device 2 including wireless communication subsystem 12 in an exemplary embodiment;
FIG. 9 illustrates an exemplary embodiment of note capture device 2 in which a specifically designated area of the writing surface has a particular meaning depending on the configuration of the note;
FIG. 10 illustrates an exemplary embodiment of note capture device 2 including an LCD display screen 60, the LCD display screen 60 may be used in conjunction with the capture process to help ensure a clear image is obtained;
FIG. 11 is a block diagram of circuitry implemented on the printed circuit board 38 of FIGS. 3, 5, 6, and 7;
FIG. 12 is a flowchart depicting a capture program running on note capture device 2;
FIGS. 13, 14 and 15 show portions of a time line that more particularly describe some aspects of some embodiments of the operation of FIG. 12;
FIG. 16 is a diagram showing an alternative embodiment for processing data from a camera sensor using a "data extraction" technique to achieve a high level "image timeline";
FIG. 17 is a flowchart depicting the operation of application software used in connection with notes captured by note capture device 2;
FIG. 18 is a flowchart depicting in greater detail the exemplary operational event loop operation 290 of FIG. 17;
FIG. 19 is a flowchart outlining in greater detail the exemplary process events operation 298 of FIG. 18;
FIG. 20 is a diagram illustrating an exemplary embodiment of a Graphical User Interface (GUI) for application software;
FIG. 21 is a diagram illustrating exemplary image and performance characteristics of application software;
FIG. 22 is a diagram depicting an exemplary metadata display/edit GUI; and
FIG. 23 is a diagram depicting an exemplary embodiment of an alternative business card scan.
Detailed Description
FIG. 1 is a high-level diagram showing note capture device 2 and various other devices in communication therewith. For example, in an exemplary embodiment, note capture device 2 may communicate with personal computer 4 via USB cable 6. It is also understood that note capture device 2 may communicate with personal computer 4 through other means. In an exemplary embodiment, note capture device 2 may be powered by battery 8 or other device such as power adapter 10 (not shown) that draws power from, for example, an AC wall outlet running at 110 volts. In various exemplary embodiments, note capture device 2 uses wireless communication subsystem 12 to communicate with various wireless devices such as personal computer 4, wireless internet router 14, laptop computer 16, and cellular telephone 17. These various devices in communication with note capture device 2 are optionally coupled to the Internet 18. The purpose of note capture device 2 is to provide an intuitive method for capturing handwritten information, such as that contained in a note paper. The captured handwritten information is then transmitted to one of the various devices in communication with note capture device 2, including personal computer 4, wireless internet router 14, laptop computer 16, or cellular telephone 17. In some exemplary embodiments, the captured information may be stored on the device or forwarded via the internet 18 to a different electronic device connected to the internet 18, such as a server 20 (not shown) or other personal computer 4, or the like.
FIG. 2 shows note capture device 2 of FIG. 1 in greater detail. Note capture device 2 includes palm rest 22 (also referred to as a rest surface), copy button 24, notepad 26 whose top page is a removable writing surface, function lights 28, camera stand 30, and camera head 32. Palm rest 22 is substantially continuous, with the relative area where the palm contacts palm rest 22 being substantially less than the total square area of palm rest 22. In some exemplary embodiments, these components serve as a human-machine interface to note capture device 2 in different subsets, supersets, and combinations, although other components exist internally. Various sensors and electronics (not shown) are required to provide the functionality of note capture device 2 and will be discussed in greater detail below. When using note capture device 2, the user simply holds a writing instrument such as a pen or pencil and writes on the notepad. In an exemplary embodiment, completion of writing on notepad 26 is indicated by the user pressing copy button 24 or by other manual or automatic triggering mechanism signaling completion of the writing. The image on the top note of notepad 26 is then sent to an external device via USB cable 6, wireless communication subsystem 12, or by other means. It should be noted that the image sent at this point has been captured prior to the press of copy button 24. In other words, the pressing of copy button 24 indicates that the image should be sent and may or may not be used as a prompt indicating that camera head 32 should activate and obtain the image. Camera head 32 may be activated in a continuous mode or in an activated mode with a click in conjunction with copy button 24. In addition, reference numeral 23 generally describes an elongated base assembly.
In some embodiments, notepad 26 may be of the type: wherein each sheet of paper is held to another sheet by a soft, reusable adhesive compound.
In one embodiment by way of non-limiting example, a planar image capture device includes a rest surface defining a base image plane and a solid-state camera device defining an optical axis oblique to the base image plane. The solid state camera also has a depth of field extending from the solid state camera of about at least 2.5 inches but no more than about 7.5 inches. In general, depth of field is the distance an image on a surface is substantially in focus when viewed from a camera. Also generally, the depth of field is calibrated to a fixed range when assembling the planar image capture device.
FIG. 3 shows note capture device 2 of FIGS. 1 and 2 in still more detail. Notepad 26 rests on notepad rest 34 and provides a writing surface for the user. A copy button 24 is provided to allow the user to indicate that the note has been completed. Copy button microswitch 36 under copy button 24 is used to convert the motion of copy button 24 into an electrical signal to indicate completion of a note. In one embodiment, copy button microswitch 36 is mounted on printed circuit board 38. In an exemplary embodiment, palm rest 22 contacts palm rest microswitches 40 mounted on printed circuit board 38. Processor 42 is mounted on printed circuit board 38 and provides computing power to implement various functions of note capture device 2. Camera head 32 is mounted on structural support member 30 and electrically connected to printed circuit board 38 via camera cable 44. The structural support member 30 may be defined as a stand or a display. In one non-limiting exemplary embodiment, a page removal detector 46 is provided to help detect the removal of notes. Such detection may also be accomplished by passive devices by analyzing the images obtained by camera 32. The number and type of sensors may vary greatly in different embodiments of note capture device 2. In one exemplary embodiment, page removal detector 46 is spring loaded so that it maintains constant physical contact with the top note of notepad 26. When page removal detector 46 detects an action indicating the removal of a note, an electrical signal is generated and transmitted onto printed circuit board 38. This signal, as well as different signals from different sensors including palm rest microswitch 40, copy button microswitch 36, page removal detector 46, and the images obtained by camera head 32 are all used together to help determine when the note is complete and should be captured. When the note is captured, a signal including an image of the note is transmitted to an external device such as the personal computer 4, the wireless internet router 14, the laptop computer 16, or the cellular phone 17. The transfer of the image may occur in different ways, including wireless communication via the wireless communication subsystem 12 (not shown) or through a physically connected device, such as a USB cable 6 connected to the printed circuit board 38 via a USB connector 48. The USB cable 6 is only one of many possible physical connections. Alternatives include ethernet cables, etc.
Fig. 2 and other figures also depict a base image plane 27 comprising a substantially flat surface. It should be noted that base image plane 27 may be a variety of shapes other than the generally square shape of notepad 26 and need not be continuous. In addition, as shown in FIG. 3B, a vertical axis 29 is defined by the base image plane 27. As used herein, a "vertical depth of field" is defined as a depth of field substantially parallel to the vertical axis 29 within which an image detected by the camera 32 may be sufficiently resolved for use by the system, e.g., the image has sufficient focus and/or resolution for identification and/or other purposes of the system.
Fig. 4 shows the camera head 32 of fig. 2 and 3 in more detail. In one exemplary embodiment, camera head 32 includes a camera printed circuit board 50, a camera assembly 52, and an LED 54. The camera assembly 52 includes a lens and camera sensor mechanism that is capable of sending a series of images obtained from the camera sensor through the camera cable 44 to other components, particularly the processor 42 mounted on the printed circuit board 38. LED 54 provides light to illuminate note capture device 2, and more particularly, the writing surface of notepad 26, under different lighting conditions. The configuration of camera head 32 shown in fig. 4 is merely an exemplary embodiment and should be construed in a non-limiting manner.
FIG. 5 illustrates note capture device 2 having yet another sensor mechanism for detecting different modes of operation. In particular, in this embodiment, notepad tray 34 is pressure sensitive so that application of pressure to notepad 26 will generate an electrical signal that is transmitted via printed circuit board 38 to processor 42 indicating that a writing activity is occurring. Proximity sensor 54 is also included to provide a signal indicating the physical presence of a hand on notepad 26. The proximity sensor 54 similarly transmits electrical signals to the processor 42 via the printed circuit board 38. As previously discussed, palm rest 22 is pressure sensitive to provide signals to processor 42 that a writing operation is occurring. The outputs of the various sensors described herein are combined with image information obtained from camera 32, the image having a field of view covering the writing surface provided by notepad 26. Because camera 32 is not just above notepad 26, it should be noted that the field of view of camera 32 is tilted. The height of structural support element 30 need only be high enough to provide an adequate field of view for notepad 26 so that the acquired image data can be corrected to a suitable image of the note.
Fig. 6 shows the field of view of camera head 32 in more detail. As shown in the illustrative, non-limiting embodiment herein, camera 32 is aimed at the center of the writing surface of notepad 26. The portion of notepad 26 that is closer to palm rest 22 occurs at a greater distance and therefore uses a lower resolution in the final image than the portion of the image that is centered with respect to notepad 26. Similarly, the portion of the image closest to the camera in the image uses a higher resolution than the portion of the image associated with the center of notepad 26.
Figure 7 shows copy button 24 in more detail. In an exemplary embodiment, a recess is provided in the surface of copy button 24 so that a pen, pencil, or other writing instrument can be used to press the button easily without sliding off copy button 24, which copy button 24 is in direct physical contact with copy button microswitch 36 mounted on printed circuit board 38.
FIG. 8 illustrates note capture device 2 including wireless communication subsystem 12 in an exemplary embodiment. Wireless communications subsystem 12 is arranged, in part, as an antenna, which may be internal or external to note capture device 2 in combination with circuitry mounted on printed circuit board 38 (not shown). Different communication protocols may be used with the wireless communication subsystem 12. Protocols such as bluetooth, wireless Local Area Network (LAN) protocols, cellular telephone protocols, etc., are well known to those of ordinary skill in the art. Various alternative embodiments include IR and the like. Additionally, a mechanism for receiving GPS signals is also provided. The GPS mechanism may be used to determine the location of note capture device 2 and thus store the location information with the image information as metadata associated with the image. This may help a user who later attempts to locate a particular note. Without the need for effort, users tend to instinctively remember where they are, when they write notes, and a rough estimate of the date. It should be noted that the note capture devices disclosed herein may vary greatly in size to accommodate larger or smaller applications of the techniques disclosed herein within the scope of the claims.
FIG. 9 illustrates an exemplary embodiment of note capture device 2 in which a specifically designated area of the writing surface has particular significance depending on the configuration of the note. In this example, specific areas 56 and 58 are provided to perform user-defined functions when the user presses the area using a writing instrument such as a pen, pencil, or the like. It should be noted that detection of the pen down into a particular region 56 or 58 may be performed by image analysis of the images obtained by camera head 32 or by providing various sensors, such as microswitches, under the particular regions 56 and 58. In addition to these embodiments, various other sensing techniques, such as a blocked light beam, may be employed. The arrangement of particular regions 56 and 58 is merely exemplary and many other configurations are possible. The semantics dependent on the particular regions 56 and 58 are user programmable and may include a variety of functions such as sending an email of the captured note to a particular, predetermined user, posting the note to an album on a web page provided, for example, in the server 20 (not shown), sending a fax to a predetermined fax number, etc.
FIG. 10 illustrates an exemplary embodiment of note capture device 2 including an LCD display screen 60, which LCD display screen 60 may be used in conjunction with the capture process to help ensure a clear image is obtained. The LCD display screen 60 is used to provide feedback to the user so that the user can see notes that will be captured when the copy button 24 is pressed. LCD display 60 also serves as a human interface to note capture device 2 and may additionally include touch screen capabilities for interacting with various soft keys using a pen or finger to effect operation of note capture device 2 and distribution of any captured notes. In the exemplary embodiment, structural support element 30 is widened to accommodate LCD display screen 60, and camera head 32 is mounted on top of the widened structural support element 30. In this embodiment, a portion of the wireless communication subsystem 12 is housed in the structural support element 30, as in other embodiments. In some embodiments, the structural support element 30 may also include a light element supported by the element 30 that is capable of projecting an image. In another embodiment, image processing may be performed in the note capture device 2.
Referring to fig. 3, 5, 6 and 7, fig. 11 is a block diagram of circuitry implemented on the printed circuit board 38 of fig. 3, 5, 6 and 7. The various components of the printed circuit board 38 are a bus 62, a processor 42, volatile storage 64, non-volatile storage 66, read only memory 68, camera input/output (I/O)70, and other I/O72. Digital memory may also be included in the configuration of fig. 11. Other I/os 72 may include communication ports operable to transfer data from digital memory and other memory and storage devices to external devices or networks. The communication port may be wired or wireless. These various electronic components of printed circuit board 38 are used as a computer embedded in note capture device 2 to provide various functions of note capture, user interface, and communication. Processor 42 is coupled to bus 62 and provides initial computing power for the various functions of note capture device 2. Volatile storage 64 is used in conjunction with processor 42 to provide temporary storage for various processes implemented in processor 42. Non-volatile storage 66 may include various storage subsystems such as flash memory or a hard disk. Alternative embodiments include various removable media such as CD-ROMs, flash memory cards, and the like. In other words, it will be appreciated that various non-volatile storage subsystems known to those of ordinary skill in the art may be employed to augment non-volatile storage 66 or to replace it. In some non-limiting exemplary embodiments, one or more notes captured by note capture device 2 may be stored in non-volatile storage 66 and subsequently uploaded to various connected devices. Read only memory 68 is provided to contain OS level instructions for processor 42 startup and also to provide various library functions for processor 42 including all or part of the programs executing user interface, capture, and communications running on note capture device 2. Note that the read only memory 68 may be programmable. In one exemplary embodiment, camera I/O70 may include electronic components associated with a camera, such as that provided in a cellular telephone. Various alternative embodiments may include the ability to control the LED 54 in the camera based on the light output of the LED 54, or the ability to turn the LED 54 on and off. Other embodiments may include a servo system to change the angle of the camera so that its camera can, for example, photograph a room or other environment to perform security functions, etc. Thus, using camera I/O70 not only to obtain images from camera head 32 but also to control different aspects of the operation of camera head 32. Zoom and focus, as well as image resolution, etc., are all potentially controlled by the camera I/O70. Other I/Os 72 provide I/O to various sensors and microswitches associated with note capture device 2. In addition, other I/O72 may provide I/O to, for example, a keyboard or pointing device (pointing device) or an output device such as LCD display 60. Other exemplary embodiments provide for I/O associated with LCD display screen 60 in touch screen embodiments thereof. Other I/Os 72 may also be used to control various LEDs, such as function lights 28, which are used to display, for example, the operational status, communication status, and capture status of note capture device 2. Other I/Os 72 may be used to interface with a microphone or a GPS mechanism. Similarly, exemplary embodiments of the other I/O72 provide communication via the wireless communication subsystem 12, the USB connector 48, and other communication modes.
In one embodiment, inverse trapezoidal image processing of the acquired image is performed, facilitated by a memory such as ROM 68.
FIG. 12 is a flowchart depicting the execution of a capture program on note capture device 2, and more specifically, on various components of printed circuit board 38, particularly processor 42 and associated memory. This is merely an example of one mode of operation and is given by way of illustration and not limitation. The code for executing FIG. 12 may be embodied in read-only memory 68 or non-volatile storage 66, or may be downloaded in some examples from an external source such as personal computer 4, laptop computer 16, or various other devices that may communicate with note capture device 2, such as a cellular telephone or a server on the Internet 18, such as server 20. The run begins in operation 74 and proceeds to operation 76 where the program is initialized in operation 76. Various buffers needed to monitor the images obtained from camera 32 and various sensors of note capture device 2 are initialized at this time. At this point, control passes to operation 78 where various inputs are read in operation 78. In some embodiments, imaging is obtained from camera 32 on an ongoing basis, and in other embodiments, data is not obtained from camera 32 until specifically requested by the current program. If the imaging is obtained on an ongoing basis, it may be used to help determine the frame of imaging that should be captured. In one embodiment, only imaging is obtained when recording and no other sensor input is needed to determine which frames of imaging should be captured. At this point, operation 80 determines whether an image should be captured. This is accomplished by, for example, examining various sensor inputs such as signals obtained by copy button microswitch 36 plus other various sensors, and imaging obtained by camera head 32. In a low cost embodiment, no sensor is required other than the camera input, which can be analyzed to determine when an image was captured. If it is determined in operation 80 that an image should be captured, control passes to operation 82 where a single image is captured in operation 82. The manner of capture may be to obtain a single image from camera 32 or from a previously stored image buffer. In one case, the image to be captured is sent to an external device such as personal computer 4, laptop computer 16, cell phone, or server 20 by one of the aforementioned devices in communication with note capture device 2. The wireless internet router 14 may also be used to contact various devices on the internet 18, such as a server 20, in order to capture images. When operation 82 is completed, control is passed to operation 84, which operation 84 determines whether the operation of FIG. 12 is completed. If it is determined in operation 80 that it is not time to capture an image, control is similarly passed to operation 84, which operation 84 determines whether the run of FIG. 12 is complete. If it is determined in operation 84 that the operation of FIG. 12 has been completed, control passes to operation 86, where it ends the operation. It should be noted that an alternative to performing the operations of FIG. 12 in note capture device 2 is to simply send telemetry information, such as images and sensor inputs, to an external device, such as personal computer 4, and then perform the operations of FIG. 12 in the external device. It is further possible to control the different hardware elements by sending a message back from the external device to note capture device 2. In other words, the different programs described herein may be implemented on different devices. In one exemplary embodiment, the note capture device may be implemented in or in conjunction with a cellular telephone with a built-in or camera attached.
Fig. 13, 14, and 15 illustrate portions of a time line that more particularly describe some aspects of some embodiments of the operations of fig. 12. While in some exemplary embodiments determining whether to capture is as simple as detecting a user pressing copy button 24, further exemplary embodiments include monitoring different sensors as described in fig. 12. Fig. 13 shows 4 moments in time and various sensor inputs, and some information derived from those inputs. The first frame of the time axis represents a blank image. At this time, none of the palm rest sensor, the proximity sensor, and the page removal sensor detects anything. The image match is negative because the image is the first image and there are no comparable objects. Similarly, the image correspondence is negative. Image blank is positive because it is easy to detect that the image is blank; no ink is present and the user's hand and pen are not in the field of view. The pen down sensor also indicates that there is currently no writing activity. Frame 2 indicates a blank image, as just in frame 1. There is no user activity and therefore various sensors such as palm rest, proximity and page removal are negative. However, since the image is the same as the previous image, both image matching and image correspondence are positive. Image blank is positive indicating that the image of frame 2 is blank, and pen down is negative indicating that the user is not currently writing on notepad 26. In frame 3, the user has begun writing on notepad 26 using a writing instrument such as a pen or pencil. As is the writing instrument itself, the completed written portion is visible in frame 3. Palm rest sensors associated with palm rest 22 are activated indicating that palm rest microswitches 40 have detected pressure on palm rest 22. The proximity sensor 54 has detected the proximity of an object, such as a hand or writing instrument, and thus the indication of the proximity is also positive. The page removal sensor does not detect the removal of the page. Since frame 3 of the current image does not match the previous image in frame 2, the image match is determined to be negative, e.g., image blank, and similarly, the image corresponds to negative. Since some writing has occurred, the image is no longer blank. The pen down indication is positive and the pressure of the stylus has activated the pen down sensor. Now, in frame 4 of the timeline, more writing occurs as the user continues to perform the writing operation. Palm rest sensors associated with palm rest 22 and palm rest microswitches 40 are positive indicating that the user's palm is touching palm rest 22. Similarly, the proximity sensor 54 detects the presence of a user's hand or writing instrument. The page removal sensor does not detect the removal of the page. The image match and image correspondence is negative, indicating that the image is different from the previous image. Image blank is negative indicating that the image is not blank because it is written and a portion of the writing instrument is in view. The pen down indication is positive indicating that pressure is being applied to notepad 26 and thus to notepad stand 34, in this embodiment notepad stand 34 includes a pressure sensor.
Turning now to FIG. 14, frame 5 shows that the note containing some writing does not contain any part of the writing instrument or the user's hand. The palm rest indication is negative indicating that no pressure is currently being applied to palm rest 22 and, therefore, palm rest microswitches 40 are quiescent. The proximity sensor shows that although the user's hand is not in contact with the notepad, it is close enough to notepad 26 to cause the proximity sensor to continue detecting it. Page removal sensor 46 is negative indicating that the user has not attempted to remove the note from notepad 26. An image match is negative indicating that the image is not similar to the previous image within a given tolerance. Methods for comparing fine variations of images within a certain tolerance and identifying images that are similar enough to indicate a match are well known to those of ordinary skill in the art. The picture correspondence indication is negative indicating that the picture does not correspond to any previous picture. The image blank indication is negative because the image is not blank and has some writing on it. The pen down indication is negative indicating that no pressure is currently being applied to notepad 26 and, therefore, no pressure is detected on notepad stand 34, which in this embodiment includes a pressure sensor. Frame 6 shows an image that is substantially identical to the previous image. Note that there are some subtle variations due to variations in lighting conditions and imperfections inherent in collecting photons in the camera sensor, such as that implemented in camera head 32. The proximity indication is positive in frame 6 indicating that the user's hand is approaching note capture device 2 while the palm rest indication is negative indicating that his hand is not currently touching palm rest sensor 22 and thus palm rest microswitches 40 are not detecting pressure. In this example, the page removal indication is negative. The image match is positive, indicating that the image is close enough to the previous image to indicate a match. The image correspondence is positive indicating that all written parts present in the previous image 5 are present in image 6. If the ink added in image 6 is visible, the image correspondence is still positive. Thus, the difference between image matching and image correspondence is that image matching is intolerant to added ink in subsequent images while image correspondence is tolerant to added ink in subsequent images. In this example the image blank is negative, indicating that the image is not blank and contains some writing. The pen down indication is negative indicating that there is currently no pressure being applied to notepad 26 and, therefore, in the exemplary embodiment, there is no pressure being applied to notepad stand 34 that includes a pressure sensor.
Frame 7 shows the writing that was added and the writing instrument is again partially visible in the image. The palm rest indication is positive indicating that the palm of the user is currently applying pressure to palm rest sensor 22 and thus activating palm rest microswitch 40. Proximity sensor 54 detects the presence of a user's hand in proximity to note capture device 2. Page removal sensor 46 is silent indicating that the user is not attempting to remove a page. The image match indicator is negative in this example due to the fact that added writing has occurred and the writing instrument is currently visible in the image, but the writing instrument is not visible in the previous image. Similarly, the image correspondence is negative because the writing instrument is currently in the image. The image blank is negative in this example because both the writing instrument and some writing are currently on the image. The pen down indication is positive indicating that the user is currently applying pressure to notepad 26 using the writing instrument and, thus, in the exemplary embodiment, to notepad stand 34 including the pressure sensor. Frame 8 shows the writing added when the writing instrument has been removed from the image. The palm rest indication is negative because there is currently no pressure applied to palm rest 22, and thus palm rest microswitches 40 are silent. Similarly, the proximity sensor 54 does not detect the presence of the user's hand. The page removal sensor 46 is silent, so the page removal indication is negative. The image match is negative in this example because there is a writing instrument in the previous image and not in the current image. The image corresponds to positive in this example because the image contains all of the ink present in frame 5 and frame 6, as well as some added ink. Because the image contains some ink, the image blank indication is negative. The pen down is negative in this example, indicating that the user is not currently applying pressure to notepad 26.
Fig. 15 depicts frame 9 with the completed writing of frame 8 visible and no writing implement. The palm rest indication is negative indicating that the user's hand is not applying pressure to palm rest 22 and thus palm rest microswitches 40 are silent. The proximity indication is positive indicating that the user's hand is approaching note capture device 2, as indicated by proximity sensor 54. The page removal sensor indicates that the user is not attempting to remove a note from notepad 26, and thus the page removal indication is negative. The image match indication is positive indicating that the image matches the previous image within a tolerance. The image correspondence is also positive, indicating that the image of frame 9 corresponds to the images of the previous frames 5, 6 and 8. The image blank indication is negative because image 9 does contain some ink. The pen down indication is negative indicating that the user is not currently writing on notepad 26 and, therefore, is not causing pressure to notepad rest 34. Frame 10 shows the presence of the user's thumb in the image with writing. The palm rest indication is positive indicating that the current user's hand is applying pressure to palm rest 22 and thus activating palm rest microswitch 40. The proximity indication is positive indicating that the user's extremity is close enough to the proximity sensor 54 to cause the proximity sensor 54 to sense the user's hand. The page remove indication is negative because the user has not attempted to remove the note from notepad 26. As the user's thumb enters the frame, the image match is negative and similarly the image correspondence is negative because there is no previous image corresponding to this image. Image blank is negative because the frame contains writing and a portion of the user's thumb. The pen down is negative because there is currently no pressure applied to notepad 26. Frame 11 shows the user tearing the note from notepad 26. The palm rest indication is negative indicating that no pressure is currently being applied to palm rest 22. The proximity sensor 54 detects the presence of the user's hand and thus the proximity indication is positive. The page removal sensor is positive indicating that the user is currently removing a note from notepad 26. The presence of the page removal indication is important because it indicates that the user has completed any operations on the note. In this frame, the image match indication is negative because the user's thumb is visible in this image and is tearing away the note. Similarly, the image correspondence indication is negative. Because the user's hands and the edges of the note are visible, the image blank is negative. The pen down is negative indicating that no pressure is currently being applied to notepad 26. Frame 12 contains a blank image. The palm rest indication is negative indicating that the user is not currently using his hand to apply pressure to palm rest 22. The proximity indication is negative indicating that the proximity sensor 54 is not currently detecting the presence of the user's hand. The page removal sensor is negative indicating that the user is not currently removing a note. The image match indication is negative indicating that the image does not match any previous images taken since the page was removed. Similarly, the image correspondence is negative because there was no previous image since the last page removal. Image blank is positive indicating that the image is blank. The pen down is negative indicating that no pressure is currently being applied to notepad 26.
With reference to fig. 12, 13, 14 and 15, the operation of fig. 12 may employ different embodiments, some of which are described herein as examples. These examples should be construed in a non-limiting manner. For example, the different frames from 1 to 12 of fig. 13, 14 and 15 may be considered examples in time, with operation 78 of fig. 12 sampling various inputs. The various inputs are then buffered for analysis in operation 80 and further processed in operation 82. Operation 80 of determining that an image is to be captured is triggered by, for example, a positive indication for a page removal sensor. It must be emphasized that there is only one way to determine the time to acquisition. Other exemplary embodiments include detection of pressing the capture button 24, and in the alternative, embodiments do not require sensors or buttons other than the imaging itself. Different combinations of different sensors, buttons and imaging may also be employed. When operation 80 of fig. 12 determines that the capture time is up, control passes to operation 82 of fig. 12, which exports the captured image to an external device, such as, for example, personal computer 4, laptop computer 16, cellular telephone 17, or server 20. Alternatively, the captured images are cached on a non-volatile storage medium, such as flash memory, for subsequent uploading to or physical loading into an external device through the use of a removable medium. It should be understood that operation 82 of fig. 12 performs the capture of an image to an external device or non-volatile storage, but the captured image is not necessarily the most recent image captured by camera head 32, but rather the most recent stable image. In some embodiments, the image is required to be stable and various sensors, such as a pen down, do not detect the presence of a writing instrument. This prevents, for example, accidental capture of images that are stable and yet contain a portion of the writing instrument or other foreign objects that have not moved. In some embodiments, the most recent stable image corresponds to the previous various images over a period of time. In an alternative embodiment, operation 82 of FIG. 12 only takes the most recent image acquired from the camera. Such a capture is appropriate when the user, for example, taps copy button 24. In some embodiments, encryption is employed to protect the note data and associated metadata. In one embodiment, an asymmetric encryption algorithm may be employed to encrypt the data so that it is only usable if it is processed by an authorized service or other recipient with a corresponding appropriate personal key. This approach has the advantage of ease of manufacture, as the public key may be built into each note capture device without having to customize each device. This facilitates a centrally controlled service to control the distribution of content, since the data is not intelligible while in transit until it reaches a server or other device that can decrypt it using the private key. In some embodiments, the public key may be updated to the note capture device on a regular basis. Other encryption techniques may allow each note capture device to use a unique key. It is also understood that a symmetric encryption algorithm is employed. In some embodiments, encryption is not centrally controlled, but rather allows each user to generate or obtain the key keys needed to protect the security of their notes on an individual basis. For example, a user may have multiple key keys and manually or automatically select from among the key keys to perform encryption and decryption.
FIG. 16 is a diagram showing an alternative embodiment of employing a "data extraction" technique to process data from a camera sensor to achieve a high level "image timeline" that can be used by high level software in conjunction with the process of FIG. 12 to determine which image(s) to capture. As is well known to those of ordinary skill in the art, data extraction involves generating a transformed "view" of a data set (e.g., a stream of images from a camera). Subsequent transformations may be generated as higher level "views," each employing a lower level "view" to provide lower level functionality.
It will be appreciated that in some embodiments, the image stream cyclically consumes the page buffer in the event that older unneeded portions are discarded to make room for newer portions.
Although many combinations will be apparent to one of ordinary skill in the art, an example of such a data extraction technique is shown in FIG. 16. At the lowest level, the image sequence 200 represents the raw images obtained from the sensor. In some embodiments, the image appears oblique to the viewer because the camera angle can be changed depending on the juxtaposition of the camera head and the writing surface. The transformed image sequence 220 provides a transformed image sequence that performs geometric correction so that the image sequence 200 can be viewed as a series of vertical line images. The filtered image sequence 240 may, for example, provide images of the same vertical lines provided by the converted image sequence 220, but remove any images containing material to be discarded. The criteria for classifying images as containing undesirable materials vary widely depending on the particular application. In some non-limiting exemplary embodiments, those images that contain colors other than background color and ink color will be classified as containing undesirable imagery. The determination of the ink color may be derived from pixels that are stable to non-background colors. Other methods such as detecting and excluding images containing skin tones are possible, as are many methods related to identifying moving objects and the like. These examples are given by way of illustration and should not be construed in a limiting sense. In this example, the image timeline 260 provides a view of the filtered image sequence that can be accessed through a time index. This is just one of many possible examples. The image timeline may be used to select, for example, the most recent stable non-blank image when a blank image is found. In this non-limiting exemplary embodiment, the most recent stable non-blank image may then be selected as the image to be captured.
FIG. 17 is a flowchart describing the operation of application software used in connection with notes captured by note capture device 2. It should be understood that the application software may be implemented in different forms on different devices, including, for example, personal computers, cellular telephones, servers, laptop notebooks, and the note capture device 2 itself.
The execution begins at operation 280 and proceeds to operation 282 where buffers are allocated and initialized in operation 282 and objects representing applications, user interface elements, and databases containing captured notes, etc., are instantiated. The database is then queried for notes to be displayed in operation 284. Operation 286 performs a geometric layout of the note in preparation for its display. The notes are then displayed to the user in operation 288. The event ring operation 290 then enters an event ring to process events related to user activity and communications with other operations, such as the operation of FIG. 12. Upon completion of the event loop 290, the operation ends in operation 292.
FIG. 18 is a flowchart describing in more detail an exemplary operation of the event loop operation 290 of FIG. 17. The execution begins in operation 294 and continues to operation 296, which operation 296 determines whether there is an event available in the event queue. This determination is shown here as conditional, but may also be implemented as an interrupt-driven operation that blocks execution of the current thread until the event is available. Many varied implementations will be apparent to those of ordinary skill in the art. If an event is determined to be available, process events operation 298 processes the event. Once the event has been processed, control passes to decision operation 300. If in operation 296 it is determined that no event is available, control is passed to decision operation 300. Operation 300 determines whether the event loop operation 290 is complete. For example, it may check whether a power-down operation has been initiated, or whether an application exit operation has been initiated by a previously processed user event. If it is determined in operation 300 that the event loop operation 290 is complete, control passes to operation 302, which ends the run. On the other hand, if the decision 300 determines that the event loop operation 290 is not complete, then control is passed to an operation 296 that continues the run.
FIG. 19 is a flowchart illustrating an exemplary process event operation 298 of FIG. 18 in greater detail. The execution begins in operation 304 and continues to operation 306, where operation 306 retrieves an event from the event queue.
A determination is then made as to whether the event is a mouse event in decision operation 308. If it is determined that the event is a mouse event, control passes to operation 310, which processes the mouse event by further dispatching the mouse event to the object instance currently designated to receive it. This is basically done by determining the coordinates of the mouse event and by dispatching it to the GUI object occupying these coordinates. Such a dispatch will be apparent to one of ordinary skill in the art. In processing the event, the object may modify instance variables, buffers, etc. and signals to other objects to perform different tasks with respect to processing the event. When operation 310 is completed, control passes to operation 324, which ends operation 298. On the other hand, if it is determined in operation 308 that the event is not a mouse event, then control is passed to operation 312.
Subsequently, in decision operation 312, a determination is made as to whether the event is a communication event. If it is determined that the event is a communication event, control passes to operation 314, where operation 314 processes the communication event by further dispatching the communication event to the target instance currently designated to receive it. This is typically done by examining the communication event and assigning it based on the type of communication and addressing the information contained in the event or associated communication buffer. Such assignments are well known to those of ordinary skill in the art. One example of a communication event is the receipt of a note that has been captured by a note capture device. To handle such events, the note is received and assembled in non-volatile storage, and then in one exemplary embodiment, a database insert operation is performed. Alternatively, as a way to locate the note in a database of many notes, recognition software is employed to obtain text from the note image for later searching of the text.
In processing a communication event, an object may modify instance variables, buffers, etc., and signal other objects to perform various subtasks with respect to processing the event. In addition, the processing of communication events often includes initializing or communicating with existing threads of execution, particularly for time consuming tasks such as handwriting recognition. When operation 314 is completed, control passes to operation 324, which ends operation 298. On the other hand, if it is determined in operation 312 that the event is not a communication event, control is passed to operation 316.
Then in decision operation 316, a determination is made as to whether the event is a key event. If it is determined that the event is a key event, control passes to operation 318, which operations 318 process the key event by further dispatching the key event to the target instance currently designated to receive it. This is basically done by determining which GUI object currently has "focus", which is an indication that is often associated with the most recently "clicked" object. Such assignments are well known to those of ordinary skill in the art. In processing the event, the object may modify instance variables, buffers, etc. and signal other objects to perform various tasks related to processing the event. When operation 318 is completed, control passes to operation 324, which ends operation 298. On the other hand, if it is determined in operation 316 that the event is not a key event, control is passed to operation 320.
Then in decision operation 320, a determination is made as to whether the event is another event. If it is determined that the event is an additional event, control passes to operation 322, which processes the additional event by further dispatching the additional event to the target instance currently designated to receive it. For example, the event includes inserting a piece of media into a peripheral device containing a note to be retrieved. Various combinations will be apparent to those of ordinary skill in the art. In processing the event, the object may modify instance variables, buffers, etc. and signal other objects to perform various tasks related to processing the event. When operation 322 is completed, control passes to operation 324, which ends operation 298. On the other hand, if it is determined in operation 322 that the event is not another event, control is passed to operation 324.
Some non-limiting exemplary embodiments provide a note capture device that includes a writing surface, a camera mounted to view the writing surface, and electronic circuitry to capture images of the surface, or various alternative surfaces, including, for example, a business card surface. Various image transformations are understood which can correct for camera angles with respect to the tilt of the surface, for example. Other transformations to enhance images, or to add labels, logos, etc. are also understood. Other information may also be associated with the image data, such as a digital signature to verify the data when presented by a particular user, and the like.
FIG. 20 is a diagram illustrating an example embodiment GUI for application software. While many varying embodiments are possible, some exemplary embodiments provide a scrolling list of thumbnail images derived from a database containing notes captured in association with the application software. Moving the mouse over or tapping the note will for example "select" it, and thus it is largely revealed to the user, and will provide further interaction, including viewing and editing of the metadata. Different menu functions may be used to provide features such as sending the note to another user or an online album, searching for notes, sorting displayed notes based on time, location of capture, etc.
FIG. 22 is a diagram illustrating exemplary image and performance characteristics of application software. This view gives detailed interaction with specific notes, such as viewing and editing metadata, including handwriting recognition results in some embodiments. It will be appreciated that editing in this manner will thus result in appropriate database interactions, including but not limited to query operations, insert operations, delete operations, and update operations. In an alternative embodiment, a hierarchical file system is used instead of a database.
Some example embodiments provide for automatic detection of completion of a written note and, when detected, cause capture of the completed note. Some embodiments employ different sensors to detect completion of writing notes. Other embodiments perform this detection by using button taps. Still other embodiments perform this detection by image analysis. Various combinations of these embodiments are understood.
In some exemplary embodiments and combinations, the captured images are stored internally, transmitted electronically to an external device such as a personal computer, cell phone, server, or the like. The mode of communication between note capture device and the external device may include a cable directly or indirectly coupling the devices, or a wireless connection directly or indirectly coupling the devices. It is further understood that this communication may occur through the use of removable media. Other general methods of transferring data from one device to another are also understood.
In some embodiments, a screen is provided to allow a user to interact with the note capture device and direct different modes of communication for the purpose of transferring captured notes to different destinations, such different protocols including, by way of example and not limitation, SMTP, FTP, SFTP, HTTP, HTTPS, file sharing protocols such as those provided by different commercially available operating systems including Windows, Mac OS, Linux, and the like.
Handwriting recognition software may be employed to generate text representing captured notes in whole or in part, in different modes of operation. The text so identified may be generated simultaneously with or used in place of the image information associated with the note.
The image information associated with the note may be converted in different ways, including converting it to stroke information or compressing, etc.
FIG. 23 depicts an alternative exemplary embodiment in which a business card 400 may be captured. When mount 402 is manually flipped up by a user, the appropriate adjustments to the view are made in the software. The mount 402 includes an electrical switch coupled to the printed circuit board 38, the printed circuit board 38 generating an electrical signal that is detectable by the processor 42 which ultimately initiates the optional capture mode. In which the card is captured and the associated image is processed according to the modified view.
These and other embodiments as well as advantages and other features disclosed herein will become apparent to those of ordinary skill in the art upon reading the following description and studying the various drawings.
Aspects of the different illustrative embodiments provide software to view, browse, share, search, and edit captured notes, their associated metadata, recognition results, and the like. The different exemplary embodiments include human interface software associated with a user's "desktop" or different application windows. Further, it is understood that the application software may be implemented on a different alternative device including a personal computer, server, cell phone, laptop, PDA, or the note capture device itself.
Although various embodiments have been described using specific terms and devices, such description is for illustrative purposes only. The statements used are merely descriptive and not limiting. It is to be understood that modifications and variations may be resorted to without departing from the spirit and scope of the invention as set forth in the following claims. Moreover, it is to be understood that aspects of the various other embodiments may be interchanged both in whole or in part. It is therefore intended that the claims be interpreted in accordance with the true spirit and scope of the invention and not limited or prohibited.
Claims (27)
1. A planar image capture device, the device comprising:
an elongated base member supporting a writing surface;
a shelf having a lower end and an upper end, wherein the lower end of the shelf is attached to the base assembly;
a solid state camera supported by said frame proximate said upper end, said solid state camera having a field of view covering a writing surface and defining a fixed optical axis that is oblique to and oriented toward said writing surface;
a trigger mechanism for indicating completion of writing on the writing surface;
a display screen;
a digital processor device coupled to the trigger mechanism, the display screen, and the solid state camera device, the digital processor device receiving image information from the solid state camera device and configured to inverse trapezoidal image processing the image information; and
at least one of a wired communication interface and a wireless communication interface coupled to the digital processor apparatus.
2. A planar image capture device as recited in claim 1 wherein said solid state camera has a depth of field of between 2.5 inches and 7.5 inches.
3. A planar image capture device as recited in claim 1 wherein a portion of said elongated base assembly forms a rest surface, said writing surface being removably engaged with said rest surface.
4. A planar image capture device as recited in claim 3 wherein said rest surface is substantially flat.
5. A planar image capture device as recited in claim 4 wherein said rest surface is substantially continuous.
6. A planar image capture device as recited in claim 5 wherein said rest surface is pressure sensitive.
7. A planar image capture device as recited in claim 3 wherein said base assembly includes a palm rest proximate a first end and said rest surface proximate a second end.
8. A planar image capture device as recited in claim 7 wherein said base assembly is further provided with a proximity sensor.
9. A planar image capture device as recited in claim 7 wherein said base assembly is further provided with a page removal sensor proximate said second end.
10. A planar image capture device as recited in claim 7 wherein said base assembly is further provided with a capture button.
11. A planar image capture device as recited in claim 1 further comprising a light emitting element supported by said frame proximate said camera device.
12. A planar image capture device as recited in claim 7 further comprising at least one palm rest sensor configured to detect pressure on said palm rest.
13. A planar image capture device as recited in claim 3 wherein said removable writing surface is a top page of a pad engaged with said rest surface.
14. A planar image capture device as recited in claim 13 wherein said pad is of a type that includes a soft, reusable adhesive compound.
15. A planar image capture device as recited in claim 1 further comprising a digital memory configured to at least temporarily store at least a portion of image data from said camera device.
16. A planar image capture device as recited in claim 1 further comprising a business card holder configured to hold a business card proximate said camera device.
17. A method for providing image information, the method comprising:
disposing a solid-state camera on a stationary frame such that the frame is at least 2.5 inches but no more than 7.5 inches from and does not move relative to a base image plane, an optical axis of the solid-state camera being oblique to the base image plane;
capturing image data with the solid-state camera;
storing the image data in a digital memory,
performing inverse trapezoidal image processing on the image data; and
and transmitting the data processed by the inverse trapezoid image to a personal computer.
18. A method for providing image information as recited in claim 17 wherein said processing occurs in the solid state camera.
19. The method for providing image information as recited in claim 17, wherein the processing includes at least one of converting an image sequence and filtering an image sequence.
20. An apparatus for providing image information, comprising:
means for positioning the solid-state camera on a stationary frame such that the frame is at least 2.5 inches but no more than 7.5 inches from and does not move relative to the primary image plane, the optical axis of the solid-state camera being oblique to the primary image plane;
means for capturing image data with the solid-state camera;
means for storing the image data in a digital memory,
means for performing inverse trapezoidal image processing on the image data; and
means for transferring the inverse trapezoidal image processed data to a personal computer.
21. The apparatus of claim 20, further comprising logic operable to cause one or more processors to process the image data to create image information.
22. The apparatus of claim 21, wherein the processing the image data comprises at least one of converting an image sequence and filtering an image sequence.
23. An image capture and transmission system, the system comprising:
a network; and
the planar image capturing apparatus as set forth in claim 1;
wherein the planar image capture device is coupled to the network and, in operation, transmits the image information to other devices over the network.
24. The image capture and transmission system of claim 23 wherein the solid state camera has a fixed focal point with a depth of field that extends at least 2.5 inches but no more than 7.5 inches from the solid state camera.
25. The image capture and transmission system of claim 23, wherein the computing device is coupled to the planar image capture device.
26. The image capture and transmission system of claim 25, wherein the computing device is coupled to the network and, in operation, transmits the image information to the remote device over the network.
27. The image capture and transmission system of claim 23 wherein the network is the internet.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US89018407P | 2007-02-15 | 2007-02-15 | |
| US60/890,184 | 2007-02-15 | ||
| PCT/US2008/054179 WO2008101224A2 (en) | 2007-02-15 | 2008-02-15 | Note capture device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1141607A1 HK1141607A1 (en) | 2010-11-12 |
| HK1141607B true HK1141607B (en) | 2013-11-15 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5470051B2 (en) | Note capture device | |
| RU2392656C2 (en) | Universal computer device | |
| US9648287B2 (en) | Note capture device | |
| RU2386161C2 (en) | Circuit of optical system for universal computing device | |
| KR101026630B1 (en) | General purpose computing device | |
| US20050024346A1 (en) | Digital pen function control | |
| US6573887B1 (en) | Combined writing instrument and digital documentor | |
| US6686910B2 (en) | Combined writing instrument and digital documentor apparatus and method of use | |
| US11301063B2 (en) | Smart pen device and method of implementing a smart pen device | |
| US9378427B2 (en) | Displaying handwritten strokes on a device according to a determined stroke direction matching the present direction of inclination of the device | |
| US20140152543A1 (en) | System, data providing method and electronic apparatus | |
| JP5925957B2 (en) | Electronic device and handwritten data processing method | |
| JP2010154089A (en) | Conference system | |
| KR20080109340A (en) | Information management system used to manage ubiquitous tasks and schedules for journalists based on digital pen | |
| US20150339538A1 (en) | Electronic controller, control method, and control program | |
| HK1141607B (en) | Note capture device | |
| JP2008181510A (en) | A system for collecting and managing handwritten written information using a digital pen | |
| WO2021084761A1 (en) | Image reading device | |
| SE516739C2 (en) | Notepad for information management system, has activation icon that enables a position code detector to initiate a predetermined operation that utilizes the recorded information |