US20150338940A1 - Pen Input Modes for Digital Ink - Google Patents
Pen Input Modes for Digital Ink Download PDFInfo
- Publication number
- US20150338940A1 US20150338940A1 US14/665,462 US201514665462A US2015338940A1 US 20150338940 A1 US20150338940 A1 US 20150338940A1 US 201514665462 A US201514665462 A US 201514665462A US 2015338940 A1 US2015338940 A1 US 2015338940A1
- Authority
- US
- United States
- Prior art keywords
- ink
- pen
- input
- digital
- input surface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04807—Pen manipulated menu
Definitions
- a particular device may receive input from a user via a keyboard, a mouse, voice input, touch input (e.g., to a touchscreen), and so forth.
- a touch instrument e.g., a pen, a stylus, a finger, and so forth
- the freehand input may be converted to a corresponding visual representation on a display, such as for taking notes, for creating and editing an electronic document, and so forth.
- Many current techniques for digital ink typically provide limited ink functionality.
- a pen apparatus is described that is switchable between providing digital ink input and non-digital-ink input.
- a pen apparatus is switchable between different ink input modes. For instance, the pen apparatus is switchable between a permanent ink mode in which ink is applied as permanent ink, and a transient ink mode in which ink is applied as transient ink.
- FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques discussed herein in accordance with one or more embodiments.
- FIG. 2 depicts an example implementation scenario for a permanent ink mode in accordance with one or more embodiments.
- FIG. 3 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 4 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 5 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 6 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 7 depicts an example implementation of a pen apparatus in accordance with one or more embodiments.
- FIG. 8 depicts an example implementation of a pen apparatus in accordance with one or more embodiments.
- FIG. 9 depicts an example implementation scenario for receiving digital ink input and non-digital-ink input in accordance with one or more embodiments.
- FIG. 10 is a flow diagram that describes steps in a method for processing ink according to a current ink mode in accordance with one or more embodiments
- FIG. 11 is a flow diagram that describes steps in a method for a transient ink timer in accordance with one or more embodiments.
- FIG. 12 is a flow diagram that describes steps in a method for processing pen input based on a pen mode in accordance with one or more embodiments.
- FIG. 13 is a flow diagram that describes steps in a method for processing pen input based on an ink mode in accordance with one or more embodiments.
- FIG. 14 is a flow diagram that describes steps in a method for erasing digital ink in accordance with one or more embodiments
- FIG. 15 illustrates an example system and computing device as described with reference to FIG. 1 , which are configured to implement embodiments of techniques described herein.
- ink refers to freehand input to a pressure-sensing functionality such as a touchscreen and/or digitizer screen, which is interpreted as digital ink.
- digital ink is referred to herein as “ink” and “digital ink.”
- Ink may be provided in various ways, such as using a pen (e.g., an active pen, a passive pen, and so forth), a stylus, a finger, and so forth.
- a pen apparatus that is switchable between providing digital ink input and non-digital-ink input. For instance, in an ink-enabled mode, the pen apparatus causes ink content to be applied to an input surface such as an active display. In an ink-disabled mode, the pen apparatus provides input modes other than ink content, such as touch input for object manipulation and selection, document navigation, and so forth. For instance, when the pen apparatus is in the ink-disabled mode, contact between the pen apparatus and an input surface does not cause ink to be applied to the input surface.
- a switchable pen apparatus enables user to provide different types of input from a single device.
- a pen apparatus is switchable to apply ink according to different ink modes. For instance, in an ink-enabled mode a pen will apply ink whenever input is provided to an input surface. However, the type of ink that is applied can be switched between different types, such as based on an application context, in response to a user selection of an ink mode, and so forth. For instance, the pen apparatus is switchable between a permanent ink mode in which ink is applied as permanent ink, and a transient ink mode in which ink is applied as transient ink.
- Example Environment is first described that is operable to employ techniques described herein.
- Example Implementation Scenarios and Procedures describes some example implementation scenarios and methods for pen input modes for digital ink in accordance with one or more embodiments.
- Example System and Device describes an example system and device that are operable to employ techniques discussed herein in accordance with one or more embodiments.
- FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques for pen input modes for digital ink discussed herein.
- Environment 100 includes a client device 102 which can be embodied as any suitable device such as, by way of example and not limitation, a smartphone, a tablet computer, a portable computer (e.g., a laptop), a desktop computer, a wearable device, and so forth.
- the client device 102 represents a smart appliance, such as an Internet of Things (“IoT”) device.
- IoT Internet of Things
- the client device 102 may range from a system with significant processing power, to a lightweight device with minimal processing power.
- FIG. 15 One of a variety of different examples of a client device 102 is shown and described below in FIG. 15 .
- the client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed.
- the client device 102 includes an operating system 104 , applications 106 , and a communication module 108 .
- the operating system 104 is representative of functionality for abstracting various system components of the client device 102 , such as hardware, kernel-level modules and services, and so forth.
- the operating system 104 can abstract various components (e.g., hardware, software, and firmware) of the client device 102 to the applications 106 to enable interaction between the components and the applications 106 .
- the applications 106 represents functionalities for performing different tasks via the client device 102 .
- Examples of the applications 106 include a word processing application, a spreadsheet application, a web browser, a gaming application, and so forth.
- the applications 106 may be installed locally on the client device 102 to be executed via a local runtime environment, and/or may represent portals to remote functionality, such as cloud-based services, web apps, and so forth.
- the applications 106 may take a variety of forms, such as locally-executed code, portals to remotely hosted services, and so forth.
- the communication module 108 is representative of functionality for enabling the client device 102 to communication over wired and/or wireless connections.
- the communication module 108 represents hardware and logic for communication via a variety of different wired and/or wireless technologies and protocols.
- the client device 102 further includes a display device 110 , input mechanisms 112 including a digitizer 114 and touch input devices 116 , and an ink module 118 .
- the display device 110 generally represents functionality for visual output for the client device 102 . Additionally, the display device 110 represents functionality for receiving various types of input, such as touch input, pen input, and so forth.
- the input mechanisms 112 generally represent different functionalities for receiving input to the computing device 102 . Examples of the input mechanisms 112 include gesture-sensitive sensors and devices (e.g., such as touch-based sensors and movement-tracking sensors (e.g., camera-based)), a mouse, a keyboard, a stylus, a touch pad, accelerometers, a microphone with accompanying voice recognition software, and so forth.
- gesture-sensitive sensors and devices e.g., such as touch-based sensors and movement-tracking sensors (e.g., camera-based)
- a mouse e.g., such as touch-based sensors and movement-tracking sensors (e.g
- the input mechanisms 112 may be separate or integral with the displays 110 ; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.
- the digitizer 114 represents functionality for converting various types of input to the display device 110 and the touch input devices 116 into digital data that can be used by the computing device 102 in various ways, such as for generating digital ink.
- the ink module 118 represents functionality for performing various aspects of techniques for pen input modes for digital ink discussed herein. Various functionalities of the ink module 118 are discussed below.
- the ink module 118 includes a transient layer application programming interface (API) 120 and a permanent layer API 122 .
- the transient layer API 120 represents functionality for enabling interaction with a transient ink layer
- the permanent layer API 122 represents functionality for enabling ink interaction with a permanent object (e.g., document) layer.
- the transient layer API 120 and the permanent layer API 122 may be utilized (e.g., by the applications 106 ) to access transient ink functionality and permanent ink functionality, respectively.
- the environment 100 further includes a pen 124 , which is representative of an input device for providing input to the display device 110 .
- the pen 124 is in a form factor of a traditional pen but includes functionality for interacting with the display device 110 and other functionality of the client device 102 .
- the pen 124 is an active pen that includes electronic components for interacting with the client device 102 .
- the pen 124 for instance, includes a battery that can provide power to internal components of the pen 124 .
- the pen 124 may include a magnet or other functionality that supports hover detection over the display device 110 .
- the pen 124 may be passive, e.g., a stylus without internal electronics.
- the pen 124 is representative of an input device that can provide input that can be differentiated from other types of input by the client device 102 .
- the digitizer 114 is configured to differentiate between input provided via the pen 124 , and input provided by a different input mechanism such as a user's finger, a stylus, and so forth. Further details concerning the pen 124 are provided below.
- ink can be applied in different ink modes including a transient ink mode and a permanent ink mode.
- transient ink refers to ink that is temporary and that can be used for various purposes, such as invoking particular actions, annotating a document, and so forth.
- ink can be used for annotation layers for electronic documents, temporary visual emphasis, text recognition, invoking various commands and functionalities, and so forth.
- Permanent ink generally refers to implementations where ink becomes a part of the underlying object, such as for creating a document, writing on a document (e.g., for annotation and/or editing), applying ink to graphics, and so forth.
- Permanent ink for example, can be considered as a graphics object, such as for note taking, for creating visual content, and so forth.
- a pen (e.g., the pen 124 ) applies ink whenever the pen is in an ink-enabled mode and is in contact with an input surface, such as the display device 104 and/or other input surface.
- a pen can apply ink across many different applications, platforms, and services.
- an application and/or service can specify how ink is used in relation to an underlying object, such as a word processing document, a spreadsheet and so forth. For instance, in some scenarios ink is applied as transient ink, and other scenarios ink is applied as permanent ink. Examples of different implementations and attributes of transient ink and permanent ink are detailed below.
- the implementation scenarios and procedures may be implemented in the environment 100 described above, the system 1500 of FIG. 15 , and/or any other suitable environment.
- the implementation scenarios and procedures describe example operations of the client device 102 and the ink module 118 . While the implementation scenarios and procedures are discussed with reference to a particular application, it is to be appreciated that techniques for pen input modes for digital ink discussed herein are applicable across a variety of different applications, services, and environments. In at least some embodiments, steps described for the various procedures are implemented automatically and independent of user interaction.
- FIG. 2 depicts an example implementation scenario 200 for a permanent ink mode in accordance with one or more implementations.
- the upper portion of the scenario 200 includes a graphical user interface (GUI) 202 displayed on the display device 110 .
- GUI graphical user interface
- the GUI 202 represents a GUI for a particular functionality, such as an instance of the applications 106 .
- a user holding the pen 124 Displayed within the GUI 202 is a document 204 , e.g., an electronic document generated via one of the applications 106 .
- the user brings the pen 124 in proximity to the surface of the display device 110 and within the GUI 202 .
- the pen 124 for instance, is placed within a particular distance of the display device 110 (e.g., less than 2 centimeters) but not in contact with the display device 110 . This behavior is generally referred to herein as “hovering” the pen 124 .
- a hover target 206 is displayed within the GUI 202 and at a point within the GUI 202 that is directly beneath the tip of the pen 124 .
- the hover target 206 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to the document 204 .
- the visual appearance (e.g., shape, color, shading, and so forth) of the hover target 206 provides a visual cue indicating a current ink mode that is active.
- the hover target is presented as a solid circle, which indicates that a permanent ink mode is active.
- the ink will become part of the document 204 , e.g., will be added to a primary content layer of the document 204 .
- the text (e.g., primary content) displayed in the document 204 was created via ink input in a permanent ink mode.
- ink applied in a permanent ink mode represents a permanent ink layer that is added to a primary content layer of the document 204 .
- an ink flag 208 is visually presented adjacent to and/or at least partially overlaying a portion of the document 204 .
- the ink flag 208 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to the document 204 .
- the ink flag 208 may be presented additionally or alternatively to the hover target 206 .
- the ink flag 208 includes a visual cue indicating a current ink mode that is active.
- the ink flag 208 includes a solid circle, which indicates that a permanent ink mode is active.
- the ink flag 208 is selectable to cause an ink menu to be displayed that includes various ink-related functionalities, options, and settings that can be applied.
- FIG. 3 depicts an example implementation scenario 300 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 300 includes a graphical user interface (GUI) 302 displayed on the display device 110 .
- GUI graphical user interface
- the GUI 302 represents a GUI for a particular functionality, such as an instance of the applications 106 .
- Displayed within the GUI 302 is a document 304 , e.g., an electronic document generating via one of the applications 106 .
- the document 304 includes primary content 306 , which represents content generated as part of a primary content layer for the document 304 .
- the document 304 is a text-based document, and thus the primary content 306 includes text that is populated to the document.
- Various other types of documents and primary content may be employed, such as for graphics, multimedia, web content, and so forth.
- a user is hovering the pen 124 within a certain proximity of the surface of the display device 110 , such as discussed above with reference to the scenario 200 .
- a hover target 308 is displayed within the document 304 and beneath the tip of the pen.
- the hover target 308 is presented as a hollow circle, thus indicating that a transient ink mode is active. For instance, if the user proceeds to apply ink to the document 304 , the ink will behave according to a transient ink mode. Examples of different transient ink behaviors are detailed elsewhere herein.
- an ink flag 310 is presented.
- the ink flag 310 includes a hollow circle 312 , thus providing a visual cue that a transient ink mode is active.
- the user removes the pen 124 from proximity to the display device 110 .
- the hover target 308 and the ink flag 310 are removed from the display device 110 .
- a hover target and/or an ink flag are presented when the pen 124 is detected as being hovered over the display device 110 , and are removed from the display device 110 when the pen 124 is removed such that the pen 124 is no longer detected as being hovered over the display device 110 .
- an ink flag may be persistently displayed to indicate that inking functionality is active and/or available.
- FIG. 4 depicts an example implementation scenario 400 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 300 includes the GUI 302 with the document 304 (introduced above) displayed on the display device 110 .
- the scenario 400 represents an extension of the scenario 300 , above.
- a user applies ink content 402 to the document 304 using the pen 124 .
- the ink content 402 corresponds to an annotation of the document 402 .
- a variety of different types of transient ink other than annotations may be employed.
- a hover target is not displayed. For instance, in at least some implementations when the pen 124 transitions from a hover position to contact with the display device 110 , a hover target is removed.
- the ink flag 310 includes a hollow circle 312 , indicating that the ink content 402 is applied according to a transient ink mode.
- an ink timer 406 begins running.
- the ink timer 406 begins counting down from a specific time value, such as 30 seconds, 60 seconds, and so forth.
- the ink timer is representative of functionality to implement a countdown function, such as for tracking time between user interactions with the display device 110 via the pen 124 .
- the ink timer 406 represents a functionality of the ink module 118 .
- the hollow circle 312 begins to unwind, e.g., begins to disappear from the ink flag 310 .
- the hollow circle 312 unwinds at a rate that corresponds to the countdown of the ink timer 406 . For instance, when the ink timer 406 is elapsed by 50%, then 50% of the hollow circle 312 is removed from the ink flag 310 .
- unwinding of the hollow circle 312 provides a visual cue that the ink timer 406 is elapsing, and how much of the ink timer has elapsed and/or remains to be elapsed.
- the ink timer 406 if the ink timer 406 is elapsing as in the lower portion of the scenario 400 and the user proceeds to place the pen 124 in proximity to the display device 110 (e.g., hovered or in contact with the display device 110 ), the ink timer 406 will reset and will not begin elapsing again until the user removes the pen 124 from the display device 110 such that the pen 124 is not detected. In such implementations, the hollow circle 312 will be restored within the ink flag 310 as in the upper portion of the scenario 400 .
- FIG. 5 depicts an example implementation scenario 500 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 300 includes the GUI 302 with the document 304 (introduced above) displayed on the display device 110 .
- the scenario 500 represents an extension of the scenario 400 , above.
- the ink timer 406 has elapsed. For instance, notice that the hollow circle 312 has completely unwound within the ink flag 310 , e.g., is visually removed from the ink flag 310 . According to various implementations, this provides a visual cue that the ink timer 406 has completely elapsed.
- the ink content 402 is removed from the GUI 302 and saved as part of a transient ink layer 504 for the document 304 .
- the ink flag 310 is populated with a user icon 502 .
- the user icon 502 represents a user that is currently logged in to the computing device 102 , and/or a user that interacts with the document 304 to apply the ink content 402 .
- the pen 124 includes user identification data that is detected by the computing device 102 and thus is leveraged to track which user is interacting with the document 304 .
- the pen 124 includes a tagging mechanism (e.g., a radio-frequency identifier (RFID) chip) embedded with a user identity for a particular user.
- RFID radio-frequency identifier
- the tagging mechanism is detected by the computing device 102 and utilized to attribute ink input and/or other types of input to a particular user.
- the term “user” may be used to refer to an identity for an individual person, and/or an identity for a discrete group of users that are grouped under a single user identity.
- population of the user icon 502 to the ink flag 310 represents a visual indication that the transient ink layer 504 exists for the document 304 , and that the transient ink layer 504 is associated with (e.g., was generated by) a particular user.
- the transient ink layer 504 represents a data layer that is not part of the primary content layer of the document 304 , but that is persisted and can be referenced for various purposes. Further attributes of transient ink layers are described elsewhere herein.
- FIG. 6 depicts an example implementation scenario 600 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 600 includes the GUI 302 with the document 304 (introduced above) displayed on the display device 110 .
- the scenario 600 represents an extension of the scenario 500 , above.
- the ink flag 310 is displayed indicating that a transient ink layer (e.g., the transient ink layer 504 ) exists for the document 304 , and that the transient ink layer is linked to a particular user represented by the user icon 502 in the ink flag 310 .
- a transient ink layer e.g., the transient ink layer 504
- a user selects the ink flag 310 with the pen 124 , which causes the ink content 402 to be returned to display as part of the document 304 .
- the ink content 402 for example, is bound to the transient ink layer 504 , along with other transient ink content generated for the transient ink layer 504 .
- the transient ink layer 504 is accessible by various techniques, such as by selection of the ink flag 310 .
- transient ink content of the transient ink layer 504 is bound (e.g., anchored) to particular portions (e.g., pages, lines, text, and so forth) of the document 304 .
- the user generated the ink content 402 adjacent to a particular section of text.
- the ink content 402 is displayed adjacent to the particular section of text.
- the transient ink layer 504 is cumulative such that a user may add ink content to and remove ink content from the transient ink layer 504 over a span of time and during multiple different interactivity sessions.
- the transient ink layer 504 generally represents a record of multiple user interactions with the document 304 , such as for annotations, proofreading, commenting, and so forth.
- multiple transient layers may be created for the document 304 , such as when significant changes are made to the primary content 306 , when other users apply transient ink to the document 304 , and so forth.
- the ink timer 406 begins elapsing such as discussed above with reference to the scenarios 400 , 500 . Accordingly, the scenario 600 may return to the scenario 400 .
- FIG. 7 depicts an example implementation of the pen 124 in accordance with one or more implementations.
- the pen 124 includes a shaft 700 , which represents a main body and/or chassis of the pen 124 . For instance, various components of the pen 124 are attached to and/or contained within the shaft 700 .
- the pen 124 further includes a tip 702 that extends through a tip portal 704 in a nose 706 of the shaft 700 .
- the tip 702 represents a portion of the pen 124 that can be leveraged to provide input and/or other types of interactions to an input surface, such as the display device 110 and/or others of the touch input devices 116 . For instance, contact between the tip 702 and an input surface causes digital ink input to be applied to the input surface.
- the tip 702 can be retracted and extended relative to the shaft 700 to provide for different input scenarios.
- the pen 124 further includes a pen mode button 708 , which represents a selectable control (e.g., a switch) for switching the pen 124 between different pen input modes.
- a pen mode button 708 represents a selectable control (e.g., a switch) for switching the pen 124 between different pen input modes.
- different pen input modes enable input from the pen 124 to be utilized and/or interpreted by the ink module 118 in different ways. For instance, selecting the pen mode button 708 causes the pen 124 to transition between a permanent ink mode and a transient ink mode.
- a tip button 710 is attached to the shaft 700 and as further detailed below is selectable to retract and extend the tip 702 .
- the pen 124 is configured to provide digital ink input with the tip 702 extended as illustrated in FIG. 7 . Further, the pen 124 is configured to provide non-digital-ink input with the tip 702 retracted, as further detailed below.
- An eraser portion 712 is attached to the pen 124 adjacent to the tip button 710 .
- the eraser portion 712 is representative of functionality to enable ink content to be erased, such as permanent and/or transient ink.
- the pen 124 further includes internal components 714 , which are representative of components that enable various functionalities of the pen 124 .
- the internal components 714 include active electronics such as logic and processing components for controlling different operating modes of the pen 124 .
- the internal components 714 are configured to transmit and receive wireless signals using any suitable wireless protocol, such as Bluetooth, radio-frequency identifier (RFID), and so forth.
- RFID radio-frequency identifier
- the pen 124 can leverage the internal components 124 to transmit a signal that indicates a current ink mode of the pen 124 , such as whether the pen 124 is in a permanent or transient ink mode. Further, the pen 124 can leverage the internal components 124 to transmit a signal that indicates whether ink functionality of the pen 124 is enabled or disabled.
- FIG. 8 depicts an example implementation of the pen 124 in accordance with one or more implementations.
- the tip 702 is retracted into the shaft 700 .
- a user selects (e.g., presses) the tip button 710 , which causes the tip 702 to at least partially retract through the tip portal 704 into the nose 706 of the shaft 700 .
- the tip portal 704 represents an opening in the nose 706 through which the tip 702 may extend and retract.
- the pen 124 provides non-digital-ink input to an input surface. For instance, contact between the nose 706 and an input surface provides non-digital-ink input, such as for selection input, object manipulation, and so forth.
- selecting the tip button 710 while the tip 702 is retracted causes the tip 702 to extend from the nose 706 , such as depicted in FIG. 7 .
- FIG. 9 depicts an example implementation scenario 900 for receiving digital ink input and non-digital-ink input in accordance with one or more implementations.
- the upper portion of the scenario 900 includes the GUI 302 with the document 304 (introduced above) displayed on the display device 110 .
- the GUI 302 includes a menu region 902 that includes different selectable controls for performing various actions, such as on the document 304 .
- the scenario 900 represents an extension of one or more of the scenarios described above.
- a user manipulates the pen 124 to apply ink content (e.g., digital-ink) to the document 304 .
- ink content e.g., digital-ink
- contact between the tip 702 and the display device 110 causes ink content to be added to the document 304 .
- the pen 124 is in an ink-enabled mode such that ink is always application when the pen 124 is placed in contact with the display device 110 .
- the ink content is added as permanent ink that becomes part of the primary content 306 of the document 304 .
- the ink content is added as transient ink of the document 304 . Examples of different transient ink behaviors are discussed throughout.
- the user selects the tip button 710 , which causes the tip 702 to retract into the nose 706 .
- the pen 124 is used to provide non-digital-ink input, such as touch input to the display device 110 .
- Retracting the tip 702 enables the pen 124 to be used for touch input such as a user would provide via a finger.
- placing the pen 124 e.g., the nose 706 ) in contact with the display device 110 within the document 304 does not cause ink content to be applied to the document 304 .
- Non-digital-ink input can be leveraged for various purposes, such as for manipulating the document 304 within the GUI 302 , selecting selectable controls from the menu region 902 , selecting primary content within the document 304 , and so forth.
- the user places the nose 706 of the pen 124 in contact with the display device 110 and drags (e.g., scrolls) the document 304 within the GUI 302 . Notice that as the user manipulates the document 304 with the tip 702 retracted, ink content is not applied to the document 304 .
- FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for processing ink according to a current ink mode in accordance with one or more embodiments.
- Step 1000 detects a pen in proximity to an input surface.
- the touch input device 116 detects that the pen 124 is hovered and/or in contact with the touch input device 116 with the tip 702 extended.
- a hover operation can be associated with a particular threshold proximity to an input surface such that hovering the pen 124 at or within the threshold proximity to the input surface is interpreted as a hover operation, but placing the pen 124 farther than the threshold proximity from the input surface is not interpreted as a hover operation.
- Step 1002 ascertains a current ink mode.
- the ink module 118 for example, ascertains an ink mode that is currently active on the computing device 102 . Examples of different ink modes are detailed elsewhere herein, and include a permanent ink mode and a transient ink mode.
- a current ink mode may be automatically selected by the ink module 118 , such as based on an application and/or document context that is currently in focus. For instance, an application 106 may specify a default ink mode that is to be active for the application. Further, some applications may specify ink mode permissions that indicate allowed and disallowed ink modes. A particular application 106 , for example, may specify that a permanent ink mode is not allowed for documents presented by the application, such as to protect documents from being edited.
- a current ink mode is user-selectable, such as in response to user input selecting an ink mode from the ink menu 802 .
- a user may cause a switch from a default ink mode for an application to a different ink mode.
- the user for example, can select the pen mode button 708 to cause the pen 124 and/or the ink module 118 to switch between different ink modes.
- ascertaining a current ink mode includes ascertaining whether the pen is in an ink-enabled mode or an ink-disabled mode, such as ascertaining whether the tip 702 of the pen 124 is extended or retracted.
- Step 1004 causes a visual affordance identifying the current ink mode to be displayed.
- Examples of such an affordance include a hover target, a visual included as part of an ink flag, and so forth. Examples of different visual affordances are detailed throughout this description and the accompanying drawings.
- Step 1006 processes ink content applied to the input surface according to the current ink mode.
- the ink content for instance, is processed as permanent ink, transient ink, and so forth.
- a permanent ink mode is active, the ink content is saved as permanent ink, such as part of a primary content layer of a document.
- the transient ink mode is active, the ink content is propagated to a transient ink layer of a document. Examples of different mode-specific ink behaviors and actions are detailed elsewhere herein.
- FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for a transient ink timer in accordance with one or more implementations.
- the method represents an extension of the method described above with reference to FIG. 10 .
- Step 1100 receives ink content applied to a document via input from a pen to an input surface while in a transient ink mode.
- the ink module 118 processes ink content received from the pen 124 to the display device 110 as transient ink.
- Step 1102 detects that the pen is removed from proximity to the input surface. For instance, the touch input device 116 detects that the pen 124 is not in contact with and is not hovering over a surface of the touch input device 116 , e.g., the display device 110 .
- Step 1104 initiates a timer.
- the timer for example, is initiated in response to detecting that the pen is removed from proximity to the input surface.
- a visual representation of the timer is presented.
- the visual representation provides a visual cue that the timer is elapsing, and indicates a relative amount (e.g., percentage) of the timer that has elapsed.
- the visual representation for example, is animated to visually convey that the timer is elapsing.
- a visual representation of a timer is discussed above with reference to FIGS. 4 and 5 .
- Step 1106 ascertains whether the pen is detected at the input surface before the timer expires. For instance, the ink module 118 ascertains whether the pen 124 is detected is contact with and/or hovering over the touch input device 116 prior to expiry of the timer. If the pen is detected at the input surface prior to expiry of the timer (“Yes”), step 1108 resets the timer and the process returns to step 1100 .
- step 1110 removes the ink content from the document and propagates the ink content to a transient layer for the document. For instance, response to expiry of the timer, the transient ink content is removed from display and propagated to a transient data layer for the document that is separate from a primary content layer of the document.
- a new transient ink layer is created for the document, and the transient ink content is propagated to the new transient ink layer.
- the transient ink content is propagated to an existing transient ink layer.
- the transient ink layer may represent an accumulation of transient ink provided by a user over multiple different interactions with the document and over a period of time.
- the transient ink layer may be associated with a particular user, e.g., a user that applies the transient ink content to the document.
- the transient ink is linked to the particular user and may subsequently be accessed by the user.
- FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for processing pen input based on a pen mode in accordance with one or more implementations.
- the method represents an extension of one or more of the methods described above.
- Step 1200 detects pen input from a pen to an input surface.
- the ink module 118 detects that the pen 124 is in contact with the display device 110 provides input to the display device 110 .
- Step 1202 ascertains whether the pen is in an ink-enabled mode or an ink-disabled mode.
- the ink module 118 for example, ascertains whether the pen 124 is in an ink-enabled or ink-disabled mode.
- the pen 124 can transmit a wireless signal that identifies whether the pen 124 is in an ink-enabled or ink-disabled mode.
- the ink module 118 can detect a state of the tip 702 and can ascertain whether the pen 124 is in an ink-enabled or ink-disabled mode based on the state.
- the ink module 118 detects that the tip 702 is extended, the ink module 118 ascertains that the pen 124 is in an ink-enabled mode. However, if the ink module 118 detects that the tip 702 is retracted, the ink module 118 ascertains that the pen 124 is in an ink-disabled mode.
- the ink module 118 detects whether the tip 702 is extended or retracted, and enables or disables ink functionality of the client device 102 based on whether the tip 702 is extended or retracted.
- step 1204 processes the pen input as digital ink input. For instance, the pen input is converted to ink content that is displayed on the display device 110 .
- the ink content may be permanent ink or transient ink.
- the pen 124 will apply ink whenever the pen is in an ink-enabled mode.
- step 1206 processes the pen input as non-digital-ink input.
- non-digital-ink input does not cause ink content to be applied.
- non-digital-ink input is used for other purposes than applying ink content, such as object selection and manipulation, document navigation, and so forth.
- FIG. 13 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for processing pen input based on an ink mode in accordance with one or more implementations.
- the method represents an extension of one or more of the methods described above.
- Step 1300 receives ink input to an input surface from a pen.
- the ink module 118 detects that ink input is provided by the pen 124 .
- the ink module 118 detects that the ink input is provided when the pen 124 is in an ink-enabled mode.
- Step 1302 ascertains whether the pen is in a permanent ink mode or a transient ink mode.
- the pen 124 communicates a signal to the client device 102 indicating whether the pen 124 is in a permanent ink mode or a transient ink mode.
- the signal for instance, is communicated as a wireless signal from the pen 124 .
- the signal is communicated via contact between the pen 124 and an input surface, e.g., the display device 110 .
- the signal can be electrically conducted across the tip 702 to the display device 110 and to internal components of the client device 102 to convey information identifying a current mode of the pen 124 .
- step 1304 processes the ink input as permanent ink. For instance, the ink input is displayed and/or added to a permanent ink layer.
- step 1306 processes the ink input as transient ink.
- the ink input for example, is added to a transient ink layer that may be subsequent accessed. Examples of different permanent and transient ink behaviors are detailed throughout.
- FIG. 14 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for erasing digital ink in accordance with one or more implementations.
- the method represents an extension of one or more of the methods described above.
- Step 1400 receives an indication of contact between an eraser portion of a pen and digital ink displayed on an input surface.
- the ink module 118 detects that the eraser portion 712 of the pen 124 is in contact with digital ink displayed on the display device 110 .
- Step 1402 causes the digital ink to be removed from the input surface in response to the indication.
- the ink module 118 deletes the digital ink. For instance, the digital ink is removed from display and data describing the digital ink is deleted.
- FIG. 15 illustrates an example system generally at 1500 that includes an example computing device 1502 that is representative of one or more computing systems and/or devices that may implement various techniques described herein.
- the client device 102 discussed above with reference to FIG. 1 can be embodied as the computing device 1502 .
- the computing device 1502 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.
- the example computing device 1502 as illustrated includes a processing system 1504 , one or more computer-readable media 1506 , and one or more Input/Output (I/O) Interfaces 1508 that are communicatively coupled, one to another.
- the computing device 1502 may further include a system bus or other data and command transfer system that couples the various components, one to another.
- a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
- a variety of other examples are also contemplated, such as control and data lines.
- the processing system 1504 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1504 is illustrated as including hardware element 1510 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors.
- the hardware elements 1510 are not limited by the materials from which they are formed or the processing mechanisms employed therein.
- processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)).
- processor-executable instructions may be electronically-executable instructions.
- the computer-readable media 1506 is illustrated as including memory/storage 1512 .
- the memory/storage 1512 represents memory/storage capacity associated with one or more computer-readable media.
- the memory/storage 1512 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth).
- RAM random access memory
- ROM read only memory
- Flash memory optical disks
- magnetic disks and so forth
- the memory/storage 1512 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth).
- the computer-readable media 1506 may be configured in a variety of other ways as further described below.
- Input/output interface(s) 1508 are representative of functionality to allow a user to enter commands and information to computing device 1502 , and also allow information to be presented to the user and/or other components or devices using various input/output devices.
- input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice recognition and/or spoken input), a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth.
- Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth.
- the computing device 1502 may be configured in a variety of ways as further described below to support user interaction.
- modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types.
- module generally represent software, firmware, hardware, or a combination thereof.
- the features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- Computer-readable media may include a variety of media that may be accessed by the computing device 1502 .
- computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”
- Computer-readable storage media may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Computer-readable storage media do not include signals per se.
- the computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data.
- Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- Computer-readable signal media may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1502 , such as via a network.
- Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism.
- Signal media also include any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
- RF radio frequency
- hardware elements 1510 and computer-readable media 1506 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein.
- Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate array
- CPLD complex programmable logic device
- a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously.
- software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1510 .
- the computing device 1502 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules that are executable by the computing device 1502 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1510 of the processing system.
- the instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1502 and/or processing systems 1504 ) to implement techniques, modules, and examples described herein.
- the example system 1500 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.
- PC personal computer
- TV device a television device
- mobile device a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.
- multiple devices are interconnected through a central computing device.
- the central computing device may be local to the multiple devices or may be located remotely from the multiple devices.
- the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.
- this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices.
- Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices.
- a class of target devices is created and experiences are tailored to the generic class of devices.
- a class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.
- the computing device 1502 may assume a variety of different configurations, such as for computer 1514 , mobile 1516 , and television 1518 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 1502 may be configured according to one or more of the different device classes. For instance, the computing device 1502 may be implemented as the computer 1514 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.
- the computing device 1502 may also be implemented as the mobile 1516 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a wearable device, a multi-screen computer, and so on.
- the computing device 1502 may also be implemented as the television 1518 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.
- the techniques described herein may be supported by these various configurations of the computing device 1502 and are not limited to the specific examples of the techniques described herein.
- functionalities discussed with reference to the client device 102 and/or ink module 118 may be implemented all or in part through use of a distributed system, such as over a “cloud” 1520 via a platform 1522 as described below.
- the cloud 1520 includes and/or is representative of a platform 1522 for resources 1524 .
- the platform 1522 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 1520 .
- the resources 1524 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 1502 .
- Resources 1524 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.
- the platform 1522 may abstract resources and functions to connect the computing device 1502 with other computing devices.
- the platform 1522 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 1524 that are implemented via the platform 1522 .
- implementation of functionality described herein may be distributed throughout the system 1500 .
- the functionality may be implemented in part on the computing device 1502 as well as via the platform 1522 that abstracts the functionality of the cloud 1520 .
- aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof
- the methods are shown as a set of steps that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations. Aspects of the methods can be implemented via interaction between various entities discussed above with reference to the environment 100 .
- a pen apparatus including: a shaft; a tip portion that is retractably engaged with the shaft such that the tip portion is extendable from the shaft and retractable at least partially into the shaft; and a tip button that is selectable to extend and retract the tip portion relative to the shaft, the pen apparatus configured to apply digital ink to an input surface of a computing device in response to interaction of the pen with the input surface with the tip portion extended from the shaft, and configured to apply non-digital-ink input to the input surface of the computing device in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
- non-digital ink input includes touch input to the input surface such that digital ink is not applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
- a system for processing digital ink input and non-digital-ink input including: an input surface; one or more processors; and one or more computer-readable storage media storing computer-executable instructions that, responsive to execution by the one or more processors, cause the system to perform operations including: detecting pen input from a pen to the input surface; processing the pen input as digital ink input in response to ascertaining that the pen is in an ink-enabled mode; and processing the pen input as non-digital-ink input in response to ascertaining that the pen is in an ink-disabled mode.
- instructions further include ascertaining whether the pen is in the ink-enabled mode or the ink-disabled mode by determining whether a tip portion of the pen is in an extended or retracted position.
- processing the pen input as non-digital-ink input includes processing the pen input such that digital ink is not applied to the input surface in response to contact between the pen and the input surface.
- a computer-implemented method for processing permanent ink input and transient ink input including: receiving by a computing system ink input to an input surface from a pen apparatus; processing by the computing system the ink input as transient ink in response to ascertaining that the pen apparatus is in a transient ink mode; and processing by the computing system the ink input as permanent ink in response to ascertaining that the pen apparatus is in a permanent ink mode.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Techniques for pen input modes for digital ink are described. According to various embodiments, a pen apparatus is described that is switchable between providing digital ink input and non-digital-ink input. According to various embodiments, a pen apparatus is switchable between different ink input modes. For instance, the pen apparatus is switchable between a permanent ink mode in which ink is applied as permanent ink, and a transient ink mode in which ink is applied as transient ink.
Description
- This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application No. 62/002,648, Attorney Docket Number 355121.01, filed May 23, 2014 and titled “Ink,” the entire disclosure of which is incorporated by reference in its entirety.
- Devices today (e.g., computing devices) typically support a variety of different input techniques. For instance, a particular device may receive input from a user via a keyboard, a mouse, voice input, touch input (e.g., to a touchscreen), and so forth. One particularly intuitive input technique enables a user to utilize a touch instrument (e.g., a pen, a stylus, a finger, and so forth) to provide freehand input to a touch-sensing functionality such as a touchscreen, which is interpreted as digital ink. The freehand input may be converted to a corresponding visual representation on a display, such as for taking notes, for creating and editing an electronic document, and so forth. Many current techniques for digital ink, however, typically provide limited ink functionality.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Techniques for pen input modes for digital ink are described. According to various embodiments, a pen apparatus is described that is switchable between providing digital ink input and non-digital-ink input. According to various embodiments, a pen apparatus is switchable between different ink input modes. For instance, the pen apparatus is switchable between a permanent ink mode in which ink is applied as permanent ink, and a transient ink mode in which ink is applied as transient ink.
- The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
-
FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques discussed herein in accordance with one or more embodiments. -
FIG. 2 depicts an example implementation scenario for a permanent ink mode in accordance with one or more embodiments. -
FIG. 3 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 4 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 5 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 6 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 7 depicts an example implementation of a pen apparatus in accordance with one or more embodiments. -
FIG. 8 depicts an example implementation of a pen apparatus in accordance with one or more embodiments. -
FIG. 9 depicts an example implementation scenario for receiving digital ink input and non-digital-ink input in accordance with one or more embodiments. -
FIG. 10 is a flow diagram that describes steps in a method for processing ink according to a current ink mode in accordance with one or more embodiments -
FIG. 11 is a flow diagram that describes steps in a method for a transient ink timer in accordance with one or more embodiments. -
FIG. 12 is a flow diagram that describes steps in a method for processing pen input based on a pen mode in accordance with one or more embodiments. -
FIG. 13 is a flow diagram that describes steps in a method for processing pen input based on an ink mode in accordance with one or more embodiments. -
FIG. 14 is a flow diagram that describes steps in a method for erasing digital ink in accordance with one or more embodiments -
FIG. 15 illustrates an example system and computing device as described with reference toFIG. 1 , which are configured to implement embodiments of techniques described herein. - Overview
- Techniques for pen input modes for digital ink are described. Generally, ink refers to freehand input to a pressure-sensing functionality such as a touchscreen and/or digitizer screen, which is interpreted as digital ink. Generally, digital ink is referred to herein as “ink” and “digital ink.” Ink may be provided in various ways, such as using a pen (e.g., an active pen, a passive pen, and so forth), a stylus, a finger, and so forth.
- According to various implementations, a pen apparatus is described that is switchable between providing digital ink input and non-digital-ink input. For instance, in an ink-enabled mode, the pen apparatus causes ink content to be applied to an input surface such as an active display. In an ink-disabled mode, the pen apparatus provides input modes other than ink content, such as touch input for object manipulation and selection, document navigation, and so forth. For instance, when the pen apparatus is in the ink-disabled mode, contact between the pen apparatus and an input surface does not cause ink to be applied to the input surface. Thus, a switchable pen apparatus enables user to provide different types of input from a single device.
- According to various implementations, a pen apparatus is switchable to apply ink according to different ink modes. For instance, in an ink-enabled mode a pen will apply ink whenever input is provided to an input surface. However, the type of ink that is applied can be switched between different types, such as based on an application context, in response to a user selection of an ink mode, and so forth. For instance, the pen apparatus is switchable between a permanent ink mode in which ink is applied as permanent ink, and a transient ink mode in which ink is applied as transient ink.
- In the following discussion, an example environment is first described that is operable to employ techniques described herein. Next, a section entitled “Example Implementation Scenarios and Procedures” describes some example implementation scenarios and methods for pen input modes for digital ink in accordance with one or more embodiments. Finally, a section entitled “Example System and Device” describes an example system and device that are operable to employ techniques discussed herein in accordance with one or more embodiments.
- Example Environment
-
FIG. 1 is an illustration of anenvironment 100 in an example implementation that is operable to employ techniques for pen input modes for digital ink discussed herein.Environment 100 includes aclient device 102 which can be embodied as any suitable device such as, by way of example and not limitation, a smartphone, a tablet computer, a portable computer (e.g., a laptop), a desktop computer, a wearable device, and so forth. In at least some implementations, theclient device 102 represents a smart appliance, such as an Internet of Things (“IoT”) device. Thus, theclient device 102 may range from a system with significant processing power, to a lightweight device with minimal processing power. One of a variety of different examples of aclient device 102 is shown and described below inFIG. 15 . - The
client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed. For instance, theclient device 102 includes anoperating system 104,applications 106, and acommunication module 108. Generally, theoperating system 104 is representative of functionality for abstracting various system components of theclient device 102, such as hardware, kernel-level modules and services, and so forth. Theoperating system 104, for instance, can abstract various components (e.g., hardware, software, and firmware) of theclient device 102 to theapplications 106 to enable interaction between the components and theapplications 106. - The
applications 106 represents functionalities for performing different tasks via theclient device 102. Examples of theapplications 106 include a word processing application, a spreadsheet application, a web browser, a gaming application, and so forth. Theapplications 106 may be installed locally on theclient device 102 to be executed via a local runtime environment, and/or may represent portals to remote functionality, such as cloud-based services, web apps, and so forth. Thus, theapplications 106 may take a variety of forms, such as locally-executed code, portals to remotely hosted services, and so forth. - The
communication module 108 is representative of functionality for enabling theclient device 102 to communication over wired and/or wireless connections. For instance, thecommunication module 108 represents hardware and logic for communication via a variety of different wired and/or wireless technologies and protocols. - The
client device 102 further includes adisplay device 110,input mechanisms 112 including adigitizer 114 andtouch input devices 116, and anink module 118. Thedisplay device 110 generally represents functionality for visual output for theclient device 102. Additionally, thedisplay device 110 represents functionality for receiving various types of input, such as touch input, pen input, and so forth. Theinput mechanisms 112 generally represent different functionalities for receiving input to thecomputing device 102. Examples of theinput mechanisms 112 include gesture-sensitive sensors and devices (e.g., such as touch-based sensors and movement-tracking sensors (e.g., camera-based)), a mouse, a keyboard, a stylus, a touch pad, accelerometers, a microphone with accompanying voice recognition software, and so forth. Theinput mechanisms 112 may be separate or integral with thedisplays 110; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors. Thedigitizer 114 represents functionality for converting various types of input to thedisplay device 110 and thetouch input devices 116 into digital data that can be used by thecomputing device 102 in various ways, such as for generating digital ink. - According to various implementations, the
ink module 118 represents functionality for performing various aspects of techniques for pen input modes for digital ink discussed herein. Various functionalities of theink module 118 are discussed below. Theink module 118 includes a transient layer application programming interface (API) 120 and apermanent layer API 122. Thetransient layer API 120 represents functionality for enabling interaction with a transient ink layer, and thepermanent layer API 122 represents functionality for enabling ink interaction with a permanent object (e.g., document) layer. In at least some implementations, thetransient layer API 120 and thepermanent layer API 122 may be utilized (e.g., by the applications 106) to access transient ink functionality and permanent ink functionality, respectively. - The
environment 100 further includes apen 124, which is representative of an input device for providing input to thedisplay device 110. Generally, thepen 124 is in a form factor of a traditional pen but includes functionality for interacting with thedisplay device 110 and other functionality of theclient device 102. In at least some implementations, thepen 124 is an active pen that includes electronic components for interacting with theclient device 102. Thepen 124, for instance, includes a battery that can provide power to internal components of thepen 124. - Alternatively or additionally, the
pen 124 may include a magnet or other functionality that supports hover detection over thedisplay device 110. This is not intended to be limiting, however, and in at least some implementations thepen 124 may be passive, e.g., a stylus without internal electronics. Generally, thepen 124 is representative of an input device that can provide input that can be differentiated from other types of input by theclient device 102. For instance, thedigitizer 114 is configured to differentiate between input provided via thepen 124, and input provided by a different input mechanism such as a user's finger, a stylus, and so forth. Further details concerning thepen 124 are provided below. - Having described an example environment in which the techniques described herein may operate, consider now a discussion of an example implementation scenario in accordance with one or more embodiments.
- Transient Ink and Permanent Ink
- According to various implementations, ink can be applied in different ink modes including a transient ink mode and a permanent ink mode. Generally, transient ink refers to ink that is temporary and that can be used for various purposes, such as invoking particular actions, annotating a document, and so forth. For instance, in transient implementations, ink can be used for annotation layers for electronic documents, temporary visual emphasis, text recognition, invoking various commands and functionalities, and so forth.
- Permanent ink generally refers to implementations where ink becomes a part of the underlying object, such as for creating a document, writing on a document (e.g., for annotation and/or editing), applying ink to graphics, and so forth. Permanent ink, for example, can be considered as a graphics object, such as for note taking, for creating visual content, and so forth.
- In at least some implementations, a pen (e.g., the pen 124) applies ink whenever the pen is in an ink-enabled mode and is in contact with an input surface, such as the
display device 104 and/or other input surface. Further, a pen can apply ink across many different applications, platforms, and services. In one or more implementations, an application and/or service can specify how ink is used in relation to an underlying object, such as a word processing document, a spreadsheet and so forth. For instance, in some scenarios ink is applied as transient ink, and other scenarios ink is applied as permanent ink. Examples of different implementations and attributes of transient ink and permanent ink are detailed below. - Example Implementation Scenarios and Procedures
- This section describes some example implementation scenarios and example procedures for ink modes in accordance with one or more implementations. The implementation scenarios and procedures may be implemented in the
environment 100 described above, thesystem 1500 ofFIG. 15 , and/or any other suitable environment. The implementation scenarios and procedures, for example, describe example operations of theclient device 102 and theink module 118. While the implementation scenarios and procedures are discussed with reference to a particular application, it is to be appreciated that techniques for pen input modes for digital ink discussed herein are applicable across a variety of different applications, services, and environments. In at least some embodiments, steps described for the various procedures are implemented automatically and independent of user interaction. -
FIG. 2 depicts anexample implementation scenario 200 for a permanent ink mode in accordance with one or more implementations. The upper portion of thescenario 200 includes a graphical user interface (GUI) 202 displayed on thedisplay device 110. Generally, theGUI 202 represents a GUI for a particular functionality, such as an instance of theapplications 106. Also depicted is a user holding thepen 124. Displayed within theGUI 202 is adocument 204, e.g., an electronic document generated via one of theapplications 106. - Proceeding to the lower portion of the
scenario 200, the user brings thepen 124 in proximity to the surface of thedisplay device 110 and within theGUI 202. Thepen 124, for instance, is placed within a particular distance of the display device 110 (e.g., less than 2 centimeters) but not in contact with thedisplay device 110. This behavior is generally referred to herein as “hovering” thepen 124. In response to detecting proximity of thepen 124, a hovertarget 206 is displayed within theGUI 202 and at a point within theGUI 202 that is directly beneath the tip of thepen 124. Generally, the hovertarget 206 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to thedocument 204. - According to various implementations, the visual appearance (e.g., shape, color, shading, and so forth) of the hover
target 206 provides a visual cue indicating a current ink mode that is active. In thescenario 200, the hover target is presented as a solid circle, which indicates that a permanent ink mode is active. For instance, if the user proceeds to put thepen 124 in contact with thedisplay device 110 to apply ink to thedocument 204 in a permanent ink mode, the ink will become part of thedocument 204, e.g., will be added to a primary content layer of thedocument 204. Consider, for example, that the text (e.g., primary content) displayed in thedocument 204 was created via ink input in a permanent ink mode. Thus, ink applied in a permanent ink mode represents a permanent ink layer that is added to a primary content layer of thedocument 204. - In further response to detecting hovering of the
pen 124, anink flag 208 is visually presented adjacent to and/or at least partially overlaying a portion of thedocument 204. Generally, theink flag 208 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to thedocument 204. In at least some implementations, theink flag 208 may be presented additionally or alternatively to the hovertarget 206. In this particular example, theink flag 208 includes a visual cue indicating a current ink mode that is active. In thescenario 200, theink flag 208 includes a solid circle, which indicates that a permanent ink mode is active. As further detailed below, theink flag 208 is selectable to cause an ink menu to be displayed that includes various ink-related functionalities, options, and settings that can be applied. -
FIG. 3 depicts anexample implementation scenario 300 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 300 includes a graphical user interface (GUI) 302 displayed on thedisplay device 110. Generally, theGUI 302 represents a GUI for a particular functionality, such as an instance of theapplications 106. Displayed within theGUI 302 is adocument 304, e.g., an electronic document generating via one of theapplications 106. Thedocument 304 includesprimary content 306, which represents content generated as part of a primary content layer for thedocument 304. For instance, in this particular example thedocument 304 is a text-based document, and thus theprimary content 306 includes text that is populated to the document. Various other types of documents and primary content may be employed, such as for graphics, multimedia, web content, and so forth. - As further illustrated, a user is hovering the
pen 124 within a certain proximity of the surface of thedisplay device 110, such as discussed above with reference to thescenario 200. In response, a hovertarget 308 is displayed within thedocument 304 and beneath the tip of the pen. In this particular example, the hovertarget 308 is presented as a hollow circle, thus indicating that a transient ink mode is active. For instance, if the user proceeds to apply ink to thedocument 304, the ink will behave according to a transient ink mode. Examples of different transient ink behaviors are detailed elsewhere herein. - Further in response to the user hovering the
pen 124 over thedisplay device 110, anink flag 310 is presented. In this particular example, theink flag 310 includes ahollow circle 312, thus providing a visual cue that a transient ink mode is active. - Proceeding to the lower portion of the
scenario 300, the user removes thepen 124 from proximity to thedisplay device 110. In response, the hovertarget 308 and theink flag 310 are removed from thedisplay device 110. For instance, in at least some implementations, a hover target and/or an ink flag are presented when thepen 124 is detected as being hovered over thedisplay device 110, and are removed from thedisplay device 110 when thepen 124 is removed such that thepen 124 is no longer detected as being hovered over thedisplay device 110. This is not intended to be limiting, however, and in at least some implementations, an ink flag may be persistently displayed to indicate that inking functionality is active and/or available. -
FIG. 4 depicts anexample implementation scenario 400 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 300 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay device 110. In at least some implementations, thescenario 400 represents an extension of thescenario 300, above. - In the upper portion of the
scenario 400, a user appliesink content 402 to thedocument 304 using thepen 124. In this particular scenario, theink content 402 corresponds to an annotation of thedocument 402. It is to be appreciated, however, that a variety of different types of transient ink other than annotations may be employed. Notice that as the user is applying theink content 402, a hover target is not displayed. For instance, in at least some implementations when thepen 124 transitions from a hover position to contact with thedisplay device 110, a hover target is removed. Notice also that theink flag 310 includes ahollow circle 312, indicating that theink content 402 is applied according to a transient ink mode. - Proceeding to the lower portion of the
scenario 400, the user lifts thepen 124 from thedisplay device 110 such that thepen 124 is not detected, e.g., thepen 124 is not in contact with thedisplay device 110 and is not in close enough proximity to thedisplay device 110 to be detected as hovering. In response to thepen 124 no longer being detected in contact with or in proximity to thedisplay device 110, anink timer 406 begins running. For instance, theink timer 406 begins counting down from a specific time value, such as 30 seconds, 60 seconds, and so forth. Generally, the ink timer is representative of functionality to implement a countdown function, such as for tracking time between user interactions with thedisplay device 110 via thepen 124. Theink timer 406, for example, represents a functionality of theink module 118. - As a visual cue that the
ink counter 406 is elapsing, thehollow circle 312 begins to unwind, e.g., begins to disappear from theink flag 310. In at least some implementations, thehollow circle 312 unwinds at a rate that corresponds to the countdown of theink timer 406. For instance, when theink timer 406 is elapsed by 50%, then 50% of thehollow circle 312 is removed from theink flag 310. Thus, unwinding of thehollow circle 312 provides a visual cue that theink timer 406 is elapsing, and how much of the ink timer has elapsed and/or remains to be elapsed. - In at least some implementations, if the
ink timer 406 is elapsing as in the lower portion of thescenario 400 and the user proceeds to place thepen 124 in proximity to the display device 110 (e.g., hovered or in contact with the display device 110), theink timer 406 will reset and will not begin elapsing again until the user removes thepen 124 from thedisplay device 110 such that thepen 124 is not detected. In such implementations, thehollow circle 312 will be restored within theink flag 310 as in the upper portion of thescenario 400. -
FIG. 5 depicts anexample implementation scenario 500 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 300 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay device 110. In at least some implementations, thescenario 500 represents an extension of thescenario 400, above. - In the upper portion of the
scenario 500, theink timer 406 has elapsed. For instance, notice that thehollow circle 312 has completely unwound within theink flag 310, e.g., is visually removed from theink flag 310. According to various implementations, this provides a visual cue that theink timer 406 has completely elapsed. - Proceeding to the lower portion of the
scenario 500, and in response to expiry of theink timer 406, theink content 402 is removed from theGUI 302 and saved as part of atransient ink layer 504 for thedocument 304. Further, theink flag 310 is populated with auser icon 502. Theuser icon 502, for example, represents a user that is currently logged in to thecomputing device 102, and/or a user that interacts with thedocument 304 to apply theink content 402. Alternatively or additionally, thepen 124 includes user identification data that is detected by thecomputing device 102 and thus is leveraged to track which user is interacting with thedocument 304. For example, thepen 124 includes a tagging mechanism (e.g., a radio-frequency identifier (RFID) chip) embedded with a user identity for a particular user. Thus, when thepen 124 is placed in proximity to thedisplay device 110, the tagging mechanism is detected by thecomputing device 102 and utilized to attribute ink input and/or other types of input to a particular user. As used herein, the term “user” may be used to refer to an identity for an individual person, and/or an identity for a discrete group of users that are grouped under a single user identity. - According to various implementations, population of the
user icon 502 to theink flag 310 represents a visual indication that thetransient ink layer 504 exists for thedocument 304, and that thetransient ink layer 504 is associated with (e.g., was generated by) a particular user. Generally, thetransient ink layer 504 represents a data layer that is not part of the primary content layer of thedocument 304, but that is persisted and can be referenced for various purposes. Further attributes of transient ink layers are described elsewhere herein. -
FIG. 6 depicts anexample implementation scenario 600 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 600 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay device 110. In at least some implementations, thescenario 600 represents an extension of thescenario 500, above. - In the upper portion of the
scenario 600, theink flag 310 is displayed indicating that a transient ink layer (e.g., the transient ink layer 504) exists for thedocument 304, and that the transient ink layer is linked to a particular user represented by theuser icon 502 in theink flag 310. - Proceeding to the lower portion of the
scenario 600, a user selects theink flag 310 with thepen 124, which causes theink content 402 to be returned to display as part of thedocument 304. Theink content 402, for example, is bound to thetransient ink layer 504, along with other transient ink content generated for thetransient ink layer 504. Thus, in at least some implementations, thetransient ink layer 504 is accessible by various techniques, such as by selection of theink flag 310. - Additionally or alternatively to selection of the
ink flag 310, if the user proceeds to apply further ink content to thedocument 304 while in the transient ink mode, thetransient ink layer 504 is retrieved and transient ink content included as part of thetransient ink layer 504 is displayed as part of thedocument 504. In at least some implementations, transient ink content of thetransient ink layer 504 is bound (e.g., anchored) to particular portions (e.g., pages, lines, text, and so forth) of thedocument 304. For instance, the user generated theink content 402 adjacent to a particular section of text. Thus, when thetransient ink layer 504 is recalled as depicted in thescenario 600, theink content 402 is displayed adjacent to the particular section of text. - According to various implementations, the
transient ink layer 504 is cumulative such that a user may add ink content to and remove ink content from thetransient ink layer 504 over a span of time and during multiple different interactivity sessions. Thus, thetransient ink layer 504 generally represents a record of multiple user interactions with thedocument 304, such as for annotations, proofreading, commenting, and so forth. Alternatively or additionally, multiple transient layers may be created for thedocument 304, such as when significant changes are made to theprimary content 306, when other users apply transient ink to thedocument 304, and so forth. - In at least some implementations, when the user pauses interaction with the
document 304, theink timer 406 begins elapsing such as discussed above with reference to thescenarios scenario 600 may return to thescenario 400. -
FIG. 7 depicts an example implementation of thepen 124 in accordance with one or more implementations. Thepen 124 includes ashaft 700, which represents a main body and/or chassis of thepen 124. For instance, various components of thepen 124 are attached to and/or contained within theshaft 700. Thepen 124 further includes atip 702 that extends through atip portal 704 in anose 706 of theshaft 700. Thetip 702 represents a portion of thepen 124 that can be leveraged to provide input and/or other types of interactions to an input surface, such as thedisplay device 110 and/or others of thetouch input devices 116. For instance, contact between thetip 702 and an input surface causes digital ink input to be applied to the input surface. As further detailed below, thetip 702 can be retracted and extended relative to theshaft 700 to provide for different input scenarios. - The
pen 124 further includes apen mode button 708, which represents a selectable control (e.g., a switch) for switching thepen 124 between different pen input modes. Generally, different pen input modes enable input from thepen 124 to be utilized and/or interpreted by theink module 118 in different ways. For instance, selecting thepen mode button 708 causes thepen 124 to transition between a permanent ink mode and a transient ink mode. - A
tip button 710 is attached to theshaft 700 and as further detailed below is selectable to retract and extend thetip 702. In at least some implementations, thepen 124 is configured to provide digital ink input with thetip 702 extended as illustrated inFIG. 7 . Further, thepen 124 is configured to provide non-digital-ink input with thetip 702 retracted, as further detailed below. - An
eraser portion 712 is attached to thepen 124 adjacent to thetip button 710. Theeraser portion 712 is representative of functionality to enable ink content to be erased, such as permanent and/or transient ink. - The
pen 124 further includesinternal components 714, which are representative of components that enable various functionalities of thepen 124. For instance, theinternal components 714 include active electronics such as logic and processing components for controlling different operating modes of thepen 124. In at least some implementations, theinternal components 714 are configured to transmit and receive wireless signals using any suitable wireless protocol, such as Bluetooth, radio-frequency identifier (RFID), and so forth. For instance, thepen 124 can leverage theinternal components 124 to transmit a signal that indicates a current ink mode of thepen 124, such as whether thepen 124 is in a permanent or transient ink mode. Further, thepen 124 can leverage theinternal components 124 to transmit a signal that indicates whether ink functionality of thepen 124 is enabled or disabled. -
FIG. 8 depicts an example implementation of thepen 124 in accordance with one or more implementations. InFIG. 8 , thetip 702 is retracted into theshaft 700. For instance, a user selects (e.g., presses) thetip button 710, which causes thetip 702 to at least partially retract through thetip portal 704 into thenose 706 of theshaft 700. Thetip portal 704 represents an opening in thenose 706 through which thetip 702 may extend and retract. According to various implementations, with thetip 702 retracted as inFIG. 8 , thepen 124 provides non-digital-ink input to an input surface. For instance, contact between thenose 706 and an input surface provides non-digital-ink input, such as for selection input, object manipulation, and so forth. - According to various implementations, selecting the
tip button 710 while thetip 702 is retracted causes thetip 702 to extend from thenose 706, such as depicted inFIG. 7 . -
FIG. 9 depicts anexample implementation scenario 900 for receiving digital ink input and non-digital-ink input in accordance with one or more implementations. The upper portion of thescenario 900 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay device 110. In this particular implementation, theGUI 302 includes amenu region 902 that includes different selectable controls for performing various actions, such as on thedocument 304. In at least some implementations, thescenario 900 represents an extension of one or more of the scenarios described above. - In the upper portion of the
scenario 900, a user manipulates thepen 124 to apply ink content (e.g., digital-ink) to thedocument 304. For instance, contact between thetip 702 and thedisplay device 110 causes ink content to be added to thedocument 304. As referenced above, with thetip 702 extended thepen 124 is in an ink-enabled mode such that ink is always application when thepen 124 is placed in contact with thedisplay device 110. In at least some implementations, the ink content is added as permanent ink that becomes part of theprimary content 306 of thedocument 304. Alternatively, the ink content is added as transient ink of thedocument 304. Examples of different transient ink behaviors are discussed throughout. - Proceeding to the lower portion of the
scenario 900, the user selects thetip button 710, which causes thetip 702 to retract into thenose 706. With thetip 702 retracted, thepen 124 is used to provide non-digital-ink input, such as touch input to thedisplay device 110. Retracting thetip 702 enables thepen 124 to be used for touch input such as a user would provide via a finger. For instance, with thetip 702 retracted, placing the pen 124 (e.g., the nose 706) in contact with thedisplay device 110 within thedocument 304 does not cause ink content to be applied to thedocument 304. Non-digital-ink input can be leveraged for various purposes, such as for manipulating thedocument 304 within theGUI 302, selecting selectable controls from themenu region 902, selecting primary content within thedocument 304, and so forth. In this particular example, the user places thenose 706 of thepen 124 in contact with thedisplay device 110 and drags (e.g., scrolls) thedocument 304 within theGUI 302. Notice that as the user manipulates thedocument 304 with thetip 702 retracted, ink content is not applied to thedocument 304. -
FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for processing ink according to a current ink mode in accordance with one or more embodiments. -
Step 1000 detects a pen in proximity to an input surface. Thetouch input device 116, for instance, detects that thepen 124 is hovered and/or in contact with thetouch input device 116 with thetip 702 extended. As referenced above, a hover operation can be associated with a particular threshold proximity to an input surface such that hovering thepen 124 at or within the threshold proximity to the input surface is interpreted as a hover operation, but placing thepen 124 farther than the threshold proximity from the input surface is not interpreted as a hover operation. -
Step 1002 ascertains a current ink mode. Theink module 118, for example, ascertains an ink mode that is currently active on thecomputing device 102. Examples of different ink modes are detailed elsewhere herein, and include a permanent ink mode and a transient ink mode. - In at least some implementations, a current ink mode may be automatically selected by the
ink module 118, such as based on an application and/or document context that is currently in focus. For instance, anapplication 106 may specify a default ink mode that is to be active for the application. Further, some applications may specify ink mode permissions that indicate allowed and disallowed ink modes. Aparticular application 106, for example, may specify that a permanent ink mode is not allowed for documents presented by the application, such as to protect documents from being edited. - Alternatively or additionally, a current ink mode is user-selectable, such as in response to user input selecting an ink mode from the ink menu 802. For instance, a user may cause a switch from a default ink mode for an application to a different ink mode. The user, for example, can select the
pen mode button 708 to cause thepen 124 and/or theink module 118 to switch between different ink modes. - In at least some implementations, ascertaining a current ink mode includes ascertaining whether the pen is in an ink-enabled mode or an ink-disabled mode, such as ascertaining whether the
tip 702 of thepen 124 is extended or retracted. -
Step 1004 causes a visual affordance identifying the current ink mode to be displayed. Examples of such an affordance include a hover target, a visual included as part of an ink flag, and so forth. Examples of different visual affordances are detailed throughout this description and the accompanying drawings. -
Step 1006 processes ink content applied to the input surface according to the current ink mode. The ink content, for instance, is processed as permanent ink, transient ink, and so forth. For example, if a permanent ink mode is active, the ink content is saved as permanent ink, such as part of a primary content layer of a document. If the transient ink mode is active, the ink content is propagated to a transient ink layer of a document. Examples of different mode-specific ink behaviors and actions are detailed elsewhere herein. -
FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for a transient ink timer in accordance with one or more implementations. In at least some implementations, the method represents an extension of the method described above with reference toFIG. 10 . -
Step 1100 receives ink content applied to a document via input from a pen to an input surface while in a transient ink mode. Theink module 118, for example, processes ink content received from thepen 124 to thedisplay device 110 as transient ink. -
Step 1102 detects that the pen is removed from proximity to the input surface. For instance, thetouch input device 116 detects that thepen 124 is not in contact with and is not hovering over a surface of thetouch input device 116, e.g., thedisplay device 110. -
Step 1104 initiates a timer. The timer, for example, is initiated in response to detecting that the pen is removed from proximity to the input surface. In at least some implementations, a visual representation of the timer is presented. For instance, the visual representation provides a visual cue that the timer is elapsing, and indicates a relative amount (e.g., percentage) of the timer that has elapsed. The visual representation, for example, is animated to visually convey that the timer is elapsing. One example of a visual representation of a timer is discussed above with reference toFIGS. 4 and 5 . -
Step 1106 ascertains whether the pen is detected at the input surface before the timer expires. For instance, theink module 118 ascertains whether thepen 124 is detected is contact with and/or hovering over thetouch input device 116 prior to expiry of the timer. If the pen is detected at the input surface prior to expiry of the timer (“Yes”),step 1108 resets the timer and the process returns to step 1100. - If the pen is not detected at the input surface prior to expiry of the timer (“No”),
step 1110 removes the ink content from the document and propagates the ink content to a transient layer for the document. For instance, response to expiry of the timer, the transient ink content is removed from display and propagated to a transient data layer for the document that is separate from a primary content layer of the document. In at least some implementations, a new transient ink layer is created for the document, and the transient ink content is propagated to the new transient ink layer. Alternatively or additionally, the transient ink content is propagated to an existing transient ink layer. For example, the transient ink layer may represent an accumulation of transient ink provided by a user over multiple different interactions with the document and over a period of time. - As discussed above, the transient ink layer may be associated with a particular user, e.g., a user that applies the transient ink content to the document. Thus, the transient ink is linked to the particular user and may subsequently be accessed by the user.
-
FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for processing pen input based on a pen mode in accordance with one or more implementations. In at least some implementations, the method represents an extension of one or more of the methods described above. -
Step 1200 detects pen input from a pen to an input surface. Theink module 118, for instance, detects that thepen 124 is in contact with thedisplay device 110 provides input to thedisplay device 110. -
Step 1202 ascertains whether the pen is in an ink-enabled mode or an ink-disabled mode. Theink module 118, for example, ascertains whether thepen 124 is in an ink-enabled or ink-disabled mode. In at least some implementations, thepen 124 can transmit a wireless signal that identifies whether thepen 124 is in an ink-enabled or ink-disabled mode. Alternatively or additionally, theink module 118 can detect a state of thetip 702 and can ascertain whether thepen 124 is in an ink-enabled or ink-disabled mode based on the state. For instance, if theink module 118 detects that thetip 702 is extended, theink module 118 ascertains that thepen 124 is in an ink-enabled mode. However, if theink module 118 detects that thetip 702 is retracted, theink module 118 ascertains that thepen 124 is in an ink-disabled mode. - In at least some implementations, the
ink module 118 detects whether thetip 702 is extended or retracted, and enables or disables ink functionality of theclient device 102 based on whether thetip 702 is extended or retracted. - If the pen is in an ink-enabled mode (“Ink-Enabled”),
step 1204 processes the pen input as digital ink input. For instance, the pen input is converted to ink content that is displayed on thedisplay device 110. According to various implementations, the ink content may be permanent ink or transient ink. As referenced above, thepen 124 will apply ink whenever the pen is in an ink-enabled mode. - If the pen is in an ink-disabled mode (“Ink-Disabled”),
step 1206 processes the pen input as non-digital-ink input. Generally, non-digital-ink input does not cause ink content to be applied. For instance, non-digital-ink input is used for other purposes than applying ink content, such as object selection and manipulation, document navigation, and so forth. -
FIG. 13 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for processing pen input based on an ink mode in accordance with one or more implementations. In at least some implementations, the method represents an extension of one or more of the methods described above. -
Step 1300 receives ink input to an input surface from a pen. Theink module 118, for instance, detects that ink input is provided by thepen 124. In at least some implementations, theink module 118 detects that the ink input is provided when thepen 124 is in an ink-enabled mode. -
Step 1302 ascertains whether the pen is in a permanent ink mode or a transient ink mode. For instance, thepen 124 communicates a signal to theclient device 102 indicating whether thepen 124 is in a permanent ink mode or a transient ink mode. The signal, for instance, is communicated as a wireless signal from thepen 124. Alternatively or additionally, the signal is communicated via contact between thepen 124 and an input surface, e.g., thedisplay device 110. For instance, the signal can be electrically conducted across thetip 702 to thedisplay device 110 and to internal components of theclient device 102 to convey information identifying a current mode of thepen 124. - If the pen is in a permanent ink mode (“Permanent”),
step 1304 processes the ink input as permanent ink. For instance, the ink input is displayed and/or added to a permanent ink layer. - If the pen is in a transient ink mode (“Transient”),
step 1306 processes the ink input as transient ink. The ink input, for example, is added to a transient ink layer that may be subsequent accessed. Examples of different permanent and transient ink behaviors are detailed throughout. -
FIG. 14 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for erasing digital ink in accordance with one or more implementations. In at least some implementations, the method represents an extension of one or more of the methods described above. -
Step 1400 receives an indication of contact between an eraser portion of a pen and digital ink displayed on an input surface. Theink module 118, for instance, detects that theeraser portion 712 of thepen 124 is in contact with digital ink displayed on thedisplay device 110. -
Step 1402 causes the digital ink to be removed from the input surface in response to the indication. Theink module 118, for example, deletes the digital ink. For instance, the digital ink is removed from display and data describing the digital ink is deleted. - Although discussed separately, it is to be appreciated that the implementations, scenarios, and procedures described above can be combined and implemented together in various ways. For instance, the implementations, scenarios, and procedures describe different functionalities of single integrated inking platform, such as implemented by the
ink module 118. - Having described some example implementation scenarios and procedures for pen input modes for digital ink, consider now a discussion of an example system and device in accordance with one or more embodiments.
- Example System and Device
-
FIG. 15 illustrates an example system generally at 1500 that includes anexample computing device 1502 that is representative of one or more computing systems and/or devices that may implement various techniques described herein. For example, theclient device 102 discussed above with reference toFIG. 1 can be embodied as thecomputing device 1502. Thecomputing device 1502 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system. - The
example computing device 1502 as illustrated includes aprocessing system 1504, one or more computer-readable media 1506, and one or more Input/Output (I/O) Interfaces 1508 that are communicatively coupled, one to another. Although not shown, thecomputing device 1502 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines. - The
processing system 1504 is representative of functionality to perform one or more operations using hardware. Accordingly, theprocessing system 1504 is illustrated as includinghardware element 1510 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. Thehardware elements 1510 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions. - The computer-
readable media 1506 is illustrated as including memory/storage 1512. The memory/storage 1512 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage 1512 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage 1512 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1506 may be configured in a variety of other ways as further described below. - Input/output interface(s) 1508 are representative of functionality to allow a user to enter commands and information to
computing device 1502, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice recognition and/or spoken input), a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, thecomputing device 1502 may be configured in a variety of ways as further described below to support user interaction. - Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” “entity,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the
computing device 1502. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.” - “Computer-readable storage media” may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Computer-readable storage media do not include signals per se. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- “Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the
computing device 1502, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media. - As previously described,
hardware elements 1510 and computer-readable media 1506 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein. Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices. In this context, a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously. - Combinations of the foregoing may also be employed to implement various techniques and modules described herein. Accordingly, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or
more hardware elements 1510. Thecomputing device 1502 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules that are executable by thecomputing device 1502 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/orhardware elements 1510 of the processing system. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one ormore computing devices 1502 and/or processing systems 1504) to implement techniques, modules, and examples described herein. - As further illustrated in
FIG. 15 , theexample system 1500 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on. - In the
example system 1500, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link. - In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.
- In various implementations, the
computing device 1502 may assume a variety of different configurations, such as forcomputer 1514, mobile 1516, andtelevision 1518 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus thecomputing device 1502 may be configured according to one or more of the different device classes. For instance, thecomputing device 1502 may be implemented as thecomputer 1514 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on. - The
computing device 1502 may also be implemented as the mobile 1516 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a wearable device, a multi-screen computer, and so on. Thecomputing device 1502 may also be implemented as thetelevision 1518 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on. - The techniques described herein may be supported by these various configurations of the
computing device 1502 and are not limited to the specific examples of the techniques described herein. For example, functionalities discussed with reference to theclient device 102 and/orink module 118 may be implemented all or in part through use of a distributed system, such as over a “cloud” 1520 via aplatform 1522 as described below. - The
cloud 1520 includes and/or is representative of aplatform 1522 forresources 1524. Theplatform 1522 abstracts underlying functionality of hardware (e.g., servers) and software resources of thecloud 1520. Theresources 1524 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from thecomputing device 1502.Resources 1524 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network. - The
platform 1522 may abstract resources and functions to connect thecomputing device 1502 with other computing devices. Theplatform 1522 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for theresources 1524 that are implemented via theplatform 1522. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout thesystem 1500. For example, the functionality may be implemented in part on thecomputing device 1502 as well as via theplatform 1522 that abstracts the functionality of thecloud 1520. - Discussed herein are a number of methods that may be implemented to perform techniques discussed herein. Aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof The methods are shown as a set of steps that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations. Aspects of the methods can be implemented via interaction between various entities discussed above with reference to the
environment 100. - Implementations discussed herein include:
- A pen apparatus including: a shaft; a tip portion that is retractably engaged with the shaft such that the tip portion is extendable from the shaft and retractable at least partially into the shaft; and a tip button that is selectable to extend and retract the tip portion relative to the shaft, the pen apparatus configured to apply digital ink to an input surface of a computing device in response to interaction of the pen with the input surface with the tip portion extended from the shaft, and configured to apply non-digital-ink input to the input surface of the computing device in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
- A pen apparatus as described in example 1, further including an active component configured to interact with the input surface to cause digital ink to be applied to the input surface in response to interaction of the pen with the input surface with the tip portion extended from the shaft.
- A pen apparatus as described in one or more of examples 1 or 2, further including an active component configured to interact with the input surface to cause digital ink to be applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially extended from the shaft, and wherein the active component is further configured to interact with the input surface to cause non-digital ink input to be applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
- A pen apparatus as described in one or more of examples 1-3, wherein the non-digital ink input includes touch input to the input surface such that digital ink is not applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
- A pen apparatus as described in one or more of examples 1-4, further including a pen mode button positioned on the shaft, the pen mode button being selectable to cause the pen apparatus to apply digital ink to the input surface in different ink modes and in response to interaction of the pen with the input surface with the tip portion at least partially extended from the shaft.
- A pen apparatus as described in one or more of examples 1-5, further including a pen mode button positioned on the shaft, the pen mode button being selectable to cause the pen apparatus to switch between applying digital ink in a permanent ink mode and applying digital ink in a transient ink mode.
- A pen apparatus as described in one or more of examples 1-6, wherein retraction of the tip portion causes the tip portion to be non-detectable to the input surface.
- A pen apparatus as described in one or more of examples 1-7, wherein the pen apparatus is configured such that retraction of the tip portion disables one or more of ink functionality of the pen apparatus or ink functionality of the input surface.
- A pen apparatus as described in one or more of examples 1-8, wherein the shaft includes an eraser portion that is configured to erase digital ink applied to the input surface.
- A system for processing digital ink input and non-digital-ink input, the system including: an input surface; one or more processors; and one or more computer-readable storage media storing computer-executable instructions that, responsive to execution by the one or more processors, cause the system to perform operations including: detecting pen input from a pen to the input surface; processing the pen input as digital ink input in response to ascertaining that the pen is in an ink-enabled mode; and processing the pen input as non-digital-ink input in response to ascertaining that the pen is in an ink-disabled mode.
- A system as described in example 10, wherein the instructions further include ascertaining whether the pen is in the ink-enabled mode or the ink-disabled mode by determining whether a tip portion of the pen is in an extended or retracted position.
- A system as described in one or more of examples 10 or 11, wherein the instructions further include determining that a tip portion of the pen is in a retracted position, and in response processing the pen input to the input surface as non-digital-ink input.
- A system as described in one or more of examples 10-12, wherein said processing the pen input as non-digital-ink input includes processing the pen input such that digital ink is not applied to the input surface in response to contact between the pen and the input surface.
- A system as described in one or more of examples 10-13, wherein the pen input includes pen input to a digital document, and wherein the operations further include: ascertaining that the pen is in a transient ink mode, wherein processing the pen input as digital-ink input includes causing the digital ink input to be propagated to a transient ink layer for the digital document.
- A system as described in one or more of examples 10-14, wherein the operations further include: receiving an indication of contact between an eraser portion of the pen and digital ink displayed on the input surface; and causing the digital ink to be removed from the input surface in response to the indication.
- A computer-implemented method for processing permanent ink input and transient ink input, the method including: receiving by a computing system ink input to an input surface from a pen apparatus; processing by the computing system the ink input as transient ink in response to ascertaining that the pen apparatus is in a transient ink mode; and processing by the computing system the ink input as permanent ink in response to ascertaining that the pen apparatus is in a permanent ink mode.
- A computer-implemented method as described in example 16, further including ascertaining whether the pen apparatus is in the transient ink mode or the permanent ink mode based on a signal from the pen apparatus indicating a current active ink mode for the pen apparatus.
- A computer-implemented method as described in one or more of examples 16 or 17, wherein the ink input includes ink input to a digital document, and wherein said processing the ink input as transient ink includes propagating the ink input to a transient layer for the digital document.
- A computer-implemented method as described in one or more of examples 16-18, wherein the ink input includes ink input to a digital document, wherein said processing the ink input as transient ink includes propagating the ink input to a transient ink layer for the digital document, and wherein the method further includes enabling the transient ink layer to be accessible separately from primary content of the digital document.
- A computer-implemented method as described in one or more of examples 16-19, further including: detecting further pen input to the input surface; ascertaining that the pen apparatus is in an ink-disabled mode; and processing further pen input to the input surface as non-digital-ink input in response to ascertaining that the pen apparatus is in an ink-disabled mode.
- Techniques for pen input modes for digital ink are described. Although embodiments are described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.
Claims (20)
1. A pen apparatus comprising:
a shaft;
a tip portion that is retractably engaged with the shaft such that the tip portion is extendable from the shaft and retractable at least partially into the shaft; and
a tip button that is selectable to extend and retract the tip portion relative to the shaft,
the pen apparatus configured to apply digital ink to an input surface of a computing device in response to interaction of the pen with the input surface with the tip portion extended from the shaft, and configured to apply non-digital-ink input to the input surface of the computing device in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
2. A pen apparatus as recited in claim 1 , further comprising an active component configured to interact with the input surface to cause digital ink to be applied to the input surface in response to interaction of the pen with the input surface with the tip portion extended from the shaft.
3. A pen apparatus as recited in claim 1 , further comprising an active component configured to interact with the input surface to cause digital ink to be applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially extended from the shaft, and wherein the active component is further configured to interact with the input surface to cause non-digital ink input to be applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
4. A pen apparatus as recited in claim 1 , wherein the non-digital ink input comprises touch input to the input surface such that digital ink is not applied to the input surface in response to interaction of the pen with the input surface with the tip portion at least partially retracted into the shaft.
5. A pen apparatus as recited in claim 1 , further comprising a pen mode button positioned on the shaft, the pen mode button being selectable to cause the pen apparatus to apply digital ink to the input surface in different ink modes and in response to interaction of the pen with the input surface with the tip portion at least partially extended from the shaft.
6. A pen apparatus as recited in claim 1 , further comprising a pen mode button positioned on the shaft, the pen mode button being selectable to cause the pen apparatus to switch between applying digital ink in a permanent ink mode and applying digital ink in a transient ink mode.
7. A pen apparatus as recited in claim 1 , wherein retraction of the tip portion causes the tip portion to be non-detectable to the input surface.
8. A pen apparatus as recited in claim 1 , wherein the pen apparatus is configured such that retraction of the tip portion disables one or more of ink functionality of the pen apparatus or ink functionality of the input surface.
9. A pen apparatus as recited in claim 1 , wherein the shaft includes an eraser portion that is configured to erase digital ink applied to the input surface.
10. A system comprising:
an input surface;
one or more processors; and
one or more computer-readable storage media storing computer-executable instructions that, responsive to execution by the one or more processors, cause the system to perform operations including:
detecting pen input from a pen to the input surface;
processing the pen input as digital ink input in response to ascertaining that the pen is in an ink-enabled mode; and
processing the pen input as non-digital-ink input in response to ascertaining that the pen is in an ink-disabled mode.
11. The system as described in claim 10 , wherein the instructions further include ascertaining whether the pen is in the ink-enabled mode or the ink-disabled mode by determining whether a tip portion of the pen is in an extended or retracted position.
12. The system as described in claim 10 , wherein the instructions further include determining that a tip portion of the pen is in a retracted position, and in response processing the pen input to the input surface as non-digital-ink input.
13. The system as described in claim 10 , wherein said processing the pen input as non-digital-ink input includes processing the pen input such that digital ink is not applied to the input surface in response to contact between the pen and the input surface.
14. The system as described in claim 10 , wherein the pen input comprises pen input to a digital document, and wherein the operations further include:
ascertaining that the pen is in a transient ink mode,
wherein processing the pen input as digital-ink input comprises causing the digital ink input to be propagated to a transient ink layer for the digital document.
15. The system as described in claim 10 , wherein the operations further include:
receiving an indication of contact between an eraser portion of the pen and digital ink displayed on the input surface; and
causing the digital ink to be removed from the input surface in response to the indication.
16. A computer-implemented method, comprising:
receiving by a computing system ink input to an input surface from a pen apparatus;
processing by the computing system the ink input as transient ink in response to ascertaining that the pen apparatus is in a transient ink mode; and
processing by the computing system the ink input as permanent ink in response to ascertaining that the pen apparatus is in a permanent ink mode.
17. A computer-implemented method as recited in claim 16 , further comprising ascertaining whether the pen apparatus is in the transient ink mode or the permanent ink mode based on a signal from the pen apparatus indicating a current active ink mode for the pen apparatus.
18. A computer-implemented method as recited in claim 16 , wherein the ink input comprises ink input to a digital document, and wherein said processing the ink input as transient ink comprises propagating the ink input to a transient layer for the digital document.
19. A computer-implemented method as recited in claim 16 , wherein the ink input comprises ink input to a digital document, wherein said processing the ink input as transient ink comprises propagating the ink input to a transient ink layer for the digital document, and wherein the method further comprises enabling the transient ink layer to be accessible separately from primary content of the digital document.
20. A computer-implemented method as recited in claim 16 , further comprising:
detecting further pen input to the input surface;
ascertaining that the pen apparatus is in an ink-disabled mode; and
processing further pen input to the input surface as non-digital-ink input in response to ascertaining that the pen apparatus is in an ink-disabled mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/665,462 US20150338940A1 (en) | 2014-05-23 | 2015-03-23 | Pen Input Modes for Digital Ink |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462002648P | 2014-05-23 | 2014-05-23 | |
US14/665,462 US20150338940A1 (en) | 2014-05-23 | 2015-03-23 | Pen Input Modes for Digital Ink |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150338940A1 true US20150338940A1 (en) | 2015-11-26 |
Family
ID=54556059
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/665,462 Abandoned US20150338940A1 (en) | 2014-05-23 | 2015-03-23 | Pen Input Modes for Digital Ink |
US14/665,282 Active US9990059B2 (en) | 2014-05-23 | 2015-03-23 | Ink modes |
US14/665,413 Abandoned US20150339050A1 (en) | 2014-05-23 | 2015-03-23 | Ink for Interaction |
US14/665,369 Active 2035-06-12 US10275050B2 (en) | 2014-05-23 | 2015-03-23 | Ink for a shared interactive space |
US14/665,330 Abandoned US20150338939A1 (en) | 2014-05-23 | 2015-03-23 | Ink Modes |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/665,282 Active US9990059B2 (en) | 2014-05-23 | 2015-03-23 | Ink modes |
US14/665,413 Abandoned US20150339050A1 (en) | 2014-05-23 | 2015-03-23 | Ink for Interaction |
US14/665,369 Active 2035-06-12 US10275050B2 (en) | 2014-05-23 | 2015-03-23 | Ink for a shared interactive space |
US14/665,330 Abandoned US20150338939A1 (en) | 2014-05-23 | 2015-03-23 | Ink Modes |
Country Status (1)
Country | Link |
---|---|
US (5) | US20150338940A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160370887A1 (en) * | 2015-06-19 | 2016-12-22 | Beijing Lenovo Software Ltd. | Apparatus and control method |
US20170108954A1 (en) * | 2015-10-16 | 2017-04-20 | Waltop International Corporation | Capacitive stylus with eraser |
US9990059B2 (en) | 2014-05-23 | 2018-06-05 | Microsoft Technology Licensing, Llc | Ink modes |
US20180188835A1 (en) * | 2015-06-25 | 2018-07-05 | Lg Electronics Inc. | Electronic device and controlling method therefor |
US20180239444A1 (en) * | 2017-02-17 | 2018-08-23 | Dell Products L.P. | System and method for dynamic mode switching in an active stylus |
CN109844760A (en) * | 2016-10-14 | 2019-06-04 | 微软技术许可有限责任公司 | Time correlation ink |
US11372518B2 (en) * | 2020-06-03 | 2022-06-28 | Capital One Services, Llc | Systems and methods for augmented or mixed reality writing |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7793233B1 (en) * | 2003-03-12 | 2010-09-07 | Microsoft Corporation | System and method for customizing note flags |
US9436659B2 (en) * | 2013-06-21 | 2016-09-06 | 3Rb Llc | Transferring annotations between documents displayed side by side |
CN104102349B (en) * | 2014-07-18 | 2018-04-27 | 北京智谷睿拓技术服务有限公司 | Content share method and device |
US20160048318A1 (en) * | 2014-08-15 | 2016-02-18 | Microsoft Technology Licensing, Llc | Detecting selection of digital ink |
KR102411890B1 (en) * | 2014-09-02 | 2022-06-23 | 삼성전자주식회사 | A mehtod for processing contents and an electronic device therefor |
JP6074396B2 (en) * | 2014-09-26 | 2017-02-01 | 富士フイルム株式会社 | Layout creation system, server, client, layout creation method, program, and recording medium |
JP6488653B2 (en) * | 2014-11-07 | 2019-03-27 | セイコーエプソン株式会社 | Display device, display control method, and display system |
CN105988568B (en) * | 2015-02-12 | 2020-07-24 | 北京三星通信技术研究有限公司 | Method and device for acquiring note information |
US9898865B2 (en) * | 2015-06-22 | 2018-02-20 | Microsoft Technology Licensing, Llc | System and method for spawning drawing surfaces |
US10380235B2 (en) * | 2015-09-01 | 2019-08-13 | Branchfire, Inc. | Method and system for annotation and connection of electronic documents |
US10228775B2 (en) * | 2016-01-22 | 2019-03-12 | Microsoft Technology Licensing, Llc | Cross application digital ink repository |
US20170236318A1 (en) * | 2016-02-15 | 2017-08-17 | Microsoft Technology Licensing, Llc | Animated Digital Ink |
US20170277673A1 (en) * | 2016-03-28 | 2017-09-28 | Microsoft Technology Licensing, Llc | Inking inputs for digital maps |
US10296574B2 (en) | 2016-03-28 | 2019-05-21 | Microsoft Technology Licensing, Llc | Contextual ink annotation in a mapping interface |
US10481682B2 (en) * | 2016-03-29 | 2019-11-19 | Google Llc | System and method for generating virtual marks based on gaze tracking |
US10838502B2 (en) * | 2016-03-29 | 2020-11-17 | Microsoft Technology Licensing, Llc | Sharing across environments |
CN108604125B (en) * | 2016-03-29 | 2021-08-27 | 谷歌有限责任公司 | System and method for generating virtual badges based on gaze tracking |
KR102520398B1 (en) | 2016-05-18 | 2023-04-12 | 삼성전자주식회사 | Electronic Device and Method for Saving User Data |
KR102536148B1 (en) * | 2016-07-20 | 2023-05-24 | 삼성전자주식회사 | Method and apparatus for operation of an electronic device |
US10871880B2 (en) * | 2016-11-04 | 2020-12-22 | Microsoft Technology Licensing, Llc | Action-enabled inking tools |
US10930045B2 (en) * | 2017-03-22 | 2021-02-23 | Microsoft Technology Licensing, Llc | Digital ink based visual components |
WO2018186031A1 (en) * | 2017-04-03 | 2018-10-11 | ソニー株式会社 | Information processing device, information processing method, and program |
US20180300302A1 (en) * | 2017-04-15 | 2018-10-18 | Microsoft Technology Licensing, Llc | Real-Time Collaboration Live Ink |
US10469274B2 (en) * | 2017-04-15 | 2019-11-05 | Microsoft Technology Licensing, Llc | Live ink presence for real-time collaboration |
US10558853B2 (en) | 2017-05-07 | 2020-02-11 | Massachusetts Institute Of Technology | Methods and apparatus for sharing of music or other information |
US20180329610A1 (en) * | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Object Selection Mode |
US20180329597A1 (en) * | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Ink Anchoring |
US10417310B2 (en) * | 2017-06-09 | 2019-09-17 | Microsoft Technology Licensing, Llc | Content inker |
US10732826B2 (en) | 2017-11-22 | 2020-08-04 | Microsoft Technology Licensing, Llc | Dynamic device interaction adaptation based on user engagement |
KR101886010B1 (en) * | 2017-12-28 | 2018-09-10 | 주식회사 네오랩컨버전스 | Electronic device and Driving method thereof |
US20190325244A1 (en) * | 2018-04-20 | 2019-10-24 | Skipy Interactive Pvt Ltd | System and method to enable creative playing on a computing device |
US10872199B2 (en) | 2018-05-26 | 2020-12-22 | Microsoft Technology Licensing, Llc | Mapping a gesture and/or electronic pen attribute(s) to an advanced productivity action |
WO2021056780A1 (en) * | 2019-09-25 | 2021-04-01 | 掌阅科技股份有限公司 | Information display method, reader, computer storage medium, ink screen reading device and screen projection display system |
JP2021056814A (en) | 2019-09-30 | 2021-04-08 | シャープ株式会社 | Display device |
CN111385683B (en) * | 2020-03-25 | 2022-01-28 | 广东小天才科技有限公司 | Intelligent sound box application control method and intelligent sound box |
US11429203B2 (en) * | 2020-06-19 | 2022-08-30 | Microsoft Technology Licensing, Llc | Tilt-responsive techniques for digital drawing boards |
US11605187B1 (en) * | 2020-08-18 | 2023-03-14 | Corel Corporation | Drawing function identification in graphics applications |
US11132104B1 (en) * | 2020-10-05 | 2021-09-28 | Huawei Technologies Co., Ltd. | Managing user interface items in a visual user interface (VUI) |
US11630946B2 (en) * | 2021-01-25 | 2023-04-18 | Microsoft Technology Licensing, Llc | Documentation augmentation using role-based user annotations |
US20220244898A1 (en) * | 2021-02-02 | 2022-08-04 | Honeywell International Inc. | Methods and systems for propagating user inputs to different displays |
US11361153B1 (en) | 2021-03-16 | 2022-06-14 | Microsoft Technology Licensing, Llc | Linking digital ink instances using connecting lines |
US11435893B1 (en) | 2021-03-16 | 2022-09-06 | Microsoft Technology Licensing, Llc | Submitting questions using digital ink |
US11875543B2 (en) | 2021-03-16 | 2024-01-16 | Microsoft Technology Licensing, Llc | Duplicating and aggregating digital ink instances |
US11526659B2 (en) | 2021-03-16 | 2022-12-13 | Microsoft Technology Licensing, Llc | Converting text to digital ink |
US11372486B1 (en) | 2021-03-16 | 2022-06-28 | Microsoft Technology Licensing, Llc | Setting digital pen input mode using tilt angle |
US20220318335A1 (en) * | 2021-04-02 | 2022-10-06 | Relativity Oda Llc | Methods and systems for opening and incrementally displaying documents |
US20230315271A1 (en) * | 2022-03-18 | 2023-10-05 | Sony Group Corporation | Collaborative whiteboard for meetings |
US20230353611A1 (en) * | 2022-04-29 | 2023-11-02 | Zoom Video Communications, Inc. | Outputs from persistent hybrid collaborative workspaces |
KR20240020926A (en) * | 2022-08-09 | 2024-02-16 | 삼성전자주식회사 | Electronic device for receiving touch input and method for controlling the same |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020024499A1 (en) * | 1998-03-27 | 2002-02-28 | International Business Machines Corporation | Flexibly interfaceable portable computing device |
US6498601B1 (en) * | 1999-11-29 | 2002-12-24 | Xerox Corporation | Method and apparatus for selecting input modes on a palmtop computer |
US20060215886A1 (en) * | 2000-01-24 | 2006-09-28 | Black Gerald R | Method for identity verification |
US20080165162A1 (en) * | 2007-01-08 | 2008-07-10 | Pegasus Technologies Ltd. | Electronic Pen Device |
US20090000831A1 (en) * | 2007-06-28 | 2009-01-01 | Intel Corporation | Multi-function tablet pen input device |
US20090231275A1 (en) * | 2005-01-30 | 2009-09-17 | Simtrix Limited | Computer mouse peripheral |
US20110184828A1 (en) * | 2005-01-19 | 2011-07-28 | Amazon Technologies, Inc. | Method and system for providing annotations of a digital work |
US20120159351A1 (en) * | 2010-12-21 | 2012-06-21 | International Business Machines Corporation | Multiple reviews of graphical user interfaces |
US9285903B1 (en) * | 2011-09-28 | 2016-03-15 | Amazon Technologies, Inc. | Stylus and electronic display |
Family Cites Families (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5559942A (en) * | 1993-05-10 | 1996-09-24 | Apple Computer, Inc. | Method and apparatus for providing a note for an application program |
US5613019A (en) * | 1993-05-20 | 1997-03-18 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
JP3855462B2 (en) | 1998-05-29 | 2006-12-13 | 株式会社日立製作所 | Method for editing command sequence with processing time and apparatus using the same |
US6658147B2 (en) | 2001-04-16 | 2003-12-02 | Parascript Llc | Reshaping freehand drawn lines and shapes in an electronic document |
US7286141B2 (en) | 2001-08-31 | 2007-10-23 | Fuji Xerox Co., Ltd. | Systems and methods for generating and controlling temporary digital ink |
US7299424B2 (en) | 2002-05-14 | 2007-11-20 | Microsoft Corporation | Lasso select |
US7096432B2 (en) | 2002-05-14 | 2006-08-22 | Microsoft Corporation | Write anywhere tool |
US6867786B2 (en) | 2002-07-29 | 2005-03-15 | Microsoft Corp. | In-situ digital inking for applications |
US20040257346A1 (en) * | 2003-06-20 | 2004-12-23 | Microsoft Corporation | Content selection and handling |
US7372993B2 (en) * | 2004-07-21 | 2008-05-13 | Hewlett-Packard Development Company, L.P. | Gesture recognition |
US7752561B2 (en) * | 2005-03-15 | 2010-07-06 | Microsoft Corporation | Method and system for creating temporary visual indicia |
US7935075B2 (en) | 2005-04-26 | 2011-05-03 | Cardiac Pacemakers, Inc. | Self-deploying vascular occlusion device |
US20060267967A1 (en) | 2005-05-24 | 2006-11-30 | Microsoft Corporation | Phrasing extensions and multiple modes in one spring-loaded control |
US8081165B2 (en) | 2005-08-30 | 2011-12-20 | Jesterrad, Inc. | Multi-functional navigational device and method |
US7526737B2 (en) | 2005-11-14 | 2009-04-28 | Microsoft Corporation | Free form wiper |
US8181103B2 (en) | 2005-12-29 | 2012-05-15 | Microsoft Corporation | Annotation detection and anchoring on ink notes |
US20070156335A1 (en) | 2006-01-03 | 2007-07-05 | Mcbride Sandra Lynn | Computer-Aided Route Selection |
US8194081B2 (en) | 2007-05-29 | 2012-06-05 | Livescribe, Inc. | Animation of audio ink |
US8004498B1 (en) | 2007-10-22 | 2011-08-23 | Adobe Systems Incorporated | Systems and methods for multipoint temporary anchoring |
US20090327501A1 (en) * | 2008-06-27 | 2009-12-31 | Athellina Athsani | Communication access control system and method |
US8402391B1 (en) * | 2008-09-25 | 2013-03-19 | Apple, Inc. | Collaboration system |
US20130283166A1 (en) * | 2012-04-24 | 2013-10-24 | Social Communications Company | Voice-based virtual area navigation |
US9269102B2 (en) | 2009-05-21 | 2016-02-23 | Nike, Inc. | Collaborative activities in on-line commerce |
US8179417B2 (en) * | 2009-07-22 | 2012-05-15 | Hewlett-Packard Development Company, L.P. | Video collaboration |
US20110143769A1 (en) * | 2009-12-16 | 2011-06-16 | Microsoft Corporation | Dual display mobile communication device |
US20110166777A1 (en) | 2010-01-07 | 2011-07-07 | Anand Kumar Chavakula | Navigation Application |
US8261213B2 (en) * | 2010-01-28 | 2012-09-04 | Microsoft Corporation | Brush, carbon-copy, and fill gestures |
US9185326B2 (en) | 2010-06-11 | 2015-11-10 | Disney Enterprises, Inc. | System and method enabling visual filtering of content |
US9323807B2 (en) | 2010-11-03 | 2016-04-26 | Sap Se | Graphical manipulation of data objects |
US9201520B2 (en) | 2011-02-11 | 2015-12-01 | Microsoft Technology Licensing, Llc | Motion and context sharing for pen-based computing inputs |
WO2013016165A1 (en) | 2011-07-22 | 2013-01-31 | Social Communications Company | Communicating between a virtual area and a physical space |
US9344684B2 (en) * | 2011-08-05 | 2016-05-17 | Honeywell International Inc. | Systems and methods configured to enable content sharing between client terminals of a digital video management system |
US9948988B2 (en) | 2011-10-04 | 2018-04-17 | Ricoh Company, Ltd. | Meeting system that interconnects group and personal devices across a network |
US20130205189A1 (en) | 2012-01-25 | 2013-08-08 | Advanced Digital Systems, Inc. | Apparatus And Method For Interacting With An Electronic Form |
US9557878B2 (en) | 2012-04-25 | 2017-01-31 | International Business Machines Corporation | Permitting participant configurable view selection within a screen sharing session |
US9876988B2 (en) * | 2012-07-13 | 2018-01-23 | Microsoft Technology Licensing, Llc | Video display modification for video conferencing environments |
US20140026076A1 (en) * | 2012-07-17 | 2014-01-23 | Jacquilene Jacob | Real-time interactive collaboration system |
US20140047330A1 (en) * | 2012-08-09 | 2014-02-13 | Sap Ag | Collaborative decision making in contract documents |
KR102129374B1 (en) | 2012-08-27 | 2020-07-02 | 삼성전자주식회사 | Method for providing user interface, machine-readable storage medium and portable terminal |
US20140136985A1 (en) | 2012-11-12 | 2014-05-15 | Moondrop Entertainment, Llc | Method and system for sharing content |
KR20140065764A (en) * | 2012-11-21 | 2014-05-30 | 한국전자통신연구원 | System and method for function expandable collaboration screen system |
US9389717B2 (en) * | 2012-12-14 | 2016-07-12 | Microsoft Technology Licensing, Llc | Reducing latency in ink rendering |
KR101984592B1 (en) * | 2013-01-04 | 2019-05-31 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR20140111497A (en) * | 2013-03-11 | 2014-09-19 | 삼성전자주식회사 | Method for deleting item on touch screen, machine-readable storage medium and portable terminal |
US9304609B2 (en) * | 2013-03-12 | 2016-04-05 | Lenovo (Singapore) Pte. Ltd. | Suspending tablet computer by stylus detection |
US9690403B2 (en) * | 2013-03-15 | 2017-06-27 | Blackberry Limited | Shared document editing and voting using active stylus based touch-sensitive displays |
US20140282103A1 (en) | 2013-03-16 | 2014-09-18 | Jerry Alan Crandall | Data sharing |
US20140328505A1 (en) | 2013-05-02 | 2014-11-06 | Microsoft Corporation | Sound field adaptation based upon user tracking |
TWI502433B (en) | 2013-07-08 | 2015-10-01 | Acer Inc | Electronic device for interacting with stylus and touch control method thereof |
US20150033140A1 (en) | 2013-07-23 | 2015-01-29 | Salesforce.Com, Inc. | Providing active screen sharing links in an information networking environment |
US20150052430A1 (en) * | 2013-08-13 | 2015-02-19 | Dropbox, Inc. | Gestures for selecting a subset of content items |
KR20150020383A (en) | 2013-08-13 | 2015-02-26 | 삼성전자주식회사 | Electronic Device And Method For Searching And Displaying Of The Same |
US10044979B2 (en) * | 2013-08-19 | 2018-08-07 | Cisco Technology, Inc. | Acquiring regions of remote shared content with high resolution |
JP2015049604A (en) | 2013-08-30 | 2015-03-16 | 株式会社東芝 | Electronic apparatus and method for displaying electronic document |
US9575948B2 (en) | 2013-10-04 | 2017-02-21 | Nook Digital, Llc | Annotation of digital content via selective fixed formatting |
US10120528B2 (en) | 2013-12-24 | 2018-11-06 | Dropbox, Inc. | Systems and methods for forming share bars including collections of content items |
US10101844B2 (en) * | 2014-03-14 | 2018-10-16 | Lg Electronics Inc. | Mobile terminal and method of controlling the same based on type of touch object used to apply touch input |
US9544257B2 (en) * | 2014-04-04 | 2017-01-10 | Blackberry Limited | System and method for conducting private messaging |
US9268928B2 (en) | 2014-04-06 | 2016-02-23 | International Business Machines Corporation | Smart pen system to restrict access to security sensitive devices while continuously authenticating the user |
US20150304376A1 (en) * | 2014-04-17 | 2015-10-22 | Shindig, Inc. | Systems and methods for providing a composite audience view |
US9906614B2 (en) * | 2014-05-05 | 2018-02-27 | Adobe Systems Incorporated | Real-time content sharing between browsers |
US20150338940A1 (en) | 2014-05-23 | 2015-11-26 | Microsoft Technology Licensing, Llc | Pen Input Modes for Digital Ink |
-
2015
- 2015-03-23 US US14/665,462 patent/US20150338940A1/en not_active Abandoned
- 2015-03-23 US US14/665,282 patent/US9990059B2/en active Active
- 2015-03-23 US US14/665,413 patent/US20150339050A1/en not_active Abandoned
- 2015-03-23 US US14/665,369 patent/US10275050B2/en active Active
- 2015-03-23 US US14/665,330 patent/US20150338939A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020024499A1 (en) * | 1998-03-27 | 2002-02-28 | International Business Machines Corporation | Flexibly interfaceable portable computing device |
US6498601B1 (en) * | 1999-11-29 | 2002-12-24 | Xerox Corporation | Method and apparatus for selecting input modes on a palmtop computer |
US20060215886A1 (en) * | 2000-01-24 | 2006-09-28 | Black Gerald R | Method for identity verification |
US20110184828A1 (en) * | 2005-01-19 | 2011-07-28 | Amazon Technologies, Inc. | Method and system for providing annotations of a digital work |
US20090231275A1 (en) * | 2005-01-30 | 2009-09-17 | Simtrix Limited | Computer mouse peripheral |
US20080165162A1 (en) * | 2007-01-08 | 2008-07-10 | Pegasus Technologies Ltd. | Electronic Pen Device |
US20090000831A1 (en) * | 2007-06-28 | 2009-01-01 | Intel Corporation | Multi-function tablet pen input device |
US20120159351A1 (en) * | 2010-12-21 | 2012-06-21 | International Business Machines Corporation | Multiple reviews of graphical user interfaces |
US9285903B1 (en) * | 2011-09-28 | 2016-03-15 | Amazon Technologies, Inc. | Stylus and electronic display |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9990059B2 (en) | 2014-05-23 | 2018-06-05 | Microsoft Technology Licensing, Llc | Ink modes |
US10275050B2 (en) | 2014-05-23 | 2019-04-30 | Microsoft Technology Licensing, Llc | Ink for a shared interactive space |
US9946368B2 (en) * | 2015-06-19 | 2018-04-17 | Beijing Lenovo Software Ltd. | Apparatus and control method |
US20160370887A1 (en) * | 2015-06-19 | 2016-12-22 | Beijing Lenovo Software Ltd. | Apparatus and control method |
US10908705B2 (en) * | 2015-06-25 | 2021-02-02 | Lg Electronics Inc. | Electronic device and controlling method therefor |
US20180188835A1 (en) * | 2015-06-25 | 2018-07-05 | Lg Electronics Inc. | Electronic device and controlling method therefor |
US20170108954A1 (en) * | 2015-10-16 | 2017-04-20 | Waltop International Corporation | Capacitive stylus with eraser |
US9639182B1 (en) * | 2015-10-16 | 2017-05-02 | Waltop International Corporation | Capacitive stylus with eraser |
CN109844760A (en) * | 2016-10-14 | 2019-06-04 | 微软技术许可有限责任公司 | Time correlation ink |
US20180239444A1 (en) * | 2017-02-17 | 2018-08-23 | Dell Products L.P. | System and method for dynamic mode switching in an active stylus |
US10620725B2 (en) * | 2017-02-17 | 2020-04-14 | Dell Products L.P. | System and method for dynamic mode switching in an active stylus |
US11372518B2 (en) * | 2020-06-03 | 2022-06-28 | Capital One Services, Llc | Systems and methods for augmented or mixed reality writing |
US11681409B2 (en) | 2020-06-03 | 2023-06-20 | Capital One Servies, LLC | Systems and methods for augmented or mixed reality writing |
Also Published As
Publication number | Publication date |
---|---|
US9990059B2 (en) | 2018-06-05 |
US20150339050A1 (en) | 2015-11-26 |
US20150341400A1 (en) | 2015-11-26 |
US20150338938A1 (en) | 2015-11-26 |
US10275050B2 (en) | 2019-04-30 |
US20150338939A1 (en) | 2015-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150338940A1 (en) | Pen Input Modes for Digital Ink | |
US10452333B2 (en) | User terminal device providing user interaction and method therefor | |
US10367765B2 (en) | User terminal and method of displaying lock screen thereof | |
US8810535B2 (en) | Electronic device and method of controlling same | |
US9898186B2 (en) | Portable terminal using touch pen and handwriting input method using the same | |
CN104424359B (en) | For providing the electronic equipment of content and method according to field attribute | |
KR101971067B1 (en) | Method and apparatus for providing of user interface in portable device | |
EP2565752A2 (en) | Method of providing a user interface in portable terminal and apparatus thereof | |
KR20140111497A (en) | Method for deleting item on touch screen, machine-readable storage medium and portable terminal | |
US20170131865A1 (en) | Electronic device with electromagnetic sensor and method for controlling the same | |
US10928948B2 (en) | User terminal apparatus and control method thereof | |
US20130311922A1 (en) | Mobile device with memo function and method for controlling the device | |
EP2770422A2 (en) | Method for providing a feedback in response to a user input and a terminal implementing the same | |
EP2670132A2 (en) | Method and apparatus for playing video in portable terminal | |
JP2012048623A (en) | Information processing unit, parameter setting method, and program | |
EP2770421A2 (en) | Electronic device having touch-sensitive user interface and related operating method | |
US10168894B2 (en) | Computing device canvas invocation and dismissal | |
JP6439266B2 (en) | Text input method and apparatus in electronic device with touch screen | |
US20140164186A1 (en) | Method for providing application information and mobile terminal thereof | |
EP3032394A1 (en) | Method and apparatus for inputting information by using on-screen keyboard | |
US10691333B2 (en) | Method and apparatus for inputting character | |
US20140223298A1 (en) | Method of editing content and electronic device for implementing the same | |
JP2015207040A (en) | Touch operation input device, touch operation input method and program | |
KR102118091B1 (en) | Mobile apparatus having fuction of pre-action on object and control method thereof | |
KR102073024B1 (en) | Apparatus and method for editing memo in a user terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VONG, WILLIAM H.;REEL/FRAME:035230/0962 Effective date: 20150320 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |