US5565894A - Dynamic touchscreen button adjustment mechanism - Google Patents
Dynamic touchscreen button adjustment mechanism Download PDFInfo
- Publication number
- US5565894A US5565894A US08/345,266 US34526694A US5565894A US 5565894 A US5565894 A US 5565894A US 34526694 A US34526694 A US 34526694A US 5565894 A US5565894 A US 5565894A
- Authority
- US
- United States
- Prior art keywords
- button
- touchscreen
- sensing region
- location
- buttons
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000000034 method Methods 0.000 claims description 7
- 230000000694 effects Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
Definitions
- This invention relates to the data processing field. More particularly this invention relates to a touchscreen apparatus for adjusting a sensing region based on the location that the user touches on the touchscreen, which is contained in a display.
- Touchscreens are well known in the art where the user communicates a desired action to a computer by touching a touchscreen in a specific location.
- the computer system displays graphics on the touchscreen that represent buttons.
- the user interfaces to the computer system by touching the touchscreen in a displayed button's sensing region, which is the area on the touchscreen that the computer system associates with the button.
- the computer system detects the user's touch, maps the touched location to a particular button, and performs appropriate functions based on the button that the user selected via the touch.
- Touchscreens are advantageous in that they can eliminate the need for a keyboard or other input device in applications where a separate input device would be cumbersome, expensive, or susceptible to vandalism. Also, with a touchscreen the user is not forced to read instructions in one place and hunt elsewhere, such as on a keyboard, for keys to press. Instead, the instructions that the user reads and the buttons that the user touches are in the same proximity. This close proximity of instructions and buttons is more convenient and faster for the user and reduces the chance of incorrect button selection.
- touchscreens suffer from the problem that the buttons on the touchscreen are two dimensional, and although the buttons are visually distinct from one another, they are not physically distinct from one another.
- the user of a touchscreen cannot feel when a finger is straying over the edge of a button.
- tall users standing to the right of the touchscreen will tend to perceive the button as being above and to the right of the actual button position, so they will tend to touch high and to the right of the button sensing region.
- short users standing to the left of the touchscreen will tend to perceive the button as being below and to the left of the actual button position, so they will tend to touch low and to the left of the button sensing region.
- the computer system dynamically adjusting the orientation of a touchscreen button's sensing region based on the location that the user has touched for a previous button or buttons.
- An initial button or series of buttons of the application that the user is using is configured to sense the location where the user is touching. These initial button(s) are called adjusting buttons.
- the button sensing regions for the remaining buttons are calibrated from these initial touched locations. For example, if the user tends to press high and to the right of the adjusting button(s), the orientation of the button sensing regions for the remaining buttons is calibrated high and to the right. If multiple adjusting buttons are used, the average of the touched positions is used to calibrate the button sensing regions of the non-adjusting buttons.
- buttons are not restricted to the beginning of the application. Any screen that displays buttons can be used to calibrate or recalibrate the button sensing region for subsequent non-adjusting buttons. This allows for greater accuracy should the users change their view or angle of the touchscreen as they use the application.
- FIG. 1 shows the parallax effect, present in a touchscreen, which causes different view angles to result in different users perceiving the button at different locations;
- FIG. 2 shows a prior art touchscreen button and sensing region
- FIG. 3 shows users viewing the touchscreen of the preferred embodiment, which is shown in this example as a ticket kiosk, from different angles;
- FIGS. 4a and 4b show an example of a seat selection screen of the preferred embodiment, which allows a high density of buttons on the screen;
- FIG. 5 shows a block diagram of the computer system of the preferred embodiment
- FIGS. 6a, 6b, 6c, 6d, and 6e show the button list, factor list, and adjustment factor of the preferred embodiment and their relationships to the button sensing region;
- FIGS. 7, 8, and 9 show the flowcharts that describe the operation of the preferred embodiment.
- FIG. 1 is a side view of display 17 and shows the parallax effect, which causes different user view angles to result in different users perceiving button 20 at different locations.
- Touchscreen 18 in display 17 has thickness 23, which is the distance between touchscreen back 28 and touchscreen front 22.
- button 20 is projected onto the touchscreen back 28, users 24 and 26 will attempt to touch it on touchscreen front 22.
- Screen thickness 23 plus the parallax effect causes different operators 24 and 26 to perceive button 20 in different places.
- User 24 will perceive and attempt to touch button 20 at button location 29.
- user 26 will perceive and attempt to touch button 20 at button location 21, which in this example is lower on touchscreen 18 than button location 29.
- FIG. 2 shows prior art display 67, touchscreen 68, touchscreen button 60, and button sensing region 64.
- Button sensing region 64 is the area of touchscreen 68 associated with button 60 such that if the user touches within button sensing region 64, the computer system considers button 60 to have been touched.
- prior art touchscreens utilized button sensing regions 64 that were larger than their corresponding buttons 60.
- the prior art solution illustrated by FIG. 2 is disadvantageous in that sensing regions 64 are large and thus limit the density and orientation of buttons 60 that may be concurrently displayed on touchscreen 68.
- FIG. 3 shows users 24 and 26 viewing button 20 from different angles.
- Button 20a is displayed on touchscreen 18 in display 17.
- the application in this example is ticket kiosk 30. Because of the parallax effect and screen thickness 23 described in FIG. 1., tall user 24 standing to the right of ticket kiosk 30 will tend to view and press button 20a above and to the right of its actual position. In contrast, short user 26, standing to the left of ticket kiosk 30 will tend to view and press button 20a below and to the left of its actual position.
- this invention dynamically adjusts the button sensing regions, as will be shown in FIGS. 4a, 4b, 6a, 6b, 6c, 6d, and 6e, and explained in more detail in the flowcharts of FIGS. 7, 8, and 9.
- FIG. 4a illustrates display 17 and touchscreen 18 of the preferred embodiment displaying an exemplary seat selection screen.
- the aggregation of all buttons 20 represents a section of seats in a stadium. Users can touch a button to select the seat for which they wish to purchase a ticket.
- the buttons with an X through them represent seats that have already been sold.
- This invention allows a high density of buttons with fine resolution on touchscreen 18 because the button sensing regions are dynamically adjusted to suit individual users, as will be shown in FIG. 4b.
- FIG. 4b is a close up view of a few of buttons 20 and their corresponding button sensing regions 34.
- each button sensing region 34 has been adjusted up and to the right to accommodate a user who tends to touch high and to the right of button 20.
- button sensing region 34c is associated with button 20c.
- the invention associates that touch with button 20c.
- the invention associates that touch with buttons 20d or 20e respectively. In this way a high density of buttons 20 can be displayed and selected on touchscreen 18.
- FIG. 5 shows a block diagram of computer system 10 of the preferred embodiment of the invention.
- Computer system 10 has touchscreen 18 in display 17 connected to system unit 11.
- System unit 11 contains processor 12 connected to memory 13, storage 14, and display adapter 15.
- Processor 12 is suitably programmed to carry out this invention, as described in more detail in the flowcharts of FIGS. 7, 8, and 9.
- computer system 10 is an IBM PS/2
- storage 14 is a magnetic hard disk file
- display adapter 15 is an IBM VGA display adapter
- display 17 is an IBM 8516 display.
- Computer system 10 could also be another type of computer system, whether it be another microcomputer such as an Apple Macintosh, a minicomputer such as an IBM AS/400, or a mainframe computer such as an IBM System/390, and still fall within the spirit and scope of this invention.
- computer system 10 could be a microcomputer such as described above but connected to a larger computer system such as an IBM AS/400.
- Button list 46 defines button sensing region 34 for each button 20.
- Factor list 47 contains a list of factors that represent the difference between the center of button sensing region 34 and the locations that the user has touched.
- adjustment factor 48 is the average of the factors in factor list 47. Button list 46, factor list 47, and adjustment factor 48 will be explained in more detail under the description of FIGS. 6a-6e.
- FIGS. 6a, 6b, 6c, 6d, and 6e show factor list 47, adjustment factor 48, and the relationship of button list 46 to button 20 and button sensing region 34 in more detail.
- Button list 46 is a list of button entries. Each entry in button list 46 defines a button sensing region 34 for a button 20 on touchscreen 18 via fields 49, 50, 51, and 52. In the preferred embodiment, there is a separate button list for every screen of buttons displayed, but there could be just one button list for the entire application without departing from the invention. In the preferred embodiment, buttons are displayed as rectangles on the touchscreen, but the buttons could be any geometric shape including, but not limited to, circles, ovals, or polygons with any number of sides.
- Field 50 defines the upper left point of button sensing region 34.
- Field 51 defines the lower right point of button sensing region 34.
- Field 52 defines the center of the button sensing region 34.
- locations on touchscreen 18 are defined by a (X,Y) coordinate system with X representing the horizontal axis and Y representing the vertical axis and location (0,0) representing the lower left hand corner of touchscreen 18.
- X representing the horizontal axis
- Y representing the vertical axis
- location (0,0) representing the lower left hand corner of touchscreen 18.
- other methods of defining locations on touchscreen 18 could be used without departing from the invention. Note that in this example since button sensing region 34 is a rectangle, it would not be necessary to include field 52 in button list 46 since it could be calculated from field 50 and field 51.
- buttons 20a and 20b are adjusting buttons
- buttons 20c, 20d, and 20e are non-adjusting buttons.
- adjusting buttons have button sensing regions that are larger than their associated buttons, and the button sensing regions for adjusting buttons do not move.
- non-adjusting buttons have button sensing regions that are the same size and shape as their associated buttons and the button sensing regions for the non-adjusting buttons are calibrated based on where the user touched on the adjusting buttons.
- the button sensing regions could be other sizes or shapes without departing from the invention.
- button list 46 contains button entries 53, 54, 55, 56, and 57, which correspond to buttons 20a, 20b, 20c, 20d, and 20e, and button sensing regions 34a, 34b, 34c, 34d, and 34e, respectively.
- adjusting buttons are displayed to the user first, but adjusting and non-adjusting buttons could be displayed to the user in any order. Adjusting and non-adjusting buttons could share the same screen or be on different screens.
- Factor list 47 contains the differences between the location that the user touched on an adjusting button and the center of button sensing region 34.
- factor list 47 contains two factors because there are two adjusting buttons in button list 46, each of which was touched once by the user.
- an application could have the user touch an adjusting button more than once, in which case there would not be a one-to-one correspondence between the number of factors in the factor list and the number of adjusting buttons in the button list.
- adjustment factor 48 is set to be the average of all factors in factor list.
- other methods of setting the adjustment factor could be used, including using the mean of the factors or weighting the recent factors more than the remote factors.
- block 601 displays buttons on the touchscreen to the user.
- the number and configuration of the buttons displayed are dependent on the particular application that the user is accessing.
- block 605 determines the location that the user touched.
- Touchscreens are well known in the art that provide the coordinates of the touched location via sensors on the touchscreen such as light emitting diodes and photo detectors. This invention is independent of the method of touched location determination.
- Block 610 determines which button the user touched by calling the Determine Button Touched subroutine, which is described in detail in FIG. 8.
- block 701 gets a button entry from button list 46.
- Block 705 checks to see if this button entry is for an adjusting button. This check is accomplished by examining type field 49 in the button entry. If this button entry is an adjusting button and the location that the user touched is within the button sensing region for this button entry, blocks 705 and 710 are answered affirmatively.
- a new adjustment factor is computed by calling the Compute Adjustment Factor subroutine, which is described in FIG. 9.
- block 765 computes a factor from the difference between the center of the sensing region and the touched location. If block 770 determines that factor list 47 is full, the oldest factor from factor list 47 is removed in block 775. This is necessary because the memory available to store factors is finite. The newly calculated factor is inserted in factor list 47 in block 780. Adjustment factor 48 is set to be the average of all factors in factor list 47 in block 785. Although the preferred embodiment sets the adjustment factor to the average of the factors in the factor list, alternative embodiments could use another method, such as giving recent factors more weight than remote factors or using the mean of the factors. Block 790 returns control from the Compute Adjustment Factor Subroutine to block 730 in FIG. 8. Block 730 returns an indication of which button was touched to block 615 of FIG. 7.
- block 710 determines that the touched location is not within the sensing region for the button entry and if block 740 determines that there are button entries left in button list 46, then the next button entry in button list 46 is processed in block 701. If block 740 determines that there are no button entries left in button list 46, then the user has touched a location on the screen that is not associated with any button, so an indication that no button was touched is returned in block 735 to block 615 in FIG. 7.
- block 715 checks to see whether remap mode (alternative embodiment) or calibrate mode (preferred embodiment) is desired. If block 715 determines that calibrate mode is desired, block 717 sets the sensing region to be the button location modified by adjustment factor 48. If block 715 determines remap mode is desired, block 719 modifies the touched location by the negative of adjustment factor 48. If block 725 determines that the touched location is within the sensing region for the button list entry, then an indication of which button was touched is returned in block 730 to block 615 of FIG. 7.
- block 725 determines that the touched location is not within the button sensing region for the button list entry, then if block 740 determines that there are button entries left in the button list, then the next button entry in button list 46 is processed in block 701. If block 740 determines that there are no button entries left in button list 46, then the user has touched a location on the screen that is not associated with any button, so an indication that no button was touched is returned in block 735 to block 615 in FIG. 7.
- block 610 returned an indication that a button was touched, then block 615 is answered affirmatively, so the appropriate processing is performed for the touched button by the application that the user is accessing in block 620. If block 610 returned an indication that no button was touched, then the user has touched a location on the touchscreen that is not associated with any button, so block 615 is answered negatively and block 625 displays an error message to the user. In either event, block 601 displays a new screen of buttons on the touchscreen.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Push-Button Switches (AREA)
Abstract
A computer based touchscreen includes a mechanism for displaying buttons on the touchscreen indicating the location where an user has touched the touchscreen. Because different operators will view the buttons from different angles, they will touch the touchscreen in different positions. An initial button or series of buttons of the application that the user is using is configured to sense the location where the user is touching. These initial button(s) are called adjusting buttons. The button sensing regions for the remaining buttons (non-adjusting buttons) are calibrated from these initial touched locations. For example, if the user tends to press high and to the right of the adjusting button(s), the orientation of the button sensing regions for the remaining buttons is calibrated high and to the right. Any screen that displays buttons can be used to calibrate or recalibrate the button sensing region for subsequent non-adjusting buttons. This allows for greater accuracy should the users change their view or angle of the touchscreen as they use the application. Thus, the computer dynamically adjusts the orientation of a touchscreen button's sensing region based on the location that the user has pressed for previous buttons. This dynamic adjustment allows a high density of buttons to be displayed on the touchscreen.
Description
This application is a continuation of application Ser. No. 08/042,048 filed Apr. 1, 1993, now abandoned.
This invention relates to the data processing field. More particularly this invention relates to a touchscreen apparatus for adjusting a sensing region based on the location that the user touches on the touchscreen, which is contained in a display.
Touchscreens are well known in the art where the user communicates a desired action to a computer by touching a touchscreen in a specific location. The computer system displays graphics on the touchscreen that represent buttons. The user interfaces to the computer system by touching the touchscreen in a displayed button's sensing region, which is the area on the touchscreen that the computer system associates with the button. The computer system detects the user's touch, maps the touched location to a particular button, and performs appropriate functions based on the button that the user selected via the touch.
Touchscreens are advantageous in that they can eliminate the need for a keyboard or other input device in applications where a separate input device would be cumbersome, expensive, or susceptible to vandalism. Also, with a touchscreen the user is not forced to read instructions in one place and hunt elsewhere, such as on a keyboard, for keys to press. Instead, the instructions that the user reads and the buttons that the user touches are in the same proximity. This close proximity of instructions and buttons is more convenient and faster for the user and reduces the chance of incorrect button selection.
However, touchscreens suffer from the problem that the buttons on the touchscreen are two dimensional, and although the buttons are visually distinct from one another, they are not physically distinct from one another. Thus, while a user of a conventional keyboard can feel when a finger is straying over the edge of the intended key and onto an adjacent, unintended key, the user of a touchscreen cannot feel when a finger is straying over the edge of a button. These problems are exacerbated by the fact that the touchscreen has thickness, so while the button is projected onto the back of the touchscreen, the user touches it from the front. This screen thickness plus the parallax effect, causes different users to perceive the same button in potentially different places. For example, tall users standing to the right of the touchscreen will tend to perceive the button as being above and to the right of the actual button position, so they will tend to touch high and to the right of the button sensing region. Analogously, short users standing to the left of the touchscreen will tend to perceive the button as being below and to the left of the actual button position, so they will tend to touch low and to the left of the button sensing region.
The cumulative effect of these problems causes users to be unsure of whether or not they have touched the intended button, especially when the buttons are small and close together. In contrast, on a conventional keyboard, users can be reasonably confident that they have depressed the intended key because the key is three dimensional, moves downward when pressed, and is tactilely separate from adjacent keys.
The prior art touchscreens attempt to solve these problems by the computer system either displaying the button over a larger than desired area or utilizing a larger than desired button sensing region associated with the button. Both these solutions severely limited the density and orientation of displayed buttons on the touchscreen.
For the foregoing reasons, there is a need for a touchscreen that allows a high density of displayed buttons with accurate selection of buttons by users who are viewing the touchscreen from a variety of view angles.
It is an object of the present invention to provide an enhanced touchscreen.
It is a further object to allow a high density of displayed buttons with accurate selection of buttons to be based on a touchscreen apparatus.
It is a further object to allow accurate selection of buttons on a touchscreen apparatus by users viewing the touchscreen from all view angles.
These and other objects are achieved by the computer system dynamically adjusting the orientation of a touchscreen button's sensing region based on the location that the user has touched for a previous button or buttons.
An initial button or series of buttons of the application that the user is using is configured to sense the location where the user is touching. These initial button(s) are called adjusting buttons. The button sensing regions for the remaining buttons (non-adjusting buttons) are calibrated from these initial touched locations. For example, if the user tends to press high and to the right of the adjusting button(s), the orientation of the button sensing regions for the remaining buttons is calibrated high and to the right. If multiple adjusting buttons are used, the average of the touched positions is used to calibrate the button sensing regions of the non-adjusting buttons.
The adjusting buttons are not restricted to the beginning of the application. Any screen that displays buttons can be used to calibrate or recalibrate the button sensing region for subsequent non-adjusting buttons. This allows for greater accuracy should the users change their view or angle of the touchscreen as they use the application.
FIG. 1 shows the parallax effect, present in a touchscreen, which causes different view angles to result in different users perceiving the button at different locations;
FIG. 2 shows a prior art touchscreen button and sensing region;
FIG. 3 shows users viewing the touchscreen of the preferred embodiment, which is shown in this example as a ticket kiosk, from different angles;
FIGS. 4a and 4b show an example of a seat selection screen of the preferred embodiment, which allows a high density of buttons on the screen;
FIG. 5 shows a block diagram of the computer system of the preferred embodiment;
FIGS. 6a, 6b, 6c, 6d, and 6e show the button list, factor list, and adjustment factor of the preferred embodiment and their relationships to the button sensing region; and
FIGS. 7, 8, and 9 show the flowcharts that describe the operation of the preferred embodiment.
FIG. 1 is a side view of display 17 and shows the parallax effect, which causes different user view angles to result in different users perceiving button 20 at different locations. Touchscreen 18 in display 17 has thickness 23, which is the distance between touchscreen back 28 and touchscreen front 22. Although button 20 is projected onto the touchscreen back 28, users 24 and 26 will attempt to touch it on touchscreen front 22. Screen thickness 23 plus the parallax effect, causes different operators 24 and 26 to perceive button 20 in different places. User 24 will perceive and attempt to touch button 20 at button location 29. In contrast, user 26 will perceive and attempt to touch button 20 at button location 21, which in this example is lower on touchscreen 18 than button location 29.
FIG. 2 shows prior art display 67, touchscreen 68, touchscreen button 60, and button sensing region 64. Button sensing region 64 is the area of touchscreen 68 associated with button 60 such that if the user touches within button sensing region 64, the computer system considers button 60 to have been touched. To solve the parallax problem illustrated in FIG. 1, prior art touchscreens utilized button sensing regions 64 that were larger than their corresponding buttons 60. The prior art solution illustrated by FIG. 2 is disadvantageous in that sensing regions 64 are large and thus limit the density and orientation of buttons 60 that may be concurrently displayed on touchscreen 68.
FIG. 3 shows users 24 and 26 viewing button 20 from different angles. Button 20a is displayed on touchscreen 18 in display 17. For purposes of illustration, the application in this example is ticket kiosk 30. Because of the parallax effect and screen thickness 23 described in FIG. 1., tall user 24 standing to the right of ticket kiosk 30 will tend to view and press button 20a above and to the right of its actual position. In contrast, short user 26, standing to the left of ticket kiosk 30 will tend to view and press button 20a below and to the left of its actual position. To accommodate the differences in perception of different operators, this invention dynamically adjusts the button sensing regions, as will be shown in FIGS. 4a, 4b, 6a, 6b, 6c, 6d, and 6e, and explained in more detail in the flowcharts of FIGS. 7, 8, and 9.
FIG. 4a illustrates display 17 and touchscreen 18 of the preferred embodiment displaying an exemplary seat selection screen. In this example, the aggregation of all buttons 20 represents a section of seats in a stadium. Users can touch a button to select the seat for which they wish to purchase a ticket. The buttons with an X through them represent seats that have already been sold. For such a seat selection application, it is critical that the buttons be small with fine enough resolution to allow many seats to be displayed close together. This invention allows a high density of buttons with fine resolution on touchscreen 18 because the button sensing regions are dynamically adjusted to suit individual users, as will be shown in FIG. 4b.
FIG. 4b is a close up view of a few of buttons 20 and their corresponding button sensing regions 34. In this example, each button sensing region 34 has been adjusted up and to the right to accommodate a user who tends to touch high and to the right of button 20. Thus, button sensing region 34c is associated with button 20c. When the user touches within button sensing region 34c, the invention associates that touch with button 20c. Likewise, when the user touches within button sensing region 34d or 34e, the invention associates that touch with buttons 20d or 20e respectively. In this way a high density of buttons 20 can be displayed and selected on touchscreen 18.
FIG. 5 shows a block diagram of computer system 10 of the preferred embodiment of the invention. Computer system 10 has touchscreen 18 in display 17 connected to system unit 11. System unit 11 contains processor 12 connected to memory 13, storage 14, and display adapter 15. Processor 12 is suitably programmed to carry out this invention, as described in more detail in the flowcharts of FIGS. 7, 8, and 9.
In the preferred embodiment, computer system 10 is an IBM PS/2, storage 14 is a magnetic hard disk file, display adapter 15 is an IBM VGA display adapter, and display 17 is an IBM 8516 display. Computer system 10 could also be another type of computer system, whether it be another microcomputer such as an Apple Macintosh, a minicomputer such as an IBM AS/400, or a mainframe computer such as an IBM System/390, and still fall within the spirit and scope of this invention. In addition, computer system 10 could be a microcomputer such as described above but connected to a larger computer system such as an IBM AS/400.
FIGS. 6a, 6b, 6c, 6d, and 6e show factor list 47, adjustment factor 48, and the relationship of button list 46 to button 20 and button sensing region 34 in more detail. Button list 46 is a list of button entries. Each entry in button list 46 defines a button sensing region 34 for a button 20 on touchscreen 18 via fields 49, 50, 51, and 52. In the preferred embodiment, there is a separate button list for every screen of buttons displayed, but there could be just one button list for the entire application without departing from the invention. In the preferred embodiment, buttons are displayed as rectangles on the touchscreen, but the buttons could be any geometric shape including, but not limited to, circles, ovals, or polygons with any number of sides. Field 50 defines the upper left point of button sensing region 34. Field 51 defines the lower right point of button sensing region 34. Field 52 defines the center of the button sensing region 34. In this example, locations on touchscreen 18 are defined by a (X,Y) coordinate system with X representing the horizontal axis and Y representing the vertical axis and location (0,0) representing the lower left hand corner of touchscreen 18. However, other methods of defining locations on touchscreen 18 could be used without departing from the invention. Note that in this example since button sensing region 34 is a rectangle, it would not be necessary to include field 52 in button list 46 since it could be calculated from field 50 and field 51.
In one preferred embodiment, there are two types of button entries in button list 46 associated with the two types of buttons. Type 49 in the button entry indicates whether the button entry is associated with an adjusting button or a non-adjusting button. In one preferred embodiment, the locations that the user touches associated with adjusting buttons are used to calibrate the button sensing regions of subsequent non-adjusting buttons. In an alternative embodiment, the locations that the user touches associated with adjusting buttons are used to remap subsequent touched locations into the button sensing regions of non-adjusting buttons. In this example, buttons 20a and 20b are adjusting buttons, while buttons 20c, 20d, and 20e are non-adjusting buttons. In the preferred embodiment, adjusting buttons have button sensing regions that are larger than their associated buttons, and the button sensing regions for adjusting buttons do not move. In the preferred embodiment, non-adjusting buttons have button sensing regions that are the same size and shape as their associated buttons and the button sensing regions for the non-adjusting buttons are calibrated based on where the user touched on the adjusting buttons. However, the button sensing regions could be other sizes or shapes without departing from the invention.
In this example, button list 46 contains button entries 53, 54, 55, 56, and 57, which correspond to buttons 20a, 20b, 20c, 20d, and 20e, and button sensing regions 34a, 34b, 34c, 34d, and 34e, respectively. In one preferred embodiment, adjusting buttons are displayed to the user first, but adjusting and non-adjusting buttons could be displayed to the user in any order. Adjusting and non-adjusting buttons could share the same screen or be on different screens.
In one preferred embodiment, adjustment factor 48 is set to be the average of all factors in factor list. However, other methods of setting the adjustment factor could be used, including using the mean of the factors or weighting the recent factors more than the remote factors.
The operation of one preferred embodiment, as shown in the flowcharts of FIGS. 7-9, will now be described in more detail. Referring to FIG. 7, block 601 displays buttons on the touchscreen to the user. The number and configuration of the buttons displayed are dependent on the particular application that the user is accessing. When the user touches the touchscreen, block 605 determines the location that the user touched. Touchscreens are well known in the art that provide the coordinates of the touched location via sensors on the touchscreen such as light emitting diodes and photo detectors. This invention is independent of the method of touched location determination. Block 610 determines which button the user touched by calling the Determine Button Touched subroutine, which is described in detail in FIG. 8.
Referring to FIG. 8, block 701 gets a button entry from button list 46. Block 705 checks to see if this button entry is for an adjusting button. This check is accomplished by examining type field 49 in the button entry. If this button entry is an adjusting button and the location that the user touched is within the button sensing region for this button entry, blocks 705 and 710 are answered affirmatively. A new adjustment factor is computed by calling the Compute Adjustment Factor subroutine, which is described in FIG. 9.
Referring to FIG. 9, block 765 computes a factor from the difference between the center of the sensing region and the touched location. If block 770 determines that factor list 47 is full, the oldest factor from factor list 47 is removed in block 775. This is necessary because the memory available to store factors is finite. The newly calculated factor is inserted in factor list 47 in block 780. Adjustment factor 48 is set to be the average of all factors in factor list 47 in block 785. Although the preferred embodiment sets the adjustment factor to the average of the factors in the factor list, alternative embodiments could use another method, such as giving recent factors more weight than remote factors or using the mean of the factors. Block 790 returns control from the Compute Adjustment Factor Subroutine to block 730 in FIG. 8. Block 730 returns an indication of which button was touched to block 615 of FIG. 7.
Referring to FIG. 8, if block 710 determines that the touched location is not within the sensing region for the button entry and if block 740 determines that there are button entries left in button list 46, then the next button entry in button list 46 is processed in block 701. If block 740 determines that there are no button entries left in button list 46, then the user has touched a location on the screen that is not associated with any button, so an indication that no button was touched is returned in block 735 to block 615 in FIG. 7.
Referring to FIG. 8, if block 705 determines that the button entry is for a non-adjusting button, then block 715 checks to see whether remap mode (alternative embodiment) or calibrate mode (preferred embodiment) is desired. If block 715 determines that calibrate mode is desired, block 717 sets the sensing region to be the button location modified by adjustment factor 48. If block 715 determines remap mode is desired, block 719 modifies the touched location by the negative of adjustment factor 48. If block 725 determines that the touched location is within the sensing region for the button list entry, then an indication of which button was touched is returned in block 730 to block 615 of FIG. 7. If block 725 determines that the touched location is not within the button sensing region for the button list entry, then if block 740 determines that there are button entries left in the button list, then the next button entry in button list 46 is processed in block 701. If block 740 determines that there are no button entries left in button list 46, then the user has touched a location on the screen that is not associated with any button, so an indication that no button was touched is returned in block 735 to block 615 in FIG. 7.
Referring again to FIG. 7, if block 610 returned an indication that a button was touched, then block 615 is answered affirmatively, so the appropriate processing is performed for the touched button by the application that the user is accessing in block 620. If block 610 returned an indication that no button was touched, then the user has touched a location on the touchscreen that is not associated with any button, so block 615 is answered negatively and block 625 displays an error message to the user. In either event, block 601 displays a new screen of buttons on the touchscreen.
While this invention has been described with respect to the preferred and alternative embodiments, it will be understood by those skilled in the art that various changes in detail may be made therein without departing from the spirit, scope, and teaching of the invention. For example, the type of applications that use a touchscreen may change from what is known today. In addition, touchscreen technology may become widely employed in consumer applications such as operator panels for consumer electronics, appliances, and automobiles. Accordingly, the herein disclosed invention is to be limited only as specified in the following claims.
Claims (6)
1. A touchscreen apparatus for receiving instructions from a user, comprising:
means for displaying a first button on the touchscreen, the first button having a first sensing region, wherein the first button is pressed when the user touches within the first sensing region;
means for receiving an indication that the user touched the touchscreen at a first location;
means for determining that the first location is within the first sensing region of the first button;
means for calculating a first factor based on the distance of the first location from the center of the first sensing region;
means for displaying a second button on the touchscreen, the second button having a second sensing region, wherein the second button is pressed when the user touches within the second sensing region;
means for receiving an indication that the user touched the touchscreen at a second location;
means for determining that the second location is within the sensing region of the second button;
means for calculating a second factor based on the distance of the second location from the center of the second sensing region, wherein the second factor is different from the first factor;
means for computing an adjustment factor by taking the average of the first factor and the second factor; and
means for displaying a third button on the touchscreen, the third button having a third sensing region at a location modified by the adjustment factor, wherein the third button is pressed when the user touches within the third sensing region.
2. The touchscreen apparatus of claim 1, further comprising:
means for receiving an indication that the user touched the touchscreen at a third location; and
means for determining that the third location is within the sensing region of the third button; and
means for performing processing in response to the user pressing the third button.
3. A touchscreen apparatus for receiving instructions from a user, comprising:
means for displaying a first button on the touchscreen, the first button having a first sensing region, wherein the first button is pressed when the user touches within the first sensing region;
means for receiving an indication that the user touched the touchscreen at a first location;
means for determining that the first location is within the sensing region of the first button;
means for calculating a first factor based on the distance of the first location from the center of the first sensing region;
means for displaying a second button on the touchscreen, the second button having a second sensing region, wherein the second button is pressed when the user touches within the second sensing region;
means for receiving an indication that the user touched the touchscreen at a second location;
means for determining that the second location is within the sensing region of the second button;
means for calculating a second factor based on the distance of the second location from the center of the second sensing region, wherein the second factor is different from the first factor;
means for computing an adjustment factor by taking the average of the first factor and the second factor;
means for displaying a third button having a third sensing region, wherein the third button is pressed when the user touches within the third sensing region;
means for receiving an indication that the user touched the touchscreen at a third location;
means for remapping the third location into a remapped location based on the average of the first factor and the second factor;
means for determining that the remapped location is within the third sensing region; and
means for performing processing in response to the user pressing the third button.
4. A method of dynamically adjusting a sensing region on a touchscreen, comprising the machine executed steps of:
displaying a first button on a touchscreen, the first button having a first sensing region, wherein the first button is pressed when the user touches within the first sensing region;
receiving an indication that a user touched the touchscreen at a first location;
determining that the first location is within the first sensing region of the first button;
calculating a first factor based on the distance of the first location from the center of the first sensing region;
displaying a second button on the touchscreen, the second button having a second sensing region, wherein the second button is pressed when the user touches within the second sensing region;
receiving an indication that the user touched the touchscreen at a second location;
determining that the second location is within the sensing region of the second button;
calculating a second factor based on the distance of the second location from the center of the second sensing region, wherein the second factor is different from the first factor;
computing an adjustment factor by taking the average of the first factor and the second factor; and
displaying a third button on the touchscreen, the third button having a third sensing region at a location modified by from the adjustment factor, wherein the third button is pressed when the user touches within the third sensing region.
5. The method of claim 4, further comprising:
receiving an indication that the user touched the touchscreen at a third location;
determining that the third location is within the sensing region of the third button; and
performing processing in response to the user pressing the third button.
6. A method of dynamically adjusting a sensing region on a touchscreen,
comprising the machine executed steps of:
displaying a first button on the touchscreen, the first button having a first sensing region, wherein the first button is pressed when the user touches within the first sensing region;
receiving an indication that an user touched the touchscreen at a first location;
determining that the first location is within the sensing region of the first button;
calculating a first factor based on the distance of the first location from the center of the first sensing region;
displaying a second button on the touchscreen, the second button having a second sensing region, wherein the second button is pressed when the user touches within the second sensing region;
receiving an indication that the user touched the touchscreen at a second location;
determining that the second location is within the sensing region of the second button;
calculating a second factor based on the distance of the second location from the center of the second sensing region, wherein the second factor is different from the first factor;
computing an adjustment factor by taking the average of the first factor and the second factor;
displaying a third button on the touchscreen, the third button having a third sensing region, wherein the third button is pressed when the user touches within the third sensing region;
receiving an indication that the user touched the touchscreen at a third location;
remapping the third location into a remapped location based on the adjustment factor;
determining that the remapped location is within the third sensing region; and
performing processing in response to the user pressing the third button.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/345,266 US5565894A (en) | 1993-04-01 | 1994-11-25 | Dynamic touchscreen button adjustment mechanism |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US4204893A | 1993-04-01 | 1993-04-01 | |
US08/345,266 US5565894A (en) | 1993-04-01 | 1994-11-25 | Dynamic touchscreen button adjustment mechanism |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US4204893A Continuation | 1993-04-01 | 1993-04-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
US5565894A true US5565894A (en) | 1996-10-15 |
Family
ID=21919787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/345,266 Expired - Lifetime US5565894A (en) | 1993-04-01 | 1994-11-25 | Dynamic touchscreen button adjustment mechanism |
Country Status (8)
Country | Link |
---|---|
US (1) | US5565894A (en) |
EP (1) | EP0618528B1 (en) |
JP (1) | JP3292267B2 (en) |
KR (1) | KR970006397B1 (en) |
CN (1) | CN1054450C (en) |
AT (1) | ATE188302T1 (en) |
DE (1) | DE69422323T2 (en) |
TW (1) | TW249282B (en) |
Cited By (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5877751A (en) * | 1994-09-22 | 1999-03-02 | Aisin Aw Co., Ltd. | Touch display type information input system |
US5896126A (en) * | 1996-08-29 | 1999-04-20 | International Business Machines Corporation | Selection device for touchscreen systems |
US6072482A (en) * | 1997-09-05 | 2000-06-06 | Ericsson Inc. | Mouse mode manager and voice activation for navigating and executing computer commands |
US6181328B1 (en) * | 1998-03-02 | 2001-01-30 | International Business Machines Corporation | Method and system for calibrating touch screen sensitivities according to particular physical characteristics associated with a user |
US20010032057A1 (en) * | 1999-12-23 | 2001-10-18 | Smith Randall G. | Initial calibration of a location sensing whiteboard to a projected display |
US6411285B1 (en) * | 1999-03-17 | 2002-06-25 | Sharp Kabushiki Kaisha | Touch-panel input type electronic device |
US6456952B1 (en) | 2000-03-29 | 2002-09-24 | Ncr Coporation | System and method for touch screen environmental calibration |
US6563492B1 (en) * | 1999-03-03 | 2003-05-13 | Yazaki Corporation | Multi-function switch unit and function indicating method of the same |
US20030222858A1 (en) * | 2002-05-28 | 2003-12-04 | Pioneer Corporation | Touch panel device |
US6751487B1 (en) | 2000-02-08 | 2004-06-15 | Ericsson, Inc. | Turn around cellular telephone |
US20040178994A1 (en) * | 2003-03-10 | 2004-09-16 | International Business Machines Corporation | Dynamic resizing of clickable areas of touch screen applications |
US20050017959A1 (en) * | 2002-06-28 | 2005-01-27 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US20050052341A1 (en) * | 2003-09-09 | 2005-03-10 | Michael Henriksson | Multi-layered displays providing different focal lengths with optically shiftable viewing formats and terminals incorporating the same |
US20050225538A1 (en) * | 2002-07-04 | 2005-10-13 | Wilhelmus Verhaegh | Automatically adaptable virtual keyboard |
US20050253818A1 (en) * | 2002-06-25 | 2005-11-17 | Esa Nettamo | Method of interpreting control command, and portable electronic device |
US20070152978A1 (en) * | 2006-01-05 | 2007-07-05 | Kenneth Kocienda | Keyboards for Portable Electronic Devices |
US20070152980A1 (en) * | 2006-01-05 | 2007-07-05 | Kenneth Kocienda | Touch Screen Keyboards for Portable Electronic Devices |
US20070220165A1 (en) * | 2006-03-16 | 2007-09-20 | Seale Moorer | Internet protocol based media streaming solution |
US20070217446A1 (en) * | 2006-03-16 | 2007-09-20 | Seale Moorer | Network based digital access point device |
US20070241945A1 (en) * | 2006-03-16 | 2007-10-18 | Seale Moorer | User control interface for convergence and automation system |
US20070260713A1 (en) * | 2006-03-16 | 2007-11-08 | Seale Moorer | Automation control system having a configuration tool |
US20080100586A1 (en) * | 2006-10-26 | 2008-05-01 | Deere & Company | Method and system for calibrating a touch screen |
US20080163119A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Electronics Co., Ltd. | Method for providing menu and multimedia device using the same |
US20080231604A1 (en) * | 2007-03-22 | 2008-09-25 | Cypress Semiconductor Corp. | Method for extending the life of touch screens |
US20080268948A1 (en) * | 2006-11-27 | 2008-10-30 | Aristocrat Technologies Australia Pty Ltd | Gaming machine with touch screen |
US20090201246A1 (en) * | 2008-02-11 | 2009-08-13 | Apple Inc. | Motion Compensation for Screens |
US20090207148A1 (en) * | 2004-06-03 | 2009-08-20 | Sony Corporation | Portable electronic device, method of controlling input operation, and program for controlling input operation |
US20090301581A1 (en) * | 2008-04-23 | 2009-12-10 | Macneal James R | Pressurized gas containing system |
US20100188342A1 (en) * | 2009-01-26 | 2010-07-29 | Manufacturing Resources International, Inc. | Method and System for Positioning a Graphical User Interface |
US20100271312A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Menu Configuration System and Method for Display on an Electronic Device |
US20100271331A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Touch-Screen and Method for an Electronic Device |
US20110074697A1 (en) * | 2009-09-25 | 2011-03-31 | Peter William Rapp | Device, Method, and Graphical User Interface for Manipulation of User Interface Objects with Activation Regions |
US20110074698A1 (en) * | 2009-09-25 | 2011-03-31 | Peter William Rapp | Device, Method, and Graphical User Interface for Manipulation of User Interface Objects with Activation Regions |
US20110082603A1 (en) * | 2008-06-20 | 2011-04-07 | Bayerische Motoren Werke Aktiengesellschaft | Process for Controlling Functions in a Motor Vehicle Having Neighboring Operating Elements |
US20110115711A1 (en) * | 2009-11-19 | 2011-05-19 | Suwinto Gunawan | Method and Apparatus for Replicating Physical Key Function with Soft Keys in an Electronic Device |
US20110141108A1 (en) * | 2008-08-27 | 2011-06-16 | Fujifilm Corporation | Device and method for setting instructed position during three-dimensional display, as well as program |
US7966083B2 (en) | 2006-03-16 | 2011-06-21 | Exceptional Innovation Llc | Automation control system having device scripting |
US20110157090A1 (en) * | 2009-12-31 | 2011-06-30 | International Business Machines Corporation | Morphing touchscreen keyboard interface |
US20110163973A1 (en) * | 2010-01-06 | 2011-07-07 | Bas Ording | Device, Method, and Graphical User Interface for Accessing Alternative Keys |
US20110167382A1 (en) * | 2010-01-06 | 2011-07-07 | Van Os Marcel | Device, Method, and Graphical User Interface for Manipulating Selectable User Interface Objects |
AU2009202481B2 (en) * | 1999-10-27 | 2011-09-15 | Keyless Systems Ltd. | Integrated keypad system |
US20120027267A1 (en) * | 2010-07-29 | 2012-02-02 | Kim Jonghwan | Mobile terminal and method of controlling operation of the mobile terminal |
US8271881B2 (en) * | 2006-04-20 | 2012-09-18 | Exceptional Innovation, Llc | Touch screen for convergence and automation system |
US8438500B2 (en) | 2009-09-25 | 2013-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US8456445B2 (en) | 2010-04-30 | 2013-06-04 | Honeywell International Inc. | Touch screen and method for adjusting screen objects |
US20130181924A1 (en) * | 2012-01-17 | 2013-07-18 | Samsung Electronics Co., Ltd. | Apparatus and method for adjusting a touch recognition area in a touch interface |
US20140184511A1 (en) * | 2012-12-28 | 2014-07-03 | Ismo Puustinen | Accurate data entry into a mobile computing device |
TWI461975B (en) * | 2011-01-12 | 2014-11-21 | Wistron Corp | Electronic device and method for correcting touch position |
US8970486B2 (en) | 2009-05-22 | 2015-03-03 | Google Technology Holdings LLC | Mobile device with user interaction capability and method of operating same |
US9047002B2 (en) | 2013-03-15 | 2015-06-02 | Elwha Llc | Systems and methods for parallax compensation |
US9086802B2 (en) | 2008-01-09 | 2015-07-21 | Apple Inc. | Method, device, and graphical user interface providing word recommendations for text input |
US9189079B2 (en) | 2007-01-05 | 2015-11-17 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations |
US20150370415A1 (en) * | 2014-06-20 | 2015-12-24 | Funai Electric Co., Ltd. | Image display device |
US9239677B2 (en) | 2004-05-06 | 2016-01-19 | Apple Inc. | Operation of a computer with touch screen interface |
US20160162276A1 (en) * | 2014-12-04 | 2016-06-09 | Google Technology Holdings LLC | System and Methods for Touch Pattern Detection and User Interface Adaptation |
US9389728B2 (en) | 2013-03-15 | 2016-07-12 | Elwha Llc | Systems and methods for parallax compensation |
US9395902B2 (en) | 2013-03-15 | 2016-07-19 | Elwha Llc | Systems and methods for parallax compensation |
US20160220179A1 (en) * | 2013-09-13 | 2016-08-04 | Centre Hospitalier Universitaire De Poitiers | Device and method for evaluating and monitoring physical pain |
US9710150B2 (en) | 2014-01-07 | 2017-07-18 | Qualcomm Incorporated | System and method for context-based touch processing |
US9791959B2 (en) | 2014-01-07 | 2017-10-17 | Qualcomm Incorporated | System and method for host-augmented touch processing |
US10025501B2 (en) | 2008-06-27 | 2018-07-17 | Apple Inc. | Touch screen device, method, and graphical user interface for inserting a character from an alternate keyboard |
US10269156B2 (en) | 2015-06-05 | 2019-04-23 | Manufacturing Resources International, Inc. | System and method for blending order confirmation over menu board background |
US10313037B2 (en) | 2016-05-31 | 2019-06-04 | Manufacturing Resources International, Inc. | Electronic display remote image verification system and method |
US10319271B2 (en) | 2016-03-22 | 2019-06-11 | Manufacturing Resources International, Inc. | Cyclic redundancy check for electronic displays |
US10319408B2 (en) | 2015-03-30 | 2019-06-11 | Manufacturing Resources International, Inc. | Monolithic display with separately controllable sections |
US10318043B2 (en) | 2016-03-24 | 2019-06-11 | Gm Global Technology Operations Llc. | Dynamic adjustment of touch sensitive area in a display assembly |
US10409483B2 (en) | 2015-03-07 | 2019-09-10 | Apple Inc. | Activity based thresholds for providing haptic feedback |
US10510304B2 (en) | 2016-08-10 | 2019-12-17 | Manufacturing Resources International, Inc. | Dynamic dimming LED backlight for LCD array |
CN111324087A (en) * | 2018-12-14 | 2020-06-23 | 发那科株式会社 | Display device, machine tool, and abnormality determination method |
US10860199B2 (en) | 2016-09-23 | 2020-12-08 | Apple Inc. | Dynamically adjusting touch hysteresis based on contextual data |
US10922736B2 (en) | 2015-05-15 | 2021-02-16 | Manufacturing Resources International, Inc. | Smart electronic display for restaurants |
US11016656B1 (en) | 2020-02-14 | 2021-05-25 | International Business Machines Corporation | Fault recognition self-learning graphical user interface |
US11656885B1 (en) | 2022-02-22 | 2023-05-23 | International Business Machines Corporation | Interface interaction system |
US11895362B2 (en) | 2021-10-29 | 2024-02-06 | Manufacturing Resources International, Inc. | Proof of play for images displayed at electronic displays |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6256021B1 (en) | 1998-09-15 | 2001-07-03 | Ericsson Inc. | Apparatus and method of configuring target areas within a touchable item of a touchscreen |
AU3181800A (en) * | 1999-03-23 | 2000-10-09 | British Telecommunications Public Limited Company | Computer input device |
US9164654B2 (en) | 2002-12-10 | 2015-10-20 | Neonode Inc. | User interface for mobile computer unit |
US8095879B2 (en) | 2002-12-10 | 2012-01-10 | Neonode Inc. | User interface for mobile handheld computer unit |
US8674966B2 (en) | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
SE0103835L (en) | 2001-11-02 | 2003-05-03 | Neonode Ab | Touch screen realized by display unit with light transmitting and light receiving units |
US9778794B2 (en) | 2001-11-02 | 2017-10-03 | Neonode Inc. | Light-based touch screen |
US9052777B2 (en) | 2001-11-02 | 2015-06-09 | Neonode Inc. | Optical elements with alternating reflective lens facets |
US8416217B1 (en) | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
JP2006061310A (en) * | 2004-08-25 | 2006-03-09 | Pentax Corp | Touch panel and processor of endoscope apparatus |
WO2006066435A1 (en) * | 2004-12-20 | 2006-06-29 | Speedscript Ag | Method for the dynamic calibration of touch screens |
KR101315048B1 (en) * | 2005-01-14 | 2013-10-10 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Moving Object Presented by a Touch Input Display Device |
DE102006043208A1 (en) * | 2006-09-11 | 2008-03-27 | Siemens Ag | Touchpad or screen and control element for touchpad or screen |
TWI450137B (en) * | 2006-12-11 | 2014-08-21 | Elo Touch Solutions Inc | Method and apparatus for calibrating targets on a touchscreen |
JP4450038B2 (en) | 2007-09-12 | 2010-04-14 | 株式会社カシオ日立モバイルコミュニケーションズ | Information display device and program |
EP2101250B1 (en) * | 2008-03-14 | 2014-06-11 | BlackBerry Limited | Character selection on a device using offset contact-zone |
JP5674079B2 (en) * | 2009-01-28 | 2015-02-25 | Necプラットフォームズ株式会社 | Touch panel coordinate automatic correction device and touch panel coordinate automatic correction method used therefor |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
EP2437213A1 (en) | 2009-06-16 | 2012-04-04 | Intel Corporation | Camera applications in a handheld device |
CN101847079A (en) * | 2010-04-30 | 2010-09-29 | 中兴通讯股份有限公司 | Method and device for regulating control button layout |
EP2407865A1 (en) * | 2010-07-16 | 2012-01-18 | Gigaset Communications GmbH | Adaptive calibration of sensor monitors for optimising interface quality |
KR101160681B1 (en) | 2011-10-19 | 2012-06-28 | 배경덕 | Method, mobile communication terminal and computer-readable recording medium for operating specific function when activaing of mobile communication terminal |
US12032817B2 (en) | 2012-11-27 | 2024-07-09 | Neonode Inc. | Vehicle user interface |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
US9690417B2 (en) * | 2014-05-21 | 2017-06-27 | Apple Inc. | Glove touch detection |
CN108551340B (en) * | 2018-05-09 | 2021-10-22 | 珠海格力电器股份有限公司 | Anti-interference processing method for touch key and electric appliance |
CN113165515B (en) | 2018-11-28 | 2021-11-02 | 内奥诺德公司 | Driver User Interface Sensors |
US10963095B1 (en) | 2019-09-27 | 2021-03-30 | Apple Inc. | Glove touch detection |
KR20230074269A (en) | 2020-09-30 | 2023-05-26 | 네오노드, 인크. | optical touch sensor |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4710758A (en) * | 1985-04-26 | 1987-12-01 | Westinghouse Electric Corp. | Automatic touch screen calibration method |
US4764885A (en) * | 1986-04-25 | 1988-08-16 | International Business Machines Corporaton | Minimum parallax stylus detection subsystem for a display device |
US4821030A (en) * | 1986-12-19 | 1989-04-11 | Tektronix, Inc. | Touchscreen feedback system |
EP0324306A2 (en) * | 1987-11-16 | 1989-07-19 | International Business Machines Corporation | Parallax error avoidance for a touch screen system |
US4903012A (en) * | 1987-01-20 | 1990-02-20 | Alps Electric Co., Ltd. | Coordinate system input device providing registration calibration and a mouse function |
US5003505A (en) * | 1984-05-07 | 1991-03-26 | Hewlett-Packard Company | Touchscreen/keyboard scanner |
US5025411A (en) * | 1986-12-08 | 1991-06-18 | Tektronix, Inc. | Method which provides debounced inputs from a touch screen panel by waiting until each x and y coordinates stop altering |
US5053758A (en) * | 1988-02-01 | 1991-10-01 | Sperry Marine Inc. | Touchscreen control panel with sliding touch control |
US5189732A (en) * | 1987-11-18 | 1993-02-23 | Hitachi, Ltd. | Touch panel input apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63197211A (en) * | 1987-02-12 | 1988-08-16 | Fujitsu Ltd | Display device with touch panel |
EP0326751B1 (en) * | 1988-02-01 | 1994-08-31 | Sperry Marine Inc. | Touchscreen control panel with sliding touch control |
JPH04191920A (en) * | 1990-11-27 | 1992-07-10 | Oki Electric Ind Co Ltd | Touch position correcting method in touch panel device |
-
1994
- 1994-02-22 EP EP94480019A patent/EP0618528B1/en not_active Expired - Lifetime
- 1994-02-22 AT AT94480019T patent/ATE188302T1/en not_active IP Right Cessation
- 1994-02-22 DE DE69422323T patent/DE69422323T2/en not_active Expired - Lifetime
- 1994-03-22 JP JP4964794A patent/JP3292267B2/en not_active Expired - Lifetime
- 1994-03-30 CN CN94103449A patent/CN1054450C/en not_active Expired - Lifetime
- 1994-03-30 KR KR1019940006482A patent/KR970006397B1/en not_active IP Right Cessation
- 1994-04-11 TW TW083103146A patent/TW249282B/zh not_active IP Right Cessation
- 1994-11-25 US US08/345,266 patent/US5565894A/en not_active Expired - Lifetime
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5003505A (en) * | 1984-05-07 | 1991-03-26 | Hewlett-Packard Company | Touchscreen/keyboard scanner |
US4710758A (en) * | 1985-04-26 | 1987-12-01 | Westinghouse Electric Corp. | Automatic touch screen calibration method |
US4764885A (en) * | 1986-04-25 | 1988-08-16 | International Business Machines Corporaton | Minimum parallax stylus detection subsystem for a display device |
US5025411A (en) * | 1986-12-08 | 1991-06-18 | Tektronix, Inc. | Method which provides debounced inputs from a touch screen panel by waiting until each x and y coordinates stop altering |
US4821030A (en) * | 1986-12-19 | 1989-04-11 | Tektronix, Inc. | Touchscreen feedback system |
US4903012A (en) * | 1987-01-20 | 1990-02-20 | Alps Electric Co., Ltd. | Coordinate system input device providing registration calibration and a mouse function |
EP0324306A2 (en) * | 1987-11-16 | 1989-07-19 | International Business Machines Corporation | Parallax error avoidance for a touch screen system |
USH716H (en) * | 1987-11-16 | 1989-12-05 | Parallax induced pointing error avoidance method and means for systems using touch screen overlays | |
US5189732A (en) * | 1987-11-18 | 1993-02-23 | Hitachi, Ltd. | Touch panel input apparatus |
US5053758A (en) * | 1988-02-01 | 1991-10-01 | Sperry Marine Inc. | Touchscreen control panel with sliding touch control |
Non-Patent Citations (4)
Title |
---|
IBM Technical Disclosure Bulletin, "Algorithm for Decreasing the Error Rate of Data Entered on a Touch-Sensitive Terminal", vol. 33, No. 10A, Mar. 1991, pp. 223-227. |
IBM Technical Disclosure Bulletin, "Method to Improve the Usability of Touch-Screen Panels for Computer Applications", vol. 32, No. 9B, Feb. 1990, pp. 250-253. |
IBM Technical Disclosure Bulletin, Algorithm for Decreasing the Error Rate of Data Entered on a Touch Sensitive Terminal , vol. 33, No. 10A, Mar. 1991, pp. 223 227. * |
IBM Technical Disclosure Bulletin, Method to Improve the Usability of Touch Screen Panels for Computer Applications , vol. 32, No. 9B, Feb. 1990, pp. 250 253. * |
Cited By (118)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5877751A (en) * | 1994-09-22 | 1999-03-02 | Aisin Aw Co., Ltd. | Touch display type information input system |
US5896126A (en) * | 1996-08-29 | 1999-04-20 | International Business Machines Corporation | Selection device for touchscreen systems |
US6072482A (en) * | 1997-09-05 | 2000-06-06 | Ericsson Inc. | Mouse mode manager and voice activation for navigating and executing computer commands |
MY119574A (en) * | 1997-09-05 | 2005-06-30 | Ericsson Inc | Mouse mode manager and voice activation for navigating and executing computer commands |
US6181328B1 (en) * | 1998-03-02 | 2001-01-30 | International Business Machines Corporation | Method and system for calibrating touch screen sensitivities according to particular physical characteristics associated with a user |
US6563492B1 (en) * | 1999-03-03 | 2003-05-13 | Yazaki Corporation | Multi-function switch unit and function indicating method of the same |
US6411285B1 (en) * | 1999-03-17 | 2002-06-25 | Sharp Kabushiki Kaisha | Touch-panel input type electronic device |
AU2009202481B2 (en) * | 1999-10-27 | 2011-09-15 | Keyless Systems Ltd. | Integrated keypad system |
US20010032057A1 (en) * | 1999-12-23 | 2001-10-18 | Smith Randall G. | Initial calibration of a location sensing whiteboard to a projected display |
US6751487B1 (en) | 2000-02-08 | 2004-06-15 | Ericsson, Inc. | Turn around cellular telephone |
US6456952B1 (en) | 2000-03-29 | 2002-09-24 | Ncr Coporation | System and method for touch screen environmental calibration |
US7154483B2 (en) * | 2002-05-28 | 2006-12-26 | Pioneer Corporation | Touch panel device |
US20030222858A1 (en) * | 2002-05-28 | 2003-12-04 | Pioneer Corporation | Touch panel device |
US20050253818A1 (en) * | 2002-06-25 | 2005-11-17 | Esa Nettamo | Method of interpreting control command, and portable electronic device |
US20050017959A1 (en) * | 2002-06-28 | 2005-01-27 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US20050052432A1 (en) * | 2002-06-28 | 2005-03-10 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US7023427B2 (en) | 2002-06-28 | 2006-04-04 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US7053887B2 (en) | 2002-06-28 | 2006-05-30 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US7295191B2 (en) | 2002-06-28 | 2007-11-13 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US20050225538A1 (en) * | 2002-07-04 | 2005-10-13 | Wilhelmus Verhaegh | Automatically adaptable virtual keyboard |
US8576173B2 (en) * | 2002-07-04 | 2013-11-05 | Koninklijke Philips N. V. | Automatically adaptable virtual keyboard |
US20040178994A1 (en) * | 2003-03-10 | 2004-09-16 | International Business Machines Corporation | Dynamic resizing of clickable areas of touch screen applications |
US7103852B2 (en) * | 2003-03-10 | 2006-09-05 | International Business Machines Corporation | Dynamic resizing of clickable areas of touch screen applications |
US20070134645A1 (en) * | 2003-09-09 | 2007-06-14 | Sony Ericsson Mobile Communications Ab | Multi-layered displays providing different focal lengths with optically shiftable viewing formats and terminals incorporating the same |
US20050052341A1 (en) * | 2003-09-09 | 2005-03-10 | Michael Henriksson | Multi-layered displays providing different focal lengths with optically shiftable viewing formats and terminals incorporating the same |
US7205959B2 (en) | 2003-09-09 | 2007-04-17 | Sony Ericsson Mobile Communications Ab | Multi-layered displays providing different focal lengths with optically shiftable viewing formats and terminals incorporating the same |
US10338789B2 (en) | 2004-05-06 | 2019-07-02 | Apple Inc. | Operation of a computer with touch screen interface |
US9239677B2 (en) | 2004-05-06 | 2016-01-19 | Apple Inc. | Operation of a computer with touch screen interface |
US20090207148A1 (en) * | 2004-06-03 | 2009-08-20 | Sony Corporation | Portable electronic device, method of controlling input operation, and program for controlling input operation |
US10860136B2 (en) * | 2004-06-03 | 2020-12-08 | Sony Corporation | Portable electronic device and method of controlling input operation |
US20070152978A1 (en) * | 2006-01-05 | 2007-07-05 | Kenneth Kocienda | Keyboards for Portable Electronic Devices |
US20070152980A1 (en) * | 2006-01-05 | 2007-07-05 | Kenneth Kocienda | Touch Screen Keyboards for Portable Electronic Devices |
US7694231B2 (en) | 2006-01-05 | 2010-04-06 | Apple Inc. | Keyboards for portable electronic devices |
US20070241945A1 (en) * | 2006-03-16 | 2007-10-18 | Seale Moorer | User control interface for convergence and automation system |
US8725845B2 (en) | 2006-03-16 | 2014-05-13 | Exceptional Innovation Llc | Automation control system having a configuration tool |
US20070260713A1 (en) * | 2006-03-16 | 2007-11-08 | Seale Moorer | Automation control system having a configuration tool |
US20070217446A1 (en) * | 2006-03-16 | 2007-09-20 | Seale Moorer | Network based digital access point device |
US8209398B2 (en) | 2006-03-16 | 2012-06-26 | Exceptional Innovation Llc | Internet protocol based media streaming solution |
US20070220165A1 (en) * | 2006-03-16 | 2007-09-20 | Seale Moorer | Internet protocol based media streaming solution |
US8155142B2 (en) | 2006-03-16 | 2012-04-10 | Exceptional Innovation Llc | Network based digital access point device |
US7966083B2 (en) | 2006-03-16 | 2011-06-21 | Exceptional Innovation Llc | Automation control system having device scripting |
US8001219B2 (en) | 2006-03-16 | 2011-08-16 | Exceptional Innovation, Llc | User control interface for convergence and automation system |
US8271881B2 (en) * | 2006-04-20 | 2012-09-18 | Exceptional Innovation, Llc | Touch screen for convergence and automation system |
US20080100586A1 (en) * | 2006-10-26 | 2008-05-01 | Deere & Company | Method and system for calibrating a touch screen |
US20080268948A1 (en) * | 2006-11-27 | 2008-10-30 | Aristocrat Technologies Australia Pty Ltd | Gaming machine with touch screen |
US20080163119A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Electronics Co., Ltd. | Method for providing menu and multimedia device using the same |
US11416141B2 (en) | 2007-01-05 | 2022-08-16 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations |
US11112968B2 (en) | 2007-01-05 | 2021-09-07 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations |
US9189079B2 (en) | 2007-01-05 | 2015-11-17 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations |
US9244536B2 (en) | 2007-01-05 | 2016-01-26 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations |
US10592100B2 (en) | 2007-01-05 | 2020-03-17 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations |
WO2008118652A1 (en) * | 2007-03-22 | 2008-10-02 | Cypress Semiconductor Corp | Method for extending the life of touch screens |
US20080231604A1 (en) * | 2007-03-22 | 2008-09-25 | Cypress Semiconductor Corp. | Method for extending the life of touch screens |
US11079933B2 (en) | 2008-01-09 | 2021-08-03 | Apple Inc. | Method, device, and graphical user interface providing word recommendations for text input |
US9086802B2 (en) | 2008-01-09 | 2015-07-21 | Apple Inc. | Method, device, and graphical user interface providing word recommendations for text input |
US11474695B2 (en) | 2008-01-09 | 2022-10-18 | Apple Inc. | Method, device, and graphical user interface providing word recommendations for text input |
US8681093B2 (en) * | 2008-02-11 | 2014-03-25 | Apple Inc. | Motion compensation for screens |
US20090201246A1 (en) * | 2008-02-11 | 2009-08-13 | Apple Inc. | Motion Compensation for Screens |
US20090301581A1 (en) * | 2008-04-23 | 2009-12-10 | Macneal James R | Pressurized gas containing system |
US8788112B2 (en) * | 2008-06-20 | 2014-07-22 | Bayerische Motoren Werke Aktiengesellschaft | Process for controlling functions in a motor vehicle having neighboring operating elements |
US20110082603A1 (en) * | 2008-06-20 | 2011-04-07 | Bayerische Motoren Werke Aktiengesellschaft | Process for Controlling Functions in a Motor Vehicle Having Neighboring Operating Elements |
US10025501B2 (en) | 2008-06-27 | 2018-07-17 | Apple Inc. | Touch screen device, method, and graphical user interface for inserting a character from an alternate keyboard |
US10430078B2 (en) | 2008-06-27 | 2019-10-01 | Apple Inc. | Touch screen device, and graphical user interface for inserting a character from an alternate keyboard |
US8482557B2 (en) | 2008-08-27 | 2013-07-09 | Fujifilm Corporation | Device and method for setting instructed position during three-dimensional display, as well as program |
US20110141108A1 (en) * | 2008-08-27 | 2011-06-16 | Fujifilm Corporation | Device and method for setting instructed position during three-dimensional display, as well as program |
WO2010085784A3 (en) * | 2009-01-26 | 2010-11-18 | Manufacturing Resources International, Inc. | Method and system for positioning a graphical user interface |
WO2010085784A2 (en) * | 2009-01-26 | 2010-07-29 | Manufacturing Resources International, Inc. | Method and system for positioning a graphical user interface |
US20100188342A1 (en) * | 2009-01-26 | 2010-07-29 | Manufacturing Resources International, Inc. | Method and System for Positioning a Graphical User Interface |
US20100271331A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Touch-Screen and Method for an Electronic Device |
US20100271312A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Menu Configuration System and Method for Display on an Electronic Device |
US8970486B2 (en) | 2009-05-22 | 2015-03-03 | Google Technology Holdings LLC | Mobile device with user interaction capability and method of operating same |
US8438500B2 (en) | 2009-09-25 | 2013-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US8416205B2 (en) | 2009-09-25 | 2013-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US20110074698A1 (en) * | 2009-09-25 | 2011-03-31 | Peter William Rapp | Device, Method, and Graphical User Interface for Manipulation of User Interface Objects with Activation Regions |
US20110074697A1 (en) * | 2009-09-25 | 2011-03-31 | Peter William Rapp | Device, Method, and Graphical User Interface for Manipulation of User Interface Objects with Activation Regions |
US8421762B2 (en) | 2009-09-25 | 2013-04-16 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US8665227B2 (en) | 2009-11-19 | 2014-03-04 | Motorola Mobility Llc | Method and apparatus for replicating physical key function with soft keys in an electronic device |
US20110115711A1 (en) * | 2009-11-19 | 2011-05-19 | Suwinto Gunawan | Method and Apparatus for Replicating Physical Key Function with Soft Keys in an Electronic Device |
US8493346B2 (en) | 2009-12-31 | 2013-07-23 | International Business Machines Corporation | Morphing touchscreen keyboard interface |
US20110157090A1 (en) * | 2009-12-31 | 2011-06-30 | International Business Machines Corporation | Morphing touchscreen keyboard interface |
US8806362B2 (en) | 2010-01-06 | 2014-08-12 | Apple Inc. | Device, method, and graphical user interface for accessing alternate keys |
US8793611B2 (en) | 2010-01-06 | 2014-07-29 | Apple Inc. | Device, method, and graphical user interface for manipulating selectable user interface objects |
US20110167382A1 (en) * | 2010-01-06 | 2011-07-07 | Van Os Marcel | Device, Method, and Graphical User Interface for Manipulating Selectable User Interface Objects |
US20110163973A1 (en) * | 2010-01-06 | 2011-07-07 | Bas Ording | Device, Method, and Graphical User Interface for Accessing Alternative Keys |
US8456445B2 (en) | 2010-04-30 | 2013-06-04 | Honeywell International Inc. | Touch screen and method for adjusting screen objects |
US20120027267A1 (en) * | 2010-07-29 | 2012-02-02 | Kim Jonghwan | Mobile terminal and method of controlling operation of the mobile terminal |
US8878822B2 (en) * | 2010-07-29 | 2014-11-04 | Lg Electronics Inc. | Mobile terminal and method of controlling operation of the mobile terminal |
TWI461975B (en) * | 2011-01-12 | 2014-11-21 | Wistron Corp | Electronic device and method for correcting touch position |
US20130181924A1 (en) * | 2012-01-17 | 2013-07-18 | Samsung Electronics Co., Ltd. | Apparatus and method for adjusting a touch recognition area in a touch interface |
US9703408B2 (en) * | 2012-01-17 | 2017-07-11 | Samung Electronics Co., Ltd | Apparatus and method for adjusting a touch recognition area in a touch interface |
US20140184511A1 (en) * | 2012-12-28 | 2014-07-03 | Ismo Puustinen | Accurate data entry into a mobile computing device |
US9389728B2 (en) | 2013-03-15 | 2016-07-12 | Elwha Llc | Systems and methods for parallax compensation |
US9047002B2 (en) | 2013-03-15 | 2015-06-02 | Elwha Llc | Systems and methods for parallax compensation |
US9405402B2 (en) | 2013-03-15 | 2016-08-02 | Elwha Llc | Systems and methods for parallax compensation |
US9395902B2 (en) | 2013-03-15 | 2016-07-19 | Elwha Llc | Systems and methods for parallax compensation |
US9986946B2 (en) * | 2013-09-13 | 2018-06-05 | Centre Hospitalier Universitaire De Poitiers | Device and method for evaluating and monitoring physical pain |
US20160220179A1 (en) * | 2013-09-13 | 2016-08-04 | Centre Hospitalier Universitaire De Poitiers | Device and method for evaluating and monitoring physical pain |
US9791959B2 (en) | 2014-01-07 | 2017-10-17 | Qualcomm Incorporated | System and method for host-augmented touch processing |
US9710150B2 (en) | 2014-01-07 | 2017-07-18 | Qualcomm Incorporated | System and method for context-based touch processing |
US9841844B2 (en) * | 2014-06-20 | 2017-12-12 | Funai Electric Co., Ltd. | Image display device |
US20150370415A1 (en) * | 2014-06-20 | 2015-12-24 | Funai Electric Co., Ltd. | Image display device |
US20160162276A1 (en) * | 2014-12-04 | 2016-06-09 | Google Technology Holdings LLC | System and Methods for Touch Pattern Detection and User Interface Adaptation |
US10235150B2 (en) * | 2014-12-04 | 2019-03-19 | Google Technology Holdings LLC | System and methods for touch pattern detection and user interface adaptation |
US10409483B2 (en) | 2015-03-07 | 2019-09-10 | Apple Inc. | Activity based thresholds for providing haptic feedback |
US10319408B2 (en) | 2015-03-30 | 2019-06-11 | Manufacturing Resources International, Inc. | Monolithic display with separately controllable sections |
US10922736B2 (en) | 2015-05-15 | 2021-02-16 | Manufacturing Resources International, Inc. | Smart electronic display for restaurants |
US10269156B2 (en) | 2015-06-05 | 2019-04-23 | Manufacturing Resources International, Inc. | System and method for blending order confirmation over menu board background |
US10467610B2 (en) | 2015-06-05 | 2019-11-05 | Manufacturing Resources International, Inc. | System and method for a redundant multi-panel electronic display |
US10319271B2 (en) | 2016-03-22 | 2019-06-11 | Manufacturing Resources International, Inc. | Cyclic redundancy check for electronic displays |
US10318043B2 (en) | 2016-03-24 | 2019-06-11 | Gm Global Technology Operations Llc. | Dynamic adjustment of touch sensitive area in a display assembly |
US10756836B2 (en) | 2016-05-31 | 2020-08-25 | Manufacturing Resources International, Inc. | Electronic display remote image verification system and method |
US10313037B2 (en) | 2016-05-31 | 2019-06-04 | Manufacturing Resources International, Inc. | Electronic display remote image verification system and method |
US10510304B2 (en) | 2016-08-10 | 2019-12-17 | Manufacturing Resources International, Inc. | Dynamic dimming LED backlight for LCD array |
US10860199B2 (en) | 2016-09-23 | 2020-12-08 | Apple Inc. | Dynamically adjusting touch hysteresis based on contextual data |
CN111324087A (en) * | 2018-12-14 | 2020-06-23 | 发那科株式会社 | Display device, machine tool, and abnormality determination method |
US11016656B1 (en) | 2020-02-14 | 2021-05-25 | International Business Machines Corporation | Fault recognition self-learning graphical user interface |
US11895362B2 (en) | 2021-10-29 | 2024-02-06 | Manufacturing Resources International, Inc. | Proof of play for images displayed at electronic displays |
US11656885B1 (en) | 2022-02-22 | 2023-05-23 | International Business Machines Corporation | Interface interaction system |
Also Published As
Publication number | Publication date |
---|---|
DE69422323T2 (en) | 2000-06-21 |
ATE188302T1 (en) | 2000-01-15 |
EP0618528A1 (en) | 1994-10-05 |
CN1095498A (en) | 1994-11-23 |
JPH06309102A (en) | 1994-11-04 |
JP3292267B2 (en) | 2002-06-17 |
TW249282B (en) | 1995-06-11 |
EP0618528B1 (en) | 1999-12-29 |
KR970006397B1 (en) | 1997-04-28 |
DE69422323D1 (en) | 2000-02-03 |
CN1054450C (en) | 2000-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5565894A (en) | Dynamic touchscreen button adjustment mechanism | |
US6411283B1 (en) | Computer touch screen adapted to facilitate selection of features at edge of screen | |
US6727892B1 (en) | Method of facilitating the selection of features at edges of computer touch screens | |
US5189732A (en) | Touch panel input apparatus | |
US6400376B1 (en) | Display control for hand-held data processing device | |
US7742042B2 (en) | Touch-sensitive device for scrolling a document on a display | |
US6597384B1 (en) | Automatic reorienting of screen orientation using touch sensitive system | |
US7158123B2 (en) | Secondary touch contextual sub-menu navigation for touch screen interface | |
EP1466242B1 (en) | Graphic user interface for data processing device | |
US4947156A (en) | Handwritten character input device | |
US6281878B1 (en) | Apparatus and method for inputing data | |
JP2999947B2 (en) | Method and apparatus for operating an object displayed on a display screen | |
EP1847915A2 (en) | Touch screen device and method of displaying and selecting menus thereof | |
EP0422577A2 (en) | Method and apparatus for displaying simulated keyboards on touch-sensitive displays | |
US20060077179A1 (en) | Keyboard having automatic adjusting key intervals and a method thereof | |
KR20070020510A (en) | How to enter two fingers on the touch screen | |
USH716H (en) | Parallax induced pointing error avoidance method and means for systems using touch screen overlays | |
US7893927B2 (en) | Touch screen device with guiding surface | |
US20060164387A1 (en) | Input apparatus and touch-reading character/symbol input method | |
JP2000293280A (en) | Information input device | |
JPH09292952A (en) | Input device | |
KR20040034915A (en) | Apparatus for implementing dynamic keyboard in pen computing system | |
JPH0314121A (en) | Programmable display device | |
JPH07129307A (en) | Input device | |
JPH08292839A (en) | Indication input system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BATES, CARY LEE;WATTS, BYRON TIMOTHY;SIGNING DATES FROM 19930331 TO 19930401;REEL/FRAME:029897/0327 |