CN113220138A - Mobile equipment three-dimensional positioning method and equipment based on pressure sense - Google Patents
Mobile equipment three-dimensional positioning method and equipment based on pressure sense Download PDFInfo
- Publication number
- CN113220138A CN113220138A CN202110366680.9A CN202110366680A CN113220138A CN 113220138 A CN113220138 A CN 113220138A CN 202110366680 A CN202110366680 A CN 202110366680A CN 113220138 A CN113220138 A CN 113220138A
- Authority
- CN
- China
- Prior art keywords
- pressure
- pressure level
- depth direction
- mobile device
- graphic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000008859 change Effects 0.000 claims description 4
- 238000004891 communication Methods 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 claims description 3
- 230000035807 sensation Effects 0.000 claims description 3
- 230000004807 localization Effects 0.000 claims 7
- 230000009466 transformation Effects 0.000 abstract description 12
- 238000010586 diagram Methods 0.000 description 15
- 238000004590 computer program Methods 0.000 description 7
- 238000002474 experimental method Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 238000000844 transformation Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a mobile equipment three-dimensional positioning method and equipment based on pressure, which comprises the following steps: detecting the pressure applied to the mobile device at the position corresponding to the generated graph; the graphic moves in a first direction in the depth direction when the pressure level is at a first pressure level, remains in a current depth direction when the pressure level is at a second pressure level, and moves in a second direction in the depth direction when the pressure level is at a third pressure level. The invention can express the depth information missing 3DOF in three-dimensional output by sensing the touch normal force input by the user, so that the user can finish the three-dimensional transformation operation of the object of the mobile equipment by using one hand.
Description
Technical Field
The invention belongs to the technical field of human-computer interaction, and particularly relates to a mobile equipment three-dimensional positioning method and equipment based on pressure.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
3DOF object transformations on mobile device touch screens typically require two hands (multiple fingers) to perform, which limits the user's operation on the mobile device in some cases with one hand. With the development of the pressure-sensitive touch screen of the mobile equipment, the force of pressing the screen by a user is applied, and a new interaction mode is provided for the single-hand operation of the user on the mobile equipment.
In human-computer interaction, asymmetry between three-dimensional visual output and two-dimensional input is always a major challenge in three-dimensional operation. After research, most users prefer to drag an object with one finger to perform a three-dimensional transformation operation. However, due to the lack of depth information in the input, such a single-finger drag (pan) operation can typically only perform 2DOF transformations on the camera plane of the mobile device, however, 2DOF transformations cannot meet most common three-dimensional operation tasks. Although a user can use both hands to complete a 3DOF transformation, the user always needs to hold the mobile device with one hand, and thus such a two-handed gesture operation is difficult to perform on mobile devices such as tablet computers and large-screen mobile phones.
Disclosure of Invention
The present invention is directed to solving the above problems, and provides a method and an apparatus for three-dimensional positioning of a mobile device based on pressure sensing, which can represent depth information lacking 3DOF in three-dimensional output by sensing a touch normal force input by a user, so that the user can perform an object three-dimensional transformation operation of the mobile device with one hand, thereby providing a possibility of one-handed operation of the user compared to a conventionally proposed two-handed three-dimensional transformation operation.
According to some embodiments, the invention adopts the following technical scheme:
a mobile equipment three-dimensional positioning method based on pressure sensation comprises the following steps:
detecting the pressure applied to the mobile device at the position corresponding to the generated graph;
the graphic moves in a first direction in the depth direction when the pressure level is at a first pressure level, remains in a current depth direction when the pressure level is at a second pressure level, and moves in a second direction in the depth direction when the pressure level is at a third pressure level.
According to the scheme, when a user operates the mobile equipment by using a finger, a touch pen or the like, the operated object can perform left-right and up-down movement in the horizontal direction, and can also be converted into depth information according to the pressure, so that front-back movement in the z-axis direction is performed, and the three-dimensional conversion operation of objects/graphics on the mobile equipment can be completed by one hand.
As an alternative embodiment, the first pressure level is [ m, m + t ], where m is the minimum pressure applied to the mobile device when the operation object keeps touching, and t is a set value.
As an alternative embodiment, the third pressure level is [ n-t, n ], where n is the maximum pressure that the mobile device can withstand when the operation object is pressed hard, and t is the set value.
As an alternative embodiment, the second pressure level is (m + t, n-t), where m is the minimum pressure applied to the mobile device when the operation object is kept touching, n is the maximum pressure that the mobile device can bear when the operation object is pressed hard, and t is a set value.
As a further limitation, the value of t is a preconfigured value;
as a further limitation, the value of t is determined via deep learning user habits.
As an alternative embodiment, the graphics are moved with the same speed in the depth direction.
As an alternative embodiment, the speed of movement of the pattern in the depth direction is proportional to the rate of change of the pressure value or the pressure value.
A mobile device, comprising:
a pressure-sensitive touch panel;
a display screen positioned below the pressure-sensitive touch panel;
a controller, in communication with the pressure-sensitive touch panel, configured to generate a graphic and to display the graphic on the display screen, detecting a magnitude of pressure applied to the pressure-sensitive touch panel at a location corresponding to the generated graphic; the graphic is moved in a first direction in the depth direction when the pressure level is at a first pressure level, the graphic is maintained in a current depth direction when the pressure level is at a second pressure level, and the graphic is moved in the second direction in the depth direction when the pressure level is at a third pressure level.
A mobile device comprising a memory and a processor and computer instructions stored on the memory and executed on the processor, the computer instructions when executed by the processor performing the steps of the above method.
Compared with the prior art, the invention has the beneficial effects that:
the invention has three touch types on the mobile equipment, wherein the touch type is represented by Z-axis translation vertical to the plane of the camera, the pressure of the touch of a user is mapped by using the pressure-sensitive screen of the mobile equipment, and the depth information can be represented when three-dimensional output is carried out.
The invention controls the depth/axial movement of the three-dimensional object in the interface by pressure grading, so that the controlled object in the mobile equipment can move in 6 degrees of freedom (up, down, left, right, front and back) in a 3D space by a single operation object (finger, operation pen and the like), thereby really realizing the liberation of two hands, increasing the experience of users, being convenient to operate in the use scenes such as inconvenient users (driving vehicles) and the like, and improving the experience of users.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
FIG. 1 is a diagram illustrating an example of an object transformation of a pressure-based three-dimensional positioning method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of pressure-based touch type recognition provided by the implementation of the present embodiment: touch pressure is described by 0-1, 0 represents zero pressure, and 1 represents the maximum recognizable pressure of the screen;
3(a) -3 (d) are schematic diagrams of the pressure-based three-dimensional transformation provided by the implementation of the embodiment;
4(a) -4 (c) are experimental diagrams of different levels of pressure division provided by the implementation of the present embodiment;
fig. 5(a) -5 (b) are schematic diagrams illustrating the experimental results of the completion time and the error rate of the three-dimensional positioning method based on different levels of pressure provided by the embodiment.
Fig. 6(a) -6 (b) are schematic diagrams illustrating the experimental results of the completion time and the error rate of the three-dimensional positioning method based on different threshold pressures provided by the embodiment.
Fig. 7(a) -7 (b) are schematic graphs of pressure transitions of two touch forms (light touch and heavy touch) provided by the implementation of the embodiment.
Fig. 8(a) -8 (b) are diagrams of system examples of the pressure-based three-dimensional transformation design provided by the embodiment.
The specific implementation mode is as follows:
the invention is further described with reference to the following figures and examples.
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
Example one
As shown in fig. 1 and 3, a pressure-based mobile device three-dimensional method includes:
detecting the pressure applied to the mobile device at the position corresponding to the generated graph;
the graphic moves in a first direction in the depth direction when the pressure level is at a first pressure level, remains in a current depth direction when the pressure level is at a second pressure level, and moves in a second direction in the depth direction when the pressure level is at a third pressure level.
In order to solve the problem that two-dimensional input and three-dimensional output on a mobile device are not matched, a first pressure-based three-dimensional positioning technology is provided, and pressure touched by a user during operation is expressed as missing depth information of screen coordinates.
In this embodiment, according to the sensing range of normal touch force on the mobile device, three touch types are defined: light touch, normal touch and heavy touch, three touch types are expressed as translation transformation of an object perpendicular to the Z axis of the camera plane.
In this embodiment, kiss: represented as facing camera movement;
normal touch: expressed as stable in the Z axis;
re-contacting: indicated as a back-facing camera movement.
To clarify each touch type division, as shown in fig. 2, a touch of a small pressure (less than a threshold m + t, where m represents a minimum pressure when a user touches the pressure-sensitive screen) is represented as a light touch, a touch of a large pressure (higher than a threshold (1-t)) is represented as a heavy touch, and a touch of a middle range is represented as a normal touch.
As shown in fig. 3, the user starts the finger with the normal touch shown in fig. 3 (a). The application scene redisplays the corresponding translucent directing object to indicate the transformed position. And for each frame, detecting the current touch by the system, changing the depth value of the operation object according to the detection result (touch type), and respectively adding and subtracting a fixed value on the depth value during the heavy touch and the light touch, wherein the depth value is unchanged during the normal touch. The results of the execution of the normal touch, the light touch, and the heavy touch are shown in fig. 3(b) - (d), respectively.
As shown in fig. 8(a), (b), for a specific embodiment, when the user needs to move the virtual cart on the screen, the user can use a single hand to slide left and right, up and down, or press the virtual cart to change the depth of the virtual cart, i.e. the display position of the z-axis.
Of course, in some embodiments, parameter optimization may also be performed, including:
as shown in fig. 4, the present embodiment adopts the classification of the user touch pressure into three levels: three-stage, five-stage and infinite stages, and the t value is set to be 0.15, and parameter optimization is carried out. Of course, in this embodiment, it is considered that 1 is the maximum pressure that the mobile device can bear when the operation object is pressed hard, and during data processing, the real pressure may be normalized. So 0.15 is relative to 1 at this time.
Of course, in a specific embodiment, the value of t may be set or optimized according to specific situations.
Third-stage: the pressure is divided into three levels;
of course, in other embodiments, the pressure may be divided into more levels, such as:
and (5) fifth stage: the pressure is divided into five levels;
infinite: the pressure is divided into infinite levels (the pressure and the Z-axis moving speed are linear correspondences);
as shown in fig. 5, 4 sets of control experiments (in order to eliminate the effect of speed on the result, a comparative experiment with a speed of 1.5 times is also provided in this example) were performed, and a total of 3200 experiments were completed: 10 participants; 4 groups of control experiments; 2 precisions (low precision target position threshold of 0.7, high precision of 0.35); 4 tests per set of experiments; each testing 10 tasks.
Statistics on task completion time are shown in fig. 5 (a): the low-precision task has no obvious difference in pressure classification, more execution time is required for operation based on three-level pressure division, and the moving speed is improved so that the completion time can be effectively shortened; the high-precision task has obvious difference in pressure classification, and the completion time is obviously in a descending trend along with the increase of the pressure classification.
The statistics on the error rate are shown in fig. 5 (b): the low-precision task has obvious difference in pressure grading, and the error rate is obviously in a descending trend along with the increase of the pressure grading; the high-precision task has obvious difference in pressure classification, and the error rate is obviously in a descending trend along with the increase of the pressure classification.
Of course, the value of t may be experimentally set in advance, and as shown in fig. 6(a) -6 (b) and fig. 7(a) -7 (b), optimization may be performed experimentally, in this embodiment, to determine the smooth transition pressure partition and the value of t.
In some embodiments, the value of t may be determined through deep learning user habits.
In some embodiments, the graphics move at the same speed in the depth direction as long as the pressure is at the same level.
It is also possible that the graphics move more quickly in the depth direction even if the pressure is on the same level, e.g. on a third level, the greater the rate of change of pressure or the greater the pressure.
Example two
A mobile device, comprising:
a pressure-sensitive touch panel;
a display screen positioned below the pressure-sensitive touch panel;
a controller, in communication with the pressure-sensitive touch panel, configured to generate a graphic and to display the graphic on the display screen, detecting a magnitude of pressure applied to the pressure-sensitive touch panel at a location corresponding to the generated graphic; the graphic is moved in a first direction in the depth direction when the pressure level is at a first pressure level, the graphic is maintained in a current depth direction when the pressure level is at a second pressure level, and the graphic is moved in the second direction in the depth direction when the pressure level is at a third pressure level.
A mobile device comprising a memory and a processor and computer instructions stored on the memory and executed on the processor, the computer instructions when executed by the processor performing the steps of the above method.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.
Claims (10)
1. A mobile equipment three-dimensional positioning method based on pressure is characterized in that: the method comprises the following steps:
detecting the pressure applied to the mobile device at the position corresponding to the generated graph;
the graphic moves in a first direction in the depth direction when the pressure level is at a first pressure level, remains in a current depth direction when the pressure level is at a second pressure level, and moves in a second direction in the depth direction when the pressure level is at a third pressure level.
2. The method for three-dimensional localization of mobile devices based on pressure as claimed in claim 1, wherein: the first pressure level is [ m, m + t ], where m is a minimum pressure applied to the mobile device when the operation object keeps touching, and t is a set value.
3. The method for three-dimensional localization of mobile devices based on pressure as claimed in claim 1, wherein: the third pressure level is [ n-t, n ], where n is the maximum pressure that the mobile device can bear when the operation object is pressed hard, and t is a set value.
4. The method for three-dimensional localization of mobile devices based on pressure as claimed in claim 1, wherein: the second pressure level is (m + t, n-t), where m is a minimum pressure applied to the mobile device when the operation object is kept touching, n is a maximum pressure that the mobile device can bear when the operation object is pressed hard, and t is a set value.
5. The method for three-dimensional localization of mobile devices based on pressure sensation as claimed in any of claims 2-4, wherein: the value of t is a preconfigured value.
6. The method for three-dimensional localization of mobile devices based on pressure sensation as claimed in any of claims 2-4, wherein: the value of t is determined through deep learning user habits.
7. The method for three-dimensional localization of mobile devices based on pressure as claimed in claim 1, wherein: the graphics move with the same speed in the depth direction.
8. The method for three-dimensional localization of mobile devices based on pressure as claimed in claim 1, wherein: the speed of movement of the pattern in the depth direction is proportional to the rate of change of the pressure value or the pressure value.
9. A mobile device, characterized by: the method comprises the following steps:
a pressure-sensitive touch panel;
a display screen positioned below the pressure-sensitive touch panel;
a controller, in communication with the pressure-sensitive touch panel, configured to generate a graphic and to display the graphic on the display screen, detecting a magnitude of pressure applied to the pressure-sensitive touch panel at a location corresponding to the generated graphic; the graphic is moved in a first direction in the depth direction when the pressure level is at a first pressure level, the graphic is maintained in a current depth direction when the pressure level is at a second pressure level, and the graphic is moved in the second direction in the depth direction when the pressure level is at a third pressure level.
10. A mobile device, characterized by: comprising a memory and a processor and computer instructions stored on the memory and executed on the processor, which when executed by the processor, perform the steps of the method of any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110366680.9A CN113220138A (en) | 2021-04-06 | 2021-04-06 | Mobile equipment three-dimensional positioning method and equipment based on pressure sense |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110366680.9A CN113220138A (en) | 2021-04-06 | 2021-04-06 | Mobile equipment three-dimensional positioning method and equipment based on pressure sense |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113220138A true CN113220138A (en) | 2021-08-06 |
Family
ID=77086401
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110366680.9A Pending CN113220138A (en) | 2021-04-06 | 2021-04-06 | Mobile equipment three-dimensional positioning method and equipment based on pressure sense |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113220138A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101495951A (en) * | 2006-07-31 | 2009-07-29 | 索尼爱立信移动通讯有限公司 | Three-dimensional touch pad input device |
CN103135889A (en) * | 2011-12-05 | 2013-06-05 | Lg电子株式会社 | Mobile terminal and 3D image control method thereof |
CN103425332A (en) * | 2012-05-22 | 2013-12-04 | 联想(新加坡)私人有限公司 | User interface navigation utilizing pressure-sensitive touch |
-
2021
- 2021-04-06 CN CN202110366680.9A patent/CN113220138A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101495951A (en) * | 2006-07-31 | 2009-07-29 | 索尼爱立信移动通讯有限公司 | Three-dimensional touch pad input device |
CN103135889A (en) * | 2011-12-05 | 2013-06-05 | Lg电子株式会社 | Mobile terminal and 3D image control method thereof |
CN103425332A (en) * | 2012-05-22 | 2013-12-04 | 联想(新加坡)私人有限公司 | User interface navigation utilizing pressure-sensitive touch |
Non-Patent Citations (1)
Title |
---|
LU WANG.: "The design and empirical evaluations of 3D positioning techniques for pressure-based touch control on mobile devices", 《PERSONAL AND UBIQUITOUS COMPUTING》 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10203871B2 (en) | Method for touch input and device therefore | |
CN103186345B (en) | The section system of selection of a kind of literary composition and device | |
US8629837B2 (en) | Method and device for controlling information display output and input device | |
US20110320978A1 (en) | Method and apparatus for touchscreen gesture recognition overlay | |
US20130234942A1 (en) | Systems and Methods for Modifying Virtual Keyboards on a User Interface | |
KR20150103240A (en) | Depth-based user interface gesture control | |
CN105117056A (en) | Method and equipment for operating touch screen | |
CN102591497A (en) | Mouse simulation system and method on touch screen | |
EP2771766B1 (en) | Pressure-based interaction for indirect touch input devices | |
CN102768595B (en) | A kind of method and device identifying touch control operation instruction on touch-screen | |
CN103793137A (en) | Display method and electronic device | |
CN104793744A (en) | Gesture operation method and device | |
CN111831180B (en) | Icon sorting method and device and electronic equipment | |
US9626086B1 (en) | Adjusting eraser size in drawing applications | |
CN102768597B (en) | Method and device for operating electronic equipment | |
CN104808906A (en) | Electronic equipment with touch display screen and touch display screen control method | |
CN104216517A (en) | Information processing method and electronic equipment | |
CN113220138A (en) | Mobile equipment three-dimensional positioning method and equipment based on pressure sense | |
CN104049867A (en) | Information processing method and electronic device | |
CN103885696A (en) | Information processing method and electronic device | |
CN104346095A (en) | Information processing method and electronic equipment | |
CN103186264A (en) | Touch control electronic device and touch control method thereof | |
CN104345877A (en) | Information processing method and electronic equipment | |
CN104081333A (en) | Remote display area including input lenses each depicting a region of a graphical user interface | |
CN104077062A (en) | Input control method and input control device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210806 |