CN105808079B - Method and device for quickly aligning objects by utilizing gestures - Google Patents
Method and device for quickly aligning objects by utilizing gestures Download PDFInfo
- Publication number
- CN105808079B CN105808079B CN201410837418.8A CN201410837418A CN105808079B CN 105808079 B CN105808079 B CN 105808079B CN 201410837418 A CN201410837418 A CN 201410837418A CN 105808079 B CN105808079 B CN 105808079B
- Authority
- CN
- China
- Prior art keywords
- editable object
- alignment mode
- editable
- moving direction
- direction information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 230000000694 effects Effects 0.000 claims description 35
- 238000001514 detection method Methods 0.000 claims description 13
- 230000001960 triggered effect Effects 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 1
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a method and a device for quickly aligning objects by using gestures, which are applied to electronic whiteboard software input through touch equipment, wherein the method comprises the following steps: detecting three-to-five point touch operation continuously input by a user; when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on the first editable object, entering an alignment mode; after entering the alignment mode, when detecting a drag operation acting on a second editable object, acquiring moving direction information of the second editable object after the drag operation is finished; according to the moving direction information, moving the second editable object to align the second editable object with the edge of the first editable object; according to the invention, the operation steps of aligning the objects are greatly simplified in a touch manner, the whole processing process is simple, quick and easy to realize, and the working efficiency of a user is obviously improved.
Description
Technical Field
The invention relates to the field of electronic whiteboard teaching software, in particular to a method and a device for quickly aligning objects by utilizing gestures.
Background
The touch input device is more and more widely applied in daily life, and can be matched with corresponding software for use, so that a user can quickly and conveniently perform a plurality of editing operations. The process is also universally applied to teaching work of teachers, and the electronic whiteboard software are used, so that the teachers can prepare courses, give lessons and other daily teaching work fast and efficiently. When teachers use electronic whiteboard software to make courseware, the positions of editable objects in the courseware need to be frequently and massively adjusted, so that the objects are arranged regularly and orderly, the courseware is reasonable and attractive as a whole, and the students can watch the objects conveniently.
Disclosure of Invention
In view of the above, the present invention provides a simple and efficient method and apparatus for quickly aligning an object by using a gesture.
Based on the above purpose, the method for quickly aligning objects by using gestures provided by the present invention is applied to electronic whiteboard software input through touch equipment, and comprises the following steps:
detecting three-to-five point touch operation continuously input by a user;
when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on the first editable object, entering an alignment mode;
in the alignment mode, after a dragging operation acting on a second editable object is detected, acquiring moving direction information of the second editable object after the dragging operation is finished;
and according to the moving direction information, moving the second editable object to align the second editable object with the edge of the first editable object.
Preferably, the method further comprises the following steps: and exiting the alignment mode when all the point touch operations falling on the first editable object are completely finished.
Preferably, in the alignment mode, all subsequent touch operations falling on the first editable object are ignored.
Preferably, when entering the alignment mode, the method further comprises the steps of: and restoring all touch operation effects triggered in the process of entering the alignment mode.
Preferably, after entering the alignment mode, the method further comprises the steps of: and displaying the first editable object in a highlighting effect mode until the alignment mode is exited.
Optionally, the highlighting effect display includes: flashing, edge highlighting, eye-catching overlay, or popping up an indicator box.
Preferably, the step of acquiring the moving direction information of the second editable object includes: and respectively taking the horizontal and vertical components of the connecting line of the starting point and the end point of the drag operation, and determining the moving direction of the second editable object by the larger value of the horizontal and vertical components.
Preferably, the method further comprises the following steps: and acquiring the moving direction information of the second editable object in real time, and displaying the corresponding edge of the first editable object in a highlighting effect in real time according to the moving direction information.
The invention also provides a device for quickly aligning objects by using gestures, which comprises:
the first detection module is used for detecting point touch operation of three to five points continuously input by a user;
the starting module is used for entering an alignment mode when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on a first editable object;
the second detection module is used for acquiring the moving direction information of a second editable object after the dragging operation is finished after the dragging operation acting on the second editable object is detected in the alignment mode;
and the execution module is used for moving the second editable object to align the second editable object with the edge of the first editable object according to the moving direction information.
Preferably, the method further comprises an ending module, configured to exit the alignment mode when all the point touch operations falling on the first editable object are ended.
Preferably, the first detection module is further configured to ignore all subsequent touch operations falling on the first editable object in the alignment mode.
Preferably, the opening module is further configured to, when entering the alignment mode, further include the steps of: and restoring all touch operation effects triggered in the process of entering the alignment mode.
Preferably, the display device further comprises a first prompting module, configured to display the first editable object in a highlighting effect after entering the alignment mode until exiting the alignment mode.
Optionally, the highlighting effect display includes: flashing, edge highlighting, eye-catching overlay, or popping up an indicator box.
Preferably, the second detection module is further configured to determine the moving direction of the second editable object by a larger value of a horizontal component and a vertical component of a line connecting the start point and the end point of the drag operation.
Preferably, the display device further comprises a second prompting module, configured to obtain moving direction information of the second editable object in real time, and display a highlighting effect on the corresponding edge of the first editable object in real time according to the moving direction information.
From the above, the method and the device for quickly aligning objects by using gestures, provided by the invention, trigger to enter the alignment mode by detecting the point touch operation of three to five points of a user falling on the first editable object; and in the alignment mode, detecting the drag operation on the second editable object, and quickly aligning the second editable object with the corresponding edge of the first editable object according to the moving direction information of the second editable object. Compared with the prior art, the method greatly simplifies the operation steps of aligning the objects, has simple, quick and easily realized whole processing process, and obviously improves the working efficiency of users.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart of a method for quickly aligning objects using gestures according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for quickly aligning objects using gestures according to another embodiment of the present invention;
fig. 3 is a schematic structural diagram of an apparatus for quickly aligning an object by using a gesture according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
The method and the device for quickly aligning the object by using the gesture are applied to electronic whiteboard software input through touch equipment. The embodiment of the invention is applied to the courseware making process, and the teacher generates the electronic courseware by inserting and editing various editable objects on the electronic courseware editing page.
The electronic whiteboard software supports touch equipment input, and is generally a touch screen of an electronic whiteboard or a handheld terminal; and supports some commonly used gesture input operations such as: point touch dragging of objects, rotation of two-point touch, zooming of objects, sliding track display and the like.
Referring to fig. 1, a flowchart of a method for quickly aligning an object by using a gesture according to an embodiment of the present invention is shown.
The method for quickly aligning objects by using gestures of the embodiment comprises the following steps:
step 101: detecting three-to-five point touch operation continuously input by a user;
step 102: when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on the first editable object, entering an alignment mode;
step 103: in the alignment mode, after a dragging operation acting on a second editable object is detected, acquiring moving direction information of the second editable object after the dragging operation is finished;
step 104: and according to the moving direction information, moving the second editable object to align the second editable object with the edge of the first editable object.
In step 101, a point touch operation of a user is detected, where the detection conditions are: the touch points are continuously input and the number is three to five, namely three, four or five touch points falling on the touch screen are detected, which are continuously input by the user. The lower limit of the number of point touch operations, i.e. the setting of three, is to distinguish from some common touch operations such as the point touch dragging object, the rotation of the two-point touch, the zooming object, and the like, so as to avoid the detection conflict and reduce the calculation amount of the detection processing. The upper limit of the number of point touch operations, namely five settings, is based on the operation habit of a human body, and the touch operation on the same object is generally finished by one hand, so that the operation by two hands is avoided, and the operation process is simplified.
In step 102, further determining the point touch operations of the three to five points detected in step 101, and determining whether all the point touch operations of the three to five points fall on the same editable object on the editing page in the electronic whiteboard software, that is, a first editable object; and whether the time interval and the distance interval between any two consecutive point touch operations are both within a preset range. For the time interval, in the present embodiment, the preset time interval is 0-20 ms, that is, for any two consecutive point touch operations, the time difference between the two points touch on the touch screen does not exceed 20 ms; considering that the point touch operation gesture is operated by a single person, the point touch operation is completed by a single finger of a user, the operation is a relatively coherent action, a plurality of fingers press the same object, and a plurality of fingers can all fall on the touch screen in a short time. For the distance interval, in this embodiment, the preset distance interval is 0-20 pixels, that is, the difference between the distances between the points falling on the touch screen for any two consecutive point touch operations does not exceed 20 pixels; considering that the point touch operation gesture is operated by a single person, for the completion of point touch by a single finger of a user, points of a plurality of fingers falling on the touch screen are concentrated in a range. And when the two judgment results are yes, namely all the point touch operations are within the preset time interval and distance interval and all the point touch operations fall on the first editable object, entering an alignment mode.
In step 103, in the alignment mode, the first editable object is continuously selected and fixed at the selected position without moving. And then, whether a drag operation is performed on a second editable object is detected, wherein the second editable object is another editable object on the editing page in the electronic whiteboard software. When the dragging operation acting on the second editable object is detected, the dragging operation is continuously monitored, and the moving direction of the second editable object relative to the initial position of the second editable object when the dragging operation is finished is obtained, namely, whether the second editable object moves to the upper side, the lower side, the left side or the right side of the initial position after dragging is determined.
In step 104, the second editable object is moved according to the moving direction information of the second editable object obtained in step 103, and an edge of the second editable object is aligned with a corresponding edge of the first editable object. For example, if the drag operation in step 103 completes dragging the second editable object to the left, then after the drag operation is completed, the second editable object continues to be moved to align its left edge with the left edge of the first editable object. For the editable object edge, in the electronic whiteboard software, the editable object has an operation box with a regular shape, which is generally a rectangle, and the editable object edge is also four sides of the rectangular operation box.
In the alignment mode, after the second editable object is aligned, the drag operation of the user is continuously detected, and when the second editable object is detected again, the second editable object is determined again, and the subsequent steps are continued, so that the alignment process of the plurality of editable objects is realized.
In this embodiment, taking three-point touch operation as an example, the whole operation process of the user is as follows: and touching the touch screen by three fingers, and enabling all three points to fall on a first editable object on an editing page in the electronic whiteboard software, and then entering an alignment mode. And the user drags the second editable object to move leftwards continuously through touch operation, after the dragging is finished, the second editable object moves continuously, the left edge of the second editable object is automatically aligned with the left edge of the first editable object, and the first editable object and the second editable object are arranged neatly in a left-side aligned mode.
It can be seen that, by the method of this embodiment, an alignment mode is triggered by a point touch operation, which is similar to "selecting" a target object, and then another object is continuously dragged in one direction, and corresponding edges of the two objects are aligned according to the dragging direction.
Referring to fig. 2, a flowchart of a method for quickly aligning an object by using a gesture according to another embodiment of the present invention is shown.
As another embodiment, the method for quickly aligning objects by using gestures includes the following steps:
step 201: and detecting three to five point touch control operation continuously input by a user.
Step 202: and entering an alignment mode when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on the first editable object.
Step 203: and restoring all touch operation effects triggered in the process of entering the alignment mode.
When the touch screen enters the alignment mode, the point touch operations sequentially fall on the touch screen, and meanwhile, because a finger falls on the touch screen and a small movement which is not easy to be perceived by a user occurs, in the process that the point touch operations sequentially fall, the point touch operations which fall first are likely to be recognized as other conventional gesture touch operations. For example, when the first point touch operation falls and slightly moves, the first point touch operation is recognized as a sliding track display gesture, and a small segment of track appears on the first editable object; when the second point touch operation falls and slightly moves relative to the first point touch operation, the second point touch operation is recognized as a zooming gesture, and the first editable object is zoomed. In order to avoid the above problem, in this step, all touch operation effects triggered during the process of entering the alignment mode are restored, that is, the above moving and zooming effects are completely cancelled.
Meanwhile, in order to further avoid that the user performs a misoperation on the first editable object in the process of entering the alignment mode, in this embodiment, in the alignment mode, all subsequent touch operations falling on the first editable object are ignored, so as to ensure that the first editable object is fixed at the selected position and cannot move or change.
Step 204: and after entering the alignment mode, displaying the first editable object in a highlight effect until exiting the alignment mode.
In the alignment mode, the first editable object is selected as an aligned target, and the position and the state of the first editable object are not changed, so that in the step, after the alignment mode is entered, the first editable object is displayed in a highlight effect to prompt a user that the alignment mode is entered currently. Specifically, the highlighting effect display includes: the first editable object is subjected to overall flashing, edge highlighting, eye-catching color covering or a pop-up indication box points to the first editable object, and the like, so that the first editable object is obviously different from other editable objects, and a prompt effect is achieved.
Step 205: in the alignment mode, after a drag operation acting on a second editable object is detected, movement direction information of the second editable object after the drag operation is finished is acquired.
In this step, the process of acquiring the moving direction information of the second editable object specifically includes: and respectively taking the horizontal and vertical components of a connecting line of the starting point and the ending point of the drag operation acted on the second editable object, and determining the moving direction of the second editable object by the larger value of the horizontal and vertical components. The user generally does not strictly drag the second editable object in the horizontal and vertical directions, so that the user needs to further judge which direction the drag operation tends to, and by performing vertical projection in the horizontal and vertical directions, corresponding components are taken, and the larger component is taken, so that the direction tendency of the drag operation can be determined. For example, if the drag operation drags the second editable object to the left upper and more to the left, and the component in the horizontal direction is greater than the vertical component, it is determined that the result of the drag operation dragging the second editable object is to the left.
Step 206: and acquiring the moving direction information of the second editable object in real time, and displaying the corresponding edge of the first editable object in a highlighting effect in real time according to the moving direction information.
In this step, based on the determination of the moving direction information in step 205, the moving direction information of the second editable object is obtained in real time, and according to the determination result of the moving direction of the second editable object, the edge corresponding to the first editable object is displayed in real time as a highlighting effect, which is used to prompt the user to which edge of the first editable object the second editable object is aligned if the user finishes the drag operation in the current drag direction.
For example, if the current drag operation drags the second editable object to move to the left, the left edge of the first editable object is displayed with the highlight effect, and the drag operation moves further to the upper side, the upper edge of the first editable object is displayed with the highlight effect in real time, and the left edge is no longer displayed with the highlight effect.
In this step, the highlighting effect display of the edge of the first editable object may be flashing, highlighting, or an indication line, and it should be noted that, in this embodiment, the first editable object is used to prompt the user that the first editable object is currently in the alignment mode and also displayed through the highlighting effect, and the two highlighting effect displays should be different, so as to ensure that respective prompt effects are realized.
Step 207: and according to the moving direction information, moving the second editable object to align the second editable object with the edge of the first editable object.
Step 208: and exiting the alignment mode when all the point touch operations falling on the first editable object are completely finished.
In this step, since the point touch operation may be terminated due to the shaking of the finger of the user or the shaking of the touch screen device, the determination condition for exiting the alignment mode is set as follows to avoid the above-mentioned erroneous operation: all point touch operations that fall on the first editable object in step 201 are all ended. When the subsequent point touch operation falling on the first editable object exists again, the point touch operation is ignored.
The embodiment of the invention also provides a device for quickly aligning an object by using a gesture, and the device is a schematic structural diagram of the device for quickly aligning the object by using the gesture according to the embodiment of the invention with reference to fig. 3.
The device for quickly aligning the object by using the gesture comprises:
the first detection module 301 is configured to detect a point touch operation of three to five points continuously input by a user;
an opening module 302, configured to enter an alignment mode when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on a first editable object;
a second detecting module 303, configured to, after entering the alignment mode, obtain, after a drag operation performed on a second editable object is detected, moving direction information of the second editable object after the drag operation is finished;
an executing module 304, configured to move the second editable object to align with an edge of the first editable object according to the moving direction information.
In this embodiment, the apparatus further includes: an end module 305, configured to exit the alignment mode when all the point touch operations falling on the first editable object are completely ended.
The first detection module 301 is further configured to ignore, in the alignment mode, all subsequent touch operations that fall on the first editable object.
The starting module 302 is further configured to restore all touch operation effects triggered in the process of entering the alignment mode when entering the alignment mode.
Further, the device further comprises: the first prompting module 306 is configured to display the first editable object in a highlight effect after entering the alignment mode until exiting the alignment mode. Specifically, the highlighting effect display includes: flashing, edge highlighting, eye-catching overlay, or popping up an indicator box.
In this embodiment, the second detecting module 303 is further configured to respectively level and vertical components of a connection line between a start point and an end point of the drag operation, and determine the moving direction of the second editable object according to a larger value of the horizontal and vertical components.
Further, the apparatus of this embodiment further includes a second prompting module 307, configured to obtain moving direction information of the second editable object in real time, and perform highlighting effect display on the corresponding edge of the first editable object in real time according to the moving direction information.
Those of ordinary skill in the art will understand that: the invention is not to be considered as limited to the specific embodiments thereof, but is to be understood as being modified in all respects, all changes and equivalents that come within the spirit and scope of the invention.
Claims (14)
1. A method for quickly aligning an object by using a gesture is applied to electronic whiteboard software input by a touch device, and is characterized by comprising the following steps:
detecting three-to-five point touch operation continuously input by a user;
when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on the first editable object, entering an alignment mode;
in the alignment mode, after a dragging operation acting on a second editable object is detected, acquiring moving direction information of the second editable object after the dragging operation is finished;
according to the moving direction information, moving the second editable object to align the second editable object with the edge of the first editable object;
the step of acquiring the movement direction information of the second editable object includes: and respectively taking the horizontal and vertical components of the connecting line of the starting point and the end point of the drag operation, and determining the moving direction of the second editable object by the larger value of the horizontal and vertical components.
2. The method of claim 1, further comprising the step of: and exiting the alignment mode when all the point touch operations falling on the first editable object are completely finished.
3. The method of claim 1, wherein in the aligned mode, all subsequent touch operations that fall on the first editable object are ignored.
4. The method of claim 1, further comprising, upon entering the alignment mode, the steps of: and restoring all touch operation effects triggered in the process of entering the alignment mode.
5. The method of claim 1, further comprising, after entering the alignment mode, the steps of: and displaying the first editable object in a highlighting effect mode until the alignment mode is exited.
6. The method of claim 5, wherein the highlighting effect display comprises: flashing, edge highlighting, eye-catching overlay, or popping up an indicator box.
7. The method of claim 1, further comprising the step of: and acquiring the moving direction information of the second editable object in real time, and displaying the corresponding edge of the first editable object in a highlighting effect in real time according to the moving direction information.
8. An apparatus for rapidly aligning objects using gestures, comprising:
the first detection module is used for detecting point touch operation of three to five points continuously input by a user;
the starting module is used for entering an alignment mode when all the point touch operations are within a preset time interval and a preset distance interval and all the point touch operations fall on a first editable object;
the second detection module is used for acquiring the moving direction information of a second editable object after the dragging operation is finished after the dragging operation acting on the second editable object is detected in the alignment mode;
the execution module is used for moving the second editable object to align the second editable object with the edge of the first editable object according to the moving direction information;
the second detection module is further used for respectively taking a connecting line of a starting point and an end point of the dragging operation into horizontal and vertical components, and determining the moving direction of the second editable object by the larger value of the horizontal and vertical components.
9. The apparatus of claim 8, further comprising an end module configured to exit the aligned mode when all point touch operations that fall on the first editable object end.
10. The apparatus of claim 8, wherein the first detection module is further configured to ignore all subsequent touch operations that fall on the first editable object in the aligned mode.
11. The apparatus of claim 8, wherein the enabling module is further configured to restore all touch operation effects triggered during entering the alignment mode when entering the alignment mode.
12. The apparatus of claim 8, further comprising a first prompting module configured to highlight the first editable object after entering the alignment mode until exiting the alignment mode.
13. The apparatus of claim 12, wherein the highlight effect display comprises: flashing, edge highlighting, eye-catching overlay, or popping up an indicator box.
14. The apparatus according to claim 8, further comprising a second prompting module, configured to obtain moving direction information of the second editable object in real time, and display a highlighting effect on a corresponding edge of the first editable object in real time according to the moving direction information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410837418.8A CN105808079B (en) | 2014-12-29 | 2014-12-29 | Method and device for quickly aligning objects by utilizing gestures |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410837418.8A CN105808079B (en) | 2014-12-29 | 2014-12-29 | Method and device for quickly aligning objects by utilizing gestures |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105808079A CN105808079A (en) | 2016-07-27 |
CN105808079B true CN105808079B (en) | 2020-02-14 |
Family
ID=56980686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410837418.8A Active CN105808079B (en) | 2014-12-29 | 2014-12-29 | Method and device for quickly aligning objects by utilizing gestures |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105808079B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106528736A (en) * | 2016-10-27 | 2017-03-22 | 中企动力科技股份有限公司 | Method and apparatus for displaying alignment line during dragging of page components |
US20180158243A1 (en) * | 2016-12-02 | 2018-06-07 | Google Inc. | Collaborative manipulation of objects in virtual reality |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102110085A (en) * | 2009-12-29 | 2011-06-29 | 北京大学 | Automatic typesetting method and system based on dependency relationship of typesetting objects |
CN102968247A (en) * | 2012-11-29 | 2013-03-13 | 广东欧珀移动通信有限公司 | Method for realizing automatic alignment and sorting of desktop icons by shaking and mobile terminal thereof |
CN103106005A (en) * | 2013-02-17 | 2013-05-15 | 广东欧珀移动通信有限公司 | Method and device for arranging status bar icons of mobile devices |
CN103176687A (en) * | 2011-12-26 | 2013-06-26 | 腾讯科技(深圳)有限公司 | Regionalized management method and regionalized management system of desktop icons |
CN103885696A (en) * | 2014-03-17 | 2014-06-25 | 联想(北京)有限公司 | Information processing method and electronic device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7093192B2 (en) * | 1999-07-30 | 2006-08-15 | Microsoft Corporation | Establishing and displaying dynamic grids |
CN102065227A (en) * | 2009-11-17 | 2011-05-18 | 新奥特(北京)视频技术有限公司 | Method and device for horizontally and vertically aligning object in graph and image processing |
US10267892B2 (en) * | 2010-10-04 | 2019-04-23 | Qualcomm Incorporated | Locating a device using a reference point to align location information |
CN103376998B (en) * | 2012-04-19 | 2016-06-15 | 中兴通讯股份有限公司 | Handwriting equipment Chinese character type-setting method and device |
US9176940B2 (en) * | 2013-03-15 | 2015-11-03 | Blackberry Limited | System and method for text editor text alignment control |
-
2014
- 2014-12-29 CN CN201410837418.8A patent/CN105808079B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102110085A (en) * | 2009-12-29 | 2011-06-29 | 北京大学 | Automatic typesetting method and system based on dependency relationship of typesetting objects |
CN103176687A (en) * | 2011-12-26 | 2013-06-26 | 腾讯科技(深圳)有限公司 | Regionalized management method and regionalized management system of desktop icons |
CN102968247A (en) * | 2012-11-29 | 2013-03-13 | 广东欧珀移动通信有限公司 | Method for realizing automatic alignment and sorting of desktop icons by shaking and mobile terminal thereof |
CN103106005A (en) * | 2013-02-17 | 2013-05-15 | 广东欧珀移动通信有限公司 | Method and device for arranging status bar icons of mobile devices |
CN103885696A (en) * | 2014-03-17 | 2014-06-25 | 联想(北京)有限公司 | Information processing method and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN105808079A (en) | 2016-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104076986B (en) | A kind of method of toch control for multiple point touching terminal and equipment | |
CN102436338B (en) | Messaging device and information processing method | |
CN103677721B (en) | Terminal demonstration interface is carried out to method and the terminal device of convergent-divergent | |
US20140028557A1 (en) | Display device, display control method and display control program, and input device, input assistance method and program | |
CN103257811A (en) | Picture display system and method based on touch screen | |
EP2891961A2 (en) | Input apparatus | |
US10373021B2 (en) | Object detection device, object detection method, and recording medium | |
US9323437B2 (en) | Method for displaying scale for enlargement and reduction operation, and device therefor | |
JP5723697B2 (en) | Display device with handwriting input function | |
US20130169563A1 (en) | Storage medium storing information processing program, information processing apparatus, information processing method, and information processing system | |
JP6904249B2 (en) | Object detector, object detection method and program | |
JP2013058149A5 (en) | ||
CN104808936B (en) | Interface operation method and portable electronic device applying same | |
CN105808129B (en) | Method and device for quickly starting software function by using gesture | |
CN103488296A (en) | Somatosensory interaction gesture control method and somatosensory interaction gesture control device | |
JP5374564B2 (en) | Drawing apparatus, drawing control method, and drawing control program | |
CN104007920A (en) | Method for selecting waveforms on electronic test equipment | |
CN103823630A (en) | Virtual mouse | |
CN104049898A (en) | Touch screen operation method and device and touch terminal | |
CN108255300B (en) | Control method and device of electronic equipment | |
CN104182144A (en) | Mobile terminal interface browsing method and system | |
CN105808080B (en) | Method and device for quickly copying object by utilizing gesture | |
CN105808079B (en) | Method and device for quickly aligning objects by utilizing gestures | |
KR20160134822A (en) | Method and Apparatus for Realizaing Human-Machine Interaction | |
JP2014137675A5 (en) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: C1104 room 100085 Beijing city Haidian District Third Street No. 9 C Applicant after: Hitevision Polytron Technologies Inc Address before: C1104 room 100085 Beijing city Haidian District Third Street No. 9 C Applicant before: HONGHE TECHNOLOGY CO., LTD. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |