[go: up one dir, main page]

CN107688385A - A kind of control method and device - Google Patents

A kind of control method and device Download PDF

Info

Publication number
CN107688385A
CN107688385A CN201610629333.XA CN201610629333A CN107688385A CN 107688385 A CN107688385 A CN 107688385A CN 201610629333 A CN201610629333 A CN 201610629333A CN 107688385 A CN107688385 A CN 107688385A
Authority
CN
China
Prior art keywords
user
control
behavior characteristics
control object
eye behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610629333.XA
Other languages
Chinese (zh)
Inventor
涂畅
张扬
王砚峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN201610629333.XA priority Critical patent/CN107688385A/en
Publication of CN107688385A publication Critical patent/CN107688385A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the present invention, which provides a kind of control method and device, methods described, to be included:Gather user's eye behavioural characteristic;When user's eye behavioural characteristic meets preparatory condition, control object is determined according to user's eye behavioural characteristic and determines control command corresponding with the eye behavioural characteristic;Control to the control object is realized according to the control command.The embodiment of the present invention is operated manually the control that can be realized to electronic equipment without user, simple and convenient, has liberated the both hands of user, improves the operating efficiency of user.

Description

Control method and device
Technical Field
The embodiment of the invention relates to the technical field of electronic equipment, in particular to a control method and a control device.
Background
With the development of touch technology, electronic devices with touch sensing units are widely used, such as mobile phones, tablet computers, notebooks, remote controllers, and the like. The user can realize the control operation of the electronic equipment through the acquired contact position information by contacting or approaching the touch sensing unit through an operation body such as a finger, a touch pen and the like. However, since the user needs to use a finger or a touch pen to control the electronic device, at least one hand of the user is occupied, which affects other operations of the user. When the user's hands are occupied by other matters, the control operation of the electronic device cannot be conveniently realized. Therefore, the electronic device control method provided by the prior art has the defect of inconvenient operation.
Disclosure of Invention
The embodiment of the invention provides a control method and a control device, which can realize control on electronic equipment by detecting eye behavior characteristics of a user, facilitate user operation and improve user experience.
Therefore, the embodiment of the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a control method, including: collecting eye behavior characteristics of a user; when the eye behavior characteristics of the user accord with preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user; and controlling the control object according to the control command.
In a second aspect, an embodiment of the present invention provides a control apparatus, including: the acquisition module is used for acquiring the eye behavior characteristics of the user; the determining module is used for determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user when the eye behavior characteristics of the user acquired by the acquiring module meet preset conditions; and the control module is used for realizing the control of the control object according to the control command generated by the determination module.
In a third aspect, an embodiment of the present invention provides a control apparatus, including: there is a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors the one or more programs including instructions for: collecting eye behavior characteristics of a user; when the eye behavior characteristics of the user accord with preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user; realizing the control of the control object according to the control command
The control method and the control device provided by the embodiment of the invention can determine the control object and the control command according to the collected eye behavior characteristics of the user, and realize the control operation on the control object. The method provided by the embodiment of the invention can realize the control of the electronic equipment without manual operation of a user, is simple and convenient, liberates the hands of the user and improves the operation efficiency of the user.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 schematically illustrates one scenario in which embodiments of the present invention may be applied;
FIG. 2 is a flowchart of a control method according to an embodiment of the present invention;
FIG. 3 is a flowchart of a control method according to another embodiment of the present invention;
FIG. 4 schematically illustrates another scenario in which embodiments of the present invention may be applied;
FIG. 5 is a schematic diagram of a control device according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a control device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a server in an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a control method and a control device, which can realize control on electronic equipment by detecting eye behavior characteristics of a user, facilitate user operation, improve user operation efficiency and improve user experience.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an exemplary application scenario of the embodiment of the present invention is shown. The method provided by the embodiment of the present invention may be applied to the scenario shown in fig. 1, wherein the method provided by the embodiment of the present invention may be applied to the electronic device 100 shown in fig. 1, and the electronic device 100 may be any existing electronic device, electronic device under development, or electronic device under development in the future, including but not limited to: existing or developing or future developing, desktop computers, laptop computers, mobile terminals (including smart phones, non-smart phones, various tablet computers), and the like. As shown in FIG. 1, the user interface of electronic device 100 may include various display objects, such as display object 101 and display object 102. When the user 200 operates the electronic device 100, the eyeball of the user may move, as shown in fig. 1, from the line of sight 1 to the position at which the line of sight 2 is directed. In a possible application scenario, the method and the apparatus provided by the embodiment of the present invention may collect the eye behavior characteristics of the user 200, and when the user moves the eyeballs and moves the sight line from the display object 101 corresponding to the sight line 1 focusing point to the display object 102 corresponding to the sight line 2 focusing point, the method and the apparatus provided by the present invention may collect the eye behavior characteristics of the user, and determine that the moving direction of the eyeballs of the user moves from top to bottom, so that the object viewed by the eyes of the user is moved as a control object, and the display object 102 that the user wants to view moves to the center of the screen of the electronic device, which is convenient for the user to view. Therefore, the control object and the control command are determined according to the collected eye behavior characteristics of the user, and the control of the control object is realized according to the control command. Of course, the above is only an exemplary illustration of the embodiment of the present invention, and the method and apparatus provided by the embodiment of the present invention may also be applied to other scenarios, which are not limited herein. It should be noted that the above application scenarios are only presented to facilitate understanding of the present invention, and the embodiments of the present invention are not limited in any way in this respect. Rather, embodiments of the present invention may be applied to any scenario where applicable.
A control method according to an embodiment of the present invention will be described with reference to fig. 2 to 3.
Referring to fig. 2, a flowchart of a control method according to an embodiment of the present invention is provided. As shown in fig. 2, may include:
s201, collecting eye behavior characteristics of a user.
The control device provided by the invention can comprise an acquisition module for acquiring the eye behavior characteristics of the user. The acquisition module can be a camera. Of course, the control device provided by the present invention may also obtain the eye behavior characteristics of the user through other external approaches, for example, other electronic devices or apparatuses communicatively connected to the control device. It should be noted that, when the control device obtains the eye behavior characteristics of the user through an external approach, the other electronic device or apparatus may be a stand-alone device with a processor, and the stand-alone device has an acquisition module, and the acquisition module is used for acquiring the eye behavior characteristics of the user. Of course, the control device may also obtain the eye behavior characteristics of the user through an external acquisition module in communication connection with the control device, and at this time, the external acquisition module may be a module without a processor, which can realize the function of acquiring the eye behavior characteristics of the user.
The eye behavior characteristics of the user may include, but are not limited to, any one or more of a moving direction of the user's eyes, a moving speed of the user's eyes, a moving distance of the user's eyes, a starting position and/or an ending position of the user's eyes moving, a size of the user's pupils, the user's eyes being in an open or closed state, a distance between the user's upper and lower eyelids, a number of blinks of the user, and a frequency of blinks of the user. Of course, the above is merely exemplary, and the present invention does not limit the type of the collected eye behavior features. During specific implementation, the control device can collect multiple frames of user eye images, and the user eye behavior characteristics are obtained through comparison of the frames of images and analysis. For example, the user's eye image can be captured periodically, and the change of the user's eye state can be obtained by comparing the adjacent frame images. Of course, the change of the eye state of the user can be recognized as the eye behavior characteristic of the user in other ways.
S202, when the eye behavior characteristics of the user meet preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user.
In specific implementation, the condition that the eye behavior characteristics of the user meet the preset conditions includes any one or more of the following combinations:
(1) the moving distance and/or the moving speed of the eyeballs of the user are/is larger than the set threshold. For example, when the user's eyes move rapidly, it may be indicated that the user wants to change the viewing object. When the moving distance of the eyeball of the user is larger than the set threshold value, the user can also indicate that the user wants to change the viewing object. At this time, it may be determined that the eye behavior characteristics of the user meet the preset conditions. The size of the set threshold may be set empirically or as desired.
(2) The moving direction of the eyeballs of the user is the same as the preset direction.
(3) The starting position and/or the ending position of the eyeball movement of the user are/is a preset position.
(4) The size of the user's pupil becomes larger or smaller.
(5) The distance between the upper and lower eyelids of the user is increased or decreased.
(6) The blinking times of the user accord with the preset times.
(7) The frequency of blinking of the user is greater than a set threshold.
(8) Other situations.
The above is merely an exemplary illustration, and those skilled in the art may set other conditions as the preset conditions, which are not limited herein.
In specific implementation, when the control object corresponding to the eye behavior feature of the user is determined, the object viewed by the user can be used as the control object. For example, in some embodiments, a display object corresponding to a user's eye focal point may be acquired and determined as a control object. For example, as shown in fig. 1, the focus point corresponding to the user's gaze 1 is the display object 101. If the collected eye behavior feature of the user is pupil dilation, the display object 101 is used as a control object, and the corresponding control command may be a dilation control object 101. For another example, in some embodiments, a movement trajectory of an eyeball of a user may be acquired, a start position or an end position of the movement trajectory of the eyeball of the user may be determined, and a display object corresponding to the start position or the end position of the movement trajectory may be determined as a control object. For example, as shown in fig. 1, the display object corresponding to the start position of the movement trajectory of the eye of the user is 101, and the display object corresponding to the end position of the movement trajectory of the eye of the user is 102, and in this case, the area where the display object 102 corresponding to the end position of the movement trajectory of the eye of the user is located may be used as the control object. In some embodiments, a display object corresponding to a start position of an eye movement trajectory of a user may be determined as a control object. As shown in fig. 4, the display object 103, which is the start position object of the eye movement trajectory of the user, is set as a control object, and is moved to a position overlapping with the control 104.
In some embodiments, the method further comprises: responding to a trigger operation of a user, and displaying a first indication object; the first indication object is used for indicating a movement track or a focusing position of eyeballs of a user; the determining the control object according to the user eye behavior feature comprises: when the eyeball of the user focuses on the first indication object, the first indication object is used as a control object; or when the first indication object is coincident with a first control, the first control is taken as a control object.
In a specific implementation, the control command may be a command to move the display object, a command to enlarge or reduce the display object, a command to click or double click the control object, and the like, and is not limited herein. When determining the control command corresponding to the eye behavior feature, the control command may be determined according to a correspondence relationship between the eye behavior feature and the control command stored in advance.
In the following, how to determine a control object according to the eye behavior characteristics of the user and determine a control command corresponding to the eye behavior characteristics will be described with reference to several possible implementation manners.
In some embodiments, the determining a control command corresponding to the eye behavior feature may include: determining the moving direction of eyeballs of a user, and determining the moving direction of a control object according to the moving direction of the eyeballs of the user; and generating a control command for moving the control object according to the moving direction of the control object. For example, as shown in fig. 1, it is assumed that the collected eye behavior characteristics of the user are a moving direction of eyeballs of the user, a moving speed of the eyeballs of the user, and a moving distance of the eyeballs of the user, and when the moving speed and/or the moving distance of the eyeballs of the user is greater than a set threshold, it is determined that the eye behavior characteristics of the user meet a preset condition. When determining the control object, the entire region where the end position of the movement locus of the eyeball of the user (corresponding to the display object 102) is located is set as the control object. Then, the moving direction of the eyeball of the user is determined to be moving from top to bottom, the moving direction of the control object is determined to be moving from bottom to top, and the generated control command is to move the control object from bottom to top.
In some embodiments, the determining a control command corresponding to the eye behavior feature may include: and acquiring a preset target moving position, and generating a control command for moving the control object to the target moving position. Still taking fig. 1 as an example for explanation, it is assumed that, when determining the control object, the entire region where the end position of the movement trajectory of the eyeball of the user (corresponding to the display object 102) is located is set as the control object. The preset target moving position is the central position of the display screen of the electronic equipment, and the generated control command is to move the control object to the central position of the display screen. In this embodiment, the control object may be directly moved to a preset target movement position without determining the movement direction of the control object.
In some embodiments, the method may further comprise: when it is detected that the eyeball of the user stops moving or the eyeball of the user is focused at a preset position, the operation of moving the control object is stopped. For example, when the control object is moved to the screen center position, the user's eyeball gazes at the screen center and stops moving, and then the moving operation for the control object may be stopped. For another example, if the eyeball of the user focuses on a preset position, it may be said that the display object corresponding to the preset position is the object that the user wants to view, and at this time, the operation of moving the control object is stopped.
In some embodiments, the method may further comprise: and when the eyeball of the user stops moving and is focused on the edge of the display unit of the electronic equipment, controlling the control object to continuously move. In some embodiments, assuming that the user's eye stops moving until focusing on the edge of the screen of the electronic device, the control object may be controlled to continue moving until the user's eye focus point leaves the edge of the screen of the electronic device. Specifically, when the eyeball of the user focuses on the upper edge of the screen of the electronic equipment, the display object of the control screen continuously moves upwards; when the eyeball of the user focuses on the lower edge of the screen of the electronic equipment, controlling the display object of the screen to continuously move downwards; when the eyeball of the user focuses on the left edge of the screen of the electronic equipment, controlling the display object of the screen to continuously move leftwards; when the user's eye is focused on the right edge of the screen of the electronic device, the display object of the control screen continues to move rightward.
In some embodiments, the determining a control command corresponding to the eye behavior feature may include: when the size of the pupil of the user or the distance between the upper eyelid and the lower eyelid is determined to be increased, a command for enlarging the control object is generated; when the size of the pupil of the user or the distance between the upper eyelid and the lower eyelid is determined to be smaller, a command for narrowing the control object is generated. For example, the user enlarges the pupils by opening the eyes, captures the information of the enlargement and the pupils enlargement of the eyes of the user, and enlarges the map displayed on the display screen. The user narrows the pupil by squinting, and at this time, a narrowing operation is performed on the map displayed on the display screen. The expression that the eyes of the user are open largely may be that the distance between the upper and lower eyelids becomes large, that is, the width of the opening of the upper and lower eyelids becomes large. Similarly, the user's eyes may be narrowed to show that the distance between the upper and lower eyelids is decreased, that is, the width of the upper and lower eyelids is decreased.
In some embodiments, the determining the control command corresponding to the eye behavior feature comprises any one or a combination of:
(1) and generating a command for amplifying the control object when the number of times of blinking of the user is determined to be the first preset number of times. In a specific implementation, when the number of times that the user blinks continuously in the preset time period is determined as the first preset number of times, the command for magnifying the control object may be generated. For example, the user may be set to blink 1 time in 2 seconds as a command to zoom in on the control object.
(2) And when the number of times of blinking of the user is determined to be the second preset number of times, generating a command for shrinking the control object. In a specific implementation, when the number of blinks of the user in the preset time period is determined to be the second preset number, the command for zooming out the control object may be generated. For example, a command for the user to blink 2 times in 2 seconds may be set as a command for zooming out the control object.
(3) And when the blink frequency of the user is determined to be the third preset frequency, generating a command for clicking the control object. In specific implementation, when the number of blinks of the user in the preset time period is determined to be a third preset number, a command for clicking the control object may be generated. For example, the user may be set to blink 4 times in 2 seconds as a command to click on the control object.
(4) And when the number of times of blinking of the user is determined to be the fourth preset number of times, generating a command of double-clicking the control object. In a specific implementation, when the number of blinks of the user in the preset time period is determined to be the fourth preset number, a command for double-clicking the control object may be generated. For example, a command that the user blinks 6 times in 2 seconds to double-click the control object may be set.
Of course, the above is merely exemplary, and the present invention is not limited thereto.
S203, controlling the control object according to the control command.
In order to facilitate those skilled in the art to more clearly understand the embodiments of the present application in a specific context, the following describes the embodiments of the present application with a specific example. It should be noted that the specific example is only to make the present invention more clearly understood by those skilled in the art, but the embodiments of the present invention are not limited to the specific example.
The method shown in fig. 1 and 2 is explained below as an example. In one possible application scenario, the method and apparatus provided by the present invention can be used in the process of user interaction with a map application. The user opens the map application for viewing. The control device provided by the invention captures the eyeballs of the user in real time through the camera, identifies the operation intention of the user, for example, the moving direction of the eyeballs of the user is captured relative to the direction of the user when the map is looked at, and the map display interface is operated to move upwards, downwards, leftwards and rightwards; when the eyeball of the user stops rotating and gazing at the map, the map display interface stops moving. For example, timing may be set for capturing an image of the user's eye. When the eyeballs of the user shift leftwards, the image is captured in real time through the camera, after the image before analysis and comparison is carried out, the shifting direction of the user is obtained, so that the display interface of the map application program can be controlled in real time to move from left to right, when the area which the user wants to view is moved to the middle of the screen, the eyeballs of the user stop shifting, the central area of the map is looked forward, the camera captures the change in real time, and the map is stopped moving immediately. Similarly, when the eyeball of the user deviates to the right, the map application program captures the eyeball action of the user through the camera, and the area which the user wants to view can be moved to the center of the screen. Similarly, the up-shifting, down-shifting, left-up, right-up, left-down, right-down, etc. operations may be performed by tracking the eyes. Further, when the user focuses his gaze on pupil enlargement (or eyes are open), the map is enlarged by capturing information on the pupil enlargement (or eyes are open) of the user; and when the user's pupil is narrowed (or the eyes are narrowed), the map may be subjected to a narrowing operation. According to the embodiment of the invention, the corresponding control is realized by collecting the behavior characteristics of the eyes, so that various interaction requirements of the user are met, the user can be liberated from complicated manual operation, and basic moving positioning and zooming-in and zooming-out operations can be performed on the display object of the application program without a touch screen.
In the above, the whole area where the display object corresponding to the user focus point is located is taken as the control object for explanation, but it can be understood by those skilled in the art that a single element corresponding to the user focus point can also be taken as the control object. Another example is described below with reference to fig. 3 and 4.
S301, responding to the trigger operation of the user, and displaying a first indication object.
Wherein the first indication object is used for indicating the movement track or the focusing position of the eyeball of the user. The first pointing object may be, for example, an arrow-type cursor 103 shown in fig. 4. The trigger operation of the user may be a preset contact operation, such as an operation of touching the screen or pressing the screen for a long time by the user. Of course, non-contact operation may be possible, such as blinking twice within a preset time period by the user. The type of the trigger operation is not limited here, and the first instruction object may be triggered and displayed as long as the first instruction object can be distinguished from other operations.
And S302, when the eyeball of the user focuses on the first indication object, taking the first indication object as a control object.
And S303, collecting the moving direction and/or moving track of the eyeballs of the user, and determining a control command.
For example, the moving direction of the user's eyeball may be acquired in order to determine the moving direction of the first pointing object, moving the first pointing object. For example, as shown in fig. 4, it is determined that the moving direction of the eyeball of the user moves from top to bottom, when the first pointing object 103 is moved from top to bottom. As another example, the movement trajectory of the eyeball of the user may be acquired, and the first pointing object 103 may be moved from a position where the line of sight 3 is focused to a position where the line of sight 4 is focused.
S304, the first indication object is moved according to the control command.
S305, collecting the eye behavior characteristics of the user, and taking the first control as a control object when the first indication object is superposed with the first control.
As shown in fig. 4, the display object 104 is a control, and the control may implement a certain function. Through the above operation, the first indication object is moved from the position where the user's line of sight 3 is focused to the position where the line of sight 4 is focused, at which time the first indication object 103 coincides or partially coincides with the display object 104, i.e., the first control. For example, when the eye behavior feature of the user is focusing on the first indicating object and blinking three times, a control (e.g., 104 in fig. 4) coinciding with the first indicating object may also be taken as a control object, and a corresponding control command may be generated.
And S306, generating a command for clicking the first control according to the corresponding relation between the eye behavior characteristics of the user and the control command.
This is explained below with reference to fig. 4. Referring to fig. 4, another exemplary application scenario of the embodiment of the present invention is shown. The method provided by the embodiment of the present invention may be applied to the scenario shown in fig. 4, wherein the method provided by the embodiment of the present invention may be applied to the electronic device 100 shown in fig. 4, and the electronic device 100 may be any existing electronic device, electronic device under development, or electronic device under development in the future, including but not limited to: existing, developing or future developing, desktop computers, laptop computers, mobile terminals (including smart phones, non-smart phones, various tablet computers), and the like. As shown in FIG. 4, the user interface of electronic device 100 may include various display objects, such as display object 103 and display object 104. The display object 103 may be an arrow cursor, for example, and the display object 104 may be a control, for example, which may implement a certain function. When the user 200 operates the electronic device 100, the eyeball thereof may move, for example, from the line of sight 3 to the position at which the line of sight 4 is directed as shown in fig. 4. In a possible application scenario, the method and apparatus provided in the embodiment of the present invention may acquire the eye behavior characteristics of the user 200, determine the display object corresponding to the user gaze 3 focus point as the control object 103, and move the control object 103 from the position where the gaze 3 is focused to the position where the gaze 4 is focused according to the moving direction of the user eyeball, for example, the control object may be a position coinciding with the control 104, and the arrow 103 shown by a dotted line in fig. 4 is the moved control object 103. Therefore, the control object and the control command are determined according to the collected eye behavior characteristics of the user, and the control of the control object is realized according to the control command. Of course, the above is only an exemplary illustration of the embodiment of the present invention, and the method and apparatus provided by the embodiment of the present invention may also be applied to other scenarios, which are not limited herein. It should be noted that the above application scenarios are only presented to facilitate understanding of the present invention, and the embodiments of the present invention are not limited in any way in this respect. Rather, embodiments of the present invention may be applied to any scenario where applicable.
The application scenario of fig. 4 is explained below as an example. Still taking the map application as an example for explanation, a cursor (e.g., 103) may be displayed in the map application, and the cursor may be moved by tracking the eyeball of the user. When the cursor is moved to a particular location, for example, to coincide with the first control 104, the operation of clicking on the control 104 may be performed by blinking twice. When the cursor is in the area without the control, the operation of zooming in or zooming out can be respectively carried out by blinking twice or blinking three times.
In some embodiments, the method provided by the embodiment of the present invention may control multiple elements or the whole of multiple elements, for example, the application scenario shown in fig. 1. In some embodiments, the method provided by the embodiment of the present invention may control a single element, for example, the application scenario shown in fig. 4, and may control the cursor 103. In another embodiment, a combination of the two approaches may be implemented, i.e., control of multiple elements or multiple elements in their entirety, as well as control of a single element. Taking a moving element as an example, the method provided by the embodiment of the present invention may collect a moving track for an eyeball, and move a single element, for example, the cursor 103, from a position where the line of sight 3 is focused to a position where the line of sight 4 is focused. When the eyeball of the user moves and the end position of the eyeball movement track is the edge of the display unit of the electronic equipment, the multiple elements or the whole movement of the multiple elements can be controlled. For example, at this time, the plurality of display elements displayed on the display screen may be controlled to move upward, downward, leftward or rightward as a whole. Of course, this is merely an exemplary illustration and is not to be construed as a limitation of the present invention.
The method provided by the embodiment of the invention is explained in detail above, and the control device provided by the embodiment of the invention is explained below.
Fig. 5 is a schematic diagram of a control device according to an embodiment of the present invention.
A control device 500 comprising:
the acquisition module is used for acquiring the eye behavior characteristics of the user;
the determining module is used for determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user when the eye behavior characteristics of the user acquired by the acquiring module meet preset conditions;
and the control module is used for realizing the control of the control object according to the control command generated by the determination module.
In some embodiments, the acquisition module is specifically configured to acquire any one or more of a moving direction of the user's eyes, a moving speed of the user's eyes, a moving distance of the user's eyes, a starting position and/or an ending position of the movement of the user's eyes, a size of the user's pupils, an open or closed state of the user's eyes, a distance between upper and lower eyelids of the user, a number of blinks of the user, and a frequency of blinks of the user.
In some embodiments, the determining module specifically includes any one or more of the following sub-modules:
the first determining sub-module is used for determining that the eye behavior characteristics of the user meet the preset conditions when the moving distance and/or the moving speed of the eyeballs of the user are larger than the set threshold;
the second determining sub-module is used for determining that the eye behavior characteristics of the user meet the preset conditions when the moving direction of the eyeballs of the user is the same as the preset direction;
the third determining sub-module is used for determining that the eye behavior characteristics of the user meet the preset conditions when the starting position and/or the ending position of the eyeball movement of the user is the preset position;
the fourth determining sub-module is used for determining that the eye behavior characteristics of the user meet the preset conditions when the size of the pupil of the user is increased or reduced;
the fifth determining submodule is used for determining that the eye behavior characteristics of the user meet the preset conditions when the distance between the upper eyelid and the lower eyelid of the user is increased or decreased;
the sixth determining submodule is used for determining that the eye behavior characteristics of the user meet the preset conditions when the blinking times of the user meet the preset times;
and the seventh determining submodule is used for determining that the eye behavior characteristics of the user meet the preset conditions when the blinking frequency of the user is greater than the set threshold value.
In some embodiments, the determining module is specifically configured to: acquiring a display object corresponding to a user eye focusing point, and determining the display object as a control object; or acquiring a movement track of the eyeball of the user, determining a starting position or an ending position of the movement track of the eyeball of the user, and determining a display object corresponding to the starting position or the ending position of the movement track as a control object.
In some embodiments, the determining module is specifically configured to: determining the moving direction of eyeballs of a user, and determining the moving direction of a control object according to the moving direction of the eyeballs of the user; generating a control command for moving the control object according to the moving direction of the control object; or acquiring a preset target moving position, and generating a control command for moving the control object to the target moving position.
In some embodiments, the control module is further configured to: when it is detected that the eyeball of the user stops moving or the eyeball of the user is focused at a preset position, the operation of moving the control object is stopped.
In some embodiments, the control module is further configured to: and when the eyeball of the user stops moving and is focused on the edge of the display unit of the electronic equipment, controlling the control object to continuously move.
In some embodiments, the determining module is specifically configured to: when the size of the pupil of the user or the distance between the upper eyelid and the lower eyelid is determined to be increased, a command for enlarging the control object is generated; when the size of the pupil of the user or the distance between the upper eyelid and the lower eyelid is determined to be smaller, a command for narrowing the control object is generated.
In some embodiments, the determining module is specifically configured to: when the blink frequency of the user is determined to be the first preset frequency, generating a command for amplifying the control object; and/or generating a command for shrinking the control object when the frequency of blinking of the user is determined to be a second preset frequency; and/or when the frequency of blinking of the user is determined to be a third preset frequency, generating a command for clicking the control object; and/or generating a command for double-clicking the control object when the number of times of blinking of the user is determined to be a fourth preset number of times.
In some embodiments, the apparatus further comprises:
the display module is used for responding to the trigger operation of a user and displaying a first indication object; wherein the first indication object is used for indicating the movement track or the focusing position of the eyeball of the user.
The determination module is further to: when the eyeball of the user focuses on the first indication object, the first indication object is used as a control object; or when the first indication object is coincident with a first control, the first control is taken as a control object.
The arrangement of each unit or module of the device of the present invention can be implemented by referring to the methods shown in fig. 1 to 4, which are not described herein again.
Referring to fig. 6, a block diagram of a control device is shown in accordance with an exemplary embodiment. For example, the apparatus 600 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 6, apparatus 600 may include one or more of the following components: processing component 602, memory 604, power component 606, multimedia component 608, audio component 610, input/output (I/O) interface 612, sensor component 614, and communication component 616.
The processing component 602 generally controls overall operation of the device 600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 602 may include one or more processors 620 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 can include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
The memory 604 is configured to store various types of data to support operation at the device 600. Examples of such data include instructions for any application or method operating on device 600, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 604 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply component 606 provides power to the various components of device 600. The power components 606 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 600.
The multimedia component 608 includes a screen that provides an output interface between the device 600 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 608 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 600 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 610 is configured to output and/or input audio signals. For example, audio component 610 includes a Microphone (MIC) configured to receive external audio signals when apparatus 600 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 604 or transmitted via the communication component 616. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 614 includes one or more sensors for providing status assessment of various aspects of the apparatus 600. For example, the sensor component 614 may detect an open/closed state of the device 600, the relative positioning of components, such as a display and keypad of the apparatus 600, the sensor component 614 may also detect a change in position of the apparatus 600 or a component of the apparatus 600, the presence or absence of user contact with the apparatus 600, orientation or acceleration/deceleration of the apparatus 600, and a change in temperature of the apparatus 600. The sensor assembly 614 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 616 is configured to facilitate communications between the apparatus 600 and other devices in a wired or wireless manner. The apparatus 600 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
Specifically, the embodiment of the present invention provides a control device 600, which includes a memory 604, and one or more programs, wherein the one or more programs are stored in the memory 604, and configured to be executed by one or more processors 620, and the one or more programs include instructions for:
collecting eye behavior characteristics of a user;
when the eye behavior characteristics of the user accord with preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user;
and controlling the control object according to the control command.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 604 comprising instructions, executable by the processor 620 of the apparatus 600 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform a control method, the method comprising:
collecting eye behavior characteristics of a user;
when the eye behavior characteristics of the user accord with preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user;
and controlling the control object according to the control command.
Fig. 7 is a schematic structural diagram of a server in an embodiment of the present invention. The server 700 may vary significantly depending on configuration or performance, and may include one or more Central Processing Units (CPUs) 722 (e.g., one or more processors) and memory 732, one or more storage media 730 (e.g., one or more mass storage devices) storing applications 742 or data 744. Memory 732 and storage medium 730 may be, among other things, transient storage or persistent storage. The program stored in the storage medium 730 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Further, the central processor 722 may be configured to communicate with the storage medium 730, and execute a series of instruction operations in the storage medium 730 on the server 700.
The server 700 may also include one or more power supplies 726, one or more wired or wireless network interfaces 750, one or more input-output interfaces 758, one or more keyboards 756, and/or one or more operating systems 741, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is only limited by the appended claims
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element. The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus embodiment, since it is substantially similar to the method embodiment, it is relatively simple to describe, and reference may be made to some descriptions of the method embodiment for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort. The foregoing is directed to embodiments of the present invention, and it is understood that various modifications and improvements can be made by those skilled in the art without departing from the spirit of the invention.

Claims (10)

1. A control method, comprising:
collecting eye behavior characteristics of a user;
when the eye behavior characteristics of the user accord with preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user;
and controlling the control object according to the control command.
2. The method of claim 1, wherein the eye behavior characteristics of the user comprise any one or more of a moving direction of the user's eyes, a moving speed of the user's eyes, a moving distance of the user's eyes, a starting position and/or an ending position of the user's eye movement, a size of the user's pupils, an open or closed state of the user's eyes, a distance between the user's upper and lower eyelids, a number of blinks of the user, and a frequency of blinks of the user.
3. The method according to claim 1 or 2, wherein the user's eye behavior characteristics meeting the preset conditions comprise any one or more of the following combinations:
the moving distance and/or moving speed of the eyeballs of the user are/is larger than a set threshold;
the moving direction of the eyeballs of the user is the same as the preset direction;
the starting position and/or the ending position of the movement of the eyeballs of the user are/is a preset position;
the size of the pupil of the user is enlarged or reduced;
the distance between the upper eyelid and the lower eyelid of the user is increased or decreased;
the blinking times of the user accord with the preset times;
the frequency of blinking of the user is greater than a set threshold.
4. The method of claim 1, wherein the determining a control object according to the user eye behavior feature comprises:
acquiring a display object corresponding to a user eye focusing point, and determining the display object as a control object; or,
the method comprises the steps of obtaining a movement track of eyeballs of a user, determining a starting position or an ending position of the movement track of the eyeballs of the user, and determining a display object corresponding to the starting position or the ending position of the movement track as a control object.
5. The method of claim 1, further comprising:
responding to a trigger operation of a user, and displaying a first indication object; the first indication object is used for indicating a movement track or a focusing position of eyeballs of a user;
the determining the control object according to the user eye behavior feature comprises:
when the eyeball of the user focuses on the first indication object, the first indication object is used as a control object; or when the first indication object is coincident with a first control, the first control is taken as a control object.
6. The method of claim 1, wherein the determining the control command corresponding to the eye behavior feature comprises:
determining the moving direction of eyeballs of a user, and determining the moving direction of a control object according to the moving direction of the eyeballs of the user; generating a control command for moving the control object according to the moving direction of the control object;
or,
and acquiring a preset target moving position, and generating a control command for moving the control object to the target moving position.
7. The method of claim 6, further comprising:
when it is detected that the eyeball of the user stops moving or the eyeball of the user is focused at a preset position, the operation of moving the control object is stopped.
8. The method of claim 6, further comprising:
and when the eyeball of the user stops moving and is focused on the edge of the display unit of the electronic equipment, controlling the control object to continuously move.
9. A control device, comprising:
the acquisition module is used for acquiring the eye behavior characteristics of the user;
the determining module is used for determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user when the eye behavior characteristics of the user acquired by the acquiring module meet preset conditions;
and the control module is used for realizing the control of the control object according to the control command generated by the determination module.
10. A control apparatus comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for:
collecting eye behavior characteristics of a user;
when the eye behavior characteristics of the user accord with preset conditions, determining a control object and a control command corresponding to the eye behavior characteristics according to the eye behavior characteristics of the user;
and controlling the control object according to the control command.
CN201610629333.XA 2016-08-03 2016-08-03 A kind of control method and device Pending CN107688385A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610629333.XA CN107688385A (en) 2016-08-03 2016-08-03 A kind of control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610629333.XA CN107688385A (en) 2016-08-03 2016-08-03 A kind of control method and device

Publications (1)

Publication Number Publication Date
CN107688385A true CN107688385A (en) 2018-02-13

Family

ID=61151296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610629333.XA Pending CN107688385A (en) 2016-08-03 2016-08-03 A kind of control method and device

Country Status (1)

Country Link
CN (1) CN107688385A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563238A (en) * 2018-06-15 2018-09-21 歌尔科技有限公司 A kind of method, apparatus of remote controlled drone, equipment and system
CN108613683A (en) * 2018-06-26 2018-10-02 威马智慧出行科技(上海)有限公司 On-vehicle navigation apparatus, method and automobile
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
CN110514219A (en) * 2019-09-20 2019-11-29 广州小鹏汽车科技有限公司 A kind of navigation map display methods, device, vehicle and machine readable media
CN110825228A (en) * 2019-11-01 2020-02-21 腾讯科技(深圳)有限公司 Interaction control method and device, storage medium and electronic device
CN110908513A (en) * 2019-11-18 2020-03-24 维沃移动通信有限公司 Data processing method and electronic equipment
CN112506344A (en) * 2020-12-09 2021-03-16 上海龙旗科技股份有限公司 Visual control method and equipment
CN113138659A (en) * 2020-01-16 2021-07-20 七鑫易维(深圳)科技有限公司 Method, device and equipment for controlling working mode and storage medium
CN114327082A (en) * 2022-03-04 2022-04-12 深圳市信润富联数字科技有限公司 Method and system for controlling industrial application screen, terminal device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1740951A (en) * 2004-08-25 2006-03-01 西门子公司 Apparatus for device control using human eyes
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
US20110169730A1 (en) * 2008-06-13 2011-07-14 Pioneer Corporation Sight line input user interface unit, user interface method, user interface program, and recording medium with user interface program recorded
CN103914151A (en) * 2014-04-08 2014-07-09 小米科技有限责任公司 Information display method and device
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes
CN105095849A (en) * 2014-05-23 2015-11-25 财团法人工业技术研究院 Object identification method and device
CN105338192A (en) * 2015-11-25 2016-02-17 努比亚技术有限公司 Mobile terminal and operation processing method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1740951A (en) * 2004-08-25 2006-03-01 西门子公司 Apparatus for device control using human eyes
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
US20110169730A1 (en) * 2008-06-13 2011-07-14 Pioneer Corporation Sight line input user interface unit, user interface method, user interface program, and recording medium with user interface program recorded
CN103914151A (en) * 2014-04-08 2014-07-09 小米科技有限责任公司 Information display method and device
CN105095849A (en) * 2014-05-23 2015-11-25 财团法人工业技术研究院 Object identification method and device
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes
CN105338192A (en) * 2015-11-25 2016-02-17 努比亚技术有限公司 Mobile terminal and operation processing method thereof

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563238B (en) * 2018-06-15 2021-08-24 歌尔科技有限公司 Method, device, equipment and system for remotely controlling unmanned aerial vehicle
CN108563238A (en) * 2018-06-15 2018-09-21 歌尔科技有限公司 A kind of method, apparatus of remote controlled drone, equipment and system
CN108613683A (en) * 2018-06-26 2018-10-02 威马智慧出行科技(上海)有限公司 On-vehicle navigation apparatus, method and automobile
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
CN109298782B (en) * 2018-08-31 2022-02-18 创新先进技术有限公司 Eye movement interaction method and device and computer readable storage medium
CN110514219A (en) * 2019-09-20 2019-11-29 广州小鹏汽车科技有限公司 A kind of navigation map display methods, device, vehicle and machine readable media
CN110514219B (en) * 2019-09-20 2022-03-18 广州小鹏汽车科技有限公司 Navigation map display method and device, vehicle and machine readable medium
CN110825228A (en) * 2019-11-01 2020-02-21 腾讯科技(深圳)有限公司 Interaction control method and device, storage medium and electronic device
CN110908513A (en) * 2019-11-18 2020-03-24 维沃移动通信有限公司 Data processing method and electronic equipment
CN110908513B (en) * 2019-11-18 2022-05-06 维沃移动通信有限公司 Data processing method and electronic equipment
CN113138659A (en) * 2020-01-16 2021-07-20 七鑫易维(深圳)科技有限公司 Method, device and equipment for controlling working mode and storage medium
CN112506344A (en) * 2020-12-09 2021-03-16 上海龙旗科技股份有限公司 Visual control method and equipment
CN114327082A (en) * 2022-03-04 2022-04-12 深圳市信润富联数字科技有限公司 Method and system for controlling industrial application screen, terminal device and storage medium

Similar Documents

Publication Publication Date Title
CN107688385A (en) A kind of control method and device
CN112118380B (en) Camera control method, device, equipment and storage medium
EP3293620A1 (en) Multi-screen control method and system for display screen based on eyeball tracing technology
EP2660681A2 (en) Mobile terminal and control method thereof
CN107515669B (en) Display method and device
RU2635904C2 (en) Method and device for target object display
CN104317402B (en) Description information display method and device and electronic equipment
KR20130081117A (en) Mobile terminal and control method therof
CN107102772A (en) Touch control method and device
CN111970566A (en) Video playing method and device, electronic equipment and storage medium
CN111522498A (en) Touch response method and device and storage medium
CN112015277B (en) Information display method and device and electronic equipment
US9148537B1 (en) Facial cues as commands
CN110636383A (en) Video playing method and device, electronic equipment and storage medium
CN111061372B (en) Equipment control method and related equipment
CN106775210B (en) Wallpaper changing method and device
CN112637495B (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN112114653A (en) Terminal device control method, device, equipment and storage medium
CN106990893B (en) Touch screen operation processing method and device
CN107179866B (en) Application control method, device and storage medium
CN110636377A (en) Video processing method, device, storage medium, terminal and server
CN112650437B (en) Cursor control method and device, electronic equipment and storage medium
CN114063876A (en) Virtual keyboard setting method, device and storage medium
CN111381667A (en) Interface layout adjusting method and device and storage medium
US20210333988A1 (en) Method and device for switching interface, touch terminal, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180213