[go: up one dir, main page]

CN105630364B - A kind of information processing method and electronic equipment - Google Patents

A kind of information processing method and electronic equipment Download PDF

Info

Publication number
CN105630364B
CN105630364B CN201410594202.3A CN201410594202A CN105630364B CN 105630364 B CN105630364 B CN 105630364B CN 201410594202 A CN201410594202 A CN 201410594202A CN 105630364 B CN105630364 B CN 105630364B
Authority
CN
China
Prior art keywords
position information
point
information
display unit
operation object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410594202.3A
Other languages
Chinese (zh)
Other versions
CN105630364A (en
Inventor
刘云辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410594202.3A priority Critical patent/CN105630364B/en
Publication of CN105630364A publication Critical patent/CN105630364A/en
Application granted granted Critical
Publication of CN105630364B publication Critical patent/CN105630364B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of information processing method and electronic equipments, the method comprise the steps that the detection unit detects operation object;The detection unit obtains the first location information of the operating point of the operation object, and the first location information characterizes location information of the operating point of the operation object in the corresponding touch coordinate system of first display unit;According to the first location information and the first conversion parameter, determine that second location information, the second location information characterize location information of the operating point of the operation object in the displaing coordinate system of the second display unit;It controls second display unit and shows the corresponding operating point information of the operation object at the second location information.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to intelligent terminal technologies, and in particular, to an information processing method and an electronic device.
Background
At present, in an electronic device with a projection function, especially, a smart phone or a tablet computer, a user needs to select information to be projected through content displayed on a display screen of the electronic device first, and then projects the information.
Disclosure of Invention
In view of the above, an object of the present invention is to provide an information processing method and an electronic device, which can solve at least the above problems in the prior art.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides an information processing method, which is applied to electronic equipment, wherein the electronic equipment is provided with a first display unit, a second display unit and a detection unit, and when the second display unit is in a working state, the method comprises the following steps:
the detection unit detects an operation object;
the detection unit acquires first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to the first display unit;
determining second position information according to the first position information and the first conversion parameter, wherein the second position information represents the position information of the operation point of the operation object in a display coordinate system of a second display unit;
and controlling the second display unit to display the operation point information corresponding to the operation object at the second position information.
In the above scheme, the detection unit is: a touch screen;
the detection unit acquires first position information of an operation point of the operation object, including: the detection unit takes the touch point of the detected operation object as an operation point and acquires position information of the operation point as first position information.
In the above scheme, the method further comprises:
when the first display unit is in a closed state, the detection unit acquires a first operation of an operation point of an operation object and analyzes position information of the first operation; determining first displacement information of an operation point of the operation object according to the position information of the first operation; converting the first displacement information into second displacement information according to a preset first conversion parameter, and controlling the second display unit to move and display an operation point corresponding to the operation object according to the second displacement information;
when the first display unit is in an open state, the detection unit acquires a second operation of an operation point of an operation object and analyzes position information of the second operation; acquiring position information of a termination point in the position information of the second operation; and converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point, and controlling the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
In the foregoing solution, the acquiring, by the detection unit, first position information of an operation point of the operation target includes:
detecting to obtain spatial position information of an operation point of the operation object, wherein the spatial position information comprises depth information, horizontal position information and vertical position information of the operation point of the operation object in a first coordinate system;
and converting the spatial position information into first position information according to a preset second conversion parameter.
In the above scheme, the method further comprises:
when the first display unit is in an open state, the detection unit acquires a third operation of an operation point of an operation object and analyzes position information of the third operation; determining horizontal displacement information of the operation point of the operation object relative to the electronic equipment according to the position information of the third operation; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, converting the third displacement information into fourth displacement information according to a preset first conversion parameter, and controlling the second display unit to move and display an operation point corresponding to the operation object according to the fourth displacement information;
when the first display unit is in a closed state, the detection unit acquires a fourth operation of an operation point of an operation object and analyzes position information of the fourth operation; acquiring position information of a termination point in the position information of the fourth operation; and converting the position information of the termination point according to a preset second conversion parameter and a first conversion parameter to obtain the converted position information of the termination point, and controlling the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
In the above scheme, the method further comprises:
detecting a fifth operation of the operation object,
extracting depth information in the spatial position information of the start point and depth information of the spatial position information of the end point of the fifth operation;
judging whether the difference value between the depth information of the spatial position information of the starting point and the depth information of the spatial position information of the ending point meets a preset threshold value or not, and generating a first operation instruction when the difference value meets the preset threshold value;
and executing corresponding processing according to the first operation instruction, acquiring a processing result, and displaying the processing result through the second display unit.
The present invention also provides an electronic device, including:
a detection unit configured to detect an operation object; acquiring first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to a first display unit;
the first conversion unit is used for determining second position information according to the first position information and the first conversion parameter, and the second position information represents the position information of the operation point of the operation object in a display coordinate system of the second display unit;
and the processing unit is used for controlling the second display unit to display the operation point information corresponding to the operation object at the second position information.
In the foregoing solution, the detecting unit is a touch screen, and is specifically configured to acquire first position information of an operation point of the operation object, and includes: the detection unit takes the touch point of the detected operation object as an operation point and acquires position information of the operation point as first position information.
In the above scheme, the detection unit is further configured to, when the first display unit is in a closed state, acquire a first operation of an operation point of an operation object, and analyze position information of the first operation; determining first displacement information of an operation point of the operation object according to the position information of the first operation; converting the first displacement information into second displacement information according to a preset first conversion parameter;
correspondingly, the processing unit is further configured to control the second display unit to move and display the operation point corresponding to the operation object according to the second displacement information;
or,
the detection unit is further used for acquiring a second operation of an operation point of an operation object when the first display unit is in an on state, and analyzing position information of the second operation; acquiring position information of a termination point in the position information of the second operation; converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point;
correspondingly, the processing unit is further configured to control the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
In the above solution, the electronic device further includes: the second conversion unit is used for converting the spatial position information into first position information according to a preset second conversion parameter;
correspondingly, the detection unit is specifically configured to obtain spatial position information of the operation point of the operation object by detection, where the spatial position information includes depth information, horizontal position information, and vertical position information of the operation point of the operation object in the first coordinate system.
In the above scheme, the detection unit is further configured to, when the first display unit is in an on state, obtain a third operation of an operation point of an operation object, and analyze position information of the third operation; determining horizontal displacement information of the operation point of the operation object relative to the electronic equipment according to the position information of the third operation; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, and converting the third displacement information into fourth displacement information according to a preset first conversion parameter;
correspondingly, the processing unit is further configured to control the second display unit to move and display the operation point corresponding to the operation object according to the fourth displacement information;
or,
the detection unit is further configured to acquire a fourth operation of an operation point of an operation object when the first display unit is in a closed state, and analyze position information of the fourth operation; acquiring position information of a termination point in the position information of the fourth operation; converting the position information of the termination point according to a preset second conversion parameter and a first conversion parameter to obtain the converted position information of the termination point;
correspondingly, the processing unit is further configured to control the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
In the foregoing solution, the detecting unit is further configured to detect a fifth operation of the operation object, and extract depth information in spatial position information of a start point and depth information of spatial position information of an end point of the fifth operation;
the processing unit is further configured to determine whether a difference between the depth information of the spatial position information of the starting point and the depth information of the spatial position information of the ending point satisfies a preset threshold, and generate a first operation instruction when the difference satisfies the preset threshold; and executing corresponding processing according to the first operation instruction, acquiring a processing result, and displaying the processing result through the second display unit.
The information processing method and the electronic equipment provided by the invention determine the corresponding position information of the operation object in the display coordinate system of the second display unit by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
Drawings
FIG. 1 is a first flowchart illustrating an information processing method according to an embodiment of the present invention;
FIG. 2 illustrates a first exemplary operational scenario of an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a second information processing method according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a second exemplary operational scenario of the present invention;
FIG. 5 is a third exemplary operational scenario of the present invention;
FIG. 6 is a third schematic flow chart illustrating an information processing method according to an embodiment of the present invention;
FIG. 7 is a fourth exemplary operational scenario of the present invention;
FIG. 8 is a fifth exemplary operational scenario of the present invention;
FIG. 9 is a sixth operational scenario of an embodiment of the present invention;
FIG. 10 is a seventh operational scenario of the present invention;
FIG. 11 is a flowchart illustrating a fifth exemplary operation process according to the present invention;
FIG. 12 is a diagram illustrating an eighth exemplary operational scenario of an embodiment of the present invention;
FIG. 13 is a first schematic view of an electronic device according to an embodiment of the present invention;
fig. 14 is a schematic view of a second electronic device according to an embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
The first embodiment,
An embodiment of the present invention provides an information processing method applied to an electronic device, where the electronic device includes a first display unit, a second display unit, and a detection unit, and when the second display unit is in a working state, as shown in fig. 1, the method includes:
step 101: the detection unit detects an operation object;
step 102: the detection unit acquires first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to the first display unit;
step 103: determining second position information according to the first position information and the first conversion parameter, wherein the second position information represents the position information of the operation point of the operation object in a display coordinate system of a second display unit;
step 104: and controlling the second display unit to display the operation point information corresponding to the operation object at the second position information.
Here, the electronic device may be a smart phone, a tablet computer, an all-in-one machine, or the like.
The first display unit may be a display screen of the electronic device. For example, as shown in fig. 2, the first display unit 21 is a display screen of an electronic device.
The second display unit may be a projection unit of an electronic device; the second display unit is located at a first position, and the first position can be used for equipment according to actual conditions, can be arranged in the middle of the top of the electronic equipment, or can be arranged on the left side or the right side of the top of the electronic equipment. For example, as shown in fig. 2, the second display unit 22 is a projection unit disposed at a middle position of the top of the electronic device.
When the second display unit is in the working state, the projection is started when the second display unit is started.
The operation object may be a finger or may be a stylus pen used by a user to perform an operation.
The detection unit may detect that the operation object is: detecting that a finger or a stylus of a user is located in a detection area of the electronic device.
The operation point of the operation object may be one of the points that meets the condition after the detection unit detects the operation area of the operation object.
The first conversion parameter is a parameter that converts the first position information into position information in a display coordinate system of the second display unit, and may include: and converting the matrix.
The manner of displaying the operation point information corresponding to the operation object may be displaying as a preset graph, for example, displaying as a circle, as shown in fig. 2, displaying an image 23 displayed by the second display unit 22 on the projection background for the final image to be displayed, and displaying the operation point information corresponding to the operation object as a circle 231 in the displayed image 23.
By adopting the scheme, the position information of the operation object in the display coordinate system of the second display unit is determined by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
Example II,
An embodiment of the present invention provides an information processing method applied to an electronic device, where the electronic device includes a first display unit, a second display unit, and a detection unit, and when the second display unit is in a working state, as shown in fig. 3, the method includes:
step 301: the detection unit detects an operation object; the detection unit is as follows: a touch screen;
step 302: the detection unit takes the touch point of the detected operation object as an operation point and acquires the position information of the operation point as first position information; the first position information represents position information of an operation point of the operation object in a touch coordinate system corresponding to the first display unit;
step 303: determining second position information according to the first position information and the first conversion parameter, wherein the second position information represents the position information of the operation point of the operation object in a display coordinate system of a second display unit;
step 304: and controlling the second display unit to display the operation point information corresponding to the operation object at the second position information.
Here, the electronic device may be a smart phone, a tablet computer, an all-in-one machine, or the like.
The first display unit may be a display screen of the electronic device. For example, as shown in fig. 2, the first display unit 21 is a display screen of an electronic device.
The second display unit may be a projection unit of an electronic device; the second display unit is located at a first position, and the first position can be used for equipment according to actual conditions, can be arranged in the middle of the top of the electronic equipment, or can be arranged on the left side or the right side of the top of the electronic equipment. For example, as shown in fig. 2, the second display unit 22 is a projection unit disposed at a middle position of the top of the electronic device.
When the second display unit is in the working state, the projection is started when the second display unit is started.
The operation object may be a finger or may be a stylus pen used by a user to perform an operation.
The detection unit in this embodiment is a touch screen, and specifically, the detection unit may be physically overlapped with the first display unit, but the detection unit is located on the surface side of the electronic device relative to the first display unit, for example, the interface that can be seen in fig. 2 is an image displayed by the first display unit, but a transparent touch screen is further provided on the first display unit.
The touch screen described in this embodiment may be: resistive touch screens, capacitive touch screens, acoustic wave touch screens, and the like.
Correspondingly, the acquiring of the first position information of the operation point of the operation object by the detecting unit includes:
for example, when the touch screen is a resistive touch screen, the acquiring, by the detecting unit, first position information of an operation point of the operation object may include: when an operation object is contacted with the resistance touch screen, the two conductive layers are contacted at the position of the touch point, the resistance is changed, signals are generated in two directions of a horizontal coordinate and a vertical coordinate, and the first position information of the operation point of the operation object of the horizontal coordinate and the vertical coordinate is obtained.
When the touch screen is a capacitive touch screen, the acquiring, by the detecting unit, first position information of an operation point of the operation object may include: when a finger touches the metal layer of the capacitive touch screen, a coupling capacitor is formed between the operation object and the surface of the touch screen due to a human body electric field, and for high-frequency current, the capacitor is a direct conductor, so that the operation object sucks first current from a contact point; the first currents respectively flow out of the electrodes on the four corners of the touch screen, four sub-currents of the first currents flowing through the four electrodes are in direct proportion to the distances from the fingers to the four corners, the controller accurately calculates the proportions of the four sub-currents of the first currents to obtain horizontal and vertical coordinate position information of a touch point, and the obtained horizontal and vertical coordinate position is used as the first position information.
When the touch screen is an acoustic wave touch screen, the acquiring, by the detecting unit, first position information of an operation point of the operation object may include: transducers for transmitting and receiving sound waves in two directions of horizontal and vertical coordinates are respectively adhered to three angles of the sound wave touch screen; the transducer is made of special ceramic materials and is divided into a transmitting transducer and a receiving transducer, and the electric signals transmitted through the touch screen cable are converted into sound wave energy and the surface sound wave energy converged by the reflection stripes is converted into electric signals. The four sides are engraved with reflection stripes reflecting surface ultrasonic waves. When the operation object touches the screen, part of the sound wave energy is absorbed, so that the received signal is changed, and the horizontal and vertical coordinates of the touch are obtained through the processing of the controller.
The first location information includes: when the operation object performs a click operation, the first position information is abscissa information and ordinate information in a touch coordinate system; when the operation object performs a sliding operation, the first position information is N pieces of abscissa information, ordinate information, and acquisition time in a touch coordinate system of the electronic device, where N is a positive integer greater than or equal to 1.
The first conversion parameter is a parameter that converts the first position information into position information in a display coordinate system of the second display unit, and may include: and converting the matrix.
The manner of displaying the operation point information corresponding to the operation object may be displaying as a preset graph, for example, displaying as a circle, as shown in fig. 2, displaying an image 23 displayed by the second display unit 22 on the projection background for the final image to be displayed, and displaying the operation point information corresponding to the operation object as a circle 231 in the displayed image 23.
Preferably, for the scenario provided by this embodiment, when the user performs an operation, the second display unit displays corresponding operation information, including:
when the first display unit is in a closed state, the detection unit acquires a first operation of an operation point of an operation object and analyzes position information of the first operation; determining first displacement information of an operation point of the operation object according to the position information of the first operation; converting the first displacement information into second displacement information according to a preset first conversion parameter, and controlling the second display unit to move and display an operation point corresponding to the operation object according to the second displacement information;
the scenario for this operation can be as shown in fig. 4: as can be seen from fig. 4, the first display unit 41 is in a closed state, and displays a black screen state, and at this time, the first display unit can be used as a touch operation area; when detecting that the finger 42 of the user performs a first operation of moving from the initial position 421 to the end position 422, determining first displacement information of the finger 42 through the initial position and the end position, that is, acquiring a moving direction, a transverse displacement and a longitudinal position of the finger as the first displacement information;
then converting the first displacement information into second displacement information by the first conversion parameter, and controlling the operation point information in the image displayed by the second display unit, that is, the circular image to move from the first position 431 to the second position 432;
immediately after the above operation is completed, if the user performs the touch operation in the same direction as the above operation again, that is, when it is detected again that the finger 42 of the user performs the first operation of moving from the initial position 421 to the end position 422, the detection unit acquires the first displacement information again; after converting the first displacement information into second displacement information of a second display unit, the second display unit displays the second displacement information in a manner that the second displacement information is superimposed on the basis of the last operation point information display, that is, the display graph of the operation point information displayed in fig. 4 is moved from the second position 432 to the third position 433; and so on.
When the first display unit is in an open state, the detection unit acquires a second operation of an operation point of an operation object and analyzes position information of the second operation; acquiring position information of a termination point in the position information of the second operation; converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point, and controlling the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point;
the scenario for this operation can be as shown in fig. 5: the first display unit 51 is in an open state, that is, displays an operation interface, and at this time, the first display unit and the second display unit can be subjected to absolute projection; the initial state of the user is that the finger is located at the initial position 521, and the position where the operation point information of the second display unit is obtained through the first conversion parameter is located at the position 531; when it is detected that the finger 52 of the user performs the second operation of moving from the initial position 521 to the end position 522, acquiring the position information of the end point in the position information of the second operation; and converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point, controlling the second display unit, and displaying the operation point information 532 corresponding to the operation object at the display interface 53 according to the converted position information of the termination point. In this operation scenario, the user operates the first display unit, i.e., projects the display image directly onto the second display unit.
Preferably, when the user performs the selection operation, the present embodiment may further include:
when a first display unit is in a closed state, acquiring position information of operation point information in a second display unit, converting the position information into position information in a touch coordinate system corresponding to the first display unit through a first conversion parameter, determining target information specific to the selected operation according to the converted position information, responding to the selected operation according to the target information, acquiring a response result, and displaying a final response result through the second display unit;
when the first display unit is in an open state, directly acquiring the position information of the selected operation of the operation object in the touch coordinate system of the first display unit, determining the target information of the selected operation according to the position information, responding to the selected operation according to the target information, acquiring a response result, and controlling the first display unit and the second display unit to display the response result. The response result is different according to the position of the selected operation, for example, if an icon of an application is selected, the response result can display an operation interface corresponding to the application for starting the application; and if one function button in the operation interface corresponding to the application is selected, the response result is the result after the function button is operated.
By adopting the scheme, the position information of the operation object in the display coordinate system of the second display unit is determined by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
Example III,
An embodiment of the present invention provides an information processing method applied to an electronic device, where the electronic device includes a first display unit, a second display unit, and a detection unit, and when the second display unit is in a working state, as shown in fig. 6, the method includes:
step 601: the detection unit detects an operation object;
step 602: detecting to obtain spatial position information of an operation point of the operation object, wherein the spatial position information comprises depth information, horizontal position information and vertical position information of the operation point of the operation object in a first coordinate system;
step 603: converting the spatial position information into first position information according to a preset second conversion parameter; the first position information represents position information of an operation point of the operation object in a touch coordinate system corresponding to the first display unit;
step 604: determining second position information according to the first position information and the first conversion parameter, wherein the second position information represents the position information of the operation point of the operation object in a display coordinate system of a second display unit;
step 605: and controlling the second display unit to display the operation point information corresponding to the operation object at the second position information.
Here, the electronic device may be a smart phone, a tablet computer, an all-in-one machine, or the like.
The first display unit may be a display screen of the electronic device. For example, as shown in fig. 2, the first display unit 21 is a display screen of an electronic device.
The second display unit may be a projection unit of an electronic device; the second display unit is located at a first position, and the first position can be used for equipment according to actual conditions, can be arranged in the middle of the top of the electronic equipment, or can be arranged on the left side or the right side of the top of the electronic equipment. For example, as shown in fig. 2, the second display unit 22 is a projection unit disposed at a middle position of the top of the electronic device.
When the second display unit is in the working state, the projection is started when the second display unit is started.
The operation object may be a finger or may be a stylus pen used by a user to perform an operation.
The detection unit in this embodiment may be a proximity sensor or an emissive capacitor, and specifically, the detection unit may be disposed at a preset position, for example, at a middle position of an upper end of the electronic device.
When the detection unit is a proximity sensor, the spatial position information of the operation point of the operation object obtained by detection may be: the proximity sensor sends out a preset detection signal, such as infrared ray, for detecting the position information of the operation object in a preset range, and the detected position of the operation object is determined by measuring the return time of the sent detection signal, which may include depth information, horizontal position information and vertical position information; as shown in fig. 7, it is assumed that the proximity sensor is installed at a middle position of an upper end of the electronic device, and transmits the detection signal and receives a time when the detection signal is reflected by the finger, thereby determining the depth information, the horizontal position information, and the vertical position information.
When the detection unit is an emissive capacitor, the spatial position information of the operation point of the operation object obtained by detection may be: the emission type capacitor of the detection unit may be spatial position information of the operation object sensed by the capacitive touch screen within a preset spatial range, for example, as shown in fig. 8, a coverage range of the emission type capacitor viewed from a side may be a range 81, and when a finger of a user is sensed within the range, the spatial position information of the finger is obtained.
The first conversion parameter is a parameter that converts the first position information into position information in a display coordinate system of the second display unit, and may include: and converting the matrix.
Preferably, for the scenario provided by this embodiment, when the user performs an operation, the second display unit displays corresponding operation information, including:
when the first display unit is in an open state, the detection unit acquires a third operation of an operation point of an operation object and analyzes position information of the third operation; determining horizontal displacement information of the operation point of the operation object relative to the electronic equipment according to the position information of the third operation; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, converting the third displacement information into fourth displacement information according to a preset first conversion parameter, and controlling the second display unit to move and display an operation point corresponding to the operation object according to the fourth displacement information;
the scenario for this operation can be as shown in fig. 9: when detecting that the finger 92 of the user performs an operation of moving from the initial position 921 to the end position 922, determining horizontal displacement information of the operation point of the operation object relative to the electronic device according to the position information of the third operation, that is, acquiring a moving direction, a lateral displacement and a longitudinal position of the finger in a horizontal direction relative to the electronic device as the horizontal displacement information; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, converting the third displacement information into fourth displacement information according to a preset first conversion parameter, controlling the second display unit to move an operation point corresponding to the operation object according to the fourth displacement information, and moving the operation point information from an initial position 931 to an end position 932;
after the above operation is completed, immediately if the user performs the touch operation in the same direction as the above operation again, that is, when it is detected again that the finger of the user performs the first operation of moving from the initial position to the end position, the detection unit acquires the first displacement information again; after the first displacement information is converted into second displacement information of a second display unit, the second display unit displays the second displacement information in a mode of overlapping the second displacement information on the basis of displaying the operation point information last time; and so on.
When the first display unit is in a closed state, the detection unit acquires a fourth operation of an operation point of an operation object and analyzes position information of the fourth operation; acquiring position information of a termination point in the position information of the fourth operation; converting the position information of the termination point according to a preset second conversion parameter and a first conversion parameter to obtain the position information of the converted termination point, and controlling the second display unit to display the operation point information corresponding to the operation object according to the position information of the converted termination point; the operation scene may be as shown in fig. 10, where the first display unit is in an off state, the position of the finger of the user in the detection area of the detection unit is directly converted into the position information of the touch coordinate system of the first display unit by the second conversion parameter, and then the position information of the touch coordinate system is converted into the position information of the display coordinate system of the second display unit according to the first conversion parameter, and the operation point information of the final image displayed on the second display unit moves from a position 1001 to another position 1002 as the finger moves.
Preferably, after the above steps are completed, as shown in fig. 11, the method further includes:
step 1101: detecting a fifth operation of the operation object,
step 1102: extracting depth information in the spatial position information of the start point and depth information of the spatial position information of the end point of the fifth operation;
step 1103: judging whether the difference value between the depth information of the spatial position information of the starting point and the depth information of the spatial position information of the ending point meets a preset threshold value or not, and generating a first operation instruction when the difference value meets the preset threshold value;
step 1104: and executing corresponding processing according to the first operation instruction, acquiring a processing result, and displaying the processing result through the second display unit.
The fifth operation may be a selection operation, that is, a suspended click operation.
The above-described operations of fig. 11 may be described using the scenario shown in fig. 12:
when the operation object performs the fifth operation, the operation object can be a suspended click operation, and at this time, the operation object inevitably moves up and down quickly, so that the selection operation can be determined at the position where the current operation point information is located only by acquiring the depth information of the fifth operation, namely the suspended click operation, of the starting point and the depth information of the ending point;
further, before determining that the user performs the selection operation, acquiring the acquisition time corresponding to the depth information of the starting point and the acquisition time corresponding to the depth information of the end point, and calculating to obtain an operation duration of a fifth operation, wherein when the operation duration is smaller than a preset threshold value, the selection operation is determined.
In fig. 12, the finger of the user is the operation object, and moves downward from the position of the solid line to the position of the dotted line, and two pieces of depth information are obtained, or two pieces of depth information and corresponding collection time are obtained;
according to the two pieces of depth information, determining the selection operation performed at the position of the current operation point information 1201;
then, converting the operation point information 1201 into position information in a touch coordinate system of the first display unit, and determining a response result aiming at the selected operation according to a target object corresponding to the position information; and displaying the response result through a second display unit.
By adopting the scheme, the position information of the operation object in the display coordinate system of the second display unit is determined by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
Example four,
An embodiment of the present invention provides an electronic device, as shown in fig. 13, the electronic device includes:
a detection unit 1301 configured to detect an operation object; acquiring first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to a first display unit;
a first conversion unit 1302, configured to determine second position information according to the first position information and the first conversion parameter, where the second position information represents position information of an operation point of the operation object in a display coordinate system of a second display unit;
and the processing unit 1303 is configured to control the second display unit to display the operation point information corresponding to the operation object at the second position information.
Here, the electronic device may be a smart phone, a tablet computer, an all-in-one machine, or the like.
The first display unit may be a display screen of the electronic device. For example, as shown in fig. 2, the first display unit 21 is a display screen of an electronic device.
The second display unit may be a projection unit of an electronic device; the second display unit is located at a first position, and the first position can be used for equipment according to actual conditions, can be arranged in the middle of the top of the electronic equipment, or can be arranged on the left side or the right side of the top of the electronic equipment. For example, as shown in fig. 2, the second display unit 22 is a projection unit disposed at a middle position of the top of the electronic device.
When the second display unit is in the working state, the projection is started when the second display unit is started.
The operation object may be a finger or may be a stylus pen used by a user to perform an operation.
The detection unit may detect that the operation object is: detecting that a finger or a stylus of a user is located in a detection area of the electronic device.
The operation point of the operation object may be one of the points that meets the condition after the detection unit detects the operation area of the operation object.
The first conversion parameter is a parameter that converts the first position information into position information in a display coordinate system of the second display unit, and may include: and converting the matrix.
The manner of displaying the operation point information corresponding to the operation object may be displaying as a preset graph, for example, as a circle.
By adopting the scheme, the position information of the operation object in the display coordinate system of the second display unit is determined by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
Example V,
The embodiment of the invention provides a detection unit, which is used for detecting an operation object; acquiring first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to a first display unit;
the first conversion unit is used for determining second position information according to the first position information and the first conversion parameter, and the second position information represents the position information of the operation point of the operation object in a display coordinate system of the second display unit;
and the processing unit is used for controlling the second display unit to display the operation point information corresponding to the operation object at the second position information.
Here, the electronic device may be a smart phone, a tablet computer, an all-in-one machine, or the like.
The first display unit may be a display screen of the electronic device. For example, as shown in fig. 2, the first display unit 21 is a display screen of an electronic device.
The second display unit may be a projection unit of an electronic device; the second display unit is located at a first position, and the first position can be used for equipment according to actual conditions, can be arranged in the middle of the top of the electronic equipment, or can be arranged on the left side or the right side of the top of the electronic equipment. For example, as shown in fig. 2, the second display unit 22 is a projection unit disposed at a middle position of the top of the electronic device.
When the second display unit is in the working state, the projection is started when the second display unit is started.
The operation object may be a finger or may be a stylus pen used by a user to perform an operation.
The detection unit is a touch screen, is specifically configured to acquire first position information of an operation point of the operation object, and includes: the detection unit takes the touch point of the detected operation object as an operation point and acquires position information of the operation point as first position information.
The detection unit in this embodiment is a touch screen, and specifically, the detection unit may be physically overlapped with the first display unit, but the detection unit is located on the surface side of the electronic device relative to the first display unit, for example, the interface that can be seen in fig. 2 is an image displayed by the first display unit, but a transparent touch screen is further provided on the first display unit.
The touch screen described in this embodiment may be: resistive touch screens, capacitive touch screens, acoustic wave touch screens, and the like.
For example, when the touch screen is a resistive touch screen, the detection unit is specifically configured to, when an operation object is in contact with the resistive touch screen, make contact between two conductive layers at a touch point position, change resistance, generate signals in two directions of a horizontal coordinate and a vertical coordinate, and output first position information of the operation point of the operation object on the horizontal coordinate and the vertical coordinate;
when the touch screen is a capacitive touch screen, the detection unit is specifically used for forming a coupling capacitor between an operation object and the surface of the touch screen due to a human body electric field when a finger touches the metal layer of the capacitive touch screen, and for high-frequency current, the capacitor is a direct conductor, so that the operation object sucks first current from a contact point; the first currents respectively flow out of the electrodes on the four corners of the touch screen, four sub-currents of the first currents flowing through the four electrodes are in direct proportion to the distances from the fingers to the four corners, the controller accurately calculates the proportions of the four sub-currents of the first currents to obtain horizontal and vertical coordinate position information of a touch point, and the obtained horizontal and vertical coordinate position is used as the first position information.
When the touch screen is an acoustic wave touch screen, the detection unit is specifically used for respectively sticking transducers which transmit and receive acoustic waves in two directions of horizontal and vertical coordinates to three angles of the acoustic wave touch screen; the transducer is made of special ceramic materials and is divided into a transmitting transducer and a receiving transducer, and the electric signals transmitted through the touch screen cable are converted into sound wave energy and the surface sound wave energy converged by the reflection stripes is converted into electric signals. The four sides are engraved with reflection stripes reflecting surface ultrasonic waves. When the operation object touches the screen, part of the sound wave energy is absorbed, so that the received signal is changed, and the horizontal and vertical coordinates of the touch are obtained through the processing of the controller.
The first conversion parameter is a parameter that converts the first position information into position information in a display coordinate system of the second display unit, and may include: and converting the matrix.
The manner of displaying the operation point information corresponding to the operation object may be displaying as a preset graph, for example, as a circle.
Preferably, for the scenario provided by this embodiment, when the user performs an operation, the second display unit displays corresponding operation information, including:
the detection unit is further configured to acquire a first operation of an operation point of an operation object when the first display unit is in a closed state, and analyze position information of the first operation; determining first displacement information of an operation point of the operation object according to the position information of the first operation; converting the first displacement information into second displacement information according to a preset first conversion parameter; correspondingly, the processing unit is further configured to control the second display unit to move and display the operation point corresponding to the operation object according to the second displacement information;
the scenario for this operation can be as shown in fig. 4: as can be seen from fig. 4, the first display unit 41 is in a closed state, and displays a black screen state, and at this time, the first display unit can be used as a touch operation area; when detecting that the finger 42 of the user performs a first operation of moving from the initial position 421 to the end position 422, the detection unit acquires first position information of the operation of the finger 42, and determines first displacement information of the finger 42 by the start position and the end position of the first position information, that is, acquires a moving direction, a lateral displacement, and a longitudinal position of the finger as the first displacement information;
then converting the first displacement information into second displacement information by the first conversion parameter, and controlling the operation point information in the image displayed by the second display unit, that is, the circular image to move from the first position 431 to the second position 432;
immediately after the above operation is completed, if the user performs the touch operation in the same direction as the above operation again, that is, when it is detected again that the finger 42 of the user performs the first operation of moving from the initial position 421 to the end position 422, the detection unit acquires the first displacement information again; after converting the first displacement information into second displacement information of a second display unit, the second display unit displays the second displacement information in a manner that the second displacement information is superimposed on the basis of the last operation point information display, that is, the display graph of the operation point information displayed in fig. 4 is moved from the second position 432 to the third position 433; and so on.
The detection unit is further used for acquiring a second operation of an operation point of an operation object when the first display unit is in an on state, and analyzing position information of the second operation; acquiring position information of a termination point in the position information of the second operation; converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point; correspondingly, the processing unit is further configured to control the second display unit to display, according to the converted position information of the termination point, operation point information corresponding to the operation object;
the scenario for this operation can be as shown in fig. 5: the first display unit 51 is in an open state, that is, displays an operation interface, and at this time, the first display unit and the second display unit can be subjected to absolute projection; the initial state of the user is that the finger is located at the initial position 521, and the position where the operation point information of the second display unit is obtained through the first conversion parameter is located at the position 531; when it is detected that the finger 52 of the user performs the second operation of moving from the initial position 521 to the end position 522, acquiring the position information of the end point in the position information of the second operation; and converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point, controlling the second display unit, and displaying the operation point information 532 corresponding to the operation object at the display interface 53 according to the converted position information of the termination point. In this operation scenario, the user operates the first display unit, i.e., projects the display image directly onto the second display unit.
Preferably, when the user performs the selection operation, the present embodiment may further include:
when the first display unit is in a closed state, the detection unit acquires position information of operation point information in the second display unit, converts the position information into position information in a touch coordinate system corresponding to the first display unit through a first conversion parameter, determines target information specific to the selected operation according to the converted position information, responds to the selected operation according to the target information, acquires a response result, and displays a final response result through the second display unit;
when the first display unit is in an open state, the detection unit directly acquires position information of a selected operation of the operation object in a touch coordinate system of the first display unit, determines target information for the selected operation according to the position information, responds to the selected operation according to the target information, acquires a response result, and controls the first display unit and the second display unit to display the response result.
By adopting the scheme, the position information of the operation object in the display coordinate system of the second display unit is determined by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
Example six,
An embodiment of the present invention provides an electronic device, as shown in fig. 14, the electronic device includes:
a detection unit 1401 for detecting an operation object; acquiring first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to a first display unit;
a first conversion unit 1402, configured to determine second position information according to the first position information and the first conversion parameter, where the second position information represents position information of an operation point of the operation object in a display coordinate system of a second display unit;
a processing unit 1403, configured to control the second display unit to display the operation point information corresponding to the operation object at the second position information.
Here, the electronic device may be a smart phone, a tablet computer, an all-in-one machine, or the like.
The first display unit may be a display screen of the electronic device. For example, as shown in fig. 2, the first display unit 21 is a display screen of an electronic device.
The second display unit may be a projection unit of an electronic device; the second display unit is located at a first position, and the first position can be used for equipment according to actual conditions, can be arranged in the middle of the top of the electronic equipment, or can be arranged on the left side or the right side of the top of the electronic equipment. For example, as shown in fig. 2, the second display unit 22 is a projection unit disposed at a middle position of the top of the electronic device.
When the second display unit is in the working state, the projection is started when the second display unit is started.
The operation object may be a finger or may be a stylus pen used by a user to perform an operation.
The electronic device further includes: a second conversion unit 1404, configured to convert the spatial location information into first location information according to a preset second conversion parameter; accordingly, the detecting unit 1401 is specifically configured to detect and obtain spatial position information of the operation point of the operation object, where the spatial position information includes depth information, horizontal position information, and vertical position information of the operation point of the operation object in the first coordinate system.
The detection unit in this embodiment may be a proximity sensor or an emissive capacitor, and specifically, the detection unit may be disposed at a preset position, for example, at a middle position of an upper end of the electronic device.
When the detection unit is a proximity sensor, the proximity sensor sends a preset detection signal, such as infrared ray, to detect position information of an operation object within a preset range, and the detected position of the operation object is determined by measuring the return time of the sent detection signal, which may include depth information, horizontal position information, and vertical position information; as shown in fig. 7, it is assumed that the proximity sensor is installed at a middle position of an upper end of the electronic device, and transmits the detection signal and receives a time when the detection signal is reflected by the finger, thereby determining the depth information, the horizontal position information, and the vertical position information.
When the detection unit is an emissive capacitor, the detection unit is specifically configured to sense spatial position information of an operation object within a preset spatial range, for example, as shown in fig. 8, a coverage range of the emissive capacitor seen from a side surface may be a range 81, and when a finger of a user is sensed within the range, the spatial position information of the finger is acquired.
The first conversion parameter is a parameter that converts the first position information into position information in a display coordinate system of the second display unit, and may include: and converting the matrix.
Preferably, for the scenario provided by this embodiment, when the user performs an operation, the second display unit displays corresponding operation information, including:
the detection unit is further configured to acquire a third operation of an operation point of an operation object when the first display unit is in an on state, and analyze position information of the third operation; determining horizontal displacement information of the operation point of the operation object relative to the electronic equipment according to the position information of the third operation; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, and converting the third displacement information into fourth displacement information according to a preset first conversion parameter; correspondingly, the processing unit is further configured to control the second display unit to move and display the operation point corresponding to the operation object according to the fourth displacement information;
the scenario for this operation can be as shown in fig. 9: when detecting that the finger 92 of the user performs an operation of moving from the initial position 921 to the end position 922, determining horizontal displacement information of the operation point of the operation object relative to the electronic device according to the position information of the third operation, that is, acquiring a moving direction, a lateral displacement and a longitudinal position of the finger in a horizontal direction relative to the electronic device as the horizontal displacement information; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, converting the third displacement information into fourth displacement information according to a preset first conversion parameter, controlling the second display unit to move an operation point corresponding to the operation object according to the fourth displacement information, and moving the operation point information from an initial position 931 to an end position 932;
after the above operation is completed, immediately if the user performs the touch operation in the same direction as the above operation again, that is, when it is detected again that the finger of the user performs the first operation of moving from the initial position to the end position, the detection unit acquires the first displacement information again; after the first displacement information is converted into second displacement information of a second display unit, the second display unit displays the second displacement information in a mode of overlapping the second displacement information on the basis of displaying the operation point information last time; and so on.
The detection unit is further configured to acquire a fourth operation of an operation point of an operation object when the first display unit is in a closed state, and analyze position information of the fourth operation; acquiring position information of a termination point in the position information of the fourth operation; converting the position information of the termination point according to a preset second conversion parameter and a first conversion parameter to obtain the converted position information of the termination point; correspondingly, the processing unit is further configured to control the second display unit to display, according to the converted position information of the termination point, operation point information corresponding to the operation object; the operation scene may be as shown in fig. 10, where the first display unit is in an off state, the position of the finger of the user in the detection area of the detection unit is directly converted into the position information of the touch coordinate system of the first display unit by the second conversion parameter, and then the position information of the touch coordinate system is converted into the position information of the display coordinate system of the second display unit according to the first conversion parameter, and the operation point information of the final image displayed on the second display unit moves from a position 1001 to another position 1002 as the finger moves.
Preferably, the detection unit is further configured to detect a fifth operation that results in the operation object, and extract depth information in spatial position information of a start point and depth information of spatial position information of an end point of the fifth operation;
the processing unit is further configured to determine whether a difference between the depth information of the spatial position information of the starting point and the depth information of the spatial position information of the ending point satisfies a preset threshold, and generate a first operation instruction when the difference satisfies the preset threshold; and executing corresponding processing according to the first operation instruction, acquiring a processing result, and displaying the processing result through the second display unit.
The fifth operation may be a selection operation, that is, a suspended click operation.
The description is made using the scenario shown in fig. 12:
when the operation object performs the fifth operation, the operation object can be a suspended click operation, and at this time, the operation object inevitably moves up and down quickly, so that the selection operation can be determined at the position where the current operation point information is located only by acquiring the depth information of the fifth operation, namely the suspended click operation, of the starting point and the depth information of the ending point;
further, before determining that the user performs the selection operation, acquiring the acquisition time corresponding to the depth information of the starting point and the acquisition time corresponding to the depth information of the end point, and calculating to obtain an operation duration of a fifth operation, wherein when the operation duration is smaller than a preset threshold value, the selection operation is determined.
In fig. 12, the finger of the user is the operation object, and moves downward from the position of the solid line to the position of the dotted line, and two pieces of depth information are obtained, or two pieces of depth information and corresponding collection time are obtained;
according to the two pieces of depth information, determining the selection operation performed at the position of the current operation point information 1201;
then, converting the operation point information 1201 into position information in a touch coordinate system of the first display unit, and determining a response result aiming at the selected operation according to a target object corresponding to the position information; and displaying the response result through a second display unit.
By adopting the scheme, the position information of the operation object in the display coordinate system of the second display unit is determined by acquiring the position information of the operation object in the touch coordinate system of the first display unit; therefore, the user can directly interact with the first display unit of the electronic equipment, interactive operation is displayed through the second display unit, the user experience of directly interacting with the content displayed through the second display unit is provided for the user, and the operation efficiency is improved.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. An information processing method is applied to an electronic device, the electronic device is provided with a first display unit, a second display unit and a detection unit, and when the second display unit is in a working state, the method comprises the following steps:
the detection unit detects an operation object;
the detection unit acquires first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to the first display unit;
determining second position information according to the first position information and the first conversion parameter, wherein the second position information represents the position information of the operation point of the operation object in a display coordinate system of a second display unit;
controlling the second display unit to display operation point information corresponding to the operation object at the second position information; the second display unit is a projection unit of the electronic device.
2. The method of claim 1, wherein the detection unit is: a touch screen;
the detection unit acquires first position information of an operation point of the operation object, including: the detection unit takes the touch point of the detected operation object as an operation point and acquires position information of the touch point as first position information.
3. The method of claim 2, further comprising:
when the first display unit is in a closed state, the detection unit acquires a first operation of an operation point of an operation object and analyzes position information of the first operation; determining first displacement information of an operation point of the operation object according to the position information of the first operation; converting the first displacement information into second displacement information according to a preset first conversion parameter, and controlling the second display unit to move and display an operation point corresponding to the operation object according to the second displacement information;
when the first display unit is in an open state, the detection unit acquires a second operation of an operation point of an operation object and analyzes position information of the second operation; acquiring position information of a termination point in the position information of the second operation; and converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point, and controlling the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
4. The method according to claim 1, wherein the detecting unit acquires first position information of an operation point of the operation object, including:
detecting to obtain spatial position information of an operation point of the operation object, wherein the spatial position information comprises depth information, horizontal position information and vertical position information of the operation point of the operation object in a first coordinate system;
and converting the spatial position information into first position information according to a preset second conversion parameter.
5. The method of claim 4, further comprising:
when the first display unit is in an open state, the detection unit acquires a third operation of an operation point of an operation object and analyzes position information of the third operation; determining horizontal displacement information of the operation point of the operation object relative to the electronic equipment according to the position information of the third operation; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, converting the third displacement information into fourth displacement information according to a preset first conversion parameter, and controlling the second display unit to move and display an operation point corresponding to the operation object according to the fourth displacement information;
when the first display unit is in a closed state, the detection unit acquires a fourth operation of an operation point of an operation object and analyzes position information of the fourth operation; acquiring position information of a termination point in the position information of the fourth operation; and converting the position information of the termination point according to a preset second conversion parameter and a first conversion parameter to obtain the converted position information of the termination point, and controlling the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
6. The method according to claim 4 or 5, characterized in that the method further comprises:
detecting a fifth operation of the operation object,
extracting depth information in the spatial position information of the start point and depth information of the spatial position information of the end point of the fifth operation;
judging whether the difference value between the depth information of the spatial position information of the starting point and the depth information of the spatial position information of the ending point meets a preset threshold value or not, and generating a first operation instruction when the difference value meets the preset threshold value;
and executing corresponding processing according to the first operation instruction, acquiring a processing result, and displaying the processing result through the second display unit.
7. An electronic device, the electronic device comprising:
a detection unit configured to detect an operation object; acquiring first position information of an operation point of the operation object, wherein the first position information represents position information of the operation point of the operation object in a touch coordinate system corresponding to a first display unit;
the first conversion unit is used for determining second position information according to the first position information and the first conversion parameter, and the second position information represents the position information of the operation point of the operation object in a display coordinate system of the second display unit;
the processing unit is used for controlling the second display unit to display the operation point information corresponding to the operation object at the second position information; the second display unit is a projection unit of the electronic device.
8. The electronic device of claim 7,
the detection unit is a touch screen, is specifically configured to acquire first position information of an operation point of the operation object, and includes: the detection unit takes the touch point of the detected operation object as an operation point and acquires position information of the touch point as first position information.
9. The electronic device of claim 8,
the detection unit is further configured to acquire a first operation of an operation point of an operation object when the first display unit is in a closed state, and analyze position information of the first operation; determining first displacement information of an operation point of the operation object according to the position information of the first operation; converting the first displacement information into second displacement information according to a preset first conversion parameter;
correspondingly, the processing unit is further configured to control the second display unit to move and display the operation point corresponding to the operation object according to the second displacement information;
or,
the detection unit is further used for acquiring a second operation of an operation point of an operation object when the first display unit is in an on state, and analyzing position information of the second operation; acquiring position information of a termination point in the position information of the second operation; converting the position information of the termination point according to a preset first conversion parameter to obtain the converted position information of the termination point;
correspondingly, the processing unit is further configured to control the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
10. The electronic device of claim 7, further comprising:
the second conversion unit is used for converting the spatial position information into first position information according to a preset second conversion parameter;
correspondingly, the detection unit is specifically configured to obtain spatial position information of the operation point of the operation object by detection, where the spatial position information includes depth information, horizontal position information, and vertical position information of the operation point of the operation object in the first coordinate system.
11. The electronic device of claim 10,
the detection unit is further configured to acquire a third operation of an operation point of an operation object when the first display unit is in an on state, and analyze position information of the third operation; determining horizontal displacement information of the operation point of the operation object relative to the electronic equipment according to the position information of the third operation; converting the horizontal displacement information into third displacement information according to a preset second conversion parameter, and converting the third displacement information into fourth displacement information according to a preset first conversion parameter;
correspondingly, the processing unit is further configured to control the second display unit to move and display the operation point corresponding to the operation object according to the fourth displacement information;
or,
the detection unit is further configured to acquire a fourth operation of an operation point of an operation object when the first display unit is in a closed state, and analyze position information of the fourth operation; acquiring position information of a termination point in the position information of the fourth operation; converting the position information of the termination point according to a preset second conversion parameter and a first conversion parameter to obtain the converted position information of the termination point;
correspondingly, the processing unit is further configured to control the second display unit to display the operation point information corresponding to the operation object according to the converted position information of the termination point.
12. The electronic device of claim 11,
the detection unit is further configured to detect a fifth operation of the operation object, and extract depth information in spatial position information of a start point and depth information of spatial position information of an end point of the fifth operation;
the processing unit is further configured to determine whether a difference between the depth information of the spatial position information of the starting point and the depth information of the spatial position information of the ending point satisfies a preset threshold, and generate a first operation instruction when the difference satisfies the preset threshold; and executing corresponding processing according to the first operation instruction, acquiring a processing result, and displaying the processing result through the second display unit.
CN201410594202.3A 2014-10-29 2014-10-29 A kind of information processing method and electronic equipment Active CN105630364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410594202.3A CN105630364B (en) 2014-10-29 2014-10-29 A kind of information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410594202.3A CN105630364B (en) 2014-10-29 2014-10-29 A kind of information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105630364A CN105630364A (en) 2016-06-01
CN105630364B true CN105630364B (en) 2019-11-26

Family

ID=56045367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410594202.3A Active CN105630364B (en) 2014-10-29 2014-10-29 A kind of information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105630364B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106843669A (en) * 2016-12-06 2017-06-13 北京小度信息科技有限公司 Application interface operating method and device
CN112328164B (en) * 2020-11-11 2022-08-02 维沃移动通信有限公司 Control method and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722335A (en) * 2012-06-08 2012-10-10 深圳Tcl新技术有限公司 Touch equipment-based remote slide input method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101526995B1 (en) * 2008-10-15 2015-06-11 엘지전자 주식회사 Mobile terminal and method for controlling display thereof
CN102164204A (en) * 2011-02-15 2011-08-24 深圳桑菲消费通信有限公司 Mobile phone with interactive function and interaction method thereof
JP5853394B2 (en) * 2011-04-07 2016-02-09 セイコーエプソン株式会社 Cursor display system, cursor display method, and projector
CN102508604A (en) * 2011-11-08 2012-06-20 中兴通讯股份有限公司 Control method of terminal display interface, and terminal
JP5906984B2 (en) * 2012-07-30 2016-04-20 カシオ計算機株式会社 Display terminal device and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722335A (en) * 2012-06-08 2012-10-10 深圳Tcl新技术有限公司 Touch equipment-based remote slide input method and device

Also Published As

Publication number Publication date
CN105630364A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
US10754468B2 (en) Coordinate indicating apparatus and coordinate measurement apparatus for measuring input position of coordinate indicating apparatus
CN104049777B (en) For the channel aggregation of optimal stylus detection
CN108958490B (en) Electronic device, gesture recognition method thereof and computer-readable storage medium
US10101874B2 (en) Apparatus and method for controlling user interface to select object within image and image input device
US20140380249A1 (en) Visual recognition of gestures
CN106610757B (en) A kind of projection touch control method, device and equipment
KR102209910B1 (en) Coordinate measuring apparaturs which measures input position of coordinate indicating apparatus and method for controlling thereof
KR102366685B1 (en) Method and system for data transfer with a touch enabled device
KR101019254B1 (en) Terminal device with space projection and space touch function and its control method
CN108769299B (en) Screen control method and device and mobile terminal
CN113918047B (en) System, method and computer readable medium for supporting multiple users
CN111164553B (en) Touch sensing method, touch chip, electronic equipment and touch system
CN102445983B (en) Electronic equipment and method for team working of plurality of input equipment
CN104780409B (en) A kind of terminal remote control method and terminal remote control system
CN105630364B (en) A kind of information processing method and electronic equipment
KR102476607B1 (en) Coordinate measuring apparatus and method for controlling thereof
JP2014123316A (en) Information processing system, information processing device, detection device, information processing method, detection method, and computer program
KR20100075282A (en) Wireless apparatus and method for space touch sensing and screen apparatus using depth sensor
KR101847517B1 (en) Touch screen terminal and near field communication method, apparatus and system thereof
JP4053903B2 (en) Pointing method, apparatus, and program
CN107850969A (en) Apparatus and method for detection gesture on a touchpad
CN103069364B (en) For distinguishing the system and method for input object
CN110069208A (en) Touch operation method and related equipment
CN111093030B (en) Equipment control method and electronic equipment
CN104965647A (en) Information processing method and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant