CN104182128A - Information processing method and device and electronic equipment - Google Patents
Information processing method and device and electronic equipment Download PDFInfo
- Publication number
- CN104182128A CN104182128A CN201410455276.9A CN201410455276A CN104182128A CN 104182128 A CN104182128 A CN 104182128A CN 201410455276 A CN201410455276 A CN 201410455276A CN 104182128 A CN104182128 A CN 104182128A
- Authority
- CN
- China
- Prior art keywords
- target object
- distance
- panel
- determining
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Controls And Circuits For Display Device (AREA)
- Position Input By Displaying (AREA)
Abstract
The embodiment of the invention discloses an information processing method which is applicable to electronic equipment. The electronic equipment is at least provided with a first panel. The first panel is provided with a display module and a projection module which are located on the same side of the first panel. According to the information processing method, when information is input to the electronic equipment via a target object, the projection module projects the input information or information related to the input information to the range nearby the target object, then users can see the related information without the need of diverting attention to the electronic equipment, and accordingly user writing convenience is improved. The embodiment of the invention further provides an information processing device and the electronic equipment.
Description
Technical Field
The present application relates to the field of electronic information technologies, and in particular, to an information processing method and apparatus, and an electronic device.
Background
With the rapid development of scientific technology, more and more handwriting input systems apply ultrasonic technology. At present, through a handwriting input system based on an ultrasonic technology, a user can input written information into an electronic device by writing on a plane outside the electronic device through an ultrasonic pen.
However, when a user writes with an ultrasonic pen, since writing-related information (such as spelling prompts, text correction, and the like) is on the electronic device side, the user needs to look away from the electronic device to see the writing-related information. Therefore, the user has poor convenience in writing.
Content of application
The application aims to provide an information processing method, an information processing device and electronic equipment so as to improve the writing convenience of a user.
In order to achieve the purpose, the application provides the following technical scheme:
an information processing method is applied to electronic equipment, wherein the electronic equipment at least comprises a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the method comprises the following steps:
detecting whether a target object exists on a plane on which the electronic equipment is placed, wherein the target object is used for inputting information to the electronic equipment;
when the target object is detected, determining the distance between the projection module and a straight line where the target object is located, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel;
determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture;
determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel;
adjusting the angle between the lens of the projection module and the first panel to the first included angle, and adjusting the focal length of the projection module to the target focal length;
and projecting the preset information displayed by the display module through the projection module.
In the above method, preferably, the determining a distance between the projection module and a straight line where the target object is located includes:
determining a distance between the target object and the electronic device;
determining a second included angle between the first panel and the horizontal plane;
and determining the distance between the projection module and the straight line where the target is located according to the distance between the upper edge and the lower edge of the first panel, the distance between the target and the electronic equipment and the second included angle.
In the above method, preferably, the target is an ultrasonic pen; the determining the distance between the target object and the electronic device comprises:
sensing ultrasonic waves emitted by the target object through two sound pickups which are arranged on the first panel and are separated from each other by a first preset distance;
determining the distance between the target object and the two sound pickups respectively through the time length required by the two sound pickups to sense the ultrasonic waves emitted by the target object;
and determining the distance between the target object and the electronic equipment according to the distance between the target object and the two microphones and the distance between the two microphones.
Preferably, the determining the distance between the target object and the electronic device includes:
acquiring images of the target object through two image acquisition devices which are arranged on the first panel and have a second preset distance;
acquiring a depth image of the target object through the two acquired target object images;
determining a distance between the target object and the electronic device from the acquired depth image.
In the above method, preferably, the determining a first included angle between a perpendicular line segment from the projection module to the straight line of the target object and the first panel includes:
and determining a first included angle between a perpendicular line segment from the projection module to the straight line where the target object is located and the first panel according to the distance between the projection module and the straight line where the target object is located, the distance between the upper edge and the lower edge of the first panel, and the distance between the target object and the electronic equipment.
In the above method, preferably, the preset information is: first information input by the object to an electronic device, and/or second information associated with the first information.
An information processing device is applied to electronic equipment, wherein the electronic equipment at least comprises a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the device comprises:
the detection module is used for detecting whether a target object exists on a plane on which the electronic equipment is placed, and the target object is used for inputting information to the electronic equipment;
the first determining module is used for determining the distance between the projection module and a straight line where the target object is located when the detection module detects the target object, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel;
the second determining module is used for determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture;
the third determining module is used for determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel;
the adjusting module is used for adjusting the angle between the lens of the projection module and the first panel to the first included angle and adjusting the focal length of the projection module to the target focal length;
and the projection control module is used for projecting the preset information displayed by the display module through the projection module.
The above apparatus, preferably, the first determining module includes:
a first determination unit configured to determine a distance between the target object and the electronic device;
the second determining unit is used for determining a second included angle between the first panel and the horizontal plane;
and the third determining unit is used for determining the distance between the projection module and the straight line where the target is located according to the distance between the upper edge and the lower edge of the first panel, the distance between the target and the electronic equipment and the second included angle.
In the above apparatus, preferably, the object is an ultrasonic pen, and the first determining unit includes:
the two sound pickups are arranged on the first panel and are separated by a first preset distance;
the first determining subunit is used for determining the distances between the target object and the two sound collectors respectively through the time length required by the two sound collectors to sense the ultrasonic waves emitted by the target object;
and the second determining subunit is used for determining the distance between the object and the electronic equipment according to the distance between the object and the two microphones and the distance between the two microphones.
The above apparatus, preferably, the first determining unit includes:
the two image acquisition devices are arranged on the first panel and have a second preset distance;
the third determining subunit is used for acquiring the depth image of the target object through two target object images acquired by the two image acquisition devices simultaneously;
a fourth determining subunit, configured to determine, from the acquired depth map, a distance between the target object and the electronic device.
In the above apparatus, preferably, the third determining module is specifically configured to determine, according to a distance between the projection module and a straight line where the target object is located, a distance between an upper edge and a lower edge of the first panel, and a distance between the target object and the electronic device, a first included angle between a perpendicular line segment from the projection module to the straight line where the target object is located and the first panel.
In the above apparatus, preferably, the projection control module is specifically configured to project, through the projection module, first information input from the object to the electronic device and displayed by the display module, and/or second information associated with the first information.
An electronic device is provided with at least a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the electronic device further comprises an information processing apparatus as described in any of the above.
According to the scheme, the information processing method and the information processing device are applied to the electronic equipment, the electronic equipment is at least provided with the first panel, and the first panel is provided with the display module and the projection module; the display module and the projection module are positioned on the same side of the first panel; the information processing method comprises the following steps: detecting whether a target object exists on a plane on which the electronic equipment is placed; when the target object is detected, determining the distance between the projection module and a straight line where the target object is located, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel; determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture; determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel; adjusting the angle between the lens of the projection module and the first panel to the first included angle, and adjusting the focal length of the projection module to the target focal length; the preset information displayed by the display module is projected by the projection module, so that the information displayed by the display module can be projected onto a plane on which the electronic equipment is placed by the projection module, and a projection picture is near a target object.
Therefore, by the information processing method and the information processing device provided by the embodiment of the application, the target object can be an ultrasonic pen, and when a user writes on a writing plane through the ultrasonic pen, information displayed on the electronic equipment side can be projected onto the writing plane through the information processing method provided by the embodiment of the application, so that the user can see the related information without transferring the sight line to the electronic equipment, and the writing convenience of the user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of several structures of an electronic device provided in an embodiment of the present application;
FIG. 2 is a flowchart of an implementation of an information processing method according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of an implementation of determining a distance between a projection module and a straight line of a target according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a calculation of a distance between a projection module and a straight line of a target according to an embodiment of the present disclosure;
FIG. 5 is a flow chart of an implementation of determining a distance between a target object and an electronic device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 7 is a flowchart of another implementation of determining a distance between a target object and an electronic device according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic diagram of calculating a distance between a target object and the electronic device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a first determining module according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a first determining unit provided in an embodiment of the present application;
fig. 13 is another schematic structural diagram of the first determining unit provided in the embodiment of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The information processing method and the information processing device are applied to electronic equipment, in the embodiment of the application, the electronic equipment is at least provided with a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are located on the same side of the first panel, so that when the display module faces a user, projection of the projection module is located between the electronic device and the user. Specifically, reference may be made to fig. 1 for a positional relationship between a display module and a projection module, where fig. 1 is a schematic view of several structures of an electronic device provided in an embodiment of the present application. When the first panel 11 is tilted on a plane, the projection module 13 may be located above the display module 12, or located at a side of the display module 12, or located below the display module.
Of course, the structure of the electronic device in the embodiment of the present application is not limited to the several structures shown in fig. 1, and other structures may also be used as long as the electronic device at least has a first panel, and a display module and a projection module are arranged on the first panel; the lenses of the display module and the projection module are located on the same side of the first panel, and detailed description is omitted here.
The electronic device may be a notebook computer, a tablet computer, or other electronic devices, such as a mobile communication terminal.
Referring to fig. 2, fig. 2 is a flowchart of an implementation of an information processing method according to an embodiment of the present application, where the implementation of the method includes:
step S21: detecting whether a target object exists on a plane on which the electronic equipment is placed, wherein the target object is used for inputting information to the electronic equipment;
the target object can be a keyboard, a physical keyboard (such as a glass keyboard), a virtual keyboard (such as an infrared projection keyboard) and the like. The object may also be an ultrasonic generating device, such as an ultrasonic pen.
Whether a target object exists can be judged through image detection; if the target object is an ultrasonic wave emitting device, whether the ultrasonic wave is sensed or not can be detected through a sound pick-up, and whether the target object exists or not can be judged.
Step S22: when the target object is detected, determining the distance between the projection module and a straight line where the target object is located, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel;
the distance between the projection module and the straight line where the target object is located is the length of the perpendicular line segment from the projection module to the straight line where the target object is located.
Step S23: determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal screen of the projection module and the size of a projection picture;
the size of the liquid crystal screen can be the area of the liquid crystal screen; the size of the projected picture may be the area of the projected picture.
The target focal length may be determined by:
determining the ratio of the size of a liquid crystal sheet of the projection module to the size of a projection picture;
and the product of the distance between the projection module and the straight line where the target object is located and the ratio is the target focal length.
The target focal length can be formulated as:
wherein f is the target focal length; c. C1The size of the liquid crystal screen of the projection module; c. C2The size of the projection picture is obtained; and D is the distance between the projection module and the straight line where the target object is located.
Step S24: determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel;
step S25: adjusting the angle between the lens of the projection module and the first panel to the first included angle, and adjusting the focal length of the projection module to the target focal length;
step S26: and projecting the preset information displayed by the display module through the projection module.
After the angle between the lens of the projection module and the first panel is adjusted to the first included angle and the focal length of the projection module is adjusted to the target focal length, when the projection module is used for projection, a projection picture can be projected to the position near the target, specifically to the intersection point of the vertical line segment between the projection module and the straight line where the target is located.
The application provides an information processing method, which is applied to electronic equipment, wherein the electronic equipment is at least provided with a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the information processing method detects whether a target object exists on a plane on which the electronic equipment is placed; when the target object is detected, determining the distance between the projection module and a straight line where the target object is located, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel; determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture; determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel; adjusting the angle between the lens of the projection module and the first panel to the first included angle, and adjusting the focal length of the projection module to the target focal length; the preset information displayed by the display module is projected by the projection module, so that the information displayed by the display module can be projected onto a plane on which the electronic equipment is placed by the projection module, and a projection picture is near a target object.
Therefore, by the information processing method provided by the embodiment of the application, the target object can be an ultrasonic pen, and when a user writes on a writing plane through the ultrasonic pen, information displayed on the electronic equipment side can be projected onto the writing plane through the information processing method provided by the embodiment of the application, so that the user can see the related information without transferring the sight line to the electronic equipment, and the writing convenience of the user is improved.
When the target object is a glass keyboard or an infrared projection keyboard, due to the fact that the touch feeling of a user to the keys is poor, the user needs to frequently switch the sight line between the keyboard and the electronic equipment, and the convenience is poor. By the information processing method provided by the embodiment of the application, a user does not need to frequently switch the sight between the keyboard and the electronic equipment, and convenience of inputting information to the electronic equipment through a keyboard with poor touch feeling, such as a glass keyboard or an infrared projection keyboard, is improved.
In the foregoing embodiment, preferably, an implementation flowchart of the determining the distance between the projection module and the straight line where the target object is located is shown in fig. 3, and may include:
step S31: determining a distance between the target object and the electronic device;
the distance between the object and the electronic device may be a length of a perpendicular segment from the object to the lower edge of the first panel.
In this embodiment of the application, when the area of the target object is relatively large (for example, a keyboard), the distance between the target object and the electronic device may be a distance determined by a point on the target object closest to the electronic device.
Step S32: determining a second included angle between the first panel and the horizontal plane;
a second angle between the first panel and the horizontal plane may be obtained based on a gravity sensor; it can also be obtained based on an acceleration sensor, and of course, it can also be obtained based on a tilt sensor.
The specific execution sequence of steps S31 and S32 is not limited, and step S31 may be executed first, and then step S32 may be executed; step S32 may be executed first, and then step S31 may be executed; alternatively, step S32 is performed simultaneously with step S31.
Step S33: and determining the distance between the projection module and the straight line where the target is located according to the distance between the upper edge and the lower edge of the first panel, the distance between the target and the electronic equipment and the second included angle.
Fig. 4 is a schematic diagram of calculating a distance between a projection module and a straight line where the target is located according to the embodiment of the present application, as shown in fig. 4.
In the embodiment of the present application, a distance between the projection module and the straight line where the target is located may be expressed as:
d is the distance between the projection module and the straight line where the target object is located; h is the distance between the upper edge and the lower edge of the first panel; alpha is a second included angle between the first panel and the horizontal plane; y is the distance between the target object and the electronic device.
In the foregoing embodiment, preferably, the flowchart of one implementation of determining the distance between the target object and the electronic device is shown in fig. 5, and may include:
step S51: acquiring images of the target object through two image acquisition devices which are arranged on the first panel and have a second preset distance;
on the basis of the schematic diagram of the electronic device shown in fig. 1, positions of the two image capturing devices on the first panel are shown in fig. 6, and fig. 6 is another schematic structural diagram of the electronic device provided in the embodiment of the present application.
When the first panel is tilted on a plane, the first image capturing device 61 and the second image capturing device 62 of the two image capturing devices are below the display module.
Step S52: acquiring a depth image of the target object through the two acquired target object images;
step S53: determining a distance between the target object and the electronic device from the acquired depth image.
Specifically, the distances between the target object and the two image acquisition devices can be determined through the acquired depth image;
the distance between the target object and the electronic device can be determined by the distance between the target object and the two image acquisition devices and the distance between the two image acquisition devices, and the distance can be specifically expressed by a formula:
x=(a2-b2+c2)/2a
wherein a is the distance between the two image acquisition devices; b is the distance between the target object and the first image acquisition device of the two image acquisition devices; c is the distance between the target object and the second image acquisition device of the two image acquisition devices; x is the distance between a perpendicular line segment between the target object and the straight line of the two image acquisition devices and the second image acquisition device from the intersection point between the two image acquisition devices; y is the distance between the target object and the electronic device.
In the foregoing embodiment, preferably, when the target object is an ultrasonic pen, another implementation flowchart of determining the distance between the target object and the electronic device is shown in fig. 7, and may include:
step S71: sensing ultrasonic waves emitted by the target object through two sound pickups which are arranged on the first panel and are separated from each other by a first preset distance;
on the basis of the schematic diagram of the electronic device shown in fig. 1, the positions of the two sound collectors on the first panel are shown in fig. 8, and fig. 8 is a schematic diagram of another structure of the electronic device according to the embodiment of the present application.
When the first panel is inclined on a plane, a first sound pickup 81 and a second sound pickup 82 of the two sound pickups are below the display module.
Step S72: determining the distance between the target object and the two sound pickups respectively through the time length required by the two sound pickups to sense the ultrasonic waves emitted by the target object;
the ultrasonic pen can be provided with an infrared emission source, infrared light is emitted through the infrared emission source at the side of the ultrasonic pen, the time when an infrared receiving device at the side of the electronic equipment receives the infrared light is determined as the starting time when the ultrasonic pen emits the ultrasonic wave, and the difference between the time when a first sound pick-up in the two sound pick-up senses the ultrasonic wave and the time when the infrared receiving device receives the infrared light is the time required by the first sound pick-up to sense the overtime wave emitted by the target object; the difference between the time when the second sound pick-up in the two sound pick-ups senses the ultrasonic waves and the time when the infrared receiving device receives the infrared light is the time required for the second sound pick-up to sense the ultrasonic waves emitted by the target object.
Step S73: and determining the distance between the target object and the electronic equipment according to the distance between the target object and the two microphones and the distance between the two microphones.
Fig. 9 is a schematic diagram illustrating a distance between a target object and the electronic device according to an embodiment of the disclosure;
wherein, r (l) represents a first sound pickup; r (r) denotes a second microphone; TX denotes a target; x represents the distance from the intersection point of a vertical line segment of a straight line from the target object to the two microphones and the straight line from the two microphones to the first microphone; y represents the length of a perpendicular segment from the target object to the straight line of the two microphones, namely the distance between the target object and the electronic equipment.
The distance between the object and the electronic device can be specifically expressed by the formula:
x=(a2-b2+c2)/2a
c=T1*340
b=T2*340
wherein y represents a distance between the target object and the electronic device; a is the distance between two pickups; b is the distance between the target object and the second sound pick-up; c is the distance between the target object and the first sound pickup; t is1Indicating a time period required for the first sound pickup to sense the ultrasonic waves emitted by the target object; t is2Indicating the time period required for the second microphone to sense the ultrasonic waves emitted from the object.
In the foregoing embodiment, preferably, the determining a first included angle between a perpendicular segment between the projection module and the straight line of the target object and the first panel may include:
and determining a first included angle between a perpendicular line segment from the projection module to the straight line where the target object is located and the first panel according to the distance between the projection module and the straight line where the target object is located, the distance between the upper edge and the lower edge of the first panel, and the distance between the target object and the electronic equipment.
Referring to fig. 4, the method for determining the first angle can be expressed by the following formula:
wherein β represents a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel; h is the distance between the upper edge and the lower edge of the first panel; d is the distance between the projection module and the straight line where the target object is located; y is the distance between the target object and the electronic device.
After the cosine value of the beta is obtained, the specific value of the beta can be known.
In the foregoing embodiment, preferably, the preset information may be: first information input by the object to an electronic device, and/or second information associated with the first information.
Wherein the second information associated with the first information may include, but is not limited to, the following information: spelling prompt information, character correction information, selection interface, etc. Of course, the entire display interface may be projected.
Corresponding to the method embodiment, an embodiment of the present application further provides an information processing apparatus, and a schematic structural diagram of the information processing apparatus provided in the embodiment of the present application is shown in fig. 10, and the information processing apparatus may include:
a detection module 101, a first determination module 102, a second determination module 103, a third determination module 104, an adjustment module 105 and a projection control module 106; wherein,
the detection module 101 is configured to detect whether a target object is located on a plane on which the electronic device is placed, where the target object is used to input information to the electronic device;
the target object can be a keyboard, a physical keyboard (such as a glass keyboard), a virtual keyboard (such as an infrared projection keyboard) and the like. The object may also be an ultrasonic generating device, such as an ultrasonic pen.
The detection module 101 can determine whether there is a target object through image detection; if the target object is an ultrasonic wave emitting device, whether the ultrasonic wave is sensed or not can be detected through a sound pick-up, and whether the target object exists or not can be judged.
The first determining module 102 is configured to determine, when the detecting module 101 detects a target object, a distance between the projection module and a straight line where the target object is located, where the straight line where the target object is located is a straight line parallel to a lower edge of the first panel;
the distance between the projection module and the straight line where the target object is located is the length of the perpendicular line segment from the projection module to the straight line where the target object is located.
The second determining module 103 is configured to determine a target focal length according to a distance between the projection module and a straight line where the target is located, a size of a liquid crystal sheet of the projection module, and a size of a projection picture;
the size of the liquid crystal screen can be the area of the liquid crystal screen; the size of the projected picture may be the area of the projected picture.
The target focal length may be determined by:
determining the ratio of the size of a liquid crystal sheet of the projection module to the size of a projection picture;
and the product of the distance between the projection module and the straight line where the target object is located and the ratio is the target focal length.
The target focal length can be formulated as:
wherein f is the target focal length; c. C1The size of the liquid crystal screen of the projection module; c. C2The size of the projection picture is obtained; d is the projection moduleAnd the distance from the straight line of the target object.
The third determining module 104 is configured to determine a first included angle between a perpendicular line segment from the projection module to the straight line where the target object is located and the first panel;
the adjusting module 105 is configured to adjust an angle between the lens of the projection module and the first panel to the first included angle, and adjust a focal length of the projection module to the target focal length;
the projection control module 106 is configured to project the preset information displayed by the display module through the projection module.
After the angle between the lens of the projection module and the first panel is adjusted to the first included angle and the focal length of the projection module is adjusted to the target focal length, when the projection module is used for projection, a projection picture can be projected to the position near the target, specifically to the intersection point of the vertical line segment between the projection module and the straight line where the target is located.
The application provides an information processing device, which is applied to electronic equipment, wherein the electronic equipment is at least provided with a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the information processing device detects whether a target object exists on a plane on which the electronic equipment is placed; when the target object is detected, determining the distance between the projection module and a straight line where the target object is located, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel; determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture; determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel; adjusting the angle between the lens of the projection module and the first panel to the first included angle, and adjusting the focal length of the projection module to the target focal length; the preset information displayed by the display module is projected by the projection module, so that the information displayed by the display module can be projected onto a plane on which the electronic equipment is placed by the projection module, and a projection picture is near a target object.
Therefore, through the information processing device provided by the embodiment of the application, the target object can be an ultrasonic pen, and when a user writes on a writing plane through the ultrasonic pen, information displayed on the electronic equipment side can be projected onto the writing plane through the information processing device provided by the embodiment of the application, so that the user can see the related information without transferring the sight line to the electronic equipment, and the writing convenience of the user is improved.
When the target object is a glass keyboard or an infrared projection keyboard, due to the fact that the touch feeling of a user to the keys is poor, the user needs to frequently switch the sight line between the keyboard and the electronic equipment, and the convenience is poor. By the aid of the information processing device, a user does not need to frequently switch the sight between the keyboard and the electronic equipment, and convenience of inputting information to the electronic equipment through keyboards with poor touch feeling, such as glass keyboards or infrared projection keyboards is improved.
In the foregoing embodiment, preferably, a schematic structural diagram of the first determining module 102 is shown in fig. 11, and may include:
a first determination unit 111, a second determination unit 112, and a third determination unit 113; wherein,
the first determining unit 111 is used for determining the distance between the target object and the electronic device;
the distance between the object and the electronic device is the length of a perpendicular line segment from the object to the lower edge of the first panel.
In this embodiment of the application, when the area of the target object is relatively large (for example, a keyboard), the distance between the target object and the electronic device may be a distance determined by a point on the target object closest to the electronic device.
The second determining unit 112 is configured to determine a second included angle between the first panel and the horizontal plane;
the second determination unit 112 may be obtained based on a gravity sensor; it can also be obtained based on an acceleration sensor, and of course, it can also be obtained based on a tilt sensor.
The third determining unit 113 is configured to determine a distance between the projection module and a straight line where the target is located according to a distance between an upper edge and a lower edge of the first panel, a distance between the target and the electronic device, and the second included angle.
Referring to fig. 4, in the embodiment of the present application, a distance between the projection module and the straight line where the target object is located may be expressed as:
d is the distance between the projection module and the straight line where the target object is located; h is the distance between the upper edge and the lower edge of the first panel; alpha is a second included angle between the first panel and the horizontal plane; y is the distance between the target object and the electronic device.
In the foregoing embodiment, preferably, a schematic structural diagram of the first determining unit 111 is shown in fig. 12, and may include:
two image capturing devices, a first image capturing device 121 and a second image capturing device 122 in fig. 12, disposed on the first panel and spaced apart by a second predetermined distance; further comprising:
a third determination subunit 123 and a fourth determination subunit 124; wherein,
the third determining subunit 123 is configured to obtain a depth image of the target object through two target object images acquired by the two image acquiring devices at the same time;
the fourth determining subunit 124 is configured to determine a distance between the target object and the electronic device through the acquired depth map.
Specifically, the distances between the target object and the two image acquisition devices can be determined through the acquired depth image;
the distance between the target object and the electronic equipment can be determined according to the distance between the target object and the two image acquisition devices and the distance between the two image acquisition devices.
The positions of the two image capturing devices on the first panel are shown in fig. 6, and when the first panel is tilted on a plane, the two image capturing devices are below the display module.
In the above embodiment, preferably, when the object is an ultrasonic pen, another schematic structural diagram of the first determining unit 111 is shown in fig. 13, and may include:
two microphones, a first microphone 131 and a second microphone 132 in fig. 13, disposed on the first panel at a first predetermined distance; further comprising:
a first determination subunit 133 and a second determination subunit 134; wherein,
the first determining subunit 133 is configured to determine, through the two microphones, the distances between the object and the two microphones, respectively, by the time length required for the two microphones to sense the ultrasonic waves emitted by the object;
the ultrasonic pen can be provided with an infrared emission source, infrared light is emitted through the infrared emission source at the side of the ultrasonic pen, the time when an infrared receiving device at the side of the electronic equipment receives the infrared light is determined as the starting time when the ultrasonic pen emits the ultrasonic wave, and the difference between the time when a first sound pick-up in the two sound pick-up senses the ultrasonic wave and the time when the infrared receiving device receives the infrared light is the time required by the first sound pick-up to sense the overtime wave emitted by the target object; the difference between the time when the second sound pick-up in the two sound pick-ups senses the ultrasonic waves and the time when the infrared receiving device receives the infrared light is the time required for the second sound pick-up to sense the ultrasonic waves emitted by the target object.
The second determining subunit 134 is configured to determine the distance between the object and the electronic device according to the distance between the object and the two microphones and the distance between the two microphones.
The method for calculating the distance between the specific target object and the electronic device can refer to the embodiment shown in fig. 9, and is not described herein again.
The positions of the two sound collectors on the first panel are shown in fig. 8, and when the first panel is obliquely erected on a plane, the two sound collectors are arranged below the display module.
In the foregoing embodiment, preferably, the third determining module 104 is specifically configured to determine, according to a distance between the projection module and the straight line where the target object is located, a distance between an upper edge and a lower edge of the first panel, and a distance between the target object and the electronic device, a first included angle between a perpendicular line segment between the projection module and the straight line where the target object is located and the first panel.
Referring to fig. 4, the method for determining the first angle can be expressed by the following formula:
wherein β represents a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel; h is the distance between the upper edge and the lower edge of the first panel; d is the distance between the projection module and the straight line where the target object is located; y is the distance between the target object and the electronic device.
After the cosine value of the beta is obtained, the specific value of the beta can be known.
In the above embodiment, preferably, the projection control module 106 is specifically configured to project, through the projection module, first information input to the electronic device by the object and displayed by the display module, and/or second information associated with the first information.
Wherein the second information associated with the first information may include, but is not limited to, the following information: spelling prompt information, character correction information, selection interface, etc. Of course, the entire display interface may be projected.
An embodiment of the present application further provides an electronic device, where the electronic device has the information processing apparatus according to any one of the above apparatus embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (13)
1. An information processing method is applied to electronic equipment, and is characterized in that the electronic equipment is at least provided with a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the method comprises the following steps:
detecting whether a target object exists on a plane on which the electronic equipment is placed, wherein the target object is used for inputting information to the electronic equipment;
when the target object is detected, determining the distance between the projection module and a straight line where the target object is located, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel;
determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture;
determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel;
adjusting the angle between the lens of the projection module and the first panel to the first included angle, and adjusting the focal length of the projection module to the target focal length;
and projecting the preset information displayed by the display module through the projection module.
2. The method of claim 1, wherein determining the distance between the projection module and the line of the target object comprises:
determining a distance between the target object and the electronic device;
determining a second included angle between the first panel and the horizontal plane;
and determining the distance between the projection module and the straight line where the target is located according to the distance between the upper edge and the lower edge of the first panel, the distance between the target and the electronic equipment and the second included angle.
3. The method of claim 2, wherein the target is an ultrasonic pen; the determining the distance between the target object and the electronic device comprises:
sensing ultrasonic waves emitted by the target object through two sound pickups which are arranged on the first panel and are separated from each other by a first preset distance;
determining the distance between the target object and the two sound pickups respectively through the time length required by the two sound pickups to sense the ultrasonic waves emitted by the target object;
and determining the distance between the target object and the electronic equipment according to the distance between the target object and the two microphones and the distance between the two microphones.
4. The method of claim 2, wherein the determining the distance between the target object and the electronic device comprises:
acquiring images of the target object through two image acquisition devices which are arranged on the first panel and have a second preset distance;
acquiring a depth image of the target object through the two acquired target object images;
determining a distance between the target object and the electronic device from the acquired depth image.
5. The method of claim 1, wherein determining a first angle between a segment of a perpendicular between the projection module and the line of the target object and the first panel comprises:
and determining a first included angle between a perpendicular line segment from the projection module to the straight line where the target object is located and the first panel according to the distance between the projection module and the straight line where the target object is located, the distance between the upper edge and the lower edge of the first panel, and the distance between the target object and the electronic equipment.
6. The method of claim 1, wherein the preset information is: first information input by the object to an electronic device, and/or second information associated with the first information.
7. An information processing device is applied to electronic equipment, and is characterized in that the electronic equipment is at least provided with a first panel, and a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the device comprises:
the detection module is used for detecting whether a target object exists on a plane on which the electronic equipment is placed, and the target object is used for inputting information to the electronic equipment;
the first determining module is used for determining the distance between the projection module and a straight line where the target object is located when the detection module detects the target object, wherein the straight line where the target object is located is a straight line parallel to the lower edge of the first panel;
the second determining module is used for determining a target focal length according to the distance between the projection module and the straight line where the target object is located, the size of a liquid crystal sheet of the projection module and the size of a projection picture;
the third determining module is used for determining a first included angle between a perpendicular line segment from the projection module to a straight line where the target object is located and the first panel;
the adjusting module is used for adjusting the angle between the lens of the projection module and the first panel to the first included angle and adjusting the focal length of the projection module to the target focal length;
and the projection control module is used for projecting the preset information displayed by the display module through the projection module.
8. The apparatus of claim 7, wherein the first determining module comprises:
a first determination unit configured to determine a distance between the target object and the electronic device;
the second determining unit is used for determining a second included angle between the first panel and the horizontal plane;
and the third determining unit is used for determining the distance between the projection module and the straight line where the target is located according to the distance between the upper edge and the lower edge of the first panel, the distance between the target and the electronic equipment and the second included angle.
9. The apparatus according to claim 8, wherein the object is an ultrasonic pen, and the first determination unit includes:
the two sound pickups are arranged on the first panel and are separated by a first preset distance;
the first determining subunit is used for determining the distances between the target object and the two sound collectors respectively through the time length required by the two sound collectors to sense the ultrasonic waves emitted by the target object;
and the second determining subunit is used for determining the distance between the object and the electronic equipment according to the distance between the object and the two microphones and the distance between the two microphones.
10. The apparatus according to claim 8, wherein the first determining unit comprises:
the two image acquisition devices are arranged on the first panel and have a second preset distance;
the third determining subunit is used for acquiring the depth image of the target object through two target object images acquired by the two image acquisition devices simultaneously;
a fourth determining subunit, configured to determine, from the acquired depth map, a distance between the target object and the electronic device.
11. The apparatus according to claim 7, wherein the third determining module is specifically configured to determine a first included angle between a perpendicular segment between the projection module and the straight line of the target object and the first panel according to a distance between the projection module and the straight line of the target object, a distance between an upper edge and a lower edge of the first panel, and a distance between the target object and the electronic device.
12. The apparatus of claim 7, wherein the projection control module is specifically configured to project, through the projection module, first information input to the electronic device by the object displayed by the display module and/or second information associated with the first information.
13. An electronic device is characterized by at least comprising a first panel, wherein a display module and a projection module are arranged on the first panel; the display module and the projection module are positioned on the same side of the first panel; the electronic device further comprises an information processing apparatus according to any one of claims 7 to 12.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201410455276.9A CN104182128B (en) | 2014-09-09 | 2014-09-09 | Information processing method, device and electronic equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201410455276.9A CN104182128B (en) | 2014-09-09 | 2014-09-09 | Information processing method, device and electronic equipment |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN104182128A true CN104182128A (en) | 2014-12-03 |
| CN104182128B CN104182128B (en) | 2017-08-29 |
Family
ID=51963228
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201410455276.9A Active CN104182128B (en) | 2014-09-09 | 2014-09-09 | Information processing method, device and electronic equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN104182128B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105786337A (en) * | 2014-12-22 | 2016-07-20 | 联想(北京)有限公司 | Information processing method and electronic device |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006033245A1 (en) * | 2004-09-21 | 2006-03-30 | Nikon Corporation | Electronic device |
| US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
| US20130335379A1 (en) * | 2012-03-31 | 2013-12-19 | Sameer Sharma | Computing device, apparatus and system for display and integrated projection |
| CN103517019A (en) * | 2013-09-16 | 2014-01-15 | 闻泰通讯股份有限公司 | Imaging system of electronic device and cell phone with imaging system |
| CN103727951A (en) * | 2014-01-21 | 2014-04-16 | 广东省自动化研究所 | Novel non-screen navigation system |
| CN103793061A (en) * | 2014-03-03 | 2014-05-14 | 联想(北京)有限公司 | Control method and electronic equipment |
-
2014
- 2014-09-09 CN CN201410455276.9A patent/CN104182128B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006033245A1 (en) * | 2004-09-21 | 2006-03-30 | Nikon Corporation | Electronic device |
| US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
| US20130335379A1 (en) * | 2012-03-31 | 2013-12-19 | Sameer Sharma | Computing device, apparatus and system for display and integrated projection |
| CN103517019A (en) * | 2013-09-16 | 2014-01-15 | 闻泰通讯股份有限公司 | Imaging system of electronic device and cell phone with imaging system |
| CN103727951A (en) * | 2014-01-21 | 2014-04-16 | 广东省自动化研究所 | Novel non-screen navigation system |
| CN103793061A (en) * | 2014-03-03 | 2014-05-14 | 联想(北京)有限公司 | Control method and electronic equipment |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105786337A (en) * | 2014-12-22 | 2016-07-20 | 联想(北京)有限公司 | Information processing method and electronic device |
| CN105786337B (en) * | 2014-12-22 | 2019-02-05 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
Also Published As
| Publication number | Publication date |
|---|---|
| CN104182128B (en) | 2017-08-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11516381B2 (en) | Display device and operating method thereof with adjustments related to an image display according to bending motion of the display device | |
| JP4812812B2 (en) | Identification of mobile device tilt and translational components | |
| US8564551B2 (en) | Input device for foldable display device and input method thereof | |
| US7599561B2 (en) | Compact interactive tabletop with projection-vision | |
| KR102109649B1 (en) | Method for correcting coordination of electronic pen and potable electronic device supporting the same | |
| US8669937B2 (en) | Information processing apparatus and computer-readable medium | |
| CN110557575A (en) | Method and electronic device for eliminating glare | |
| US20140375698A1 (en) | Method for adjusting display unit and electronic device | |
| CN105094675A (en) | Man-machine interaction method and touch screen wearable device | |
| WO2014029020A1 (en) | Keyboard projection system with image subtraction | |
| US8983227B2 (en) | Perspective correction using a reflection | |
| CN105975550A (en) | Question searching method and device of intelligent equipment | |
| WO2020063115A1 (en) | Graphic coding display method and apparatus | |
| JP6686319B2 (en) | Image projection device and image display system | |
| KR102018556B1 (en) | Management system and method for managing defect repairing | |
| CN103905865A (en) | Touchable intelligent television and method for achieving touching | |
| US20110133882A1 (en) | Apparatus for detecting coordinates of an event within interest region, display device, security device and electronic blackboard including the same | |
| CN104182128A (en) | Information processing method and device and electronic equipment | |
| JP2018116346A (en) | Input control device, display device, and input control method | |
| KR102084161B1 (en) | Electro device for correcting image and method for controlling thereof | |
| CN110163192A (en) | Character identifying method, device and readable medium | |
| KR20140132065A (en) | Electronic apparatus and method for detecting user input position on display unit and machine-readable storage medium | |
| CN111899615A (en) | Scoring method, device, equipment and storage medium for experiment | |
| KR20120029447A (en) | Input system for a handheld electronic device | |
| JP5677873B2 (en) | Data transmission method and information processing system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |