[go: up one dir, main page]

CN105208181B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN105208181B
CN105208181B CN201410258450.0A CN201410258450A CN105208181B CN 105208181 B CN105208181 B CN 105208181B CN 201410258450 A CN201410258450 A CN 201410258450A CN 105208181 B CN105208181 B CN 105208181B
Authority
CN
China
Prior art keywords
electronic device
image
information
display unit
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410258450.0A
Other languages
Chinese (zh)
Other versions
CN105208181A (en
Inventor
杨杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410258450.0A priority Critical patent/CN105208181B/en
Publication of CN105208181A publication Critical patent/CN105208181A/en
Application granted granted Critical
Publication of CN105208181B publication Critical patent/CN105208181B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention discloses a kind of information processing method and electronic equipment, methods described includes:The second electronic equipment display unit is controlled to show the first image, to receive the second image of the first electronic equipment acquisition by communication port;Second image is parsed, obtains the second image information corresponding to second image;By second image information compared with the first image information corresponding to described first image, attitude information of first electronic equipment relative to the second electronic equipment is obtained;According to the attitude information, determine the 3rd information and send the 3rd information to first electronic equipment, so that first electronic equipment is shown.By means of the invention it is possible to make equipment be taken over seamlessly in different equipment collaboration scenes, simple operation and meet user's custom, improve Consumer's Experience.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to information processing technologies, and in particular, to an information processing method and an electronic device.
Background
Increasingly, scenes of cooperative use of electronic devices are increasing, for example, information can be exchanged between a mobile phone and a tablet computer, and the information can be files or data of program operation; however, how to implement smooth switching in different device cooperation scenarios, for example, how to make the device cooperation in the office scenario smoothly switch to the device cooperation in the game scenario in order to simplify the user operation and save the operation time, and the related art has not yet an effective solution.
Disclosure of Invention
The embodiment of the invention provides an information processing method and electronic equipment, which can realize smooth switching of different scenes in equipment cooperation, and are quick and convenient to operate and high in user experience.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides an information processing method, which is applied to second electronic equipment with a display unit, wherein the second electronic equipment supports the establishment of a communication channel with first electronic equipment;
the method comprises the following steps:
the second electronic device determines that the first electronic device is positioned over the second electronic device display unit,
controlling the second electronic device display unit to display a first image to,
receiving a second image acquired by the first electronic device through the communication channel, wherein the second image is obtained by acquiring the first image by the first electronic device;
analyzing the second image to obtain second image information corresponding to the second image;
comparing the second image information with first image information corresponding to the first image to obtain attitude information of the first electronic equipment relative to the second electronic equipment;
and determining third information according to the attitude information and sending the third information to the first electronic equipment for displaying by the first electronic equipment.
The embodiment of the invention provides second electronic equipment, which comprises a communication unit, a display unit, a detection unit, a control unit and an analysis unit; wherein,
the communication unit is used for supporting the establishment of a communication channel with the first electronic equipment;
the detection unit is used for detecting whether the first electronic equipment is arranged on the display unit or not;
the control unit is used for controlling the display unit to display a first image when the detection unit detects that the first electronic equipment is placed on the display unit;
the communication unit is used for receiving a second image acquired by the first electronic equipment through the communication channel, wherein the second image is obtained by acquiring the first image by the first electronic equipment;
the analysis unit is used for analyzing the second image to obtain second image information corresponding to the second image;
comparing the second image information with first image information corresponding to the first image to obtain attitude information of the first electronic equipment relative to the second electronic equipment;
the control unit is further configured to determine third information according to the posture information and trigger the communication unit to send the third information to the first electronic device for display by the first electronic device.
In the embodiment of the invention, the second electronic equipment determines the posture of the first electronic equipment relative to the second electronic equipment by comparing the first image information with the second image information, and determines information for displaying the first electronic equipment based on the determined posture; therefore, when different postures correspond to different collaborative scenes, the posture of the first electronic device relative to the second electronic device is changed, and the information corresponding to the different postures can be determined to be displayed by the first electronic device, so that smooth switching of scenes in device collaboration is realized, the operation is convenient and fast, the habit of a user is met, and the user experience is improved.
Drawings
Fig. 1a is a schematic flow chart illustrating an implementation of an information processing method according to an embodiment of the present invention;
fig. 1b is a schematic view of a scenario of device cooperation according to a first embodiment of the present invention;
fig. 2a is a schematic flow chart illustrating an implementation of an information processing method according to a second embodiment of the present invention;
fig. 2b is a schematic view of a scenario of device cooperation in the second embodiment of the present invention;
fig. 3a is a schematic flow chart illustrating an implementation of an information processing method according to a third embodiment of the present invention;
fig. 3b to fig. 3g are schematic views of a scenario of device cooperation in a third embodiment of the present invention;
fig. 4a is a schematic flow chart illustrating an implementation of an information processing method according to a fourth embodiment of the present invention;
fig. 4b to fig. 4g are schematic views of a scenario of device cooperation in the fourth embodiment of the present invention;
fig. 5a is a schematic flow chart illustrating an implementation of the information processing method according to the fifth embodiment of the present invention;
fig. 5b to fig. 5d are schematic views of scenes of device cooperation in the fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a second electronic device according to a sixth embodiment of the invention;
fig. 7 is a schematic structural diagram of a second electronic device in a seventh embodiment of the invention.
Detailed Description
The invention is described in further detail below with reference to the figures and specific examples.
Example one
This embodiment describes an information processing method, applied to a second electronic device with a display unit, where the second electronic device supports establishing a communication channel with a first electronic device, and the type of the communication channel includes: a communication channel for data transmission based on invisible light (e.g., infrared light); a communication channel for data transmission based on an Internet Protocol (IP); a communication channel for data transmission based on Wireless compatibility authentication (WiFi) or Wireless compatibility authentication Direct connection (WiFi-Direct) protocol; and the communication channel is used for data transmission based on Bluetooth (Bluetooth) and Zigbee (Zigbee) protocols.
As shown in fig. 1a, the information processing method according to the present embodiment includes the steps of:
step 101, when the second electronic device determines that the first electronic device is placed on the display unit of the second electronic device, the second electronic device is controlled to display a first image on the display unit of the second electronic device.
When a user of the first electronic device places the first electronic device on the display unit of the second electronic device, a control instruction can be sent to the user of the second electronic device by the user of the first electronic device to indicate that the first electronic device is placed on the display unit of the second electronic device; or, the first electronic device is triggered by the first electronic device user to send a control instruction to the second electronic device, indicating that the first electronic device has been placed on the display unit of the second electronic device.
And 102, receiving a second image acquired by the first electronic equipment through the communication channel.
The second image is obtained by the first electronic device controlling an image acquisition unit (such as a camera) to acquire the first image.
Step 103, analyzing the second image to obtain second image information corresponding to the second image.
And 104, comparing the second image information with first image information corresponding to the first image to obtain the posture information of the first electronic equipment relative to the second electronic equipment.
The second image information includes information of pixel points of the second image, for example, Red, Green and Blue (RGB) information of each pixel point, the first image information includes information of pixel points of the first image, for example, RGB information of each pixel point, and the first image displayed by the first electronic device may be an image having a unique identification feature locally, that is, the second image is divided in any form, and the divided image has a unique identification feature, where the identification feature includes: contour features, RGB features; therefore, when the second electronic device is placed on the first electronic device, the second image information corresponding to the acquired second image is compared with the first image information of the first image, so that the position of the first electronic device on the display unit of the second electronic device for image acquisition can be determined, and the posture of the first electronic device relative to the second electronic device is also determined.
And 105, determining third information according to the attitude information and sending the third information to the first electronic equipment.
And step 106, sending the third information to the first electronic device through the communication channel.
The first information is used for displaying by the first electronic device.
In this embodiment, the third information corresponds to different posture information, that is, when the posture of the first electronic device relative to the second electronic device changes, the first electronic device receives and displays different third information; in practical application, when the second electronic device has a display unit with a large area, the display unit of the second electronic device is often divided into different functional areas, such as an office area, a game area, and the like, so that the functional area where the first electronic device is located can be determined according to the posture of the first electronic device relative to the second electronic device, and third information corresponding to the functional area is determined and sent to the first electronic device through a communication channel, thereby realizing the need that different functional areas need to be interacted based on different third information.
The information processing method described in the present embodiment will be described below with reference to specific usage scenarios.
Effects of realization
The first electronic device is a mobile phone, the second electronic device is a tablet computer, and as shown in fig. 1b, a display unit of the tablet computer is divided into two functional areas; a gaming area and an office area; when a mobile phone is placed in the corresponding area, the functional area where the mobile phone is placed is activated, and information (corresponding to third information) corresponding to the activated functional area is sent to the placed mobile phone.
Implementation procedure
1) And when the mobile phone and the tablet personal computer are within the effective Bluetooth communication distance, Bluetooth connection is established.
Of course, other types of communication channels may also be established between the mobile phone and the tablet computer, and the types of the communication channels are as described above, and are not described herein again.
2) When the tablet personal computer confirms that the mobile phone is placed, displaying a first image with local unique identification characteristics to replace the functional area shown in the figure 1b, so that the mobile phone acquires the first image through a camera to obtain a second image; the tablet computer may redisplay the functional area shown in fig. 1b after displaying the first image for a predetermined time (ensuring that the mobile phone acquires the second image).
3) The tablet personal computer receives the second image through the Bluetooth connection, and analyzes the second image to obtain second graphic information, wherein the second graphic information comprises RGB information of pixel points of the second image and/or contour characteristic information of the second image.
4) And the tablet personal computer compares the second graphic information with the first graphic information corresponding to the first image to determine the posture of the mobile phone relative to the tablet personal computer, namely which functional area of the display unit of the tablet personal computer the mobile phone is placed in.
5) The tablet computer determines information corresponding to the functional area where the mobile phone is located according to the functional area where the mobile phone is located, and taking the mobile phone located in the office area shown in fig. 1b as an example, the tablet computer can send a document currently displayed in the office area to the mobile phone through bluetooth connection, so that a user of the mobile phone can process the document conveniently.
In this embodiment, the second electronic device determines the posture of the first electronic device relative to the second electronic device by comparing the first image information and the second image information, and determines information for the first electronic device to display based on the determined posture; therefore, when different postures correspond to different collaborative scenes, the posture of the first electronic device relative to the second electronic device is changed, and the information corresponding to the different postures can be determined to be displayed by the first electronic device, so that smooth switching of scenes in device collaboration is realized, the operation is convenient and fast, the habit of a user is met, and the user experience is improved. .
Example two
This embodiment describes an information processing method, applied to a second electronic device with a display unit, where the second electronic device supports establishing a communication channel with a first electronic device, and the type of the communication channel includes: a communication channel for data transmission based on invisible light (e.g., infrared light); a communication channel for data transmission based on IP; a communication channel for data transmission based on WiFi or WiFi-Direct protocol; and the communication channel is used for carrying out data transmission based on Bluetooth and Zigbee protocols.
As shown in fig. 2a, the information processing method according to the present embodiment includes the steps of:
step 201, when the second electronic device determines that the first electronic device is placed on the display unit of the second electronic device, the second electronic device is controlled to display the first image on the display unit of the second electronic device.
When the first electronic device user is placed on the first electronic device and the second electronic device display unit, a control instruction can be sent to the second electronic device user by the first electronic device user to indicate that the first electronic device is placed on the second electronic device display unit; or, the first electronic device is triggered by the first electronic device user to send a control instruction to the second electronic device, indicating that the first electronic device has been placed on the display unit of the second electronic device.
In this embodiment, the second electronic device may determine whether the first electronic device is placed on the display unit of the second electronic device by providing a conductive unit in a contact area between the first electronic device and the second electronic device, where the conductive unit may be a conductive body such as a metal; the second electronic equipment periodically detects the contact point induced on the display unit of the second electronic equipment and judges whether the position of the detected contact point corresponds to the position of the conductive unit; when it is determined that the position of the contact point corresponds to the position of the conductive unit, it is determined that the first electronic device is placed on the second electronic device display unit. For example, when the conductive unit is disposed in a zigzag shape on the back of the second electronic device, if the second electronic device detects the contact arranged in a zigzag shape on its own display unit at the same time, it is determined that the first electronic device has been placed on the display unit of the second electronic device.
Step 202, receiving a second image acquired by the first electronic device through the communication channel.
The second image is obtained by acquiring the first image by the first electronic equipment.
Step 203, analyzing the second image to obtain second image information corresponding to the second image.
Step 204, comparing the second image information with the first image information corresponding to the first image to obtain the posture information of the first electronic device relative to the second electronic device.
The second image information includes information of pixel points of the second image, for example, RGB information of each pixel point, the first image information includes information of pixel points of the first image, for example, RGB information of each pixel point, and the first image displayed by the first electronic device may be an image having a locally unique identification feature, that is, the second image is divided in any form, the divided image has a unique identification feature, and the identification feature includes: contour features, RGB features; therefore, when the second electronic device is placed on the first electronic device, the second image information corresponding to the acquired second image is compared with the first image information of the first image, so that the position of the first electronic device on the display unit of the second electronic device for image acquisition can be determined, and the posture of the first electronic device relative to the second electronic device is also determined.
Step 205, determining a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of the display unit of the second electronic device.
The first area is an area covered by the first electronic device in the display area of the display unit of the second electronic device.
Step 206, display area information of the display unit of the first electronic device is acquired.
The display area information includes length and width information of a display area of the display unit of the first electronic device.
Step 207, obtaining the third information according to the display area information, the first area, and the third image.
The third information is used for being displayed by the first electronic device, the third information includes a fourth image, and the fourth image is an image corresponding to an area covered by the first electronic device in the third image when the third image is displayed by the second electronic device display unit; in practical applications, the third information may further include information associated with the fourth image, for example, when the fourth image is an image of a text, the third information may further include the electronic document corresponding to the text.
Step 208, sending the third information to the first electronic device.
The third information is used for displaying by the first electronic device.
In this embodiment, the third information is information displayed on the display unit of the second electronic device when the first electronic device is placed in front of the display unit of the second electronic device. The inventor finds that a scene shown in fig. 2b exists in device cooperation in the process of implementing the invention, and when the second electronic device displays the third image, the content of the third image is divided into two areas, namely an office area and a game area; when a document displayed in an office area needs to be processed by a user, at this time, the document is more convenient and accords with the use habit of the user, the user places the first electronic device in an area (corresponding to the first area) corresponding to information needing to be processed in the display area of the display unit of the second electronic device, so that the second electronic device determines third information and sends the third information to the first electronic device for the second electronic device to display, and the third information is convenient for the user to process, and the third information comprises an image (namely an image of the document to be processed) corresponding to the first area in the third image and also comprises an electronic document corresponding to the image of the document to be processed.
The information processing method described in the present embodiment will be described below with reference to specific usage scenarios.
Effects of realization
The first electronic device is a mobile phone, the second electronic device is a tablet computer, and as shown in fig. 2b, a display unit of the tablet computer is divided into two functional areas; a gaming area and an office area; when the mobile phone is placed in the corresponding functional area, the functional area, where the mobile phone is placed in, of the tablet computer is activated, and the image corresponding to the area covered by the mobile phone (and information associated with the covered image) is sent to the placed mobile phone.
Implementation procedure
1) And when the mobile phone and the tablet personal computer are within the effective Bluetooth communication distance, Bluetooth connection is established.
Of course, other types of communication channels may also be established between the mobile phone and the tablet computer, and the types of the communication channels are as described above, and are not described herein again.
2) When the tablet personal computer confirms that the mobile phone is placed, displaying a first image with local unique identification characteristics to replace a functional area (corresponding to a third image) shown in fig. 2b, so that the mobile phone acquires the first image through a camera to obtain a second image; the tablet pc may redisplay the functional area shown in fig. 2b after displaying the first image for a predetermined time (ensuring that the second image is captured by the mobile phone).
3) The tablet personal computer receives the second image through the Bluetooth connection, and analyzes the second image to obtain second graphic information, wherein the second graphic information comprises RGB information of pixel points of the second image and/or contour characteristic information of the second image.
4) And the tablet personal computer compares the second graphic information with the first graphic information corresponding to the first image to determine the posture of the mobile phone relative to the tablet personal computer.
5) The tablet personal computer determines the area of the display unit covered by the mobile phone according to the information of the display area of the display unit and the posture information of the mobile phone relative to the tablet personal computer.
6) The tablet personal computer acquires display area information of the mobile phone display unit, and determines third information according to the acquired display area information of the mobile phone display unit, the display area information of the display unit and the image shown in fig. 2b, wherein the third information comprises an image corresponding to an area covered by the mobile phone in the image shown in fig. 2 b.
In consideration of the fact that in practical application, the area of the display unit of the mobile phone is often smaller than the area covered by the display unit of the tablet computer, therefore, the image covered by the mobile phone in fig. 2b needs to be adapted according to the information of the display area of the display unit of the mobile phone, so that the image corresponding to the area covered by the mobile phone in fig. 2b can be completely displayed on the display unit of the mobile phone.
In this embodiment, the second electronic device determines the posture of the first electronic device relative to the second electronic device by comparing the first image information with the second image information, so that when different postures correspond to different collaborative scenes, the collaborative scenes can be smoothly switched by changing the posture of the first electronic device relative to the second electronic device; moreover, the image and the related information covered by the first electronic equipment in different scenes are sent to the first electronic equipment, the operation of the user through the first electronic equipment is facilitated, the operation is convenient and fast, the habit of the user is met, and the user experience is improved.
EXAMPLE III
This embodiment describes an information processing method, applied to a second electronic device with a display unit, where the second electronic device supports establishing a communication channel with a first electronic device, and the type of the communication channel includes: a communication channel for data transmission based on invisible light (e.g., infrared light); a communication channel for data transmission based on IP; a communication channel for data transmission based on WiFi or WiFi-Direct protocol; and the communication channel is used for carrying out data transmission based on Bluetooth and Zigbee protocols.
As shown in fig. 3a, the information processing method according to the present embodiment includes the steps of:
step 301, when the second electronic device determines that the first electronic device is placed on the display unit of the second electronic device, controlling the display unit of the second electronic device to display a first image.
When the first electronic device user is placed on the first electronic device and the second electronic device display unit, a control instruction can be sent to the second electronic device user by the first electronic device user to indicate that the first electronic device is placed on the second electronic device display unit; or, the first electronic device is triggered by the first electronic device user to send a control instruction to the second electronic device, indicating that the first electronic device has been placed on the display unit of the second electronic device.
In this embodiment, the second electronic device may determine whether the first electronic device is placed on the display unit of the second electronic device by providing a conductive unit in a contact area between the first electronic device and the second electronic device, where the conductive unit may be a conductive body such as a metal; the second electronic equipment periodically detects the contact point induced on the display unit of the second electronic equipment and judges whether the position of the detected contact point corresponds to the position of the conductive unit; when it is determined that the position of the contact point corresponds to the position of the conductive unit, it is determined that the first electronic device is placed on the second electronic device display unit. For example, when the conductive unit is disposed in a zigzag shape on the back of the second electronic device, if the second electronic device detects the contact arranged in a zigzag shape on its own display unit at the same time, it is determined that the first electronic device has been placed on the display unit of the second electronic device.
Step 302, receiving a second image acquired by the first electronic device through the communication channel.
The second image is obtained by the first electronic equipment controlling the image acquisition unit to acquire the first image.
Step 303, analyzing the second image to obtain second image information corresponding to the second image.
Step 304, matching the pixel point sequence in the first image information with the pixel point sequence in the second image information, and executing step 305 when the pixel point sequences are matched; otherwise, the process is stopped.
Step 305, determining an angle and a direction by which the sequence of pixel points in the first image information is rotated.
Step 306, determining the posture of the first electronic device relative to the second electronic device based on the rotated angle and direction.
The second image information includes information of pixel points of the second image, for example, RGB information of each pixel point, the first image information includes information of pixel points of the first image, for example, RGB information of each pixel point, and the first image displayed by the first electronic device may be an image having a locally unique identification feature, that is, the second image is divided in any form, the divided image has a unique identification feature, and the identification feature includes: contour features, RGB features; therefore, when the second electronic device is placed on the first electronic device, the second image information corresponding to the acquired second image is compared with the first image information of the first image, so that the position of the first electronic device on the display unit of the second electronic device for image acquisition and the placement direction of the first electronic device relative to the second electronic device can be determined.
In one example, as shown in fig. 3b, the second electronic device controls the first image information displayed by the self display unit to include at least a plurality of pixel point sequences (identified by solid line segments and dotted line segments, respectively); each of the pixel point sequences has unique RGB information; the geometric relation of the pixel point sequences is parallel; the pixel point sequence enables the first image to be an image with local unique identification characteristics, namely the second image is divided in any form, and the divided image has unique contour characteristics and RGB characteristics; it should be noted that fig. 3b is only schematic, and in practical application, the pixel point sequence displayed by the second electronic device may completely fill the display area of the display unit of the tablet computer, so as to ensure that the second images acquired by the mobile phone in any area are different.
When the first electronic device is placed on the display unit of the second electronic device as shown in fig. 3b, the second image collected by the first electronic device corresponds to the sequence of the middle pixel points of the first image displayed by the second electronic device as shown in fig. 3c, that is, the sequence of the pixel points in the second image shown in fig. 3b can be matched with the sequence of the pixel points in the first image shown in fig. 3b without rotation; when the second image acquired by the first electronic device is shown in fig. 3d, according to the first image shown in fig. 3b, the second image shown in fig. 3d needs to be rotated counterclockwise by an angle (set to θ) to enable the pixel point sequence in the second image shown in fig. 3d to correspond to the pixel point sequence in the first image shown in fig. 3b, where the pixel point sequence corresponds to the pixel point and the position of the pixel point and the RGB information of the pixel point; accordingly, it can be determined that when the first electronic device captures the second image shown in fig. 3d, the second electronic device is in the attitude shown in fig. 3e with respect to the first electronic device.
Step 307, determining a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of the display unit of the second electronic device.
The first area is an area covered by the first electronic device in the display area of the display unit of the second electronic device.
Step 308, obtaining display area information of the display unit of the first electronic device.
The display area information includes length and width information of a display area of the display unit of the first electronic device.
Step 309, obtaining the third information according to the display area information, the first area, and the third image.
The third information is used for being displayed by the first electronic device, the third information includes a fourth image, the fourth image is an image corresponding to an area covered by the first electronic device in the third image when the display unit of the second electronic device displays the third image, the third image is any image displayed in the operation process of the second electronic device, and for example, the third image may correspond to a desktop icon of the second electronic device; in practical applications, the third information may further include information associated with the fourth image, for example, when the fourth image is an image of a text, the third information may further include the electronic document corresponding to the text.
Step 310, sending the third information to the first electronic device.
The third information is used for displaying by the first electronic device.
In one example, when the tablet computer displays the desktop icon as shown in fig. 3f, and the mobile phone is placed at the position of the dotted line shown in fig. 3f relative to the tablet computer, the tablet computer determines the posture information of the mobile phone relative to the tablet computer through steps 301 to 306, including the position and the placement direction of the mobile phone on the display unit of the tablet computer; through steps 307 to 309, determining information of an area of the tablet pc display unit covered by the mobile phone, and adapting the image of the tablet pc display unit covered by the mobile phone in combination with the information of the mobile phone display area and the posture information of the mobile phone, so that the mobile phone display unit can completely display the covered image, and the display direction of the covered image is consistent with the display direction of the covered image on the tablet pc, and a schematic diagram of the mobile phone displaying the covered image (corresponding to the third information) is shown in fig. 3 g.
In this embodiment, the second electronic device determines the posture of the first electronic device relative to the second electronic device, including the position and the direction in which the first electronic device is placed, by comparing the pixel sequences in the first image information and the second image information; therefore, the first electronic equipment can display the covered image, and the display direction of the first electronic equipment is consistent with the display direction of the covered image on the second electronic equipment, so that the habit of a user is met, and the user experience is improved.
Example four
This embodiment describes an information processing method, applied to a second electronic device with a display unit, where the second electronic device supports establishing a communication channel with a first electronic device, and the type of the communication channel includes: a communication channel for data transmission based on invisible light (e.g., infrared light); a communication channel for data transmission based on IP; a communication channel for data transmission based on WiFi or WiFi-Direct protocol; and the communication channel is used for carrying out data transmission based on Bluetooth and Zigbee protocols.
As shown in fig. 4a, the information processing method according to the present embodiment includes the steps of:
step 401, when the second electronic device determines that the first electronic device is placed on the display unit of the second electronic device, controlling the display unit of the second electronic device to display a first image.
Step 402, receiving a second image acquired by the first electronic device through the communication channel.
The second image is obtained by the first electronic equipment controlling the image acquisition unit to acquire the first image.
Step 403, analyzing the second image to obtain second image information corresponding to the second image.
Step 404, matching the pixel point sequence in the first image information with the pixel point sequence in the second image information, and executing step 405 when the pixel point sequences are matched; otherwise, the process is stopped.
Step 405, determining an angle and a direction by which the sequence of pixel points in the first image information is rotated.
Step 406, determining the posture of the first electronic device relative to the second electronic device based on the rotated angle and direction.
Step 407, determining a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of the display unit of the second electronic device.
The first area is an area covered by the first electronic device in the display area of the display unit of the second electronic device.
Step 408, obtaining display area information of the display unit of the first electronic device.
The display area information includes length and width information of a display area of the display unit of the first electronic device.
Step 409, obtaining the third information according to the display area information, the first area and the third image.
Step 410, sending the third information to the first electronic device.
The third information is used for displaying by the first electronic device.
In one example, when the tablet computer displays the desktop icon as shown in fig. 4b, and the mobile phone is placed at the position of the dotted line shown in fig. 4b relative to the tablet computer, the tablet computer determines the posture information of the mobile phone relative to itself through steps 401 to 406, including the position of the mobile phone on the display unit of the tablet computer and the placement direction of the mobile phone; through steps 407 to 409, information of an area of the tablet computer display unit covered by the mobile phone is determined, and the image of the tablet computer display unit covered by the mobile phone is adapted according to the information of the mobile phone display area and the posture information of the mobile phone, so that the mobile phone display unit can completely display the covered image, the display direction of the covered image is consistent with the display direction of the covered image on the tablet computer, and a schematic diagram of the mobile phone displaying the covered image is shown in fig. 4 c.
Step 411, receiving fourth information sent by the first electronic device through the communication channel.
The fourth information is information of the first touch event received by the first electronic device display unit.
Step 412, obtaining a second touch event according to the posture information of the first electronic device relative to the second electronic device and the first touch event.
The operating point of the second touch event is located in the first area, and the position of the operating point of the second touch event in the first area corresponds to the position of the operating point of the first touch event in the display unit of the first electronic device.
Step 413, controlling the second electronic device display unit to respond to the second touch event.
As shown in fig. 4d, when the operation point of the first touch event is located in the icon 4 of the display unit of the mobile phone, the tablet computer obtains a second touch event according to the touch event, and the operation point of the second touch event is as shown in fig. 4e, so as to control the display unit of the tablet computer to respond to the second touch event and change the currently displayed image of the tablet computer.
Step 414, determining new third information after responding to the second touch event, and sending the new third information to the first electronic device.
The new third information is used for being displayed by the first electronic equipment; if the image to be displayed after the tablet computer responds to the second touch event is shown in fig. 4f, determining new third information for the mobile phone to display according to the currently displayed image, information of the display area of the display unit of the tablet computer and information of the display area of the display unit of the mobile phone, wherein a schematic diagram of the mobile phone displaying the new third information is shown in fig. 4 g;
it should be noted that fig. 4f is only schematic, and in practical applications, the area of the tablet computer display unit covered by the mobile phone may not display an image, so as to reduce power consumption.
In this embodiment, when the first electronic device displays an image covering the second electronic device, the second electronic device can control the display unit of the first electronic device to respond to the touch event according to the touch event received by the display unit of the first electronic device, and synchronize the content displayed by the first electronic device with the content updated and displayed in the first area of the display unit of the second electronic device, thereby improving user experience in a device cooperation scene.
EXAMPLE five
This embodiment describes an information processing method, applied to a second electronic device with a display unit, where the second electronic device supports establishing a communication channel with a first electronic device, and the type of the communication channel includes: a communication channel for data transmission based on invisible light (e.g., infrared light); a communication channel for data transmission based on IP; a communication channel for data transmission based on WiFi or WiFi-Direct protocol; and the communication channel is used for carrying out data transmission based on Bluetooth and Zigbee protocols.
As shown in fig. 5a, the information processing method according to the present embodiment includes the steps of:
step 501, when the second electronic device determines that the first electronic device is placed on the display unit of the second electronic device, the second electronic device is controlled to display a first image on the display unit of the second electronic device.
Step 502, receiving a second image acquired by the first electronic device through the communication channel.
The second image is obtained by the first electronic equipment controlling the image acquisition unit to acquire the first image.
Step 503, analyzing the second image to obtain second image information corresponding to the second image.
Step 504, matching the pixel point sequence in the first image information with the pixel point sequence in the second image information, and executing step 505 when the pixel point sequences are matched; otherwise, the process is stopped.
Step 505, determining the angle and direction of rotation of the sequence of pixel points in the first image information.
Step 506, determining the posture of the first electronic device relative to the second electronic device based on the rotated angle and direction.
The process of determining the pose in step 506 is similar to the previous embodiment and is not described here again.
Step 507, determining a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of the display unit of the second electronic device.
The first area is an area covered by the first electronic device in the display area of the display unit of the second electronic device.
Step 508, obtaining display area information of the display unit of the first electronic device.
The display area information includes length and width information of a display area of the display unit of the first electronic device.
Step 509, obtaining the third information according to the display area information, the first area, and the third image.
The third information is used for being displayed by the first electronic device, the third information includes a fourth image, the fourth image is an image corresponding to an area covered by the first electronic device in the third image when the display unit of the second electronic device displays the third image, the third image is any image displayed in the operation process of the second electronic device, and for example, the third image may correspond to a desktop icon of the second electronic device; in practical applications, the third information may further include information associated with the fourth image, for example, when the fourth image is an image of a text, the third information may further include the electronic document corresponding to the text.
Step 510, sending the third information to the first electronic device.
The third information is used for displaying by the first electronic device.
In step 511, a third touch event is received.
The operation point of the third touch event is located in a second area of the display unit of the second electronic device, where the second area is an area other than the first area in the display area of the display unit of the second electronic device, and an example of the third touch event is shown in fig. 5b, and when a user performs an operation, the operation point of the triggered third touch event corresponds to an icon 9; the dotted line represents a box which is an area covered by the mobile phone of the display unit of the tablet computer; the dashed icon is displayed on the tablet computer but covered by the mobile phone.
Step 512, controlling the second electronic device display unit to respond to the third touch event to display a fifth image.
Step 513, sending the sixth image to the first electronic device through the communication channel.
The sixth image is used for displaying by the first electronic device; and the sixth image is an image corresponding to the first area in the fifth image.
If the third touch event is an event of switching desktop icons of the tablet computer, an example diagram of the tablet computer needing to display a fifth image is shown in fig. 5c, the tablet computer displays an icon different from that in fig. 5b, it should be noted that fig. 5c is only schematic, an area covered by the mobile phone in the display unit of the tablet computer may not display an image to save power consumption, and an image (i.e., a sixth image) corresponding to an area covered by the mobile phone (i.e., a first area) in the image needing to be displayed is sent to the mobile phone for display by the mobile phone, and an example diagram is shown in fig. 5d, the icon displayed by the mobile phone corresponds to a new icon displayed by the display unit of the tablet computer triggered by the third touch event.
In this embodiment, when the first electronic device displays an image covering the image displayed by the second electronic device, the second electronic device can control the display unit of the second electronic device to respond to the touch event according to the touch event received by the display unit of the second electronic device to display a new image, and enable the first electronic device to display a portion of the new image covered by the first electronic device, so that synchronization of operations of the first electronic device and the second electronic device is achieved, and experience of device cooperation is improved.
Here, it should be noted that: the following description of the embodiments of the electronic device is similar to the description of the method, and the description of the advantageous effects of the method is omitted for brevity. For technical details not disclosed in the embodiments of the electronic device of the present invention, refer to the description of the embodiments of the method of the present invention.
EXAMPLE six
This embodiment describes a second electronic device, as shown in fig. 6, the second electronic device includes:
a communication unit 61, a display unit 62, a detection unit 63, a control unit 64, and an analysis unit 65; wherein,
the communication unit 61 is configured to support establishment of a communication channel with a first electronic device;
the detecting unit 63 is configured to detect whether a first electronic device is placed on the display unit 62;
the control unit 64 is configured to control the display unit 62 to display a first image when the detection unit 63 detects that the first electronic device is placed on the display unit 62;
the communication unit 61 is configured to receive, through the communication channel, a second image acquired by the first electronic device, where the second image is obtained by acquiring, by the first electronic device, the first image;
the analyzing unit 65 is configured to analyze the second image to obtain second image information corresponding to the second image;
comparing the second image information with first image information corresponding to the first image to obtain attitude information of the first electronic equipment relative to the second electronic equipment;
the control unit 64 is further configured to determine third information according to the posture information and trigger the communication unit 61 to send the third information to the first electronic device for display by the first electronic device.
In practical applications, the communication unit 61 may be implemented by a module in the second electronic device that supports a corresponding communication protocol, such as a bluetooth protocol; the display unit 62 can be implemented by a touch display screen and a display driving circuit in the second electronic device; the detection unit 63, the control unit 64, and the analysis unit 65 may be implemented by a CPU, a Field Programmable Gate Array (FPGA) in the second electronic device.
EXAMPLE seven
This embodiment describes a second electronic device, as shown in fig. 7, the second electronic device includes:
a communication unit 71, a display unit 72, a detection unit 73, a control unit 74, and an analysis unit 75; wherein,
the communication unit 71 is configured to support establishment of a communication channel with a first electronic device;
the detecting unit 73 is used for detecting whether the first electronic device is placed on the display unit 72;
the control unit 74 is configured to control the display unit 72 to display a first image when the detection unit 73 detects that the first electronic device is placed on the display unit 72;
the communication unit 71 is configured to receive, through the communication channel, a second image acquired by the first electronic device, where the second image is obtained by acquiring, by the first electronic device, the first image;
the analyzing unit 75 is configured to analyze the second image to obtain second image information corresponding to the second image;
comparing the second image information with first image information corresponding to the first image to obtain attitude information of the first electronic equipment relative to the second electronic equipment;
the control unit 74 is further configured to determine third information according to the posture information and trigger the communication unit 71 to send the third information to the first electronic device, so that the third information is displayed by the first electronic device.
The control unit 74 is further configured to determine a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of the display unit 72, where the first area is an area covered by the first electronic device in the display area of the display unit 72;
acquiring display area information of a display unit of the first electronic device;
and obtaining third information according to the display area information, the first area, and a third image, where the third information includes a fourth image, and the fourth image is an image corresponding to an area covered by the first electronic device in the third image when the second electronic device display unit 72 displays the third image.
The communication unit 71 is further configured to receive fourth information sent by the first electronic device through the communication channel, where the fourth information is information of the first touch event received by the display unit of the first electronic device;
the control unit 74 is further configured to obtain a second touch event according to the posture information of the first electronic device relative to the second electronic device and the first touch event; wherein,
the operating point of the second touch event is located in the first area, and the position of the operating point of the second touch event in the first area corresponds to the position of the operating point of the first touch event in the display unit of the first electronic device;
the control unit 74 is further configured to control the display unit 72 to respond to the second touch event;
determining new third information after responding to the second touch event, and triggering the communication unit 71 to send the new third information to the first electronic device for display by the first electronic device.
The communication unit 71 is further configured to receive a third touch event, where an operation point of the third touch event is located in a second area of the display unit 72, and the second area is an area of the display unit 72 except for the first area;
the control unit 74 is further configured to control the display unit 72 to respond to the third touch event to display a fifth image, and trigger the communication unit 71 to send a sixth image to the first electronic device through the communication channel, so that the first electronic device displays the sixth image; wherein,
the sixth image is an image corresponding to the first region in the fifth image.
Wherein the first image information comprises at least two sequences of pixel points;
the parsing unit 75 is further configured to match a pixel point sequence in the first image information with a pixel point sequence in the second image information;
when matching, determining an angle and a direction by which a sequence of pixel points in the first image information is rotated;
determining a pose of the first electronic device relative to the second electronic device based on the rotated angle and direction.
The detection unit 73 is further configured to detect a contact point sensed on the display unit 72;
judging whether the position of the contact corresponds to the position of a conductive unit arranged on the first electronic equipment or not;
when it is determined that the position of the contact point corresponds to the position of the conductive unit, it is determined that the first electronic device is placed over the second electronic device display unit 72.
In practical applications, the communication unit 71 may be implemented by a module in the second electronic device that supports a corresponding communication protocol, such as a bluetooth protocol; the display unit 72 can be implemented by a touch display screen and a display driving circuit in the second electronic device; the detection unit 73, the control unit 74 and the analysis unit 75 may be implemented by a CPU, an FPGA in the second electronic device.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. With such an understanding, the technical solutions of the embodiments of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. An information processing method is applied to second electronic equipment with a display unit, and is characterized in that the second electronic equipment supports the establishment of a communication channel with first electronic equipment; the method comprises the following steps:
the second electronic device determines that the first electronic device is positioned over the second electronic device display unit,
controlling the second electronic device display unit to display a first image to,
receiving a second image acquired by the first electronic device through the communication channel, wherein the second image is obtained by acquiring the first image by the first electronic device;
analyzing the second image to obtain second image information corresponding to the second image;
comparing the second image information with first image information corresponding to the first image to obtain attitude information of the first electronic equipment relative to the second electronic equipment;
determining third information according to the attitude information and sending the third information to the first electronic equipment for displaying by the first electronic equipment;
the third information comprises a fourth image, and the fourth image is an image corresponding to an area covered by the first electronic device in the third image when the second electronic device display unit displays the third image;
the first image is an image with a unique identification characteristic locally.
2. The method of claim 1, wherein determining third information from the pose information comprises:
determining a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of a display unit of the second electronic device, wherein the first area is an area covered by the first electronic device in the display area of the display unit of the second electronic device;
acquiring display area information of a display unit of the first electronic device;
and obtaining the third information according to the display area information, the first area and the third image.
3. The method of claim 2, further comprising:
receiving fourth information sent by the first electronic device through the communication channel, wherein the fourth information comprises information of a first touch event received by a display unit of the first electronic device;
obtaining a second touch event according to the attitude information of the first electronic device relative to the second electronic device and the first touch event; wherein,
the operating point of the second touch event is located in the first area, and the position of the operating point of the second touch event in the first area corresponds to the position of the operating point of the first touch event in the display unit of the first electronic device;
controlling the display unit of the second electronic equipment to respond to the second touch event;
and determining new third information after responding to the second touch event, and sending the new third information to the first electronic equipment for displaying by the first electronic equipment.
4. The method of claim 2, further comprising:
receiving a third touch event, wherein an operation point of the third touch event is located in a second area of the display unit of the second electronic equipment, and the second area is an area except the first area in the display area of the display unit of the second electronic equipment;
controlling the second electronic equipment display unit to respond to the third touch event so as to display a fifth image, and sending a sixth image to the first electronic equipment through the communication channel so as to display the first electronic equipment; wherein,
the sixth image is an image corresponding to the first area in the fifth image;
the fifth image is: and the second electronic equipment responds to the image displayed when the third touch event occurs.
5. The method according to claim 2, characterized in that the first image information comprises at least two sequences of pixel points;
correspondingly, the comparing the second image information with the first image information corresponding to the first image to obtain the posture information of the first electronic device relative to the second electronic device includes:
matching the pixel point sequence in the first image information with the pixel point sequence in the second image information;
when matching, determining an angle and a direction by which a sequence of pixel points in the first image information is rotated;
determining a pose of the first electronic device relative to the second electronic device based on the rotated angle and direction.
6. The method according to any one of claims 1 to 5, wherein a region where the first electronic device is in contact with the display unit of the second electronic device is provided with a conductive unit;
accordingly, the determining that the first electronic device is disposed over the second electronic device display unit includes:
detecting the contact points sensed on the display unit of the second electronic equipment;
judging whether the position of the contact corresponds to the position of the conductive unit;
when it is determined that the position of the contact point corresponds to the position of the conductive unit, it is determined that the first electronic device is placed on the second electronic device display unit.
7. A second electronic device, characterized in that the second electronic device comprises a communication unit, a display unit, a detection unit, a control unit and an analysis unit; wherein,
the communication unit is used for supporting the establishment of a communication channel with the first electronic equipment;
the detection unit is used for detecting whether the first electronic equipment is arranged on the display unit or not;
the control unit is used for controlling the display unit to display a first image when the detection unit detects that the first electronic equipment is placed on the display unit;
the communication unit is used for receiving a second image acquired by the first electronic equipment through the communication channel, wherein the second image is obtained by acquiring the first image by the first electronic equipment;
the analysis unit is used for analyzing the second image to obtain second image information corresponding to the second image;
comparing the second image information with first image information corresponding to the first image to obtain attitude information of the first electronic equipment relative to the second electronic equipment;
the control unit is further configured to determine third information according to the posture information and trigger the communication unit to send the third information to the first electronic device for display by the first electronic device;
the third information comprises a fourth image, and the fourth image is an image corresponding to an area covered by the first electronic device in the third image when the second electronic device display unit displays the third image;
the first image is an image with a unique identification characteristic locally.
8. The second electronic device of claim 7,
the control unit is further configured to determine a first area according to the posture information of the first electronic device relative to the second electronic device and the display area information of the display unit of the second electronic device, where the first area is an area covered by the first electronic device in the display area of the display unit of the second electronic device;
acquiring display area information of a display unit of the first electronic device;
and obtaining the third information according to the display area information, the first area and the third image.
9. The second electronic device of claim 8,
the communication unit is further configured to receive fourth information sent by the first electronic device through the communication channel, where the fourth information is information of the first touch event received by the display unit of the first electronic device;
the control unit is further configured to obtain a second touch event according to the posture information of the first electronic device relative to the second electronic device and the first touch event; wherein,
the operating point of the second touch event is located in the first area, and the position of the operating point of the second touch event in the first area corresponds to the position of the operating point of the first touch event in the display unit of the first electronic device;
the control unit is further used for controlling the display unit to respond to the second touch event;
and determining new third information after responding to the second touch event, and triggering the communication unit to send the new third information to the first electronic device for display by the first electronic device.
10. The second electronic device of claim 8,
the communication unit is further configured to receive a third touch event, where an operation point of the third touch event is located in a second area of the display unit, and the second area is an area of the display unit except for the first area;
the control unit is further configured to control the display unit to respond to the third touch event to display a fifth image, and trigger the communication unit to send a sixth image to the first electronic device through the communication channel, so that the first electronic device displays the sixth image; wherein,
the sixth image is an image corresponding to the first area in the fifth image;
the fifth image is: and the second electronic equipment responds to the image displayed when the third touch event occurs.
11. The second electronic device of claim 8, wherein the first image information comprises at least two sequences of pixel points;
the analysis unit is further configured to match a pixel point sequence in the first image information with a pixel point sequence in the second image information;
when matching, determining an angle and a direction by which a sequence of pixel points in the first image information is rotated;
determining a pose of the first electronic device relative to the second electronic device based on the rotated angle and direction.
12. The second electronic device of any of claims 7-11,
the detection unit is also used for detecting the contact points induced on the display unit;
judging whether the position of the contact corresponds to the position of a conductive unit arranged on the first electronic equipment or not;
when it is determined that the position of the contact point corresponds to the position of the conductive unit, it is determined that the first electronic device is placed on the second electronic device display unit.
CN201410258450.0A 2014-06-11 2014-06-11 Information processing method and electronic equipment Active CN105208181B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410258450.0A CN105208181B (en) 2014-06-11 2014-06-11 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410258450.0A CN105208181B (en) 2014-06-11 2014-06-11 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105208181A CN105208181A (en) 2015-12-30
CN105208181B true CN105208181B (en) 2018-01-23

Family

ID=54955600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410258450.0A Active CN105208181B (en) 2014-06-11 2014-06-11 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105208181B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110536479A (en) * 2019-08-28 2019-12-03 维沃移动通信有限公司 Object transmission method and electronic equipment
CN114327324B (en) * 2020-09-29 2025-03-04 华为技术有限公司 A distributed display method of interface, electronic equipment and communication system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1875337A (en) * 2003-11-07 2006-12-06 三菱电机株式会社 Method for determining location on display surface and interactive display system
CN102419680A (en) * 2010-09-27 2012-04-18 联想(北京)有限公司 Electronic equipment and display method thereof
CN102799373A (en) * 2012-06-29 2012-11-28 联想(北京)有限公司 Electronic device, method for generating input areas, and terminal device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9081080B2 (en) * 2011-03-04 2015-07-14 Qualcomm Incorporated RSSI-based indoor positioning in the presence of dynamic transmission power control access points
US9774989B2 (en) * 2011-09-27 2017-09-26 Sony Interactive Entertainment Inc. Position and rotation of a portable device relative to a television screen

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1875337A (en) * 2003-11-07 2006-12-06 三菱电机株式会社 Method for determining location on display surface and interactive display system
CN102419680A (en) * 2010-09-27 2012-04-18 联想(北京)有限公司 Electronic equipment and display method thereof
CN102799373A (en) * 2012-06-29 2012-11-28 联想(北京)有限公司 Electronic device, method for generating input areas, and terminal device

Also Published As

Publication number Publication date
CN105208181A (en) 2015-12-30

Similar Documents

Publication Publication Date Title
US10979617B2 (en) Mobile device and control method
JP6558527B2 (en) Electronic device, electronic device control method, program, and wireless communication system
JP5826408B2 (en) Method for gesture control, gesture server device, and sensor input device
CN111897507B (en) Screen projection method and device, second terminal and storage medium
WO2014188797A1 (en) Display control device, display control method, and recording medium
CN106873928A (en) Long-range control method and terminal
CN105487641B (en) Control method and device of terminal equipment
CN108701365A (en) Light spot recognition method, device and system
CN106293563B (en) Control method and electronic equipment
US20180220066A1 (en) Electronic apparatus, operating method of electronic apparatus, and non-transitory computer-readable recording medium
US20150319792A1 (en) Communication control device, communication control method, program, and communication control system
CN105208181B (en) Information processing method and electronic equipment
EP2902884B1 (en) Method, device, and system for recognizing gesture based on multi-terminal collaboration
US20160321373A1 (en) File transmission method, file transmission apparatus and file transmission system
US9600088B2 (en) Method and apparatus for displaying a pointer on an external display
CN104007928A (en) Information processing method and electronic device
CN104461438B (en) The method, device and mobile terminal of display control
WO2013136702A1 (en) Wireless communication apparatus, wireless communication method, and wireless communication control program
CN109196860B (en) Control method of multi-view image and related device
CN110032353B (en) Display method, display terminal, lamp device and display system
CN105446574B (en) A kind of information processing method and electronic equipment
EP3007146A1 (en) System for controlling an electronic device and head mounted unit for such a system
WO2016123890A1 (en) Handheld electronic device and control method, apparatus, and computer storage medium thereof
CN110837764B (en) Image processing method, device, electronic device and visual interaction system
CN106020659B (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant