[go: up one dir, main page]

CN115840506A - Data interaction method, device, equipment and medium based on wearable equipment - Google Patents

Data interaction method, device, equipment and medium based on wearable equipment Download PDF

Info

Publication number
CN115840506A
CN115840506A CN202211600446.9A CN202211600446A CN115840506A CN 115840506 A CN115840506 A CN 115840506A CN 202211600446 A CN202211600446 A CN 202211600446A CN 115840506 A CN115840506 A CN 115840506A
Authority
CN
China
Prior art keywords
wearable device
terminal device
palm
virtual
protocol address
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211600446.9A
Other languages
Chinese (zh)
Inventor
高兴世
韩伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202211600446.9A priority Critical patent/CN115840506A/en
Publication of CN115840506A publication Critical patent/CN115840506A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

本公开涉及一种基于可穿戴设备的数据交互方法、装置、设备及介质,该方法包括:基于可穿戴设备对设定空间范围内的终端设备进行识别,生成终端设备的虚拟线框和网络协议地址,响应于在设定空间范围内检测到用户手掌,对用户手掌进行识别,以在可穿戴设备中生成用户手掌对应的虚拟手,在虚拟手与虚拟线框之间的距离小于距离阈值,且虚拟手的目标手势与预设交互手势匹配的情况下,确定可穿戴设备与终端设备之间的数据交互流程,根据数据交互流程和网络协议地址,在可穿戴设备与终端设备之间进行数据交互。从而应用可穿戴设备通过手势交互,来实现设备与设备之间的数据交互,使交互的现实体验感更强,交互操作更加便捷。

Figure 202211600446

The present disclosure relates to a wearable device-based data interaction method, device, device, and medium. The method includes: identifying terminal devices within a set space range based on wearable devices, and generating virtual wireframes and network protocols of the terminal devices address, in response to detecting the user's palm within the set space range, identifying the user's palm to generate a virtual hand corresponding to the user's palm in the wearable device, and the distance between the virtual hand and the virtual wireframe is less than a distance threshold, And when the target gesture of the virtual hand matches the preset interaction gesture, determine the data interaction process between the wearable device and the terminal device, and perform data exchange between the wearable device and the terminal device according to the data interaction process and the network protocol address. interact. Therefore, the application of wearable devices through gesture interaction can realize data interaction between devices, so that the interactive experience is stronger and the interactive operation is more convenient.

Figure 202211600446

Description

基于可穿戴设备的数据交互方法、装置、设备及介质Wearable device-based data interaction method, device, device and medium

技术领域technical field

本公开涉及混合现实技术领域,尤其涉及一种基于可穿戴设备的数据交互方法、装置、设备及介质。The present disclosure relates to the technical field of mixed reality, and in particular to a wearable device-based data interaction method, device, device and medium.

背景技术Background technique

相关技术中,移动设备之间的数据交互主要通过设备与设备之间建立有线或无线连接后进行数据传输,但其受限于设备与设备之间的连接线路问题,容易导致设备间的数据传输受阻,并且数据交互过程需要建立设备与设备之间的通信连接,导致交互过程繁琐,影响交互效率。例如,在进行室内会议时,用户需要在投影设备中进行文件展示时,需要先通过有线或无线的方式,建立移动设备与投影设备之间的通信连接,再将数据文件从移动设备传输至投影设备中,若移动设备与投影设备之间的连接受阻,或传输过程中连接中断,会导致移动设备与投影设备之间无法进行正常的数据交互。In related technologies, the data interaction between mobile devices is mainly through the establishment of wired or wireless connections between devices for data transmission, but it is limited by the problem of connection lines between devices, which may easily lead to data transmission between devices. Blocked, and the data interaction process needs to establish a communication connection between devices, resulting in cumbersome interaction process and affecting interaction efficiency. For example, during an indoor conference, when the user needs to display files on the projection device, it is necessary to establish a communication connection between the mobile device and the projection device through a wired or wireless method, and then transfer the data file from the mobile device to the projection device. In the device, if the connection between the mobile device and the projection device is blocked, or the connection is interrupted during the transmission process, normal data interaction between the mobile device and the projection device will not be possible.

发明内容Contents of the invention

为克服相关技术中存在的问题,本公开提供一种基于可穿戴设备的数据交互方法、装置、设备及介质。In order to overcome the problems existing in related technologies, the present disclosure provides a wearable device-based data interaction method, device, device and medium.

根据本公开实施例的第一方面,提供一种基于可穿戴设备的数据交互方法,包括:According to the first aspect of an embodiment of the present disclosure, a wearable device-based data interaction method is provided, including:

基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的虚拟线框和网络协议地址;Identifying terminal devices within a set space range based on the wearable device, and generating a virtual wireframe and a network protocol address of the terminal device;

响应于在所述设定空间范围内检测到用户手掌,对所述用户手掌进行识别,以在所述可穿戴设备中生成所述用户手掌对应的虚拟手;Responding to detecting the user's palm within the set space range, identifying the user's palm, so as to generate a virtual hand corresponding to the user's palm in the wearable device;

在所述虚拟手与所述虚拟线框之间的距离小于距离阈值,且所述虚拟手的目标手势与预设交互手势匹配的情况下,确定所述可穿戴设备与所述终端设备之间的数据交互流程;When the distance between the virtual hand and the virtual wireframe is less than the distance threshold, and the target gesture of the virtual hand matches the preset interaction gesture, determine the distance between the wearable device and the terminal device data interaction process;

根据所述数据交互流程和所述网络协议地址,在所述可穿戴设备与所述终端设备之间进行数据交互。Perform data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address.

可选地,所述确定所述可穿戴设备与所述终端设备之间的数据交互流程,包括:Optionally, the determining the data interaction process between the wearable device and the terminal device includes:

监控所述虚拟手的动态轨迹,以生成所述虚拟手的手势动作;monitoring the dynamic trajectory of the virtual hand to generate gestures of the virtual hand;

在所述手势动作为设定提取动作的情况下,确定所述虚拟手在所述可穿戴设备中的当前提取目标;When the gesture action is a set extraction action, determine the current extraction target of the virtual hand in the wearable device;

若所述当前提取目标不在所述虚拟线框的展示范围内,则确定所述数据交互流程为终端设备展示流程;If the current extraction target is not within the display range of the virtual wireframe, then determine that the data interaction process is a terminal device display process;

若所述当前提取目标在所述虚拟线框的展示范围内,则确定所述数据交互流程为可穿戴设备展示流程。If the current extraction target is within the display range of the virtual wireframe, it is determined that the data interaction process is a wearable device display process.

可选地,所述根据所述数据交互流程和所述网络协议地址,在所述可穿戴设备与所述终端设备之间进行数据交互,包括:Optionally, the performing data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address includes:

在所述数据交互流程为所述可穿戴设备展示流程的情况下,向网络端发送所述网络协议地址,所述网络协议地址用于指示所述网络端根据所述网络协议地址向所述终端设备请求所述当前提取目标对应的第一目标数据;When the data interaction process is the display process of the wearable device, the network protocol address is sent to the network end, and the network protocol address is used to instruct the network end to send the network protocol address to the terminal according to the network protocol address The device requests the first target data corresponding to the current extraction target;

响应于接收到所述网络端反馈的所述第一目标数据,在所述可穿戴设备的设定位置展示所述第一目标数据;In response to receiving the first target data fed back by the network terminal, displaying the first target data at a set position of the wearable device;

在所述数据交互流程为所述终端设备展示流程的情况下,向所述网络端发送所述当前提取目标对应的第二目标数据和所述网络协议地址,所述网络协议地址用于指示所述网络端根据所述网络协议地址向所述终端设备发送所述第二目标数据,以在所述终端设备中展示所述第二目标数据。When the data interaction process is the display process of the terminal device, the second target data corresponding to the current extraction target and the network protocol address are sent to the network end, and the network protocol address is used to indicate the The network end sends the second target data to the terminal device according to the network protocol address, so as to display the second target data in the terminal device.

可选地,所述对所述用户手掌进行识别,以在所述可穿戴设备中生成所述用户手掌对应的虚拟手,包括:Optionally, the identifying the user's palm to generate a virtual hand corresponding to the user's palm in the wearable device includes:

对所述用户手掌进行关键点识别,以生成所述用户手掌对应的多个手掌关键点;Carrying out key point recognition on the user's palm to generate a plurality of palm key points corresponding to the user's palm;

基于所述可穿戴设备中展示的所述用户手掌,连接所述多个手掌关键点以生成所述虚拟手。Based on the user's palm displayed in the wearable device, the plurality of palm key points are connected to generate the virtual hand.

可选地,所述方法包括:Optionally, the method includes:

对所述虚拟手中所述多个手掌关键点的移动轨迹进行监控,以生成所述虚拟手的所述目标手势;monitoring the movement trajectories of the plurality of palm key points in the virtual hand to generate the target gesture of the virtual hand;

在所述目标手势与所述预设交互手势的相似度达到相似度阈值的情况下,确定所述目标手势与所述预设交互手势匹配。If the similarity between the target gesture and the preset interaction gesture reaches a similarity threshold, it is determined that the target gesture matches the preset interaction gesture.

可选地,所述基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的虚拟线框和网络协议地址,包括:Optionally, the identifying the terminal device within the set space range based on the wearable device, and generating the virtual wireframe and network protocol address of the terminal device include:

基于所述可穿戴设备对设定空间范围内的所述终端设备进行识别,以生成所述终端设备的外观特征;Identifying the terminal device within a set space range based on the wearable device, so as to generate appearance features of the terminal device;

根据所述外观特征,生成所述虚拟线框;generating the virtual wireframe according to the appearance features;

将所述虚拟线框发送至网络端,所述虚拟线框用于指示所述网络端根据所述虚拟线框确定所述终端设备的所述网络协议地址;sending the virtual wireframe to a network, where the virtual wireframe is used to instruct the network to determine the network protocol address of the terminal device according to the virtual wireframe;

接收所述网络端反馈的所述网络协议地址。The network protocol address fed back by the network terminal is received.

可选地,所述基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的网络协议地址,包括:Optionally, the identifying the terminal device within the set space range based on the wearable device, and generating the network protocol address of the terminal device includes:

基于所述可穿戴设备对设定空间范围内的终端设备进行识别,以确定所述可穿戴设备与所述终端设备之间的实际空间距离;Identifying terminal devices within a set spatial range based on the wearable device to determine an actual spatial distance between the wearable device and the terminal device;

向网络端上报所述可穿戴设备的位置信息和所述实际空间距离,所述位置信息和所述实际空间距离用于指示所述网络端根据所述位置信息和所述实际空间距离,确定所述终端设备的所述网络协议地址;reporting the location information and the actual spatial distance of the wearable device to the network, where the location information and the actual spatial distance are used to instruct the network to determine the wearable device according to the location information and the actual spatial distance The network protocol address of the terminal device;

接收所述网络端反馈的所述网络协议地址。The network protocol address fed back by the network terminal is received.

根据本公开实施例的第二方面,提供一种基于可穿戴设备的数据交互装置,包括:According to a second aspect of an embodiment of the present disclosure, a wearable device-based data interaction device is provided, including:

第一生成模块,被配置为基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的虚拟线框和网络协议地址;The first generation module is configured to identify terminal devices within a set space range based on the wearable device, and generate a virtual wireframe and a network protocol address of the terminal device;

第二生成模块,被配置为响应于在所述设定空间范围内检测到用户手掌,对所述用户手掌进行识别,以在所述可穿戴设备中生成所述用户手掌对应的虚拟手;The second generation module is configured to, in response to detecting the user's palm within the set space range, identify the user's palm, so as to generate a virtual hand corresponding to the user's palm in the wearable device;

确定模块,被配置为在所述虚拟手与所述虚拟线框之间的距离小于距离阈值,且所述虚拟手的目标手势与预设交互手势匹配的情况下,确定所述可穿戴设备与所述终端设备之间的数据交互流程;A determination module configured to determine that the wearable device is compatible with the wearable device when the distance between the virtual hand and the virtual wireframe is less than a distance threshold and the target gesture of the virtual hand matches a preset interaction gesture. A data interaction process between the terminal devices;

执行模块,被配置为根据所述数据交互流程和所述网络协议地址,在所述可穿戴设备与所述终端设备之间进行数据交互。The execution module is configured to perform data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address.

根据本公开实施例的第三方面,提供一种可穿戴设备,包括:According to a third aspect of an embodiment of the present disclosure, a wearable device is provided, including:

处理器;processor;

用于存储处理器可执行指令的存储器;memory for storing processor-executable instructions;

其中,所述处理器被配置为在执行所述可执行指令时,实现本公开第一方面中任一项所述方法的步骤。Wherein, the processor is configured to implement the steps of any one of the methods in the first aspect of the present disclosure when executing the executable instructions.

根据本公开实施例的第四方面,提供一种计算机可读存储介质,其上存储有计算机程序指令,该程序指令被处理器执行时实现本公开第一方面所提供的基于可穿戴设备的数据交互方法的步骤。According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium, on which computer program instructions are stored, and when the program instructions are executed by a processor, the wearable device-based data provided by the first aspect of the present disclosure is realized. The steps of the interactive method.

本公开的实施例提供的技术方案可以包括以下有益效果:The technical solutions provided by the embodiments of the present disclosure may include the following beneficial effects:

通过上述方式,基于可穿戴设备对设定空间范围内的终端设备进行识别,生成终端设备的虚拟线框和网络协议地址,响应于在设定空间范围内检测到用户手掌,对用户手掌进行识别,以在可穿戴设备中生成用户手掌对应的虚拟手,在虚拟手与虚拟线框之间的距离小于距离阈值,且虚拟手的目标手势与预设交互手势匹配的情况下,确定可穿戴设备与终端设备之间的数据交互流程,根据数据交互流程和网络协议地址,在可穿戴设备与终端设备之间进行数据交互。从而应用可穿戴设备通过手势交互,来实现设备与设备之间的数据交互,使交互的现实体验感更强,交互操作更加便捷。Through the above method, based on the wearable device, the terminal device within the set space range is identified, the virtual wireframe and network protocol address of the terminal device are generated, and the user's palm is recognized in response to the detection of the user's palm within the set space range , to generate a virtual hand corresponding to the user's palm in the wearable device, and determine that the wearable device The data interaction process with the terminal device, according to the data interaction process and the network protocol address, performs data interaction between the wearable device and the terminal device. Therefore, the application of wearable devices through gesture interaction can realize data interaction between devices, so that the interactive experience is stronger and the interactive operation is more convenient.

应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present disclosure.

附图说明Description of drawings

此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description serve to explain the principles of the disclosure.

图1是根据一示例性实施例示出的一种基于可穿戴设备的数据交互方法的流程图。Fig. 1 is a flow chart of a data interaction method based on a wearable device according to an exemplary embodiment.

图2是根据一示例性实施例示出的一种混合现实眼镜的数据交互方法的示例图。Fig. 2 is an exemplary diagram showing a data interaction method for mixed reality glasses according to an exemplary embodiment.

图3是根据一示例性实施例示出的一种混合现实场景的生成方法的流程图。Fig. 3 is a flowchart showing a method for generating a mixed reality scene according to an exemplary embodiment.

图4是根据一示例性实施例示出的一种手势事件的判定方法的流程图。Fig. 4 is a flowchart showing a method for determining a gesture event according to an exemplary embodiment.

图5是根据一示例性实施例示出的一种基于可穿戴设备的数据交互方法的流程图。Fig. 5 is a flowchart showing a data interaction method based on a wearable device according to an exemplary embodiment.

图6是根据一示例性实施例示出的一种虚拟手的示意图。Fig. 6 is a schematic diagram of a virtual hand according to an exemplary embodiment.

图7是根据一示例性实施例示出的一种目标手势识别方法的流程图。Fig. 7 is a flow chart showing a method for recognizing a target gesture according to an exemplary embodiment.

图8是根据一示例性实施例示出的一种可穿戴设备展示流程的示意图。Fig. 8 is a schematic diagram showing a display process of a wearable device according to an exemplary embodiment.

图9是根据一示例性实施例示出的一种终端设备展示流程的示意图。Fig. 9 is a schematic diagram showing a display process of a terminal device according to an exemplary embodiment.

图10是根据一示例性实施例示出的一种基于混合现实眼镜的数据交互方法的流程图。Fig. 10 is a flowchart showing a data interaction method based on mixed reality glasses according to an exemplary embodiment.

图11是根据一示例性实施例示出的一种基于可穿戴设备的数据交互装置的框图。Fig. 11 is a block diagram of a data interaction device based on a wearable device according to an exemplary embodiment.

图12是根据一示例性实施例示出的一种可穿戴设备的框图。Fig. 12 is a block diagram of a wearable device according to an exemplary embodiment.

具体实施方式Detailed ways

这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatuses and methods consistent with aspects of the present disclosure as recited in the appended claims.

需要说明的是,本申请中所有获取信号、信息或数据的动作都是在遵照所在地国家相应的数据保护法规政策的前提下,并获得由相应装置所有者给予授权的情况下进行的。It should be noted that all actions to obtain signals, information or data in this application are carried out under the premise of complying with the corresponding data protection laws and policies of the country where the location is located, and with the authorization given by the corresponding device owner.

在对本申请中的数据交互方法进行阐述之前,需要对本申请中的可穿戴设备进行相应介绍,本申请中可穿戴设备用于实现混合现实技术,混合现实(Mix Reality,MR)场景是一种介于虚拟场景和现实场景之间的形态,包括了增强现实场景和增强虚拟场景,是指合并现实和虚拟而产生的新的可视化环境。在新的可视化环境里物理对象和虚拟对象共存,两者之间可以产生多种互动。本申请中多个设备之间的信息流交互是以搭载有混合现实技术的可穿戴设备作为主体,通过在多个设备中内置服务,以达成设备与设备之间的间接通信。Before explaining the data interaction method in this application, it is necessary to introduce the wearable device in this application. In this application, the wearable device is used to realize the mixed reality technology, and the mixed reality (Mix Reality, MR) scene is an intermediary The form between the virtual scene and the real scene, including the augmented reality scene and the augmented virtual scene, refers to a new visual environment produced by merging reality and virtuality. In the new visual environment, physical objects and virtual objects coexist, and various interactions can occur between the two. In this application, the information flow interaction between multiple devices is based on wearable devices equipped with mixed reality technology, and indirect communication between devices is achieved through built-in services in multiple devices.

相关技术中,已通过多种方式为虚拟物体赋予更多现实属性,例如:通过空间定位技术固定虚拟物体、通过虚实遮挡技术使现实物体与虚拟物体之间实现层级分明、通过光照计算技术使虚拟物体同步实现物体的光照信息。混合现实中沉浸式的交互,使虚拟物体融入到生活的每个角落,使传输媒介可以从手机屏幕转移到了更加轻薄易携带的可穿戴设备上,通过图像识别、数据传输技术获取其他电子设备中展示的具体信息,并根据相应的控制手势在可穿戴设备的混合现实空间中实现更多的数据交互。In related technologies, virtual objects have been endowed with more realistic attributes in various ways, such as: fixing virtual objects through spatial positioning technology, achieving a clear hierarchy between real objects and virtual objects through virtual and real occlusion technology, and making virtual objects Object synchronization realizes the lighting information of the object. The immersive interaction in mixed reality enables virtual objects to be integrated into every corner of life, so that the transmission medium can be transferred from the mobile phone screen to a thinner and more portable wearable device, and can be obtained from other electronic devices through image recognition and data transmission technology. Display specific information, and realize more data interaction in the mixed reality space of the wearable device according to the corresponding control gestures.

图1是根据一示例性实施例示出的一种基于可穿戴设备的数据交互方法的流程图,如图1所示,该方法用于可穿戴设备中,包括以下步骤。Fig. 1 is a flow chart of a data interaction method based on a wearable device according to an exemplary embodiment. As shown in Fig. 1 , the method is used in a wearable device and includes the following steps.

在步骤S101中,基于可穿戴设备对设定空间范围内的终端设备进行识别,生成终端设备的虚拟线框和网络协议地址。In step S101, terminal devices within a set space range are identified based on the wearable device, and a virtual wireframe and a network protocol address of the terminal device are generated.

值得一提的是,本实施例应用于可穿戴设备中,该可穿戴设备中搭载有混合现实技术,可穿戴设备可以是混合现实眼镜、混合现实头盔等,使用时会在可穿戴设备的显示装置中显示混合现实场景。用户在穿戴该可穿戴设备时,可以通过显示装置向用户呈现混合现实场景,其中,通过显示装置在混合现实场景中向用户显示的空间范围,即为可穿戴设备能够捕捉到的设定空间范围。为方便进行混合现实交互,当用户穿戴该可穿戴设备后,可穿戴设备会对设定空间范围进行扫描,以确定设定空间范围内的终端设备,示例的,通常情况下终端设备都具有较固定的外观特征,例如,电视机、移动手机、个人电脑等,均具有四边形外框和显示装置,可以根据该外观特征对设定空间范围内的终端设备进行识别,并根据终端设备的外框在可穿戴设备的显示装置中终端设备的对应位置生成终端设备的虚拟线框。It is worth mentioning that this embodiment is applied to wearable devices, which are equipped with mixed reality technology. Wearable devices can be mixed reality glasses, mixed reality helmets, etc. A mixed reality scene is displayed in the device. When the user wears the wearable device, the mixed reality scene can be presented to the user through the display device, wherein the spatial range displayed to the user in the mixed reality scene through the display device is the set spatial range that the wearable device can capture . In order to facilitate mixed reality interaction, when the user wears the wearable device, the wearable device will scan the set space range to determine the terminal devices within the set space range. For example, usually the terminal devices have relatively Fixed appearance features, for example, TVs, mobile phones, personal computers, etc., all have quadrangular outer frames and display devices, and terminal devices within a set space can be identified based on the appearance features, and the terminal devices can be identified according to the outer frame of the terminal devices. A virtual wireframe of the terminal device is generated at a corresponding position of the terminal device in the display device of the wearable device.

需要说明的是,本实施例中可穿戴设备与设定空间范围内的终端设备进行数据交互的原理是通过局域网络或互联网络将终端设备的显示屏上,当前显示的数据分享至该局域网络或互联网络中,并通过局域网络或互联网络将该数据发送至可穿戴设备中进行显示,或将可穿戴设备中当前显示的数据分享至局域网络或互联网络中,通过局域网络或互联网络将该数据发送至终端设备中进行显示。其中,该数据可以是图像数据、视频数据、文件数据等,对此,本实施例不做限定。因此,对应的终端设备为可以接入网络的智能设备,其中该智能设备可以包括:需要通过中转路由器连入互联网的局域网设备,例如电视、笔记本电脑、平板电脑等,以及可以直接通过网卡连入互联网的互联网设备,例如移动手机、连接网线的台式电脑等。示例的,图2是根据一示例性实施例示出的一种混合现实眼镜的数据交互方法的示例图,如图2所示,本实施例中该可穿戴设备为混合现实眼镜,当该终端设备为局域网设备(电视机、个人电脑)时,混合现实眼镜与该局域网设备通过同一路由器连接到互联网中,此时,路由器作为数据中转接口,实现混合现实眼镜与局域网设备之间的间接数据交互;当该终端设备为互联网设备(智能手机)时,混合现实眼镜自带联网功能,可以接入互联网中,此时互联网络作为混合现实眼镜与互联网设备之间的数据中转接口,实现混合现实眼镜与互联网设备之间的间接数据交互。It should be noted that the principle of data interaction between the wearable device and the terminal device within the set space range in this embodiment is to share the data currently displayed on the display screen of the terminal device to the local area network through the local area network or the Internet or the Internet, and send the data to the wearable device for display through the LAN or the Internet, or share the data currently displayed in the wearable device to the LAN or the Internet, and send the data to the wearable device through the LAN or the Internet. The data is sent to the terminal device for display. Wherein, the data may be image data, video data, file data, etc., which is not limited in this embodiment. Therefore, the corresponding terminal device is a smart device that can access the network, where the smart device may include: a LAN device that needs to be connected to the Internet through a transit router, such as a TV, a laptop, a tablet computer, etc., and a device that can be connected directly through a network card Internet devices for the Internet, such as mobile phones, desktop computers connected to network cables, etc. For example, FIG. 2 is an example diagram showing a data interaction method for mixed reality glasses according to an exemplary embodiment. As shown in FIG. 2, the wearable device in this embodiment is mixed reality glasses. When the terminal device When it is a local area network device (TV, personal computer), the mixed reality glasses and the local area network device are connected to the Internet through the same router. At this time, the router acts as a data transfer interface to realize the indirect data interaction between the mixed reality glasses and the local area network device; When the terminal device is an Internet device (smart phone), the mixed reality glasses have a built-in networking function and can be connected to the Internet. At this time, the Internet serves as a data transfer interface between the mixed reality glasses and the Internet device to realize the connection between the mixed reality glasses and the Internet device. Indirect data interaction between Internet devices.

网络协议地址为终端设备在进行网络数据交互过程中的唯一身份标识,不同终端设备的网络协议地址均不相同,通过识别网络协议地址可以确定终端设备在网络数据交互过程中的身份,从而可以确定该终端设备当前浏览的数据文件。示例的,本实施例中对于局域网设备,在局域网对应的路由器中转站中,预先配置有各个终端设备对应外观特征与终端设备的网络协议地址之间的映射关系,当用户穿戴该可穿戴设备后,对设定空间范围内的终端设备进行识别,从而提取对应终端设备的外观特征,其中,该外观特征可以包括尺寸特征、颜色特征、品牌特征等,并将该外观特征发送至路由器中转站中,路由器中转站根据该外观特征和映射关系,确定终端设备的网络协议地址后,将该网络协议地址反馈至可穿戴设备中,可穿戴设备将该网络协议地址显示在终端设备虚拟线框的对应位置。示例的,图3是根据一示例性实施例示出的一种混合现实场景的生成方法的流程图,如图3所示,该方法包括以下步骤。The network protocol address is the unique identity of the terminal device in the process of network data interaction. The network protocol addresses of different terminal devices are different. By identifying the network protocol address, the identity of the terminal device in the process of network data interaction can be determined, so that it can be determined The data file currently browsed by the terminal device. For example, in this embodiment, for LAN devices, in the router transfer station corresponding to the LAN, the mapping relationship between the corresponding appearance characteristics of each terminal device and the network protocol address of the terminal device is pre-configured. When the user wears the wearable device , identify the terminal devices within the set space range, thereby extracting the appearance features of the corresponding terminal devices, where the appearance features may include size features, color features, brand features, etc., and send the appearance features to the router transfer station , the router transfer station determines the network protocol address of the terminal device according to the appearance characteristics and the mapping relationship, and then feeds the network protocol address back to the wearable device, and the wearable device displays the network protocol address on the corresponding virtual line frame of the terminal device Location. For example, Fig. 3 is a flow chart showing a method for generating a mixed reality scene according to an exemplary embodiment. As shown in Fig. 3 , the method includes the following steps.

(1)通过可穿戴设备对设定范围内的场景进行扫描,生成地图信息;(1) Scan the scene within the set range through the wearable device to generate map information;

(2)根据该地图信息对场景中的终端设备进行识别;(2) Identify the terminal device in the scene according to the map information;

(3)若场景中存在已经识别过的终端设备,则根据终端设备的设备名称从数据库中调取该终端设备对应的虚拟线框,并根据该终端设备在可穿戴设备中的展示位置防止该虚拟线框;若场景中的终端设备未被可穿戴设备识别过,则基于终端设备在场景中的位置生成对应的虚拟线框;(3) If there is an identified terminal device in the scene, the virtual wireframe corresponding to the terminal device is retrieved from the database according to the device name of the terminal device, and the display position of the terminal device in the wearable device is used to prevent the Virtual wireframe; if the terminal device in the scene has not been recognized by the wearable device, then generate a corresponding virtual wireframe based on the position of the terminal device in the scene;

(4)对终端设备进行识别在可穿戴设备中录入终端设备对应的网络IP地址,以生产对应场景的地图文件。(4) Identify the terminal device. Enter the network IP address corresponding to the terminal device in the wearable device to produce a map file corresponding to the scene.

可选地,上述生成终端设备的网络协议地址的步骤,可以包括:Optionally, the above step of generating the network protocol address of the terminal device may include:

基于可穿戴设备对设定空间范围内的终端设备进行识别,以确定可穿戴设备与终端设备之间的实际空间距离;Identify the terminal devices within the set space range based on the wearable device to determine the actual spatial distance between the wearable device and the terminal device;

向网络端上报可穿戴设备的位置信息和实际空间距离,位置信息和实际空间距离用于指示网络端根据位置信息和实际空间距离,确定终端设备的网络协议地址;Report the location information and actual spatial distance of the wearable device to the network, and the location information and actual spatial distance are used to instruct the network to determine the network protocol address of the terminal device based on the location information and actual spatial distance;

接收网络端反馈的网络协议地址。Network protocol address for receiving feedback from the network side.

值得一提的是,本实施例中可穿戴设备与终端设备连接到同一网络中,示例的,若终端设备为局域网设备,则可穿戴设备和终端设备通过同一路由器中转站连接到互联网中;若终端设备为互联网设备,则可穿戴设备和终端设备通过同一基站连接到互联网中。示例的,本实施例中可穿戴设备具有精准定位功能和距离检测功能,通过终端设备在可穿戴设备的显示装置中的显示比例和显示距离,从而确定可穿戴设备与终端设备之间的实际空间距离,其中,该实际空间距离可以是三维坐标系中的相对三维坐标。将该实际空间距离和可穿戴设备的位置信息发送至网络端,网络端根据位置信息确定可穿戴设备的位置,并根据精准定位方式,确定该可穿戴设备预设范围内的多个其他终端设备,网络端根据可穿戴设备与终端设备之间的实际空间距离,从多个其他终端设备中确定可穿戴设备的显示装置中所显示的终端设备,根据该终端设备与网络端之间的数据交互记录,确定终端设备对应的网络协议地址,并将该网络协议地址反馈至可穿戴设备中。It is worth mentioning that in this embodiment, the wearable device and the terminal device are connected to the same network. For example, if the terminal device is a LAN device, the wearable device and the terminal device are connected to the Internet through the same router transfer station; if The terminal device is an Internet device, and the wearable device and the terminal device are connected to the Internet through the same base station. As an example, the wearable device in this embodiment has a precise positioning function and a distance detection function, and the actual space between the wearable device and the terminal device can be determined through the display ratio and display distance of the terminal device in the display device of the wearable device distance, wherein the actual spatial distance may be a relative three-dimensional coordinate in a three-dimensional coordinate system. Send the actual space distance and the location information of the wearable device to the network terminal, and the network terminal determines the location of the wearable device according to the location information, and determines multiple other terminal devices within the preset range of the wearable device according to the precise positioning method , the network side determines the terminal device displayed on the display device of the wearable device from multiple other terminal devices according to the actual spatial distance between the wearable device and the terminal device, and according to the data interaction between the terminal device and the network side Record, determine the network protocol address corresponding to the terminal device, and feed back the network protocol address to the wearable device.

需要说明的是,本实施例中可穿戴设备在设定空间范围内扫描确定的终端设备可以包括多个,可以通过上述方式确定多个终端设备对应的虚拟线框和网络协议地址,从而增加可穿戴设备的实用性,使可穿戴设备能够更好的实现与设定空间范围内任一终端设备之间的数据交互。It should be noted that in this embodiment, the wearable device can scan and determine multiple terminal devices within the set space range, and the virtual wireframe and network protocol address corresponding to multiple terminal devices can be determined through the above-mentioned method, thereby increasing the available The practicability of wearable devices enables wearable devices to better realize data interaction with any terminal device within a set space range.

可选地,在另一种实施方式中,上述生成终端设备的网络协议地址的步骤,还可以包括:Optionally, in another implementation manner, the above step of generating the network protocol address of the terminal device may also include:

基于可穿戴设备对设定空间范围内的终端设备进行识别,以生成终端设备的外观特征;Identify terminal devices within a set space based on wearable devices to generate appearance features of terminal devices;

根据外观特征,生成虚拟线框;Generate a virtual wireframe based on appearance features;

将虚拟线框发送至网络端,虚拟线框用于指示网络端根据虚拟线框确定终端设备的网络协议地址;Sending the virtual wireframe to the network side, the virtual wireframe is used to instruct the network side to determine the network protocol address of the terminal device according to the virtual wireframe;

接收网络端反馈的网络协议地址。The address of the network protocol that receives the feedback from the network.

值得一提的是,本实施例中在设定空间范围内的各个终端设备的虚拟线框与终端设备之间为唯一对应关系,在网络端设定有虚拟线框与终端设备对应网络协议地址之间的映射关系,通过查阅该映射关系可以确定虚拟线框对应的网络协议地址。示例的,本实施例中可穿戴设备对设定范围内的终端设备进行识别,生成终端设备的外观特征,并根据该外观特征生成虚拟线框。将该虚拟线框发送至网络端,网络端根据上述映射关系确定虚拟线框对应的网络协议地址,将终端设备的网络协议地址反馈至可穿戴设备中。It is worth mentioning that in this embodiment, there is a unique correspondence between the virtual wireframe of each terminal device within the set space and the terminal device, and the network protocol address corresponding to the virtual wireframe and the terminal device is set on the network side The mapping relationship among them, the network protocol address corresponding to the virtual wireframe can be determined by referring to the mapping relationship. As an example, in this embodiment, the wearable device identifies terminal devices within a set range, generates appearance features of the terminal devices, and generates a virtual wireframe according to the appearance features. The virtual wireframe is sent to the network side, and the network side determines the network protocol address corresponding to the virtual wireframe according to the above mapping relationship, and feeds back the network protocol address of the terminal device to the wearable device.

在步骤S102中,响应于在设定空间范围内检测到用户手掌,对用户手掌进行识别,以在可穿戴设备中生成用户手掌对应的虚拟手。In step S102, in response to detecting the user's palm within the set space range, the user's palm is recognized, so as to generate a virtual hand corresponding to the user's palm in the wearable device.

值得一提的是,本实施例中可穿戴设备中的数据交互与数据显示主要通过用户的手势进行控制,通过检测可穿戴设备的显示装置中用户手掌对应的手势动作,来控制可穿戴设备中的视频播放、图像展示、文件显示,以及可穿戴设备与终端设备之间的数据交互。示例的,可以通过固定手势,控制在可穿戴设备的显示装置上显示相应的图像数据,因为可穿戴设备中装载有混合现实技术,因此该图像数据显示在当前可穿戴设备捕捉到的现实场景上,在设定空间范围的现实场景上进行图像数据的显示,此时,用户还可以根据相应的手势来控制图像数据在该现实场景中的显示位置。因此,为识别对应用户的用户意图,通过可穿戴设备对设定空间范围内的用户手掌进行检测,需要说明的是,为防止设定空间范围内用户手掌的误识别,通常情况下,可以以用户为中心,检测设定空间范围的预设阈值范围内的手掌图像,当在该预设阈值范围内检测到手掌图像时,则确定该手掌图像为用户手掌,并对该用户手掌进行识别,以确定用户手掌对应的手势动作。It is worth mentioning that in this embodiment, the data interaction and data display in the wearable device are mainly controlled by the user's gestures, and the gestures in the wearable device are controlled by detecting the gestures corresponding to the user's palm in the display device of the wearable device. Video playback, image display, file display, and data interaction between wearable devices and terminal devices. For example, it is possible to control the display of corresponding image data on the display device of the wearable device through a fixed gesture, because the wearable device is loaded with mixed reality technology, so the image data is displayed on the real scene captured by the current wearable device , the image data is displayed on the real scene with a set spatial range, at this time, the user can also control the display position of the image data in the real scene according to corresponding gestures. Therefore, in order to identify the user intention of the corresponding user, the wearable device detects the user's palm within the set space range. The user is the center, and the palm image within the preset threshold range of the set spatial range is detected. When the palm image is detected within the preset threshold range, it is determined that the palm image is the user's palm, and the user's palm is recognized. To determine the gesture action corresponding to the user's palm.

示例的,本实施例中可以对用户手掌的手指尺寸,手掌大小、指头位置等重要特征进行检测,并根据相应的特征在可穿戴设备中生成可以用以表征该用户手掌的虚拟手。值得一提的是,用户手掌在可穿戴设备中的变化包括位置变化和手势变化,当需要拖动可穿戴设备中所显示的虚拟数据时,则需要用户手掌在可穿戴设备中拖动虚拟数据,从而发生位置变化;当需要操作可穿戴设备中所显示的虚拟数据时,则需要通过用户手掌对应的手指对可穿戴设备下达操作指令,从而发生手势变化。因此,该虚拟手中包括用户对应的手掌和手指,以表征用户手掌的位置变化特征和手势变化特征。其中,该虚拟手在可穿戴设备中的显示位置与用户手掌在可穿戴设备中的显示位置相对应,当用户手掌在可穿戴设备中产生变化时,虚拟手会跟随用户手掌在可穿戴设备的显示装置中进行变换或移动。For example, in this embodiment, important features such as finger size, palm size, and finger position of the user's palm can be detected, and a virtual hand that can be used to represent the user's palm can be generated in the wearable device according to the corresponding features. It is worth mentioning that the change of the user's palm in the wearable device includes position change and gesture change. When the virtual data displayed in the wearable device needs to be dragged, the user's palm is required to drag the virtual data in the wearable device , so that the position changes; when it is necessary to operate the virtual data displayed in the wearable device, it is necessary to issue an operation command to the wearable device through the fingers corresponding to the user's palm, so that the gesture changes. Therefore, the virtual hand includes the corresponding palm and fingers of the user to represent the position change feature and gesture change feature of the user's palm. Wherein, the display position of the virtual hand in the wearable device corresponds to the display position of the user's palm in the wearable device. When the user's palm changes in the wearable device, the virtual hand will follow the user's palm in the wearable device. Transform or move in the display device.

在步骤S103中,在虚拟手与虚拟线框之间的距离小于距离阈值,且虚拟手的目标手势与预设交互手势匹配的情况下,确定可穿戴设备与终端设备之间的数据交互流程。In step S103, when the distance between the virtual hand and the virtual wireframe is less than the distance threshold and the target gesture of the virtual hand matches the preset interaction gesture, determine the data interaction process between the wearable device and the terminal device.

示例的,通过上述步骤生成虚拟手后对虚拟手与虚拟线框之间的距离进行检测,以确定用户手掌与终端设备之间是否进行了相关数据交互。当用户手掌在可穿戴设备的显示装置中触控到终端设备,对应虚拟手与虚拟线框相交时,则确定用户具有与终端设备之间进行交互的意图,再对用户虚拟手对应的目标手势进行检测,当该目标手势与预设交互手势匹配时,通过该目标手势确定可穿戴设备与终端设备之间的数据交互流程。For example, after the virtual hand is generated through the above steps, the distance between the virtual hand and the virtual wireframe is detected to determine whether relevant data interaction has been performed between the user's palm and the terminal device. When the user's palm touches the terminal device on the display device of the wearable device, and the corresponding virtual hand intersects with the virtual wireframe, it is determined that the user has the intention to interact with the terminal device, and then the target gesture corresponding to the user's virtual hand is determined. Detection is performed, and when the target gesture matches the preset interaction gesture, the data interaction process between the wearable device and the terminal device is determined through the target gesture.

示例的,可穿戴设备与终端设备之间的数据交互流程包括(1)通过中转站从终端设备中将当前展示的数据提取至可穿戴设备中进行数据展示;(2)通过中转站将可穿戴设备中当前展示的数据提取至终端设备中进行展示。因此,本实施例中可穿戴设备与终端设备之间的预设交互手势包括:数据抓取手势和数据释放手势,可以通过检测虚拟手对应手势的产生位置,确定可穿戴设备与终端设备之间的数据交互流程。图4是根据一示例性实施例示出的一种手势事件的判定方法的流程图,如图4所示,该判定方法包括以下步骤。As an example, the data interaction process between the wearable device and the terminal device includes (1) extracting the currently displayed data from the terminal device to the wearable device through the transfer station for data display; (2) transferring the wearable data through the transfer station The data currently displayed in the device is extracted to the terminal device for display. Therefore, the preset interaction gestures between the wearable device and the terminal device in this embodiment include: a data capture gesture and a data release gesture, and the interaction between the wearable device and the terminal device can be determined by detecting the position where the gesture corresponding to the virtual hand is generated. data interaction process. Fig. 4 is a flowchart of a method for determining a gesture event according to an exemplary embodiment. As shown in Fig. 4 , the method for determining includes the following steps.

(1)对虚拟手的位置和手势类型进行识别;(1) Identify the position and gesture type of the virtual hand;

(2)在虚拟手的正前方打出相应的感应射线;(2) Shoot corresponding induction rays directly in front of the virtual hand;

(3)当该感应射线击中终端设备对应的虚拟线框时,确定虚拟手和虚拟线框之间的距离;(3) When the induction ray hits the virtual wireframe corresponding to the terminal device, determine the distance between the virtual hand and the virtual wireframe;

(4)当该距离小于阈值时,确定当前虚拟手对应的手势;(4) When the distance is less than the threshold, determine the gesture corresponding to the current virtual hand;

(5)若该手势与预设交互手势相符合,则判定手势控制流程通过,用户正基于该手势事件对可穿戴设备下达控制指令。(5) If the gesture matches the preset interaction gesture, it is determined that the gesture control process has passed, and the user is giving a control command to the wearable device based on the gesture event.

在步骤S104中,根据数据交互流程和网络协议地址,在可穿戴设备与终端设备之间进行数据交互。In step S104, data interaction is performed between the wearable device and the terminal device according to the data interaction process and the network protocol address.

示例的,本实施例中通过上述步骤确定可穿戴设备与终端设备之间的数据交互流程后,根据终端设备的网络协议地址,通过数据中转的网络端,使可穿戴设备与终端设备之间进行数据交互。示例的,当数据交互流程为可穿戴设备提取终端设备中当前显示的数据信息时,通过可穿戴设备将网络协议地址和数据请求指令发送至网络端,网络端根据该网络协议地址确定对应的终端设备后,将该数据请求指令发送至终端设备中,终端设备基于该数据请求指令将当前显示屏中显示的数据发送至网络端,由网络端将该数据发送至可穿戴设备中进行数据显示;当数据交互流程为可穿戴设备传送数据至终端设备中进行数据显示时,通过可穿戴设备将当前显示的数据和网络协议地址发送至网络端,其中当前显示的数据为可穿戴设备中虚拟手选定的需要分享的数据,网络端根据该网络协议地址将该数据发送至终端设备中,使终端设备接收到该数据后在当前屏幕中显示该数据。For example, in this embodiment, after the data interaction process between the wearable device and the terminal device is determined through the above steps, according to the network protocol address of the terminal device, through the data transfer network end, the wearable device and the terminal device are exchanged. Data interaction. For example, when the data interaction process is that the wearable device extracts the data information currently displayed in the terminal device, the wearable device sends the network protocol address and data request instructions to the network side, and the network side determines the corresponding terminal according to the network protocol address After the device is installed, the data request command is sent to the terminal device, and the terminal device sends the data currently displayed on the display screen to the network terminal based on the data request command, and the network terminal sends the data to the wearable device for data display; When the data interaction process is for the wearable device to transmit data to the terminal device for data display, the currently displayed data and network protocol address are sent to the network through the wearable device, where the currently displayed data is the virtual hand selected in the wearable device According to the specified data that needs to be shared, the network end sends the data to the terminal device according to the network protocol address, so that the terminal device displays the data on the current screen after receiving the data.

通过上述方式,基于可穿戴设备对设定空间范围内的终端设备进行识别,生成终端设备的虚拟线框和网络协议地址,响应于在设定空间范围内检测到用户手掌,对用户手掌进行识别,以在可穿戴设备中生成用户手掌对应的虚拟手,在虚拟手与虚拟线框之间的距离小于距离阈值,且虚拟手的目标手势与预设交互手势匹配的情况下,确定可穿戴设备与终端设备之间的数据交互流程,根据数据交互流程和网络协议地址,在可穿戴设备与终端设备之间进行数据交互。从而应用可穿戴设备通过手势交互,来实现设备与设备之间的数据交互,使交互的现实体验感更强,交互操作更加便捷。Through the above method, based on the wearable device, the terminal device within the set space range is identified, the virtual wireframe and network protocol address of the terminal device are generated, and the user's palm is recognized in response to the detection of the user's palm within the set space range , to generate a virtual hand corresponding to the user's palm in the wearable device, and determine that the wearable device The data interaction process with the terminal device, according to the data interaction process and the network protocol address, performs data interaction between the wearable device and the terminal device. Therefore, the application of wearable devices through gesture interaction can realize data interaction between devices, so that the interactive experience is stronger and the interactive operation is more convenient.

图5是根据一示例性实施例示出的一种基于可穿戴设备的数据交互方法的流程图,如图5所示,该方法应用于可穿戴设备中,包括以下步骤。Fig. 5 is a flow chart of a data interaction method based on a wearable device according to an exemplary embodiment. As shown in Fig. 5, the method is applied to a wearable device and includes the following steps.

在步骤S201中,基于可穿戴设备对设定空间范围内的终端设备进行识别,生成终端设备的虚拟线框和网络协议地址。In step S201, the wearable device identifies the terminal device within the set space range, and generates a virtual wireframe and a network protocol address of the terminal device.

示例的,本实施例中生成虚拟线框和网络协议地址的方式与上述步骤S101中相同,可以参照上述步骤S101,不再赘述。As an example, the method of generating the virtual wireframe and the network protocol address in this embodiment is the same as that in the above step S101, and reference may be made to the above step S101, and details will not be repeated here.

在步骤S202中,响应于在设定空间范围内检测到用户手掌,对用户手掌进行识别,以在可穿戴设备中生成用户手掌对应的虚拟手。In step S202, in response to detecting the user's palm within the set space range, the user's palm is recognized, so as to generate a virtual hand corresponding to the user's palm in the wearable device.

示例的,本实施例中生成虚拟手的方式与上述步骤S102中相同,可以参照上述步骤S102,不再赘述。As an example, the method of generating the virtual hand in this embodiment is the same as that in the above step S102, and reference may be made to the above step S102, which will not be repeated here.

可选地,在一种实施方式中上述步骤S202,包括:Optionally, in an implementation manner, the above step S202 includes:

对用户手掌进行关键点识别,以生成用户手掌对应的多个手掌关键点。Perform key point recognition on the user's palm to generate multiple palm key points corresponding to the user's palm.

基于可穿戴设备中展示的用户手掌,连接多个手掌关键点以生成虚拟手。Based on the user's palm displayed in the wearable device, multiple palm key points are connected to generate a virtual hand.

示例的,本实施例中根据用户手掌的活动点位,对用户手掌进行关键点识别,例如,以用户手掌中的食指为例,基于食指的活动特征,对食指的活动点位进行识别,在食指不同活动直接的位置上生成该食指对应的手掌关键点。基于同一识别逻辑对用户手掌进行识别,生成用户手掌对应的多个手掌关键点。需要说明的是,本实施例中手掌关键点在可穿戴设备中的生成位置与用户手掌相贴合,在用户视角上可以观察到手掌关键点附着在用户手掌上,因此可以根据用户手掌在可穿戴设备中的显示位置和显示轮廓,连接多个手掌关键点,生成用户手掌对应的虚拟手。图6是根据一示例性实施例示出的一种虚拟手的示意图,如图6所示,通过对用户手掌的虚拟点位进行识别,并根据用户手掌在可穿戴设备中的轮廓,将虚拟点位连接后生成如图所示的虚拟手。值得一提的是,本实施例中会根据虚拟点位的排列顺序对虚拟点位进行编号,方便后续对虚拟手对应的目标手势进行识别,从而分析用户的控制意图。示例的,图7是根据一示例性实施例示出的一种目标手势识别方法的流程图,如图7所示,该方法应用于混合现实眼镜中,该方法包括以下步骤。As an example, in this embodiment, key points are identified on the user's palm according to the activity points of the user's palm. The palm key points corresponding to the index finger are generated at the direct position of the different activities of the index finger. Recognize the user's palm based on the same recognition logic, and generate multiple palm key points corresponding to the user's palm. It should be noted that in this embodiment, the generated position of the palm key points in the wearable device fits the user's palm, and it can be observed from the user's perspective that the palm key points are attached to the user's palm. The display position and display outline in the wearable device connect multiple palm key points to generate a virtual hand corresponding to the user's palm. Fig. 6 is a schematic diagram of a virtual hand according to an exemplary embodiment. As shown in Fig. 6, by recognizing the virtual point of the user's palm, and according to the outline of the user's palm in the wearable device, the virtual point After the bits are connected, a virtual hand as shown in the figure is generated. It is worth mentioning that in this embodiment, the virtual points are numbered according to their arrangement order, so as to facilitate the subsequent recognition of the target gesture corresponding to the virtual hand, so as to analyze the user's control intention. For example, Fig. 7 is a flowchart of a target gesture recognition method according to an exemplary embodiment. As shown in Fig. 7, the method is applied to mixed reality glasses, and the method includes the following steps.

(1)响应于混合现实眼镜中场景的启动,开启混合现实眼镜的摄像机;(1) In response to the start of the scene in the mixed reality glasses, turn on the camera of the mixed reality glasses;

(2)通过混合现实眼镜的摄像机对摄像机捕捉到的图像进行分析,确定图像中是否存在用户手掌;(2) Analyze the image captured by the camera through the camera of the mixed reality glasses to determine whether there is a user's palm in the image;

(3)当存在用户手掌时,在混合现实眼镜的空间中生成用户手掌对应的关键点;(3) When there is a user's palm, generate key points corresponding to the user's palm in the space of the mixed reality glasses;

(4)根据该关键点的位置确定用户手掌对应的手势。(4) Determine the gesture corresponding to the user's palm according to the position of the key point.

可选地,在另一种实施方式中,该交互方法还包括:Optionally, in another implementation manner, the interaction method further includes:

对虚拟手中多个手掌关键点的移动轨迹进行监控,以生成虚拟手的目标手势。The movement trajectory of multiple palm key points in the virtual hand is monitored to generate the target gesture of the virtual hand.

在目标手势与预设交互手势的相似度达到相似度阈值的情况下,确定目标手势与预设交互手势匹配。When the similarity between the target gesture and the preset interactive gesture reaches the similarity threshold, it is determined that the target gesture matches the preset interactive gesture.

值得一提的是,可穿戴设备中虚拟手的位置与用户手掌贴合,虚拟手会跟随用户手掌移动,因此,本实施例中对虚拟手对应的多个手掌关键点的移动轨迹进行监控,根据各个手掌关键点之间的相对关系,确定用户手掌对应的目标手势,并将该目标手势与预设交互手势进行对比,确定两者之间的相似度,当相似度达到相似度阈值时,则确定目标手势与预设交互手势匹配。It is worth mentioning that the position of the virtual hand in the wearable device fits the user's palm, and the virtual hand will follow the movement of the user's palm. Therefore, in this embodiment, the movement trajectory of multiple palm key points corresponding to the virtual hand is monitored. According to the relative relationship between the key points of each palm, determine the target gesture corresponding to the user's palm, and compare the target gesture with the preset interactive gesture to determine the similarity between the two. When the similarity reaches the similarity threshold, Then it is determined that the target gesture matches the preset interaction gesture.

在步骤S203中,监控虚拟手的动态轨迹,以生成虚拟手的手势动作。In step S203, the dynamic trajectory of the virtual hand is monitored to generate gestures of the virtual hand.

在步骤S204中,在手势动作为设定提取动作的情况下,确定虚拟手在可穿戴设备中的当前提取目标。In step S204, if the gesture action is a set extraction action, determine the current extraction target of the virtual hand in the wearable device.

值得一提的是,可穿戴设备与终端设备之间进行数据交互时,需要通过用户手掌向可穿戴设备下达数据交互指令,其中不同的数据交互流程中均包括数据提取动作,且可以根据数据提取动作的产生位置,确定数据交互流程。示例的,本实施例中预设交互手势包括设定提取动作和设定释放动作,通过可穿戴设备对虚拟手中各个手指的动态轨迹进行监控,生成虚拟手的手势动作。将该手势动作与预设交互手势进行对比,当手势动作为设定提取动作时,确定用户需要在可穿戴设备的显示装置中选定的当前提取目标。It is worth mentioning that when performing data interaction between a wearable device and a terminal device, it is necessary to issue data interaction instructions to the wearable device through the user's palm. Different data interaction processes include data extraction actions, and can be based on data extraction. The position where the action is generated determines the data interaction process. As an example, the preset interactive gestures in this embodiment include a set extraction action and a set release action, and the wearable device monitors the dynamic trajectory of each finger in the virtual hand to generate gesture actions of the virtual hand. The gesture action is compared with the preset interaction gesture, and when the gesture action is a set extraction action, it is determined that the user needs to select the current extraction target in the display device of the wearable device.

在步骤S205中,若当前提取目标不在虚拟线框的展示范围内,则确定数据交互流程为终端设备展示流程。In step S205, if the current extraction target is not within the display range of the virtual wireframe, it is determined that the data interaction process is a terminal device display process.

在步骤S206中,若当前提取目标在虚拟线框的展示范围内,则确定数据交互流程为可穿戴设备展示流程。In step S206, if the current extraction target is within the display range of the virtual wireframe, it is determined that the data interaction process is a wearable device display process.

值得一提的是,本实施例中上述步骤中生成的虚拟线框可以是终端设备的边界线,该终端设备中均设置有显示屏,通常情况下终端设备的边界线均是围绕着显示屏的,因此虚拟线框的展示范围为终端设备的显示屏范围。因此,本实施例中通过确定设定提取动作对应的提取目标是否处于虚拟线框的展示范围内,从而确定可穿戴设备与终端设备之间的交互流程。示例的,若当前提取目标在终端设备对应的虚拟线框的展示范围内,则确定用户意图为从终端设备中提取数据在可穿戴设备中进行展示,对应的数据交互流程为可穿戴设备展示流程。若当前提取目标不在终端设备对应的虚拟线框的展示范围内,则确定用户意图为从可穿戴设备中提取数据至终端设备中进行数据展示,对应的数据交互流程为终端设备展示流程。It is worth mentioning that the virtual wireframe generated in the above steps in this embodiment may be the boundary line of the terminal device, and the terminal device is equipped with a display screen. Usually, the boundary line of the terminal device is around the display screen Therefore, the display range of the virtual wireframe is the display range of the terminal device. Therefore, in this embodiment, the interaction process between the wearable device and the terminal device is determined by determining whether the extraction target corresponding to the set extraction action is within the display range of the virtual wireframe. For example, if the current extraction target is within the display range of the virtual wireframe corresponding to the terminal device, it is determined that the user intends to extract data from the terminal device and display it on the wearable device, and the corresponding data interaction process is the wearable device display process . If the current extraction target is not within the display range of the virtual wireframe corresponding to the terminal device, it is determined that the user intends to extract data from the wearable device to the terminal device for data display, and the corresponding data interaction process is the terminal device display process.

在步骤S207中,根据数据交互流程和网络协议地址,在可穿戴设备与终端设备之间进行数据交互。In step S207, data interaction is performed between the wearable device and the terminal device according to the data interaction process and the network protocol address.

示例的,本实施例中数据交互方式与上述步骤S104中相同,可以参照上述步骤S104,不再赘述。As an example, the data interaction method in this embodiment is the same as that in the above step S104, and reference may be made to the above step S104, and details are not repeated here.

可选地,在一种实施方式中,上述步骤S207,包括:Optionally, in an implementation manner, the above step S207 includes:

在数据交互流程为可穿戴设备展示流程的情况下,向网络端发送网络协议地址,网络协议地址用于指示网络端根据网络协议地址向终端设备请求当前提取目标对应的第一目标数据。When the data interaction process is a wearable device display process, the network protocol address is sent to the network end, and the network protocol address is used to instruct the network end to request the terminal device for the first target data corresponding to the current extraction target according to the network protocol address.

响应于接收到网络端反馈的第一目标数据,在可穿戴设备的设定位置展示第一目标数据。In response to receiving the first target data fed back by the network terminal, the first target data is displayed at the set position of the wearable device.

在数据交互流程为终端设备展示流程的情况下,向网络端发送当前提取目标对应的第二目标数据和网络协议地址,网络协议地址用于指示网络端根据网络协议地址向终端设备发送第二目标数据,以在终端设备中展示第二目标数据。In the case that the data interaction process is a terminal device display process, send the second target data and network protocol address corresponding to the current extraction target to the network side, and the network protocol address is used to instruct the network side to send the second target to the terminal device according to the network protocol address data to display the second target data in the terminal device.

示例的,图8是根据一示例性实施例示出的一种可穿戴设备展示流程的示意图,如图8所示,本实施例中通过上述步骤确定数据交互流程为可穿戴设备展示流程时,可穿戴设备向网络端发送网络协议地址,网络端根据该网络协议地址向对应的终端设备发送数据请求,终端设备接收到该数据请求后,将当前在显示屏中展示的第一目标数据发送至网络端,通过网络端中转将该第一目标数据反馈至可穿戴设备中,并在可穿戴设备显示装置的设定位置上展示该第一目标数据,实现第一目标数据在可穿戴设备和终端设备中的同步展示。图9是根据一示例性实施例示出的一种终端设备展示流程的示意图,如图9所示,当确定数据交互流程为终端设备展示流程时,可穿戴设备将当前提取目标对应的第二目标数据和网络协议地址发送至网络端,网络端根据网络协议地址确定对应的终端设备后,将第二目标数据发送至终端设备中,终端设备接收到该第二目标数据后,在当前显示屏中展示该第二目标数据。For example, FIG. 8 is a schematic diagram of a display process of a wearable device according to an exemplary embodiment. As shown in FIG. 8, when the data interaction process is determined to be a display process of a wearable device through the above steps in this embodiment, it can be The wearable device sends a network protocol address to the network, and the network sends a data request to the corresponding terminal device according to the network protocol address. After receiving the data request, the terminal device sends the first target data currently displayed on the display to the network. The first target data is fed back to the wearable device through the network transfer, and the first target data is displayed on the set position of the display device of the wearable device, so that the first target data can be shared between the wearable device and the terminal device. Synchronous display in . Fig. 9 is a schematic diagram of a display process of a terminal device according to an exemplary embodiment. As shown in Fig. 9, when the data interaction process is determined to be a display process of a terminal device, the wearable device will extract the second target corresponding to the current target The data and the network protocol address are sent to the network terminal. After the network terminal determines the corresponding terminal device according to the network protocol address, it sends the second target data to the terminal device. After receiving the second target data, the terminal device displays the Display the second target data.

通过上述方式,用户启动可穿戴设备后通过在可穿戴设备中比划设定交互手势,实现可穿戴设备与终端设备之间的数据传输,并完成可穿戴设备和终端设备之间的同步展示,使交互的现实体验感更强,交互操作更加便捷。Through the above method, after starting the wearable device, the user can set the interactive gestures in the wearable device to realize the data transmission between the wearable device and the terminal device, and complete the synchronous display between the wearable device and the terminal device, so that The interactive reality experience is stronger, and the interactive operation is more convenient.

图10是根据一示例性实施例示出的一种基于混合现实眼镜的数据交互方法的流程图,如图10所示,该方法应用于混合现实眼镜,该方法包括以下步骤。Fig. 10 is a flow chart of a data interaction method based on mixed reality glasses according to an exemplary embodiment. As shown in Fig. 10, the method is applied to mixed reality glasses, and the method includes the following steps.

(1)通过混合现实眼镜对用户手掌对应的目标手势进行交互判定;(1) Interactively determine the target gesture corresponding to the user's palm through the mixed reality glasses;

(2)当确定存在手势交互事件时,获取混合现实眼镜中终端设备对应的设备IP地址;(2) When it is determined that there is a gesture interaction event, obtain the device IP address corresponding to the terminal device in the mixed reality glasses;

(3)根据目标手势在混合现实眼镜中的交互轨迹,确定终端设备与混合现实眼镜之间的交互流程,示例的,当用户手掌将混合现实眼镜空间中的文件拖向终端设备代表的虚拟线框时,确定交互流程为混合现实眼镜向终端设备传递数据进行显示;当用户手掌将终端设备APP中显示的文件拖向混合现实眼镜的任意空间时,确定交互流程为混合现实眼镜向终端设备请求数据,并将该数据显示在混合现实眼镜的对应位置;(3) Determine the interaction process between the terminal device and the mixed reality glasses according to the interaction track of the target gesture in the mixed reality glasses. For example, when the user's palm drags the file in the mixed reality glasses space to the virtual line represented by the terminal device Frame, determine the interaction process as the mixed reality glasses transmit data to the terminal device for display; when the user's palm drags the file displayed in the terminal device APP to any space of the mixed reality glasses, determine the interaction process as the mixed reality glasses request the terminal device data, and display the data in the corresponding position of the mixed reality glasses;

(4)混合现实眼镜根据交互流程通过服务器向终端设备传输信息;(4) The mixed reality glasses transmit information to the terminal device through the server according to the interaction process;

(5)当交互流程为混合现实眼镜向终端设备传递数据时,混合现实眼镜向服务器发送信息,该信息中包括用户手掌当前选中的目标数据以及终端设备对应的设备IP地址;服务器根据该设备IP地址向终端设备发送目标数据;终端设备后台APP接收目标数据后,在显示屏中显示该目标数据;(5) When the interaction process is that the mixed reality glasses transmit data to the terminal device, the mixed reality glasses send information to the server, which includes the target data currently selected by the user's palm and the device IP address corresponding to the terminal device; The address sends the target data to the terminal device; after the background APP of the terminal device receives the target data, it displays the target data on the display screen;

(6)当交互流程为混合现实眼镜向终端设备请求数据时,混合现实眼镜发送数据请求和设备IP地址至服务器中,服务器根据该设备IP地址将数据请求发送至对应的终端设备中,终端设备根据该数据请求发送对应的目标数据至服务器,服务器向混合现实眼镜传输该目标数据,混合现实眼镜接收该目标数据后,在相应的显示位置上显示该目标数据。(6) When the interaction process is that the mixed reality glasses request data from the terminal device, the mixed reality glasses send the data request and the device IP address to the server, and the server sends the data request to the corresponding terminal device according to the device IP address, and the terminal device According to the data request, the corresponding target data is sent to the server, the server transmits the target data to the mixed reality glasses, and the mixed reality glasses display the target data on a corresponding display position after receiving the target data.

通过上述方式,用户启动可穿戴设备后通过在可穿戴设备中比划设定交互手势,实现可穿戴设备与终端设备之间的数据传输,并完成可穿戴设备和终端设备之间的同步展示,使交互的现实体验感更强,交互操作更加便捷。Through the above method, after starting the wearable device, the user can set the interactive gestures in the wearable device to realize the data transmission between the wearable device and the terminal device, and complete the synchronous display between the wearable device and the terminal device, so that The interactive reality experience is stronger, and the interactive operation is more convenient.

图11是根据一示例性实施例示出的一种基于可穿戴设备的数据交互装置的框图。参照图11,该装置100包括:第一生成模块110,第二生成模块120、确定模块130和执行模块140。Fig. 11 is a block diagram of a data interaction device based on a wearable device according to an exemplary embodiment. Referring to FIG. 11 , the device 100 includes: a first generation module 110 , a second generation module 120 , a determination module 130 and an execution module 140 .

第一生成模块110,被配置为基于可穿戴设备对设定空间范围内的终端设备进行识别,生成终端设备的虚拟线框和网络协议地址;The first generating module 110 is configured to identify terminal devices within a set space range based on the wearable device, and generate a virtual wireframe and a network protocol address of the terminal device;

第二生成模块120,被配置为响应于在设定空间范围内检测到用户手掌,对用户手掌进行识别,以在可穿戴设备中生成用户手掌对应的虚拟手;The second generation module 120 is configured to recognize the user's palm in response to detecting the user's palm within the set space range, so as to generate a virtual hand corresponding to the user's palm in the wearable device;

确定模块130,被配置为在虚拟手与虚拟线框之间的距离小于距离阈值,且虚拟手的目标手势与预设交互手势匹配的情况下,确定可穿戴设备与终端设备之间的数据交互流程;The determination module 130 is configured to determine the data interaction between the wearable device and the terminal device when the distance between the virtual hand and the virtual wireframe is less than a distance threshold and the target gesture of the virtual hand matches the preset interaction gesture process;

执行模块140,被配置为根据数据交互流程和网络协议地址,在可穿戴设备与终端设备之间进行数据交互。The execution module 140 is configured to perform data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address.

可选地,确定模块130,被配置为:Optionally, the determination module 130 is configured to:

监控虚拟手的动态轨迹,以生成虚拟手的手势动作;Monitor the dynamic trajectory of the virtual hand to generate gestures of the virtual hand;

在手势动作为设定提取动作的情况下,确定虚拟手在可穿戴设备中的当前提取目标;In the case that the gesture action is a set extraction action, determine the current extraction target of the virtual hand in the wearable device;

若当前提取目标不在虚拟线框的展示范围内,则确定数据交互流程为终端设备展示流程;If the current extraction target is not within the display range of the virtual wireframe, determine the data interaction process as the terminal device display process;

若当前提取目标在虚拟线框的展示范围内,则确定数据交互流程为可穿戴设备展示流程。If the current extraction target is within the display range of the virtual wireframe, it is determined that the data interaction process is a wearable device display process.

可选地,执行模块140,被配置为:Optionally, the execution module 140 is configured to:

在数据交互流程为可穿戴设备展示流程的情况下,向网络端发送网络协议地址,网络协议地址用于指示网络端根据网络协议地址向终端设备请求当前提取目标对应的第一目标数据;When the data interaction process is a wearable device display process, a network protocol address is sent to the network terminal, and the network protocol address is used to instruct the network terminal to request the terminal device for the first target data corresponding to the current extraction target according to the network protocol address;

响应于接收到网络端反馈的第一目标数据,在可穿戴设备的设定位置展示第一目标数据;displaying the first target data at a set position of the wearable device in response to receiving the first target data fed back by the network terminal;

在数据交互流程为终端设备展示流程的情况下,向网络端发送当前提取目标对应的第二目标数据和网络协议地址,网络协议地址用于指示网络端根据网络协议地址向终端设备发送第二目标数据,以在终端设备中展示第二目标数据。In the case that the data interaction process is a terminal device display process, send the second target data and network protocol address corresponding to the current extraction target to the network side, and the network protocol address is used to instruct the network side to send the second target to the terminal device according to the network protocol address data to display the second target data in the terminal device.

可选地,第二生成模块120,被配置为:Optionally, the second generating module 120 is configured to:

对用户手掌进行关键点识别,以生成用户手掌对应的多个手掌关键点;Carry out key point recognition on the user's palm to generate multiple palm key points corresponding to the user's palm;

基于可穿戴设备中展示的用户手掌,连接多个手掌关键点以生成虚拟手。Based on the user's palm displayed in the wearable device, multiple palm key points are connected to generate a virtual hand.

可选地,该装置100包括判定模块,该判定模块被配置为:Optionally, the apparatus 100 includes a determination module, the determination module is configured to:

对虚拟手中多个手掌关键点的移动轨迹进行监控,以生成虚拟手的目标手势;Monitor the movement trajectory of multiple palm key points in the virtual hand to generate the target gesture of the virtual hand;

在目标手势与预设交互手势的相似度达到相似度阈值的情况下,确定目标手势与预设交互手势匹配。When the similarity between the target gesture and the preset interactive gesture reaches the similarity threshold, it is determined that the target gesture matches the preset interactive gesture.

可选地,第一生成模块110,被配置为:Optionally, the first generating module 110 is configured to:

基于可穿戴设备对设定空间范围内的终端设备进行识别,以生成终端设备的外观特征;Identify terminal devices within a set space based on wearable devices to generate appearance features of terminal devices;

根据外观特征,生成虚拟线框;Generate a virtual wireframe based on appearance features;

将虚拟线框发送至网络端,虚拟线框用于指示网络端根据虚拟线框确定终端设备的网络协议地址;Sending the virtual wireframe to the network side, the virtual wireframe is used to instruct the network side to determine the network protocol address of the terminal device according to the virtual wireframe;

接收网络端反馈的网络协议地址。Network protocol address for receiving feedback from the network side.

可选地,第一生成模块110,被配置为:Optionally, the first generating module 110 is configured to:

基于可穿戴设备对设定空间范围内的终端设备进行识别,以确定可穿戴设备与终端设备之间的实际空间距离;Identify the terminal devices within the set space range based on the wearable device to determine the actual spatial distance between the wearable device and the terminal device;

向网络端上报可穿戴设备的位置信息和实际空间距离,位置信息和实际空间距离用于指示网络端根据位置信息和实际空间距离,确定终端设备的网络协议地址;Report the location information and actual spatial distance of the wearable device to the network, and the location information and actual spatial distance are used to instruct the network to determine the network protocol address of the terminal device based on the location information and actual spatial distance;

接收网络端反馈的网络协议地址。Network protocol address for receiving feedback from the network side.

关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。Regarding the apparatus in the foregoing embodiments, the specific manner in which each module executes operations has been described in detail in the embodiments related to the method, and will not be described in detail here.

本公开还提供一种计算机可读存储介质,其上存储有计算机程序指令,该程序指令被处理器执行时实现本公开提供的基于可穿戴设备的数据交互方法的步骤。The present disclosure also provides a computer-readable storage medium, on which computer program instructions are stored. When the program instructions are executed by a processor, the steps of the wearable device-based data interaction method provided by the present disclosure are implemented.

图12是根据一示例性实施例示出的一种可穿戴设备的框图。例如,设备1200可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。Fig. 12 is a block diagram of a wearable device according to an exemplary embodiment. For example, the device 1200 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.

参照图12,设备1200可以包括以下一个或多个组件:处理组件1202,存储器1204,电源组件1206,多媒体组件1208,音频组件1210,输入/输出接口1212,传感器组件1214,以及通信组件1216。12, device 1200 may include one or more of the following components: processing component 1202, memory 1204, power supply component 1206, multimedia component 1208, audio component 1210, input/output interface 1212, sensor component 1214, and communication component 1216.

处理组件1202通常控制设备1200的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件1202可以包括一个或多个处理器1220来执行指令,以完成上述基于可穿戴设备的数据交互的方法的全部或部分步骤。此外,处理组件1202可以包括一个或多个模块,便于处理组件1202和其他组件之间的交互。例如,处理组件1202可以包括多媒体模块,以方便多媒体组件1208和处理组件1202之间的交互。Processing component 1202 generally controls the overall operations of device 1200, such as those associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1202 may include one or more processors 1220 to execute instructions to complete all or part of the steps of the above method for data interaction based on wearable devices. Additionally, processing component 1202 may include one or more modules that facilitate interaction between processing component 1202 and other components. For example, processing component 1202 may include a multimedia module to facilitate interaction between multimedia component 1208 and processing component 1202 .

存储器1204被配置为存储各种类型的数据以支持在设备1200的操作。这些数据的示例包括用于在设备1200上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器1204可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。The memory 1204 is configured to store various types of data to support operations at the device 1200 . Examples of such data include instructions for any application or method operating on device 1200, contact data, phonebook data, messages, pictures, videos, and the like. The memory 1204 can be realized by any type of volatile or non-volatile storage device or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Magnetic or Optical Disk.

电源组件1206为设备1200的各种组件提供电力。电源组件1206可以包括电源管理系统,一个或多个电源,及其他与为设备1200生成、管理和分配电力相关联的组件。The power supply component 1206 provides power to various components of the device 1200 . Power components 1206 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for device 1200 .

多媒体组件1208包括在所述设备1200和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件1208包括一个前置摄像头和/或后置摄像头。当设备1200处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。The multimedia component 1208 includes a screen that provides an output interface between the device 1200 and the user. In some embodiments, the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may not only sense a boundary of a touch or swipe action, but also detect duration and pressure associated with the touch or swipe action. In some embodiments, the multimedia component 1208 includes a front camera and/or a rear camera. When the device 1200 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front camera and rear camera can be a fixed optical lens system or have focal length and optical zoom capability.

音频组件1210被配置为输出和/或输入音频信号。例如,音频组件1210包括一个麦克风(MIC),当设备1200处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器1204或经由通信组件1216发送。在一些实施例中,音频组件1210还包括一个扬声器,用于输出音频信号。The audio component 1210 is configured to output and/or input audio signals. For example, the audio component 1210 includes a microphone (MIC), which is configured to receive external audio signals when the device 1200 is in operation modes, such as call mode, recording mode and voice recognition mode. Received audio signals may be further stored in memory 1204 or sent via communication component 1216 . In some embodiments, the audio component 1210 also includes a speaker for outputting audio signals.

输入/输出接口1212为处理组件1202和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。The input/output interface 1212 provides an interface between the processing component 1202 and a peripheral interface module. The peripheral interface module may be a keyboard, a click wheel, a button, and the like. These buttons may include, but are not limited to: a home button, volume buttons, start button, and lock button.

传感器组件1214包括一个或多个传感器,用于为设备1200提供各个方面的状态评估。例如,传感器组件1214可以检测到设备1200的打开/关闭状态,组件的相对定位,例如所述组件为设备1200的显示器和小键盘,传感器组件1214还可以检测设备1200或设备1200一个组件的位置改变,用户与设备1200接触的存在或不存在,设备1200方位或加速/减速和设备1200的温度变化。传感器组件1214可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件1214还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件1214还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。Sensor assembly 1214 includes one or more sensors for providing status assessments of various aspects of device 1200 . For example, the sensor component 1214 can detect the open/closed state of the device 1200, the relative positioning of components, such as the display and the keypad of the device 1200, and the sensor component 1214 can also detect a change in the position of the device 1200 or a component of the device 1200 , the presence or absence of user contact with the device 1200 , the device 1200 orientation or acceleration/deceleration and the temperature change of the device 1200 . Sensor assembly 1214 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. Sensor assembly 1214 may also include optical sensors, such as CMOS or CCD image sensors, for use in imaging applications. In some embodiments, the sensor component 1214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor or a temperature sensor.

通信组件1216被配置为便于设备1200和其他设备之间有线或无线方式的通信。设备1200可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件1216经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件1216还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。Communication component 1216 is configured to facilitate wired or wireless communications between device 1200 and other devices. The device 1200 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 1216 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1216 also includes a near field communication (NFC) module to facilitate short-range communication. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wide Band (UWB) technology, Bluetooth (BT) technology and other technologies.

在示例性实施例中,设备1200可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述基于可穿戴设备的数据交互方法。In an exemplary embodiment, device 1200 may be programmed by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable Realized by a gate array (FPGA), controller, microcontroller, microprocessor or other electronic components, it is used to execute the above-mentioned data interaction method based on the wearable device.

在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器1204,上述指令可由设备1200的处理器1220执行以完成上述基于可穿戴设备的数据交互方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。In an exemplary embodiment, there is also provided a non-transitory computer-readable storage medium including instructions, such as a memory 1204 including instructions, which can be executed by the processor 1220 of the device 1200 to complete the above-mentioned wearable device-based data storage medium. interactive method. For example, the non-transitory computer readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.

上述设备除了可以是独立的电子设备外,也可是独立电子设备的一部分,例如在一种实施例中,该装置可以是集成电路(Integrated Circuit,IC)或芯片,其中该集成电路可以是一个IC,也可以是多个IC的集合;该芯片可以包括但不限于以下种类:GPU(GraphicsProcessing Unit,图形处理器)、CPU(Central Processing Unit,中央处理器)、FPGA(Field Programmable Gate Array,可编程逻辑阵列)、DSP(Digital Signal Processor,数字信号处理器)、ASIC(Application Specific Integrated Circuit,专用集成电路)、SOC(System on Chip,SoC,片上系统或系统级芯片)等。上述的集成电路或芯片中可以用于执行可执行指令(或代码),以实现上述的基于可穿戴设备的数据交互方法。其中该可执行指令可以存储在该集成电路或芯片中,也可以从其他的装置或设备获取,例如该集成电路或芯片中包括处理器、存储器,以及用于与其他的装置通信的接口。该可执行指令可以存储于该存储器中,当该可执行指令被处理器执行时实现上述的基于可穿戴设备的数据交互方法;或者,该集成电路或芯片可以通过该接口接收可执行指令并传输给该处理器执行,以实现上述的基于可穿戴设备的数据交互方法。In addition to being an independent electronic device, the above-mentioned device can also be a part of an independent electronic device. For example, in one embodiment, the device can be an integrated circuit (Integrated Circuit, IC) or a chip, wherein the integrated circuit can be an IC , can also be a collection of multiple ICs; the chip can include but not limited to the following types: GPU (Graphics Processing Unit, graphics processing unit), CPU (Central Processing Unit, central processing unit), FPGA (Field Programmable Gate Array, programmable logic array), DSP (Digital Signal Processor, digital signal processor), ASIC (Application Specific Integrated Circuit, application specific integrated circuit), SOC (System on Chip, SoC, system on chip or system-on-a-chip), etc. The above-mentioned integrated circuits or chips can be used to execute executable instructions (or codes), so as to realize the above-mentioned wearable device-based data interaction method. The executable instructions may be stored in the integrated circuit or chip, or may be obtained from other devices or devices, for example, the integrated circuit or chip includes a processor, a memory, and an interface for communicating with other devices. The executable instruction can be stored in the memory, and when the executable instruction is executed by the processor, the above-mentioned data interaction method based on the wearable device can be realized; or, the integrated circuit or chip can receive the executable instruction and transmit it through the interface. Execute to the processor to realize the above-mentioned data interaction method based on the wearable device.

在另一示例性实施例中,还提供一种计算机程序产品,该计算机程序产品包含能够由可编程的装置执行的计算机程序,该计算机程序具有当由该可编程的装置执行时用于执行上述的基于可穿戴设备的数据交互方法的代码部分。In another exemplary embodiment, there is also provided a computer program product comprising a computer program executable by a programmable device, the computer program having a function for performing the above-mentioned The code part of the wearable device-based data interaction method.

本领域技术人员在考虑说明书及实践本公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。Other embodiments of the disclosure will be readily apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any modification, use or adaptation of the present disclosure, and these modifications, uses or adaptations follow the general principles of the present disclosure and include common knowledge or conventional technical means in the technical field not disclosed in the present disclosure . The specification and examples are to be considered exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。It should be understood that the present disclosure is not limited to the precise constructions which have been described above and shown in the drawings, and various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1.一种基于可穿戴设备的数据交互方法,其特征在于,包括:1. A data interaction method based on a wearable device, characterized in that, comprising: 基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的虚拟线框和网络协议地址;Identifying terminal devices within a set space range based on the wearable device, and generating a virtual wireframe and a network protocol address of the terminal device; 响应于在所述设定空间范围内检测到用户手掌,对所述用户手掌进行识别,以在所述可穿戴设备中生成所述用户手掌对应的虚拟手;Responding to detecting the user's palm within the set space range, identifying the user's palm, so as to generate a virtual hand corresponding to the user's palm in the wearable device; 在所述虚拟手与所述虚拟线框之间的距离小于距离阈值,且所述虚拟手的目标手势与预设交互手势匹配的情况下,确定所述可穿戴设备与所述终端设备之间的数据交互流程;When the distance between the virtual hand and the virtual wireframe is less than the distance threshold, and the target gesture of the virtual hand matches the preset interaction gesture, determine the distance between the wearable device and the terminal device data interaction process; 根据所述数据交互流程和所述网络协议地址,在所述可穿戴设备与所述终端设备之间进行数据交互。Perform data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address. 2.根据权利要求1所述的交互方法,其特征在于,所述确定所述可穿戴设备与所述终端设备之间的数据交互流程,包括:2. The interaction method according to claim 1, wherein the determining the data interaction process between the wearable device and the terminal device comprises: 监控所述虚拟手的动态轨迹,以生成所述虚拟手的手势动作;monitoring the dynamic trajectory of the virtual hand to generate gestures of the virtual hand; 在所述手势动作为设定提取动作的情况下,确定所述虚拟手在所述可穿戴设备中的当前提取目标;When the gesture action is a set extraction action, determine the current extraction target of the virtual hand in the wearable device; 若所述当前提取目标不在所述虚拟线框的展示范围内,则确定所述数据交互流程为终端设备展示流程;If the current extraction target is not within the display range of the virtual wireframe, then determine that the data interaction process is a terminal device display process; 若所述当前提取目标在所述虚拟线框的展示范围内,则确定所述数据交互流程为可穿戴设备展示流程。If the current extraction target is within the display range of the virtual wireframe, it is determined that the data interaction process is a wearable device display process. 3.根据权利要求2所述的交互方法,其特征在于,所述根据所述数据交互流程和所述网络协议地址,在所述可穿戴设备与所述终端设备之间进行数据交互,包括:3. The interaction method according to claim 2, wherein the data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address comprises: 在所述数据交互流程为所述可穿戴设备展示流程的情况下,向网络端发送所述网络协议地址,所述网络协议地址用于指示所述网络端根据所述网络协议地址向所述终端设备请求所述当前提取目标对应的第一目标数据;When the data interaction process is the display process of the wearable device, the network protocol address is sent to the network end, and the network protocol address is used to instruct the network end to send the network protocol address to the terminal according to the network protocol address The device requests the first target data corresponding to the current extraction target; 响应于接收到所述网络端反馈的所述第一目标数据,在所述可穿戴设备的设定位置展示所述第一目标数据;In response to receiving the first target data fed back by the network terminal, displaying the first target data at a set position of the wearable device; 在所述数据交互流程为所述终端设备展示流程的情况下,向所述网络端发送所述当前提取目标对应的第二目标数据和所述网络协议地址,所述网络协议地址用于指示所述网络端根据所述网络协议地址向所述终端设备发送所述第二目标数据,以在所述终端设备中展示所述第二目标数据。When the data interaction process is the display process of the terminal device, the second target data corresponding to the current extraction target and the network protocol address are sent to the network end, and the network protocol address is used to indicate the The network end sends the second target data to the terminal device according to the network protocol address, so as to display the second target data in the terminal device. 4.根据权利要求1所述的交互方法,其特征在于,所述对所述用户手掌进行识别,以在所述可穿戴设备中生成所述用户手掌对应的虚拟手,包括:4. The interaction method according to claim 1, wherein the identifying the user's palm to generate a virtual hand corresponding to the user's palm in the wearable device comprises: 对所述用户手掌进行关键点识别,以生成所述用户手掌对应的多个手掌关键点;Carrying out key point recognition on the user's palm to generate a plurality of palm key points corresponding to the user's palm; 基于所述可穿戴设备中展示的所述用户手掌,连接所述多个手掌关键点以生成所述虚拟手。Based on the user's palm displayed in the wearable device, the plurality of palm key points are connected to generate the virtual hand. 5.根据权利要求4所述的交互方法,其特征在于,所述方法包括:5. The interactive method according to claim 4, characterized in that the method comprises: 对所述虚拟手中所述多个手掌关键点的移动轨迹进行监控,以生成所述虚拟手的所述目标手势;monitoring the movement trajectories of the plurality of palm key points in the virtual hand to generate the target gesture of the virtual hand; 在所述目标手势与所述预设交互手势的相似度达到相似度阈值的情况下,确定所述目标手势与所述预设交互手势匹配。If the similarity between the target gesture and the preset interaction gesture reaches a similarity threshold, it is determined that the target gesture matches the preset interaction gesture. 6.根据权利要求1所述的交互方法,其特征在于,所述基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的虚拟线框和网络协议地址,包括:6. The interaction method according to claim 1, wherein the terminal device within the set space range is identified based on the wearable device, and a virtual wireframe and a network protocol address of the terminal device are generated, include: 基于所述可穿戴设备对设定空间范围内的所述终端设备进行识别,以生成所述终端设备的外观特征;Identifying the terminal device within a set space range based on the wearable device, so as to generate appearance features of the terminal device; 根据所述外观特征,生成所述虚拟线框;generating the virtual wireframe according to the appearance features; 将所述虚拟线框发送至网络端,所述虚拟线框用于指示所述网络端根据所述虚拟线框确定所述终端设备的所述网络协议地址;sending the virtual wireframe to a network, where the virtual wireframe is used to instruct the network to determine the network protocol address of the terminal device according to the virtual wireframe; 接收所述网络端反馈的所述网络协议地址。The network protocol address fed back by the network terminal is received. 7.根据权利要求1所述的交互方法,其特征在于,所述基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的网络协议地址,包括:7. The interaction method according to claim 1, wherein the identifying the terminal device within the set space range based on the wearable device, and generating the network protocol address of the terminal device includes: 基于所述可穿戴设备对设定空间范围内的终端设备进行识别,以确定所述可穿戴设备与所述终端设备之间的实际空间距离;Identifying terminal devices within a set spatial range based on the wearable device to determine an actual spatial distance between the wearable device and the terminal device; 向网络端上报所述可穿戴设备的位置信息和所述实际空间距离,所述位置信息和所述实际空间距离用于指示所述网络端根据所述位置信息和所述实际空间距离,确定所述终端设备的所述网络协议地址;reporting the location information and the actual spatial distance of the wearable device to the network, where the location information and the actual spatial distance are used to instruct the network to determine the wearable device according to the location information and the actual spatial distance The network protocol address of the terminal device; 接收所述网络端反馈的所述网络协议地址。The network protocol address fed back by the network terminal is received. 8.一种基于可穿戴设备的数据交互装置,其特征在于,包括:8. A data interaction device based on a wearable device, characterized in that it comprises: 第一生成模块,被配置为基于所述可穿戴设备对设定空间范围内的终端设备进行识别,生成所述终端设备的虚拟线框和网络协议地址;The first generation module is configured to identify terminal devices within a set space range based on the wearable device, and generate a virtual wireframe and a network protocol address of the terminal device; 第二生成模块,被配置为响应于在所述设定空间范围内检测到用户手掌,对所述用户手掌进行识别,以在所述可穿戴设备中生成所述用户手掌对应的虚拟手;The second generation module is configured to, in response to detecting the user's palm within the set space range, identify the user's palm, so as to generate a virtual hand corresponding to the user's palm in the wearable device; 确定模块,被配置为在所述虚拟手与所述虚拟线框之间的距离小于距离阈值,且所述虚拟手的目标手势与预设交互手势匹配的情况下,确定所述可穿戴设备与所述终端设备之间的数据交互流程;A determination module configured to determine that the wearable device is compatible with the wearable device when the distance between the virtual hand and the virtual wireframe is less than a distance threshold and the target gesture of the virtual hand matches a preset interaction gesture. A data interaction process between the terminal devices; 执行模块,被配置为根据所述数据交互流程和所述网络协议地址,在所述可穿戴设备与所述终端设备之间进行数据交互。The execution module is configured to perform data interaction between the wearable device and the terminal device according to the data interaction process and the network protocol address. 9.一种可穿戴设备,其特征在于,包括:9. A wearable device, characterized in that, comprising: 处理器;processor; 用于存储处理器可执行指令的存储器;memory for storing processor-executable instructions; 其中,所述处理器被配置为在执行所述可执行指令时,实现权利要求1-7中任一项所述方法的步骤。Wherein, the processor is configured to implement the steps of the method according to any one of claims 1-7 when executing the executable instructions. 10.一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述程序指令被处理器执行时实现权利要求1~7中任一项所述方法的步骤。10. A computer-readable storage medium, on which computer program instructions are stored, wherein the steps of the method according to any one of claims 1-7 are implemented when the program instructions are executed by a processor.
CN202211600446.9A 2022-12-13 2022-12-13 Data interaction method, device, equipment and medium based on wearable equipment Pending CN115840506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211600446.9A CN115840506A (en) 2022-12-13 2022-12-13 Data interaction method, device, equipment and medium based on wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211600446.9A CN115840506A (en) 2022-12-13 2022-12-13 Data interaction method, device, equipment and medium based on wearable equipment

Publications (1)

Publication Number Publication Date
CN115840506A true CN115840506A (en) 2023-03-24

Family

ID=85578532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211600446.9A Pending CN115840506A (en) 2022-12-13 2022-12-13 Data interaction method, device, equipment and medium based on wearable equipment

Country Status (1)

Country Link
CN (1) CN115840506A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107479712A (en) * 2017-08-18 2017-12-15 北京小米移动软件有限公司 information processing method and device based on head-mounted display apparatus
CN108369345A (en) * 2015-10-20 2018-08-03 奇跃公司 Select virtual objects in 3D space
CN109947249A (en) * 2019-03-15 2019-06-28 努比亚技术有限公司 Exchange method, wearable device and the computer storage medium of wearable device
CN111399630A (en) * 2019-01-03 2020-07-10 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Interactive method, device, terminal device and storage medium for virtual content
CN112233161A (en) * 2020-10-15 2021-01-15 北京达佳互联信息技术有限公司 Hand image depth determination method and device, electronic equipment and storage medium
CN113282166A (en) * 2021-05-08 2021-08-20 青岛小鸟看看科技有限公司 Interaction method and device of head-mounted display equipment and head-mounted display equipment
CN113448427A (en) * 2020-03-24 2021-09-28 华为技术有限公司 Equipment control method, device and system
JP2021184233A (en) * 2020-05-21 2021-12-02 株式会社トゥービーソフトTobesoft Co., Ltd. Electronic terminal device that performs control command execution and feedback output corresponding to user's hand gesture and operating method thereof
CN113986093A (en) * 2021-09-26 2022-01-28 展讯通信(上海)有限公司 Interactive method and related device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108369345A (en) * 2015-10-20 2018-08-03 奇跃公司 Select virtual objects in 3D space
CN107479712A (en) * 2017-08-18 2017-12-15 北京小米移动软件有限公司 information processing method and device based on head-mounted display apparatus
CN111399630A (en) * 2019-01-03 2020-07-10 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
CN109947249A (en) * 2019-03-15 2019-06-28 努比亚技术有限公司 Exchange method, wearable device and the computer storage medium of wearable device
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Interactive method, device, terminal device and storage medium for virtual content
CN113448427A (en) * 2020-03-24 2021-09-28 华为技术有限公司 Equipment control method, device and system
JP2021184233A (en) * 2020-05-21 2021-12-02 株式会社トゥービーソフトTobesoft Co., Ltd. Electronic terminal device that performs control command execution and feedback output corresponding to user's hand gesture and operating method thereof
CN112233161A (en) * 2020-10-15 2021-01-15 北京达佳互联信息技术有限公司 Hand image depth determination method and device, electronic equipment and storage medium
CN113282166A (en) * 2021-05-08 2021-08-20 青岛小鸟看看科技有限公司 Interaction method and device of head-mounted display equipment and head-mounted display equipment
CN113986093A (en) * 2021-09-26 2022-01-28 展讯通信(上海)有限公司 Interactive method and related device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李亚骅;尹念东;陈志;: "基于手势识别的虚拟实验交互系统的设计与实现", 湖北工业大学学报, no. 01, 28 February 2018 (2018-02-28), pages 40 - 43 *

Similar Documents

Publication Publication Date Title
EP4123437A1 (en) Screen projection display method and system, terminal device, and storage medium
CN109068161B (en) Equipment connection establishing method and device
US12032752B2 (en) Method and apparatus for controlling input function, and storage medium
CN106453052A (en) Message interaction method and apparatus thereof
CN105739857A (en) Mobile terminal control method and apparatus
US20180144546A1 (en) Method, device and terminal for processing live shows
CN112269556A (en) Information display method, apparatus, system, device, server and storage medium
JP2017513416A (en) Data transmission method, apparatus, facility, program, and recording medium
CN106997281A (en) The method and smart machine of shared virtual objects
WO2020173316A1 (en) Image display method, terminal, and mobile terminal
CN112905089A (en) Equipment control method and device
CN107371052A (en) Equipment control method and device
WO2021104266A1 (en) Object display method and electronic device
CN113485596B (en) Virtual model processing method and device, electronic equipment and storage medium
CN108319363A (en) Product introduction method, apparatus based on VR and electronic equipment
CN106792041A (en) Content share method and device
CN104967966B (en) A kind of method and device of binding bluetooth equipment
CN106681632A (en) Projection control method, device and system, terminal device and display device
CN106358064A (en) Method and equipment for controlling television
CN106454093A (en) Image processing method, image processing device and electronic equipment
CN115576417A (en) Interaction control method, device and equipment based on image recognition
CN115643445A (en) Interactive processing method, device, electronic device and storage medium
CN114296587A (en) Cursor control method and device, electronic equipment and storage medium
CN106896917A (en) Aid in method and device, the electronic equipment of Consumer's Experience virtual reality
CN115840506A (en) Data interaction method, device, equipment and medium based on wearable equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination