[go: up one dir, main page]

CN109934930B - A method and system for augmented reality based on precise user location - Google Patents

A method and system for augmented reality based on precise user location Download PDF

Info

Publication number
CN109934930B
CN109934930B CN201711370401.6A CN201711370401A CN109934930B CN 109934930 B CN109934930 B CN 109934930B CN 201711370401 A CN201711370401 A CN 201711370401A CN 109934930 B CN109934930 B CN 109934930B
Authority
CN
China
Prior art keywords
user
location
target object
video
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711370401.6A
Other languages
Chinese (zh)
Other versions
CN109934930A (en
Inventor
姜鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201711370401.6A priority Critical patent/CN109934930B/en
Publication of CN109934930A publication Critical patent/CN109934930A/en
Application granted granted Critical
Publication of CN109934930B publication Critical patent/CN109934930B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例所提供的一种基于用户精准位置的增强现实方法和系统,通过计算用户的精准位置,将所述精准位置处的AR目标物体与用户智能终端设备提供的视频画面中AR目标物体位置进行对照识别,匹配到与用户提供的视频画面一致的图像;所述用户智能终端将虚拟信息和用户智能终端拍摄的视频画面进行叠加渲染,得到增强现实的视频画面,以供用户交互使用。这样的增强现实方法和系统,因为是根据用户精准位置进行的对照识别,这样可以大大降低对照识别的工作量,提升匹配成功率,稳定输出增强现实的画面。

The embodiment of the present invention provides an augmented reality method and system based on the precise location of the user. By calculating the precise location of the user, the AR target object at the precise location is compared and identified with the location of the AR target object in the video screen provided by the user's smart terminal device, and an image consistent with the video screen provided by the user is matched; the user's smart terminal overlays and renders the virtual information and the video screen shot by the user's smart terminal to obtain an augmented reality video screen for user interaction. Such an augmented reality method and system can greatly reduce the workload of comparison and identification, improve the matching success rate, and stably output the augmented reality screen because it is based on the precise location of the user.

Description

Reality augmentation method and system based on accurate position of user
Technical Field
The invention relates to the technical field of image recognition, in particular to a reality enhancement method and system based on a user accurate position.
Background
Augmented reality (AugmentedReality, AR for short) is simply to apply virtual information to the real world through computer technology, and the real environment and the virtual object are superimposed on the same screen or space in real time, so that the virtual object and the real object exist simultaneously. Augmented reality provides information that is generally different from what a human can perceive. The virtual information display system not only displays real world information, but also displays virtual information at the same time, and the two kinds of information are mutually complemented and overlapped.
The AR technology is a brand new man-machine exchange technology, virtual information is applied to the real world through intelligent terminal equipment and a visualization technology, and the virtual information and the real world are simultaneously overlapped on the same picture or space to be presented to a user. The general working flow of the AR application program is that a terminal shoots a video frame picture through a camera, recognizes video frame picture data, determines an AR target object, tracks the AR target object in the video frame picture, determines the position of the AR target object, acquires AR virtual information related to the AR target object, renders the video frame picture, superimposes the AR virtual information on the AR target object for display, and simultaneously displays the AR target object and AR virtual content on a terminal screen for interaction of a user. The existing AR application technology has the defects of large calculated amount, unstable video enhancement effect, low system detection accuracy and the like.
The existing GPS navigation technology can be combined with the positions of users in video frame pictures shot by other nearby electronic terminals to obtain the accurate positions of the users, and then the accurate positions (including directions) of the users can be determined according to the video frame pictures shot by the user electronic equipment terminals.
Disclosure of Invention
In order to solve the problems, the invention provides an augmented reality method and system based on the accurate position of a user.
In order to achieve the above object, the present disclosure provides an augmented reality method based on a precise location of a user, including:
1, a user intelligent terminal device collects position information and video picture data;
2, collecting position information and video picture data of other electronic equipment nearby the position according to the position information of the user intelligent terminal equipment;
3, obtaining the accurate position of the user according to the position of the user in the video picture of the other electronic equipment;
4, comparing the video picture shot by the user intelligent terminal device with the pre-acquired accurate position picture to obtain the accurate position (including orientation) of the user;
5, comparing and identifying the AR target object at the accurate position with the AR target object position in the video picture shot by the intelligent terminal equipment of the user, and matching the AR target object with an image consistent with the video picture provided by the user;
And 6, the user intelligent terminal performs superposition rendering on virtual information corresponding to the matched consistent image and the video picture shot by the user intelligent terminal to obtain an augmented reality video picture for interactive use of the user.
Preferably, the user is a person, animal or vehicle, etc.
Preferably, the AR target object at the precise position is a moving object, and the virtual information corresponding to the AR target object is displaced along with the displacement of the AR target object.
Preferably, the AR target object at the precise location is another user, and the other user presets own virtual information for interaction with another user.
Preferably, the virtual information is an image, text, sound, etc.
Preferably, in the augmented reality method based on the accurate position of the user, the other electronic devices near the accurate position are one electronic device or a plurality of electronic devices.
In order to achieve the above purpose, the invention discloses an augmented reality system based on a precise position of a user, which comprises a cloud server, intelligent terminal equipment and other electronic equipment in the nearby position.
The cloud server is used for collecting the position information and video picture data of the intelligent user terminal equipment, collecting the position information and video picture data of other electronic equipment nearby the position, calculating the accurate position of the intelligent user equipment, comparing and identifying the AR target object at the accurate position with the AR target object position in the video picture provided by the user, matching the AR target object with an image consistent with the video picture provided by the user, and sending virtual information corresponding to the matched consistent image to the intelligent user terminal;
the intelligent terminal equipment is used for collecting the position information of the user and video picture data, and is used for superposing and rendering the virtual information corresponding to the matched consistent image and the video picture provided by the intelligent terminal equipment of the user to obtain an augmented reality video picture;
And the other electronic equipment at the nearby position is used for collecting the position information and video picture data of the electronic equipment.
Preferably, the intelligent terminal device is configured to collect face data of a user, where the user sets corresponding virtual information for the face data of the user, and interacts with other users.
Preferably, the other electronic devices at the nearby position in the augmented reality system based on the accurate position of the user are one electronic device or a plurality of electronic devices.
Preferably, the intelligent terminal device includes a speaker for playing the sound signal in the virtual information.
According to the augmented reality method and system based on the accurate position of the user, the AR target object at the accurate position is compared with the AR target object position in the video picture shot by the user intelligent terminal device to be identified and matched with the image consistent with the video picture provided by the user by calculating the accurate position of the user, and the user intelligent terminal carries out superposition rendering on virtual information and the video picture shot by the user intelligent terminal to obtain an augmented reality video picture for interactive use of the user. According to the augmented reality method and system, the contrast identification is performed according to the accurate position of the user, so that the workload of the contrast identification can be greatly reduced, the matching success rate is improved, and the augmented reality picture is stably output.
Drawings
FIG. 1 is a schematic flow diagram of an augmented reality method based on a user's precise location according to an embodiment of the present invention;
Fig. 2 is a schematic diagram of an augmented reality system based on a precise location of a user according to an embodiment of the present invention.
Detailed description of the preferred embodiments
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
It is to be noted that, in the following embodiments of the present invention, two other electronic devices near the precise location are provided, which is merely for simplicity of explanation and not for limitation of the present invention, and the other electronic devices near the location may be one or more, so that for brevity, the embodiments of the present invention are not illustrated one by one.
Referring to fig. 1 and 2, fig. 1 is a schematic flow chart of an augmented reality method based on a user accurate position according to an embodiment of the invention, and fig. 2 is a schematic structural diagram of an augmented reality system based on a user accurate position according to an embodiment of the invention, as shown in fig. 1 and 2, comprising the following steps:
S1, a cloud server A collects position information provided by a GPS module B1 of user intelligent terminal equipment B, and meanwhile, the cloud server A collects video picture data provided by a camera module B2 of the user intelligent terminal equipment B;
S2, the cloud server A collects position information and video picture data of other electronic equipment C and D nearby the position, and obtains the accurate position of the user intelligent terminal equipment B according to the positions of the users in video frame pictures of the other electronic equipment C and D;
s3, the cloud server A compares the acquired video picture of the user intelligent terminal equipment B with the accurate position picture acquired in advance by the system to obtain the accurate position of the user intelligent terminal equipment B;
S4, the cloud server A compares and identifies the AR target object at the accurate position with the AR target object position in the video picture provided by the user, and matches the AR target object position with an image consistent with the video picture provided by the user intelligent terminal equipment B;
S5, the cloud server A sends virtual information corresponding to the matched consistent images to the user intelligent terminal equipment B;
and S6, the rendering module B3 of the user intelligent terminal B performs superposition rendering on the virtual information and the video picture shot by the user intelligent terminal to obtain an augmented reality video picture for interactive use of the user.
The apparatus embodiments described above are merely illustrative, wherein the modules described as separate components may or may not be physically separate, and some or all of the modules may be selected according to actual needs to achieve the objectives of the embodiment.
The above embodiments are only for illustrating the technical solution of the present invention, but not for limiting the same, the technical features of the above embodiments may be combined in any order, and the steps may be implemented in any order, and there are many other variations in different aspects of the present invention as described above, which are not provided in detail for brevity, and although the present invention is described in detail with reference to the above embodiments, it will be understood by those skilled in the art that they may still make modifications to the technical solution described in the above embodiments or make equivalent substitutions to some of the technical features thereof, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the scope of the technical solution of the embodiment of the present invention.

Claims (10)

1.一种基于用户精准位置的增强现实方法,包括:1. An augmented reality method based on a user's precise location, comprising: 用户智能终端设备采集位置信息和视频画面数据;User smart terminal devices collect location information and video image data; 根据所述用户智能终端设备的位置信息,采集该位置附近的其他电子设备位置信息和视频画面数据;According to the location information of the user's intelligent terminal device, the location information and video image data of other electronic devices near the location are collected; 根据所述其他电子设备视频画面中用户位置,得出该用户的准确位置;Determining the accurate location of the user based on the user's location in the video screen of the other electronic device; 将所述用户智能终端设备拍摄的视频画面与预先采集的所述准确位置画面进行对比,得出用户的精准位置和朝向;Compare the video image captured by the user's smart terminal device with the accurate position image collected in advance to obtain the user's precise position and orientation; 将该精准位置和朝向处AR目标物体与用户智能终端设备拍摄的视频画面中AR目标物体位置进行对照识别,匹配到与用户提供的视频画面一致的图像;The AR target object at the precise position and orientation is compared and identified with the position of the AR target object in the video image captured by the user's smart terminal device, and an image consistent with the video image provided by the user is matched; 所述用户智能终端将匹配一致图像对应的虚拟信息和用户智能终端拍摄的视频画面进行叠加渲染,得到增强现实的视频画面,以供用户交互使用。The user intelligent terminal overlays and renders the virtual information corresponding to the matching consistent image and the video picture shot by the user intelligent terminal to obtain an augmented reality video picture for user interaction. 2.根据权利要求1所述的方法,其中,所述用户为人、动物或车辆等。2. The method according to claim 1, wherein the user is a person, an animal, a vehicle, etc. 3.根据权利要求1所述的方法,其中,所述精准位置处AR目标物体为移动物体,所述AR目标物体对应的虚拟信息随着AR目标物体的位移而位移。3. The method according to claim 1, wherein the AR target object at the precise position is a moving object, and the virtual information corresponding to the AR target object moves with the displacement of the AR target object. 4.根据权利要求1所述的方法,其中,所述精准位置处的AR目标物体为其他用户,所述其他用户预设自己的虚拟信息,用于与别的用户进行交互。4. The method according to claim 1, wherein the AR target object at the precise position is another user, and the other user presets his own virtual information for interacting with other users. 5.根据权利要求1所述的方法,其中,所述虚拟信息为图像、文字、声音等。5. The method according to claim 1, wherein the virtual information is an image, text, sound, etc. 6.根据权利要求1所述的方法,其中,所述位置附近的其他电子设备为一个电子设备或多个电子设备。The method according to claim 1 , wherein the other electronic devices near the location are one electronic device or a plurality of electronic devices. 7.一种基于用户精准位置的增强现实系统,包括:云端服务器、智能终端设备及附近位置的其他电子设备;7. An augmented reality system based on the precise location of a user, comprising: a cloud server, a smart terminal device and other electronic devices in the vicinity; 所述云端服务器,用于采集用户智能终端设备的位置信息和视频画面数据;用于采集所述附近位置的其他电子设备的位置信息和视频画面数据用于根据所述其他电子设备视频画面中用户位置,得出该用户的准确位置;用于通过将用户智能终端设备拍摄的视频画面与预先采集的准确位置画面进行对比,计算出用户智能设备的精准位置和朝向;用于根据所述精准位置和朝向处AR目标物体与用户智能终端设备拍摄的视频画面中AR目标物体位置进行对照识别,匹配到与用户提供的视频画面一致的图像;用于将所述匹配一致图像对应的虚拟信息发送给用户智能终端;The cloud server is used to collect the location information and video screen data of the user's smart terminal device; to collect the location information and video screen data of other electronic devices at the nearby locations to obtain the accurate location of the user based on the user's location in the video screen of the other electronic devices; to calculate the accurate location and orientation of the user's smart device by comparing the video screen shot by the user's smart terminal device with the accurate location screen collected in advance; to compare and identify the AR target object at the accurate location and orientation with the location of the AR target object in the video screen shot by the user's smart terminal device, and match it to an image consistent with the video screen provided by the user; and to send virtual information corresponding to the consistent matching image to the user's smart terminal; 所述用户智能终端设备,用于采集用户所在位置信息和视频画面数据;用于将所述虚拟信息与用户智能终端设备拍摄的视频画面叠加渲染,得到增强现实的视频画面;The user intelligent terminal device is used to collect the user's location information and video picture data; and is used to overlay and render the virtual information with the video picture taken by the user intelligent terminal device to obtain an augmented reality video picture; 所述附近位置的其他电子设备,用于采集所述电子设备的位置信息和视频画面数据。The other electronic devices at the nearby locations are used to collect the location information and video picture data of the electronic device. 8.根据权利要求7所述的系统,其中所述智能终端设备,用于采集用户的面部数据,所述用户为自己的面部数据设置对应的虚拟信息,与其他用户交互。8. The system according to claim 7, wherein the intelligent terminal device is used to collect facial data of the user, and the user sets corresponding virtual information for his or her facial data to interact with other users. 9.根据权利要求7所述的系统,其中,所述附近位置的其他电子设备,为一个电子设备或多个电子设备。9. The system according to claim 7, wherein the other electronic devices in the nearby location are one electronic device or multiple electronic devices. 10.根据权利要求7所述的系统,其中,所述智能终端设备包括扬声器,用于播放所述虚拟信息中的声音信号。10. The system according to claim 7, wherein the intelligent terminal device comprises a speaker for playing the sound signal in the virtual information.
CN201711370401.6A 2017-12-18 2017-12-18 A method and system for augmented reality based on precise user location Active CN109934930B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711370401.6A CN109934930B (en) 2017-12-18 2017-12-18 A method and system for augmented reality based on precise user location

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711370401.6A CN109934930B (en) 2017-12-18 2017-12-18 A method and system for augmented reality based on precise user location

Publications (2)

Publication Number Publication Date
CN109934930A CN109934930A (en) 2019-06-25
CN109934930B true CN109934930B (en) 2025-01-03

Family

ID=66983059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711370401.6A Active CN109934930B (en) 2017-12-18 2017-12-18 A method and system for augmented reality based on precise user location

Country Status (1)

Country Link
CN (1) CN109934930B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110335351B (en) * 2019-07-02 2023-03-24 北京百度网讯科技有限公司 Multi-modal AR processing method, device, system, equipment and readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104376118A (en) * 2014-12-03 2015-02-25 北京理工大学 Panorama-based outdoor movement augmented reality method for accurately marking POI
CN104571532A (en) * 2015-02-04 2015-04-29 网易有道信息技术(北京)有限公司 Method and device for realizing augmented reality or virtual reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017192467A1 (en) * 2016-05-02 2017-11-09 Warner Bros. Entertainment Inc. Geometry matching in virtual reality and augmented reality
CN106354258B (en) * 2016-08-30 2019-04-05 上海乐相科技有限公司 A kind of picture display process and device of virtual reality device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104376118A (en) * 2014-12-03 2015-02-25 北京理工大学 Panorama-based outdoor movement augmented reality method for accurately marking POI
CN104571532A (en) * 2015-02-04 2015-04-29 网易有道信息技术(北京)有限公司 Method and device for realizing augmented reality or virtual reality

Also Published As

Publication number Publication date
CN109934930A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
US11393173B2 (en) Mobile augmented reality system
US10089794B2 (en) System and method for defining an augmented reality view in a specific location
CN110866977B (en) Augmented reality processing method and device, system, storage medium and electronic equipment
CN107222468A (en) Augmented reality processing method, terminal, cloud server and edge server
US8917908B2 (en) Distributed object tracking for augmented reality application
US9392248B2 (en) Dynamic POV composite 3D video system
CN110296686B (en) Vision-based positioning method, device and equipment
CN109582122B (en) Augmented reality information providing method and device and electronic equipment
CN104331929A (en) Crime scene reduction method based on video map and augmented reality
CN112348968B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
WO2018080848A1 (en) Curated photogrammetry
CN102903144A (en) Cloud computing based interactive augmented reality system implementation method
CN107084740B (en) Navigation method and device
CN109035415B (en) Virtual model processing method, device, equipment and computer readable storage medium
CN112882576B (en) AR interaction method and device, electronic equipment and storage medium
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
CN110555876B (en) Method and apparatus for determining position
CN114092670A (en) Virtual reality display method, equipment and storage medium
WO2022166173A1 (en) Video resource processing method and apparatus, and computer device, storage medium and program
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112183431A (en) Real-time pedestrian number statistical method and device, camera and server
CN109445598B (en) Augmented reality system device based on vision
CN112637665B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
CN113178017A (en) AR data display method and device, electronic equipment and storage medium
CN103327251B (en) A kind of multimedia photographing process method, device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: No.5, Caichang Hutong, Wangfujing Street, Dongcheng District, Beijing

Applicant after: Jiang Pengfei

Address before: 610041 Unit 2, 3rd Floor, No. 83 Xinnan Road, Wuhou District, Chengdu City, Sichuan Province

Applicant before: Jiang Pengfei

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant