CN113155117B - Navigation system, method and device - Google Patents
Navigation system, method and device Download PDFInfo
- Publication number
- CN113155117B CN113155117B CN202010077083.XA CN202010077083A CN113155117B CN 113155117 B CN113155117 B CN 113155117B CN 202010077083 A CN202010077083 A CN 202010077083A CN 113155117 B CN113155117 B CN 113155117B
- Authority
- CN
- China
- Prior art keywords
- navigation
- guidance
- small aircraft
- guided object
- visual guidance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000000007 visual effect Effects 0.000 claims abstract description 120
- 230000003993 interaction Effects 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 230000004044 response Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 230000002452 interceptive effect Effects 0.000 description 6
- 230000002085 persistent effect Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 239000002355 dual-layer Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
A navigation system, method and apparatus are disclosed. The system comprises: a navigation device for providing mobile navigation; a visual guidance device disposed on the navigation device for providing visual guidance; and a control device for controlling the mobile navigation of the navigation device and the guidance content of the visual guidance device. The invention can provide visual guidance for users in physical space by combining navigation equipment of small aircrafts and visual guidance devices arranged on the navigation equipment.
Description
Technical Field
The present disclosure relates to the field of navigation, and in particular, to a navigation system, method, and apparatus.
Background
In scenes with complex environments such as malls, hospitals and venues, people often have to expend some effort to reach the intended location. Fig. 1 shows an example of existing navigation. For example, a user needs to purchase a certain brand of pants, which needs to query the location of the store where the brand is located and the pants vending area within the brand, for example, on a terminal where the store App or a navigation App with a detailed profile of the store is installed. At this time, the App in the terminal generates a route based on the current location of the user and the location of the pants vending area (as shown by the marking points in the store 1750 in the figure), and the user needs to reach the target pants vending area with continuous reference to the terminal screen route display according to the direction of the stereoscopic arrow.
The method requires the user to inquire on the terminal, the handheld terminal is not convenient and direct to navigate the user by watching the screen of the terminal from time to time, and the method is easy to make mistakes under the condition of having a complex path.
In view of this, a more convenient and intuitive navigation solution is needed.
Disclosure of Invention
One technical problem to be solved by the present disclosure is to provide an improved navigation solution, which, in combination with a navigation device and a visual guiding means provided thereon, is capable of providing intuitive visual guidance for a user in a physical space. The navigation device may be a mobile navigation device such as an aircraft, which may adjust its flight trajectory and state based on the current action state of the user, and on which the visual guiding means provided may give appropriate guidance in real time, thereby providing the user with convenient and intuitive navigation, in particular indoor navigation.
According to a first aspect of the present disclosure, there is provided a navigation system comprising: a navigation device for providing mobile navigation; a visual guidance device disposed on the navigation device for providing visual guidance; and a control device for controlling the mobile navigation of the navigation device and the guidance content of the visual guidance device.
Alternatively, the navigation device is a device that provides navigation by itself movement, for example, a small aircraft. The control device is used for: generating a guide instruction; transmitting the guidance instructions to the small aircraft, the small aircraft being configured to: and flying based on the guiding instruction so that the visual guiding device can conduct visual guiding on the guided object.
Optionally, the visual guiding device is configured to: providing corresponding visual guidance content based on the guidance instructions, the current flight status of the small aircraft, and/or the current location information of the guided object.
According to a second aspect of the present disclosure, there is provided a navigation method comprising: generating a guide instruction; the navigation device performs mobile navigation based on the guiding instruction, so that a visual guiding device arranged on the navigation device performs visual guiding on the guided object.
According to a third aspect of the present disclosure, there is provided a navigation method comprising: the navigation terminal receives a guide instruction input by a guided object and generates a calling command; the idle small aircraft flies to a preset guiding starting position based on the calling command and positions the guided object; determining a small aircraft flight path and a current state based on the guidance instructions and a current position of the guided object; a visual guidance device provided on the small aircraft visually guides the guided object based on the current positions of the small aircraft and the guided object.
According to a fourth aspect of the present disclosure, there is provided a navigation device comprising: an input unit for receiving a guidance instruction input by a guided object; a processing unit for generating a fetch command based on the boot instruction; and the sending unit is used for sending the calling instruction to the idle mobile navigation equipment so that the idle mobile navigation equipment moves to a preset guiding starting position and positions the guided object based on the calling instruction, and the mobile navigation equipment and the visual guiding equipment arranged on the mobile navigation equipment visually guide the guided object according to the guiding instruction, the current positions of the mobile navigation equipment and the guided object.
According to a fifth aspect of the present disclosure, there is provided a computing device comprising: a processor; and a memory having executable code stored thereon which, when executed by the processor, causes the processor to perform the method as described in the second aspect above.
According to a sixth aspect of the present disclosure there is provided a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the method as described in the second aspect above.
Therefore, the navigation scheme of the invention can provide visual and convenient visual guidance for the guided object by combining the navigation equipment with the visual guidance. The scheme can be combined with the functions of the intelligent control center of a building or other places, and provides timely and effective guidance for users.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout exemplary embodiments of the disclosure.
Fig. 1 shows an example of existing navigation.
Fig. 2 shows a schematic composition of a navigation system according to an embodiment of the invention.
Fig. 3 shows an example of navigation based on the present invention.
Fig. 4 shows an example of a navigation system according to the invention.
Fig. 5 shows a flow diagram of a navigation method according to an embodiment of the invention.
FIG. 6 illustrates a schematic diagram of a computing device that may be used to implement the navigation method described above, according to one embodiment of the invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The invention discloses an improved navigation scheme, which is combined with navigation equipment and a visual guiding device arranged on the navigation equipment, and can provide visual guiding for a user in a physical space. The navigation device may be a mobile navigation device such as a small aircraft, which can adjust its flight track and state based on the current action state of the user, and the visual guiding device provided thereon can give appropriate guidance in real time, thereby providing convenient and intuitive navigation, particularly indoor navigation, for the user.
Fig. 2 shows a schematic composition of a navigation system according to an embodiment of the invention. As shown, the system may include a navigation device (e.g., a small aircraft as shown in the figures) 210, a visual guidance device 220, and a control device 230.
An example of the navigation system of the present invention is shown in the figure as a small aircraft. It should be understood that the navigation system of the present invention may also use other devices as navigation devices. In the present invention, the navigation device is a device for providing mobile navigation. The above-mentioned mobile navigation may be a navigation in which the movement of the device itself drives the movement of the visual guidance device, or may be a navigation in which the device itself does not move, but only the movement of the visual guidance device is caused. In various embodiments, the navigation device may be a mobile navigation device, i.e. a device that performs visual navigation for the user by physical movement of itself, such as a small aircraft in this example, but also other ground or wall mobile devices. The ground or wall surface moving device can be also provided with a visual guiding device and can receive the control of the control device to conduct route planning and guiding based on self determination or the control of the control device.
In other embodiments, the navigation device may also be a device that does not itself move, but is capable of issuing a visual guide of movement. In one embodiment, the navigation device may be a device that provides navigation through its projected movement. For example, a wall-mounted projection device may adjust the projection onto a wall or floor under control of a control device to provide a mobile visual guide. The above-described device can also be combined with a camera, for example, also arranged on the wall, to give visual guidance of the corresponding projection position and the projection content based on a confirmation of the current position of the user.
The mini-aircraft 210 may be a four-rotor mini-unmanned as shown, capable of flying or hovering. The visual guide device 220 is provided on a small aircraft to provide visual guidance. The control device 230 may then be used to control the flight of the small aircraft 210 and the guidance content of the visual guidance device 220. In various embodiments, the small aircraft 210 and the visual guidance device 220 may be controlled by one or more control devices 230, either in association or separately. Therefore, the navigation system can provide visual and convenient guidance for the guided objects (such as meeting participants in a convention center) through visual guidance of the unmanned aerial vehicle and visual guidance equipment arranged on the unmanned aerial vehicle in the entity space.
In particular, the control device 230 may be configured to generate and send guidance instructions to the small aircraft. Accordingly, the small aircraft can fly based on the guiding instruction so that the visual guiding device visually guides the guided object.
In one embodiment, the control device 230 may include a remote control device onboard a small aircraft (e.g., a drone), and a worker or a guided object may input a destination into the remote control device, such that the small aircraft generates a flight path by itself according to the destination, and guides the guided object. In other embodiments, the control device 230 may include a control center and/or a control terminal connected to the control center. Similarly, a crew or guided object may enter a destination into a control center or terminal to cause the corresponding guidance of the small aircraft.
More specifically, the control device 230 may receive a query of the guided object and then generate a guidance instruction containing guidance-destination location information based on the query result. Typically, the guided object knows the destination name it needs to go to, but does not know the specific location of the destination. At this time, the worker or the guided object may input a destination name to the control center or the terminal (e.g., via voice interaction with the intelligent voice device), and the control center or the terminal queries a specific location corresponding to the destination name in the background and sends a guiding instruction including the specific location to the small aircraft. For example, a user may input a certain brand of men's clothing at a control terminal equipped in a mall, and the background may query the location of the brand of men's clothing, and cause an unmanned aerial vehicle equipped in the mall and a visual guidance device provided thereon to guide the user to the location of the brand of men's clothing.
When the user's input query points to a plurality of results, the plurality of results may be presented to the user and directed based on the user's selection. In other cases, the user may also be automatically selected the nearest or most idle destination. For example, when a user needs to find a toilet in a large-scale exhibition center, the control center can combine the current position of the user to find a relatively close and most idle toilet for the user, and guide the user by using the unmanned aerial vehicle and the visual guiding device arranged on the unmanned aerial vehicle.
In different scenarios, the visual guide device 220 may be provided on the small aircraft 210 in different modalities and integration levels. In one embodiment, the visual guide device 220 may be a relatively standalone device, for example, may be mounted or removably loaded on the small aircraft 210. At this time, the visual guidance device 220 may have a relatively independent display and control means, which may be in wired or wireless communication with the small aircraft 210 or the control device 230 to receive the visual guidance content command. In other embodiments, the visual guide device 220 may be part of the small aircraft 210, for example, implemented as an on-board projector of the small aircraft 210. At this time, the visual guide device 220 may share display and control means within the fuselage with the small aircraft 210.
In some embodiments, the visual guidance device 220 is a device that guides with optical content. In one simplest implementation, the visual guidance device 220 may be a device that displays the positional relationship of the guided object and the guidance destination using, for example, different colors of light or flashing frequencies. For example, the guided object may display green light when it is on the correct path, flash red light on the wrong path, blue light when it approaches or reaches the destination, etc. In other embodiments, the visual guide device 220 may be a device that utilizes a display screen for visual guidance, such as a soft plastic embedded LED particles or a rollable OLED screen. The visual guide device 220 may also be a device that uses projection for visual guidance, such as a projector. In other embodiments, the visual guide device 220 may also utilize mechanical means, such as a physical arrow pointing location, for visual cues, which the present invention is not limited to.
While the visual guidance device may visually guide by simply displaying an arrow, in a preferred embodiment, the visual guidance device 220 may provide corresponding visual guidance content based on the guidance instructions and the current flight status of the small aircraft. Still further, the visual guidance device 220 may also adjust its visual guidance content based on the current location information of the guided object (where a positioning mechanism for the guided object is required, as will be described in more detail below).
For example, when the display of the user query is in zone D, the visual guidance device 220 may always display the "zone D" and the current directional arrow with respect to zone D at the beginning of the guidance, or during the guidance process. During the flight of the small aircraft 210, a corresponding indication may be generated based on the relative relationship of the small aircraft 210 to the destination location. Further, the visual guidance device 220 may also adjust the corresponding content display based on the user's position relative to the small aircraft 210. For example, the drone 210 may carry a projection device 220. The projection device projects an image onto the ground to direct the navigated person using the image on the ground. When a right turn is required, for example, a right turn icon is displayed on the ground.
In order to accurately guide the guided object, the small aircraft 210 also typically needs to know the current position of the guided object (e.g., the user) during the guiding process. To this end, the small aircraft 210 may comprise positioning means for positioning the guided object and adjust the current flight status based on the current position information of the guided object.
In different embodiments of the invention, the small aircraft 210 may obtain location information of the guided object based on various mechanisms. In one embodiment, the positioning device may be a target tracking device that actively tracks the guided object, e.g. a camera equipped on a drone. The camera can have a depth camera function, for example, and the current position of the guided object is obtained through active shooting, so that the unmanned aerial vehicle always flies in a certain distance in front of the guided object, and the navigation image can be easily seen. In one embodiment, the positioning device may be a positioning information receiving device that passively acquires the current position of the guided object. For example, a smartphone carried by the guided object may be paired with the drone, such that the locating device may determine the current location of the guided object based on signals emitted by the smartphone. For example, a camera (e.g., a conference monitor) connected to the control center may acquire the current position of the guided object, and send the position information to the positioning device in real time. Thus, the small aircraft 210 can adjust the current flight state based on the current position information of the guided object so that the visual guide content thereof is always located at a position easily seen in the traveling direction of the user.
In a specific implementation, the navigation system is a navigation system equipped for a specific location. For example, a navigation system provided in an outdoor music festival or exhibition, or a navigation system provided in an indoor mall or exhibition center, or a navigation system provided in a shopping mall including indoor and outdoor spaces, for example. Thus, the control device 230 may obtain the specific location of each facility in the venue in advance, so as to facilitate the user's inquiry. The small aircraft 210 may then be used to: planning a flight route according to the spatial distribution of the places; and modifying the flight path based on the current condition of the venue.
In addition, the system may also include devices that provide power to the small aircraft 210 and/or the visual guidance devices, such as a dedicated charging dock that is located at a particular location (e.g., on an indoor ceiling). For example, the small aircraft 210 may be pre-configured with charging dock location information so that it may return to the charging dock for charging when idle. The visual guidance device 220 provided thereon may be charged individually (e.g., an independent charging interface) or together based on its independence.
As previously mentioned, the navigation system of the present invention may be of different scale in different implementations. In an extreme embodiment, the control device may comprise a control unit built-in to said small aircraft and/or said visual guidance device, or be a remote control. Thus, one navigation system of the invention may be, for example, the drone itself provided with projection means. Fig. 3 shows an example of navigation based on the present invention. As shown on the left side of the figure, the navigation system of the present invention is deployed at the entrance of a venue (e.g., a venue), which is shown as an unmanned aerial vehicle with an LED touch display screen mounted. The display screen may prompt the user to navigate the current display by displaying what the user is going to. The user may make destination entry, for example, by selecting the display of zone a 152 on the touch screen. Then, the unmanned aerial vehicle can guide the guided object based on the address. For example, as shown on the right side of fig. 3, a guided object turns right at a bifurcation intersection is indicated by an arrow.
In a more general embodiment, the navigation system of the present invention may be of a larger scale, in which case the system may include a plurality of small aircraft. The control device may be a separate control device in wireless communication with the small aircraft and/or the visual guidance device, e.g. a control center of a large indoor location, even implemented as a cloud control. At this time, the control device may be configured to: each mini-aircraft hovering and/or flying route is planned based on the guidance space information for guidance by the mini-aircraft and related messages for other mini-aircraft.
Fig. 4 shows an example of a navigation system according to the invention. As shown on the left side of the figure, the navigation system may be a navigation system equipped for a certain building (e.g. an exhibition hall). The building has a large number of people to visit. In order to guide a plurality of guided objects at the same time, the navigation system is equipped with a plurality of unmanned aerial vehicles 410 as shown on the right side of the figure, and a projector for visual guidance (i.e., serving as the visual guidance device in fig. 2) is loaded on each unmanned aerial vehicle 410. In order to manage the plurality of drones 410, the control device 420 is implemented as a control center 421 and a plurality of control terminals 422. The control terminal 422 may be configured to receive user input and obtain a corresponding destination location center based on the control center 421 or a local query. The control center 421 may be part of a building data center and may be implemented in the cloud in some embodiments. The data center stores the position information of each facility in the building, and the control center 421 inquires according to the input and generates a corresponding guidance command.
After the crowd enters the building (e.g., indoors), it may interact with the control terminal 422 of the navigation system, e.g., voice interaction, as shown on the right side of the figure. In one embodiment, the control terminal 422 may be implemented as a smart speaker, or other form with smart voice conversation functionality, such as a smart robot, etc. The user may input the voice "i want to see the exhibits in the XX age" to one control terminal 422, the control terminal 422 may analyze the semantics of the user input locally or via the control center 421, the control center 421 may query the data center based on the semantics to determine the guiding route, and call the currently idle drone 410 to visually guide the user (hereinafter, the "guided object").
The unmanned plane 410 may receive the guidance start position (the current location of the guided object) and the destination position input by the control center 421, and generate a guidance route by itself, or receive a route generated by the control center 421. The drone 410 may then come to a guidance starting location to provide guidance services for the guided object. The drone 410 may have a camera that may operate to lock the guided object, such as face recognition, and as shown in the lower portion of the figure, always determine the current speed and altitude based on the current position (or relative distance) of the guided object acquired by the camera (or other relative position determination mechanism, such as acquired by the control center 421 based on the building monitoring system). Accordingly, the projector arranged on the guiding device can correspondingly change the guiding content according to the current positions of the unmanned aerial vehicle and the guided object, for example, an arrow pattern projected to the ground, so as to give the guided object the most intuitive guiding. During the flight, the unmanned aerial vehicle 410 can automatically cope with an emergency, for example, avoidance of an object in a flight path based on a captured image of a camera or a built-in radar; a visual indication or the like is given when it is determined that the guided object enters the wrong path. The drone 410 can continue to communicate with the control center 421 during navigation initiation or process, and the control center 421 can monitor the current conditions of various sites within the building in real time and can update the guidance destination or the guidance route in real time based on the current conditions (e.g., emergency).
In addition to employing visual guidance, the navigation scheme of the present invention may also be combined with a voice interaction scheme. For example, the control device and/or the mini-analyzer interact with the guided object in voice during the guiding process. The boot process may include the stages of a boot start, a boot process, and a boot end. In the guidance initiation phase, the guided object may interact with the control device to determine its guided destination as shown on the right side of fig. 4, at which point the control device (e.g., control terminal 422 implemented as a smart voice interaction device) may directly interact with the guided object in voice and may be responsible for recognizing the user's intent and returning the corresponding voice by the control center 421. During the guidance process, the small aircraft or the visual guidance device may use its voice function to guide the user, even to interact with the user. In a simple implementation, the small aircraft and/or the visual guidance device may play the voice guidance content directly to the user, where the small aircraft or the visual guidance device only has to have the function of a speaker (or e.g. a bluetooth headset to be worn by the guided object). In a more complex implementation, the small aircraft and/or the visual guidance device is also provided with voice interaction functionality. At this time, the user can also directly input a new destination instruction by voice while being guided. In other embodiments, the guided object may also wear an interactive terminal, for example, connected to the control center 421, at which time the coordination and control of the small aircraft, the visual guiding device mounted thereon, and the interactive terminal worn by the guided object may be performed simultaneously by the control center 421. The guided object may receive voice prompts (e.g., turn left, turn right, 50 meters forward, etc.) or give new voice instructions at any time by voice interaction with the interactive terminal (e.g., inputting instructions to the microphone of the interactive terminal, receiving instructions from the headset). At the end of the guidance, the small aircraft, the visual guidance device mounted thereon, or the like may inform the guided object of the end of the guidance process, and thereafter the small aircraft may fly away.
In addition, the navigation solution of the present invention may further comprise a mechanism for facilitating the interaction of the guided object with the control device or the small aircraft during navigation. For example, a guided object may not want to follow-up guidance if the guided object has already clarified the location of its intended destination before being guided to the destination. At this point, the guided object may operate on the small aircraft itself, a control device (e.g., a control terminal or a wearable interactive terminal, etc.), or a visual guidance device to interrupt subsequent navigation. At this time, the guided object may make a negative gesture (e.g., swing an X-shape with a forearm in front of the chest) to be easily recognized visually, or perform a voice or key (or touch screen) operation, etc.
The navigation scheme of the invention can also be realized as a navigation method. Fig. 5 shows a flow diagram of a navigation method according to an embodiment of the invention. Under different application scenes, the method can be realized by a small aircraft integrated with the vision guiding equipment, by an independent control equipment, by a control center or by a control terminal.
In step S510, a guidance instruction is generated. In step S520, the navigation apparatus performs mobile navigation based on the guidance instruction, so that the visual guidance apparatus provided on the navigation performs visual guidance on the guided object.
As previously described, the navigation device may be a device that provides navigation by itself movement, and includes at least one of: a small aircraft for flying or hovering; and a ground moving device for moving or being stationary on the ground. The navigation device may also be a device that provides navigation through its projected movement.
In a scenario where a small aircraft is used as the navigation device, step S520 may include: the small aircraft flies based on the guiding instruction, so that a visual guiding device provided on the small aircraft visually guides the guided object.
In one embodiment, the method may further comprise: based on the guidance instructions and/or the current flight status of the small aircraft, the visual guidance device provides corresponding visual guidance content.
In one embodiment, the method may further comprise: locating the position of the guided object; and adjusting the current flight state of the small aircraft based on the current position information of the guided object.
In an embodiment of the indoor scene implementation, the method may further include: planning a flight route of the small aircraft according to indoor space distribution; and modifying the flight path based on the current conditions in the room. The small aircraft can automatically stop at a special charging dock arranged on a ceiling in a standby state.
In one embodiment, the method may further comprise: the visual guidance device performs visual guidance by using a display screen and/or projection.
During the guidance of the small aircraft, speech may also be used to interact with the guided object. In various implementations, the control device, the visual guidance device, and/or the mini-analyzer may all be used as an interactive terminal for voice or other types of interactions (e.g., gesture interactions, etc.) with a user.
In one specific scenario, the navigation method of the present invention may include: the navigation terminal receives a guide instruction input by a guided object and generates a calling command; the idle small aircraft flies to a preset guiding starting position based on the calling command and positions the guided object; determining a small aircraft flight path and a current state based on the guidance instructions and a current position of the guided object; a visual guidance device provided on the small aircraft visually guides the guided object based on the current positions of the small aircraft and the guided object.
In one specific scenario, a navigation device of the present invention comprises: an input unit for receiving a guidance instruction input by a guided object; a processing unit for generating a fetch command based on the boot instruction; and the sending unit is used for sending the calling instruction to the idle small aircraft so that the idle small aircraft flies to a preset guiding start position based on the calling instruction and positions the guided object, and the small aircraft and a visual guiding device arranged on the small aircraft visually guide the guided object according to the guiding instruction, the small aircraft and the current position of the guided object. The apparatus may be implemented, for example, as the control terminal 422 in fig. 4.
FIG. 6 illustrates a schematic diagram of a computing device that may be used to implement the navigation method described above, according to one embodiment of the invention.
Referring to fig. 6, a computing device 600 includes a memory 610 and a processor 620.
Processor 620 may be a multi-core processor or may include multiple processors. In some embodiments, processor 620 may include a general-purpose host processor and one or more special coprocessors, such as a Graphics Processor (GPU), digital Signal Processor (DSP), etc. In some embodiments, the processor 620 may be implemented using custom circuitry, for example, an Application SPECIFIC INTEGRATED Circuit (ASIC) or a field programmable gate array (FPGA, field Programmable GATE ARRAYS).
Memory 610 may include various types of storage units, such as system memory, read Only Memory (ROM), and persistent storage. Where the ROM may store static data or instructions that are required by the processor 620 or other modules of the computer. The persistent storage may be a readable and writable storage. The persistent storage may be a non-volatile memory device that does not lose stored instructions and data even after the computer is powered down. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the persistent storage may be a removable storage device (e.g., diskette, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as dynamic random access memory. The system memory may store instructions and data that are required by some or all of the processors at runtime. Furthermore, memory 610 may include any combination of computer-readable storage media including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic disks, and/or optical disks may also be employed. In some implementations, memory 610 may include readable and/or writable removable storage devices such as Compact Discs (CDs), digital versatile discs (e.g., DVD-ROMs, dual-layer DVD-ROMs), blu-ray discs read only, super-density discs, flash memory cards (e.g., SD cards, min SD cards, micro-SD cards, etc.), magnetic floppy disks, and the like. The computer readable storage medium does not contain a carrier wave or an instantaneous electronic signal transmitted by wireless or wired transmission.
The memory 610 has stored thereon executable code that, when processed by the processor 620, causes the processor 620 to perform the navigation method described above.
The navigation system, method and apparatus according to the present invention have been described in detail hereinabove with reference to the accompanying drawings. Therefore, the navigation scheme can provide visual and convenient visual guidance for the guided object by combining the unmanned aerial vehicle with the visual guidance. The scheme can be combined with the functions of the intelligent control center of a building or other places, and provides timely and effective guidance for users.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for performing the steps defined in the above-mentioned method of the invention.
Or the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) that, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (31)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010077083.XA CN113155117B (en) | 2020-01-23 | 2020-01-23 | Navigation system, method and device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010077083.XA CN113155117B (en) | 2020-01-23 | 2020-01-23 | Navigation system, method and device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113155117A CN113155117A (en) | 2021-07-23 |
| CN113155117B true CN113155117B (en) | 2024-11-26 |
Family
ID=76882110
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010077083.XA Active CN113155117B (en) | 2020-01-23 | 2020-01-23 | Navigation system, method and device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN113155117B (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113962760B (en) * | 2021-09-23 | 2022-04-01 | 深圳市朗格邦定电子有限公司 | Intelligent shopping method and system based on indicator light and readable storage medium |
| JP7534065B2 (en) * | 2022-09-12 | 2024-08-14 | 三菱ロジスネクスト株式会社 | Guidance System |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104035111A (en) * | 2014-07-04 | 2014-09-10 | 重庆大学 | Indoor offline path guide method and system based on GPS |
| WO2018230539A1 (en) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | Guide system |
| CN109891195A (en) * | 2016-10-26 | 2019-06-14 | 谷歌有限责任公司 | System and method for using visual landmarks in initial navigation |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101458083B (en) * | 2007-12-14 | 2011-06-29 | 财团法人工业技术研究院 | Structured light vision navigation system and method |
| WO2009122356A1 (en) * | 2008-04-03 | 2009-10-08 | Philips Intellectual Property & Standards Gmbh | Method of guiding a user from an initial position to a destination in a public area |
| DE102014219567A1 (en) * | 2013-09-30 | 2015-04-02 | Honda Motor Co., Ltd. | THREE-DIMENSIONAL (3-D) NAVIGATION |
| WO2016065623A1 (en) * | 2014-10-31 | 2016-05-06 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with visual marker |
| US9992465B1 (en) * | 2017-02-08 | 2018-06-05 | Hyundai Motor Company | Vehicular navigation system utilizing a projection device |
-
2020
- 2020-01-23 CN CN202010077083.XA patent/CN113155117B/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104035111A (en) * | 2014-07-04 | 2014-09-10 | 重庆大学 | Indoor offline path guide method and system based on GPS |
| CN109891195A (en) * | 2016-10-26 | 2019-06-14 | 谷歌有限责任公司 | System and method for using visual landmarks in initial navigation |
| WO2018230539A1 (en) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | Guide system |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113155117A (en) | 2021-07-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12188777B2 (en) | Spatial audio navigation | |
| US11644832B2 (en) | User interaction paradigms for a flying digital assistant | |
| CN108521788B (en) | Method for generating simulated flight path, method and equipment for simulating flight and storage medium | |
| CN115951598B (en) | Virtual-real combination simulation method, device and system for multiple unmanned aerial vehicles | |
| US10900790B2 (en) | Interior building navigation system | |
| US20180073889A1 (en) | Methods and systems foe electronic device interactions | |
| CN107436608A (en) | Control device for unmanned plane and the system for guide | |
| KR20190106866A (en) | Robot and method of providing guidance service by the robot | |
| US10642263B2 (en) | UAV, control system and intermediary device for UAV | |
| CN113155117B (en) | Navigation system, method and device | |
| JP6607624B2 (en) | MOBILE PROJECTION SYSTEM AND MOBILE PROJECTOR DEVICE | |
| JP7091880B2 (en) | Information processing equipment, mobiles, remote control systems, information processing methods and programs | |
| WO2018065857A1 (en) | Systems and methods for determining predicted risk for a flight path of an unmanned aerial vehicle | |
| WO2023159591A1 (en) | Intelligent explanation system and method for exhibition scene | |
| JP6456770B2 (en) | Mobile projection system and mobile projection method | |
| JP2016208255A (en) | Movable projection device | |
| US20180376075A1 (en) | Method and system for recording video data using at least one remote-controllable camera system which can be aligned with objects | |
| JP2019161652A (en) | Mobile projection apparatus and projection system | |
| TWI750821B (en) | Navigation method, system, equipment and medium based on optical communication device | |
| US20240118703A1 (en) | Display apparatus, communication system, display control method, and recording medium | |
| TWI695966B (en) | Indoor positioning and navigation system for mobile communication device | |
| KR102852109B1 (en) | Method and apparatus for generating customized region within the space with which service element is associated using tool for generating customized region | |
| Jeny et al. | EINS_AR: Enhanced Indoor Navigation System using Augmented Reality with Dynamic Path Adjustement | |
| WO2021019879A1 (en) | Information processing device, information processing method, program, and information processing system | |
| US12299213B2 (en) | Activating a handheld device with universal pointing and interacting device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |