[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (9)

Search Parameters:
Keywords = MAVLink

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 2745 KiB  
Article
Latency Reduction and Packet Synchronization in Low-Resource Devices Connected by DDS Networks in Autonomous UAVs
by Joao Leonardo Silva Cotta, Daniel Agar, Ivan R. Bertaska, John P. Inness and Hector Gutierrez
Sensors 2023, 23(22), 9269; https://doi.org/10.3390/s23229269 - 18 Nov 2023
Cited by 1 | Viewed by 1739
Abstract
Real-time flight controllers are becoming dependent on general-purpose operating systems, as the modularity and complexity of guidance, navigation, and control systems and algorithms increases. The non-deterministic nature of operating systems creates a critical weakness in the development of motion control systems for robotic [...] Read more.
Real-time flight controllers are becoming dependent on general-purpose operating systems, as the modularity and complexity of guidance, navigation, and control systems and algorithms increases. The non-deterministic nature of operating systems creates a critical weakness in the development of motion control systems for robotic platforms due to the random delays introduced by operating systems and communication networks. The high-speed operation and sensitive dynamics of UAVs demand fast and near-deterministic communication between the sensors, companion computer, and flight control unit (FCU) in order to achieve the required performance. In this paper, we present a method to assess communications latency between a companion computer and an RTOS open-source flight controller, which is based on an XRCE-DDS bridge between clients hosted in the low-resource environment and the DDS network used by ROS2. A comparison based on the measured statistics of latency illustrates the advantages of XRCE-DDS compared to the standard communication method based on MAVROS-MAVLink. More importantly, an algorithm to estimate latency offset and clock skew based on an exponential moving average filter is presented, providing a tool for latency estimation and correction that can be used by developers to improve synchronization of processes that rely on timely communication between the FCU and companion computer, such as synchronization of lower-level sensor data at the higher-level layer. This addresses the challenges introduced in GNC applications by the non-deterministic nature of general-purpose operating systems and the inherent limitations of standard flight controller hardware. Full article
Show Figures

Figure 1

Figure 1
<p>MicroXRCE-DDS architecture [<a href="#B15-sensors-23-09269" class="html-bibr">15</a>].</p>
Full article ">Figure 2
<p>PX4 EKF2 architecture [<a href="#B18-sensors-23-09269" class="html-bibr">18</a>].</p>
Full article ">Figure 3
<p>ROS and FCU communications for latency assessment using MAVROS/MAVlink bridge.</p>
Full article ">Figure 4
<p>ROS 2 and FCU communications for latency assessment using the MicroXRCE–DDS bridge.</p>
Full article ">Figure 5
<p>Hardware-in-the-loop setup for end-to-end latency measurements.</p>
Full article ">Figure 6
<p>Visual odometry message: latency comparison, FCU.</p>
Full article ">Figure 7
<p>Visual odometry message: comparison of latency probability distribution, FCU.</p>
Full article ">Figure 8
<p>Visual odometry message: latency when using MicroXRCE-DDS–ROS 2.</p>
Full article ">Figure 9
<p>IMU message: latency comparison, companion computer.</p>
Full article ">Figure 10
<p>IMU message: comparison of latency probability distribution, companion computer.</p>
Full article ">Figure 11
<p>Pixhawk 4 FMU-V5: CPU and RAM load comparison during latency testing.</p>
Full article ">
20 pages, 2092 KiB  
Article
Low-Cost Computer-Vision-Based Embedded Systems for UAVs
by Luis D. Ortega, Erick S. Loyaga, Patricio J. Cruz, Henry P. Lema, Jackeline Abad and Esteban A. Valencia
Robotics 2023, 12(6), 145; https://doi.org/10.3390/robotics12060145 - 27 Oct 2023
Cited by 6 | Viewed by 3147
Abstract
Unmanned Aerial Vehicles (UAVs) are versatile, adapting hardware and software for research. They are vital for remote monitoring, especially in challenging settings such as volcano observation with limited access. In response, economical computer vision systems provide a remedy by processing data, boosting UAV [...] Read more.
Unmanned Aerial Vehicles (UAVs) are versatile, adapting hardware and software for research. They are vital for remote monitoring, especially in challenging settings such as volcano observation with limited access. In response, economical computer vision systems provide a remedy by processing data, boosting UAV autonomy, and assisting in maneuvering. Through the application of these technologies, researchers can effectively monitor remote areas, thus improving surveillance capabilities. Moreover, flight controllers employ onboard tools to gather data, further enhancing UAV navigation during surveillance tasks. For energy efficiency and comprehensive coverage, this paper introduces a budget-friendly prototype aiding UAV navigation, minimizing effects on endurance. The prototype prioritizes improved maneuvering via the integrated landing and obstacle avoidance system (LOAS). Employing open-source software and MAVLink communication, these systems underwent testing on a Pixhawk-equipped quadcopter. Programmed on a Raspberry Pi onboard computer, the prototype includes a distance sensor and basic camera to meet low computational and weight demands.Tests occurred in controlled environments, with systems performing well in 90% of cases. The Pixhawk and Raspberry Pi documented quad actions during evasive and landing maneuvers. Results prove the prototype’s efficacy in refining UAV navigation. Integrating this cost-effective, energy-efficient model holds promise for long-term mission enhancement—cutting costs, expanding terrain coverage, and boosting surveillance capabilities. Full article
(This article belongs to the Special Issue UAV Systems and Swarm Robotics)
Show Figures

Figure 1

Figure 1
<p>Proportional guidance geometry.</p>
Full article ">Figure 2
<p>Pursuit guidance geometry.</p>
Full article ">Figure 3
<p>Evasion maneuvers by changing the flight mode.</p>
Full article ">Figure 4
<p>Connection diagram.</p>
Full article ">Figure 5
<p>Guided landing algorithm.</p>
Full article ">Figure 6
<p>Structure of the object detection stage.</p>
Full article ">Figure 7
<p>(<b>a</b>) Image in the HSV system. (<b>b</b>) Result of the color detector. (<b>c</b>) Result of the canny edge detector. (<b>d</b>) Edge dilatation. (<b>e</b>) Final result of the obstacle detector.</p>
Full article ">Figure 8
<p>Flow diagram of tracker operation.</p>
Full article ">Figure 9
<p>Obstacle avoidance system diagram.</p>
Full article ">Figure 10
<p>Altitude record without helipad detected (<b>left</b>) and with helipad detected (<b>right</b>).</p>
Full article ">Figure 11
<p>X vs. Y trajectory without helipad detected (<b>left</b>) and with helipad detected (<b>right</b>).</p>
Full article ">Figure 12
<p>Vz during last stage of landing.</p>
Full article ">Figure 13
<p>Obstacle detector result.</p>
Full article ">Figure 14
<p>(<b>a</b>) Position vs. Time, (<b>b</b>) Position X vs. Position Y.</p>
Full article ">Figure 15
<p>Flight path obtained from the data in the CSV file.</p>
Full article ">
17 pages, 978 KiB  
Article
LightMAN: A Lightweight Microchained Fabric for Assurance- and Resilience-Oriented Urban Air Mobility Networks
by Ronghua Xu, Sixiao Wei, Yu Chen, Genshe Chen and Khanh Pham
Drones 2022, 6(12), 421; https://doi.org/10.3390/drones6120421 - 16 Dec 2022
Cited by 11 | Viewed by 2236
Abstract
Rapid advancements in the fifth generation (5G) communication technology and mobile edge computing (MEC) paradigm have led to the proliferation of unmanned aerial vehicles (UAV) in urban air mobility (UAM) networks, which provide intelligent services for diversified smart city scenarios. Meanwhile, the widely [...] Read more.
Rapid advancements in the fifth generation (5G) communication technology and mobile edge computing (MEC) paradigm have led to the proliferation of unmanned aerial vehicles (UAV) in urban air mobility (UAM) networks, which provide intelligent services for diversified smart city scenarios. Meanwhile, the widely deployed Internet of drones (IoD) in smart cities has also brought up new concerns regarding performance, security, and privacy. The centralized framework adopted by conventional UAM networks is not adequate to handle high mobility and dynamicity. Moreover, it is necessary to ensure device authentication, data integrity, and privacy preservation in UAM networks. Thanks to its characteristics of decentralization, traceability, and unalterability, blockchain is recognized as a promising technology to enhance security and privacy for UAM networks. In this paper, we introduce LightMAN, a lightweight microchained fabric for data assurance and resilience-oriented UAM networks. LightMAN is tailored for small-scale permissioned UAV networks, in which a microchain acts as a lightweight distributed ledger for security guarantees. Thus, participants are enabled to authenticate drones and verify the genuineness of data that are sent to/from drones without relying on a third-party agency. In addition, a hybrid on-chain and off-chain storage strategy is adopted that not only improves performance (e.g., latency and throughput) but also ensures privacy preservation for sensitive information in UAM networks. A proof-of-concept prototype is implemented and tested on a micro-air–vehicle link (MAVLink) simulator. The experimental evaluation validates the feasibility and effectiveness of the proposed LightMAN solution. Full article
(This article belongs to the Special Issue Urban Air Mobility (UAM))
Show Figures

Figure 1

Figure 1
<p>System Architecture of LightMAN.</p>
Full article ">Figure 2
<p>ML/DL Learning Process for UAM Monitoring.</p>
Full article ">Figure 3
<p>Software-In-The-Loop Simulation for Data Acquisition.</p>
Full article ">Figure 4
<p>End-to-end latency of committing CapAC tokens on Microchain: committee size vs. tps.</p>
Full article ">Figure 5
<p>Processing Time of querying CapAC tokens and validating access rights.</p>
Full article ">Figure 6
<p>Throughput of querying CapAC tokens and validating access rights.</p>
Full article ">Figure 7
<p>Processing time of data operations: accessing DDS and symmetric encryption.</p>
Full article ">
25 pages, 8946 KiB  
Review
A Tutorial and Review on Flight Control Co-Simulation Using Matlab/Simulink and Flight Simulators
by Nadjim Horri and Mikolaj Pietraszko
Automation 2022, 3(3), 486-510; https://doi.org/10.3390/automation3030025 - 3 Sep 2022
Cited by 17 | Viewed by 13517
Abstract
Flight testing in a realistic three-dimensional virtual environment is increasingly being considered a safe and cost-effective way of evaluating aircraft models and their control systems. The paper starts by reviewing and comparing the most popular personal computer-based flight simulators that have been successfully [...] Read more.
Flight testing in a realistic three-dimensional virtual environment is increasingly being considered a safe and cost-effective way of evaluating aircraft models and their control systems. The paper starts by reviewing and comparing the most popular personal computer-based flight simulators that have been successfully interfaced to date with the MathWorks software. This co-simulation approach allows combining the strengths of Matlab toolboxes for functions including navigation, control, and sensor modeling with the advanced simulation and scene rendering capabilities of dedicated flight simulation software. This approach can then be used to validate aircraft models, control algorithms, flight handling chatacteristics, or perform model identification from flight data. There is, however, a lack of sufficiently detailed step-by-step flight co-simulation tutorials, and there have also been few attempts to evaluate more than one flight co-simulation approach at a time. We, therefore, demonstrate our own step-by-step co-simulation implementations using Simulink with three different flight simulators: Xplane, FlightGear, and Alphalink’s virtual flight test environment (VFTE). All three co-simulations employ a real-time user datagram protocol (UDP) for data communication, and each approach has advantages depending on the aircraft type. In the case of a Cessna-172 general aviation aircraft, a Simulink co-simulation with Xplane demonstrates successful virtual flight tests with accurate simultaneous tracking of altitude and speed reference changes while maintaining roll stability under arbitrary wind conditions that present challenges in the single propeller Cessna. For a medium endurance Rascal-110 unmanned aerial vehicle (UAV), Simulink is interfaced with FlightGear and with QGroundControl using the MAVlink protocol, which allows to accurately follow the lateral UAV path on a map, and this setup is used to evaluate the validity of Matlab-based six degrees of freedom UAV models. For a smaller ZOHD Nano Talon miniature aerial vehicle (MAV), Simulink is interfaced with the VFTE, which was specifically designed for this MAV, and with QGroundControl for the testing of advanced H-infinity observer-based autopilots using a software-in-the-loop (SIL) simulation to achieve robust low altitude flight under windy conditions. This is then finally extended to hardware-in-the-loop (HIL) implementation on the Nano Talon MAV using a controller area network (CAN) databus and a Pixhawk-4 mini autopilot with simulated sensor models. Full article
(This article belongs to the Special Issue Anniversary Feature Papers-2022)
Show Figures

Figure 1

Figure 1
<p>Mavlink libearies from the Matlab UAV control toolbox.</p>
Full article ">Figure 2
<p>Simulink to QGroundControl communication via the Mavlink protocol.</p>
Full article ">Figure 3
<p>UDP send block in Matlab/Simulink.</p>
Full article ">Figure 4
<p>Xplane inputs and outputs selection for UDP communication.</p>
Full article ">Figure 5
<p>Simulink speed and altitude autopilots with Xplane interfacing via UDP.</p>
Full article ">Figure 6
<p>Xplane view selection.</p>
Full article ">Figure 7
<p>Selection of the initial altitude and speed on Xplane.</p>
Full article ">Figure 8
<p>Flight co-simulation using Matlab/Simulink and Xplane.</p>
Full article ">Figure 9
<p>FlightGear command window.</p>
Full article ">Figure 10
<p>FlightGear interface in Simulink.</p>
Full article ">Figure 11
<p>Parameter choice for FlightGear script generator.</p>
Full article ">Figure 12
<p>Batch file running in the Windows console.</p>
Full article ">Figure 13
<p>Simulink model for 6DoF FlightGear and QGroundControl co-simulation.</p>
Full article ">Figure 14
<p>FlightGear and QGroundControl windows during the Simulink-FlightGear-QGroundControl simulation of the Rascal 110 UAV.</p>
Full article ">Figure 15
<p>A simple UAV support package for PX4 example for initial sensor testing.</p>
Full article ">Figure 16
<p>Key sensing, actuation and communication libraries of the UAV support package for PX4 from The Mathworks, Inc., Natick, MA, USA.</p>
Full article ">Figure 17
<p>PID autopilots in the X-configured quadcopter SIL position tracking example by The Mathworks, Inc. [<xref ref-type="bibr" rid="B28-automation-03-00025">28</xref>].</p>
Full article ">Figure 18
<p>Control channels allocation/mixing for the X-configured quadcopter SIL position tracking example by The Mathworks, Inc. [<xref ref-type="bibr" rid="B28-automation-03-00025">28</xref>].</p>
Full article ">Figure 19
<p>HIL simulation architecture using a Pixhawk 4 with a Simulink-Unreal Engine co-simulation [<xref ref-type="bibr" rid="B29-automation-03-00025">29</xref>].</p>
Full article ">Figure 20
<p>Example Nano Talon flight control model for Simulink/VFTE lateral flight co-simulation (Developed from Alphalink Engineeing GmbH (Berlin, Germany) training resources).</p>
Full article ">Figure 21
<p>Detail of the observer-based Nano Talon lateral flight controller block of <xref ref-type="fig" rid="automation-03-00025-f020">Figure 20</xref>.</p>
Full article ">Figure 22
<p>VFTE window view during the SIL co-simulation using the Simulink lateral observer/controller loop.</p>
Full article ">Figure 23
<p>HIL architecture using Pixhawk (courtesy of Alphalink Engineeing GmbH (Berlin, Germany)).</p>
Full article ">Figure 24
<p>HIL co-simulation using RC commands to the Simulink/VFTE/QGroundControl environment.</p>
Full article ">
30 pages, 20321 KiB  
Article
Wireless Local Area Network Technologies as Communication Solutions for Unmanned Surface Vehicles
by Andrzej Stateczny, Krzysztof Gierlowski and Michal Hoeft
Sensors 2022, 22(2), 655; https://doi.org/10.3390/s22020655 - 15 Jan 2022
Cited by 11 | Viewed by 4127
Abstract
As the number of research activities and practical deployments of unmanned vehicles has shown a rapid growth, topics related to their communication with operator and external infrastructure became of high importance. As a result a trend of employing IP communication for this purpose [...] Read more.
As the number of research activities and practical deployments of unmanned vehicles has shown a rapid growth, topics related to their communication with operator and external infrastructure became of high importance. As a result a trend of employing IP communication for this purpose is emerging and can be expected to bring significant advantages. However, its employment can be expected to be most effective using broadband communication technologies such as Wireless Local Area Networks (WLANs). To verify the effectiveness of such an approach in a specific case of surface unmanned vehicles, the paper includes an overview of IP-based MAVLink communication advantages and requirements, followed by a laboratory and field-experiment study of selected WLAN technologies, compared to popular narrowband communication solutions. The conclusions confirm the general applicability of IP/WLAN communication for surface unmanned vehicles, providing an overview of their advantages and pointing out deployment requirements. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

Figure 1
<p>General architecture of UV control system.</p>
Full article ">Figure 2
<p>MAVLink-over-serial and MAVLink-over-IP protocol stacks.</p>
Full article ">Figure 3
<p>Employment of multi-technology (heterogeneous) access network for UV communication.</p>
Full article ">Figure 4
<p>A self-organizing, multihop network utilizing UV-based nodes.</p>
Full article ">Figure 5
<p>Use of Internet for remote UV monitoring, control, data acquisition and maintenance.</p>
Full article ">Figure 6
<p>HydroDron during measurement tasks in the Port of Gdynia.</p>
Full article ">Figure 7
<p>HydroDron model and its components.</p>
Full article ">Figure 8
<p>Architecture of the test system. Connections: orange—USB, green—UART (serial), blue—1 Gbps Ethernet, dotted—wireless.</p>
Full article ">Figure 9
<p>Number of MAVLink messages received with a specific RTT value for different TDMA Period Size settings.</p>
Full article ">Figure 10
<p>Ground station location (marked by red X), its 3 dB antenna sector boundary and HydroDron movement speed (km/h) during the test deployment.</p>
Full article ">Figure 11
<p>Received signal strength for Wi-Fi, NV2 and RFD devices during HydroDron test deployment.</p>
Full article ">Figure 12
<p>MAVLink message loss ratio measurements for Wi-Fi, NV2 and RFD devices.</p>
Full article ">Figure 13
<p>ECDF plot of message loss ratio results.</p>
Full article ">Figure 14
<p>Histogram of message loss ratio results.</p>
Full article ">Figure 15
<p>Scatter plot showing correlation between MLR and received signal strength level.</p>
Full article ">Figure 16
<p>Round trip time measurements for MAVLink message exchange over Wi-Fi, NV2 and RFD links.</p>
Full article ">Figure 17
<p>ECDF plot of message round trip time results.</p>
Full article ">Figure 18
<p>Histogram of message round trip time results.</p>
Full article ">Figure 19
<p>Scatter plot showing correlation between RTT and received signal strength level.</p>
Full article ">Figure 20
<p>Assessment of the free communication bandwidth available with different transmission technologies.</p>
Full article ">
20 pages, 4447 KiB  
Article
Sensor Information Sharing Using a Producer-Consumer Algorithm on Small Vehicles
by Rodrigo Vazquez-Lopez, Juan Carlos Herrera-Lozada, Jacobo Sandoval-Gutierrez, Philipp von Bülow and Daniel Librado Martinez-Vazquez
Sensors 2021, 21(9), 3022; https://doi.org/10.3390/s21093022 - 25 Apr 2021
Cited by 4 | Viewed by 3462
Abstract
There are several tools, frameworks, and algorithms to solve information sharing from multiple tasks and robots. Some applications such as ROS, Kafka, and MAVLink cover most problems when using operating systems. However, they cannot be used for particular problems that demand optimization of [...] Read more.
There are several tools, frameworks, and algorithms to solve information sharing from multiple tasks and robots. Some applications such as ROS, Kafka, and MAVLink cover most problems when using operating systems. However, they cannot be used for particular problems that demand optimization of resources. Therefore, the objective was to design a solution to fit the resources of small vehicles. The methodology consisted of defining the group of vehicles with low performance or are not compatible with high-level known applications; design a reduced, modular, and compatible architecture; design a producer-consumer algorithm that adjusts to the simultaneous localization and communication of multiple vehicles with UWB sensors; validate the operation with an interception task. The results showed the feasibility of performing architecture for embedded systems compatible with other applications managing information through the proposed algorithm allowed to complete the interception task between two vehicles. Another result was to determine the system’s efficiency by scaling the memory size and comparing its performance. The work’s contributions show the areas of opportunity to develop architectures focusing on the optimization of robot resources and complement existing ones. Full article
(This article belongs to the Special Issue Indoor Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>Comparison of software-oriented architecture (<b>left-side</b>) and proposed architecture (<b>right-side</b>).</p>
Full article ">Figure 2
<p>Ground vehicle model.</p>
Full article ">Figure 3
<p>Nano-quadcopter. (<b>a</b>) Body frame. (<b>b</b>) Inertial frame.</p>
Full article ">Figure 4
<p>Different configurations of the absolute positioning system mounted within a workspace: (<b>a</b>) existence of only one tag: a location algorithm based on “two way ranging” TWR is used. (<b>b</b>)Two or more tags are present: a location algorithm based on “time difference of arrival” TDoA is used.</p>
Full article ">Figure 5
<p>Flowchart of the producer–consumer problem.</p>
Full article ">Figure 6
<p>Diagram of the characterization experiment. (<b>a</b>) In 2D. (<b>b</b>) In 3D.</p>
Full article ">Figure 7
<p>Flowchart of the process A (producer) implemented for the mobile robot EV3 and of the process B (consumer) implemented for the Crazyflie quadcopter.</p>
Full article ">Figure 8
<p>Block diagram illustrating the interaction between the components during the execution of the algorithm.</p>
Full article ">Figure 9
<p>Performance of anchors in rectangular (green) and square (pink) configuration concerning a reference path.</p>
Full article ">Figure 10
<p>Results of the characterization run for a 2 × 2 m workspace. (<b>a</b>) Trajectory performed by the robot. (<b>b</b>) <span class="html-italic">x</span>-axis component of the movement. (<b>c</b>) <span class="html-italic">y</span>-axis component of the movement. (<b>d</b>) Quadratic error on <span class="html-italic">x</span> and <span class="html-italic">y</span> between the performed path and the given ideal trajectory.</p>
Full article ">Figure 11
<p>The paths taken by the EV3 (blue line), and the Crazyflie quadcopter robot (red line) compared to the ideal path (black dashed line). (<b>a</b>) The trajectory performed in 3D perspective. (<b>b</b>) Displacement in <span class="html-italic">x</span>. (<b>c</b>) Displacement <span class="html-italic">y</span>. (<b>d</b>) Displacement in <span class="html-italic">z</span>.</p>
Full article ">Figure 12
<p>RMS-positioning-error between the EV3 robot and the Crazyflie quadcopter. (<b>a</b>) <span class="html-italic">x</span>-axis error. (<b>b</b>) <span class="html-italic">y</span>-axis error.</p>
Full article ">Figure 13
<p>Buffer size vs. (<b>a</b>) consumption data rate and (<b>b</b>) kinetic energy and execution time comparison as a system’s performance evaluation.</p>
Full article ">Figure 14
<p>Different tests made using buffer sizes from 0 to 14.</p>
Full article ">
20 pages, 4551 KiB  
Article
3D Trajectory Planning Method for UAVs Swarm in Building Emergencies
by Ángel Madridano, Abdulla Al-Kaff, David Martín and and Arturo de la Escalera
Sensors 2020, 20(3), 642; https://doi.org/10.3390/s20030642 - 23 Jan 2020
Cited by 43 | Viewed by 5196
Abstract
The development in Multi-Robot Systems (MRS) has become one of the most exploited fields of research in robotics in recent years. This is due to the robustness and versatility they present to effectively undertake a set of tasks autonomously. One of the essential [...] Read more.
The development in Multi-Robot Systems (MRS) has become one of the most exploited fields of research in robotics in recent years. This is due to the robustness and versatility they present to effectively undertake a set of tasks autonomously. One of the essential elements for several vehicles, in this case, Unmanned Aerial Vehicles (UAVs), to perform tasks autonomously and cooperatively is trajectory planning, which is necessary to guarantee the safe and collision-free movement of the different vehicles. This document includes the planning of multiple trajectories for a swarm of UAVs based on 3D Probabilistic Roadmaps (PRM). This swarm is capable of reaching different locations of interest in different cases (labeled and unlabeled), supporting of an Emergency Response Team (ERT) in emergencies in urban environments. In addition, an architecture based on Robot Operating System (ROS) is presented to allow the simulation and integration of the methods developed in a UAV swarm. This architecture allows the communications with the MavLink protocol and control via the Pixhawk autopilot, for a quick and easy implementation in real UAVs. The proposed method was validated by experiments simulating building emergences. Finally, the obtained results show that methods based on probability roadmaps create effective solutions in terms of calculation time in the case of scalable systems in different situations along with their integration into a versatile framework such as ROS. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>3D model of the building for simulation.</p>
Full article ">Figure 2
<p>Obtaining 3D occupancy diagram.</p>
Full article ">Figure 3
<p>3D Probabilistic Roadmaps.</p>
Full article ">Figure 4
<p>3D PRM output.</p>
Full article ">Figure 5
<p>Solution for multiple tasks and multiple vehicles.</p>
Full article ">Figure 6
<p>Combination of trajectories to be optimized with the Hungarian Method.</p>
Full article ">Figure 7
<p>Control architecture diagram.</p>
Full article ">Figure 8
<p>Simulation environment.</p>
Full article ">Figure 9
<p>Results of the computation time according to the number of nodes.</p>
Full article ">Figure 10
<p>Percentage of times a solution is found as opposed to the number of nodes employed.</p>
Full article ">Figure 11
<p>Computation time depending on the number of tasks and agents involved in the mission.</p>
Full article ">Figure 12
<p>Total travelled distance vs. number of tasks.</p>
Full article ">Figure 13
<p>Total travelled distance vs. number of nodes.</p>
Full article ">Figure 14
<p>Set of paths for a swarm of 10 vehicles.</p>
Full article ">Figure 15
<p>Visualization of the complete mission in RVIZ.</p>
Full article ">
2 pages, 341 KiB  
Proceeding Paper
UAV Trajectory Management: Ardupilot Based Trajectory Management System
by Javier Losada Pita and Félix Orjales Saavedra
Proceedings 2019, 21(1), 8; https://doi.org/10.3390/proceedings2019021008 - 23 Jul 2019
Cited by 1 | Viewed by 2302
Abstract
In this paper we explain the structure and development of a trajectory management system on board a UAV capable to achieve complex trajectories and versatile to adapt disturbances during flight. This system is built in Python and runs in a companion computer on [...] Read more.
In this paper we explain the structure and development of a trajectory management system on board a UAV capable to achieve complex trajectories and versatile to adapt disturbances during flight. This system is built in Python and runs in a companion computer on board the UAV while maintains communication with a ground station over a radio link. Full article
(This article belongs to the Proceedings of The 2nd XoveTIC Conference (XoveTIC 2019))
Show Figures

Figure 1

Figure 1
<p>System configuration.</p>
Full article ">Figure 2
<p>Sinusoidal trajectory performed by the simulated UAV.</p>
Full article ">
12 pages, 2082 KiB  
Article
A Practical Deployment of a Communication Infrastructure to Support the Employment of Multiple Surveillance Drones Systems
by Maik Basso, Iulisloi Zacarias, Carlos Eduardo Tussi Leite, Haijun Wang and Edison Pignaton de Freitas
Drones 2018, 2(3), 26; https://doi.org/10.3390/drones2030026 - 13 Aug 2018
Cited by 15 | Viewed by 6253
Abstract
In many incidents involving amateur drones (ADr), the big challenge is to quickly deploy a surveillance system that countermeasures the threat and keeps track of the intruders. Depending on the area under concern, launching a single surveillance drone (SDr) to hunt the intruder [...] Read more.
In many incidents involving amateur drones (ADr), the big challenge is to quickly deploy a surveillance system that countermeasures the threat and keeps track of the intruders. Depending on the area under concern, launching a single surveillance drone (SDr) to hunt the intruder is not efficient, but employing multiple ones can cope with the problem. However, in order to make this approach feasible, an easy to use mission setup and control station for multiple SDr is required, which by its turn, requires a communication infrastructure able to handle the connection of multiple SDr among themselves and their ground control and payload visualization station. Concerning this Issue, this paper presents a proposal of a network infrastructure to support the operation of multiple SDr and its practical deployment. This infrastructure extends the existing Micro Air Vehicle Link (MAVLink) protocol to support multiple connections among the SDrs and between them and a ground control station. Encouraging results are obtained, showing the viability of this proposed protocol extension. Full article
(This article belongs to the Special Issue Advances in Drone Communications, State-of-the-Art and Architectures)
Show Figures

Figure 1

Figure 1
<p>Ground Control Station Architecture: (<b>a</b>) Overall MVC design. (<b>b</b>) Detailed interaction of the architectural components. (<b>c</b>) Control and network interfaces.</p>
Full article ">Figure 2
<p>Practical Network Implementation: (<b>a</b>) Protocol Stack. (<b>b</b>) Network Topology.</p>
Full article ">Figure 3
<p>Experimental Setup: Ground Control Station running on the Android Tablet; two instances of drones running in the STIL in the notebook; and the XBee modules used as communication devices.</p>
Full article ">Figure 4
<p>Schematic representation of the performed tests.</p>
Full article ">Figure 5
<p>Screenshots of the ground control station: (<b>a</b>) controlling just one drone—highlight in the righthand side the recognition options menu; (<b>b</b>) controlling two drones—highlight in the waypoint selection menus; (<b>c</b>) controlling three drones—highlight in the waypoint sequences on the bottom of each windows; (<b>d</b>) controlling four drones—highlight in the telemetric information on the left hand side of each window and on the bottom the controlling menus.</p>
Full article ">
Back to TopTop