[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (476)

Search Parameters:
Keywords = aerial robotics

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 14420 KiB  
Article
Semantic Segmentation-Driven Integration of Point Clouds from Mobile Scanning Platforms in Urban Environments
by Joanna Koszyk, Aleksandra Jasińska, Karolina Pargieła, Anna Malczewska, Kornelia Grzelka, Agnieszka Bieda and Łukasz Ambroziński
Remote Sens. 2024, 16(18), 3434; https://doi.org/10.3390/rs16183434 - 16 Sep 2024
Viewed by 302
Abstract
Precise and complete 3D representations of architectural structures or industrial sites are essential for various applications, including structural monitoring or cadastre. However, acquiring these datasets can be time-consuming, particularly for large objects. Mobile scanning systems offer a solution for such cases. In the [...] Read more.
Precise and complete 3D representations of architectural structures or industrial sites are essential for various applications, including structural monitoring or cadastre. However, acquiring these datasets can be time-consuming, particularly for large objects. Mobile scanning systems offer a solution for such cases. In the case of complex scenes, multiple scanning systems are required to obtain point clouds that can be merged into a comprehensive representation of the object. Merging individual point clouds obtained from different sensors or at different times can be difficult due to discrepancies caused by moving objects or changes in the scene over time, such as seasonal variations in vegetation. In this study, we present the integration of point clouds obtained from two mobile scanning platforms within a built-up area. We utilized a combination of a quadruped robot and an unmanned aerial vehicle (UAV). The PointNet++ network was employed to conduct a semantic segmentation task, enabling the detection of non-ground objects. The experimental tests used the Toronto 3D dataset and DALES for network training. Based on the performance, the model trained on DALES was chosen for further research. The proposed integration algorithm involved semantic segmentation of both point clouds, dividing them into square subregions, and performing subregion selection by checking the emptiness or when both subregions contained points. Parameters such as local density, centroids, coverage, and Euclidean distance were evaluated. Point cloud merging and augmentation enhanced with semantic segmentation and clustering resulted in the exclusion of points associated with these movable objects from the point clouds. The comparative analysis of the method and simple merging was performed based on file size, number of points, mean roughness, and noise estimation. The proposed method provided adequate results with the improvement of point cloud quality indicators. Full article
Show Figures

Figure 1

Figure 1
<p>Area of investigation (red box). Coordinates refer to WGS84 (EPSG: 4326). Background image: Google Earth, <a href="http://earth.google.com/web/" target="_blank">earth.google.com/web/</a>.</p>
Full article ">Figure 2
<p>Leica BLK ARC laser scanner (<b>a</b>), Boston Dynamics Spot equipped with Leica BLK ARC (<b>b</b>).</p>
Full article ">Figure 3
<p>DJI Matrice 350 RTK equipped with DJI Zenmuse L1.</p>
Full article ">Figure 4
<p>Comparison of PointNet++ performance. UAV data are classified based on models trained on (<b>a</b>) DALES and (<b>b</b>) Toronto 3D. Mobile robot data classified based on models trained on (<b>c</b>) DALES and (<b>d</b>) Toronto 3D. Different colors represent labels assigned to points.</p>
Full article ">Figure 5
<p>Semantic segmentation: (<b>a</b>) UAV point cloud, (<b>b</b>) mobile platform point cloud. Different colors represent labels assigned to points.</p>
Full article ">Figure 6
<p>Ground classification after binarization: (<b>a</b>) UAV point cloud, (<b>b</b>) mobile platform point cloud. Blue color represents the ground label. and orange color represents the non-ground label.</p>
Full article ">Figure 7
<p>The diagram of research workflow.</p>
Full article ">Figure 8
<p>Integrated point cloud.</p>
Full article ">Figure 9
<p>Comparison between scans obtained from different devices and the point cloud created with the proposed algorithm. Ceilings: (<b>a</b>) UAV, (<b>b</b>) quadruped robot, and (<b>c</b>) integrated point cloud. Building fronts: (<b>d</b>) UAV, (<b>e</b>) quadruped robot, and (<b>f</b>) integrated point cloud.</p>
Full article ">Figure 10
<p>Comparison between scans obtained from different devices and the point cloud created with the proposed algorithm. Cars: (<b>a</b>) UAV, (<b>b</b>) quadruped robot, and (<b>c</b>) integrated point cloud.</p>
Full article ">Figure 11
<p>Comparison between scans obtained from different devices and the point cloud created with the proposed algorithm. Cars: (<b>a</b>) UAV, (<b>b</b>) quadruped robot, and (<b>c</b>) integrated point cloud. Trees: (<b>d</b>) UAV, (<b>e</b>) quadruped robot, and (<b>f</b>) integrated point cloud.</p>
Full article ">Figure 12
<p>Semantic segmentation of integrated point cloud (<b>a</b>) with 8 classes and (<b>b</b>) binarized.</p>
Full article ">Figure 13
<p>Point cloud without points with the ground label.</p>
Full article ">Figure 14
<p>Point cloud with ground removed after clustering with DBSCAN. Each cluster is indicated with a different color. Small elements such as small trees are grouped into separated clusters.</p>
Full article ">Figure 15
<p>Final point cloud (<b>a</b>) before outlier removal and (<b>b</b>) after outlier removal.</p>
Full article ">
18 pages, 16152 KiB  
Article
Characterization of Wing Kinematics by Decoupling Joint Movement in the Pigeon
by Yishi Shen, Shi Zhang, Weimin Huang, Chengrui Shang, Tao Sun and Qing Shi
Biomimetics 2024, 9(9), 555; https://doi.org/10.3390/biomimetics9090555 - 15 Sep 2024
Viewed by 255
Abstract
Birds have remarkable flight capabilities due to their adaptive wing morphology. However, studying live birds is time-consuming and laborious, and obtaining information about the complete wingbeat cycle is difficult. To address this issue and provide a complete dataset, we recorded comprehensive motion capture [...] Read more.
Birds have remarkable flight capabilities due to their adaptive wing morphology. However, studying live birds is time-consuming and laborious, and obtaining information about the complete wingbeat cycle is difficult. To address this issue and provide a complete dataset, we recorded comprehensive motion capture wing trajectory data from five free-flying pigeons (Columba livia). Five key motion parameters are used to quantitatively characterize wing kinematics: flapping, sweeping, twisting, folding and bending. In addition, the forelimb skeleton is mapped using an open-chain three-bar mechanism model. By systematically evaluating the relationship of joint degrees of freedom (DOFs), we configured the model as a 3-DOF shoulder, 1-DOF elbow and 2-DOF wrist. Based on the correlation analysis between wingbeat kinematics and joint movement, we found that the strongly correlated shoulder and wrist roll within the stroke plane cause wing flap and bending. There is also a strong correlation between shoulder, elbow and wrist yaw out of the stroke plane, which causes wing sweep and fold. By simplifying the wing morphing, we developed three flapping wing robots, each with different DOFs inside and outside the stroke plane. This study provides insight into the design of flapping wing robots capable of mimicking the 3D wing motion of pigeons. Full article
(This article belongs to the Special Issue Biologically Inspired Design and Control of Robots: Second Edition)
Show Figures

Figure 1

Figure 1
<p>Schematic view of flight arena. (<b>a</b>) Overview of the measurement arena. The size of the experimental arena was 16 m × 5 m × 3 m, and the 30 motion capture cameras used were evenly distributed on the roof. At the same time, three GoPro cameras were also placed around the area to assist with the capture. (<b>b</b>) Regarding the four flight modes of pigeons during flight experiments, we only analyze the data for the continuous flapping phase in this paper. (<b>c</b>) The locations and names of the markers on the pigeons.</p>
Full article ">Figure 2
<p><math display="inline"><semantics> <mi>μ</mi> </semantics></math><math display="inline"><semantics> <mrow> <mi>C</mi> <mi>T</mi> </mrow> </semantics></math> result forelimb skeleton 3D reconstruction for five pigeons. (<b>a</b>) Overall view of <math display="inline"><semantics> <mi>μ</mi> </semantics></math><math display="inline"><semantics> <mrow> <mi>C</mi> <mi>T</mi> </mrow> </semantics></math> result for pigeon id: 2096, 2205, 5018, and 2417. It points out the humerus, radius, ulna, and carpometacarpus. (<b>b</b>) <math display="inline"><semantics> <mi>μ</mi> </semantics></math><math display="inline"><semantics> <mrow> <mi>C</mi> <mi>T</mi> </mrow> </semantics></math> result for pigeon id 4036, the marker pasted on elbow, writs, and carpometacarpus.</p>
Full article ">Figure 3
<p>Definitions of the coordinate systems during flight. (<b>a</b>) Three Euler angles are used to describe the orientation of the pigeon’s body in the world coordinate system elevation: elevation (<math display="inline"><semantics> <mo>Θ</mo> </semantics></math>), heading (<math display="inline"><semantics> <mo>Ψ</mo> </semantics></math>), and bank angle (<math display="inline"><semantics> <mo>Φ</mo> </semantics></math>). The horizontal plane is shown in grey. (<b>b</b>) Recorded anatomical points on the wing (see <a href="#biomimetics-09-00555-f001" class="html-fig">Figure 1</a>c) were used to define multiple planes. (<b>c</b>) Represent of the five angles in the arm wing and hand wing coordinate systems.</p>
Full article ">Figure 4
<p>Definitions of the wing kinematics during continuous flapping. (<b>a</b>) The stroke plane corresponds to a linear regression plane of the <span class="html-italic">x</span> and <span class="html-italic">z</span> of the wrist joint relative to the shoulder. (<b>b</b>) The flap angle is between the wing plane and <math display="inline"><semantics> <mrow> <msub> <mi>x</mi> <mi>s</mi> </msub> <msub> <mi>y</mi> <mi>s</mi> </msub> </mrow> </semantics></math> plane. The sweep angle is between the leading edge and the stroke plane. (<b>c</b>) The twist angle is the wing chord length being rotated about the transverse <math display="inline"><semantics> <msub> <mi>y</mi> <mi>s</mi> </msub> </semantics></math> axis. (<b>d</b>) The fold angle is the hand wing plane rotation along the <math display="inline"><semantics> <msub> <mi>z</mi> <mi>h</mi> </msub> </semantics></math> axis. The bend angle is the hand wing plane rotation along the <math display="inline"><semantics> <msub> <mi>x</mi> <mi>h</mi> </msub> </semantics></math> axis. (<b>e</b>) Schematic definition of wing angle of attack.</p>
Full article ">Figure 5
<p>Schematic diagram of the mapping process using the proposed hierarchical global optimization algorithm for computing joint angles. The framework consists of two layers. The upper layer (red box) built a three-bar mechanism based on an open chain characterizing the pigeon forelimb skeleton. The lower layer (blue box) mainly concerns flight data acquisition and forward kinematics iteration. (<b>a</b>) The DOF of the joint angle is determined. (<b>b</b>) The OKC model in the world coordinates. (<b>c</b>) The offset of the marker points on each joint concerning the OKC model. (<b>d</b>) The optimization process is to fit the corrected OKC model pose to the capture position pose and the output of the joint angles. (<b>e</b>) Capture data visualization and pre-processing in a motion capture system. (<b>f</b>) The marker placement on the pigeon.</p>
Full article ">Figure 6
<p>Averaged wing kinematics of pigeon ID 4036 in a normalized wingbeat cycle during continuous flapping. The solid line represents the mean traces, the shaded area indicates ±1 s.d. (<span class="html-italic">n</span> = 24), and the dashed line is the curve fitted to the Fourier series. Colors are used to represent different wing positions: red for the wrist and blue for the ninth primary. The white and grey backgrounds represent upstroke and downstroke, respectively. (<b>a</b>–<b>e</b>) flap angle (<math display="inline"><semantics> <mi>ϕ</mi> </semantics></math>), sweep angle (<math display="inline"><semantics> <mi>ψ</mi> </semantics></math>), twist angle (<math display="inline"><semantics> <mi>θ</mi> </semantics></math>), in-plane bend angle (<math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> </mrow> </semantics></math>), and out-of plane fold angle (<math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ψ</mi> </mrow> </semantics></math>) in a normalized wingbeat cycle, respectively. (<b>f</b>) Pigeon body velocities, the solid black line shows the sum of the velocities, the dashed blue line shows in the x-direction and the dashed red line shows in the z-direction. (<b>g</b>) Angle of attack for arm wing and hand wing.</p>
Full article ">Figure 7
<p>Joint movements and joint error of pigeon ID 4036 during continuous flapping. (<b>a</b>) The joint angles within one flapping cycle are illustrated for the joint DOF configuration of 3-1-2; they represent, respectively, shoulder yaw angle, shoulder roll angle, shoulder pitch angle, elbow yaw angle, wrist yaw angle, and wrist roll angle. The color bands represent each angle’s maximum and minimum values, and the colored solid lines indicate the average values. (<b>b</b>) Schematic representations of the magnitude and direction of the change in each joint angle. (<b>c</b>) Compared to the collected data, the optimized errors for the shoulder, wrist, and carpometacarpus.</p>
Full article ">Figure 8
<p>Wing kinematics and joint movements correlation analysis during continuous flapping. The analysis is based on a sample size of <span class="html-italic">N</span> = 5. (<b>a</b>) The color scheme depicts the correlation between each joint movement and wing kinematics, with red indicating a highly positive correlation and blue indicating a highly negative correlation. (<b>b</b>) The specific <math display="inline"><semantics> <mi>ρ</mi> </semantics></math> between the two joint movements and two wing kinematics in and out of the stroke plane. (<b>c</b>) The specific <math display="inline"><semantics> <mi>ρ</mi> </semantics></math> between the three joint movements and two wing kinematics out-stroke plane. (<b>d</b>) The correlation between wrist roll and shoulder roll, with arrows indicating the trend from the beginning of the downstroke to the end of the upstroke. In the upstroke, the correlation coefficient is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>S</mi> <mi>h</mi> <mi>o</mi> <mi>u</mi> <mi>l</mi> <mi>d</mi> <mi>e</mi> <mi>r</mi> <mo>-</mo> <mi>W</mi> <mi>r</mi> <mi>i</mi> <mi>s</mi> <mi>t</mi> </mrow> <mrow> <mi>u</mi> <mi>p</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.989</mn> </mrow> </semantics></math>. The correlation between elbow yaw and wrist yaw of downstroke is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>S</mi> <mi>h</mi> <mi>o</mi> <mi>u</mi> <mi>l</mi> <mi>d</mi> <mi>e</mi> <mi>r</mi> <mo>-</mo> <mi>E</mi> <mi>l</mi> <mi>b</mi> <mi>o</mi> <mi>w</mi> </mrow> <mrow> <mi>d</mi> <mi>o</mi> <mi>w</mi> <mi>n</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.999</mn> </mrow> </semantics></math>, and during the upstroke is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>S</mi> <mi>h</mi> <mi>o</mi> <mi>u</mi> <mi>l</mi> <mi>d</mi> <mi>e</mi> <mi>r</mi> <mo>-</mo> <mi>E</mi> <mi>l</mi> <mi>b</mi> <mi>o</mi> <mi>w</mi> </mrow> <mrow> <mi>u</mi> <mi>p</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.988</mn> </mrow> </semantics></math>. The correlation between shoulder wrist yaw is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>S</mi> <mi>h</mi> <mi>o</mi> <mi>u</mi> <mi>l</mi> <mi>d</mi> <mi>e</mi> <mi>r</mi> <mo>-</mo> <mi>W</mi> <mi>r</mi> <mi>i</mi> <mi>s</mi> <mi>t</mi> </mrow> <mrow> <mi>d</mi> <mi>o</mi> <mi>w</mi> <mi>n</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.984</mn> </mrow> </semantics></math>, and the correlation coefficient of upstroke is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>S</mi> <mi>h</mi> <mi>o</mi> <mi>u</mi> <mi>l</mi> <mi>d</mi> <mi>e</mi> <mi>r</mi> <mo>-</mo> <mi>W</mi> <mi>r</mi> <mi>i</mi> <mi>s</mi> <mi>t</mi> </mrow> <mrow> <mi>u</mi> <mi>p</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.992</mn> </mrow> </semantics></math>. The correlation between elbow wrist yaw during the downstroke is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>E</mi> <mi>l</mi> <mi>b</mi> <mi>o</mi> <mi>w</mi> <mo>-</mo> <mi>W</mi> <mi>r</mi> <mi>i</mi> <mi>s</mi> <mi>t</mi> </mrow> <mrow> <mi>d</mi> <mi>o</mi> <mi>w</mi> <mi>n</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.995</mn> </mrow> </semantics></math>, and the correlation of upstroke is <math display="inline"><semantics> <mrow> <msubsup> <mi>ρ</mi> <mrow> <mi>E</mi> <mi>l</mi> <mi>b</mi> <mi>o</mi> <mi>w</mi> <mo>-</mo> <mi>W</mi> <mi>r</mi> <mi>i</mi> <mi>s</mi> <mi>t</mi> </mrow> <mrow> <mi>u</mi> <mi>p</mi> <mi>s</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>k</mi> <mi>e</mi> </mrow> </msubsup> <mo>=</mo> <mn>0.949</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 9
<p>Pigeon-inspired robots with four different motions. (<b>a</b>) Flapping motion robot with only one DOF of the wing. (<b>b</b>) Bending motion robot, the inner and outer wings have different trajectories, both are in-stroke planes. The bend joint changes depending on the state of motion. (<b>c</b>) Folding motion robot, the folding of the outer wings is driven by the servo at the tail. (<b>d</b>) The twisting motion robot, twisting out of the stroke plane is achieved by an additional 4-bar spatial link to change the AOA of the wing.</p>
Full article ">
20 pages, 6616 KiB  
Article
Comprehensive Task Optimization Architecture for Urban UAV-Based Intelligent Transportation System
by Marco Rinaldi and Stefano Primatesta
Drones 2024, 8(9), 473; https://doi.org/10.3390/drones8090473 - 10 Sep 2024
Viewed by 439
Abstract
This paper tackles the problem of resource sharing and dynamic task assignment in a task scheduling architecture designed to enable a persistent, safe, and energy-efficient Intelligent Transportation System (ITS) based on multi-rotor Unmanned Aerial Vehicles (UAVs). The addressed task allocation problem consists of [...] Read more.
This paper tackles the problem of resource sharing and dynamic task assignment in a task scheduling architecture designed to enable a persistent, safe, and energy-efficient Intelligent Transportation System (ITS) based on multi-rotor Unmanned Aerial Vehicles (UAVs). The addressed task allocation problem consists of heterogenous pick-up and delivery tasks with time deadline constraints to be allocated to a heterogenous fleet of UAVs in an urban operational area. The proposed architecture is distributed among the UAVs and inspired by market-based allocation algorithms. By exploiting a multi-auctioneer behavior for allocating both delivery tasks and re-charge tasks, the fleet of UAVs is able to (i) self-balance the utilization of each drone, (ii) assign dynamic tasks with high priority within each round of the allocation process, (iii) minimize the estimated energy consumption related to the completion of the task set, and (iv) minimize the impact of re-charge tasks on the delivery process. A risk-aware path planner sampling a 2D risk map of the operational area is included in the allocation architecture to demonstrate the feasibility of deployment in urban environments. Thanks to the message exchange redundancy, the proposed multi-auctioneer architecture features improved robustness with respect to lossy communication scenarios. Simulation results based on Monte Carlo campaigns corroborate the validity of the approach. Full article
(This article belongs to the Special Issue Unmanned Traffic Management Systems)
Show Figures

Figure 1

Figure 1
<p>Schematic representation of the standard four phases of an auction-based allocation round in a centralized multi-robot system.</p>
Full article ">Figure 2
<p>Schematic representation of the addressed drone delivery problem with three UAVs, three tasks, and one charge hub in the urban operational area defined in the city of Turin, Italy.</p>
Full article ">Figure 3
<p>Conceptualization of the architecture with nested dynamic task allocation: example of the dynamic role of each UAV in the auction-based allocation architecture (shown for the bidding phase of UAV 1), where each UAV can serve as both an auctioneer and a bidder throughout the task allocation process.</p>
Full article ">Figure 4
<p>Risk map computed for a UAV type C without any additional payload. The black line is the minimum-risk path computed with the risk-aware path planning. The scenario refers to the simplified example of <a href="#drones-08-00473-f002" class="html-fig">Figure 2</a>.</p>
Full article ">Figure 5
<p>Risk map computed for a UAV type C carrying a payload of 2.5 kg. The red line is the minimum-risk path, while the dashed green line is the line-of-sight path. The scenario refers to the simplified example of <a href="#drones-08-00473-f002" class="html-fig">Figure 2</a>.</p>
Full article ">Figure 6
<p>Distribution of risk values of the paths in <a href="#drones-08-00473-f004" class="html-fig">Figure 4</a> and <a href="#drones-08-00473-f005" class="html-fig">Figure 5</a>.</p>
Full article ">Figure 7
<p>(<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>E</mi> </mrow> <mrow> <mi>D</mi> <mi>T</mi> </mrow> </msub> <mfenced separators="|"> <mrow> <msub> <mrow> <mi mathvariant="bold-italic">V</mi> </mrow> <mrow> <mi mathvariant="bold-italic">D</mi> <mi mathvariant="bold-italic">T</mi> </mrow> </msub> </mrow> </mfenced> </mrow> </semantics></math> optimization for the computation of the bid of UAV type A for delivery task 2. On the left, evolution of the constrained problem’s objective function, <math display="inline"><semantics> <mrow> <mi>f</mi> <mo>(</mo> <mi mathvariant="bold-italic">x</mi> <mo>)</mo> </mrow> </semantics></math>, at each iteration, <math display="inline"><semantics> <mrow> <mi>k</mi> </mrow> </semantics></math>, of Algorithm 4. On the right, evolution of the penalty function, <math display="inline"><semantics> <mrow> <mi>P</mi> <mo>(</mo> <mi mathvariant="bold-italic">x</mi> <mo>,</mo> <mi mathvariant="bold-italic">λ</mi> <mo>,</mo> <mi mathvariant="bold-italic">ρ</mi> <mo>)</mo> </mrow> </semantics></math>, at each iteration, <math display="inline"><semantics> <mrow> <mi>k</mi> </mrow> </semantics></math>, of Algorithm 4. (<b>b</b>) Evolution of the penalty function, <math display="inline"><semantics> <mrow> <mi>P</mi> <mfenced separators="|"> <mrow> <mi mathvariant="bold-italic">x</mi> <mo>,</mo> <mi mathvariant="bold-italic">λ</mi> <mo>,</mo> <mi mathvariant="bold-italic">ρ</mi> </mrow> </mfenced> </mrow> </semantics></math>, between <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>=</mo> <mn>2</mn> </mrow> </semantics></math> with respect to <a href="#drones-08-00473-f007" class="html-fig">Figure 7</a>a.</p>
Full article ">
36 pages, 6817 KiB  
Article
Optimizing Autonomous UAV Navigation with D* Algorithm for Sustainable Development
by Pannee Suanpang and Pitchaya Jamjuntr
Sustainability 2024, 16(17), 7867; https://doi.org/10.3390/su16177867 - 9 Sep 2024
Viewed by 607
Abstract
Autonomous navigation for Unmanned Aerial Vehicles (UAVs) has emerged as a critical enabler in various industries, from agriculture, delivery services, and surveillance to search and rescue operations. However, navigating UAVs in dynamic and unknown environments remains a formidable challenge. This paper explores the [...] Read more.
Autonomous navigation for Unmanned Aerial Vehicles (UAVs) has emerged as a critical enabler in various industries, from agriculture, delivery services, and surveillance to search and rescue operations. However, navigating UAVs in dynamic and unknown environments remains a formidable challenge. This paper explores the application of the D* algorithm, a prominent path-planning method rooted in artificial intelligence and widely used in robotics, alongside comparisons with other algorithms, such as A* and RRT*, to augment autonomous navigation capabilities in UAVs’ implication for sustainability development. The core problem addressed herein revolves around enhancing UAV navigation efficiency, safety, and adaptability in dynamic environments. The research methodology involves the integration of the D* algorithm into the UAV navigation system, enabling real-time adjustments and path planning that account for dynamic obstacles and evolving terrain conditions. The experimentation phase unfolds in simulated environments designed to mimic real-world scenarios and challenges. Comprehensive data collection, rigorous analysis, and performance evaluations paint a vivid picture of the D* algorithm’s efficacy in comparison to other navigation methods, such as A* and RRT*. Key findings indicate that the D* algorithm offers a compelling solution, providing UAVs with efficient, safe, and adaptable navigation capabilities. The results demonstrate a path planning efficiency improvement of 92%, a 5% reduction in collision rates, and an increase in safety margins by 2.3 m. This article addresses certain challenges and contributes by demonstrating the practical effectiveness of the D* algorithm, alongside comparisons with A* and RRT*, in enhancing autonomous UAV navigation and advancing aerial systems. Specifically, this study provides insights into the strengths and limitations of each algorithm, offering valuable guidance for researchers and practitioners in selecting the most suitable path-planning approach for their UAV applications. The implications of this research extend far and wide, with potential applications in industries such as agriculture, surveillance, disaster response, and more for sustainability. Full article
Show Figures

Figure 1

Figure 1
<p>UAV training using a deep reinforcement agent.</p>
Full article ">Figure 2
<p>The algorithm’s ability to find a path in a complex environment.</p>
Full article ">Figure 3
<p>Research framework.</p>
Full article ">Figure 4
<p>D* algorithm implementation flowchart.</p>
Full article ">Figure 5
<p>Experimental methodology.</p>
Full article ">Figure 6
<p>Results of the simulations.</p>
Full article ">Figure 7
<p>Obstacle avoidance performance of the D*, A*, and RRT* algorithms.</p>
Full article ">Figure 8
<p>Dynamic obstacle interaction of the D*, A*, and RRT* algorithms.</p>
Full article ">Figure 9
<p>Terrain adaptation of the D*, A*, and RRT* algorithms.</p>
Full article ">Figure 10
<p>Weather resilience of the D*, A*, and RRT* algorithms.</p>
Full article ">Figure 11
<p>Localization challenges for the D*, A*, and RRT* algorithms.</p>
Full article ">Figure 12
<p>Path efficiency comparison.</p>
Full article ">Figure 13
<p>Comparison of path planning algorithms.</p>
Full article ">
17 pages, 8500 KiB  
Article
Stiffness Analysis of Cable-Driven Parallel Robot for UAV Aerial Recovery System
by Jun Wu, Honghao Yue, Xueting Pan, Yanbing Wang, Yong Zhao and Fei Yang
Actuators 2024, 13(9), 343; https://doi.org/10.3390/act13090343 - 6 Sep 2024
Viewed by 321
Abstract
Unmanned Aerial Vehicle (UAV) aerial recovery is a challenging task due to the limited maneuverability of both the transport aircraft and the UAV, making it difficult to establish an effective capture connection in the airflow field. In previous studies, we proposed using a [...] Read more.
Unmanned Aerial Vehicle (UAV) aerial recovery is a challenging task due to the limited maneuverability of both the transport aircraft and the UAV, making it difficult to establish an effective capture connection in the airflow field. In previous studies, we proposed using a Cable-Driven Parallel Robot (CDPR) for active interception and recovery of UAVs. However, during the aerial recovery process, the CDPR is continuously subjected to aerodynamic loads, which significantly affect the stiffness characteristics of the CDPR. This paper conducts a stiffness analysis of a single cable and a CDPR in a flow field environment. Firstly, we derive the stiffness matrix of a single cable based on a model that considers aerodynamic loads. The CDPR is then divided into elements using the finite element method (FEM), and the stiffness matrix for each element is obtained. These element stiffness matrices are assembled to form the stiffness matrix of the CDPR system. Secondly, we analyze the stiffness distribution of a single cable at various equilibrium positions within a flow field environment. Aerodynamic loads were observed to alter the equilibrium position of the cable, thereby impacting its stiffness. The more the cable bends, the greater the reduction in its stiffness. We examine the stiffness distribution characteristics of the CDPR’s end-effector within its workspace and analyze the impact of varying flow velocities and different cable materials on the system’s stiffness. This research offers a methodology for analyzing the stiffness of CDPR systems operating in a flow field environment. Full article
(This article belongs to the Special Issue Soft Robotics: Actuation, Control, and Application)
Show Figures

Figure 1

Figure 1
<p>UAV aerial recovery system based on CDPR.</p>
Full article ">Figure 2
<p>Static equilibrium force analysis of the cable.</p>
Full article ">Figure 3
<p>The CDPR configuration for UAV recovery system.</p>
Full article ">Figure 4
<p>The elements and nodes of the CDPR.</p>
Full article ">Figure 5
<p>The local coordinate system of element 5.</p>
Full article ">Figure 6
<p>Distribution of the spatial equilibrium positions of point <math display="inline"><semantics> <mi mathvariant="italic">B</mi> </semantics></math> under different conditions: (<b>a</b>) Three-dimensional view; (<b>b</b>) x-direction view; (<b>c</b>) y-direction view; (<b>d</b>) z-direction view.</p>
Full article ">Figure 7
<p>Distribution of the cable bending degree <math display="inline"><semantics> <mi mathvariant="italic">λ</mi> </semantics></math>: (<b>a</b>) Considering mass and elasticity; (<b>b</b>) considering mass, elasticity, and aerodynamic loads.</p>
Full article ">Figure 8
<p>Maximum stiffness distribution of the cable: (<b>a</b>) Considering mass and elasticity; (<b>b</b>) considering mass, elasticity, and aerodynamic loads.</p>
Full article ">Figure 9
<p>CDPR stiffness in three directions: (<b>a</b>) <math display="inline"><semantics> <mi mathvariant="italic">X</mi> </semantics></math>-direction; (<b>b</b>) <math display="inline"><semantics> <mi mathvariant="italic">Y</mi> </semantics></math>-direction; (<b>c</b>) <math display="inline"><semantics> <mi mathvariant="italic">Z</mi> </semantics></math>-direction.</p>
Full article ">Figure 10
<p>CDPR minimum and maximum stiffness distribution: (<b>a</b>) Minimum stiffness; (<b>b</b>) maximum stiffness.</p>
Full article ">Figure 11
<p>Stiffness distribution under different flow velocities: (<b>a</b>) 0 m/s; (<b>b</b>) 40 m/s; (<b>c</b>) 60 m/s; (<b>d</b>) 80 m/s; (<b>e</b>) 100 m/s; (<b>f</b>) 120 m/s.</p>
Full article ">Figure 12
<p>Stiffness variation curves of typical points with changing flow velocity.</p>
Full article ">Figure 13
<p>Stiffness distribution corresponding to different cable materials: (<b>a</b>) <span class="html-italic">E</span> = 75 GPa; (<b>b</b>) <span class="html-italic">E</span> = 20.2 GPa; (<b>c</b>) <span class="html-italic">E</span> = 1.6 GPa.</p>
Full article ">Figure 14
<p>The <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="italic">a</mi> <mi mathvariant="italic">i</mi> </msub> </mrow> </semantics></math> values corresponding to the different cable materials: (<b>a</b>) <math display="inline"><semantics> <mi mathvariant="italic">X</mi> </semantics></math>-direction; (<b>b</b>) <math display="inline"><semantics> <mi mathvariant="italic">Y</mi> </semantics></math>-direction; (<b>c</b>) <math display="inline"><semantics> <mi mathvariant="italic">Z</mi> </semantics></math>-direction.</p>
Full article ">Figure 15
<p>The ideal interception spatial position for different <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="italic">K</mi> <mrow> <mi mathvariant="italic">l</mi> <mi mathvariant="italic">o</mi> <mi mathvariant="italic">w</mi> </mrow> </msub> </mrow> </semantics></math> values: (<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="italic">K</mi> <mrow> <mi mathvariant="italic">l</mi> <mi mathvariant="italic">o</mi> <mi mathvariant="italic">w</mi> </mrow> </msub> <mrow> <mo>=</mo> <mn>100,000</mn> <mi>N</mi> <mo>/</mo> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="italic">K</mi> <mrow> <mi mathvariant="italic">l</mi> <mi mathvariant="italic">o</mi> <mi mathvariant="italic">w</mi> </mrow> </msub> <mrow> <mo>=</mo> <mn>200,000</mn> <mi>N</mi> <mo>/</mo> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="italic">K</mi> <mrow> <mi mathvariant="italic">l</mi> <mi mathvariant="italic">o</mi> <mi mathvariant="italic">w</mi> </mrow> </msub> <mrow> <mo>=</mo> <mn>300,000</mn> <mi>N</mi> <mo>/</mo> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>.</p>
Full article ">
20 pages, 5589 KiB  
Article
Advanced Control Strategies for Securing UAV Systems: A Cyber-Physical Approach
by Mohammad Sadeq Ale Isaac, Pablo Flores Peña, Daniela Gîfu and Ahmed Refaat Ragab
Appl. Syst. Innov. 2024, 7(5), 83; https://doi.org/10.3390/asi7050083 - 6 Sep 2024
Viewed by 432
Abstract
This paper explores the application of sliding mode control (SMC) as a robust security enhancement strategy for unmanned aerial vehicle (UAV) systems. The study proposes integrating advanced SMC techniques with security protocols to develop a dual-purpose system that improves UAV control and fortifies [...] Read more.
This paper explores the application of sliding mode control (SMC) as a robust security enhancement strategy for unmanned aerial vehicle (UAV) systems. The study proposes integrating advanced SMC techniques with security protocols to develop a dual-purpose system that improves UAV control and fortifies against adversarial actions. The strategy includes dynamic reconfiguration capabilities within the SMC framework, allowing adaptive responses to threats by adjusting control laws and operational parameters. This is complemented by anomaly detection algorithms that monitor deviations in control signals and system states, providing early warnings of potential cyber-intrusions or physical tampering. Additionally, fault-tolerant SMC mechanisms are designed to maintain control and system stability even when parts of the UAV are compromised. The methodology involves simulation and real-world testing to validate the effectiveness of the SMC-based security enhancements. Simulations assess how the UAV handles attack scenarios, such as GPS spoofing and control signal jamming, with SMC adapting in real-time to mitigate these threats. Field tests further confirm the system’s capability to operate under varied conditions, proving the feasibility of SMC for enhancing UAV security. This integration of sliding mode control into UAV security protocols leverages control theory for security purposes, offering a significant advancement in the robust, adaptive control of UAVs in hostile environments. Full article
Show Figures

Figure 1

Figure 1
<p>The inertial frame vs. body frame of the helicopter. Different colors are used for a clearer distinction between the components of each frame.</p>
Full article ">Figure 2
<p>Performance of the helicopter’s control system in the presence of the secured algorithm, highlighting various key parameters and their behavior over a 45-min flight. The subplots present altitude, throttle, flight trajectory, roll, pitch, yaw, and corresponding servo flap positions.</p>
Full article ">Figure 3
<p>Performance data of the Nuntius helicopter controlled by a secured SMC during a 45-min flight with random noise applied to the yaw angle between minutes 10 and 20. The subplots display critical flight parameters and corresponding control inputs.</p>
Full article ">Figure 4
<p>Performance data of the Nuntius helicopter controlled by a secured SMC during a 35-min flight. The pilot began sending assisted-manual commands at around minute 20, starting with a throttle reduction. The data highlight the controller’s compensation for these inputs, maintaining stability despite the manual interventions.</p>
Full article ">Figure 5
<p>Performance data of the Nuntius helicopter controlled by a secured SMC during a 45-min flight. At around minute 30, a changing direction was applied to the roll loop to test the attitude control performance. The figure showcases the helicopter’s response to this input across various flight parameters.</p>
Full article ">Figure 6
<p>Performance data of the Nuntius helicopter controlled by a secured SMC during a 50-min flight. The flight includes transitions between manual and automatic control modes, with significant altitude changes at around minute 12 and minute 35. The figure evaluates the controller’s performance in maintaining stability and handling these transitions.</p>
Full article ">
31 pages, 49489 KiB  
Review
Runway-Free Recovery Methods for Fixed-Wing UAVs: A Comprehensive Review
by Yunxiao Liu, Yiming Wang, Han Li and Jianliang Ai
Drones 2024, 8(9), 463; https://doi.org/10.3390/drones8090463 - 5 Sep 2024
Viewed by 335
Abstract
Fixed-wing unmanned aerial vehicles (UAVs) have the advantages of long endurance and fast flight speed and are widely used in surveying, mapping, monitoring, and defense fields. However, its conventional take-off and landing methods require runway support. Achieving runway-free recovery is necessary for expanding [...] Read more.
Fixed-wing unmanned aerial vehicles (UAVs) have the advantages of long endurance and fast flight speed and are widely used in surveying, mapping, monitoring, and defense fields. However, its conventional take-off and landing methods require runway support. Achieving runway-free recovery is necessary for expanding the application of fixed-wing UAVs. This research comprehensively reviews the various techniques and scenarios of runway-free recovery of fixed-wing UAVs and summarizes the key technologies. The above methods cover parachute recovery, net recovery, rope recovery, SideArm recovery, deep stall recovery, towed drogue docking recovery, and robotic arm recovery methods within runway-free recovery. Finally, this research discusses the future research directions of runway-free recovery. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic diagram of runway landing and runway-free recovery of fixed-wing UAVs. (<b>a</b>) Runway landing [<a href="#B19-drones-08-00463" class="html-bibr">19</a>]. (<b>b</b>) Runway-free recovery [<a href="#B20-drones-08-00463" class="html-bibr">20</a>].</p>
Full article ">Figure 2
<p>Parachute recovery process and system component. (<b>a</b>) Design of automatic parachute opening structure using acceleration sensor [<a href="#B26-drones-08-00463" class="html-bibr">26</a>]. (<b>b</b>) Parachute system component [<a href="#B27-drones-08-00463" class="html-bibr">27</a>].</p>
Full article ">Figure 3
<p>Several examples of natural inflation opening technology. (<b>a</b>) Design of automatic parachute opening structure using acceleration sensor [<a href="#B26-drones-08-00463" class="html-bibr">26</a>]. (<b>b</b>) Schematic diagram of the coordinate system for 6-DOF dynamic modelling of the UAV parachute combination [<a href="#B30-drones-08-00463" class="html-bibr">30</a>]. (<b>c</b>) Schematic diagram of elastic parachute rope dynamics analysis [<a href="#B25-drones-08-00463" class="html-bibr">25</a>].</p>
Full article ">Figure 4
<p>Schematic diagram of spring ejection parachute opening device [<a href="#B23-drones-08-00463" class="html-bibr">23</a>].</p>
Full article ">Figure 5
<p>Rocket ejection device and simulation results. (<b>a</b>) Rocket ejection opening device [<a href="#B32-drones-08-00463" class="html-bibr">32</a>]. (<b>b</b>) Simulation of the opening of parachutes with round and cross-shaped canopies [<a href="#B33-drones-08-00463" class="html-bibr">33</a>].</p>
Full article ">Figure 6
<p>Schematic diagrams of net recovery system applications. (<b>a</b>) Ground-based net recovery [<a href="#B40-drones-08-00463" class="html-bibr">40</a>]. (<b>b</b>) Ship-based net recovery [<a href="#B41-drones-08-00463" class="html-bibr">41</a>]. (<b>c</b>) Sea platform-based net recovery [<a href="#B39-drones-08-00463" class="html-bibr">39</a>]. (<b>d</b>) A recovery net suspended by two multirotor UAVs [<a href="#B37-drones-08-00463" class="html-bibr">37</a>,<a href="#B38-drones-08-00463" class="html-bibr">38</a>].</p>
Full article ">Figure 7
<p>The geometry of the recovery plan illustrated with a net on moving platforms [<a href="#B39-drones-08-00463" class="html-bibr">39</a>].</p>
Full article ">Figure 8
<p>Schematic diagram of rope recovery. (<b>a</b>) Horizontal rope recovery [<a href="#B42-drones-08-00463" class="html-bibr">42</a>]. (<b>b</b>) Vertical rope recovery [<a href="#B43-drones-08-00463" class="html-bibr">43</a>].</p>
Full article ">Figure 9
<p>Workflow of automatic horizontal rope shipborne recovery system [<a href="#B42-drones-08-00463" class="html-bibr">42</a>].</p>
Full article ">Figure 10
<p>Schematic diagrams of horizontal rope recovery system. (<b>a</b>) Arresting rope beam and UAV-based arresting hook assembly [<a href="#B45-drones-08-00463" class="html-bibr">45</a>]. (<b>b</b>) An active flexible arresting hook recovery system [<a href="#B46-drones-08-00463" class="html-bibr">46</a>]. (<b>c</b>) A rope-hook recovery system [<a href="#B47-drones-08-00463" class="html-bibr">47</a>]. (<b>d</b>) Recovery of a fixed-wing UAV using a rope suspended between two multicopter UAVs [<a href="#B48-drones-08-00463" class="html-bibr">48</a>].</p>
Full article ">Figure 11
<p>Images of vertical rope recovery system. (<b>a</b>) ScanEagle UAV captured by SkyHook on land [<a href="#B49-drones-08-00463" class="html-bibr">49</a>]. (<b>b</b>) SkyHook shipboard recovery [<a href="#B50-drones-08-00463" class="html-bibr">50</a>]. (<b>c</b>) VTOL-based FLARES recovery system [<a href="#B52-drones-08-00463" class="html-bibr">52</a>].</p>
Full article ">Figure 12
<p>Images of FLARES system [<a href="#B53-drones-08-00463" class="html-bibr">53</a>]. (<b>a</b>) Simulation diagram. (<b>b</b>) Shipborne recovery test.</p>
Full article ">Figure 13
<p>SideArm recovery system [<a href="#B55-drones-08-00463" class="html-bibr">55</a>]. (<b>a</b>) Sea-based. (<b>b</b>) Ground-based.</p>
Full article ">Figure 14
<p>Schematic diagram of shipborne SideArm recovery [<a href="#B58-drones-08-00463" class="html-bibr">58</a>].</p>
Full article ">Figure 15
<p>NPS rated shipborne UAV recovery technology, with SideArm receiving the highest performance value [<a href="#B55-drones-08-00463" class="html-bibr">55</a>].</p>
Full article ">Figure 16
<p>Schematic diagrams of deep stall recovery. (<b>a</b>) Top view of track [<a href="#B62-drones-08-00463" class="html-bibr">62</a>]. (<b>b</b>) Three-dimensional view of aircraft attitude [<a href="#B63-drones-08-00463" class="html-bibr">63</a>].</p>
Full article ">Figure 17
<p>Wind estimation combining pitot–static tube and GPS in an extended Kalman filter [<a href="#B71-drones-08-00463" class="html-bibr">71</a>]. (<b>a</b>) Diagram for deep stall trim analysis. (<b>b</b>) Deep stall path planning.</p>
Full article ">Figure 18
<p>Diagrams of towed drogue docking recovery system framework. (<b>a</b>) Aerial recovery system [<a href="#B72-drones-08-00463" class="html-bibr">72</a>]. (<b>b</b>) C-130 recovery system [<a href="#B73-drones-08-00463" class="html-bibr">73</a>]. (<b>c</b>) Autonomous midair docking system [<a href="#B74-drones-08-00463" class="html-bibr">74</a>].</p>
Full article ">Figure 19
<p>X-61 Gremlins docking and recovery [<a href="#B73-drones-08-00463" class="html-bibr">73</a>].</p>
Full article ">Figure 20
<p>Design of the active stability augmentation control drogue [<a href="#B72-drones-08-00463" class="html-bibr">72</a>].</p>
Full article ">Figure 21
<p>Two states of probe-and-drogue docking structure [<a href="#B83-drones-08-00463" class="html-bibr">83</a>].</p>
Full article ">Figure 22
<p>Recovery (perching) is completed by the UAV’s own robotic arm [<a href="#B86-drones-08-00463" class="html-bibr">86</a>].</p>
Full article ">Figure 23
<p>Design of the cable-driven parallel robotic arm platform [<a href="#B87-drones-08-00463" class="html-bibr">87</a>].</p>
Full article ">Figure 24
<p>Bird-leg-like outrigger landing gear [<a href="#B88-drones-08-00463" class="html-bibr">88</a>].</p>
Full article ">Figure 25
<p>Diagram of grasping structure [<a href="#B86-drones-08-00463" class="html-bibr">86</a>].</p>
Full article ">Figure 26
<p>The process of perching action [<a href="#B91-drones-08-00463" class="html-bibr">91</a>].</p>
Full article ">Figure 27
<p>Examples of commonly used sensors of recovery. (<b>a</b>) C-RTK GNSS receiver [<a href="#B94-drones-08-00463" class="html-bibr">94</a>]. (<b>b</b>) Visible light camera [<a href="#B95-drones-08-00463" class="html-bibr">95</a>]. (<b>c</b>) IR camera [<a href="#B96-drones-08-00463" class="html-bibr">96</a>]. (<b>d</b>) Depth camera [<a href="#B97-drones-08-00463" class="html-bibr">97</a>].</p>
Full article ">Figure 28
<p>Example diagram of RTK GNSS module. (<b>a</b>) RTK GNSS rover module [<a href="#B99-drones-08-00463" class="html-bibr">99</a>]. (<b>b</b>) RTK GNSS base module [<a href="#B99-drones-08-00463" class="html-bibr">99</a>].</p>
Full article ">Figure 29
<p>Drogue recognition image of the airborne visible light camera [<a href="#B73-drones-08-00463" class="html-bibr">73</a>].</p>
Full article ">Figure 30
<p>Example of IR camera and infrared beacon application. (<b>a</b>) Aerial docking based on IR camera [<a href="#B101-drones-08-00463" class="html-bibr">101</a>]. (<b>b</b>) Recovery simulation on a ship based on IR Camera [<a href="#B45-drones-08-00463" class="html-bibr">45</a>].</p>
Full article ">Figure 31
<p>Aerial docking and recovery simulation test based on depth camera [<a href="#B83-drones-08-00463" class="html-bibr">83</a>].</p>
Full article ">
34 pages, 9346 KiB  
Article
An Adaptive Spiral Strategy Dung Beetle Optimization Algorithm: Research and Applications
by Xiong Wang, Yi Zhang, Changbo Zheng, Shuwan Feng, Hui Yu, Bin Hu and Zihan Xie
Biomimetics 2024, 9(9), 519; https://doi.org/10.3390/biomimetics9090519 - 29 Aug 2024
Viewed by 582
Abstract
The Dung Beetle Optimization (DBO) algorithm, a well-established swarm intelligence technique, has shown considerable promise in solving complex engineering design challenges. However, it is hampered by limitations such as suboptimal population initialization, sluggish search speeds, and restricted global exploration capabilities. To overcome these [...] Read more.
The Dung Beetle Optimization (DBO) algorithm, a well-established swarm intelligence technique, has shown considerable promise in solving complex engineering design challenges. However, it is hampered by limitations such as suboptimal population initialization, sluggish search speeds, and restricted global exploration capabilities. To overcome these shortcomings, we propose an enhanced version termed Adaptive Spiral Strategy Dung Beetle Optimization (ADBO). Key enhancements include the application of the Gaussian Chaos strategy for a more effective population initialization, the integration of the Whale Spiral Search Strategy inspired by the Whale Optimization Algorithm, and the introduction of an adaptive weight factor to improve search efficiency and enhance global exploration capabilities. These improvements collectively elevate the performance of the DBO algorithm, significantly enhancing its ability to address intricate real-world problems. We evaluate the ADBO algorithm against a suite of benchmark algorithms using the CEC2017 test functions, demonstrating its superiority. Furthermore, we validate its effectiveness through applications in diverse engineering domains such as robot manipulator design, triangular linkage problems, and unmanned aerial vehicle (UAV) path planning, highlighting its impact on improving UAV safety and energy efficiency. Full article
(This article belongs to the Special Issue Computer-Aided Biomimetics: 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>The Gaussian chaotic distribution plot.</p>
Full article ">Figure 2
<p>The dung beetle’s search trajectory.</p>
Full article ">Figure 3
<p>The dung beetle’s search trajectory.</p>
Full article ">Figure 4
<p>Nonlinear weight values.</p>
Full article ">Figure 5
<p>The ADBO algorithm.</p>
Full article ">Figure 6
<p>CEC2017 test curve chart (Dim = 30).</p>
Full article ">Figure 7
<p>CEC2017 test curve chart (Dim = 100).</p>
Full article ">Figure 8
<p>Multiple improved DBO vs. ADBO (Dim = 30).</p>
Full article ">Figure 9
<p>Multiple improved DBO vs. ADBO (Dim = 100).</p>
Full article ">Figure 10
<p>Mechanical arm image.</p>
Full article ">Figure 11
<p>Mechanical arm convergence plot.</p>
Full article ">Figure 12
<p>Triangle truss design.</p>
Full article ">Figure 13
<p>Three-bar truss convergence curve diagram.</p>
Full article ">Figure 14
<p>Threat cost.</p>
Full article ">Figure 15
<p>Elevation cost.</p>
Full article ">Figure 16
<p>Turn angle and climb angle description.</p>
Full article ">Figure 17
<p>UAV scenarios.</p>
Full article ">Figure 18
<p>Paths in UAV scenarios.</p>
Full article ">Figure 19
<p>Overhead perspective of the UAV path.</p>
Full article ">Figure 20
<p>Iteration graph of the UAV path.</p>
Full article ">
23 pages, 15418 KiB  
Article
Efficient UAV Exploration for Large-Scale 3D Environments Using Low-Memory Map
by Junlong Huang, Zhengping Fan, Zhewen Yan, Peiming Duan, Ruidong Mei and Hui Cheng
Drones 2024, 8(9), 443; https://doi.org/10.3390/drones8090443 - 29 Aug 2024
Viewed by 583
Abstract
Autonomous exploration of unknown environments is a challenging problem in robotic applications, especially in large-scale environments. As the size of the environment increases, the limited onboard resources of the robot hardly satisfy the memory overhead and computational requirements. As a result, it is [...] Read more.
Autonomous exploration of unknown environments is a challenging problem in robotic applications, especially in large-scale environments. As the size of the environment increases, the limited onboard resources of the robot hardly satisfy the memory overhead and computational requirements. As a result, it is challenging to respond quickly to the received sensor data, resulting in inefficient exploration planning. And it is difficult to comprehensively utilize the gathered environmental information for planning, leading to low-quality exploration paths. In this paper, a systematic framework tailored for unmanned aerial vehicles is proposed to autonomously explore large-scale unknown environments. To reduce memory consumption, a novel low-memory environmental representation is introduced that only maintains the information necessary for exploration. Moreover, a hierarchical exploration approach based on the proposed environmental representation is developed to allow for fast planning and efficient exploration. Extensive simulation tests demonstrate the superiority of the proposed method over current state-of-the-art methods in terms of memory consumption, computation time, and exploration efficiency. Furthermore, two real-world experiments conducted in different large-scale environments also validate the feasibility of our autonomous exploration system. Full article
Show Figures

Figure 1

Figure 1
<p>An overview of the proposed framework for autonomous exploration: The local 3D grid map is updated by the SLAM results, and then it is utilized to update FAOmap and the road map. Global planning, local refinement, and trajectory optimization are then conducted sequentially. Finally, the UAV executes the trajectory to explore unknown environments.</p>
Full article ">Figure 2
<p>FAOmap representation (presented in 2D view). (<b>a</b>) The environmental representation during exploration. (<b>b</b>) The space is split by occupancy grids, and FAOmap only stores the frontiers and occupied grids. (<b>c</b>) The relationships between grid states. (<b>d</b>) Cases corresponding to the indicator values associated with frontiers and occupied grids in FAOmap. (<b>e</b>) An example of determining the indicator value of the grid <span class="html-italic">v</span>.</p>
Full article ">Figure 3
<p>The data structure of FAOmap, which is organized by a hash map. (<b>a</b>) The whole space is divided into multiple blocks. (<b>b</b>) The hash map only stores the blocks that include frontier or occupied grids. (<b>c</b>) An element of the hash map contains the frontier and occupied arrays together with their corresponding length values. (<b>d</b>) An element of the frontier or occupied arrays consist of the center point of the frontier or occupied grid, an indicator that indicates the state of the grid’s neighbors, and an occupancy probability of the grid.</p>
Full article ">Figure 4
<p>The updating of FAOmap (presented in 2D view), which relies on a local 3D grid map <span class="html-italic">G</span>. (<b>a</b>) The states of the space before FAOmap updating: the green line represents the frontiers, and the black line represents the obstacles’ surfaces (the area inside the obstacle is unobservable and is regarded as unknown). (<b>b</b>) The states of grids in <span class="html-italic">G</span> after retrieving frontiers and occupied grids from FAOmap. The white grids are to be determined. (<b>c</b>–<b>e</b>) An example of determining the grid states by querying the indicators of grid <span class="html-italic">v</span>. (<b>f</b>) The real states of grids in <span class="html-italic">G</span>. (<b>g</b>) The states of grids in <span class="html-italic">G</span> after being updated by newly received data. (<b>h</b>) The states of the space after FAOmap updating.</p>
Full article ">Figure 5
<p>Fast incremental frontier clustering. (<b>a</b>) The old clusters and some new frontiers to be clustered when the local grid map is updated as the UAV moves. (<b>b</b>) The new clusters (red and yellow) are formed after updating. The green cluster in (<b>a</b>) is erased, and the blue cluster is modified.</p>
Full article ">Figure 6
<p>Hierarchical exploration planning. (<b>a</b>) Global planning: global frontiers are grouped into multiple clusters quickly and incrementally, and then a global exploration route is found for visiting these clusters in order. (<b>b</b>) Local refinement: when reaching the vicinity of the first-to-be-observed cluster, a local path that can efficiently observe all frontiers in the cluster is determined by the sampled candidate viewpoints. (<b>c</b>) Generation of the candidate viewpoint <math display="inline"><semantics> <msub> <mi>v</mi> <mi>f</mi> </msub> </semantics></math> by frontier <span class="html-italic">f</span> and its observation vector <math display="inline"><semantics> <mover accent="true"> <msub> <mi>a</mi> <mi>f</mi> </msub> <mo>→</mo> </mover> </semantics></math>.</p>
Full article ">Figure 7
<p>Three simulation environments: (<b>a</b>) Indoor: 105 m × 60 m × 20 m; (<b>b</b>) Mountain: 100 m × 100 m × 30 m; (<b>c</b>) Village: 150 m <math display="inline"><semantics> <mrow> <mo>×</mo> <mspace width="3.33333pt"/> <mn>100</mn> </mrow> </semantics></math> m <math display="inline"><semantics> <mrow> <mo>×</mo> <mspace width="3.33333pt"/> <mn>30</mn> </mrow> </semantics></math> m.</p>
Full article ">Figure 8
<p>The memory usage (MB) vs. known space volume (m<sup>3</sup>) for the five maps at a resolution of 0.1 m for three simulation scenes: (<b>a</b>) Indoor, (<b>b</b>) Mountain, and (<b>c</b>) Village. Ordinate values beyond 2000 are scaled three times.</p>
Full article ">Figure 9
<p>The exploration progress of the three methods in three different environments: (<b>a</b>) Indoor, (<b>b</b>) Mountain, and (<b>c</b>) Village. The charts show explored volume (m<sup>3</sup>) vs. time (s).</p>
Full article ">Figure 10
<p>The executed trajectories of the three autonomous exploration methods in three different environments. The exploration is complete when the known space reaches 95% of the total observable space, according to the termination condition. (<b>a</b>) Indoor, (<b>b</b>) Mountain, and (<b>c</b>) Village.</p>
Full article ">Figure 11
<p>The real-world exploration in an underground garage. (<b>a</b>,<b>b</b>): Two different views of the online-built point cloud map and the UAV’s trajectory, including images of the environment. The red rectangle represents the bounding box of the space to be explored. Several untraversable areas are within the bounding box due to environment structural restrictions. Some areas outside the bounding box are observed due to the long LiDAR sensor range.</p>
Full article ">Figure 12
<p>The real-world exploration in a cluttered forest. (<b>a</b>,<b>b</b>): Two different views of the online-built point cloud map and the UAV’s trajectory, including images of the environment. The red rectangle represents the bounding box of the space to be explored. Areas outside the bounding box are also observed due to the long LiDAR sensor range.</p>
Full article ">Figure 13
<p>The flying platform used in real-world experiments: a quadrotor drone equipped with an Intel NUC and a Livox Mid-360 LiDAR sensor.</p>
Full article ">Figure 14
<p>The exploration progress for two real-world experiments conducted in large-scale environments. (<b>a</b>) Garage. (<b>b</b>) Forest.</p>
Full article ">
22 pages, 6038 KiB  
Article
An Enhanced SL-YOLOv8-Based Lightweight Remote Sensing Detection Algorithm for Identifying Broken Strands in Transmission Lines
by Xiang Zhang, Jianwei Zhang and Xiaoqiang Jia
Appl. Sci. 2024, 14(17), 7469; https://doi.org/10.3390/app14177469 - 23 Aug 2024
Viewed by 364
Abstract
Power transmission lines frequently face threats from lightning strikes, severe storms, and chemical corrosion, which can lead to damage in steel–aluminum-stranded wires, thereby seriously affecting the stability of the power system. Currently, manual inspections are relatively inefficient and high risk, while drone inspections [...] Read more.
Power transmission lines frequently face threats from lightning strikes, severe storms, and chemical corrosion, which can lead to damage in steel–aluminum-stranded wires, thereby seriously affecting the stability of the power system. Currently, manual inspections are relatively inefficient and high risk, while drone inspections are often limited by complex environments and obstacles. Existing detection algorithms still face difficulties in identifying broken strands. To address these issues, this paper proposes a new method called SL-YOLOv8. This method incorporates an improved You Only Look Once version 8 (YOLOv8) algorithm, specifically designed for online intelligent inspection robots to detect broken strands in transmission lines. Transmission lines are susceptible to lightning strikes, storms, and chemical corrosion, which is leading to the potential failure of steel- and aluminum-stranded lines, and significantly impacting the stability of the power system. Currently, manual inspections come with relatively low efficiency and high risk, and Unmanned Aerial Vehicle (UAV) inspections are hindered by complex situations and obstacles, with current algorithms making it difficult to detect the broken strand lines. This paper proposes SL-YOLOv8, which is a broken transmission line strand detection method for an online intelligent inspection robot combined with an improved You Only Look Once version 8 (YOLOv8). By incorporating the Squeeze-and-Excitation Network version 2 (SENet_v2) into the feature fusion network, the method effectively enhances adaptive feature representation by focusing on and amplifying key information, thereby improving the network’s capability to detect small objects. Additionally, the introduction of the LSKblockAttention module, which combines Large Selective Kernels (LSKs) and the attention mechanism, allows the model to dynamically select and enhance critical features, significantly enhancing detection accuracy and robustness while maintaining model precision. Compared with the original YOLOv8 algorithm, SL-YOLOv8 demonstrates improved precision recognition accuracy in Break-ID-1632 and cable damage datasets. The precision is increased by 3.9% and 2.7%, and the recall is increased by 12.2% and 2.3%, respectively, for the two datasets. The mean average precision (mAP) at the Intersection over Union (IoU) threshold of 0.5 is also increased by 4.9% and 1.2%, showing the SL-YOLOv8’s effectiveness in accurately identifying small objects in complex situations. Full article
(This article belongs to the Special Issue Advanced Pattern Recognition & Computer Vision)
Show Figures

Figure 1

Figure 1
<p>Architecture diagram of the YOLOv8.</p>
Full article ">Figure 2
<p>Architecture diagram of SENet_v2.</p>
Full article ">Figure 3
<p>Architecture diagram of LSK. The part framed by the dotted line represents the large convolution kernel.</p>
Full article ">Figure 4
<p>Architecture diagram of the SL-YOLOv8.</p>
Full article ">Figure 5
<p>The Break-ID-1632 image dataset. (<b>a</b>) Outdoor-normal; (<b>b</b>) snow-normal; (<b>c</b>) grassland-normal; (<b>d</b>) outdoor-broken strand; (<b>e</b>) snow-broken strand; (<b>f</b>) grassland-broken strand.</p>
Full article ">Figure 6
<p>The cable damage image dataset. (<b>a</b>) Broken strand; (<b>b</b>) burning wires.</p>
Full article ">Figure 7
<p>Visual comparison of the Break-ID-1632 dataset.</p>
Full article ">Figure 8
<p>Visual comparison of the cable damage dataset.</p>
Full article ">Figure 9
<p>Comparison of mAP@0.5 before and after improvement.</p>
Full article ">Figure 10
<p>Dataset visualization results. (<b>a</b>) Visualization of YOLOv8 detection on the Break-ID-1632 dataset. (<b>b</b>) The detection visualization results of the SL-YOLOv8 on the Break-ID-1632 dataset. (<b>c</b>) Visualization of YOLOv8 detection on the cable damage dataset.</p>
Full article ">Figure 11
<p>Visualized bar plots of mAP@0.5 corresponding to <a href="#applsci-14-07469-t004" class="html-table">Table 4</a>.</p>
Full article ">Figure 12
<p>Visualized bar plots of mAP@0.5 corresponding to <a href="#applsci-14-07469-t005" class="html-table">Table 5</a>.</p>
Full article ">Figure 13
<p>Visualization of the evaluation parameters for the Break-ID-1632 dataset.</p>
Full article ">Figure 14
<p>Visualization of the evaluation parameters for the cable damage dataset.</p>
Full article ">
26 pages, 16010 KiB  
Article
Conversion of a Coaxial Rotorcraft to a UAV—Lessons Learned
by Barzin Hosseini, Julian Rhein, Florian Holzapfel, Benedikt Grebing and Juergen Rauleder
Aerospace 2024, 11(8), 681; https://doi.org/10.3390/aerospace11080681 - 19 Aug 2024
Viewed by 486
Abstract
A coaxial helicopter with a maximum take-off weight of 600 kg was converted to an unmanned aerial vehicle. A minimally invasive robotic actuator system was developed, which can be retrofitted onto the copilot seat of the rotorcraft in a short period of time [...] Read more.
A coaxial helicopter with a maximum take-off weight of 600 kg was converted to an unmanned aerial vehicle. A minimally invasive robotic actuator system was developed, which can be retrofitted onto the copilot seat of the rotorcraft in a short period of time to enable automatic flight. The automatic flight control robot includes electromechanical actuators, which are connected to the cockpit inceptors and control the helicopter. Most of the sensors and avionic components were integrated into the modular robotic system for faster integration into the rotorcraft. The mechanical design of the control system, the development of the robot control software, and the control system architecture are described in this paper. Furthermore, the multi-body simulation of the robotic system and the estimation of the linear low-order actuator models from hover-frame flight test data are discussed. The developed technologies in this study are not specific to a coaxial helicopter and can be applied to the conversion of any crewed flight vehicle with mechanical controls to unmanned or fly-by-wire. This agile development of a full-size flying test-bed can accelerate the testing of advanced flight control laws, as well as advanced air mobility-related functions. Full article
Show Figures

Figure 1

Figure 1
<p>The actuators controlling the stick.</p>
Full article ">Figure 2
<p>Actuators controlling the collective lever and the pedal.</p>
Full article ">Figure 3
<p>Coupling of a servo model with a revolute joint.</p>
Full article ">Figure 4
<p>Structure of the simulation framework for the robotic control system.</p>
Full article ">Figure 5
<p><math display="inline"><semantics> <msub> <mi>Act</mi> <msub> <mi>Cyc</mi> <mi mathvariant="normal">C</mi> </msub> </msub> </semantics></math> (Actuator 1) inverse-kinematics grid.</p>
Full article ">Figure 6
<p><math display="inline"><semantics> <msub> <mi>Act</mi> <msub> <mi>Cyc</mi> <mi mathvariant="normal">S</mi> </msub> </msub> </semantics></math> (Actuator 2) inverse-kinematics grid.</p>
Full article ">Figure 7
<p>Collective lever actuator inverse-kinematics grid.</p>
Full article ">Figure 8
<p>Pedal actuator inverse-kinematics grid.</p>
Full article ">Figure 9
<p>APCU software structure.</p>
Full article ">Figure 10
<p>Flight control system schematic overview.</p>
Full article ">Figure 11
<p>Onboard flight control system electronic architecture.</p>
Full article ">Figure 12
<p>Remote crew interfaces.</p>
Full article ">Figure 13
<p>System setup for HIL tests.</p>
Full article ">Figure 14
<p>HIL tests block diagram.</p>
Full article ">Figure 15
<p>Flight tests in a hover frame.</p>
Full article ">Figure 16
<p>Flight tests at Magdeburg–Cochstedt airport (EDBC).</p>
Full article ">Figure 17
<p>The UAS operation volumes—each of the overlapping green fields represents one volume of operation for VLOS flights.</p>
Full article ">Figure 18
<p>CoAX 600 UAS flight tests.</p>
Full article ">Figure 19
<p>Rotor cyclic controls reference positions and swashplate instrumentation.</p>
Full article ">Figure 20
<p>Actuator dynamics poles.</p>
Full article ">Figure 21
<p>Actuator model fit in the frequency domain.</p>
Full article ">Figure 22
<p>Comparison of the responses of the linear low-order systems of the rotorcraft in the hover frame (black) with the recorded data (red).</p>
Full article ">Figure 23
<p>Poles (x) and zeros (o) of the low-order equivalent systems for roll (blue), pitch (red), and yaw (magenta) transfer functions.</p>
Full article ">
23 pages, 3682 KiB  
Article
Adaptive Incremental Nonlinear Dynamic Inversion Control for Aerial Manipulators
by Chanhong Park, Alex Ramirez-Serrano and Mahdis Bisheban
Aerospace 2024, 11(8), 671; https://doi.org/10.3390/aerospace11080671 - 15 Aug 2024
Viewed by 489
Abstract
This paper proposes an adaptive incremental nonlinear dynamic inversion (INDI) controller for unmanned aerial manipulators (UAMs). A novel adaptive law is employed to enable aerial manipulators to manage the inertia parameter changes that occur when the manipulator moves or picks up unknown objects [...] Read more.
This paper proposes an adaptive incremental nonlinear dynamic inversion (INDI) controller for unmanned aerial manipulators (UAMs). A novel adaptive law is employed to enable aerial manipulators to manage the inertia parameter changes that occur when the manipulator moves or picks up unknown objects during any phase of the UAM’s flight maneuver. The adaptive law utilizes a Kalman filter to estimate a set of weighting factors employed to adjust the control gain matrix of a previously developed INDI control law formulated for the corresponding UAV (no manipulator included). The proposed adaptive control scheme uses acceleration and actuator input measurements of the UAV without necessitating any knowledge about the manipulator, its movements, or the objects being grasped, thus enabling the use of previously developed INDI UAV controllers for UAMs. The algorithm is validated through simulations demonstrating that the adaptive control gain matrix used in the UAV’s INDI controller is promptly updated based on the UAM maneuvers, resulting in effective UAV and robot arm control. Full article
(This article belongs to the Special Issue Challenges and Innovations in Aircraft Flight Control)
Show Figures

Figure 1

Figure 1
<p>The Navig8-UAV and hypothetical Navig8-UAM: (<b>a</b>) The Navig8-UAV; (<b>b</b>) The hypothetical Navig8-UAM.</p>
Full article ">Figure 2
<p>Schematic diagram of the Navig8-UAM.</p>
Full article ">Figure 3
<p>Block diagram of the proposed adaptive INDI controller for UAMs.</p>
Full article ">Figure 4
<p>Manipulator poses during the simulation.</p>
Full article ">Figure 5
<p>Joint angles of the manipulator during the simulation.</p>
Full article ">Figure 6
<p>Position and attitude control of the UAV during the simulation: (<b>a</b>) UAV position control in the east direction; (<b>b</b>) UAV position control in the north direction; (<b>c</b>) UAV position control in the upward direction; (<b>d</b>) UAV roll angle control; (<b>e</b>) UAV pitch angle control; (<b>f</b>) UAV yaw angle control.</p>
Full article ">Figure 7
<p>Acceleration control of the UAV during the simulation: (<b>a</b>) UAV angular acceleration control in the x direction of the UAV frame; (<b>b</b>) UAV angular acceleration control in the y direction of the UAV frame; (<b>c</b>) UAV angular acceleration control in the z direction of the UAV frame; (<b>d</b>) UAV linear acceleration control in the x direction of the UAV frame; (<b>e</b>) UAV linear acceleration control in the y direction of the UAV frame; (<b>f</b>) UAV linear acceleration control in the z direction of the UAV frame.</p>
Full article ">Figure 8
<p>Components of the inverse control effectiveness matrix during the simulation: (<b>a</b>) 1st column of the adapted <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>b</b>) 1st column of the true <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>c</b>) 2nd column of the adapted <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>d</b>) 2nd column of the true <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>e</b>) 3rd column of the adapted <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>f</b>) 3rd column of the true <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>g</b>) 4th column of the adapted <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>h</b>) 4th column of the true <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>i</b>) 5th column of the adapted <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>; (<b>j</b>) 5th column of the true <math display="inline"><semantics> <msup> <mi>G</mi> <mrow> <mo>−</mo> <mn>1</mn> </mrow> </msup> </semantics></math>.</p>
Full article ">Figure 9
<p>Side view of the helical trajectory achieved by the UAM.</p>
Full article ">Figure 10
<p>Top view of the helical trajectory achieved by the UAM.</p>
Full article ">Figure 11
<p>Attitude control errors during the helical trajectory tracking simulation: (<b>a</b>) roll angle control error; (<b>b</b>) pitch angle control error; (<b>c</b>) yaw angle control error.</p>
Full article ">
18 pages, 16140 KiB  
Article
Development and Validation of a New Type of Displacement-Based Miniatured Laser Vibrometers
by Ke Yuan, Zhonghua Zhu, Wei Chen and Weidong Zhu
Sensors 2024, 24(16), 5230; https://doi.org/10.3390/s24165230 - 13 Aug 2024
Viewed by 572
Abstract
Developing a miniatured laser vibrometer becomes important for many engineering areas, such as experimental and operational modal analyses, model validation, and structural health monitoring. Due to its compact size and light weight, a miniatured laser vibrometer can be attached to various mobilized platforms, [...] Read more.
Developing a miniatured laser vibrometer becomes important for many engineering areas, such as experimental and operational modal analyses, model validation, and structural health monitoring. Due to its compact size and light weight, a miniatured laser vibrometer can be attached to various mobilized platforms, such as an unmanned aerial vehicle and a robotic arm whose payloads can usually not be large, to achieve a flexible vibration measurement capability. However, integrating optics into a miniaturized laser vibrometer presents several challenges. These include signal interference from ghost reflectance signals generated by the sub-components of integrated photonics, polarization effects caused by waveguide structures, wavelength drifting due to the semiconductor laser, and the poorer noise characteristics of an integrated laser chip compared to a non-integrated circuit. This work proposes a novel chip-based high-precision laser vibrometer by incorporating two or more sets of quadrature demodulation networks into its design. An additional set of quadrature demodulation networks with a distinct reference arm delay line length can be used to conduct real-time compensation to mitigate linear interference caused by temperature and environmental variations. A series of vibration measurements with frequencies ranging from 0.1 Hz to 1 MHz were conducted using the proposed laser vibrometer to show its repeatability and accuracy in vibration and ultrasonic vibration measurements, and its robustness to test surface conditions. The proposed laser vibrometer has the advantage of directly measuring the displacement response of a vibrating structure rather than integrating its velocity response to yield the measured displacement with a conventional laser Doppler vibrometer. Full article
(This article belongs to the Section Optical Sensors)
Show Figures

Figure 1

Figure 1
<p>Simplified schematic diagram of a laser interferometer: the homodyne configuration without heterodyne implementation marked by the dashed box, and the heterodyne configuration.</p>
Full article ">Figure 2
<p>Schematic of quadrature demodulation.</p>
Full article ">Figure 3
<p>Schematic of the proposed chip-based high-precision laser vibrometer.</p>
Full article ">Figure 4
<p>Actual size of the laser vibrometer chip proposed in this work, which is 7 mm × 5 mm.</p>
Full article ">Figure 5
<p>Two laser vibrometers developed using integrated optical chips.</p>
Full article ">Figure 6
<p>(<b>a</b>) Vibration of a medical scalpel tip; and (<b>b</b>) the measurement of ultrasonic waves induced by a high-energy laser pulse.</p>
Full article ">Figure 7
<p>Experimental setups of the measurement repeatability validation of the proposed laser vibrometer: (<b>a</b>) a speaker under excitation frequencies of 0.1 Hz–1000 Hz; (<b>b</b>) another speaker under excitation frequencies of 5000 Hz and 20,000 Hz; and (<b>c</b>) an ultrasonic vibration source under an excitation frequency of 1,000,000 Hz.</p>
Full article ">Figure 8
<p>Vibrations of the speaker #1 under sinusoidal excitation with a frequency of 0.1 Hz: responses from three independent datasets with the same length in the (<b>a</b>) time domain and (<b>b</b>) frequency domain.</p>
Full article ">Figure 9
<p>Vibrations of the speaker #2 under sinusoidal excitation with a frequency of 20,000 Hz: responses from three independent datasets with the same length in the (<b>a</b>) time domain and (<b>b</b>) frequency domain.</p>
Full article ">Figure 10
<p>Vibrations of the ultrasonic vibration source under sinusoidal excitation with a frequency of 1,000,000 Hz: responses from three independent datasets with the same length in the (<b>a</b>) time domain and (<b>b</b>) frequency domain.</p>
Full article ">Figure 11
<p>Experimental setup of the measurement accuracy validation of the proposed laser vibrometer, where a Polytec LDV was used as a reference.</p>
Full article ">Figure 12
<p>Vibrations of the speaker #1 under sinusoidal excitation with a frequency of 0.1 Hz: responses from the two independent measurement systems with the same length in the (<b>a</b>) time domain and (<b>b</b>) frequency domain.</p>
Full article ">Figure 13
<p>Vibrations of the speaker #2 under sinusoidal excitation with a frequency of 5000 Hz: responses from the two independent measurement systems with the same length in the (<b>a</b>) time domain and (<b>b</b>) frequency domain.</p>
Full article ">Figure 14
<p>Experimental setup for assessing the robustness of the proposed laser vibrometer to test surface conditions using the speaker #1 with (<b>a</b>) a surface enhanced by a reflective tape, and (<b>b</b>) a natural surface.</p>
Full article ">Figure 15
<p>SNRs of measured responses of the speaker #1 with enhanced and natural surfaces under sinusoidal excitation with a frequency of 10 Hz using (<b>a</b>) the Polytec LDV, and (<b>b</b>) the proposed laser vibrometer.</p>
Full article ">Figure 16
<p>SNRs of measured responses of the speaker #1 with enhanced and natural surfaces under sinusoidal excitation with a frequency of 1000 Hz by using (<b>a</b>) the Polytec LDV; and (<b>b</b>) the proposed laser vibrometer.</p>
Full article ">
19 pages, 7931 KiB  
Article
Improving Aerial Targeting Precision: A Study on Point Cloud Semantic Segmentation with Advanced Deep Learning Algorithms
by Salih Bozkurt, Muhammed Enes Atik and Zaide Duran
Drones 2024, 8(8), 376; https://doi.org/10.3390/drones8080376 - 6 Aug 2024
Viewed by 776
Abstract
The integration of technological advancements has significantly impacted artificial intelligence (AI), enhancing the reliability of AI model outputs. This progress has led to the widespread utilization of AI across various sectors, including automotive, robotics, healthcare, space exploration, and defense. Today, air defense operations [...] Read more.
The integration of technological advancements has significantly impacted artificial intelligence (AI), enhancing the reliability of AI model outputs. This progress has led to the widespread utilization of AI across various sectors, including automotive, robotics, healthcare, space exploration, and defense. Today, air defense operations predominantly rely on laser designation. This process is entirely dependent on the capability and experience of human operators. Considering that UAV systems can have flight durations exceeding 24 h, this process is highly prone to errors due to the human factor. Therefore, the aim of this study is to automate the laser designation process using advanced deep learning algorithms on 3D point clouds obtained from different sources, thereby eliminating operator-related errors. As different data sources, dense 3D point clouds produced with photogrammetric methods containing color information, and point clouds produced with LiDAR systems were identified. The photogrammetric point cloud data were generated from images captured by the Akinci UAV’s multi-axis gimbal camera system within the scope of this study. For the point cloud data obtained from the LiDAR system, the DublinCity LiDAR dataset was used for testing purposes. The segmentation of point cloud data utilized the PointNet++ and RandLA-Net algorithms. Distinct differences were observed between the evaluated algorithms. The RandLA-Net algorithm, relying solely on geometric features, achieved an approximate accuracy of 94%, while integrating color features significantly improved its performance, raising its accuracy to nearly 97%. Similarly, the PointNet++ algorithm, relying solely on geometric features, achieved an accuracy of approximately 94%. Notably, the model developed as a unique contribution in this study involved enriching the PointNet++ algorithm by incorporating color attributes, leading to significant improvements with an approximate accuracy of 96%. The obtained results demonstrate a notable improvement in the PointNet++ algorithm with the proposed approach. Furthermore, it was demonstrated that the methodology proposed in this study can be effectively applied directly to data generated from different sources in aerial scanning systems. Full article
Show Figures

Figure 1

Figure 1
<p>The study area is the district of Çorlu, situated in Tekirdağ province, Türkiye. (<b>a</b>) A map of Türkiye’s provinces. (<b>b</b>) A map of Tekirdağ province’s districts. (<b>c</b>) An image of the study area located in the district of Çorlu.</p>
Full article ">Figure 2
<p>Illustration of DublinCity LiDAR data hierarchical structure.</p>
Full article ">Figure 3
<p>Sample of DublinCity LiDAR data.</p>
Full article ">Figure 4
<p>Illustration of SfM algorithm.</p>
Full article ">Figure 5
<p>Illustration of the conversion between the aircraft and gimbal axes.</p>
Full article ">Figure 6
<p>Comparison of before and after axis transformations for the generated 3D point cloud. (<b>a</b>,<b>b</b>) Before axis transformation. (<b>c</b>) After axis transformation.</p>
Full article ">Figure 7
<p>Example of CloudCompare labeling phase. (<b>a</b>) Label layers. (<b>b</b>) Regions in the labeling stage.</p>
Full article ">Figure 8
<p>Illustration of the PointNet++ architecture for a single-scale point group.</p>
Full article ">Figure 9
<p>Illustration of RandLA-Net architecture.</p>
Full article ">Figure 10
<p>Sample of DublinCity LiDAR data with 6 classes. (<b>a</b>) Sample of PointNet++ training data. (<b>b</b>) Sample of PointNet++ predicted data.</p>
Full article ">Figure 11
<p>Sample of DublinCity LiDAR data with 4 classes. (<b>a</b>) Sample of PointNet++ train data. (<b>b</b>) Sample of PointNet++ predicted data.</p>
Full article ">Figure 12
<p>Sample of DublinCity LiDAR Data with 3 classes. (<b>a</b>) Sample of PointNet++ training data. (<b>b</b>) Sample of PointNet++ predicted data.</p>
Full article ">Figure 13
<p>Sample of DublinCity LiDAR data RandLA-Net prediction results. (<b>a</b>) Prediction results for 6 classes. (<b>b</b>) Prediction results for 4 classes. (<b>c</b>) Prediction results for 3 classes.</p>
Full article ">Figure 14
<p>Sample of generated point cloud with 3 classes. (<b>a</b>) Sample of PointNet++ training data. (<b>b</b>) Sample of PointNet++ predicted data (minimum batch size: 16; epochs: 50).</p>
Full article ">Figure 15
<p>Sample of generated point cloud PointNet++ results created with 3 classes (<b>a</b>) produced with a minimum batch size of 32 and 100 epochs, and (<b>b</b>) produced with a minimum batch size of 32 and 1000 epochs.</p>
Full article ">Figure 16
<p>RandLA-Net with only geometric features: sample of a building used as test data and its original and predicted label views. (<b>a</b>) Original view of building. (<b>b</b>) Manually labeled building. (<b>c</b>) Predicted labels of the building.</p>
Full article ">Figure 17
<p>RandLA-Net with color and geometric features: Sample of a building used as test data and its original and predicted label views. (<b>a</b>) Original view of building. (<b>b</b>) Manually labeled building. (<b>c</b>) Predicted labels of the building.</p>
Full article ">
17 pages, 8581 KiB  
Article
Oil Spill Mitigation with a Team of Heterogeneous Autonomous Vehicles
by André Dias, Ana Mucha, Tiago Santos, Alexandre Oliveira, Guilherme Amaral, Hugo Ferreira, Alfredo Martins, José Almeida and Eduardo Silva
J. Mar. Sci. Eng. 2024, 12(8), 1281; https://doi.org/10.3390/jmse12081281 - 30 Jul 2024
Viewed by 511
Abstract
This paper presents the implementation of an innovative solution based on heterogeneous autonomous vehicles to tackle maritime pollution (in particular, oil spills). This solution is based on native microbial consortia with bioremediation capacity, and the adaptation of air and surface autonomous vehicles for [...] Read more.
This paper presents the implementation of an innovative solution based on heterogeneous autonomous vehicles to tackle maritime pollution (in particular, oil spills). This solution is based on native microbial consortia with bioremediation capacity, and the adaptation of air and surface autonomous vehicles for in situ release of autochthonous microorganisms (bioaugmentation) and nutrients (biostimulation). By doing so, these systems can be applied as the first line of the response to pollution incidents from several origins that may occur inside ports, around industrial and extraction facilities, or in the open sea during transport activities in a fast, efficient, and low-cost way. The paper describes the work done in the development of a team of autonomous vehicles able to carry as payload, native organisms to naturally degrade oil spills (avoiding the introduction of additional chemical or biological additives), and the development of a multi-robot framework for efficient oil spill mitigation. Field tests have been performed in Portugal and Spain’s harbors, with a simulated oil spill, and the coordinate oil spill task between the autonomous surface vehicle (ASV) ROAZ and the unmanned aerial vehicle (UAV) STORK has been validated. Full article
(This article belongs to the Section Marine Pollution)
Show Figures

Figure 1

Figure 1
<p>Oil spill incident detected by the Satellite radar images from Copernicus Sentinel-1 in the Mediterranean Sea [<a href="#B4-jmse-12-01281" class="html-bibr">4</a>].</p>
Full article ">Figure 2
<p>Conceptual approach for multi-robot oil spill mitigation with a team of heterogeneous autonomous vehicles, particularly an ASV and a UAV.</p>
Full article ">Figure 3
<p>Multi-robot framework for oil spill mitigation.</p>
Full article ">Figure 4
<p>(<b>Left</b>): repulsive forces applied to the ASV for oil spill avoidance. (<b>Right</b>): resultant force from repulsive and attractive forces.</p>
Full article ">Figure 5
<p>New point based on three consecutive contour points.</p>
Full article ">Figure 6
<p>Coverage area of the powder spreader nozzle.</p>
Full article ">Figure 7
<p>UAV path planning with the proposed algorithm.</p>
Full article ">Figure 8
<p>ASV ROAZ II adapted for oil spill mitigation.</p>
Full article ">Figure 9
<p>UAV STORK I (<b>left</b>) and GRIFO-X (<b>right</b>) prepared for oil spill mitigation missions.</p>
Full article ">Figure 10
<p>Release system conceptual architecture for both autonomous vehicles. (<b>Left</b>): UAV release system for lyophilized spread. (<b>Right</b>): ASV release system with the ability to mix lyophilized powder mixture with native water.</p>
Full article ">Figure 11
<p>UAV release system developed for both UAVs. (<b>Left</b>): release system for UAV STORK I with a capacity of 1 kg. (<b>Right</b>): release system for UAV GRIFO-X with a capacity of 7 kg on each reservoir, with an overall capacity of 14 kg.</p>
Full article ">Figure 12
<p>ASV release system. Water pump and control system box.</p>
Full article ">Figure 13
<p>Oil spill simulation environment developed in Gazebo to provide the oil spill to both vehicles during the field tests.</p>
Full article ">Figure 14
<p>Field tests during the robotics exercise (REX).</p>
Full article ">Figure 15
<p>Field Tests in Puerto of Medas, Portugal and Coruña, Spain.</p>
Full article ">Figure 16
<p>Trajectory behavior of both vehicles to mitigate the oil spill. Field test in the harbor of Leixões, Portugal.</p>
Full article ">Figure 17
<p>(<b>Left</b>): simulated oil spills in the Puerto A Coruña. (<b>Right</b>): ground Station 3D graphical user interface for monitoring the mission with the position of both vehicles and the generated path planning.</p>
Full article ">Figure 18
<p>Trajectory behavior of both vehicles to mitigate the oil spill. Field test in the harbor of Coruña, Spain. (<b>Left</b>): UAV trajectory over the oil spill. (<b>Right</b>): ASV trajectory contours the oil spill.</p>
Full article ">
Back to TopTop