[go: up one dir, main page]

0% found this document useful (0 votes)
18 views17 pages

Suresh 2020, Conference

Uploaded by

Shalini Suresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views17 pages

Suresh 2020, Conference

Uploaded by

Shalini Suresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

AIAA SciTech Forum 10.2514/6.

2020-1092
6-10 January 2020, Orlando, FL
AIAA Scitech 2020 Forum

Guidance Using Multiple Sequential Line-of-Sight Information

Shalini Suresh ∗ and Ashwini Ratnoo †


Indian Institute of Science, Bangaluru-560012, India

This paper considers the problem of UAV guidance using relative bearing information of
beacons placed along a path. Using proportional navigation guidance, the method utilizes, in
sequence, the line-of-sight information obtained from the beacons. Analysis of the proposed
guidance deduces asymptotic behaviour of the UAV heading and minimum number flight
segments required to achieve that. The paper also presents an algorithm for obtaining UAV
initial orientation corresponding to a desired final path.
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

I. Introduction
Autonomous flight of unmanned aerial vehicles in indoor environments where GPS cannot be relied on, finds varied
applications ranging from office premises to underground mines. Corridor or passage traversal is a ubiquitous problem
in such indoor applications. Quadrotors and helicopters can hover and are normally used for such applications.
There have been numerous works towards attaining autonomous navigation in corridors and passages. These can be
broadly classified as optic flow based guidance, RGB-D sensor based guidance and visual servoing.
Biologically inspired optical flow based UAV guidance for flight through indoor corridor is discussed [1-3]. In [1],
optical flow obtained using a fisheye camera is used to create a depth map around the vehicle. The error towards the
center of the corridor is then calculated and used to steer the vehicle through the corridor. In [2, 3], the UAV turn rate is
governed by the difference between the optical flow on the two sides of the image. This causes the UAV to traverse the
corridor by keeping away from the walls. Optic flow based approaches for corridor traversal work well only for similar
image patterns on the two halves of the image. Further, under very low speeds, optic flow itself can be negligible which
can lead to errors in safe navigation.
RGB-D sensors which are devices that work in association with RGB camera to give the depth information in a
per pixel basis, are widely employed for vision based UAV navigation. Reference [4] discusses a multi-point visual
sensing method which guides the vehicle through a corridor by comparing the average distance to a set of points, each
on the two walls. The required motion commands are generated using fuzzy interference systems. A mobile robot
corridor navigation based on wall-floor boundary line detection from a single image is discussed in [5]. The wall floor
boundary lines are extracted by applying Hough transform on the 3D point clouds obtained from on-board RGB-D
sensor. The vehicle is then made to travel through a reference tracking line specified at a certain distance form one wall
of the corridor. RGB-D based guidance demands complex on-board processing which can be a limitation.
Visual servoing based autonomous corridor following works on the principle of real time extraction of wall-floor
boundary lines from the on-board camera image. This method uses two visual features which are position of the
vanishing point and orientation of the corridor median line projected in the image plane in [6-8] and position of both the
∗ Project Associate, Department of Aerospace Engineering, shalinipanicker9@gmail.com.
† Associate Professor, Department of Aerospace Engineering, ratnoo@iisc.ac.in, Senior Member AIAA.

Copyright © 2020 by Shalini Suresh, Ashwini Ratnoo. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
vanishing point and the median line in [9,10]. Visual servoing is sensitive to camera position and the vehicle initial
condition in terms of its lateral and angular positions in the corridor.
This work considers a wall following UAV guidance scenario using passive beacon information. The passive bearing
information based UAV traversal has the advantage of lower on-board processing and minimal sensor requirement
compared to other image processing based guidance. Moreover, the beacons are easy to identify in a wide range of
lighting conditions. Based on the bearing angle of the UAV with respect to stationary beacons, the vehicle can be
guided to the desired location using suitable guidance commands as discussed in [11-13]. Guidance laws based on the
line-of-sight bearing information have been widely studied in the context of interceptors [14]. As the simplest and most
flexible, proportional navigation based guidance laws were studied for impact angle controlled interception of targets
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

[15 -19] with field-of-view constraint [18, 19], interception of weaving targets 20 and anti-ballistic interception [21].
As the main contributions of this note, a guidance method is presented for a field-of-view limited sensor equipped
UAV to sequentially use the line-of-sight information of beacons placed along a path. The guidance logic generates the
desired path as concatenation of path segments generated using individual beacon information. Analytic properties of
the guided trajectories show smooth heading convergence and control over the lateral clearance. Simulation studies are
carried out in support of the analytic results.
The remainder of this paper is arranged as follows: Section II presents the problem statement followed by proposed
guidance method in Section III. Section IV discusses the characteristics of the guided trajectories. Simulation results
are presented in Section V. The concluding remarks are given in Section VI.

II. Problem statement


Consider a quadrotor UAV which needs to navigate through a passage or follow a wall by using the passive bearing
information of a series of beacons placed along the wall. Fig. 1 shows the scenario wherein the beacons Bk , k = 1
to n are positioned along a straight line with a constant separation, l and the UAV, U flies with the speed, v. The
UAV has an intantaneous heading α and θ k measures the bearing orientation of the k th beacon in a fixed frame. The
objective here is to develop a guidance method which uses the beacon bearing information and steers the vehicle to a
path following the wall with a clearance, d as shown in Fig 1. It is desired that the guidance law governing the UAV
heading is computationally feasible and generates bounded turning rate command. It is assumed that the vehicle has
on-board sensor that can measure the bearing-angle of the beacons and the field-of-view of the sensor is limited.

III. Proposed guidance method


The proposed guidance logic is based on utilizing the line-of-sight information from the beacons in a sequential
manner. The vehicle switches to the subsequent beacon when the previous beacon line-of-sight violates the maximum
field-of-view limit of the onboard sensor.
As shown in Fig. 2, consider that the UAV uses the line-of-sight information of beacon B1 to start with as it moves
forward. Here, αki , αk f , Rki , Rk f , θ ki , θ k f denote the initial and final values of UAV heading in the k th segment, range
to the k th beacon and line-of-sight angle with respect to the k th beacon, respectively. Initially, beacon B1 is assumed to

2
v

α
Y d
θk
U

O
X

B1 B2 Bk Bn
l
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

Fig. 1 Engagement Scenario

t = t2 v
v
σmax
α1f
t = t1
v σmax
R2f
R2i
α1i R1f
t=0
U R1i
−θ1f −θ2f
−θ1i −θ2i

B1 l
B2

Fig. 2 Sequential segments of the flight

be inside the field-of-view and therefore, the initial look angle with respect to B1 satisfies,

α1i − θ 1i < σmax (1)

where, σmax is the maximum seeker field-of-view limit. At the instant t1 , the look angle with respect to B1 reaches
σmax and the vehicle switches to the line-of-sight information of the next beacon B2 . Instant t = t2 , as marked in Fig.
2 denotes the time when the line-of-sight with respect to B2 reaches the maximum seeker field-of-view. In a similar
fashion, the vehicle uses subsequent beacon line-of-sight information.
Consider the vehicle to be using the line-of-sight information, θ k with respect to Bk along the respective segments
of guided trajectory. We consider proportional navigation as the building block for developing the overall guidance
method. Accordingly, in any of the segment, the vehicle turn rate is governed by,

αÛ k = Nk θÛk (2)

where Nk is the navigation gain used in the k th segment. The governing equations of motion of the UAV can be expressed
as,
RÛk = −v cos(αk − θ k ) (3)

3
and
Rk θÛk = −v sin(αk − θ k ) (4)

where Rk is the distance of the UAV from the k th beacon. Integrating the system of Eqs. (2) - (4), the trajectory in this
segment can be derived as,
 N 1 −1
sin(αk − θ k )

k
Rk = Rki (5)
sin(αki − θ ki )
where Rk , θ k , αk are the instantaneous range, line-of-sight angle, heading, respectively during the k th segment. At the
time of switching over to Bk+1 , (αk − θ k ) is equal to σmax which upon substitution in Eq. (5) leads to
  N 1 −1
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

sin(σmax ) k
Rk f = Rki (6)
sin(αki − θ ki )

As seen in Fig. 2, switching to the subsequent segment satisfies,

αk+1i − θ k+1i = σmax − (θ k+1i − θ k f ) (7)

and
q
Rk+1i = (Rk f sin (−θ k f ))2 + (Rk f cos (−θ k f ) + l)2 (8)

Since the UAV moves on the same side of the line joining the beacons, θ ki, θ k f < 0. The choice of navigation gain used
in the segments is discussed subsequently.

A. Navigation Gain Nk

 
−αki
Nk = −π/2−θki

θk = −π/2, αk = 0
−σmax

αki

−θkf
−θki
B1
Fig. 3 Navigation gain selection logic for k th segment

During the flight engaging a particular beacon, the guidance objective is to shape the UAV trajectory such that it
tends towards a heading parallel to the wall. This is shown schematically in Fig. 3. Specifically, the objective is to

4
achieve a desired terminal heading, αk = 0 at θ k = −π/2 starting from the initial condition αki and θ ki . Accordingly,
using Eq. (2), the navigation gain for a particular segment can be deduced as

0 − αki
Nk = (9)
−π/2 − θ ki

However, it is noteworthy that the look angle (αk − θ k ) reaches the bound σmax before attaining αk = 0 and at that
instant, UAV switches to the next beacon. Accordingly, during each segment,

− π/2 < θ k < 0 (10)

The solid curve in Fig. 3 represents the trajectory guided using beacon Bk and the dashed trajectory is the virtual part
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

that connects to the desired final orientation in every segment. The same logic is used in every subsequent segment of
the flight.
Note that the shaping idea here is similar to the one used in Ref. [15]. However, in the considered multiple
line-of-sight application, this represents a virtual trajectory which is now followed partially incorporating the seeker
field-of-view and facilitating guidance in subsequent segments of the flight.

IV. Characteristics of the guided trajectories


Of particular interest are the resulting heading angle variation and lateral displacement of the UAV with respect to
the wall.

A. UAV heading convergence


At the end of k th segment, the look angle with respect to the k th beacon becomes equal to the field-of-view limit.
Accordingly,
αk f − θ k f = σmax (11)

The maximum look angle limit is assumed to satisfy,

σmax < π/2 (12)

Integrating Eq. (2),


αk f − αki = Nk θ k f − θ ki

(13)

Using Eq. (11) in Eq. (13),


αk f − αki = Nk αk f − σmax − θ ki

(14)

Rearranging,
Nk σmax − (αki − Nk θ ki )
αk f = (15)
(Nk − 1)

5
N4 αnf → 0
N3 α4i

α3i
N2
dmax

α2i
N1
α1i
U

B1 B2 B3 B4
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

l l l
Fig. 4 Proposed guidance logic

Using Eq. (15), the heading angle at the end of segment 1 can be expressed as,
N1 .σmax − (α1i − N1 θ 1i )
α1 f = (16)
(N1 − 1)
Using Eq. (9),
−α1i
N1 = (17)
−π/2 − θ 1i
Using Eqs. (16) and (17),
(π/2 − σmax )
α1 f = α1i (18)
(π/2 − (α1i − θ 1i ))
Since, the end of segment 1 marks the beginning of segment 2,

α2i = α1 f (19)

Using Eqs. (9), (15) and (19), UAV heading at the end of segment 2 can be deduced as
(π/2 − σmax ) (π/2 − σmax )
α2 f = α1i . (20)
(π/2 − (α1i − θ 1i )) (π/2 − (α2i − θ 2i ))
In a similar fashion, the heading angle at the end of nth segment can be expressed as a function of the initial heading
angle as,
(π/2 − σmax ) (π/2 − σmax ) (π/2 − σmax )
αn f = α1i . ..... (21)
(π/2 − (α1i − θ 1i )) (π/2 − (α2i − θ 2i )) (π/2 − (αni − θ ni ))
Proposition IV.1. Subject to the kinematics (2)-(4) and the choice of navigation gain as governed by (9), the UAV
heading asymptotically approaches zero as n → ∞

Proof. Using Eq. (21), the heading angle at the end of nth segment can be expressed in a consolidated form as,
n
Ö
αn f = α1i βk (22)
k=1

where,
(π/2 − σmax )
βk = (23)
(π/2 − (αki − θ ki ))

6
Using Eqs. (1), (12) and (23), it can be deduced that

β1 < 1 (24)

Using the geometry shown in Fig. 2, it can be readily deduced that


 
Rk−1 f sin(−θ k−1 f )
θ ki = − tan−1
(25)
Rk−1 f cos(−θ k−1 f ) + l

The separation between the beacons, l is positive and hence Eqs. (10) and (25) imply,

θ ki > θ k−1 f , θ k−1 f , θ ki < 0 (26)


Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

Using Eqs. (26) in (7) leads to


αki − θ ki < σmax (27)

which, upon substitution in (23) yields


βk < 1 for k ∈ [2, n] (28)

From Eqs. (24) and (28),


βk < 1 for k ∈ [1, n] (29)

which implies,
n
Ö
lim αn f = lim α1i βk → 0 (30)
n→∞ n→∞
k=1

From Proposition 1, it can be observed that it takes infinite number of beacons to achieve the desired final heading.
However, a practical limit can be obtained by considering a negligible final error as shown in Proposition 2.

Proposition IV.2. A conservative minimum number


& of segments,
' nmin required for the UAV heading to be less than one
ln(0.01/β1 )
percent of its initial value is given by nmin = 1 + ln β2

Proof. Using Eq. (7) and Eq. (23), βk , can be expressed as,

(π/2 − σmax )
βk = (31)
(π/2 − (σmax + θ k−1 f − θ ki ))

With respect to two consecutive beacons, a generalized form of Eq. (25) relates the line-of-sight angles as,
 
Rk−1 sin(−θ k−1 )
θ k = − tan−1 (32)
Rk−1 cos(−θ k−1 ) + l

Taking partial derivative of Eq. (32) with respect to θ k−1 ,

∂θ k 1 −(Rk−1 cos(−θ k−1 ) + l)(Rk−1 cos(−θ k−1 )) − (Rk−1 sin(−θ k−1 ))(Rk−1 sin(−θ k−1 ))
=−
∂θ k−1 
R k−1 sin(−θk−1 )
2
(Rk−1 cos(−θ k−1 ) + l)2
1+ R k−1 cos(−θk−1 )+l
(33)

7
∂θ k (Rk−1 )2 + Rk−1 l cos(−θ k−1 )
⇒ = (34)
∂θ k−1 (Rk−1 )2 + 2Rk−1 l cos(−θ k−1 ) + l 2
Since Rk−1 > 0 and l > 0, from Eq. (34), it can be deduced that

dθ k
0< <1 (35)
dθ k−1

From Eqs. (22) and (29),


αk f < αk−1 f (36)

Using Eqs. (11) and (36),


θ k f < θ k−1 f (37)
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

Combining Eqs. (31), (35) and (37),


βk+1 < βk for k ∈ [2, n − 1] (38)

Hence, using Eqs. (22) and (38), a conservative approximation of the heading at the end of the k th segment can be
obtained as,
αk f w α1i β1 β2k−1 (39)
αk f
⇒ w β2k−1 (40)
α1i β1
which upon imposing αk f = 0.01α1i results in

ln (0.01/β1 )
k =1+ (41)
ln β2

Since the number of segments can only be positive integer, the minimum number of segments required for α < 0.01α1i
can be obtained from Eq. (41) as & '
ln (0.01/β1 )
nmin = 1+ (42)
ln β2

where, represents the ceiling function that computes the smallest integer greater than or equal to the given value. 

B. Cross Track Clearance


The variation of cross track clearance of the vehicle from the wall, dk is governed by

dÛk = v sin αk (43)

For αk ∈ (0, π/2), dÛk > 0 and the lateral displacement from the wall increases throughout. As governed by Proposition
1 and Eq. (43),
lim dÛk → 0 (44)
k→∞

Hence, the UAV converges to a path parallel to the wall. From Fig. 2, the cross track clearance at the end of k th segment
can be obtained as
dk = Rk f sin(−θ k f ) (45)

8
From Eq. (23), it can be seen that for a greater α1i , β1 is greater. Hence, Eq. (39) implies that greater the α1i , greater is
the approximate heading at the end of all successive segments. With higher average heading in the segments, higher α1i
eventually leads to higher dÛk as governed by Eq. (43). Accordingly, higher clearance can be achieved by having higher
initial heading angle of the UAV. Considering the UAV initial position as (xu0, yu0 ) = (0, 1) m and beacon positions as
(xbk , ybk ) = (5 + (k − 1)5, 0) m, k = 1, 2, ..nmin , where, nmin is computed using Eq. (42). The corresponding variation
in dnmin with respect to α1i is plotted in Fig. 5.
The initial heading is varied from 0 to the maximum possible value considering the field-of-view limit as
α1i = σmax + θ 1i . Using this relationship as well as the UAV intial position which gives θ 1i = −11.31° and considering
σmax = ±70°, the limiting value is obtained as α1i = 58.69°. The minimum clearance corresponds to zero initial UAV
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

heading and is given as dmin = yu0 = 1 m whereas the maximum clearance can be obtained by using α1i = 58.69° as
dmax = 13.23 m.

16

14
dmax = 13.54
12

10

8
d, m

2
dmin = 1
0

-2
0 10 20 30 40 50 60
α1i , deg

Fig. 5 Variation of dn with α1i

In practical applications, it is desirable to have a specific final clearance. The value of α1i required for obtaining the
desired cross track clearance ddes can be found using a backward one dimensional search method as given by Algorithm
1. The Algorithm is based on recursively computing the clearance at the end of k th segment as a function of the UAV
initial heading. The Algorithm incorporates an error tolerance of  radians.

V. Simulation Results
Sample simulations are carried out to validate the proposed guidance method. The beacons are assumed to be
placed along a wall with a constant separation l = 5 with the first beacon at (7, 0). The initial location of the UAV is
(xu0, yu0 ) = (0, 1) which corresponds to the initial line-of-sight angle with respect to first beacon as θ 1i = −8.13 deg.
Unless specified, UAV’s maximum seeker field-of-view limit is considered as σmax = ±70 deg.

9
Algorithm 1 One-dimensional numerical search for α1i which satisfies ddesired
1: Input R1i , θ 1i , l, , σmax , ddes ∈ (dmin, dmax ), ∆α1i
2: Set α1i = 0.01
3: Compute α2i using Eq. (15) and θ 2i using Eqs. (11) and (25).
4: Compute the minimum number of beacons, nmin using Eq. (41)
5: for k = 1 to nmin do
6: Compute Nk using Eq. (9)
7: Compute αk f and θ k f using Eqs. (15) and (11), respectively
8: Compute Rk f using Eq. (6)
9: Compute θ k+1i and Rk+1i using Eqs. (25) and (8), respectively.
10: Set αk f = αk+1i
11: end for
12: Compute the wall clearance dk using Eq. (45).
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

13: if | dk − ddes |<  then, output α1i


14: else, update α1i =α1i + ∆α1i ,
15: Repeat steps 3 to 13.
16: end if

A. Case 1: Sample scenario


This case considers the UAV initial heading as α1i = 20 deg. Each segment is generated successively, using the
bearing information of the beacons, one at a time. The resulting trajectory is shown in Fig. 6a wherein the UAV settles
to a path parallel to the wall. Figure 6b shows the UAV heading variation during the engagement. It can be seen that the
heading reduces monotonically from the given initial value, 20 deg and approaches zero asymptoically which is in
accordance with Proposition 1. By proposition 2, the number of segments after which the heading decreases to below
1% of its initial value is found to be 5. As shown in the zoomed-in part of Fig. 6b, the heading reduces to 1% of the
initial value in accordance with the theoretical result. Figure 6c shows the variation of cross track clearance of the
vehicle, d with respect to the wall. The clearance increases from the initial value of 1 m and smoothly converges to a
final vaue of 3.2 m.
Figure 6d shows the variation of UAV-beacon line-of-sight angle in each of the segments. During a segment, θÛk < 0
as obtained using Eq. (4) with (αk − θ k ) ∈ (0, π/2) for any k ∈ [1, n]. Further, the increase in line-of-sight angle (θ k f to
θ k+1i ) at switching is in accordance with Eq. (25). The corresponding segment-wise look-angle variation is plotted in
Fig. 6e. The bearing information of a particular beacon is used untill the look-angle of the beacon with respect to the
vehicle reaches the maximum seeker field-of-view limit, σmax = 70 deg. The variation of βk is shown in Fig. 6f. It can
be seen that βk < 1 for all segments as given by the Eq. (29) and satisfies the relation given by Eq. (38) as can be seen
in the zoomed-in part. Moreover, the nearly negligible differences betweeen the βk in different segments as seen from
the plot justifies the approximation made in Eq. (39).
Figures 7a and 7b show the variations of navigation gain and UAV turn rate during the engagement. Navigation gain
as well as turn rate converge to zero as the UAV heading angle approaches zero.

10
beacons
UAV initial location segment 1
15 segment 1 20 segment 2
segment 2 segment 3
segment 3 segment 4
10 segment 5
segment 4 15
segment 5
5
Y, m

α, deg
10
U 0.3
0
B1 B2 B3 B4 B5 0.2
0.1
-5 5
0
11 12 13
-10
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

-5 0 5 10 15 20 25 30 35
X, m 0 2 4 6 8 10 12 14
t, s

(a) UAV trajectory (b) UAV heading

3.5 0
segment 1
segment 2
-10 segment 3
3
segment 4
-20 segment 5
2.5
-30
θ, deg
d, m

2
-40

1.5 -50

segment 1
segment 2 -60
1 segment 3
segment 4 -70
segment 5
0.5
0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14
t, s t, s

(c) Cross track distance of the UAV from wall (d) UAV-beacon line-of-sight angle

80 0.5
segment 1 0.33
segment 2
0.45
segment 3 0.32
70
segment 4 0.4
0.31
segment 5 1 2 3 4 5
0.35
60
0.3
σ, deg

βk

50 0.25

0.2
40
0.15

0.1
30
0.05

20 0
-2 0 2 4 6 8 10 12 14 0 1 2 3 4 5 6
t, s k

(e) Look angle variation (f) Variation in βk

Fig. 6 Results for Case 1: α1i = 20 deg

11
0.3 0.1

0.05
0.25
0

0.2 -0.05

turn rate, m/s2


-0.1
0.15
-0.15
N

0.1
-0.2

0.05 -0.25

-0.3
0
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

-0.35

-0.05 -0.4
0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14
t, s t, s

(a) Navigation gain (b) UAV turn rate

Fig. 7 Navigation gain and turn rate for Case 1: α1i = 20 deg

B. Case 2: Achieving different clearances


It may often be desirable to have the UAV settle to a specific clearance from the wall. This case considers three
different desired final cross track clearances, ddes = 4 m, 5 m and 7 m, respectively. Using Algorithm 1 and the
desired choice of final clearances, the UAV initial headings are deduced as α1i = 27.52 deg, 35.27 deg and 46.55 deg,
respectively.
The UAV trajectories for the three scenarios are plotted in Fig. 8a. The corresponding cross track clearance
variations are shown in Fig. 8b. The clearances increase from the initial value of 1 m and converge smoothly to the
desired final values corresponding to the three different α1i . Figure 8c shows the UAV heading variation. It can be seen
that higher initial heading leads to higher heading at the end of every successive segment as governed by Eqs. (22) and
(23). This leads the UAV to settle to a path with a higher clearance from the wall.

C. Case 3: Beacon failure


This subsection studies the effectiveness of the guidance algorithm in the event of beacon failure. Consider UAV
initial heading as α1i = 20 deg and that B3 is malfunctioning. The trajectory of the UAV is shown in Fig. 9a. The
vehicle heading variation is shown in Fig. 9b. Figure 9c shows the look-angle variation. The lower initial look angle for
the fourth segment is because of the failure of B3 , owing to which, the UAV has to switch from the second beacon to the
fourth one. The navigation gain variation during the engagement is plotted in Fig. 9d. The UAV turn rate required is
shown in Fig. 9e. Figure 9f shows the variation of βk . Results highlight the robustness of the supposed sequential
guidance method to intermediate beacon failure.

12
8
20 UAV initial position
ddes = 7 m
ddes = 5 m 7
15
ddes = 4 m
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

beacon 6
10
5

5
Y, m

d, m
4
U
0
3
B1 B2 B3 B4 B5 B6 B7
-5 2

ddes = 7 m
-10 1
ddes = 5 m
ddes = 4 m
0
-5 0 5 10 15 20 25 30 35 40 -2 0 2 4 6 8 10 12 14 16 18 20
X, m t, s

(a) Trajectories (b) Cross track distance variation

50
ddes = 7 m
45 ddes = 5 m
ddes = 4 m
40

35

30
α, deg

25

20

15

10

-5
-2 0 2 4 6 8 10 12 14 16 18 20
t, s

(c) Heading angle variation

Fig. 8 Results for Case 2: ddes = 4 m, 5 m and 7 m

13
UAV initial location segment 1
15 segment 1 20 segment 2
segment 2 segment 4
segment 4 segment 5
10
segment 5 15
active beacons
failed beacon
5

α, deg
Y, m

10

0
U
B1 B2 B3 B4 B5
5
-5
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

-10 0

-5 0 5 10 15 20 25 30 0 2 4 6 8 10 12 14
X, m t, s

(a) trajectory (b) heading variation

80 0.3
segment 1
segment 2
70 segment 4 0.25
segment 5
60 0.2

50 0.15
σ, deg

40 0.1

30 0.05

20 0

10 -0.05
-2 0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14
t, s t, s

(c) look-angle variation (d) Navigation gain variation

0.1 0.4

0.05
0.35
0
0.3
-0.05
turn rate, m/s2

0.25
-0.1
βk

-0.15 0.2

-0.2
0.15
-0.25
0.1
-0.3
0.05
-0.35

-0.4 0
0 2 4 6 8 10 12 14 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
t, s k

(e) UAV turn rate (f) Variation in βk

Fig. 9 Results for Case 3: Beacon B3 failure, α1i = 20 deg

14
Beacon σmax = 80 deg
25
σmax = 50 deg 20 σmax = 70 deg
20 σmax = 60 deg σmax = 60 deg
σmax = 70 deg σmax = 50 deg
15 σmax = 80 deg
15
UAV initial location
10

α, deg
5
Y, m

10
0 U
B1 B2 B3 B4 B5 B B B8 B 9 B 10 B 11
6 7
-5
5
-10

-15
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

0
-20

0 10 20 30 40 50 60 0 5 10 15 20 25 30
X, m t, s

(a) UAV trajectories (b) heading variation

6 5.5

5
5 maximum cross track clearance, m

4.5
4

4
d, m

3
3.5

2
3
σmax = 50 deg
1 σmax = 60 deg 2.5
σmax = 70 deg
σmax = 80 deg
0 2
0 5 10 15 20 25 30 45 50 55 60 65 70 75 80 85
t, s σmax , deg

(c) cross track clearance variation (d) Maximum cross track distance

Fig. 10 Results for Case 4: Various seeker field-of-view limits, α1i = 20 deg

D. Case 4: Various seeker field-of-view limits


In this Case, effect of the seeker field-of-view limit on the proposed guidance logic is studied. Simulations are
carried out for various seeker field-of-view limits considering UAV initial heading as α1i = 20 deg. The resulting UAV
trajectories are shown in Fig. 10a. The UAV heading varies as shown in Fig. 10b. From Eq. (18), it can be seen that, for
a given α1i and θ 1i , α1 f is lower for higher σmax . This leads to a lower average α in a segment and a corresponding
lower cross track distance variation. The variation of UAV cross track distance is shown in Fig. 10c. Figure 10d shows
the variation of maximum cross track clearance with σmax .

15
VI. Conclusion
This paper considers the problem of UAV navigation using relative bearing information of beacons placed along a
path. Using proportional navigation guidance, the method utilizes, in sequence, the line-of-sight information obtained
from the beacons. Analysis of the guided trajectories deduces asymptotic behaviour of the UAV heading and minimum
number flight segments required to achieve that. The paper also presents an algorithm for obtaining UAV initial
orientation corresponding to a desired final path. Simulation studies validate the analytical findings and also present
robustness studies for the case of beacon failure.
Passive information based navigation finds a variety of applications in indoor UAV navigation. And this paper
provides a simple and easily implementable guidance method addressing that. Potential future works include developing
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

guidance method for various beacon spatial arrangements.

References
[1] Zingg, S., Scaramuzza, D., Weiss, S., and Siegwart, R., “MAV navigation through indoor corridors using optical flow, ”In 2010
IEEE International Conference on Robotics and Automation , Anchorage, AK, USA, May 2010, pp. 3361-3368.
doi: 10.1109/ROBOT.2010.5509777
[2] Agrawal, P., Ratnoo, A., and Ghose, D., “Inverse optical flow based guidance for UAV navigation through urban canyons,
”Aerospace Science and Technology, Vol. 68, 2017, pp. 163-178.
doi: 10.1016/j.ast.2017.05.012
[3] Agrawal, P., Ratnoo, A., and Ghose, D.,“A composite guidance strategy for optical flow based UAV navigation, ”IFAC
Proceedings Volumes, Vol. 47, No.1, 2014, pp. 1099-1103.
doi: 10.3182/20140313-3-IN-3024.00151
[4] Purwanto, D., Rivai, M., and Soebhakti, H., “Vision-based multi-point sensing for corridor navigation of Autonomous Indoor
Vehicle, ”In 2017 International Conference on Electrical Engineering and Computer Science (ICECOS), Palembang, Indonesia,
Aug. 2017, pp. 67-70.
doi: 10.1109/ICECOS.2017.8167168
[5] Qian, K., Chen, Z., Ma, X., and Zhou, B., “Mobile robot navigation in unknown corridors using line and dense features of point
clouds, ”In IECON 2015 - 41st Annual Conference of the IEEE Industrial Electronics Society , Yokohama, Japan, Nov. 2015,
pp. 1831-1836.
doi: 10.1109/IECON.2015.7392367
[6] Ben-Said, H., Stéphant, J., and Labbani-Igbida, O., “Sensor localization for visual servoing stability, application to corridor
following, ”IFAC-PapersOnLine, Vol. 50, No. 1, 2017, pp. 2223-2228.
doi: 10.1016/j.ifacol.2017.08.929
[7] Pasteau, F., Babel, M., and Sekkal, R.,“Corridor following wheelchair by visual servoing, ”IEEE/RSJ International Conference
on Intelligent Robots and Systems, Tokyo, Japan, Nov. 2013, pp. 590-595.
doi: 10.1109/IROS.2013.6696411
[8] Pasteau, F., Narayanan, V. K., Babel, M., and Chaumette, F., “A visual servoing approach for autonomous corridor following
and doorway passing in a wheelchair, ”Robotics and Autonomous Systems, Vol. 75, Part A, 2016, pp. 28-40.
doi: 10.1016/j.robot.2014.10.017.
[9] Toibero, J. M., Soria, C. M., Roberti, F., Carelli, R., and Fiorini, P., “Switching visual servoing approach for stable corridor
navigation, ”In 2009 International Conference on Advanced Robotics , Munich, Germany, June 2009, pp. 1-6.
[10] Faragasso, A., Oriolo, G., Paolillo, A., and Vendittelli, M., “Vision-based corridor navigation for humanoid robots, ”In 2013
IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, May 2013, pp. 3190-3195.
doi: 10.1109/ICRA.2013.6631021
[11] Trinh, M. H., Ko, G. H., Pham, V. H., Oh, K. K., and Ahn, H. S., “Guidance using bearing-only measurements with three
beacons in the plane, ”Control Engineering Practice, Vol. 51, 2016, pp. 81-91.
doi: 10.1016/j.conengprac.2016.03.013

16
[12] Bekris, K. E., Argyros, A. A., and Kavraki, L. E., “Angle-based methods for mobile robot navigation: Reaching the entire plane,
”In IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, May 2004, pp. 2373-2378.
doi: 10.1109/ROBOT.2004.1307416

[13] Trinh, M. H., Pham, V. H., Hoang, P. H., Son, J. H., and Ahn, H. S., “A new bearing-only navigation law, ”17th International
Conference on Control, Automation and Systems (ICCAS), Jeju, South Korea, Oct. 2017, pp. 117-122.
doi: 10.23919/ICCAS.2017.8204428

[14] Zarchan, P., Tactical and Strategic Missile Guidance, Vol. 239, 6th ed., AIAA, Reston, VA, 2012, pp. 14–18.
doi: 10.2514/4.868948

[15] Ratnoo, A., and Ghose, D., “Impact Angle Constrained Interception of Stationary Targets, ”Journal of Guidance, Control, and
Dynamics, Vol. 31, No. 6, 2008, pp. 1817-1822.
doi: 10.2514/1.37864
Downloaded by UNIVERSITY OF TEXAS AT AUSTIN on May 31, 2020 | http://arc.aiaa.org | DOI: 10.2514/6.2020-1092

[16] Lu, P., “Adaptive Terminal Guidance for Hypervelocity Impact in Specified Direction, ”Journal of Guidance, Control, and
Dynamics, Vol. 29, No. 2, 2006, pp. 269-278.
doi: 10.2514/1.14367

[17] Erer, K. S., and Merttopcuoglu, O., “Indirect Impact-Angle-Control Against Stationary Targets Using Biased Pure Proportional
Navigation, ”Journal of Guidance, Control, and Dynamics, Vol. 35, No. 2, 2012, pp.700-704.
doi: 10.2514/1.52105

[18] Ratnoo, A., “Analysis of Two-Stage Proportional Navigation with Heading Constraints, ”Journal of Guidance, Control, and
Dynamics, Vol. 39, No. 1, 2016, pp. 156-164.
doi: 10.2514/1.G001262

[19] Erer, K. S., and Tekin, R., “Look Angle Constrained Impact Angle Control Based on Proportional Navigation, ”AIAA Guidance,
Navigation, and Control Conference, Kissimmee, Florida, USA, Jan. 2015, AIAA Paper 2015-0091.
doi: 10.2514/6.2015-0091

[20] Zarchan, P., “Proportional Navigation and Weaving Targets, ”Journal of Guidance, Control, and Dynamics, Vol. 18, No. 5,
1995, pp. 969-974.
doi: 10.2514/3.21492

[21] Zarchan, P., “Kill Vehicle Guidance and Control Sizing for Boost-Phase Intercept, ”Journal of Guidance, Control, and
Dynamics, Vol. 34, No. 2, 2011, pp. 513-521.
doi: 10.2514/1.50927

17

You might also like