[go: up one dir, main page]

0% found this document useful (0 votes)
14 views5 pages

Vision - Based in Line Follow Robot

This paper presents a vision-based line following robot utilizing image processing techniques for autonomous navigation. The robot employs a camera for line detection and uses OpenCV and Python for image processing, simulated in the Webots environment. Results indicate successful navigation along various line types, with future work suggested to improve speed control and hardware implementation.

Uploaded by

tancdtbachkhoa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views5 pages

Vision - Based in Line Follow Robot

This paper presents a vision-based line following robot utilizing image processing techniques for autonomous navigation. The robot employs a camera for line detection and uses OpenCV and Python for image processing, simulated in the Webots environment. Results indicate successful navigation along various line types, with future work suggested to improve speed control and hardware implementation.

Uploaded by

tancdtbachkhoa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2020 FORTEI-International Conference on Electrical Engineering (FORTEI-ICEE)

Vision-Based Line Following Robot in Webots


Alfian Ma’arif Aninditya Anggari Nuryono* Iswanto
Department of Electrical Engineering Department of Electrical Engineering Department of Electrical Engineering
Universitas Ahmad Dahlan Universitas Gadjah Mada Universitas Muhammadiyah
Yogyakarta, Indonesia Yogyakarta, Indonesia Yogyakarta
alfianmaarif@ee.uad.ac.id anindityanuryono@gmail.com Yogyakarta, Indonesia
*corresponding author iswanto_te@umy.ac.id

Abstract — Line following robot is one of the popular robots camera as the main component to be able to move
commonly used for educational purposes. The most widely used autonomously. Visual-based robot movements have become
sensors for the robots are photoelectric sensors. However, it is a research trend in the field of robotics. The visual
irrelevant, along with the development of autonomous vehicles representation of the environment that is similar to reality is
and robotic vision. Robotic vision is a robot that can obtain
information through image processing by the camera. The
an essential part of image-based control [13]. It is processed
camera installed on the line following robot aims to detect as feedback to move the robot from the starting point to the
image-based lines and to navigate the robot to follow the path. destination point. In the paper, the camera is utilized as a
This paper proposed a method of image preprocessing along sensor to detect the path or line that will be traversed by the
with its robot action for line-following robots. This includes robot. The robot will follow the relative position of black and
image preprocessing such as dilation, erosion, Gaussian white processed-image.
filtering, contour search, and centerline definition to detect path Webots simulator is a simulator of a virtual environment
lines and to determine the proper robot action. The that can be used as a robotic vision simulator [14][15]. This
implementation of the robot is simulated using Webots paper will also use the Webots simulator virtual environment
simulator. OpenCV and Python are utilized to design line
detection systems and robot movements. The simulation result
with KepheraIV as line following robot. OpenCV will be used
shows that the method is implemented properly, and the robot for image processing of robotic cameras.
can follow a different type of path lines such as zigzag, dotted, The control system of the camera robot has an important
and curved line. The resolution of the cropped-image frame is role in the application of image-based autonomous robot
the fundamental parameter in detecting path lines. motion systems [16]. Image-based systems can provide
detailed information related to the environment and
Keywords—Webots, line-following, robot, OpenCV, vision navigation that obtained by image processing.
I. I NTRODUCTION There were some researches related to the line following
robots, such as one that is conducted by Li-Hong Juang et al.
Rapid advances in robotics make robots to be able to do [17]. They aimed to get visual navigation for humanoid
some specific tasks such as monitoring [1], delivering goods robots. Their proposed method was a square searching to
[2], football [3][4][5], sorting fruit based on the type of fruit extract lines from line images. The method not only improves
or the fruit color, or automatic watering plants [6]. Another the navigation performance of NAO robots in real-time but
kind of robot which is line following robot is one of the also increases the accuracy of detecting lines. The robot can
popular robots commonly used for educational purposes [7]. walk along with different line shapes with low deviations
Line-following robots are popular to be used as an using PID control.
introduction for students on how to program the robot in a Visual based line detection was also studied by Jianye et
computer moving in a specific direction or with controlled al. [18] using multiple line detection algorithm based on pixel
speed. histogram. Lines are the bird view to obtain bend pixels and
Line following robots are robots that can follow or track ignore horizontal lines that intersect each other. Image-based
particular lines autonomously. The most commonly used robot control was also carried out by Chun et al. [19], who
sensors for the robots are photoelectric sensors such as proposed the IP camera and MATLAB toolbox as image
photodiode [8] or light-dependent resistors (LDR) [9]. processing to detect black lines. The cascade intelligent
However, photoelectric sensors are no longer relevant to control system is also used to stabilize the robot and to
be used for advanced line following robots along with the achieve the desired tracking performances.
development of autonomous vehicles. Photoelectric sensors The following line robots also had a task as assistant
have some tricky limitations. The basic design of the robots, such as one conducted by Abdul Kader et al. [20] to
photoelectric sensor for detecting lines is profoundly affected deliver particular objects. This robot uses the PID controller
by external or additional lights, hard to recognize varying and infrared to detect paths [21][22]. Line following robots
colors and minimal distance range of senses. using artificial neural networks proposed by Roy et al. [23]
Some modified-designs of photoelectric sensors for line based on infrared sensors. The robot can follow the curve path
following have been proposed. However, it is only suitable smoothly. Besides Arduino [24][25], line-following robots
for specific conditions [10][11]. Along with the era of robotic can also use the microcontroller ATMega32A [26][27] and
vision, optical cameras are now considered as the alternative ATMega328 Pro [28].
use for line following robots [12]. The first section of the paper is about the introduction. The
This paper proposed an image processing method and its second section is about modeling a mobile robot. The third
implementation to the line following robot. The robot has a section discusses the proposed method. The fourth section is

978-1-7281-9434-9/20/$31.00
Authorized ©2020
licensed use limited to: Auckland University IEEE
of Technology. 24 on December 23,2020 at 06:02:14 UTC from IEEE Xplore. Restrictions apply.
Downloaded
about results and discussion. The last section is conclusions Meanwhile, the binary image dilation equation can be
and future work. written as (4), where 𝐴𝑏 is the translation of 𝐴 with respect to
II. P ROPOSED M ETHOD 𝑏, 𝐴 is a binary image, and 𝐵 is an element of the Euclidean
The research used Python 3.7 and OpenCV 4 to build the distance.
visual simulation of KheperaIV as a line following robot. The 𝐴 ⊕ 𝐵 = ⋃ 𝐴𝑏 (4)
proposed method consists of three main steps, which are 𝑏∈𝐵
initialization, image processing, and robot action.
The result of the dilation and erosion process is the result
Initialization is done based on parameters of the robot and
of image preprocessing. The data will get into image
the environment, such as the camera selection and the
processing steps. After the dilation and erosion process, the
generated path line. In other words, initialization is done to
contour values will be determined. In other words, this is the
define each parameter used in the simulation, including the
process to find and to detect the edge contour of any shapes
robot's and virtual environment parameters.
found in the image. The contour values are used as a curve
In image preprocessing, the image of the path line is
that connects all continuous points, which has the same color
captured by the robot's camera. The image is used as an input
and light intensity.
and then cropped to a specific width and length of the frame.
These data will be processed in further image processing
After that, the image which is still in RGB color, is converted
steps to obtain the parameter of the centerline point, which
into a greyscale image.
will determine the robot action or movement. The equation
The RGB image is converted into greyscale by using (1),
used to find the central or centroid line equation with
where variable 𝑅𝑒𝑑 is the red value, variable 𝐺𝑟𝑒𝑒𝑛 is the
375×752 resolution can be written as (5) and (6).
green value, and the variable 𝐵𝑙𝑢𝑒 is the blue value of each
pixel. The result is shown in Figure 1. 𝑚𝑗𝑖 = ∑(𝑎𝑟𝑟𝑎𝑦(𝑥, 𝑦) ∙ (𝑥 − 𝑥̅ ) 𝑗 ∙ (𝑦 − 𝑦̅)𝑖 (5)
𝐺𝑟𝑎𝑦 = 0.3 ∗ 𝑅𝑒𝑑 + 0.59 ∗ 𝐺𝑟𝑒𝑒𝑛 + 0.11 ∗ 𝐵𝑙𝑢𝑒 (1) 𝑥,𝑦

The next step is to reduce the noise found in the image. 𝑚01 𝑚01
Gaussian blur filtering is used for blurring in image
𝑥̅ = , 𝑦̅ = (6)
𝑚00 𝑚00
preprocessing. It creates the autofocus effect by reducing Where 𝑥̅ and 𝑦̅ are center of 𝑥-axis and center of 𝑦-axis
details and adding the foggy effect.
respectively, 𝑚𝑗𝑖 is the central momentum. The result of the
The equation of Gaussian filtering can be written as (2),
where variable 𝜎 is the standard deviation. The result of process is shown in Figure 4.
image preprocessing based on the applied 2-dimensional The simulation in Webots displays KepheraIV as the
robot, the black-colored line as the path to be followed by the
Gaussian filter equation is shown in Figure 2.
robot, and the camera display as the robot vision of the
1 𝑥2+ 𝑦2
𝐺=

𝑒 2𝜎 2 (2) environment. The robot vision and the line detection
2𝜋𝜎 2 algorithm are shown in Figure 5 and Figure 6.
Then, the grayscale image is converted into binary form. Based on Figure 6, robot action is the robot movement,
The binary threshold is used for segmentation so that the such as move forward, turn left, turn right, and turn around as
robot's path line can be obtained from a black and white the response to detected lines based on the result of the image
image. The result is shown in Figure 3.
The next step is the dilation and erosion process. Dilation
is the process of adding pixels to the edges of objects in the
image, while erosion is the process of removing pixels. The
equation of the erosion process is shown as (3). (a)

𝐴 ⊖ 𝐵 = {𝑧 ∈ 𝐸|𝐵𝑧 ⊆ 𝐴 (3)
Where 𝐵𝑧 is 𝐵 translation of the vector 𝑧, 𝐸 is Euclidean,
𝐴 is a binary image.
(b)

(c)

Fig. 1. Grey image

(d)

Fig. 2. Gaussian blur image


(e)
Fig. 4. Centerline detection with 375×752 resolution: (a) center of 𝑥
and 𝑦 axis, (b) center of 𝑥-axis: move forward, (c) center of 𝑥 axis:
turn left (d) center of 𝑥-axis: turn right, (e) center of 𝑦-axis: turn
around
Fig. 3. Binary threshold result

25 on December 23,2020 at 06:02:14 UTC from IEEE Xplore. Restrictions apply.


Authorized licensed use limited to: Auckland University of Technology. Downloaded
T ABLE I. ROBOT T RAJECTORY AND T RACK LINE
Robot Trajectory Track Line

Curved

Fig. 5. Line path and KheperaIV robot vision

Find CenterX and CenterY Line Zigzag


Start False

True
Image CenterX > Max_Threshold Turn Right

Preprocessing False

CenterX<Max_Threshold True
& Forward
Find Countour of frame CenterX>Min_Threshold
Dotted
False

Countours == 1 True
CenterY >
True Turn Around
Min_Threshold

False False

No Line Detected
True
CenterX < Mid_Threshold Turn Left
Line path 1

False

Fig. 6. Line following detection system diagram

processing. It depends on the detected line center value.


When the center value on the 𝑥-axis of the line is greater than
the maximum threshold, the robot will turn right. The robot
will move forward when the 𝑥-axis center value of the line is
between the range of maximum and the minimum threshold
value. It will turn around when the robot meets a dead-end or
does not detect any line, and the center value of the 𝑥-axis is
greater than the minimum threshold value. It will turn left Line path 2
when the 𝑥-axis center value is smaller than the minimum
threshold.
III. R ESULT AND D ISCUSSION
The study has several scenarios for the evaluation of the The result of the evaluation is shown in Table I. The track
proposed method. The evaluation is conducted by testing the lines are the path lines or the virtual environment, which the
applied proposed method with a different type of path lines. robot shall pass. Meanwhile, the robot trajectory is the path

26 on December 23,2020 at 06:02:14 UTC from IEEE Xplore. Restrictions apply.


Authorized licensed use limited to: Auckland University of Technology. Downloaded
of robot movement from one point to another by following a cropped-image frame. Therefore, the frame resolution of the
predetermined path. In other words, the trajectory is the result cropped-image becomes a significant parameter for detecting
of the robot's action based on the applied proposed method of path lines.
image processing of the track lines as a robot's environment. Some possible future researches can be done based on the
In Webots simulator, the 𝑥-axis and 𝑦-axis is used since the paper. In the paper, the robot only used a simple controller,
𝑧-axis is not used because there is no coordinates' alteration which is the if-else rule. Hence, the speed of the robot cannot
of the robot's position in the Cartesian 𝑧-axis. be controlled well. It creates possible future work in regards
Based on the figures in Table I, the robot is able to move to speed control, which can be done by applying the
according to the line paths in general. The first track is the Proportional Integral Derivative (PID) Controller. Moreover,
curve line. It is shown that the robot can follow the line with the hardware implementation is needed to explore the
some arches quite different from the original curve track. The behavior or characteristics of the proposed method. In a
second track is to evaluate how the robot moves in the path hardware implementation, the noise of the image from the
with corners. The result shows that the robot does not move camera becomes another problem in the visualization. It may
accurately while meeting corners or sharp turns. come from the shock that happened when the robot is moving.
Meanwhile, the third, fourth, and fifth track are to test the Kalman Filter may be added to overcome this issue.
robot while following elbow tracks. The result shows that the
movement does not form the elbow well. However, the robot R EFERENCES
still can follow the path. [1] W. Rahmaniar and A. Wicaksono, “Design and Implementation of a
Based on the simulation result, the robot can follow Mobile Robot for Carbon Monoxide Monitoring,” Journal of Robotics
various track lines generally. However, the robot is not able and Control (JRC), vol. 2, no. 1, 2020.
to move along the path while using frame-cropping image [2] L. K. Amifia, M. I. Riansyah, and P. D. Putra, “Design of Logistic
Transporter Robot System,” Jurnal Ilmiah Teknik Elektro Komputer
resolution higher than 475×725 dpi, as shown in Figure 7. dan Informatika, vol. 6, no. 1, p. 19, 2020.
This has happened because the camera detects lines in the [3] R. T. Yunardi, D. Arifianto, F. Bachtiar, J. I. Prananingrum, and U.
outside area. Airlangga, “Holonomic Implementation of Three Wheels
IV. C ONCLUSION AND F UTURE W ORK Omnidirectional Mobile Robot using DC Motors,” Journal of
Robotics and Control (JRC), vol. 2, no. 2, pp. 65–71, 2021.
The paper presented the visual of navigation system for [4] N. S. Widodo and A. Pamungkas, “Machine Vision-based Obstacle
Avoidance for Mobile Robot,” Jurnal Ilmiah Teknik Elektro
KheperaIV robot as line following robot, which is simulated Komputer dan Informatika, vol. 5, no. 2, p. 77, 2020.
in the Webots simulator. KheperaIV Robot used a camera as [5] N. Rinanto, I. Marzuqi, A. Khumaidi, and S. T. Sarena, “Obstacle
the primary sensor for visualization. Several steps of image Avoidance using Fuzzy Logic Controller on Wheeled Soccer Robot,”
preprocessing, such as image cropping, binary conversion, Jurnal Ilmiah Teknik Elektro Komputer dan Informatika, vol. 5, no. 1,
pp. 26–35, 2019.
grayscaling, Gaussian blur filtering, erosion, and dilation, are [6] A. Hassan et al., “A Wirelessly Controlled Robot-based Smart
carried out for the path detection process. The proposed Irrigation System by Exploiting Arduino,” Journal of Robotics and
method, which was applied to KepheraIV robot, is able to Control (JRC), vol. 2, no. 1, pp. 29–34, 2020.
make the robot to move and to follow different path lines [7] P. V. S. G. De Lima et al., “Improving Early Robotics Education
Using a Line-Following Robot Simulator,” Proceedings - 15th Latin
properly. Some types of path lines used for evaluation are American Robotics Symposium, 6th Brazilian Robotics Symposium
zigzag, dotted lines, and a curved line. A robot is able to and 9th Workshop on Robotics in Education, pp. 554–561, 2018.
perform the appropriate robot action regarding the path line [8] D. Nikolov, G. Zafirov, I. Stefanov, K. Nikov, and S. Stefanova,
based on visualization from the camera. However, a path “Autonomous navigation and speed control for line following robot,”
2018 IEEE 27th International Scientific Conference Electronics, ET
detection can be failed because of the higher resolution of the 2018 - Proceedings, pp. 1–4, 2018.
[9] K. M. Hasan, Abdullah-Al-Nahid, and A. Al Mamun,
“Implementation of autonomous line follower robot,” in 2012
International Conference on Informatics, Electronics and Vision,
ICIEV 2012, 2012, pp. 865–869.
[10] M. Shah, V. Rawal, and J. Dalwadi, “Design Implementation of High-
Performance Line Following Robot,” International Conference on
(a) Transforming Engineering Education, 2018.
[11] G. Sonal, P. Raninga, and H. Patel, “Design and implementation of
RGB color line following robot,” in Proceedings of the International
Conference on Computing Methodologies and Communication,
ICCMC 2017, 2018, vol. 2018-January, pp. 442–446.
[12] J. Sarwade, S. Shetty, A. Bhavsar, M. Mergu, and A. Talekar, “Line
(b) following robot using image processing,” in Proceedings of the 3rd
International Conference on Computing Methodologies and
Communication, ICCMC 2019, 2019, pp. 1174–1179.
[13] H. Omrane, M. S. Masmoudi, and M. Mesmoudi, “Neural controller
of autonomous driving mobile robot by an embedded camera,” 2018
4th International Conference on Advanced Technologies for Signal
(c) and Image Processing, ATSIP 2018, no. Figure I, pp. 1–5, 2018.
[14] O. Aviles, O. G. Rubiano, M. Mauledoux, A. Valencia, and R.
Jiménez, “Simulation of a Mobile Manipulator on Webots,” Int. J.
Online Eng., vol. 14, pp. 90–102, 2018.
[15] A. C. Stan and M. Oprea, “A Case Study of Multi-Robot Systems
Coordination using PSO simulated in Webots,” Proceedings of the
(d) 11th International Conference on Electronics, Computers and
Fig. 7. Failure line detection with higher resolution cropping image Artificial Intelligence, ECAI 2019, 2019.
[16] X. Du, K. K. Tan, and K. K. K. Htet, “Vision-based lane line detection
with 475×725 resolution

27 on December 23,2020 at 06:02:14 UTC from IEEE Xplore. Restrictions apply.


Authorized licensed use limited to: Auckland University of Technology. Downloaded
for autonomous vehicle navigation and guidance,” 2015 10th Asian [23] A. Roy and M. M. Noel, “Design of a high-speed line following robot
Control Conference: Emerging Control Techniques for a Sustainable that smoothly follows tight curves,” Computers and Electrical
World, 2015. Engineering, vol. 56, pp. 732–747, Nov. 2016.
[17] L. H. Juang and J. Sen Zhang, “Robust visual line-following [24] M. S. Bin Mostafa, A. K. M. Masum, M. S. Uddin, M. K. A. Chy, and
navigation system for humanoid robots,” Artificial Intelligence S. M. T. Reza, “Amphibious Line following Robot for Product
Review, vol. 53, no. 1, pp. 653–670, 2020. Delivery in Context of Bangladesh,” 2nd International Conference on
[18] J. He, S. Sun, D. Zhang, G. Wang, and C. Zhang, “Lane detection for Electrical, Computer and Communication Engineering, pp. 7–9,
track-following based on histogram statistics,” 2019 IEEE 2019.
International Conference on Electron Devices and Solid-State [25] R. I. Putra, S. Sunardi, and R. D. Puriyanto, “Monitoring Tegangan
Circuits, pp. 1–2, 2019. Baterai Lithium Polymer pada Robot Line Follower Secara Nirkabel,”
[19] C. F. Hsu, C. T. Su, W. F. Kao, and B. K. Lee, “Vision-Based Line- Buletin Ilmiah Sarjana Teknik Elektro, vol. 1, no. 2, p. 73, 2019.
Following Control of a Two-Wheel Self-Balancing Robot,” in [26] A. Maarif, S. Iskandar, and I. Iswanto, “New Design of Line Maze
Proceedings - International Conference on Machine Learning and Solving Robot with Speed Controller and Short Path Finder
Cybernetics, 2018, vol. 1, pp. 319–324. Algorithm,” International Review of Automatic Control (IREACO),
[20] M. A. Kader, M. Z. Islam, J. Al Rafi, M. R. Islam, and F. S. Hossain, vol. 12, no. 3, p. 154, 2019.
“Line Following Autonomous Office Assistant Robot with PID [27] A. Latif, H. A. Widodo, R. Rahim, and K. Kunal, “Implementation of
Algorithm,” 2018 International Conference on Innovations in Line Follower Robot based Microcontroller ATMega32A,” Journal of
Science, Engineering and Technology, no. October, pp. 109–114, Robotics and Control (JRC), vol. 1, no. 2, pp. 70–74, 2020.
2018. [28] Y. Tian and G. Du, “Infrared Line Following and Ultrasonic
[21] N. Hasanah, A. H. Alasiry, and B. Sumantri, “Two Wheels Line Navigating Robot with ATMEGA328 Pro,” Proceedings of 2019
Following Balancing Robot Control using Fuzzy Logic and PID on IEEE 3rd Advanced Information Management, Communicates,
Sloping Surface,” 2018 International Electronics Symposium on Electronic and Automation Control Conference, no. Imcec, pp. 856–
Engineering Technology and Applications, pp. 210–215, 2019. 860, 2019.
[22] Z. U. Abideen, M. B. Anwar, and H. Tariq, “Dual purpose cartesian
infrared sensor array based PID controlled line follower robot for
medical applications,” 2018 International Conference on Electrical
Engineering, pp. 1–7, 2018.

28 on December 23,2020 at 06:02:14 UTC from IEEE Xplore. Restrictions apply.


Authorized licensed use limited to: Auckland University of Technology. Downloaded

You might also like