Three-Dimensional Path-Following with Articulated 6DoF Robot and ToF Sensors
<p>Control principles for typical line-following, 2-wheeled robot showing situations resulting in (<b>a</b>) turn left, (<b>b</b>) move straight, (<b>c</b>) turn right commands.</p> "> Figure 2
<p>Robot holding path detection tool during the path-following.</p> "> Figure 3
<p>Path detection tool schema showing ToF sensors (1–6) placement, sensor separation <span class="html-italic">a</span>, and locations of detection points (purple circles), e.g., point A is the detection point at the intersection of the lines of sight of sensors 2 and 4.</p> "> Figure 4
<p>The <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">w</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> vector and its component vectors and location of <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>P</mi> </mrow> <mrow> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math> (next tool position) point in the global coordinate system; <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">w</mi> </mrow> <mo>^</mo> </mover> </mrow> </semantics></math> is the unit vector parallel to <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">w</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>. Other symbols: <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">t</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>—tool displacement vector; <math display="inline"><semantics> <mrow> <mi>O</mi> </mrow> </semantics></math>—detection tool centre in the current position; <math display="inline"><semantics> <mrow> <mi>O</mi> <mi mathvariant="normal">’</mi> </mrow> </semantics></math> —detection tool centre in the previous position; and <span class="html-italic">S</span>—detected path position.</p> "> Figure 5
<p>Three consecutive positions of the detection tool during path-following. Symbols: <math display="inline"><semantics> <mrow> <mi>O</mi> </mrow> </semantics></math>—detection tool centre in the current position; <math display="inline"><semantics> <mrow> <mi>O</mi> <mi mathvariant="normal">’</mi> </mrow> </semantics></math>—detection tool centre in the previous position; <span class="html-italic">S</span>—detected path position for the current detection tool position; and <span class="html-italic">Pn</span>—calculated next tool centre position, 1…6—ToF sensors.</p> "> Figure 6
<p>The <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi mathvariant="bold-italic">w</mi> </mrow> <mrow> <mi mathvariant="bold-italic">a</mi> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> vector with its components, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>w</mi> </mrow> <mrow> <mi>a</mi> <mo>,</mo> <mi>x</mi> </mrow> </msub> <mo>,</mo> <msub> <mrow> <mi>w</mi> </mrow> <mrow> <mi>a</mi> <mo>,</mo> <mi>y</mi> </mrow> </msub> <mo>,</mo> <msub> <mrow> <mi>w</mi> </mrow> <mrow> <mi>a</mi> <mo>,</mo> <mi>z</mi> </mrow> </msub> </mrow> </semantics></math>, and two rotation angles that can be determined on the basis of this vector: <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>β</mi> </mrow> <mrow> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>γ</mi> </mrow> <mrow> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math>.</p> "> Figure 7
<p>Example of combining the next position and tool rotation; detection tool seen from the top. Symbols: <math display="inline"><semantics> <mrow> <mi>O</mi> </mrow> </semantics></math>—detection tool centre in the current position; <math display="inline"><semantics> <mrow> <mi>O</mi> <mi mathvariant="normal">’</mi> </mrow> </semantics></math>—detection tool centre in the previous position; <span class="html-italic">S</span>—detected path position for the current detection tool position; <span class="html-italic">P<sub>n</sub></span>—calculated next tool centre position; vectors <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <msub> <mrow> <mi mathvariant="bold-italic">w</mi> </mrow> <mrow> <mi mathvariant="bold-italic">a</mi> </mrow> </msub> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">w</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">t</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">n</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <mover accent="true"> <mrow> <mi mathvariant="bold-italic">m</mi> </mrow> <mo>→</mo> </mover> </mrow> </semantics></math> and <span class="html-italic">a</span> and <span class="html-italic">b</span> constants are described in the text.</p> "> Figure 8
<p>Path detection tool used during experimental verification: (<b>a</b>) view of the real tool in the robot gripper placed over the path to follow; (<b>b</b>) main tool diameters.</p> "> Figure 9
<p>General software architecture of the presented system.</p> "> Figure 10
<p>Path-following algorithm schema.</p> "> Figure 11
<p>Path 1, run 1—path (green line) detected by the detection tool (<b>a</b>) isometric view, (<b>b</b>) front view (YZ plane), (<b>c</b>) side view (XZ plane), and (<b>d</b>) top view (YX plane).</p> "> Figure 12
<p>Path 1, run 1—(<b>a</b>–<b>c</b>) three selected examples of robot positions obtained during path-following; full video available as <a href="#app1-applsci-15-02917" class="html-app">Supplementary Material [S2]</a>.</p> "> Figure 13
<p>Path 1, run 2—path (green line) detected by the detection tool (<b>a</b>) isometric view, (<b>b</b>) front view (YZ plane), (<b>c</b>) side view (XZ plane), and (<b>d</b>) top view (YX plane).</p> "> Figure 14
<p>Path 1, run 2—(<b>a</b>–<b>c</b>) three selected examples of robot positions obtained during path-following; full video available as <a href="#app1-applsci-15-02917" class="html-app">Supplementary Material [S2]</a>.</p> "> Figure 15
<p>Path 1, run 1 and 2 overlayed; the robot body is removed for a clearer view.</p> "> Figure 16
<p>Path 2, run 1—path (green line) detected by the detection tool (<b>a</b>) isometric view, (<b>b</b>) front view (YZ plane), (<b>c</b>) side view (XZ plane), and (<b>d</b>) top view (YX plane).</p> "> Figure 17
<p>Path 2, run 1—(<b>a</b>–<b>c</b>) three selected examples of robot positions obtained during path-following; full video available as <a href="#app1-applsci-15-02917" class="html-app">Supplementary Material [S2]</a>.</p> "> Figure 18
<p>Path 3, run 1—path (green line) detected by the detection tool (<b>a</b>) isometric view, (<b>b</b>) front view (YZ plane), (<b>c</b>) side view (XZ plane), and (<b>d</b>) top view (YX plane).</p> "> Figure 19
<p>Path 3, run 1—(<b>a</b>–<b>c</b>) three selected examples of robot positions obtained during path-following; full video available as <a href="#app1-applsci-15-02917" class="html-app">Supplementary Material [S2]</a>.</p> "> Figure 20
<p>Path 4, run 1—path (green line) detected by the detection tool (<b>a</b>) isometric view, (<b>b</b>) front view (YZ plane), (<b>c</b>) side view (XZ plane), and (<b>d</b>) top view (YX plane).</p> "> Figure 21
<p>Path 4, run 1—(<b>a</b>–<b>c</b>) three selected examples of robot positions obtained during path-following; full video available as <a href="#app1-applsci-15-02917" class="html-app">Supplementary Material [S2]</a>.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. General Description of the Proposed Solution
2.2. Detected Path Position Calculation
2.3. Calculation of the Next Tool Position
2.4. Calculation of the Tool Rotation
2.5. Determining the Position of the Detection Tool Based on the Position of the Gripper
3. Results
3.1. Experimental Setup
3.2. Algorithm Implementation, Constraints, and Supporting Solutions
- Read sensor data and interpreting these data:The measurement field of the detection tool is represented as a 3 × 3 array (sensor array). Path detection is possible at nine points that are located at the intersections of the sensors’ line of sight. Each of the points is represented by one element of the sensor array. The matrix rows represent the sensors located in the left arm of the path detection tool (sensors 1–3) and the columns represent the sensors in the right arm (sensors 4–6). The path is marked as detected at a given point when its presence is signalled by two sensors whose fields intersect in this point. Additionally, if the path is detected too close or too far from the sensor it is marked as a detection error to prevent unambiguous data input to the path-following algorithm. A detection error is not treated as a wrong readout. In such a case, the algorithm continues as if the path was not detected in this step and calculates the next step based on the last successful readout (the O’ point of the last step when the path was detected is used).
- Read the current position of the gripper from the robot.
- Calculate the effector’s (detection tool) current position and orientation based on the position of the gripper using Equations (23)–(25).
- Perform path-following algorithm step:Calculate the effector’s new position based on its current and previous positions and detected path position. First, the detected path position is calculated using Equations (1)–(8). Then, the next tool position and rotation is calculated using Equations (9)–(14) and (15)–(22). It must be noted here that according to the detection condition, a “long” or “short” step is performed (according to the value of the variable d in Equations (13) and (14)). A “short” step is performed when the path is detected only in the central point of the detection field or when it is not detected by any sensor. In all other cases, a “long” movement is performed. This is to prevent an insufficient correction of the detection tool position when the path trajectory bends or losing the path when moving straight ahead. Moreover, in some cases, due to the algorithm’s response to detecting a path in a position other than the centre of the detection field, the calculated tool rotation may have a high angle value (see Section 2.4). This has a negative impact on the smoothness of the motion along the path and may also cause the loss of path detection. Rotation reduction is accomplished by reducing the length of the vector. It is multiplied by a constant parameter (named b in Equation (17)), which should have a value greater than 0 and less than 1. This reduction may, however, cause the situation where the tool rotates by a smaller angle than is needed. Fortunately, in such a case the path-following algorithm will recognise in the next iteration that the path is still positioned on the edge of the detection field and, in order to follow the path, it must be rotated again. Eventually, the algorithm corrects the tool orientation in a few subsequent movement steps.
- Calculate inverse kinematics (IK):The IK function calculates the articulated variables of the robot based on the detection tool position and orientation determined by the path-following algorithm. To calculate IK solution, the Matlab generalised IK object [40] is used, which internally uses the Broyden–Fletcher–Goldfarb–Shanno (BFGS [41]) gradient projection algorithm solver. As a starting point for the solver to find the solution for the next position, the previous robot position is used. There are also some additional solver constraints applied that limit the robot body elements to stay inside safety cell, force the solver to prefer the downward tool orientation, and limit maximum tool rotation around its axes to less than 90° to prevent tool reversal, which will cause a collision with the path.
- Send new joint positions to the robot controller.
- Execute robot move:Robot moves are executed with an arbitrary chosen constant speed. Read, in a loop, current robot position and wait until it reaches the assigned position.
3.3. Experimental Verification
3.3.1. Path 1, Run 1
3.3.2. Path 1, Run 2
3.3.3. Path 2, Run 1
3.3.4. Path 3, Run 1
3.3.5. Path 4, Run 1
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Marashian, A.; Razminia, A. Mobile robot’s path-planning and path-tracking in static and dynamic environments: Dynamic programming approach. Robot. Auton. Syst. 2024, 172, 104592. [Google Scholar] [CrossRef]
- Namgung, H. Local Route Planning for Collision Avoidance of Maritime Autonomous Surface Ships in Compliance with COLREGs Rules. Sustainability 2022, 14, 198. [Google Scholar] [CrossRef]
- Debnath, D.; Vanegas, F.; Sandino, J.; Hawary, A.F.; Gonzalez, F. A Review of UAV Path-Planning Algorithms and Obstacle Avoidance Methods for Remote Sensing Applications. Remote Sens. 2024, 16, 4019. [Google Scholar] [CrossRef]
- Ou, X.; You, Z.; He, X. Local Path Planner for Mobile Robot Considering Future Positions of Obstacles. Processes 2024, 12, 984. [Google Scholar] [CrossRef]
- Liu, L.; Wang, X.; Yang, X.; Liu, H.; Li, J.; Wang, P. Path planning techniques for mobile robots: Review and prospect. Expert Syst. Appl. 2023, 227, 120254. [Google Scholar] [CrossRef]
- Tan, C.S.; Mohd-Mokhtar, R.; Arshad, M.R. A Comprehensive Review of Coverage Path Planning in Robotics Using Classical and Heuristic Algorithms. IEEE Access 2021, 9, 119310–119342. [Google Scholar] [CrossRef]
- Thompson, N.; Greenewald, K.; Lee, K.; Manso, G.F. The Computational Limits of Deep Learning. In Proceedings of the Ninth Computing within Limits 2023, Virtual, 14–15 June 2023. [Google Scholar] [CrossRef]
- Youn, W.; Ko, H.; Choi, H.; Choi, I.; Baek, J.-H.; Myung, H. Collision-free Autonomous Navigation of A Small UAV Using Low-cost Sensors in GPS-denied Environments. Int. J. Control Autom. Syst. 2021, 19, 953–968. [Google Scholar] [CrossRef]
- Mekathlon—International Line Follower Robot Competition. Available online: https://www.mekathlon.com/fastest-line-follower (accessed on 18 December 2024).
- Robotex International. Available online: https://robotex.international/line-following/ (accessed on 18 December 2024).
- Minaya, C.; Rosero, R.; Zambrano, M.; Catota, P. Application of Multilayer Neural Networks for Controlling a Line-Following Robot in Robotic Competitions. J. Autom. Mob. Robot. Intell. Syst. 2024, 18, 35–42. [Google Scholar] [CrossRef]
- Magnum Automation Inc. AGV/AGC Assembly Line. Available online: https://www.magnum-inc.com/systems/agvs-agcs/agc-assembly-line/ (accessed on 18 December 2024).
- Mohammed, M.S.; Abduljabar, A.M.; Faisal, M.M.; Mahmmod, B.M.; Abdulhussain, S.H.; Khan, W.; Liatsis, P.; Hussain, A. Low-cost autonomous car level 2: Design and implementation for conventional vehicles. Results Eng. 2023, 17, 100969. [Google Scholar] [CrossRef]
- Zakaria, N.J.; Shapiai, M.I.; Ghani, R.A.; Yassin MN, M.; Ibrahim, M.Z.; Wahid, N. Lane Detection in Autonomous Vehicles: A Systematic Review. IEEE Access 2023, 11, 3729–3765. [Google Scholar] [CrossRef]
- Anand, M.; Kalaisevi, P.; Arun Kumar, S.; Nithyavathy, N. Design and Development of Automated Guided Vehicle with Line Follower Concept using IR. In Proceedings of the 2023 Fifth International Conference on Electrical, Computer and Communication Technologies (ICECCT), Erode, India, 22–24 February 2023; pp. 1–11. [Google Scholar] [CrossRef]
- Jang, J.-Y.; Yoon, S.-J.; Lin, C.-H. Automated Guided Vehicle (AGV) Driving System Using Vision Sensor and Color Code. Electronics 2023, 12, 1415. [Google Scholar] [CrossRef]
- Bach, S.; Yi, S. An Efficient Approach for Line-Following Automated Guided Vehicles Based on Fuzzy Inference Mechanism. J. Robot. Control 2022, 3, 395–401. [Google Scholar] [CrossRef]
- Engin, M.; Engin, D. Path Planning of Line Follower Robot. In Proceedings of the 5th European DSP Education and Research Conference (EDERC), Amsterdam, The Netherlands, 13–14 September 2012; pp. 1–5. [Google Scholar] [CrossRef]
- Mahaleh, M.; Mirroshandel, S. Real-time application of swarm and evolutionary algorithms for line follower automated guided vehicles: A comprehensive study. Evol. Intell. 2022, 15, 119–140. [Google Scholar] [CrossRef]
- Moshayedi, A.J.; Zanjani, S.M.; Xu, D.; Chen, X.; Wang, G.; Yang, S. Fusion based AGV Robot Navigation Solution Comparative Analysis and Vrep Simulation. In Proceedings of the 8th Iranian Conference on Signal Processing and Intelligent Systems (ICSPIS), Behshahr, Iran, 28–29 December 2022; pp. 1–11. [Google Scholar] [CrossRef]
- Liu, G.; Sun, W.; Xie, W.; Xu, Y. Learning visual path–following skills for industrial robot using deep reinforcement learning. Int. J. Adv. Manuf. Technol. 2022, 122, 1099–1111. [Google Scholar] [CrossRef]
- Manorathna, R.P.; Phairatt, P.; Ogun, P.; Widjanarko, T.; Chamberlain, M.; Justham, L.; Marimuthu, S.; Jackson, M.R. Feature extraction and tracking of a weld joint for adaptive robotic welding. In Proceedings of the 13th International Conference on Control Automation Robotics & Vision (ICARCV), Singapore, 10–12 December 2014; pp. 1368–1372. [Google Scholar] [CrossRef]
- Mathoworks Minidrone Competition. Available online: https://www.mathworks.com/academia/students/competitions/minidrones.html (accessed on 18 December 2024).
- Basso, M.; Pignaton de Freitas, E. A UAV Guidance System Using Crop Row Detection and Line Follower Algorithms. J. Intell. Robot. Syst. 2020, 97, 605–621. [Google Scholar] [CrossRef]
- da Silva, Y.M.R.; Andrade, F.A.A.; Sousa, L.; de Castro, G.G.R.; Dias, J.T.; Berger, G.; Lima, J.; Pinto, M.F. Computer Vision Based Path Following for Autonomous Unmanned Aerial Systems in Unburied Pipeline Onshore Inspection. Drones 2022, 6, 410. [Google Scholar] [CrossRef]
- Schofield, O.B.; Iversen, N.; Ebeid, E. Autonomous power line detection and tracking system using UAVs. Microprocess. Microsyst. 2022, 94, 104609. [Google Scholar] [CrossRef]
- Pussente, G.A.N.; de Aguiar, E.P.; Marcato, A.L.M.; Pinto, M.F. UAV Power Line Tracking Control Based on a Type-2 Fuzzy-PID Approach. Robotics 2023, 12, 60. [Google Scholar] [CrossRef]
- Xiang, X.; Yu, C.; Niu, Z.; Zhang, Q. Subsea Cable Tracking by Autonomous Underwater Vehicle with Magnetic Sensing Guidance. Sensors 2016, 16, 1335. [Google Scholar] [CrossRef]
- Gerigk, M.K.; Gerigk, M. Application of unmanned USV surface and AUV underwater maritime platforms for the monitoring of offshore structures at sea. Sci. J. Marit. Univ. Szczec. 2023, 76, 89–100. [Google Scholar]
- Ziebinski, A.; Mrozek, D.; Cupek, R.; Grzechca, D.; Fojcik, M.; Drewniak, M.; Kyrkjebø, E.; Lin, J.C.-W.; Øvsthus, K.; Biernacki, P. Challenges Associated with Sensors and Data Fusion for AGV-Driven Smart Manufacturing. In Computational Science—ICCS 2021; Paszynski, M., Kranzlmüller, D., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2021; Volume 12745. [Google Scholar] [CrossRef]
- Ghambari, S.; Golabi, M.; Jourdan, L.; Lepagnot, J.; Idoumghar, L. UAV path planning techniques: A survey. RAIRO-Oper. Res. 2024, 58, 2951–2989. [Google Scholar] [CrossRef]
- ul Husnain, A.; Mokhtar, N.; Mohamed Shah, N.; Dahari, M.; Iwahashi, M. A Systematic Literature Review (SLR) on Autonomous Path Planning of Unmanned Aerial Vehicles. Drones 2023, 7, 118. [Google Scholar] [CrossRef]
- Meyes, R.; Tercan, H.; Roggendorf, S.; Thiele, T.; Büscher, C.; Obdenbusch, M.; Brecher, C.; Jeschke, S.; Meisen, T. Motion Planning for Industrial Robots using Reinforcement Learning. Procedia CIRP 2017, 63, 107–112. [Google Scholar] [CrossRef]
- Guo, Q.; Yang, Z.; Xu, J.; Jiang, Y.; Wang, W.; Liu, Z.; Zhao, W.; Sun, Y. Progress, challenges and trends on vision sensing technologies in automatic/intelligent robotic welding: State-of-the-art review. Robot. Comput.-Integr. Manuf. 2024, 89, 102767. [Google Scholar] [CrossRef]
- Maldonado-Ramirez, A.; Rios-Cabrera, R.; Lopez-Juarez, I. A visual path-following learning approach for industrial robots using DRL. Robot. Comput.-Integr. Manuf. 2021, 71, 102130. [Google Scholar] [CrossRef]
- Slabaugh, G.G. Computing Euler Angles from a Rotation Matrix; Technical Report; University of London: London, UK, 1999; pp. 39–63. [Google Scholar]
- Nachi Fujikoshi Corp. Manipulator Instruction Manual, MMZEN-288-013; Nachi Fujikoshi Corp: Toyama City, Japan, 2021. [Google Scholar]
- ST Microelectronics. VL6180X Proximity and Ambient Light Sensing (ALS) Module Datasheet, 2016. Available online: https://www.st.com/resource/en/datasheet/vl6180x.pdf (accessed on 13 February 2025).
- Nachi Fujikoshi Corp. CFDS Controller Instruction Manual User Task, CFDs-EN-123-001A; Nachi Fujikoshi Corp: Toyama City, Japan, 2021. [Google Scholar]
- Mathworks, GeneralizedInverseKinematrics Object Documentation. Available online: https://www.mathworks.com/help/robotics/ref/generalizedinversekinematics-system-object.html (accessed on 7 January 2025).
- Press, W.H.; Teukolsky, S.A.; Vetterling, W.T.; Flannery, B.P. Numerical Recipes in C, The Art of Scientific Computing, 2nd ed.; Cambridge University Press: Cambridge, UK, 1992; ISBN 0-521-43108-5. [Google Scholar]
- Liwiński, K.; Budzisz, D. Testing the Capabilities of the Nachi MZ04 Robot in Terms of Performing Tasks with External Control. B.Sc. (Eng.) Thesis, Gdańsk University of Technology, Gdańsk, Poland, 2024. (In Polish). [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wawrzyniak, T.F.; Orłowski, I.D.; Galewski, M.A. Three-Dimensional Path-Following with Articulated 6DoF Robot and ToF Sensors. Appl. Sci. 2025, 15, 2917. https://doi.org/10.3390/app15062917
Wawrzyniak TF, Orłowski ID, Galewski MA. Three-Dimensional Path-Following with Articulated 6DoF Robot and ToF Sensors. Applied Sciences. 2025; 15(6):2917. https://doi.org/10.3390/app15062917
Chicago/Turabian StyleWawrzyniak, Tymon F., Ignacy D. Orłowski, and Marek A. Galewski. 2025. "Three-Dimensional Path-Following with Articulated 6DoF Robot and ToF Sensors" Applied Sciences 15, no. 6: 2917. https://doi.org/10.3390/app15062917
APA StyleWawrzyniak, T. F., Orłowski, I. D., & Galewski, M. A. (2025). Three-Dimensional Path-Following with Articulated 6DoF Robot and ToF Sensors. Applied Sciences, 15(6), 2917. https://doi.org/10.3390/app15062917