SENSOR AND VISION SYSTEMS IN ROBOTICS
SUBMITTED BY:-
SHREYA SANKHWAR
SAGAR SINGH BISHT
SUMEDHA VASHISHTHA
Introduction - Importance of sensors and vision in Robotics
• Sensors and vision enable robots to sense, interpret, and interact with their
surroundings.
• These systems are crucial for autonomous robots, self-driving cars, industrial
automation, and AI-driven machines.
• The goal is to mimic human perception and make robots capable of making
intelligent decisions.
Types of Sensors in Robotics
1. Object Detection Sensors
• Proximity Sensor: Detects objects without physical contact.
• Infrared Trans Receivers: Emit/receive IR signals to detect obstacles.
• Ultrasonic Sensor: Uses high-frequency sound waves to measure
distance.
• Photoresistor: Changes resistance based on light intensity; used for
ambient light and object detection.
Interaction and Distance Sensors
• Touch Sensor: Senses physical contact; helps in stopping or triggering
actions upon touch.
• Range Sensor: Measures the distance between the sensor and an object
using various methods (e.g., sonar or vision).
• Tactile Sensor: Provides detailed information about pressure and
texture during contact.
• Sound Sensor: Converts sound into electrical signals; used for voice
commands and environmental awareness.
Environmental and Control Sensors
• Temperature Sensor: Monitors thermal conditions for system safety
and control.
• Contact Sensor: Uses mechanical switches or capacitive methods to
detect physical contact.
• Voltage Sensors: Monitor voltage levels to ensure proper power
and system performance.
Vision Systems in Robotics
• Computer Vision Basics – How robots process and interpret images.
• Image Processing Techniques – Filtering, Edge Detection, Object Segmentation.
• Object Detection & Recognition – Using AI models like YOLO and R-CNN.
• Depth Perception & Stereo Vision – Understanding distance using two cameras
(like human eyes).
Sensor Fusion
• Combining multiple sensor data for better accuracy.
• Kalman Filter & Particle Filter – Techniques to merge sensor inputs.
• Example: Self-driving cars use LIDAR, GPS, and cameras together
for navigation.
AI in Robotics Vision
• Deep Learning for Vision – AI models like CNNs, YOLO, and Faster R-
CNN for object recognition.
• Motion Tracking & Gesture Recognition – How robots detect and follow
movement.
• SLAM (Simultaneous Localization and Mapping) – AI helps robots navigate
unknown environments.
Applications of Sensor & Vision in Robotics
• Industrial Automation – Robots for manufacturing & quality inspection.
• Autonomous Vehicles – Self-driving cars & delivery drones.
• Healthcare Robotics – Surgical robots & assistive devices for disabled
people.
Challenges & Future Scope
Challenges
• High data processing requirements – Vision systems generate
huge data.
• Real-time decision-making – Robots must respond instantly.
• AI model limitations – Can struggle in unpredictable environments.
Future Scope:
• Advancements in AI-driven robotic perception.
• Better sensors for more accuracy.
• More intelligent and human-like robots.
Conclusion
• Sensors & vision systems help robots perceive and interact
with the world.
• AI-powered vision enables robots to make better decisions.
• Future advancements will make robots even smarter and
more autonomous.