- Overview
- Features
- Technology Stack
- Installation
- Usage
- Development Process
- Challenges and Solutions
- Future Work
- Team
- Demo and Pitch
- Environmental mapping and 3D modeling of surroundings
- AI-powered object detection for common items (e.g., chairs, tables)
- Distance approximation using ray casting
- Haptic feedback with variable frequency (15-30 Hz) based on object proximity
- Complementary audio cues for enhanced awareness
-
Hardware:
- VR Headset: HTC Vive Pro Eye
- Haptic Feedback: HaptX gloves
- Spatial Tracking: Lighthouse system
-
Software:
- Development Platform: Unity 19.4.31f
- SDKs:
- HaptX 2.0.0 beta 8
- SRworks
-
Operating System: Windows 11
- Ensure you have Unity 19.4.31f installed on your Windows 11 system.
- Clone this repository to your local machine.
- Open the project in Unity.
- Install the required SDKs:
- HaptX 2.0.0 beta 8
- SRworks
- Connect the HTC Vive Pro Eye headset and HaptX gloves to your system.
- Set up the Lighthouse tracking system according to the manufacturer's instructions.
- Build the project in Unity for your VR platform.
- Put on the HTC Vive Pro Eye headset and HaptX gloves.
- Launch the Haptic Vision application.
- The system will automatically start mapping your environment.
- Move your hands to detect objects in your surroundings:
- As your hands approach objects, you'll feel vibrations through the HaptX gloves.
- The intensity of the vibrations increases as you get closer to objects.
- Audio cues will provide additional information about detected objects.
- Use the haptic and audio feedback to navigate your environment safely.
- Problem Definition: Identified specific challenges faced by visually impaired individuals in navigation.
- Solution Ideation: Collaborated on potential approaches, focusing on haptic feedback as the primary modality.
- Prototype Development: Integrated VR and haptic technologies using Unity, implementing object detection and distance approximation algorithms.
- Testing and Refinement: Conducted user testing with visually impaired individuals to gather feedback and improve the system.
Challenge | Solution |
---|---|
SRworks compatibility | Used Unity v19.1 as recommended by VIVE mentors |
Hand tracking with SRworks | Opted for fixed orientation and adjusted hand position for mesh interaction |
Distance approximation accuracy | Cast hands a few units in front of their actual position for smoother interaction |
Limited testing devices | Focused on frequency adjustment with constant amplitude for haptic feedback |
- Enabling audio feedback
- Implementing active scanning with an on/off feature
- Developing a smaller, more discreet headset
- Improving vision through XR glasses recalibration
- Exploring waypoint guidance functionality
This project was made possible by the dedicated efforts of the Haptic Vision team. Christine, Leon, Winny, Malcolm, and Kyle
This project is licensed under the MIT License. See the LICENSE file for details.
We would like to thank:
- The VIVE team for their support and guidance
- Our visually impaired tester for their valuable feedback (Chris MacNally)
- The Unity and HaptX communities for their resources and support