3D Interaction Techniques.
3D Interaction Techniques.
5. 3D interaction techniques.
3D manipulation in AR allows users to interact with virtual objects overlaid onto the real world. The goal
is to make this interaction intuitive, efficient, and natural. Below is a comprehensive breakdown of 3D
manipulation tasks and techniques used in AR environments.
3D Manipulation Tasks in AR
These are the fundamental actions users perform to interact with virtual objects:
a) Selection
● Choosing a specific virtual object for manipulation.
● Techniques: Touch-based selection, raycasting, gaze-based selection.
e) Placement
● Positioning an object within the AR environment while considering real-world constraints.
● Example: Placing a virtual chair on a floor surface in an AR furniture app.
Prepared by : Divya Kaurani Augmented and Virtual Reality
3D Manipulation Techniques in AR
These techniques define how users interact with virtual objects:
a) Touch-Based Interaction
● Used in handheld AR (smartphones, tablets).
● Challenges: Mapping 2D touch gestures to 3D interactions.
● Examples:
○ Dragging to move objects.
○ Pinching to scale.
○ Rotating fingers to adjust orientation.
c) Device-Based Interaction
● Uses the device’s movement and orientation to manipulate objects.
● Examples:
○ Tilting the device to rotate an object.
○ Moving the device forward/backward to scale objects.
d) Raycasting
● Projects a virtual ray from the user’s device or hand into the AR environment to select and
manipulate distant objects.
● Common in AR headsets (HoloLens, Magic Leap).
e) Virtual Hand
● Displays a virtual hand within the AR scene to interact with objects naturally.
● Mimics real-world hand movements for an immersive experience.
Prepared by : Divya Kaurani Augmented and Virtual Reality
f) Physics-Based Manipulation
● Objects respond to real-world physics (gravity, collision, friction).
● Example: Dropping a virtual ball that rolls on a table realistically.
4. Real-World Applications
● AR Interior Design (IKEA Place, Houzz AR) – Users place and manipulate furniture in their real
space.
● Medical Training (HoloLens Surgery Simulation) – Surgeons practice 3D anatomical
manipulations.
● AR Gaming (Pokémon GO, Snapchat Lenses) – Players interact with virtual objects in the real
world.
● Manufacturing & Prototyping – Engineers manipulate virtual models before production.
A. Touch-Based Interaction
💡 Common in mobile AR applications (smartphones, tablets).
● Users interact with virtual objects using on-screen touch gestures.
● Challenge: Mapping 2D touch inputs to 3D interactions.
● Examples:
○ Dragging → Moving objects across the AR scene.
Prepared by : Divya Kaurani Augmented and Virtual Reality
C. Device-Based Interaction
💡 Uses the movement and orientation of the AR device to manipulate objects.
● Common in handheld AR (mobile AR) and headset-based AR.
● Uses sensors like the accelerometer and gyroscope to detect motion.
● Examples:
○ Tilting Device → Rotating objects.
○ Moving Device Closer/Farther → Scaling objects.
Real-World Applications
● AR Interior Design (IKEA Place, Houzz AR) – Uses touch and mid-air gestures for object
placement. 🏠
● AR Gaming (Pokémon GO, Snapchat Lenses) – Uses touch-based and device movement
🎮
for interaction.
● Medical AR Training (HoloLens Surgery Simulations) – Uses virtual hand interaction for
precision tasks. 👨⚕️
● Industrial & Engineering AR (Prototyping, Manufacturing) – Uses raycasting and indirect
touch for precise control. 🏗
The effectiveness of 3D manipulation in AR depends on selecting the right interaction technique based
on hardware capabilities and user needs. From touch-based gestures in mobile AR to virtual hands
and raycasting in AR headsets, these techniques define how users engage with augmented reality
environments.
Prepared by : Divya Kaurani Augmented and Virtual Reality
A. Direct Manipulation
💡 Users interact directly with virtual objects as if they were physically touching them.
● Virtual Hand Interaction → A virtual hand mimics real-world hand movements to grab, move,
and rotate objects.
● Touchscreen Interaction → Users use multi-touch gestures on mobile devices to manipulate
objects.
● Example: Pinching to scale or dragging to move a virtual object in an AR app.
Prepared by : Divya Kaurani Augmented and Virtual Reality
B. Indirect Manipulation
💡 Uses tools or interfaces instead of direct physical interaction.
● Ray Casting ("Laser Pointer" Method) → Projects a ray to select and manipulate distant
objects.
● Widgets & UI Controls → Uses on-screen buttons/sliders to move or rotate objects.
● Teleoperation → Remote manipulation via joysticks or controllers (e.g., robotic arm control in
AR).
● Example: Using an AR headset controller to point at and rotate objects.
C. Gesture-Based Manipulation
💡 Users manipulate objects using hand or body gestures detected by sensors.
● Hand Gestures → Grabbing, swiping, or pinching gestures to move, scale, and rotate objects.
● Body Tracking → Moving the head or body to interact with objects (common in AR gaming).
● Example: Pinching in mid-air to select and move a holographic object in HoloLens AR.
D. Device-Based Manipulation
💡 The device's motion and orientation determine object manipulation.
● Tilting the device → Rotates objects in AR.
● Moving the device closer/farther → Scales objects dynamically.
● Example: Using a smartphone's gyroscope and accelerometer to control 3D models in AR
apps.
● Gaming (Pokémon GO, AR Filters) – Uses device-based and touch-based manipulation for
in-game object control.🎮
● Industrial AR (Prototyping & Manufacturing) – Uses ray casting and indirect manipulation
for design adjustments. 🏗
Manipulation tasks in AR/VR involve selecting, moving, rotating, and scaling virtual objects. These
tasks can be performed through direct, indirect, gesture-based, or device-based interactions,
depending on hardware capabilities and user needs. The goal is to create intuitive, precise, and
efficient interactions for a seamless AR/VR experience.
1. Hand Tracking Devices (e.g., Leap Motion, HoloLens, Meta Quest Hand
Tracking)
💡 Uses cameras and sensors to track hand movements for direct interaction.
🔹 Functionality:
● Uses infrared cameras or depth sensors to capture hand and finger movements.
● Detects position, gestures, and orientation in real time.
● Translates hand actions into 3D interactions without requiring a physical controller.
🔹 Application in 3D Manipulation:
● Natural interaction → Users can grab, move, rotate, and scale virtual objects using real-world
gestures.
● Gesture-based controls → Pinching, swiping, or opening/closing hands can trigger various
commands.
● No need for controllers → Fully hands-free operation enhances immersion.
🔹 Advantages:
● Highly intuitive and immersive interaction.
Prepared by : Divya Kaurani Augmented and Virtual Reality
🔹 Disadvantages:
● Requires high processing power for accurate tracking.
● Lighting conditions and occlusions can impact accuracy.
● Can be less precise than controller-based manipulation.
2. 6DOF (Six Degrees of Freedom) Controllers (e.g., Oculus Touch, HTC Vive,
PlayStation VR2 Controllers)
💡 Provides highly accurate tracking of position and orientation for precise manipulation.
🔹 Functionality:
● Tracks movement in six degrees of freedom (6DOF):
🔸 3 for position (X, Y, Z) → Moving forward/backward, left/right, up/down.
🔸 3 for rotation (pitch, yaw, roll) → Tilting, turning, and rolling.
● Includes buttons, joysticks, and triggers for additional input.
● Uses sensors and tracking cameras to detect real-time motion.
Prepared by : Divya Kaurani Augmented and Virtual Reality
🔹 Application in 3D Manipulation:
● Precise translation & rotation → Objects can be moved, rotated, and placed with great
accuracy.
● Selection and pointing → Controllers allow users to aim and manipulate distant objects.
● Multi-functional interaction → Buttons and triggers enable grabbing, scaling, and tool
switching.
🔹 Advantages:
● High precision and accuracy in 3D manipulation.
● More control options (buttons, triggers, joysticks).
● Reliable tracking in different environments.
🔹 Disadvantages:
● Requires physical controllers, which may limit natural movement.
● Learning curve → Users need to get familiar with the button layout.
● Can feel less immersive than direct hand tracking.
Best Use Case Simple, intuitive tasks (e.g., object Complex manipulation (e.g., VR modeling,
picking) gaming)
Both hand tracking devices and 6DOF controllers offer powerful interaction methods for 3D
manipulation in AR/VR. Hand tracking provides an intuitive, controller-free experience, while 6DOF
controllers deliver precision and versatility for more complex tasks.
Prepared by : Divya Kaurani Augmented and Virtual Reality
Example: IKEA Place app lets users position furniture in AR using touch gestures.
Example: HoloLens 2 lets users interact with virtual objects using pinching and grabbing gestures.
🔹 Example: Google ARCore lets users reposition AR objects by moving their phone.
Key Benefits
● Enhanced Precision & Control: Combining precise controllers with natural hand interactions.
● Contextual Adaptability: Adapts to different tasks by switching between interaction modes.
● Improved User Experience: Supports diverse user preferences and skill levels.
Hybrid techniques in 3D manipulation offer a powerful way to enhance usability, efficiency, and
adaptability in interactive systems. By strategically combining different input methods, users can benefit
from both precision and natural interaction, leading to a more fluid and effective manipulation experience.
PYQ: What is 3D computer graphics & also discuss the rendering process
3D computer graphics involve generating three-dimensional images from digital data, creating virtual
scenes that can be displayed on a 2D screen. This technology is widely used across multiple industries:
1. Modeling
● The creation of 3D objects using software like Blender, Maya, or 3ds Max.
● Defines object geometry, shape, and size.
● Includes polygonal modeling, NURBS modeling, and sculpting techniques.
3. Lighting
● Placement of virtual light sources (e.g., point lights, spotlights, directional lights).
● Determines shadows, reflections, and highlights.
● Advanced techniques like global illumination and ambient occlusion improve realism.
4. Rendering Techniques
This is the step where the 3D scene is converted into a 2D image. Two main techniques are used:
● Ray Tracing:
○ Simulates the real behavior of light by tracing rays as they interact with surfaces.
○ Produces highly realistic shadows, reflections, and refractions.
○ Used in movies, high-end graphics, and modern real-time rendering (RTX in
gaming).
● Rasterization:
○ Converts 3D objects into pixels using a process called scan conversion.
○ Faster than ray tracing, making it ideal for real-time applications like video games.
○ Used in game engines (Unity, Unreal Engine) and GPU-based rendering (DirectX,
OpenGL).
5. Post-Processing
● Final touch-ups to enhance image quality.
● Common effects include:
○ Anti-aliasing: Smooths jagged edges.
○ Motion Blur: Adds realism to fast-moving objects.
○ Depth of Field: Simulates camera focus effects.
○ Color Grading & Bloom: Adjusts tone, contrast, and glow effects.
3D computer graphics play a crucial role in various industries, creating lifelike and engaging visuals.
The rendering process is at the heart of 3D graphics, using complex algorithms to transform raw 3D
models into high-quality images. With advancements in AI-driven rendering, real-time ray tracing, and
GPU acceleration, 3D graphics continue to evolve, pushing the boundaries of realism and performance.
Prepared by : Divya Kaurani Augmented and Virtual Reality