F01


Learning Outcomes

  • Explain and apply multimodal feedback principles in VR. Before class, review how coordinated audio–haptic cues (consistency, proportional intensity, tight synchronization, localization, and comfort controls) improve realism, situational awareness, and safety in engineering scenarios like assembly, teleoperation, and diagnostics.
  • Implement controller and glove haptics with Unity XRI. In preparation, configure HapticImpulsePlayer and SimpleHapticFeedback for hover/select/attach events, and verify your XR Rig interactors. Skim the per-object patterns (CustomHoverHaptics, CustomSelectHaptics) and, if available, note the setup for hand tracking and bHaptics gloves (colliders + SimpleFingerHaptics) to deliver object-specific fingertip feedback.
  • Configure spatial, event-driven, and proximity-triggered audio. Ahead of the session, attach and tune an AudioSource (set Spatial Blend, Min/Max Distance, loop/3D), prepare clips for event-based playback (e.g., surface-tag switcher), and review optional Audio Reverb Zone and proximity scripts so sounds activate and attenuate naturally as users move through the environment.

Multimodal VR Experiences

In VR, creating a truly immersive experience goes beyond realistic visuals. The senses of hearing and touch play pivotal roles in bridging the gap between the virtual and the physical world. In engineering applications—such as industrial simulations, training environments, and control systems—accurate auditory cues and haptic feedback are critical for:

  • Enhanced Situational Awareness: Operators in control rooms or simulation environments can quickly gauge system status and spatial relationships without relying solely on visual displays.
  • Improved Safety and Efficiency: Timely and clear sensory feedback helps users confirm actions (such as engaging emergency controls) and reduces the risk of errors.
  • Realistic Simulation of Real-World Systems: Sound cues and tactile sensations, such as the click of a button or the vibration of a control panel, closely mimic physical interactions, making simulations more believable.
  • Increased User Engagement: A rich, multimodal experience keeps users immersed in the virtual environment, essential for effective training and remote operations.

F02


Engineering Use Cases

  • Industrial Training Simulators: Teach operators to handle machinery, control panels, or robotic arms by providing immediate sensory feedback. In the assembly station of XFactory, users can be trained to work alongside a robotic arm to assemble a complex part like the V8 engine. As they correctly (or incorrectly) assemble each component, the system gives a confirmation beep and a short vibration pulse to the controller.

  • Remote Monitoring and Teleoperation: Allow operators to “feel” and “hear” system statuses and anomalies when controlling equipment remotely. In the logistics station of XFactory, the drone is operated remotely. If the path becomes cluttered or a system error is detected (e.g., battery low or sensor failure), a warning tone is played, and a longer vibration indicates urgency.

  • Design and Prototyping: Enable engineers to evaluate new control interfaces or equipment designs by simulating the sensory experience of interacting with physical components. In the exhibit station of XFactory, users can import and evaluate a prototype, touch-sensitive control panel. Each button provides a different audio-haptic profile to simulate physical differences (e.g., soft rubbery button vs. hard metallic switch).

  • Maintenance and Troubleshooting: Help diagnose issues by associating specific haptic and audio cues with malfunctioning systems, such as abnormal vibrations or unexpected sounds. In the welding station of XFactory, a robot’s welding head alignment is faulty. When diagnostics are run, the VR system plays a distorted buzzing sound and sends a repetitive low-pulse vibration to indicate a hardware malfunction.


Design Principles

Effective haptic design enhances realism, reinforces user actions, and maintains comfort across devices. The following principles ensure feedback feels intentional, synchronized, and adaptable in any VR experience.

  • Consistency and Predictability: Each user action should always produce the same, recognizable response. A light tap might yield a quick, soft pulse and click sound, while a firm press delivers a stronger, deeper vibration. Predictable feedback builds user trust and supports muscle memory.

  • Proportional and Context-Aware Feedback: Scale haptic strength and duration to match the significance of an interaction. Routine actions (like opening drawers or pressing buttons) should feel subtle and brief, while critical events (such as warnings or heavy impacts) should deliver stronger, layered feedback patterns that capture attention.

  • Multimodal Synchronization: Align haptic, audio, and visual feedback to occur simultaneously—ideally within a few milliseconds. When a button is pressed, the user should feel a pulse, hear a click, and see a corresponding animation at the same moment. Tight timing reinforces realism and immersion.

  • Frequency and Texture Pairing: Use different vibration frequencies to convey material qualities or intensity. High-frequency vibrations can simulate texture or fine detail, while low-frequency rumbles suggest weight or impact. Pair these with complementary audio tones so that both sensations enhance, rather than compete with, each other.

  • Localization and Device Awareness: Deliver haptic impulses only to the controller or device involved in the interaction. Pressing a control lever with the left hand should cause only the left controller to vibrate. Because vibration strength and frequency vary between devices, calibrate haptics per hardware profile and degrade gracefully when certain effects aren’t supported.

  • Comfort and Adaptability: Avoid excessive or continuous vibrations that cause fatigue. Provide user settings to adjust intensity or switch between “audio-only,” “haptic-only,” or combined feedback modes. Flexible control ensures accessibility and comfort during extended sessions.


Controller Haptics

In VR, haptics refers to the use of technology that simulates the sense of touch, allowing users to feel physical sensations in virtual environments. This is achieved through devices like gloves, vests, or controllers that provide feedback such as vibrations, pressure, or force, enhancing immersion and realism in virtual experiences. In XFactory, haptics bring machinery and tools to life. From operating a robot in the assembly station to pushing a button on a forklift dashboard in the logistics station, carefully designed haptic responses reinforce the user’s sense of presence and interaction. Note that good haptic design should balance consistency, context, and comfort, synchronizing multiple sensory cues while adapting gracefully to different users and hardware.

F02


Haptic Impulse Player

The HapticImpulsePlayer component attached to each controller is the core output driver in Unity’s XR Interaction Toolkit haptic system. It is responsible for sending vibration impulses to the controller’s haptic motor. Rather than deciding when vibrations should play, the HapticImpulsePlayer defines how they are delivered once triggered. It provides a unified interface for sending impulses from other components, such as SimpleHapticFeedback, directly to a specific XR controller. Typically, you attach one HapticImpulsePlayer to each controller in your XR Rig (for example, the left and right hands). Other components then reference it to trigger haptics during interactions. Key properties include:

  • Haptic Output: Specifies the controller or haptic control that impulses are sent to.
  • Amplitude Multiplier: A global scaling factor that amplifies or dampens all outgoing haptic impulses from this player.

02

In essence, the HapticImpulsePlayer functions like an amplifier—it doesn’t initiate haptics on its own but ensures that all incoming impulses are properly routed and adjusted before reaching the hardware device.


Simple Haptic Feedback

The SimpleHapticFeedback component serves as the event-driven controller for haptics, determining when and what haptic sensations to play for each interaction. It listens to interaction events from an interactor (e.g., hovering over or selecting an interactable) and uses a linked HapticImpulsePlayer to send the corresponding vibration to the hardware. Attach SimpleHapticFeedback to interaction components like Near-Far Interactor, Poke Interactor, or Teleport Interactor. This setup allows the player’s controller to automatically vibrate when specific events occur, such as when an object is grabbed, released, or approached. Key properties include:

  • Interactor Source: Defines which interactor this component listens to for hover and select events (e.g., Near-Far Interactor).
  • Haptic Impulse Player: References the HapticImpulsePlayer that actually sends the vibration signal. If left empty, Unity will automatically assign a suitable HapticImpulsePlayer from the controller’s components at runtime.
  • Toggleable event options such as Play Select Entered, Play Hover Entered, or Play Select Exited determine which interactions trigger haptics.
  • Each event can define its own parameters, customizing the feel of the feedback. For instance, a short, sharp vibration with high amplitude might simulate a click or button press, while a longer, softer pulse could represent a gentle touch or hover cue:
    • Amplitude: How strong the vibration is (range 0.0 – 1.0).
    • Duration: How long the vibration lasts (in seconds).
    • Frequency: The pitch or “buzziness” of the vibration (in Hz, optional and device-dependent).
  • Allow Hover Haptics While Selecting: Enables hover-related vibrations even while an object is being held or interacted with.

03

Developers can also call controller-level methods like SendHapticImpulse(float amplitude, float duration) to manually trigger vibrations, but using SimpleHapticFeedback provides a more structured, event-based workflow that aligns with the XR Interaction Toolkit’s interaction system.


Configure Hover Feedback

The Simple Haptics Feedback component in the XR Interaction Toolkit provides a more flexible way to configure vibration feedback during common interaction events like hovering, selecting, or canceling. Let’s use it to create a light pulse when the user’s controller hovers near the tire in the assembly station.

  1. Add the Simple Haptics Feedback Component:
    • In the Hierarchy, select the Left Controller > Near-Far Interactor (repeat for Right Controller if needed).
    • In the Inspector, click Add Component.
    • Search for and add Simple Haptics Feedback.
  2. Enable Hover Enter Feedback:
    • In the Simple Haptics Feedback component, scroll to the Hover section.
    • Check the box next to Play Hover Entered.
  3. Set Hover Feedback Parameters:
    • Set Amplitude to 0.25 for light vibration.
    • Set Duration to 0.1 to create a short pulse.
    • Set Frequency to 0.0 (use default frequency or leave at 0 if unsure).
    • These values create a gentle, noticeable cue as users move close to the tire, ideal for subtle prompts.

    04

  4. Test Hover Feedback in Scene:
    • Press Play (if using the Meta Link app) or deploy the app to your headset.
    • Move your controller near the tire positioned on the table.
    • You should feel a soft pulse as the ray interactor enters hover range. This hover interaction feels intuitive and non-intrusive, perfect for signaling interactivity across the factory — like near tools, robot controls, or digital screens.
  5. Add Object-Specific Hover Haptics:
    • Optionally, write a script to enable more granular control (e.g., different objects trigger different vibrations) and define feedback per object, rather than globally on the controller.
     using UnityEngine;
     using UnityEngine.XR.Interaction.Toolkit;
    
     // Ensures the GameObject has an XRBaseInteractable component
     [RequireComponent(typeof(XRBaseInteractable))]
     public class CustomHoverHaptics : MonoBehaviour
     {
         [Tooltip("Strength of the haptic impulse (0.0 – 1.0).")]
         public float amplitude = 0.2f; // vibration intensity
    
         [Tooltip("Duration of the haptic impulse in seconds.")]
         public float duration = 0.1f;  // how long it vibrates
    
         private void OnEnable()
         {
             // Get the interactable and subscribe to hoverEntered event
             var interactable = GetComponent<XRBaseInteractable>();
             interactable.hoverEntered.AddListener(OnHoverEntered);
         }
    
         private void OnDisable()
         {
             // Unsubscribe when disabled to avoid memory leaks
             var interactable = GetComponent<XRBaseInteractable>();
             interactable.hoverEntered.RemoveListener(OnHoverEntered);
         }
    
         private void OnHoverEntered(HoverEnterEventArgs args)
         {
             // If the hovering object is a controller, send a haptic pulse
             if (args.interactorObject is XRBaseControllerInteractor controllerInteractor)
             {
                 controllerInteractor.SendHapticImpulse(amplitude, duration);
             }
         }
     }
    

    XRBaseInteractable is the base class for all interactable objects in Unity’s XR Interaction Toolkit, providing the core events and logic for interactions such as hovering, selecting, and activating. Classes like XRGrabInteractable, XRSocketInteractor, and XRSimpleInteractable inherit from it to implement specific interaction behaviors.

  6. Configure the Script:
    • Select the Tire GameObject in the Hierarchy.
    • Click Add Component, then add CustomHoverHaptics.cs.
    • Tweak Intensity and Duration in the Inspector for a different sensation unique to this object.

    05

Use the Simple Haptics Feedback component for controller-based interaction logic. Use the custom script when you want different objects (e.g., torque wrench vs. tire) to feel distinct when hovered.


Configure Select Feedback

Now we will configure stronger haptic feedback to reinforce intentional actions, like grabbing a tire, pressing a button, or toggling a switch. In this case, the pulse confirms the user has successfully selected (grabbed) the tire in the assembly station.

  1. Add the Simple Haptics Feedback Component (if not already added):
    • In the Hierarchy, select the Left Controller > Near-Far Interactor (repeat for Right Controller, if needed).
    • In the Inspector, click Add Component.
    • Search for and add Simple Haptics Feedback.
  2. Enable Select Enter Feedback:
    • Locate the Simple Haptics Feedback component.
    • Scroll to the Select section.
    • Check the box for Play Select Entered.
  3. Set Select Feedback Parameters:
    • Set Amplitude to 0.5 to make it more pronounced than hover.
    • Set Duration to 0.25 to vibrate longer to signify a “confirmed” action.
    • Set Frequency to 0.0 unless using a custom frequency pattern.
    • This pulse acts like a tactile confirmation — perfect for grabbing the tire, flipping a safety switch, or initiating a machine cycle.

    06

  4. Test Select Feedback in Scene:
    • Press Play (if using the Meta Link app) or deploy the app to your headset.
    • Aim at the tire, and grab it using the interactor (direct or ray).
    • You should feel a firm pulse when the Play Select Entered event fires, confirming successful selection.
  5. Add Custom Select Feedback Per Object (Optional):
    • Similar to the custom hover haptics example, you can different objects to give unique haptic responses on selection. Here’s a custom script you can attach directly to the object.
     using UnityEngine;
     using UnityEngine.XR.Interaction.Toolkit.Interactables;
     using UnityEngine.XR.Interaction.Toolkit.Interactors;
     using UnityEngine.XR.Interaction.Toolkit;
    
     [RequireComponent(typeof(XRBaseInteractable))]
     public class CustomSelectHaptics : MonoBehaviour
     {
         [Tooltip("Strength of the haptic impulse (0.0 – 1.0).")]
         public float amplitude = 0.5f;
    
         [Tooltip("Duration of the haptic impulse in seconds.")]
         public float duration = 0.25f;
    
         private void OnEnable()
         {
             // Subscribe to the select entered event when the object becomes active
             var interactable = GetComponent<XRBaseInteractable>();
             interactable.selectEntered.AddListener(OnSelectEntered);
         }
    
         private void OnDisable()
         {
             // Unsubscribe from the event when the object is disabled
             var interactable = GetComponent<XRBaseInteractable>();
             interactable.selectEntered.RemoveListener(OnSelectEntered);
         }
    
         private void OnSelectEntered(SelectEnterEventArgs args)
         {
             // Check if the interactor is a controller-based interactor (e.g., hand controller)
             if (args.interactorObject is XRBaseInputInteractor controllerInteractor)
             {
                 // Trigger haptic feedback on the controller using amplitude and duration
                 controllerInteractor.SendHapticImpulse(amplitude, duration);
             }
         }
     }
    
  6. Configure the Script:
    • Select the Tire GameObject in the Hierarchy.
    • Click Add Component, and choose CustomSelectHaptics.cs.
    • Modify the Intensity and Duration values in the Inspector for desired feedback.

    07


Configure Attach Feedback

When the user successfully attaches the tire onto its platform (i.e., socket) at the tire stand in the assembly station, we want to simulate a clear tactile “click” — like snapping something firmly into place. When the socket detects the tire has been correctly placed, the controller delivers a firm, long vibration. This feedback adds a strong sense of completion to the interaction. Here is how to configure this behavior:

  1. Add the Simple Haptics Feedback Component (if not already added):
    • In the Hierarchy, select the Left Controller > Near-Far Interactor (and RightHand Controller, if applicable).
    • In the Inspector, click Add Component.
    • Add Simple Haptics Feedback.
  2. Set Attach Feedback Parameters:
    • Scroll to the Select section.
    • Check the box labeled Play Select Exited.
    • Set Amplitude to 0.7 – strong and satisfying.
    • Set Duration to 0.35 – longer to simulate the weight and resistance of snapping the tire in.
    • Set Frequency to 0.0 – leave at 0 unless using a specific pulse frequency.

    08

    Why this event? In socket-based interactions, the grabbable is often released (select exited) when it snaps into the socket. This makes Select Exited a good trigger for “attachment complete” feedback. This feedback makes it feel like the tire is firmly and correctly mounted on the stand — just like snapping a car wheel into place.

  3. Test Attach Feedback in Scene:
    • Enter Play mode (with Link app) or deploy the app.
    • Grab the tire and place it into the socket on the tire stand.
    • Once inserted and released, the Play Select Exited event fires — triggering a deep, confident pulse on the controller.
    • This sensation pairs well with visual alignment cues and a click sound for multisensory feedback.

Hand Tracking & Haptic Gloves

Hand tracking and haptic gloves in VR bring natural, intuitive interaction to virtual environments by letting users see and feel their own hands. Instead of relying solely on controllers, hand tracking captures finger and wrist motion directly, enabling gestures like grabbing, pressing, and pointing. When paired with haptic gloves such as the bHaptics TactGlove DK2, these interactions gain a tactile layer — vibrations in the fingertips and wrists simulate the sensation of touching, pressing, or holding virtual objects. Together, they transform simple gestures into convincing physical experiences, enhancing realism and user immersion in VR. This section walks through setting up hand tracking using Unity’s XR Interaction Toolkit and integrating bHaptics TactGlove DK2 for fingertip haptic feedback.

F05


Set Up Hand Tracking

  1. Install Required Packages for Hand Tracking:
    • Click Window > Package Management > Package Manager.
    • Select XR Interaction Toolkit (v3.x or newer). Install if not installed already.
    • Click the Samples tab. Import Hand Interaction Demo.
    • Ensure Input System is installed and active (Project Settings > Player > Active Input Handling > Both).

    F06

  2. Set Up Hand Tracking in Scene:
    • Open the demo scene from the imported Hand Interaction Demo sample.
    • In your own XR Setup, copy these objects from the demo rig:
    • Left Hand
    • Right Hand
    • Hand Visualizer
    • Hand Menu With Button Activation - Paste them under your XR Setup in your scene. - These prefabs already include tracked-hand logic and colliders for interaction.

    F07

  3. Enable OpenXR Features:
    • Go to Edit > Project Settings > XR Plug-in Management > OpenXR.
    • Select your target platform (e.g., Android for Quest).
    • Under Features, enable:
    • Hand Tracking Subsystem
    • Meta Quest Support (if building for Quest)
    • Hand Interaction Profile

    F08


Configure bHaptics SDK

  1. Import the bHaptics Haptic Plugin:
    • Import the bHaptics Haptic Plugin from Unity Asset Store.
    • Go to Window > Package Management > Package Manager > My Assets.
    • Import the bHaptics Unity SDK (v2.3.0 or newer).
    • You should now have: Assets/Bhaptics/SDK2.
  2. Set up the Gloves Haptic Settings:
    • In the Project window, open Assets/Bhaptics/SDK2/Prefabs/.
    • Drag the following prefabs into your scene:
    • [bhaptics]
    • [bHapticsGlove] - Select [bHapticsGlove] in the Hierarchy. - Click Create New GloveHapticSettings to auto-generate settings. - Keep the default parameters for now — they work well for most setups.

    F09

Review this Bhaptics documentation to learn more about the bHaptics Physics Glove SDK.


Add Physics to Fingertips

  1. Locate Fingertip Transforms:
    • Inside your hand hierarchy (i.e., RightHand Interaction Visual and LeftHand Interaction Visual), find:
    • R_ThumbTip
    • R_IndexTip
    • R_MiddleTip
    • R_RingTip
    • R_LittleTip
    • L_ThumbTip
    • L_IndexTip
    • L_MiddleTip
    • L_RingTip
    • L_LittleTip
  2. Add Collider and Rigidbody to Each Fingertip:
    • Select the fingertip GameObjects.
    • Click Add Component > Sphere Collider.
    • Set Radius to 0.005 – 0.01.
    • Leave Is Trigger unchecked.
    • Click Add Component > Rigidbody.
    • Uncheck Use Gravity.
    • Check Is Kinematic.
    • Set Collision Detection to Continuous.

    F10

  3. Add the Haptic Script:
    • Create a new script named SimpleFingerHaptics.cs in your project.
    • Paste this code:
     // SimpleFingerHaptics.cs (Trigger-based)
     using UnityEngine;
     using Bhaptics.SDK2;        // for PositionType
     using Bhaptics.SDK2.Glove; // for BhapticsPhysicsGlove
    
     /// <summary>
     /// Trigger-based fingertip haptics for bHaptics TactGlove.
     /// Set fingertip Sphere Colliders to IsTrigger = true.
     /// At least one collider in the pair must have a Rigidbody.
     /// </summary>
     public class SimpleFingerHaptics : MonoBehaviour
     {
         [Tooltip("0=Thumb, 1=Index, 2=Middle, 3=Ring, 4=Little, 5=Wrist")]
         public int fingerIndex;
    
         [Tooltip("Enable for left glove; disable for right.")]
         public bool isLeft;
    
         [Header("Master/Slave Transforms")]
         [Tooltip("Tracked fingertip (visual).")]
         public Transform masterTransform;
    
         [Tooltip("Physics fingertip (trigger collider).")]
         public Transform slaveTransform;
    
         [Header("Optional")]
         [Tooltip("Default impact vector for enter haptic. "
               + "Increase for stronger tap.")]
         public Vector3 defaultEnterVector = new Vector3(0f, 0f, 1f);
    
         // Determines left/right glove side for SDK calls.
         // => is a C# expression-bodied property.
         private PositionType Side =>
             isLeft ? PositionType.GloveL : PositionType.GloveR;
    
         private void OnTriggerEnter(Collider other)
         {
             if (BhapticsPhysicsGlove.Instance == null) return;
    
             // Triggers lack velocity; use default vector for intensity.
             BhapticsPhysicsGlove.Instance.SendEnterHaptic(
                 Side, fingerIndex, defaultEnterVector);
         }
    
         private void OnTriggerStay(Collider other)
         {
             if (BhapticsPhysicsGlove.Instance == null) return;
             if (!masterTransform || !slaveTransform) return;
    
             // While overlapping, sustain haptic based on offset.
             BhapticsPhysicsGlove.Instance.SendStayHaptic(
                 Side, fingerIndex, slaveTransform, masterTransform);
         }
    
         private void OnTriggerExit(Collider other)
         {
             if (BhapticsPhysicsGlove.Instance == null) return;
    
             // End of contact → exit haptic.
             BhapticsPhysicsGlove.Instance.SendExitHaptic(
                 Side, fingerIndex);
         }
     }
    
  4. Configure Each Fingertip in the Inspector:
    • fingerIndex: Assign 0 = Thumb, 1 = Index, 2 = Middle, 3 = Ring, 4 = Little, 5 = Wrist (optional).
    • isLeft: Check for left-hand fingertips. Uncheck for right-hand fingertips.
    • masterTransform: Drag in the tracked fingertip transform (the one following hand tracking). Usually the same tip under your Hand Interaction Visual.
    • slaveTransform: Drag in the fingertip GameObject that has this script and collider attached (often itself).
    • In most XRI Hand Interaction setups, you can assign the same GameObject for both masterTransform and slaveTransform if you’re not using a physics proxy hand.

    F11

  5. Deploy and Test:
    • Build and deploy your project to the headset.
    • On the headset, make sure your TactGlove DK2 devices are paired and connected through the bHaptics Player app or system Bluetooth settings.
    • Once the gloves are paired, launch your Unity app to test fingertip collisions and haptic feedback in VR.
  6. Fine-Tuning Haptic Sensation:
    • Open your GloveHapticSettings asset (attached to [bHapticsGlove]).
    • Adjust parameters as needed.
    • motorIntensityMin / Max: Controls vibration strength.
    • velocityChangeMin / Max: Tunes how sensitive it is to impact speed.
    • decayRate and decayDelay: Control how long sustained contact persists.
    • Resize fingertip colliders (e.g., radius 0.006–0.01) to optimize detection.

Audio in VR

Audio in VR plays a crucial role in creating immersive and realistic experiences by simulating 3D spatial sound that corresponds to the user’s environment and physical movements. It enhances presence, emotional engagement, and user understanding by allowing users to perceive direction, distance, and material characteristics of sound sources, just as they would in a real-world engineering environment. Key concepts include:

  • Audio Feedback: Audio feedback uses sound cues to indicate interactions, state changes, or successful task completions. These auditory signals can be subtle (e.g., soft beeps for UI interaction) or mechanical (e.g., metallic clunks when machinery operates). In XFactory, placing a box into a rack at the logistics station can trigger a satisfying “clunk” sound, reinforcing task completion.

  • Spatial Audio: This technique adjusts sound characteristics—such as stereo balance, volume, and filtering—based on the user’s position and orientation. It provides directional awareness, helping users interpret where sounds are coming from and how far away they are. In the manufacturing station of XFactory, the CNC machines should sound louder as users walk toward them and quieter as they move away.

  • Audio Events: Audio events are specific triggers that play sound effects during interactions. These include object grabs, button presses, or environmental triggers (like a door opening). In the assembly station, interacting with the large display screen should trigger a crisp UI confirmation tone.

F04


Principles

  • Immersion & Presence: Craft audio that aligns seamlessly with the visual and interactive world. Every machine, tool or environment element should emit appropriate sounds—whether it’s the steady, low-pitched hum of a CNC lathe or the distant echo of footsteps in a factory hall. Even subtle layers of ambient noise (airflow, distant machinery, electrical buzz) reinforce spatial context and keep users “in” the scene.

  • Spatial & Directional Cues: Use accurate 3D audio positioning so users can pinpoint sound sources at a glance. In a multi-station workshop, they should hear a welding arc crackle to their left or a conveyor belt whirring behind them. Consistent volume attenuation and HRTF-based panning help operators quickly locate equipment or hazards without breaking visual focus.

  • Clarity & Consistency: Define a clear “audio vocabulary” for key actions and system states. Reuse the same short beep for every successful control-panel selection or the same alert tone for safety warnings across all modules. Predictable, unambiguous sounds build muscle memory and reduce cognitive load, so users instantly recognize outcomes and system responses.

  • Layered Soundscapes By blending these levels thoughtfully—mixing quieter ambient tracks under louder interactive cues—you create a rich, believable soundscape that enhances situational awareness without masking critical feedback. Build depth by stacking multiple audio layers:

    1. Ambient Base: Continuous environmental loops (air handling units, distant machinery).
    2. Functional Mid-Layer: Operational sounds tied to active components (motors, hydraulics).
    3. Interactive Top-Layer: Event-driven cues (button clicks, error tones).

Event-Driven Audio

In VR, event-driven audio feedback is triggered in response to specific game or application events, such as footsteps, machinery actions, UI interaction, or object manipulation. It creates a dynamic and interactive sound experience that responds to user behavior in real time, enhancing immersion by responding to user interactions. Let’s demonstrate how to trigger different sounds when a large box is dropped onto various surfaces inside the logistics station of the XFactory. We will use XR Interaction Toolkit for grabbing and dropping, AudioSource for playing sounds, and physics-based collision detection for triggering different clips.

  1. Prepare Your Environment:
    • Inside the logistics station, create or place a grabbable box (e.g., Box_Large_01a_Prefab_01).
    • Make sure the box has a Rigidbody, a Collider, and an XR Grab Interactable.
    • Position the box next to a the scale (Industrial_Scale_01a) and a wooden pallet (e.g., Pallet_02_Prefab_02).
    • Confirm the scale, pallet, and concrete floor have appropriate colliders.

    09

  2. Create and Configure an Audio Trigger Script:
    • Make sure your box prefab (Box_Large_01a_Prefab_01) has an AudioSource component. If not, the script will automatically add one at runtime, but it’s good practice to add and configure it in advance if you want to control volume, spatial blend, or other settings.
    • Create and attach the following script to your box prefab:
     using UnityEngine;
    
     public class SurfaceAudioTrigger : MonoBehaviour
     {
         public AudioClip metalClip;
         public AudioClip woodClip;
         public AudioClip concreteClip;
    
         private AudioSource audioSource;
    
         void Start()
         {
             audioSource = GetComponent<AudioSource>();
             if (audioSource == null)
             {
                 audioSource = gameObject.AddComponent<AudioSource>();
             }
         }
    
         private void OnCollisionEnter(Collision collision)
         {
             string tag = collision.gameObject.tag;
    
             switch (tag)
             {
                 case "Metal":
                     PlaySound(metalClip);
                     break;
                 case "Wood":
                     PlaySound(woodClip);
                     break;
                 case "Concrete":
                     PlaySound(concreteClip);
                     break;
             }
         }
    
         void PlaySound(AudioClip clip)
         {
             if (clip != null)
             {
                 audioSource.PlayOneShot(clip);
             }
         }
     }
    
  3. Configure the Script:
    • Select the box GameObject in the Hierarchy.
    • In the Inspector, find the Surface Audio Trigger component.
    • Drag appropriate audio clips into the three fields: Metal Clip, Wood Clip, and Concrete Clip.

    10

  4. Tag the Surfaces:
    • Tag the scale Metal.
    • Tag the pallet Wood.
    • Tag the floor Concrete,
    • Go to Tags & Layers > Add Tag, add Metal, Wood, and Concrete, and assign them to the respective objects.

    11

  5. Test the Behavior:
    • Play the scene (using the Link app) or deploy it to your VR headset.
    • Pick up the box using the controller.
    • Drop it on each surface.
    • Listen for context-aware impact audio feedback.

This setup showcases dynamic feedback based on user action and environment context, tightly integrated VR interaction and audio design. You can expand this system to other factory objects—like different drop sounds for metal parts in the production station, or warning buzzers if items are dropped in restricted zones.


3D Spatial Audio

3D spatial audio simulates sound sources in a three-dimensional space, allowing users to perceive direction, distance, and environmental effects. This is especially useful in immersive VR environments like your XFactory, where spatial context enhances realism and user awareness. Let’s enable ambient machine sound for the CNC_Mill_Set GameObject in the manufacturing station using Unity’s 3D audio features.

  1. Prepare the CNC Machine:
    • In the Hierarchy, locate CNC_Mill_Set
    • Confirm it already has an Audio Source component (or add one if not).
  2. Configure the Audio Source:
    • Go to the Audio Source component.
    • Audio Resource: Assign a looping ambient sound (e.g., machine.ogg) that represents the machine hum.
    • Play On Awake: Enable this so the sound starts automatically when the scene loads.
    • Loop: Enable this to keep the sound playing continuously.
    • Spatial Blend: Set to 1.0 to make the sound fully 3D and spatially aware.
    • Min Distance: Set to 3 meters. The sound will be at full volume within this range.
    • Max Distance: Set to 20 meters. Beyond this distance, the sound fades out gradually.
    • Optionally, adjust Doppler Level to simulate pitch shift when the user or object moves, or Spread to control stereo width for large machines.

    12

  3. Automate Playback and 3D Settings (Optional):
    • Attach the script below to the CNC_Mill_Set GameObject.
    • Assign an ambient clip such as machine.ogg in the Inspector.
     using UnityEngine;
    
     // This script plays a looping ambient sound (e.g., CNC machine hum) in the environment
     [RequireComponent(typeof(AudioSource))]
     public class AmbientMachineSound : MonoBehaviour
     {
         [Tooltip("The looping ambient sound clip to play (e.g., CNC hum)")]
         public AudioClip ambientClip;
    
         private AudioSource audioSource;
    
         void Start()
         {
             // Get or add the AudioSource component
             audioSource = GetComponent<AudioSource>();
    
             // Configure AudioSource for spatial audio playback
             audioSource.clip = ambientClip;
             audioSource.loop = true;
             audioSource.playOnAwake = true;
             audioSource.spatialBlend = 1.0f;     // Fully 3D
             audioSource.minDistance = 3f;        // Full volume within 3 meters
             audioSource.maxDistance = 20f;       // Sound fades beyond 20 meters
    
             // Play the ambient sound
             audioSource.Play();
         }
     }
    

    13

    The script configures an AudioSource to loop a 3D ambient clip, sets near and far fade distances, and starts playback automatically. The sound naturally fades with distance, making it ideal for prefabs or multiple ambient machines—ensuring consistent, realistic audio with minimal setup.

  4. Add Audio Reverb Zone for Factory Acoustics (Optional):
    • In the Hierarchy, right-click and choose Audio > Audio Reverb Zone.
    • Position the zone to fully encompass the desired area (e.g., manufacturing station).
    • Go to the Inspector.
    • Set Min Distance to 5 meters. Inside this range, full reverb is applied.
    • Set Max Distance to 15 meters. Outside this, reverb fades out completely.
    • Set Reverb Preset to Factory, Hangar, or Room to simulate realistic industrial acoustics.

    14

Reverb zones only affect 3D audio sources within their area. As the user moves in or out of the zone, sounds dynamically gain or lose reverb — enhancing the spatial feel of different factory zones.


Proximity-Triggered Audio

A Proximity-Triggered Audio setup plays sound only when the user is within a defined range of a sound-emitting object. This improves realism and performance by avoiding unnecessary audio playback in distant or inactive areas. Instead of always playing machine audio, let’s make the CNC_Mill_Set emit sound only when the user approaches, mimicking a real-world scenario where machine noise becomes prominent only when nearby.

  1. Add and Configure the Audio Source:
    • Select the CNC_Mill_Set GameObject in the manufacturing station.
    • Add an Audio Source component if it doesn’t already exist.
    • Go to the Inspector.
    • Audio Clip: Assign a mechanical loop (e.g., machine.ogg).
    • Loop: Enabled.
    • Play On Awake: Disabled (sound will be triggered via script).
    • Spatial Blend: 1 (Fully 3D).
    • Min Distance: 3 (full volume within 3m).
    • Max Distance: 20 (fades out gradually).
  2. Add and Configure the Proximity Trigger Script:
    • Prepare a script that checks the distance between the user’s headset (camera) and the CNC machine. If within a specified radius, the sound plays. If the user walks away, the sound stops.
     using UnityEngine;
    
     /// Plays machine sound when user is near the CNC_Mill_Set object.
     public class ProximityAudioTrigger : MonoBehaviour
     {
         public AudioClip machineClip;
         public Transform userHead;
         public float playDistance = 6f;
    
         private AudioSource audioSource;
    
         void Start()
         {
             audioSource = GetComponent<AudioSource>();
             audioSource.clip = machineClip;
             audioSource.loop = true;
             audioSource.spatialBlend = 1f;
             // Do not play on awake; controlled by proximity
         }
    
         void Update()
         {
             if (userHead == null) return;
    
             float distance = Vector3.Distance(userHead.position, transform.position);
    
             if (distance < playDistance && !audioSource.isPlaying)
             {
                 audioSource.Play();
             }
             else if (distance >= playDistance && audioSource.isPlaying)
             {
                 audioSource.Stop();
             }
         }
     }
    
  3. Configure the Script:
    • Attach the ProximityAudioTrigger script to the same CNC_Mill_Set GameObject.
    • Go to the Inspector.
    • Drag your audio clip (e.g., machine.ogg) into the Machine Clip field.
    • Drag the Main Camera (XR Rig > Camera Offset > Main Camera) into the User Head field.
    • Adjust the Play Distance value to control when the sound activates (e.g., 6 meters for a moderate range).

    15

  4. Play and Test:
    • Enter Play Mode in Unity with your VR headset connected.
    • Walk toward the CNC machine and listen for the audio to fade in naturally as you approach.
    • Move away and notice the audio fading out until it stops completely.
    • Try different approach angles to confirm the sound spatialization feels realistic.
    • Adjust Play Distance, Min Distance, and Max Distance in the Inspector if the audio triggers too soon, too late, or feels unnatural.

Proximity-triggered audio limits active audio sources in large scenes, reducing processing load. It also mimics how object sounds become audible only when the user is nearby and enhances spatial awareness by tying audio to physical proximity.


Key Takeaways

Integrating touch and sound in VR transforms interaction from a visual process into a multisensory experience that strengthens realism, awareness, and precision. In engineering simulations like XFactory, synchronized haptic and audio cues—ranging from fingertip vibrations and controller pulses to spatial and proximity-based machine sounds—provide intuitive, context-rich feedback that enhances safety and engagement. Effective design balances consistency, proportional intensity, and timing across senses while allowing user comfort and adaptability. By layering event-driven haptics, spatial audio, and tactile gloves within responsive, context-aware systems, developers can create immersive VR environments that feel authentic, support efficient task performance, and mirror the dynamics of real-world engineering operations.