Supported Devices

Theoretically all WebXR-capable devices and browsers should be supported. That being said, we've tested the following configurations:

Tested VR DeviceBrowserNotes
Apple Vision Proโœ”๏ธ Safari Browserhand tracking, support for transient pointer
Meta Quest 1โœ”๏ธ Meta Browserhand tracking, support for sessiongranted1
Meta Quest 2โœ”๏ธ Meta Browserhand tracking, support for sessiongranted1, passthrough (black and white)
Meta Quest 3โœ”๏ธ Meta Browserhand tracking, support for sessiongranted1, passthrough, depth sensing, mesh tracking
Meta Quest Proโœ”๏ธ Meta Browserhand tracking, support for sessiongranted1, passthrough
Pico Neo 3โœ”๏ธ Pico Browserno hand tracking, inverted controller thumbsticks
Pico Neo 4โœ”๏ธ Pico Browserpassthrough, hand tracking2
Oculus Rift 1/2โœ”๏ธ Chrome
Hololens 2โœ”๏ธ Edgehand tracking, support for AR and VR (in VR mode, background is rendered as well)
Looking Glass Portraitโœ”๏ธ Chromerequires shim, see samples
Tested AR DeviceBrowserNotes
Android 10+โœ”๏ธ Chrome
Android 10+โœ”๏ธ Firefox
iOS 15+โœ”๏ธ WebXR Viewerdoes not fully implement standards, but supported
iOS 15+(โœ”๏ธ)3 SafariNo full code support, but Needle Everywhere Actions are supported for creating dynamic, interactive USDZ files.
Hololens 2โœ”๏ธ Edge
Hololens 1โŒno WebXR support
Magic Leap 2โœ”๏ธ
Not Tested but Should Workโ„ข๏ธBrowserNotes
Magic Leap 1please let us know if you tried!

1: Requires enabling a browser flag: chrome://flags/#webxr-navigation-permission
2: Requires enabling a toggle in the Developer settings
3: Uses Everywhere Actions or other approaches

Examples

Visit our Needle Engine XR Samplesopen in new window to try many interactive examples right now!

Adding VR and AR capabilities to a scene

AR, VR and networking capabilites in Needle Engine are designed to be modular. You can choose to not support any of them, or add only specific features.

Basic capabilities

  • Enable AR and VR
    Add a WebXR component.
    Optional: you can set a custom avatar by referencing an Avatar Prefab.
    By default a very basic DefaultAvatar is assigned.

  • Enable Teleportation
    Add a TeleportTarget component to object hierarchies that can be teleported on.
    To exclude specific objects, set their layer to IgnoreRaycasting.

Multiplayer

  • Enable Networking
    Add a SyncedRoom component.

  • Enable Desktop Viewer Sync
    Add a SyncedCamera component.

  • Enable Voice Chat
    Add a VoIP component.

Note: these components can be anywhere inside your GltfObject hierarchy. They can also all be on the same GameObject.

Castle Builderopen in new window uses all of the above for a cross-platform multiplayer sandbox experience.
โ€” #madebyneedle ๐Ÿ’š

Special AR Components

  • Define the AR Session Root and Scale
    Add a WebARSessionRoot component to your root object.
    Here you can define the user scale to shrink (< 1) or enlarge (> 1) the user in relation to the scene when entering AR.

Controlling object display for XR

  • Define whether an object is visible in Browser, AR, VR, First Person, Third Person
    Add a XR Flag component to the object you want to control. Change options on the dropdown as needed.

    Common usecases are

    • hiding floors when entering AR
    • hiding Avatar parts in First or Third Person views (e.g. first-person head shouldn't be visible).

Travelling between VR worlds

Needle Engine supports the sessiongrantedopen in new window state. This allows users to seamlessly traverse between WebXR applications without leaving an immersive session โ€“ they stay in VR or AR.

Currently, this is only supported on Oculus Quest 1 and 2 in the Oculus Browser. On other platforms, users will be kicked out of their current immersive session and have to enter VR again on the new page.
Requires enabling a browser flag: chrome://flags/#webxr-navigation-permission

  • Click on objects to open links
    Add the OpenURL component that makes it very easy to build connected worlds.

Scripting

Read more about scripting for XR at the scripting XR documentation

Avatars

While we don't currently provide an out-of-the-box integration external avatar systems, you can create application-specific avatars or custom systems.

  • Create a custom Avatar
    • Create an empty GameObject as avatar root
    • Add an object named Head and add a XRFlag that's set to Third Person
    • Add objects named HandLeft and HandRight
    • Add your graphics below these objects.

Experimental Avatar Components

There's a number of experimental components to build more expressive Avatars. At this point we recommended duplicating them to make your own variants, since they might be changed or removed at a later point.

20220817-230858-87dG-Unity_PLjQ
Example Avatar Rig with basic neck model and limb constraints

  • Random Player Colors
    As an example for avatar customization, you can add a PlayerColor component to your renderers.
    This randomized color is synchronized between players.

  • Eye Rotation
    AvatarEyeLook_Rotation rotates GameObjects (eyes) to follow other avatars and a random target. This component is synchronized between players.

  • Eye Blinking
    AvatarBlink_Simple randomly hides GameObjects (eyes) every few seconds, emulating a blink.

    image
    Example Avatar Prefab hierarchy

  • Offset Constraint
    OffsetConstraint allows to shift an object in relation to another one in Avatar space. This allows, for example, to have a Body follow the Head but keep rotation levelled. It also allows to construct simple neck models.

  • Limb Constraint
    BasicIKConstraint is a very minimalistic constraint that takes two transforms and a hint. This is useful to construct simple arm or leg chains. As rotation is currently not properly implemented, arms and legs may need to be rotationally symmetric to "look right". It's called "Basic" for a reason!

HTML Content Overlays in AR

If you want to display different html content whether the client is using a regular browser or using AR or VR, you can just use a set of html classes.
This is controlled via HTML element classes. For example, to make content appear on desktop and in AR add a <div class="desktop ar"> ... </div> inside the <needle-engine> tag:

<needle-engine>
    <div class="desktop ar" style="pointer-events:none;">
        <div class="positioning-container">
          <p>your content for AR and desktop goes here</p>
          <p class="only-in-ar">This will only be visible in AR</p>
        <div>
    </div>
</needle-engine>

Content Overlays are implemented using the optional dom-overlay feature which is usually supported on screen-based AR devices (phones, tablets).

Use the .ar-session-active class to show/hide specific content while in AR. The :xr-overlay pseudo classopen in new window shouldn't be used at this point because using it breaks Mozilla's WebXR Viewer.

.only-in-ar {
  display: none;
}

.ar-session-active .only-in-ar {
  display:initial;
}

It's worth noting that the overlay element will be always displayed fullscreen while in XRopen in new window, independent of styling that has been applied. If you want to align items differently, you should make a container inside the class="ar" element.

Image Tracking

WebXR ImageTracking is still in "draft" phase: Marker Tracking Explaineropen in new window
But you can still use WebXR ImageTracking with Needle Engine today:

  • Enable WebXR Incubations in chrome
  • Add the WebXRImageTracking component

You can find additional documentation in the Everywhere Actions section

Without that spec, one can still request camera image access and run custom algorithms to determine device pose.
Libraries to add image tracking:

Augmented Reality and WebXR on iOS

Augmented Reality experiences on iOS are somewhat limited, due to Apple currently not supporting WebXR on iOS devices.

Needle Engine's Everywhere Actions are designed to fill that gap, bringing automatic interactive capabilities to iOS devices for scenes composed of specific components. They support a subset of the functionality that's available in WebXR, for example spatial audio, image tracking, animations, and more. See the docs for more information.

Musical Instrument โ€“ WebXR and QuickLook support

Here's an example for a musical instrument that uses Everywhere Actions and thus works in browsers and in AR on iOS devices. It uses spatial audio, animation, and tap interactions.

Everywhere Actions and other options for iOS AR

There's also other options for guiding iOS users to even more capable interactive AR experiences:

  1. Exporting content on-the-fly as USDZ files.
    These files can be displayed on iOS devices in AR. When exported from scenes with Everywhere Actions the interactivity is the same, more than sufficient for product configurators, narrative experiences and similar. An example is Castle Builderopen in new window where creations (not the live session) can be viewed in AR.

Encryption in Spaceopen in new window uses this approach. Players can collaboratively place text into the scene on their screens and then view the results in AR on iOS. On Android, they can also interact right in WebXR.
โ€” #madewithneedle by Katja Rempel ๐Ÿ’š

  1. Guiding users towards WebXR-compatible browsers on iOS. Depending on your target audience, you can guide users on iOS towards for example Mozilla's WebXR Vieweropen in new window to experience AR on iOS.

  2. Using camera access and custom algorithms on iOS devices.
    One can request camera image access and run custom algorithms to determine device pose.
    While we currently don't provide built-in components for this, here's a few references to libraries and frameworks that we want to try in the future:

References

WebXR Device APIopen in new window
caniuse: WebXRopen in new window
Apple's Preliminary USD Behavioursopen in new window