Here is a overview of some of the components that we provide. Some of them map directly to Unity components, while others are core components from Needle Engine.

For a complete list please have a look at the components inside the folders node_modules/@needle-tools/engine/engine-components and engine-components-experimental.

You can always add your own components or add wrappers for Unity components we haven't provided yet.

Learn more in the Scripting section of our docs.

Audio

NameDescription
AudioListener
AudioSourceUse to play audio

Animation

NameDescription
Animator with AnimatorControllerExport with animation state machine, conditions, transitions
AnimationMost basic animation component. Only first clip is exported
PlayableDirector with TimelineAssetExport powerful sequences to control animation, audio, state and more

Rendering

NameDescription
Camera
LightDirectionalLight, PointLight, Spotlight. Note that you can use it to bake light (e.g. Rectangular Light shapes) as well
XRFlagControl when objects will be visible. E.g. only enable object when in AR
DeviceFlagControl on which device objects will be visible
LODGroup
ParticleSystemExperimental and currently not fully supported
VideoPlayerPlayback videos from url or referenced video file (will be copied to output on export). The VideoPlayer also supports streaming from MediaStream objects or M3U8 livestream URLs
MeshRendererUsed to handle rendering of objects including lightmapping and instancing
SkinnedMeshRendererSee MeshRenderer
SpriteRendererUsed to render Sprites and Spriteanimations
Volume with PostProcessing assetSee table below

Postprocessing

Postprocessing effects use the pmndrs postprocessing libraryopen in new window under the hood. This means you can also easily add your own custom effects and get an automatically optimized postprocessing pass.

  • Unity only: Note that Postprocessing effect export in Unity is only supported with URP.

  • Unity only: extra Component means that the effect component is an extra component that has to be added next to the Volume component. For example for Antialiasing add a Volume component and an Antialiasing component to the same GameObject.

Effect Name
Antialiasingextra Unity Component
Bloomvia Volume asset
Chromatic Aberrationvia Volume asset
Color Adjustments / Color Correctionvia Volume asset
Depth Of Fieldvia Volume asset
Pixelation
Screenspace Ambient Occlusion N8
Screenspace Ambient Occlusion
Tilt Shift Effect
Vignettevia Volume asset
Your custom effectCreate a new class that extends from Needle Engine's PostProcessingEffect class. Then call registerCustomEffectType with your effect name and class type.

Networking

NameDescription
SyncedRoomMain networking component. Put in your scene to enable networking
NetworkingUsed to setup backend server for networking.
SyncedTransformAutomatically network object transformation
SyncedCameraAutomatically network camera position and view to other users in room. You can define how the camera is being rendered by referencing an object
WebXRSyncNetworks WebXR avatars (AR and VR)
VoipEnables voice-chat
ScreensharingEnables screen-sharing capabilities

Interaction

NameDescription
EventSystemHandles raising pointer events and UI events on objects in the scene
ObjectRaycaterRequired for DragControls and Duplicatable
GraphicsRaycasterSame as ObjectRaycaster but for UI elements
DragControlsAllows objects to be dragged in the scene. Requires raycaster in parent hierarchy, e.g. ObjectRaycaster
DuplicatableCan duplicate assigned objects by drag. Requires DragControls
InteractableBasic component to mark an object to be interactable.
OrbitControlsAdd to camera to add camera orbit control functionality
SmoothFollowAllows to interpolate smoothly to another object's transform
DeleteBoxWill destroy objects with the Deletable component when entering the box
DeletableThe GameObject this component is attached to will be deleted when it enters or intersects with a DeleteBox
DropListenerAdd to receive file drop events for uploading
SpatialTriggerUse to raise event if an object enters a specific space or area. You can also use Physics events
SpatialTriggerReceiverUse to receive events from SpatialTrigger

Physics

Physics is implemented using Rapieropen in new window.

NameDescription
RigidbodyAdd to make an object react to gravity (or be kinematic and static)
BoxColliderA Box collider shape that objects can collide with or raise trigger events when set to trigger
SphereColliderSee BoxCollider
CapsuleColliderSee BoxCollider
MeshColliderSee BoxCollider
Physics MaterialsPhysics materials can be used to define e.g. the bouncyness of a collider

XR / WebXR

Read the XR docs

NameDescription
WebXRAdd to scene for VR, AR and Passthrough support as well as rendering Avatar models
USDZExporterAdd to enable USD and Quicklook support
XRFlagControl when objects are visible, e.g. only in VR or AR or only in ThirdPerson
WebARSessionRootHandles placement and scale of your scene in AR mode
WebARCameraBackgroundAdd to access the AR camera image and apply effects or use it for rendering
WebXRImageTrackingAssign images to be tracked and optionally instantiate an object at the image position
WebXRPlaneTrackingCreate plane meshes or colliders for tracked planes
XRControllerModelCan be added to render device controllers or hand models (will be created by default when enabled in the WebXR component)
XRControllerMovementCan be added to provide default movement and teleport controls
XRControllerFollowCan be added to any object in the scene and configured to follow either left or right hands or controllers

Debugging

NameDescription
GridHelperDraws a grid
BoxGizmoDraws a box
AxesHelperDraws XYZ axes
Note: When you're writing custom code you can use the static Gizmos methods for drawing debugging lines and shapes

Runtime File Input/Output

NameDescription
GltfExportExperimental! Use to export gltf from web runtime.
DropListenerReceive file drop events for uploading and networking

UI

Spatial UI components are mapped from Unity UI (Canvas, not UI Toolkit) to three-mesh-uiopen in new window. UI can be animated.

NameDescription
CanvasUnity's UI system. Needs to be in World Space mode right now.
TextRender Text using Unity's UI. Custom fonts are supported, a font atlas will be automatically generated on export. Use the font settings to control which characters are included in the atlas
ButtonReceives click events - use the onClick event to react to it. It can be added too 3D scene objects as well
ImageRenders a sprite image
RawImageRenders a texture

Note: Depending on your project, often a mix of spatial and 2D UI makes sense for cross-platform projects where VR, AR, and screens are supported. Typically, you'd build the 2D parts with HTML for best accessibility, and the 3D parts with geometric UIs that also support depth offsets (e.g. button hover states and the like).

Other

NameDescription
SceneSwitcherHandles loading and unloading of other scenes or prefabs / glTF files. Has features to preload, change scenes via swiping, keyboard events or URL navigation

Editor Only

NameDescription
ExportInfoMain component for managing the web project(s) to e.g. install or start the web app
EditorSyncAdd to enable networking material or component value changes to the running three.js app directly from the Unity Editor without having to reload