Component Reference 🧩

Here is a overview of some of the components that we provide. Some of them map directly to Unity components, while others are core components from Needle Engine.
For a complete list please have a look at the components inside the folders node_modules/@needle-tools/engine/engine-components and engine-components-experimental.

You can always add your own components or add wrappers for Unity components we haven't provided yet.
Read more in the Scripting section of our docs.

Audio

NameDescription
AudioListener
AudioSource

Animation

NameDescription
Animator + AnimatorControllerExport with animation state machine, conditions, transitions
AnimationMost basic animation component. Only first clip is exported
PlayableDirector + TimelineExport powerful sequences to control animation, audio, state and more

Rendering

NameDescription
Camera
LODGroup
Light
ParticleSystemExperimental and currently not fully supported
XRFlagControl when objects will be visible. E.g. only enable object when in AR
VideoPlayerPlayback videos from url or referenced video file (will be copied to output on export)

Networking

NameDescription
SyncedRoomMain networking component. Put in your scene to enable networking
NetworkingUsed to setup backend server for networking.
SyncedTransformAutomatically network object transformation
SyncedCameraAutomatically network camera to other users in room
WebXRSyncNetworks WebXR avatars (AR and VR)
VoipEnables voice-chat

Interaction

NameDescription
EventSystem
ObjectRaycaterRequired for DragControls and Duplicatable
DragControlsRequires raycaster in parent hierarchy, e.g. ObjectRaycaster
DuplicatableRequires DragControls
InteractableBasic component to mark an object to be interactable.
OrbitControlsAdd to camera to add camera orbit control functionality
SmoothFollowAllows to interpolate smoothly to another object's transform
DeleteBox
DropListenerAdd to receive file drop events for uploading
SpatialTriggerUse to raise event if an object enters a specific space or area
SpatialTriggerReceiverUse to receive events from SpatialTrigger

Physics

Physics is implemented using rapieropen in new window.

NameDescription
Rigidbody
BoxCollider
SphereCollider
CapsuleCollider
MeshCollider
Physics Materialsopen in new windowPhysics materials can be used to define e.g. the bouncyness of a collider

XR / WebXR

Read the XR docs

NameDescription
WebXRAdd to scene for AR and VR avatars
WebXRSyncResponsible for networking avatars
SpectatorCameraMirrors VR view to screen when e.g. connected via Oculus Link
XRFlagControl when objects are visible, e.g. only in VR or AR or only in ThirdPerson
WebARSessionRootPut your AR content inside a WebARSessionRoot for placement and scale

Debugging

NameDescription
GridHelperDraws a grid
BoxGizmoDraws a box
AxesHelperDraws axes

Runtime File Input/Output

NameDescription
GltfExportExperimental! Use to export gltf from web runtime.
DropListenerReceive file drop events for uploading and networking

UI

Spatial UI components are mapped from Unity UI (Canvas, not UI Toolkit) to three-mesh-uiopen in new window. UI can be animated.

NameDescription
CanvasUnity's UI system. Needs to be in World Space mode right now.
TextRender Text using Unity's UI. Custom fonts are supported, a font atlas will be automatically generated on export. Use the font settings to control which characters are included in the atlas
Button
Image
RawImage

Note: Depending on your project, often a mix of spatial and 2D UI makes sense for cross-platform projects where VR, AR, and screens are supported. Typically, you'd build the 2D parts with HTML for best accessibility, and the 3D parts with geometric UIs that also support depth offsets (e.g. button hover states and the like).