AnimationCurve is a representation of a curve that can be used to animate values over time.
Simple Inverse Kinematics Constraint
Play animations
Animation controller and playback component
Controls the playback of animations using a state machine architecture.
Controls and plays TimelineAssets
Marker with a name, used for scroll-driven timelines. It is used together with elements in your HTML to define what time in the timeline should be active when the element is in the scroll view.
Component to handle drag and drop of files into the scene
Export selected 3D objects to glTF format
Loads and instantiates a nested glTF file
Dynamically loads and switches between multiple scenes
Rendering scenes from a specific viewpoint
Camera controller using three.js OrbitControls
Automatically fits a box area into the camera view
Look At Constraint for OrbitControls
Character Movement Controller
Maintains positional and rotational offset relative to another object
Bloom Post-Processing Effect
Chromatic Aberration Post-Processing Effect
Color Adjustments Post-Processing Effect
Depth of Field Post-Processing Effect
Pixelation Post-Processing Effect
PostProcessingEffect is a base class for post processing effects that can be applied to the scene.
To create a custom post processing effect, extend this class and override the onCreateEffect method and call registerCustomEffectType to make it available in the editor.
Screenspace Ambient Occlusion post-processing effect.
We recommend using ScreenSpaceAmbientOcclusionN8 instead.
Screen Space Ambient Occlusion (SSAO) Post-Processing Effect
Sharpening Post-Processing Effect
Tilt Shift Post-Processing Effect
Tonemapping Post-Processing Effect
Vignette Post-Processing Effect
Manage Post-Processing Effects
The EventList is a class that can be used to create a list of event listeners that can be invoked.
Changes the material of objects when clicked
Moves an object to a target transform upon click
Emphasizes the target object when clicked
Hides the object on scene start
Plays an animation when clicked
Plays an audio clip when clicked
Sets the active state of an object when clicked
Triggers an action when the object is tapped/clicked
Hides or shows the object when clicked
Visualizes object axes in the scene
Display a box around the object
Bounding box helper with intersection tests
Bounding box helper with intersection tests
GridHelper is a component that allows to display a grid in the scene.
Object manipulation gizmo for translating, rotating, and scaling
A Raycaster that performs raycasting against its own GameObject.
Objects with this component can be destroyed by the DeleteBox component.
Box area that deletes objects entering it
Enables dragging of objects in 2D or 3D space
Duplicates a GameObject on pointer events
Triggers events on pointer interactions
OpenURL behaviour opens a URL in a new tab or window when the object (or any if it's children) is clicked.
Smoothly follows a target object
A spatial trigger component that detects objects within a box-shaped area. Used to trigger events when objects enter, stay in, or exit the defined area
Receives spatial trigger events
Makes the object look at a target object or the camera
Audio listener for 3D audio capture
3D audio source with spatial positioning and playback controls
Plays video clips from URLs or streams
WebXR Avatar component for head and hands synchronization
Networking configuration
Assigns a unique color to the player object
Share screen, camera or microphone in a networked room
Spectator camera for following other users
Syncs camera position and rotation of users in a networked room
Joins a networked room based on URL parameters or a random room
Synchronizes object transform over the network with ownership management
Voice over IP for networked audio communication
BoxCollider represents a box-shaped collision volume. Ideal for rectangular objects or objects that need a simple cuboid collision boundary.
CapsuleCollider represents a capsule-shaped collision volume (cylinder with hemispherical ends). Ideal for character controllers and objects that need a rounded collision shape.
Physics collider
MeshCollider creates a collision shape from a mesh geometry. Allows for complex collision shapes that match the exact geometry of an object.
Rigidbody for physical interactions
SphereCollider represents a sphere-shaped collision volume. Useful for objects that are roughly spherical in shape or need a simple collision boundary.
Display contact shadows on the ground
Projects the environment map onto the ground
Light component for various light types and shadow settings
Level of Detail Group for optimizing rendering
Handles the motion and rendering of many individual particles
Provides reflection data to materials
Sets the skybox or environment texture of a scene
This component is automatically added by the Renderer component if the object has lightmap uvs AND we have a lightmap.
Makes objects fade out when obscuring a reference point from the camera
Creates a shadow mask or a light occluder
2D image renderer
Manage Post-Processing Effects
Receives signals and invokes reactions
Holds spline data and generates a spline curve.
Moves an object along a spline
Used by the SpriteRenderer to hold the sprite sheet and the currently active sprite index.
The sprite renderer renders a sprite on a GameObject using an assigned spritesheet (SpriteData).
A Raycaster that performs raycasting against UI elements (objects with a CanvasRenderer component).
Derive from this class if you want to implement your own UI components.
It provides utility methods and simplifies managing the underlying three-mesh-ui hierarchy.
UI Button that can be clicked to perform actions
Root component for UI elements, managing layout and rendering settings
Group UI elements to control transparency and interactivity
Manages and dispatches input events to UI components
Display a 2D image in the UI
Text field for user input
Configuration component for the Needle Menu
Add an outline effect to UI elements
Derive from this class if you want to implement your own UI components.
It provides utility methods and simplifies managing the underlying three-mesh-ui hierarchy.
Render HTML elements as 3D objects in the scene
Display text in the UI
Derive from this class if you want to implement your own UI components.
It provides utility methods and simplifies managing the underlying three-mesh-ui hierarchy.
Show or hide GameObject based on device type
Use the XRFlag component to show or hide objects based on the current XR state or session.
This means you can show or hide objects based on if the user is in VR, AR, using first person view or third person view.
Aligns and scales the object between two target GameObjects
Allows pointer events to "click through" the 3D canvas to HTML elements behind it.
Makes the object follow the cursor position on screen
Hover Animation on Pointer Enter/Exit
Links scroll position to target objects
WebXR Avatar component for head and hands synchronization
A Raycaster that performs sphere overlap raycasting for spatial grab interactions in XR.
This component is just used as a marker on objects for WebXR teleportation
The XRControllerMovement component can be configured to check if the TeleportTarget component is present on an object to allow teleporting to that object.
Export 3D objects as USDZ files for QuickLook
Displays the camera feed as background in WebAR sessions
Root object for WebAR sessions, managing scene placement and user scaling in AR.
WebXR Component for VR and AR support
Add this component to a object to enable image tracking in WebXR sessions.
Use this component to track planes and meshes in the real world when in immersive-ar (e.g. on Oculus Quest).
Makes the object follow a specific XR controller or hand
Displays controller or hand models in XR
Move the XR rig using controller input
Use the XRFlag component to show or hide objects based on the current XR state or session.
This means you can show or hide objects based on if the user is in VR, AR, using first person view or third person view.
A user in XR (VR or AR) is parented to an XR rig during the session.
When moving through the scene the rig is moved instead of the user.
The ClearFlags enum is used to determine how the camera clears the background
The DragMode determines how an object is dragged around in the scene.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Used to attract Rigidbodies towards the position of this component.
Add Rigidbodies to the targets array to have them be attracted.
You can use negative strength values to create a repulsion effect.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Handles loading and instantiating avatar models from various sources. Provides functionality to find and extract important parts of an avatar (head, hands).
This is used to mark an object being controlled / owned by a player
This system might be refactored and moved to a more centralized place in a future version
Represents an avatar model with head and hands references. Used for representing characters in 3D space.
internal USDZ behaviours extension
Needle Engine component's are the main building blocks of the Needle Engine.
Derive from Behaviour to implement your own using the provided lifecycle methods.
Components can be added to any Object3D using addComponent or GameObject.addComponent.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Custom branding for the QuickLook overlay, used by USDZExporter.
Base class for objects in Needle Engine. Extends Object3D from three.js. GameObjects can have components attached to them, which can be used to add functionality to the object. They manage their components and provide methods to add, remove and get components.
The instance handle is used to interface with the mesh that is rendered using instancing.
Handles instancing for Needle Engine.
Keyframe is a representation of a keyframe in an AnimationCurve.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
This pointer event data object is passed to all event receivers that are currently active
It contains hit information if an object was hovered or clicked
If the event is received in onPointerDown or onPointerMove, you can call setPointerCapture to receive onPointerMove events even when the pointer has left the object until you call releasePointerCapture or when the pointerUp event happens
You can get additional information about the event or event source via the event property (of type NEPointerEvent)
PostProcessingHandler is responsible for applying post processing effects to the scene. It is internally used by the Volume component
Needle Engine component's are the main building blocks of the Needle Engine.
Derive from Behaviour to implement your own using the provided lifecycle methods.
Components can be added to any Object3D using addComponent or GameObject.addComponent.
Needle Engine component's are the main building blocks of the Needle Engine.
Derive from Behaviour to implement your own using the provided lifecycle methods.
Components can be added to any Object3D using addComponent or GameObject.addComponent.
Used to reference a signal asset in a SignalReceiver. This is internally used by the SignalReceiverEvent.
An event that links a signal to a reaction. Used internally by SignalReceiver.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Marks an object as currently being interacted with. For example, DragControls set this on the dragged object to prevent DeleteBox from deleting it.
WebXRImageTracking allows you to track images in the real world and place objects on top of them.
This component is only available in WebXR sessions.
The WebXRImageTrackingModel contains the image to track, the object to place on top of the image, and the size of the image as well as settings for the tracking.
Used by the WebXRImageTracking component
Implement on your component to receive input events via the EventSystem component
The ISceneEventListener is called by the SceneSwitcher when a scene is loaded or unloaded.
It must be added to the root object of your scene (that is being loaded) or on the same object as the SceneSwitcher
It can be used to e.g. smooth the transition between scenes or to load additional content when a scene is loaded.
Experimental interface for receiving timeline animation callbacks. Register at the PlayableDirector
Network event arguments passed between clients when using the DropListener with networking
Arguments provided to handlers when an object is dropped or added to the scene
Used by WebXRPlaneTracking to track planes in the real world.
Used by WebXRPlaneTracking to track planes in the real world.
Default order for post-processing effects. This can be used to sort effects by their rendering order when creating custom effects.
E.g. in your custom effect, you can set order: PostProcessingEffectOrder.Bloom + 1; to ensure it gets rendered after the bloom effect.
OR order: PostProcessingEffectOrder.Bloom - 1; to ensure it gets rendered before the bloom effect.
Reads back a texture from the GPU (can be compressed, a render texture, or anything), optionally applies RGBA colorScale to it, and returns CPU data for further usage. Note that there are WebGL / WebGPU rules preventing some use of data between WebGL contexts.
This method uses a '2d' canvas context for pixel manipulation, and can apply a color scale or Y flip to the given image. Unfortunately, canvas always uses premultiplied data, and thus images with low alpha values (or multiplying by a=0) will result in black pixels.
Contains Needle Engine Core Components.
This includes
Interactivity components
DragControls, SmoothFollow, Duplicatable, SpatialTrigger
Everywhere Actions
SetActiveOnClick, PlayAnimationOnClick, PlayAudioOnClick, ChangeMaterialOnClick
Camera and user controls
OrbitControls, CharacterController
Rendering components
Light, Renderer, ParticleSystem, Volume (post processing), ReflectionProbe, GroundProjectedEnv, ShadowCatcher
Media components
AudioSource, VideoPlayer
Helpers
AxesHelper, GridHelper, TransformGizmo
Asset Management components
DropListener, SceneSwitcher, GltfExport
XR components
WebXR, USDZExporter, XRRig
Networking components
SyncedRoom, SyncedTransform, SyncedCamera, Voip, ScreenCapture
Animation components
Animator, Animation, PlayableDirector
Physics components
Rigidbody, BoxCollider, SphereCollider, MeshCollider, PhysicsMaterial
Utilities
NeedleMenu
and more.
All these components are available wherever Needle Engine is used.