AnimationCurve is a representation of a curve that can be used to animate values over time.
Plays animations from AnimationClips
Plays and manages animations on a GameObject based on an AnimatorController
Controls the playback of animations using a state machine architecture.
Two-bone inverse kinematics constraint
Controls and plays TimelineAssets
Receives signals and invokes reactions
Marker with a name, used for scroll-driven timelines. It is used together with elements in your HTML to define what time in the timeline should be active when the element is in the scroll view.
Drag-and-drop file loading for 3D assets
Export selected 3D objects to glTF format
Loads and instantiates a nested glTF file
Dynamically loads and switches between multiple scenes
Rendering scenes from a specific viewpoint
Look At Constraint for camera targeting
Camera controller using three.js OrbitControls
Automatically fits a box area into the camera view
Character Movement Controller
User Input for Character Controller
Aligns and scales object between two targets
Maintains positional/rotational offset relative to target
Antialiasing provides SMAA (Subpixel Morphological Antialiasing) post-processing effect to smooth edges in the rendered scene.
Bloom Post-Processing Effect
Chromatic Aberration Post-Processing Effect
Color Adjustments Post-Processing Effect
Depth of Field Post-Processing Effect
EffectWrapper wraps a custom postprocessing effect to integrate it with the Needle Engine post-processing pipeline.
Pixelation Post-Processing Effect
PostProcessingEffect is a base class for post processing effects that can be applied to the scene.
To create a custom post processing effect, extend this class and override the onCreateEffect method and call registerCustomEffectType to make it available in the editor.
ScreenSpaceAmbientOcclusion is a screenspace ambient occlusion post-processing effect.
We recommend using ScreenSpaceAmbientOcclusionN8 instead.
Screen Space Ambient Occlusion (SSAO) Post-Processing Effect
Sharpening Post-Processing Effect
Tilt Shift Post-Processing Effect
Tonemapping Post-Processing Effect
Vignette Post-Processing Effect
Manage Post-Processing Effects
EventList manages a list of callbacks that can be invoked together.
Used for Unity-style events that can be configured in the editor (Unity or Blender).
Changes the material of objects when clicked
Moves an object to a target transform upon click
Emphasizes the target object when clicked
Hides the object on scene start
Makes the object look at a target object or the camera
Plays an animation when clicked
Plays an audio clip when clicked
Sets the active state of an object when clicked
Triggers an action when the object is tapped/clicked
Hides or shows the object when clicked
Visualizes object axes (X=red, Y=green, Z=blue)
Display a box around the object
Bounding box helper with intersection tests
Bounding box helper with intersection tests
The GridHelper displays a flat grid in the scene for visual reference. Useful for debugging, level design, or providing spatial context.
Object manipulation gizmo for translate/rotate/scale
Makes objects follow the cursor/touch position in 3D space
Marks object as destroyable by DeleteBox
Box area that deletes objects entering it
Enables dragging of objects in 2D or 3D space
Duplicates a GameObject on pointer events
Triggers events on pointer interactions
Plays animations on pointer hover enter/exit events
Makes the object look at a target object or the camera
ObjectRaycaster enables pointer interactions with 3D objects.
Add this component to any object that needs click/hover detection.
OpenURL behaviour opens a URL in a new tab or window when the object (or any if it's children) is clicked.
Smoothly follows a target object's position and/or rotation
Define a trigger zone that detects entering objects
Receives spatial trigger events
Marks object as currently being interacted with
Receives audio in the scene and outputs it to speakers
Plays audio clips from files or media streams
Share screen, camera or microphone in a networked room
Plays video clips from URLs or streams
WebXR Avatar component for head and hands synchronization
Internal marker for player-controlled objects in networked sessions
Networking configuration
Assigns a unique color to the player object
Share screen, camera or microphone in a networked room
Spectator camera for following other users
Syncs camera position and rotation of users in a networked room
Joins a networked room based on URL parameters or a random room
Synchronizes object transform over the network with ownership management
Voice over IP for networked audio communication
Box-shaped physics collider
CapsuleCollider represents a capsule-shaped collision volume (cylinder with hemispherical ends).
Ideal for character controllers and objects that need a rounded collision shape.
Physics collider base class
Lock two Rigidbodies together rigidly
Connect two Rigidbodies with a rotating hinge
MeshCollider creates a collision shape from a mesh geometry.
Allows for complex collision shapes that match the exact geometry of an object.
Enables physics simulation with forces, gravity, and collisions
Sphere-shaped physics collider
Display contact shadows on the ground
Adds fog effect to the scene
Projects the environment map onto the ground
Light component for various light types and shadow settings
Level of Detail Group for optimizing rendering
The Renderer component controls rendering properties of meshes including materials,
lightmaps, reflection probes, and GPU instancing.
Handles the motion and rendering of many individual particles
Provides reflection data to materials
Sets the skybox or environment texture of a scene
The Renderer component controls rendering properties of meshes including materials,
lightmaps, reflection probes, and GPU instancing.
This component is automatically added by the Renderer component if the object has lightmap uvs AND we have a lightmap.
Fades objects when they obscure the camera's view of a reference point
Renders real-time shadows from lights onto surfaces
Renderer for deformable meshes
2D image renderer
Renders 2D images from a sprite sheet
Manage Post-Processing Effects
Manages smooth spline curves defined by control point knots
Moves an object along a spline
Used by the SpriteRenderer to hold the sprite sheet and the currently active sprite index.
Derive from this class if you want to implement your own UI components.
It provides utility methods and simplifies managing the underlying three-mesh-ui hierarchy.
UI Button that can be clicked to perform actions
Root component for UI elements, managing layout and rendering settings
Group UI elements to control transparency and interactivity
Manages and dispatches input events to UI components
Graphic provides basic rendering for UI elements with color, opacity, and texture support.
Raycaster for UI elements
GridLayoutGroup arranges child UI elements in a grid pattern.
HorizontalLayoutGroup arranges child UI elements horizontally with spacing, padding, and alignment options.
Display a 2D image in the UI
Text field for user input
Configuration component for the Needle Menu overlay
Add an outline effect to UI elements
UI Rectangle Transform
Render HTML elements as 3D objects in the scene
Display text in the UI
Derive from this class if you want to implement your own UI components.
It provides utility methods and simplifies managing the underlying three-mesh-ui hierarchy.
VerticalLayoutGroup arranges child UI elements vertically with spacing, padding, and alignment options.
Show or hide GameObject based on device type
XRFlag shows or hides GameObjects based on the current XR state or session.
Use for XR-responsive content that should only appear in specific modes.
Enables pointer events to pass through canvas to HTML elements behind it
Links scroll position to target objects
WebXR Avatar component for head and hands synchronization
Internal marker for player-controlled objects in networked sessions
SpatialGrabRaycaster enables direct grab interactions in VR/AR. Uses sphere overlap detection around the controller/hand position to allow grabbing objects by reaching into them.
Marker component for valid VR teleportation destinations
Export 3D objects as USDZ files for Apple QuickLook AR
Displays the camera feed as background in WebAR sessions
Root object for WebAR sessions, managing scene placement and user scaling in AR.
WebXR Component for VR and AR support
The easiest way to create cross-platform AR image tracking experiences
Configuration for a single trackable image marker
WebXRPlaneTracking tracks planes and meshes in the real world when in immersive-ar (e.g. on Oculus Quest).
Runtime data for a detected marker image in WebXR
Makes the object follow a specific XR controller or hand
Displays controller or hand models in XR
Move the XR rig using controller input
XRFlag shows or hides GameObjects based on the current XR state or session.
Use for XR-responsive content that should only appear in specific modes.
A user in XR (VR or AR) is parented to an XR rig during the session.
When moving through the scene the rig is moved instead of the user.
The ClearFlags enum is used to determine how the camera clears the background
The DragMode determines how an object is dragged around in the scene.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Used to attract Rigidbodies towards the position of this component.
Add Rigidbodies to the targets array to have them be attracted.
You can use negative strength values to create a repulsion effect.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Handles loading and instantiating avatar models from various sources. Provides functionality to find and extract important parts of an avatar (head, hands).
Represents an avatar model with head and hands references. Used for representing characters in 3D space.
internal USDZ behaviours extension
Needle Engine component's are the main building blocks of the Needle Engine.
Derive from Behaviour to implement your own using the provided lifecycle methods.
Components can be added to any Object3D using addComponent or GameObject.addComponent.
CallInfo represents a single callback method that can be invoked by the EventList.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Custom branding for the QuickLook overlay, used by USDZExporter.
Base class for objects in Needle Engine. Extends Object3D from three.js. GameObjects can have components attached to them, which can be used to add functionality to the object. They manage their components and provide methods to add, remove and get components.
The instance handle is used to interface with the mesh that is rendered using instancing.
Handles instancing for Needle Engine.
Keyframe is a representation of a keyframe in an AnimationCurve.
Defines a single LOD level with its transition distance and associated renderers. Used by LODGroup to configure level of detail switching.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Base class for custom particle behaviors. Extend this to create custom particle logic.
PointerEventData This pointer event data object is passed to all event receivers that are currently active
It contains hit information if an object was hovered or clicked
If the event is received in onPointerDown or onPointerMove, you can call setPointerCapture to receive onPointerMove events even when the pointer has left the object until you call releasePointerCapture or when the pointerUp event happens
You can get additional information about the event or event source via the event property (of type NEPointerEvent)
PostProcessingHandler Is responsible for applying post processing effects to the scene. It is internally used by the Volume component
Needle Engine component's are the main building blocks of the Needle Engine.
Derive from Behaviour to implement your own using the provided lifecycle methods.
Components can be added to any Object3D using addComponent or GameObject.addComponent.
Needle Engine component's are the main building blocks of the Needle Engine.
Derive from Behaviour to implement your own using the provided lifecycle methods.
Components can be added to any Object3D using addComponent or GameObject.addComponent.
Used to reference a signal asset in a SignalReceiver. This is internally used by the SignalReceiverEvent.
An event that links a signal to a reaction. Used internally by SignalReceiver.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Represents a single knot (control point) in a spline curve.
A TrackHandler is responsible for evaluating a specific type of timeline track.
A timeline track can be an animation track, audio track, signal track, control track etc and is controlled by a PlayableDirector.
Implement on your component to receive input events via the EventSystem component
The ISceneEventListener is called by the SceneSwitcher when a scene is loaded or unloaded.
It must be added to the root object of your scene (that is being loaded) or on the same object as the SceneSwitcher
It can be used to e.g. smooth the transition between scenes or to load additional content when a scene is loaded.
Interface for receiving callbacks during timeline animation evaluation.
Allows modification of position/rotation values before they are applied.
Network event arguments passed between clients when using the DropListener with networking
Arguments provided to handlers when an object is dropped or added to the scene
Defines how the ViewBox component applies camera framing adjustments.
Used by WebXRPlaneTracking to track planes in the real world.
Used by WebXRPlaneTracking to track planes in the real world.
Default order for post-processing effects. This can be used to sort effects by their rendering order when creating custom effects.
E.g. in your custom effect, you can set order: PostProcessingEffectOrder.Bloom + 1; to ensure it gets rendered after the bloom effect.
OR order: PostProcessingEffectOrder.Bloom - 1; to ensure it gets rendered before the bloom effect.
Reads back a texture from the GPU (can be compressed, a render texture, or anything), optionally applies RGBA colorScale to it, and returns CPU data for further usage. Note that there are WebGL / WebGPU rules preventing some use of data between WebGL contexts.
This method uses a '2d' canvas context for pixel manipulation, and can apply a color scale or Y flip to the given image. Unfortunately, canvas always uses premultiplied data, and thus images with low alpha values (or multiplying by a=0) will result in black pixels.
Contains Needle Engine Core Components.
This includes
Interactivity components
DragControls, SmoothFollow, Duplicatable, SpatialTrigger
Everywhere Actions
SetActiveOnClick, PlayAnimationOnClick, PlayAudioOnClick, ChangeMaterialOnClick
Camera and user controls
OrbitControls, CharacterController
Rendering components
Light, Renderer, ParticleSystem, Volume (post processing), ReflectionProbe, GroundProjectedEnv, ShadowCatcher
Media components
AudioSource, VideoPlayer
Helpers
AxesHelper, GridHelper, TransformGizmo
Asset Management components
DropListener, SceneSwitcher, GltfExport
XR components
WebXR, USDZExporter, XRRig
Networking components
SyncedRoom, SyncedTransform, SyncedCamera, Voip, ScreenCapture
Animation components
Animator, Animation, PlayableDirector
Physics components
Rigidbody, BoxCollider, SphereCollider, MeshCollider, PhysicsMaterial
Utilities
NeedleMenu
and more.
All these components are available wherever Needle Engine is used.