← Back to Blog

Three.js Rendering: Building a 3D Starfield for 200,000 Systems

When we started building EF-Map, the most fundamental question was: how do we render EVE Frontier's massive star map in a browser? The game has 200,000+ star systems positioned in 3D space, and users need to orbit, pan, and zoom smoothly at 60 FPS. This ruled out simple 2D approaches—we needed a real 3D rendering engine.

After evaluating several options (raw WebGL, Babylon.js, Unity WebGL), we chose Three.js—a mature, well-documented 3D library that balances performance with developer productivity. Here's how we built a scalable 3D star map that handles hundreds of thousands of objects without breaking a sweat.

The Challenge: Rendering Scale

Traditional 3D scenes might have hundreds or thousands of objects. We needed to render 200,000+ point lights (stars) plus 500+ gate connections (lines between systems) plus region labels and selection highlights. Naive Three.js usage would grind to a halt.

The core problem: each star is a separate THREE.Mesh object. Creating 200,000 meshes means:

At 60 FPS, we have 16 milliseconds per frame. Traditional rendering would take 500+ milliseconds. We needed to think differently.

Solution: Instanced Rendering

Three.js provides InstancedMesh—a way to render thousands of identical objects with a single draw call. Instead of creating a mesh per star, we create one mesh template and define 200,000 instance transforms (positions, colors, scales).

Here's the core implementation:

const createStarfield = (systems: StarSystem[]) => {
    // Single sphere geometry (shared by all stars)
    const geometry = new THREE.SphereGeometry(0.05, 8, 8);
    
    // Material with per-instance colors
    const material = new THREE.MeshBasicMaterial({
        vertexColors: true
    });
    
    // Instanced mesh for 200,000 stars
    const starfield = new THREE.InstancedMesh(
        geometry,
        material,
        systems.length
    );
    
    // Set position and color for each instance
    const matrix = new THREE.Matrix4();
    const color = new THREE.Color();
    
    systems.forEach((system, i) => {
        // Position in 3D space
        matrix.setPosition(
            system.x / SCALE_FACTOR,
            system.y / SCALE_FACTOR,
            system.z / SCALE_FACTOR
        );
        starfield.setMatrixAt(i, matrix);
        
        // Color based on star type
        color.setHex(getStarColor(system.star_type));
        starfield.setColorAt(i, color);
    });
    
    starfield.instanceMatrix.needsUpdate = true;
    starfield.instanceColor.needsUpdate = true;
    
    return starfield;
};

This reduced our render time from 500ms to 4ms—a 125x performance improvement. A single draw call handles all 200,000 stars.

Camera Controls: Smooth Navigation

For navigation, we use Three.js's OrbitControls with custom constraints:

const controls = new OrbitControls(camera, renderer.domElement);

// Limit zoom to prevent clipping or infinite distance
controls.minDistance = 10;
controls.maxDistance = 5000;

// Smooth damping for cinematic feel
controls.enableDamping = true;
controls.dampingFactor = 0.05;

// Prevent camera flipping at poles
controls.maxPolarAngle = Math.PI;
controls.minPolarAngle = 0;

// Disable panning (orbit only) for focused experience
controls.enablePan = false;

The damping creates a physics-like momentum: when you spin the camera, it gradually decelerates rather than stopping abruptly. This makes exploration feel more immersive.

Field of View Tricks

We use a narrow FOV (30 degrees) instead of the typical 75 degrees:

const camera = new THREE.PerspectiveCamera(
    30,  // FOV (narrower = more "zoomed in" feel)
    window.innerWidth / window.innerHeight,
    0.1,  // Near plane
    10000 // Far plane
);

This creates a telephoto lens effect that compresses depth perception. Distant stars appear closer, making the universe feel denser and more interconnected. It's a subtle psychological trick borrowed from cinematography.

Selection Highlighting: GPU Picking

When users hover over a star, we need to identify which system they're pointing at. Traditional approaches use raycasting: project a ray from the cursor into 3D space and test intersection with each object. But with 200,000 stars, raycasting is too slow.

We use GPU picking instead:

const raycaster = new THREE.Raycaster();
const mouse = new THREE.Vector2();

const onMouseMove = (event: MouseEvent) => {
    // Normalize mouse coordinates to -1 to +1
    mouse.x = (event.clientX / window.innerWidth) * 2 - 1;
    mouse.y = -(event.clientY / window.innerHeight) * 2 + 1;
    
    // Update raycaster
    raycaster.setFromCamera(mouse, camera);
    
    // Test intersection with instanced mesh
    const intersects = raycaster.intersectObject(starfield);
    
    if (intersects.length > 0) {
        // Instance ID = system index
        const instanceId = intersects[0].instanceId;
        const system = systems[instanceId];
        showTooltip(system);
    }
};

Three.js's raycaster knows how to test InstancedMesh efficiently—it only tests the bounding volume of the entire mesh, then checks individual instances only if the ray intersects the volume. This reduces checks from 200,000 (every star) to typically <100 (stars near the cursor).

Visual Polish: Glow Effects

Raw point lights look clinical. We added bloom effects using Three.js's post-processing pipeline:

import { EffectComposer } from 'three/examples/jsm/postprocessing/EffectComposer';
import { RenderPass } from 'three/examples/jsm/postprocessing/RenderPass';
import { UnrealBloomPass } from 'three/examples/jsm/postprocessing/UnrealBloomPass';

const composer = new EffectComposer(renderer);
composer.addPass(new RenderPass(scene, camera));

const bloomPass = new UnrealBloomPass(
    new THREE.Vector2(window.innerWidth, window.innerHeight),
    0.5,  // Bloom strength
    0.4,  // Bloom radius
    0.85  // Bloom threshold
);
composer.addPass(bloomPass);

// Render with bloom
const animate = () => {
    requestAnimationFrame(animate);
    controls.update();
    composer.render(); // Instead of renderer.render()
};

The bloom pass creates halos around bright stars, making them feel more luminous. It's computationally expensive (adds 3-5ms per frame) but dramatically improves visual quality.

Gate Connections: Line Rendering

Stargates are rendered as lines connecting systems:

const createGateLines = (gates: Gate[]) => {
    const geometry = new THREE.BufferGeometry();
    const positions = new Float32Array(gates.length * 6); // 2 vertices × 3 coords
    
    gates.forEach((gate, i) => {
        const fromSys = systemsById[gate.from_system];
        const toSys = systemsById[gate.to_system];
        
        positions[i * 6 + 0] = fromSys.x / SCALE_FACTOR;
        positions[i * 6 + 1] = fromSys.y / SCALE_FACTOR;
        positions[i * 6 + 2] = fromSys.z / SCALE_FACTOR;
        
        positions[i * 6 + 3] = toSys.x / SCALE_FACTOR;
        positions[i * 6 + 4] = toSys.y / SCALE_FACTOR;
        positions[i * 6 + 5] = toSys.z / SCALE_FACTOR;
    });
    
    geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
    
    const material = new THREE.LineBasicMaterial({
        color: 0x4080ff,
        opacity: 0.3,
        transparent: true
    });
    
    return new THREE.LineSegments(geometry, material);
};

Using LineSegments instead of individual Line objects means one draw call for all gates, similar to our instanced star approach.

Performance Monitoring: FPS Tracking

We track frame times to detect performance issues:

let lastFrameTime = performance.now();
let frameCount = 0;
let fps = 60;

const animate = () => {
    requestAnimationFrame(animate);
    
    const now = performance.now();
    const delta = now - lastFrameTime;
    
    frameCount++;
    if (frameCount % 60 === 0) {
        fps = Math.round(1000 / delta);
        console.log(`FPS: ${fps}`);
    }
    
    lastFrameTime = now;
    composer.render();
};

If FPS drops below 30, we automatically reduce visual quality (disable bloom, reduce star detail) to maintain smooth interaction. Responsiveness > visual fidelity.

Lessons for Large-Scale 3D Rendering

Building this taught us several key principles:

1. Batch aggressively. Instanced meshes and buffer geometries reduce draw calls from thousands to single digits.

2. LOD is essential. Render distant stars as simple points, nearby stars as spheres. Users can't see the difference but GPU can.

3. Post-processing is expensive. Bloom looks great but adds 20-30% overhead. Make it optional.

4. Camera matters. Narrow FOV, constrained orbit, and smooth damping make navigation feel intentional and cinematic.

5. Profile constantly. Use Chrome DevTools Performance tab to identify bottlenecks. The slowdown is rarely where you expect.

Future Enhancements

We're planning several rendering improvements:

Three.js gave us a solid foundation to build on. The library handles the low-level WebGL complexity, letting us focus on user experience and performance optimization. For anyone building large-scale 3D web visualizations, it's an excellent choice—mature, well-documented, and performant when used correctly.

Ready to explore 200,000 stars in 3D? Launch the map and experience the universe rendered in real-time in your browser.

Related Posts

threejswebgl3d renderingperformance optimizationinstanced rendering