← Back to Blog

Procedural Cosmic Backdrop: From Textures to Noise via Iterative Refinement

For months, EF-Map's universe view displayed white dots on a plain black background. Functional, yes. But it lacked the atmospheric depth that makes a star map feel like you're peering into an actual cosmos. This is the story of how we tried textures, discovered their limitations, built user controls for rapid iteration, and ultimately found that procedural noise with those same controls delivered the perfect solution.

The Journey Matters

This feature took multiple pivots before landing on the final approach. The "failed" attempts weren't wasted—they produced the slider system that made the final solution possible. Sometimes the tooling you build for one approach becomes essential for a completely different one.

The Starting Point: White Dots on Black

EF-Map renders 24,000+ star systems positioned in 3D space. The depth effects we added previously—glow, parallax, brightness falloff—gave the stars themselves dimensionality. But the background remained stark black, creating a somewhat clinical feel.

The goal was simple: add a subtle cosmic backdrop that suggests nebulae, dust clouds, and stellar nurseries without overwhelming the navigation function of the map. The stars and routes must remain the visual priority.

Attempt 1: Procedural Point Clouds

The Approach

Our first attempt used procedural generation—thousands of faint point sprites scattered in layers around the camera. The concept was sound: multiple layers at different distances would create parallax depth as the camera moved.

The Problem

The pattern was immediately recognizable as repetitive. The human eye is remarkably good at detecting regularity, and the point cloud had an obvious tiling quality. It looked like a texture on repeat rather than a boundless universe.

It looked too repetitive. Very easy to spot a pattern, which kind of shattered the illusion.

We needed something with more visual complexity—something that felt organic rather than algorithmic.

Attempt 2: Real Nebula Textures

The Pivot

The obvious solution: use real astronomical imagery. We sourced high-resolution nebula photographs and mapped them onto a sphere surrounding the entire scene. This approach had worked beautifully in countless space games.

The Implementation

We created a massive inverted sphere (camera inside), applied the nebula texture using equirectangular projection, and positioned it far beyond all the star systems. Initial results were promising—the texture had the organic complexity we wanted.

The Problems

Spherical texture mapping has two fundamental issues:

UV Mapping Challenges

  • The Seam: Where the texture wraps around meets itself, there's a visible line. Various blending techniques can reduce but never eliminate this.
  • Pole Pinching: At the top and bottom of the sphere, the texture compresses into a point, creating distorted, stretched artifacts.

We tried multiple approaches to mitigate these issues:

Each solution traded one problem for another. Tri-planar mapping eliminated the single seam but created a grid of panel boundaries. The image looked like "lots of little panels stitched together"—hardly better than the original issue.

The Turning Point: Sliders for Rapid Iteration

Building the Controls

While fighting the texture problems, we built a comprehensive UI for tuning the backdrop appearance:

These sliders dramatically accelerated iteration. Instead of waiting for LLM-generated code changes and rebuilds, the human operator could tweak values in real-time and see immediate results.

The Slider Advantage

Building adjustable controls wasn't just convenient—it was essential for discovery. The ability to quickly test dozens of configurations revealed patterns that would have taken days to find through code iteration alone.

The Realization

Despite extensive tinkering with the sliders, the texture-based approach never felt right. The seams remained visible. The image felt static. But the sliders themselves were working beautifully.

The question became: could we keep these controls but switch the underlying rendering to something without UV mapping problems?

The Solution: Procedural Noise with User Controls

Why Noise Works

Procedural noise (specifically 3D simplex noise with Fractional Brownian Motion) has a key advantage: it samples directly from 3D world coordinates. There's no UV mapping, which means:

The Implementation

We rewrote the backdrop shader to use multi-octave noise instead of texture sampling:

// Fractional Brownian Motion (FBM) - layered noise for organic patterns
float fbm(vec3 p, int octaves) {
  float value = 0.0;
  float amplitude = 0.5;
  for (int i = 0; i < octaves; i++) {
    value += amplitude * snoise(p);
    p *= 2.0;        // Increase frequency
    amplitude *= 0.5; // Decrease amplitude
  }
  return value;
}

// Sample three noise layers at different frequencies
float n1 = fbm(scaledPos * 0.5, 5);
float n2 = fbm(scaledPos * 1.2 + vec3(100.0), 4);
float n3 = fbm(scaledPos * 2.5 + vec3(200.0), 3);

// Combine for varied nebula structure
float nebula = n1 * 0.5 + n2 * 0.3 + n3 * 0.2;

Keeping the Controls

Critically, we preserved all the sliders built during the texture phase. They now controlled the procedural generation:

SliderTexture ModeProcedural Mode
ScaleUV zoomNoise frequency (higher = larger features)
ContrastColor curveDensity power curve
Color TintMultiply blendGradient color shift
BrightnessOutput multiplyOutput multiply

Dialing In the Defaults

With procedural noise and functional sliders, finding the perfect look took minutes rather than hours:

Final Default Values

  • Brightness: 25% — subtle, not overwhelming
  • Contrast: 2.1 — enough structure to see nebula shapes
  • Scale: 0.7 — large enough features to feel organic
  • Color Tint: R 0.1, G 0.9, B 1.3 — cool blue-cyan shift
  • Tint Strength: 45% — noticeable but not garish

The effect adds depth and atmosphere while remaining subtle enough that users notice the map "feels better" without being able to pinpoint exactly why.

Parallax Layer Refinement

The Parallax Problem

We render the backdrop as three concentric spheres at different distances, each moving at a different rate relative to the camera. This creates depth through parallax—but our initial values were too conservative:

With all layers moving at nearly the same speed, the parallax effect was imperceptible. We spread the values dramatically:

This creates noticeable but not distracting relative motion between layers as the camera pans.

Protecting the Illusion: Zoom Limits

The Edge Case

The backdrop spheres, while large, aren't infinite. If a user zoomed out far enough, they could see "outside" the effect—revealing the spherical geometry and breaking the illusion entirely.

The Fix

We clamped the maximum zoom-out distance to 10,000 units (matching our embed guide's recommended limits). At this distance, all stars remain visible while the backdrop effect stays seamlessly intact. The universe feels bounded by natural limits rather than arbitrary technical constraints.

// In OrbitControls setup
controls.maxDistance = ZOOM_MAX_DISTANCE; // 10,000 units

// Previously was Infinity - allowed seeing outside the backdrop

User Empowerment: Keeping the Sliders

A key decision: we kept all the tuning sliders exposed to users. The defaults work well, but different preferences exist:

By exposing the controls, users can customize the experience to their taste without needing to ask for features or wait for updates.

The LLM Workflow Advantage

This feature exemplifies the "vibe coding" development style. The human provided direction ("make it look like space"), evaluated results ("too repetitive", "I can see the seam"), and tuned via sliders. The LLM handled shader math, UV coordinate systems, noise algorithms, and performance optimization. Neither could have built this alone—but together, the iteration happened in hours rather than weeks.

Lessons Learned

1. Build Controls Early

The sliders built for the "failed" texture approach became essential for the successful procedural approach. Investment in iteration tools is never wasted.

2. Know When to Pivot

We spent significant time on texture solutions before accepting their fundamental limitations. Sometimes the constraints are architectural, not tunable. Recognizing this earlier saves time.

3. Procedural Beats Textured for Seamless Wrapping

Any effect that needs to wrap continuously in 3D space will struggle with texture mapping. Procedural generation, while more complex to implement, eliminates entire categories of visual artifacts.

4. Subtle Is Better

The default effect at 25% brightness is barely noticeable on first glance. That's intentional. The map now "feels like space" without users being distracted by the background. The best visual effects are often the ones you don't consciously notice.

The Result

EF-Map's universe view now has depth, atmosphere, and visual richness—but remains a tool first. The stars and routes stand out clearly against a backdrop that suggests infinite cosmic space without competing for attention. And if users disagree with any aesthetic choice, they can tune every parameter themselves.

From white dots on black, to repeating patterns, to seamed textures, to seamless procedural noise—the journey took multiple attempts, but each "failure" contributed something to the final solution. That's iterative development in action.

procedural generation simplex noise fbm three.js webgl shader nebula uv mapping iterative development vibe coding eve frontier