Personal Project

2025

HUE 9000

A cinematic control surface built to test the limits of diegetic interface design and motion logic.

I designed and built a high-craft control surface with UI complexity exceeding most production apps. This project is a record of building a complex diegetic engine in the pre-agentic era of 2025. I orchestrated a wide tech stack, from XState to OKLCH color math, to solve the hardest problems in motion choreography. The result is a high-fidelity interactive experience that looks and feels real. It turns out (at the time of writing) that building the truly difficult things still requires a disciplined manual focus.

Poster frame of the HUE 9000 interface with the illuminated central lens

Proof points

Outcomes and impact

Shipped artifact

A live cinematic control surface with breathing buttons, backlit simulation, parametric lens gradient, radial dials, and a full ambient audio layer.

Working method

Codified a five-stage AI-assisted development workflow (plan, build, troubleshoot, refine, simplify) that kept visual judgment in human hands.

Range extension

Built fluency in GSAP motion, Howler.js audio, XState behavior logic, and advanced Chrome DevTools debugging on a code base whose UI complexity exceeds most production apps.

Origins in Failure

This project began as a on/off status light for an entirely different project. While working on that other project, The AI revolution was gaining momentum around me, and I thought it would be a cool idea to explore a HAL 9000 style power light, with the option to adjust the color. I built the component in isolation -- it looked great, and then dropped it in -- It looked absolutely terrible.

The lighting, the vibe were both a complete misfit for the project. It was a total failure in that context. However, the button looked pretty cool in isolation. Rather than throwing it away, I decided to build a completely new project where this lens could be the hero.

The Lens Bezel

Creating the HAL9000 inspired lens began with the bezel. I created a high-fidelity angular gradient for the inner and outer rings. I used a reference photo of the original hardware and generally followed it, but made intentional changes to simplify the number of colors (which would become CSS variables so I could animate their intensity), and also to have more left-right symmetry.

Once the bezel looked right, I converted the shades of gray into parametric CSS variables. This allowed me to adjust the global lighting intensity dynamically.

Defining Highlights

To sell the illusion of a physical glass surface, I created SVG highlights to overlay with psueduo ::after:: classes. I found a high-quality image of what I believed was the original HAL 9000 and carefully traced its sharp specular highlights.

After the work was finished, I rewatched the actual movie and realized my reference image was not a production photo. It was a fan-made 3D recreation. Fortunately, these looked great and the sharpness was helpful to keep.

Gradient Reconstruction and Hue Shifting

The internal lens illumination follows the physics of the original optic. I carefully mapped a photo of the lens into a reconstructed CSS radial gradient. I paid extra attention to the boundaries. I then converted all the color stops into CSS variables using the OKLCH color space, which would allowed me to vary the hue dynamically across the different stops.

One key challenge was the inner glow. It has a noticeable hue shift at its brightest point. I used a CSS calculation variable to create an offset hue. If a user changes the overall lens color from red to blue, the inner region mathematically maintains that shifted, brighter center.

Lens Interpolation and Camera Bloom

I had to model how the light behaves at different intensities to make the interface feel alive. I used the exact same gradient color stops but adjusted their physical positions based on power levels. At zero percent intensity, the bright inner glow compress to nothing, and gets overlapped with the more subtle gradient stops. As the power increases, those color stops expand outward.

To simulate a realistic camera bloom, I treated the light spill as a dynamic after-effect. The bloom grows in both opacity and radial size as the lens intensity increases. This creates the illusion that the lens is generating enough energy to glow past its physical edges.

Building the Main UI

When no high-resolution photo reference exists for a 1960s speaker grille, you build it yourself. I modeled the geometry in CAD to achieve the exact ridge spacing, geometry, and hole size and spacing. By rendering these custom assets, I was able to match the lighting of the user interface perfectly.

Visual Progression

As the project visual language took shape, I explored a variety of layout options. Each iteration made a bit more sense, and eventually I landed on a macro symmetric left-right layout, with some asymmetry within the macro regions.

Visual progression: from early wireframes to the final cinematic surface.

Startup Sequence

The boot process is a thirteen-phase XState orchestration. It is a state machine that manages everything from real-time terminal logs to a global reduction factor that perceptually dims the entire interface during the power-up arc.

Video Ratio Lab

A quick shape check using clips from the old website export. This is here to pressure-test square, portrait, and wider compositions before we settle on a final playback posture.

Mobile-First Refactor

Because the mobile version doesn't show the side panels, it required a completely different startup sequence. Pulling this off required a grueling four-day manual refactor.