Since 2022, I’ve been creating audio-reactive visual experiences using Unreal Engine’s Niagara particle system. My work spans live music performances, VR DJ experiences, and experimental sound design-translating audio into dynamic, responsive motion. I’ve collaborated with teams at TLM Partners and Sensorium to build immersive VR environments where the visuals actually react to the music in real-time. Every project starts with a simple question: how do I make particles dance to the beat?
For over a decade, I’ve designed visual experiences for major festivals and events-from the Intro Electronic Music Festival in Beijing (2011-2018) to smaller experimental projects across Europe and Asia. I work closely with lighting teams, projection designers, and event producers to create cohesive stage environments. Whether it’s video mapping, LED installations, or live VJing, the goal is always the same: craft something that feels alive and connected to the performance happening in front of it.
Fader works across music visualization, stage design, and live visual production-building immersive experiences for concerts, festivals, VR projects, and experimental audiovisual work.
Real-time, audio-reactive visual systems created for live music, immersive environments, and performance-led experiences.
Stage concepts and visual identities designed to support the atmosphere, scale, and flow of a live event.
Video mapping, LED content, VJ workflows, and show visuals developed to feel fully connected to the performance.
A collaborative process with artists, studios, producers, and technical teams to turn ideas into production-ready visual experiences.