NoizBox

Crafting Spatial Soundscapes

Designed & Developed by Ben Westergreen
Personal Project, 2023
Created with Unity Engine for Meta Quest 2 (PCVR)

NoizBox is an experimental virtual reality spatial audio project developed with Unity Engine. It emerged from explorations to create a synesthetic VR experience that celebrates body autonomy and empowers users to connect with rhythm and sound on a personal level.

Concept Development

The focus of developing NoizBox was designing for agency, where users have control over their experience. Unlike traditional rhythm games that dictate specific movements and note maps, NoizBox enables users to move to music on their terms. I developed these story points to explore aesthetic, expression and scale for XR.

Prototype 1:
Interactive Sound Objects

The project introduced audio reactive cubes, representing sound objects, which users could play, move, and manipulate within the virtual space. These cubes dynamically changed color and size based on audio intensity, creating visually expressive particle effects that resonated with the audio.

Playtesting: Users expressed a desire to explore the virtual environment more openly, often venturing off the guided path and into the virtual woods. As I explored ways to guide users while allowing for open exploration, striking a balance between narrative guidance and unrestricted movement proved to be a significant challenge. The dynamic nature of soundscapes and user interactions made it difficult to predict user trajectories and maintain a cohesive audio narrative.

Prototype 2:
Orbital Collecting

Prototype 2 centered around collecting audio objects and creating orbital motion around the user’s hand. By introducing a dynamic element of randomness to the orbits, I aimed to create an engaging experience. While the concept was experimental  I realized the importance of grounding ideas in the overall user flow.

Style Frames & Generative Space

Style frames were instrumental in honing the aesthetics of NoizBox. Through quick gesture digital paintings, I experimented with various visual styles, eventually leading to the creation of the generative cube rooms. These captivating and ever-changing spaces allow users to lose themselves in a symphony of visuals that respond to their movements and actions.

Procedural Generation

The development of generative tools opened a world of possibilities for NoizBox’s environments. By adapting the “generative creature” code from Eliza Struthers-Jobin, I explored ways to sync her creature to a BPM and allow user-controlled acceleration based on the effect intensity of specific sound cubes.

Prototype 3:
Gyroscope Audio Control

Inspired by the Orba, Prototype 3 incorporated the gyroscope filter into NoizBox’s sound objects, providing users with an exciting and satisfying way to control volume, distortion, and low-pass filtering. The combination of multiple control types within a single object aimed to enhance the user experience and creative expression.

Playtesting: While the gyroscope filter added a layer of interactivity, user testing exposed the challenges of combining various interactions without clear feedback. The lack of intuitive responses left users feeling uncertain about the changes they were making to the sound.

Prototype 4: Layered Audio Experience

Prototype 4 explored a linear approach to guide users through a curated series of sounds that naturally layered and built upon each other. Users unlocked the next sound object by interacting with the current one for a set time. The challenge lay in quickly prototyping different sound samples to discover how they could combine and layer, guiding users through the journey.

Opportunity: During this exploration, a critical realization emerged – the absence of a tool to rapidly prototype various sound samples in a spatial environment. This obstacle presented an opportunity to pivot to a spatial audio tool, allowing users to create and experiment with layered compositions.

Frontend Framework & Visual Design

The project’s frontend framework, Nova UI, accelerated the prototyping speed by allowing the creation of basic shapes and assets directly in the editor. Early testing and visual design for interactions were facilitated using a 2D wireframe and prototype in Figma.

Audio Objects & Modular Design

NoizBox’s audio objects embrace a modular design, offering a flexible and adaptable foundation for the project. Effect panels for loops and one-shots add depth to user interaction. Loops feature effect levers for control over filters and distortion, while one-shots include drum pads for triggering sounds. The modular audio objects in NoizBox become the canvas for users to paint their sonic stories.

Data Visualization

To enrich the display of audio samples I took inspiration from sound applications like FractalBits. I used an AI Image Generator Midjourney to generate a catalog of imagery for a curated selection of audio samples. My prompts were based on descriptions of what each sound sounded like, allowing me to generate over +200 distinct images. The generated images were sorted based on affinity attributes and reassigned to instruments.

Lighting & Optimization

I focused on making the experience smoother and more visually appealing with lighting and optimization of the scene. This involved baking lights and enabling static and dynamic batching, leading to significant performance improvements. Dimmer lit scenes were carefully designed to create a comfortable environment for audio creation and extended play sessions.

Future Additions

Features I would like to explore to expand and enhance NoizBox:

  • User Customization: Import your own audio and art for a personalized experience.
  • Full AR Functionality: Embrace upcoming devices like Quest 3 and Apple Vision Pro for immersive AR experiences.
  • Audio Reactive Elements: Witness the environment come alive, responding to sound intensity.
  • Enhanced Effect Panels: More tools to shape sounds and compositions.
  • Collaboration with Professionals: Partner with sound designers, producers, and performers for specialized tools.
  • Live Performance and Collaboration: Connect in real-time for mesmerizing audio and visual experiences.

.