Good music stirs by its mysterious resemblance to the objects and feelings which motivated it.
Jean Cocteau
Objects of Power is a new media art exhibit that allows the attendees to experience physical texture in an spatial atmospheric soundscape using sound localization.
Various objects were placed in a 3D space setup with spatial sound. As the attendees moved through the space their relative position & orientation to the objects determined with how much intensity & what part of the soundscape they would be hearing.
This resulted in a unique interactive listening experience that allowed the attendees to have an amplified sensory experience & gave the attendees a moment of pause to really think about the relation of the objects, the space & the soundscape.
Background
The idea behind this exhibit was to create a setting that allowed the attendees to experience a spatial soundscape with the sound itself being the main driving factor.
Initially the idea was to have the attendees be in a completely dark room and that they would navigate this 3D space and move around the soundscape purely by following auditory cues.
I quickly moved away from this idea as being in a completely dark setting can be disturbing and even disorienting for some people.
Keeping that in mind I began looking into various forms of visuals to use as guidance triggers using projectors & VR headsets, however I decided to minimize the technological aspect and decided to create certain ‘objects’ that would in the abstract sense convey texture & association to the elements of nature.
The Objects
After a very arbitrary decision process I chose to create 5 objects, each object was tied to a short atmospheric track that I carefully tailored to each objects form & texture.
The five objects were loosely inspired by the 5 elements of nature. They vary from found objects to carefully modelled & 3D printed entities.
‘The Cube’ – Green modeling foam & wire mesh | Earth‘Snow globe’ – Floral foam inside Glass ball | Air“Container” – Oil inside glass ball | Water“Halo” – Found metal part | Fire“Orb” – 3D Printed PLA | Sky
The Soundscape
I created atmospheric sound signatures for each of the objects in the NMSA Sound lab. When writing the pieces I took specific care that they would all mix together to form a convincing soundscape no matter where the attendee is situated in the room. (p.s you can play multiple audio files at once so have fun making your own little soundscape)
The exhibit itself was also held in the NMSA Sound lab owing to it being a space already setup for surround sound. The spatial localization was done fully live myself. I adjusted the volumes & panning of the tracks as the attendees moved through the space.
Live control via the Image-line remote app.Directors (producers?) notes. Testing mockup of the objects in a scaled space.
Future Direction & Reflections
Overall I was very happy with the reception of the exhibit. All the attendees in one way or another expressed that they felt a sensory experience that was not present in space like taste & touch, which was amazing to hear.
The experience was almost as I wanted it to be but the exhibit could have used a bit more context & some kind of more concrete narrative to give it more meaning. But on the flipside the current context (or lack thereof) provided a interesting moot point as everyone had their own unique experience.
That made me realize that perhaps I personally like to create experiential frameworks more than directed pieced. Both would still be highly curated but by keeping it more abstract everyone’s mind was able to run free and they created their own associations & stories based on their personal feelings, emotions & experiences.
Moving forward I would love to scale the exhibit up & create an even more immersive soundscape. I envision it as being a two part exhibit one being highly narrative driven by some kind of visuals & the other being purely abstract with just light & sound.
I do not know what the future truly hold for myself & my work going forward but I do know that the support that I have received from my friends, colleagues & faculty has truly inspired me to explore the boundaries of what I am capable of as an artist & designer.
“I’m half living my life between reality and fantasy at all times.”
Lady Gaga
In this project I aimed to explore the realm of mixed reality and how we could use it as a means to make a table top role playing game (TTRPG) experience more immersive for players. The resultant was a prototype of how we can incorporate mixed reality into world building & setting up scenes for campaigns / sessions.
Research.
I began by looking into research papers & books that discussed mixed reality and how it can be used as a tool for creating experiences specifically in board game / TTRPG’s. The idea was to understand how we can use this technology to augment the player experience and not for it to take over completely.
Based upon the research I was able to draw the following points as sort of a framework that could be used to design for mixed reality transformative play.
Seamless Integration
User Engagement
Balancing realism & Fantasy
Scenario Building
Personalization
Storytelling & Narratives
I decided to focus on the scenario/world building portion of TTRPG’s as in my view that is something that can be augmented or even completely translated using mixed reality or similar technologies.
Experimentation.
My first experiment was using the passthrough feature of the Meta Oculus Quest 2.
Quest 2 headset & controllers
However quickly the low resolution, lack of color video & general bulkiness of a VR headset drove me to look towards other options.
Passthrough + Controllerless Hand tracking test – Oculus Quest 2
The next series of experiments revolved around using the Microsoft HoloLens which is a true Mixed reality headset unlike the Quest 2 which is a VR headset capable of passthrough.
HoloLens 1 (left) & 2 (right)
The benefit of the HoloLens was the lack of controllers & higher onboard processing capacity. I settled on using the HoloLens 2 over the 1 due to its more accurate hand tracking & spatial mapping capabilities.
My initial exploration revolved around understanding the device and using its basic inbuilt features to be able to move forward with developing a more concrete outcome.
Basic 3D model placement & hand gestures
The initial pipeline that I settled on was with the Mixed Reality tool kit, Unity & the on board processing of the HoloLens 2.
The next experiment was testing out procedural generation using a modified unity application. While there were issues with getting it to register hand gestures correctly the overall effect was rather interesting to experience.
Procedural generation using unity – the mesh boundaries of the spatial mapping are visible.
My next course of action was to understand in depth of how to develop and design for the HoloLens are potentially test flight my own unity app. Which after much trial and error and in the timeframe of this project I was able to construct a simple enough unity application that allowed me to make a bare bones world building demo.
World building Demo
This video demonstrates a bare bones way that game masters could create their scenarios that normally would either be paper maps or crafted models using mixed reality.
The players are able to view what would be their inventory & a game scenario that features both static & animated assets (some of which are own some of which are premade).
With MR players are able to view parts of the world that they normally would not be able to and things like imagining you are on a floating island are easier to visualize when you can see one in front of your eyes.
Using this gamemasters could create multiple scenes that could be switched with just an air tap & even mix in real work figures or models to create even more interactive & immersive scenarios.
Backend insight.
This demo was made using Unity , Microsoft Azure and the mixed reality tool kit (MRTK). I utilized the sensor data for hand tracking and spatial mapping via the MRTK & spatial anchoring via Microsoft Azure (The way to suspend / place 3D models in the real world).
Spatial Anchor Data for the World building demoHead tracking & some more anchor data
Future Design Pipeline & Reflections.
While Unity may be the current “standard” way to design & develop for mixed reality it runs into multiple logistical issues stemming but not limited to the limited onboard processing capacity of the HoloLens 2 (It came out in 2019 & was already dated on the processing front back then).
So keeping that in mind a more better way would be to stream content to your headset of choice via Unreal Engine 5 & Figma. Creating interfaces in Figma & 3D asset scenes in UE5 with the help of the MRTK and streaming content to a headset like the Magic leap 2 is the sweet spot with the currently available technology that we have in the MR space.
Magic Leap 2 – A very compact but tethered MR goggle.
Thinking more about what I have done up till now I would definitely want to explore more ways to create & maintain immersion using techniques like soundscapes. I would also like to explore new ways of sensory input that could potentially be used to augment & enhance player experience in such gaming scenarios.
Bibliography.
Games as Blends: Understanding Hybrid Games Ville Kankainen , Jonne Arjoranta, Timo Nummenmaa
Roles People Play: Key Roles Designed to Promote Participation and Learning in Alternate Reality Games Elizabeth Bonsignore , Derek Hansen , Kari Kraus , Amanda Visconti , Ann Fraistat
Designing to Split Attention in a Mixed Reality Game Hitesh Nidhi Sharma, Andruid Kerne, Zachary O. Toups, Ajit
Magia Transformo: Designing for Mixed Reality Transformative Play Ke Jing, Natalie Nygaard, Theresa Jean Tanenbaum
Developing a Platform for Community-curated Mixed Reality Play Spaces Joshua A. Fisher, Linying Shangguan, Joshua Scott Crisp
Guidelines on Successfully Porting Non-Immersive Games to Virtual Reality: A Case Study in Minecraft John Porter III, Matthew Boyer, Andrew Robb
“The meaning of things lies not in the things themselves, but in our attitude towards them.”
Antoine de Saint-Exupery
The Meaning Making Machine (MMM) is a resultant of a 3 week collaboration between myself and my studio partner Logan Wilkinson.
The aim of this project was to create an object of collaborative sonic collaboration that was detached from the standardized concept of instruments.
Meaning Making Machine
Design Guidelines.
We had multiple conversations around creating soundscapes & how we could mold the creation process into a simpler one. Both having a background in music production we understood the need for a reduced barrier of entry.
Based on these discussion between us and with our peers we were able to form the following basic design guidelines.
Creating music is often seen as a difficult task & though to be out of reach.
Music making is a collaborative process & can be used a means of connection between unknown people.
Ideation.
Myself having a background in Industrial Design & Logan having one in Psychology & Philosophy, we first began to discuss how could we design a entity that allows for people to develop new connections using sound as a means of collaboration.
We specifically discussed what the experience should be for people who do not have a musical background or have no experience in music production.
Eventually we settled on creating an interpretation of a sequencer (the most basic of music production tools). The aim was to envision a sequencer like device that would aim to incorporate communal input in a shared space.
Mood board
Building on this mood board we aimed to infuse mysticism into our project, transforming our shared space from a mere work environment to a magical voyage. Contemplating occult and astrological elements like fortune teller booths and horoscope generators, we envisioned a unique atmosphere. Moving forward we leveraged our initial musical aspirations, drawing inspiration from the ECUAD Sound Lab’s array of synthesizers, anagrams, and imagery of moss-covered rock sculptures.
The MMM metaphorically utilizes ecosystems to depict collective meaning creation. In this context, each participant’s presence and their intuitive card selections influence the outcome. The interpretation of this outcome relies on the collective experience of the group.
The initial step in this process involves users selecting cards with words, which, when combined, form a poem whose interpretation can change based on the card sequence. Subsequently, these cards are slotted into one of six positions on the MMM’s body, with each slot representing a musical instrument. The order in which cards are placed in these slots determines which of the six instrument samples will be incorporated into the final song, played once all cards are inserted. This musical composition marks the second layer of meaning creation, influenced by both the group’s decisions regarding card placement and their interpretation of the resulting song. The words on the cards and the instrument samples hold inherent meanings, but their significance evolves and mutates through the dynamic interaction of the group.
Prototyping.
We split the prototyping process into largely three parts:
Physical Form
Backend Development
The Soundscapes
Physical Form.
For the form we had decided on a rock like shape that would have the cards inserted into it in someway. I started off with some quick sketches to put down our collective thoughts.
Ideation sketches
From there I moved swiftly into making a 3D model of it so we could get an idea of the dimensions & proportions that we would like to have. I used Shapr3D owing to its much faster workflow compared to traditional 3D design software’s.
Shapr3D model
We settled on having six cards as that is the remainder of our cohort when you exclude us two.
For the physical form we used blue foam as it is a versatile and easy to use material for making organic shapes and requires little prior knowledge to work with. The blue foam pieces were glued together to form a block which was then cut down and sanded into a orb/pebble like shape.
Glued foam stack dryingMeasuring for cuttingBandsaw for initial cutsTemplate for sanding into orbSanding processReady for primer
Once the initial form was made I did a mock up with some paper & tape to finalize the position & angle of the card slots. I then made the slits in it with a hot knife.
Card slot maquette
The next step was the finishing process which logan oversaw. This included priming & painting the orb & making the faux moss to give it an earthy texture.
Blending sponges with green dyePrimer + paintGluing the “ “moss”1st “moss layer”
We also mixed in some actual dirt & leaves we foraged from outside the campus to give it a more natural look.
Closeup of the surface texture
The next step was to create the cards & accompanying poem that would be in way the user prompt. Logan used Procreate & Adobe Illustrator to create nature inspired symbols that were rooted in this mystical vibe.
Each card also had a word on it that would allow for the creation of a poem based on the order the cards were inserted in.
Backend Development.
For the backend working I decide on a simple card detection mechanism of a blue LED + LDR with a logic gate to detect when the beam was broken. I took the input via the analog serial pins of an Arduino Uno & routed that into Processing which was being used to create the sequence of the audio that gets played.
All 6 LED modules Single LED moduleArduino & Processing code
We chose blue LED’s as they added to the mystical and otherworldly ambiance & it provided a stark contrast to the green earthy mossy form as blue is one of the rarest colors found in nature. The LEDs were positioned such in the form that they provided a blue glow inside the card slots.
LED placement
The Soundscapes.
All of the samples used to create the soundscapes were recorded by both of us in the Sound lab. We leveraged the sprawl of synthesizers at hand to create various samples that we layered out in Abelton & Audacity. We used the following equipment to record:
Prophet 5 – Atmospheric pads
Korg Minilogue – Arpeggio sequences
Moog Matriarch – Bass + Pads
Roland Tr8s – Drum pads
Elektron Digitakt – Guitar, Vocal chops
Arpeggio recordingGuitar SamplingAbleton Session
The video below is a fun summation of this entire development process
Project Demo.
We held the project demo in the Sound Lab as the space with its ambient lighting, surround sound audio setup & blinking instruments added to the mystical & mysterious vibe that we were aiming for.
The demo started with everyone picking a card & inserting it into the MMM in an order that the participants decided among themselves based on the poem they had made using the words on the card.
Cards ready for selectionMMM Demo
Reflections.
An interesting dialogue started around the MMM after the demo in which it was viewed as a performance art piece. The mystery of the object & the cards leant to this almost ritualistic kind of interpretation especially as the outcome was unknown. This prompted me to think of the possibility of the MMM being more of a interactive art piece as opposed to being a tool or instrument for studio production or team building.
The flip side was that the unpredictability of the MMM makes its very interesting tool for use during studio recording sessions. As it creates an entirely random output every time one can use it as an ideation tool for writing melodies or for creating interesting fusion song structures.
Apart from technical difficulties like an Arduino failure that prompted me to rewrite the code I would also like the MMM to be completely random as due to the short duration of this project (2 weeks) various things were hard coded in to ensure that the demo progresses smoothly while also giving a similar end result.
Overall I was quite satisfied by what Logan & I created. I especially enjoyed deconstruction the music writing process to understand from a psychological aspect why people make the decisions they do and how we can use that understanding to create something that gives a large effect with very little cause.
“The Journey of a thousand miles begins with one step”
Lao Tzu
Jo described her journey of moving across the world from India to Canada as one of immense emotion, hesitation, uncertainty & an intense wondering of what’s to come.
As me & Jo were in different time zones & countries at the time of this project I was only able to talk to her via Zoom which got me thinking what would be a meaningful way to give someone a gift which has no physical existence yet it still embodies & manages to invoke the same kind of excitement & pleasure as more concrete gift would.
So I set out to create her gift in perhaps the most universal language of them all, Music. I jumped on some more zoom sessions with Jo and asked her various questions to better gauge & understand her feelings & her personality.
Track composition.
Using that I started to write an airy, atmospheric & soulful track aptly called “Portal”. I gave it that name because I aimed to capture her feeling leading up to the moment she travelled to Canada, the moment when it happened (the portal) and the uncertainties that lie ahead. Jo had likened her journey to that of a plant of how it grows & is nurtured & eventually grows into a large tree.
The track was produced, mixed & mastered in FL Studio 21.
The Artwork.
For the artwork I chose a glowing frame as the shape of the portal itself & set it inside this mystical landscape with mountains, green pasture & subtle amounts of fog. The idea being that the portal stands out as this almost mirage like entity in the landscape. The portal signifies the ‘leap of faith’ of moving abroad.
Inside the portal you can just barely see a distorted image of what looks like a tree which I chose from the plant analogy that Jo used to describe her journey.
This artwork was drawn in Procreate.
The Track.
So sit back, put a pair of headphones on and join us on a journey of growth & just have a thought what place does this track take you?
Reflections.
I found this to be a really emotion evoking project for myself as me & Jo both come from a similar cultural background & both our stories of arriving here have significant overlap. And I enjoyed listening to the track in almost the same vein as Jo did.
I was glad that in her eyes I was able to capture & almost describe the journey using a sonic landscape as it was quite difficult to frame using simply just words.
To make it even more soulful I would have loved to add more orchestral elementals & structure it more like an orchestrated composition as opposed to the ‘4/4 electronic beat’ it is right now.
But overall I am happy that she liked it & It also served as an almost meditating experience for the rest of the cohort while they were listening to it as well.