“UNMUTE" is an interactive VR experience designed to retrain attention and perception, using sound as an entry point into a larger exploration of sensory awareness. It began as an inquiry into disappearing neighborhoods—how physical spaces and communities fade not just through urban change, but through our own disengagement from our surroundings. Now, it has evolved into an experiment in attunement—a process of noticing, sensing, and reconnecting with the world.
This project unfolds in four stages, guiding players from passive recognition to active exploration. Beginning with focused listening, players progress through sound-matching challenges, experience sonic illusions, and eventually take on the role of sound designers, reconstructing meaning through a dubbing mini-game. The final phase moves beyond structure, offering an open-world soundscape where players freely explore and apply their newfound awareness.
This experience is not just about sound—it is about perception itself. By blending VR’s immersive potential with the psychology of attunement, it invites players to break habitual disengagement and rediscover curiosity. When they remove the headset, they may find themselves more aware—not just of what they hear, but of what they have been missing all along.
Intro:
Resonance encourages people to interact with the community in an AR world where players can move the cube to their desired location, build their own style of STRUCTURE, and then go through a portal to another scene that will see the world affected by your interactions in the AR scene.
The other players will see the perspective of the player who is operating from a monitor next to them, and can direct or think about how they would change and move the cube, promoting teamwork. This means that the future built by different groups will be different, and everyone is encouraged to discover their own small or large interactions with the community, which will have a different impact on the future, as well.
Goal:
Foster curiosity about surroundings
Highlight the effects of individual actions on the environment
Strengthen the connection between people and their communities
ARVR Platform Conversion Demo:
Goal:
Smooth transition between ARscene and VRscene. While moving items in ARscene, VRscene needs to generate some items.
Challenge:
Need to record the cube position in the AR scene and synchronize it to the VR scene. I used SharedState at first to let the system track the objects in the two scenes, but the test site was outdoors and the network was not very good, using SharedState would cause the file to get stuck directly, and in the end, I used the number of times I recorded the movement in the AR scene to open the prefabs that had already been laid out in the VR scene to be used as a simulation of the display.
For richness also need VR prefabs are randomly generated.
cubes in AR scene need to be can regenerate [take one away and generate one again].
Memory problemquest can't record too much data will cause lag.