• Kyan Rowse

Freelance Work

Around week 3 of this Tri I started work on a VR Game called the Causeway as a Technical Sound Designer.

The Project was made for the Liminal VR company to use on their VR platform. Liminal Vr creates calming VR experiences for the Oculus headsets such as the Oculus Go.

As a Technical Sound Designer, my role was to use Unity to implement all the sound and music. I worked alongside my classmate Nyle Gibbens who did the Sound design and another SAE Student named Will who did the music.

I had worked on a couple of games doing Technical Sound Design before this, for those projects I used an audio Middleware software called FMOD which allows you to create dynamic audio easier without knowing much code.

Summary of the game

The game is set on an island there are some pillars on the island that respond to music. While you are listening to the music a giant wave is slowly approaching that will eventually destroy the island.

I had some initial meetings with the developers and they wanted the pillars to react to the music. I suggested we use a system that can make the pillars sensitive to certain frequencies.

Concept Video

I had watched a tutorial series on Audio Visualization by a YouTuber called Audiopeer, but if we went with Audiopeers Visualization setup I wouldn't be able to use FMOD as this setup would not work with FMOD.

I decided on not using FMOD and doing the audio implementation using Unity's Audio Engine. This just meant that I would have to do more coding, which I already had some experience in from my previous projects, and over the Christmas holidays, I had progressed quite far into an online C# course. The project would be a good challenge for me and a way for me to improve my coding skills.

It didn't take long for me to get the Audio Visualization script working in Unity I did some tests with the pillars and sent it for feedback from the developers. They were happy with the results. I then went about improving the system making the pillars bounce smoother and creating some documentation to help the game developers with using the Audio Visualization system.

By around week 6 or so, I started to receive some audio drafts from the Sound Designer and Composer. Before I put the sounds in the game I decided to do some research into Sound Design for VR games I discovered some useful information from reading a paper titled:

"Introduction to sound design for virtual reality games: a look into 3D sound, spatializer plugins and their implementation in Unity game engine"

(Nuora, 2018)

And also documentation on the Oculus website

(“Introduction to Audio in Virtual Reality,” 2014).

After doing the research I started Implementing the ambience and music into the game. For all the ambience and music I used 3D Sounds with Oculus attenuation as this is what I read was the best way to do VR audio as the creates a more immersive experience as sounds sound like they are coming from an actual source.

By week 11, I was making final adjustments to the audio I used the Unity Audio Mixing to balance the ambience sounds with the mixer and I used some automation to make the ambience sounds come up once the giant wave was closer and then to fade out as the game ended. To do this I used some Audio Mixer snapshots attached to some triggers boxes that are triggered by the player passing through them.

Final Product

The Project can be found on the website where you can download a copy of the game and try it out:


Overall I had loads of fun working on this project before this project I had mainly worked on 2D games so it was good to finally work on a 3D game as the 3D audio is just a lot of fun to work with.

The game developers have said that they enjoyed working with me and will definitely try to work with me again.

I am very happy with the final product but I really want to try it out on a VR headset which hasn't been possible due to the COVID 19 crises. As soon as I can get a hold of a headset I will give it ago.


2 views0 comments

Recent Posts

See All