1st Place Winners - Computershare Awards, 2022
- Rob Crawford
- Nov 4, 2022
- 3 min read
Updated: Oct 3, 2024

Introduction
The module
Within Edinburgh Napier University, I participated in a module called Group project. All students regardless of their major in the school of computing participated. We had to assemble our own team, mixed with students from other majors and bid for a real-life client and their project. This allowed us to be a specialist in our respective sectors while working within a team, balancing our tasks to meet the deadline.
The Team
After bidding for a team, I was onboarded into a group made of 6 people including myself. Our members were very diverse, coming from programming and 3D modelling degrees with myself being the only team member from an audio background, excitingly taking the responsibility for the full sonic experience.
The Client and Project
The project we bidded for and won was a VR Fire Training Application for a Glasgow, UK-based VR application development company called Digitalnauts.
The project we bidded for and also obtained was to create a VR fire safety training application which had the requirements to be functional to teach workers fire safety in a simulated environment to assure safety.
Development
Preparation
utilizing the game engine Unity and the audio middleware, Wwise. They were integrated and then a repository was created for the Project on Plastic SCM which is a form of source control. This allowed us to work remotely on our own machines. The headset of choice was the Meta Quest 2, due to its wireless capabilities allowing it to be used in practically any location, such as the workplace. The use of Jira, an issue-tracking software was also implemented which allowed us to set tasks while utilising a timeline to stay on track for each sprint to make the deadlines.
We created a discord chat to house as our main point of communication with weekly meetings, discussing tasks and problems. When developing the assets to assure everything was probably optimized for the quest 2 headset, we didn't use the Unity asset store, instead, everything from the 3D models, code and sound, was made from scratch.
Sound Creation
When creating the assets, the utilisation of wavetable synthesis and field recording was applied. The research went into understanding what types of frequencies are admitted from fire alarms and a vast range of fire extinguishers in order to replicate them. The fire extinguishers types were: Water, Foam, Powder and CO2, using a video reference for their original sound as guidance. Audio manipulation with white noise and other field-recorded source material helped imitate these fire extinguishers types with only needing to record one real fire extinguisher spray. I made sure to keep all the audio at 44.1k, 16-bit to save as much storage as possible as the storage on the quest cannot be replaced or upgraded. Foley, ambience and music I also recorded and created.
I came up with the idea to have characters to go with the game to help guide the player while also building immersion. I helped write a script with the project leader as well as a character sheet. After this, the other sound assets that needed to be required were professional voice talents. To do this, I created an ad and hosted auditions while checking in with the team and project leader on their opinions of the voices until we hired the preferred performers and then ultimately obtained their dialogue files.
Sound Implementation
The sounds were implemented using Wwise using containers and Wwise events. Making sure that any sounds that were going to be localized in the environment were mono to assure it becomes 3D. An event was set up for each sound asset with the spray event having multiple layers such as a start and stop with the spray sound looping. The looping was achieved by cutting the spray sound in half within my DAW (Digital Audio Workstation) and crossfading the cut end on the other side of the file, glueing them together. In Wwise by when looping the asset, it plays continuously unless the player lets go of Quest 2's controller trigger. RTCs were also used that adjusted a low pass and volume to imitate distance while some sound assets just needed a static low pass to emulate sound occlusion. The dialogue was set up in a similar way, with them being triggered with the sequencer in Unity.
Conclusion
In conclusion, we finished the project on time and were able to present it, after the presentation our project was shortlisted as one of the top projects and with 10 other groups, we went to the Computershare office in Edinburgh for one more presentation of our project in front of professional judges. After all the groups presented, the judges voted and our group won first place.
댓글