Individual Contributions
Designing Gameplay
Led brainstorming ideas for application gameplay.
Voice Recognition
Used Wit AI and MetaVoiceSDK to trigger out breath animation. Ultimately this approach failed but I used this feature for the application menus.
Breathing Animator Cycle
The gameplay loop of the breathing game requires a range of triggered animations, sounds and effects. To manage this I used a large animation state machine pictured below.
Hardware and Audio
Finding a solution for hardware that was compatible for all the audio features of the application proved difficult as Quest's audio data is encrypted. The solution was using a link cable and external microphone. However different microphones all had to be calibrated for.
Audio Volume Detection
Used Meta Voice SDK volume threshold callback to detect the "WHOOSH" sound. This proved an effective method for detecting voice but was sensitive to calibration. It required a smoothing function to ensure the detected volume remained consistant.
Pitch Detection
Used an external script to detect pitch of humming: https://github.com/nakakq/AudioPitchEstimatorForUnity. This triggered a callback to update the scene lighting, and firefly count.
Firefly Hand Interaction
Added a firefly particle system which dances around the user's hands in the humming scene
Hand and Mouth Particles
In the breathing scene when the user breathes out particles are emmitted from their mouth and hands. The emission is correlated with the volume.