top of page

Individual Contributions

Designing Gameplay

Led brainstorming ideas for application gameplay.

Image.jpeg

Voice Recognition

Used Wit AI and MetaVoiceSDK to trigger out breath animation. Ultimately this approach failed but I used this feature for the application menus.

Breathing Animator Cycle

The gameplay loop of the breathing game requires a range of triggered animations, sounds and effects. To manage this I used a large animation state machine pictured below.

Screenshot 2024-05-20 at 00.54.10.png

Hardware and Audio

Finding a solution for hardware that was compatible for all the audio features of the application proved difficult as Quest's audio data is encrypted. The solution was using a link cable and external microphone. However different microphones all had to be calibrated for.

Screenshot 2024-05-20 at 01.03.55.png

Audio Volume Detection

Used Meta Voice SDK volume threshold callback to detect the "WHOOSH" sound. This proved an effective method for detecting voice but was sensitive to calibration. It required a smoothing function to ensure the detected volume remained consistant.

Screenshot 2024-05-20 at 01.17.49.png

Pitch Detection

Used an external script to detect pitch of humming: https://github.com/nakakq/AudioPitchEstimatorForUnity. This triggered a callback to update the scene lighting, and firefly count.

Screenshot 2024-05-20 at 01.26.05.png

Firefly Hand Interaction

Added a firefly particle system which dances around the user's hands in the humming scene

Hand and Mouth Particles

In the breathing scene when the user breathes out particles are emmitted from their mouth and hands. The emission is correlated with the volume.

bottom of page