Summary
Last week, we started working on the development of XRcise. Our first focus was on the jumping jack motion. To achieve this, we decided to consider a jumping jack as any motion that meets these two criteria for our MVP:
- Raising both controllers above the user’s headset
- Jumping
To achieve this, we researched various articles, videos, and previous teams’ projects. Eventually, we figured out how to access the fields of the MRTK input system and built a script that could detect when the user raises their hands above their head. However, after trying it in the Oculus, we realized that our script couldn’t detect the controllers and only worked when hands were used. We are all currently working to fix this issue and aim to resolve it by the midterm presentation. Additionally, we plan to integrate the second jumping constraint, add an environment to our scene, and add a basic level of gamification.
What each team member worked on
All:
- Researched various ways of implementing the jumping jack motion.
- Experimented with different methods for detecting the jumping jack motions, such as using the motion controllers or the built-in MRTK gestures.
- Debugged the jumping jack motion detection script.
May:
- Continued researching technical challenges of motion tracking.
- Looked through previous teams’ projects, specifically 22wi-team7 (Extreme Red Light Green Light) and tested their motion tracking scripts
- Experimented with alternative approaches to tracking the jumping jack motion based on their code.
Michelle:
- Experimented with alternative approaches to tracking the jumping jack motion, such as using OVR to get the controller inputs.
- Helped team members configure build settings and resolve compiler-related issues.
Abas:
- Wrote Snippet Week 4.
- Built off of Aaron’s script and wrote another script that detects when the user raises their hands above their head (currently doesn’t work with controllers).
- Implemented sound effect feedback to notify the user when they raise their hands above their head.
Aaron:
- Wrote a script that allows us to extract the positional data for the controllers/hands.
- Attempted to get positional data from the controller meshes, logged input types, and performed other debugging tasks to help fix the JumpingJackDetector script.
New features/functionality implemented
- Extracted the positional data from hand input.
- Wrote a script that detects when the user raises their hands above their head (currently doesn’t work with controllers) - includes sound effect feedback.
Files to be reviewed
abas-tracking
branch:XRcise/Assets/JumpingJackDetector.cs
Blocking issues or help needed
- Currently unable to extract the positional data from the controllers (scripts only work with hands).