Blog Post #3 - Starting work on MVP - April 22nd, 2021


All applications described can also be found in the notes section.

What we accomplished

This week we split into 2 groups. One group worked on building a Croquet application with voice chat and physics, and the other group worked on writing a component that would allow players to pick up and throw balls in a more natural-feeling way.

Voice Chat and Physics With Croquet

A link to an app demonstrating this feature can be found here.

Below is what a room might look like when you enter: The left hand and right hand side correspond to different players - where they both see eachother. To use the demo in the browser - you are able to use the wasd keys. To use the demo in VR mode - you can use the joysticks. You are also able to pick up the cubes and the sphere in the room and throw/drop/stretch them (using superhands). We have superhands in use at the moment - but next week plan on changing this to use Eddie + Akash’s code.

The blue balls represent users and when entering the same room, players are now able to hold conversations with each other over voice chat. When you move to the other side of another player, you will also be able to hear the voice coming from that direction. You are able to see the players position update in real time as well.

Users are able to interact with objects - where the other players in the room do not go through the objects. One issue we had was that the object positions are not all the same in each users view (blocking issue). This demo however shows a few key features - such as croquet with aframe-physics, voice integration and users being able to join the same room with user positions being updating in real time. To unblock ourselves (croquet team: Timothy & Clarisa) with object positions not being consistent across user views - we plan to email our mentor this week and spend more time in office hours. As our workload for next week was planned to work on integrating Eddie & Akash’s work with superhands and this week involved more coding - we plan to take advantage of the additional buffer time we planned.

Throwing Balls

In our rapid prototype app, we had implemented throwing in a naive way. Previously, a user can pick up a ball and when releasing it, the velocity of the ball is set to the velocity of the controller and then multiplied by some scalar. This led to some unintended behavior where if you released the ball after picking it up, even with no throwing motion, the ball would move much more quickly than expected. Additionally, the “Superhands” module had a significant delay between when you move your hand and when the grabbed ball is moved. This led to issues where if you moved your hand too fast, the ball would end up being too far away from your hands, ending the grab. When combined with the above naive implementation of throwing, the ball would sometimes shoot off in an unknown direction.

This week, we implemented our own throwing and grabbing system that aimed to solve this issue. A link the the new demo can be found here. Here, the ball tracks the motion of the controller with a nearly unnoticable delay so that it is impossible to lose your grip on the ball from jerky motions. Throwing has also been improved so that we now track the position of the ball from the last few frames before the ball is released to determine the throwing velocity. This led to a much more natural feeling throw and normal behavior when the you drop the ball.


Individual work log


Plans for next week

Next week, we plan on combining the parts we worked on this week to create a basic game where multiple players can speak and throw items at each other. Timothy and Clarisa will also continue work on making sure that objects positions are updated in real time for all user in the room.


Blocking issues


Notes

Delivarables

None this week.