Personal Reflection – Tomas Ondrejka
Tomas Ondrejka
Contributions
AR
Free Hand Placement One of the bigger contributions I have made to the AR project was a free hand placement for our quest (game board). This mode allows users to manually position the quest via two joysticks. The first joystick is responsible for the rotation of the quest, and the second is responsible for movement in space. The other way can the quest be placed in our AR game is by identifying the corners of the table. The free-hand placement is useful when the table is not in regular shape, as the corners are hard to identify, or when a user wants to have bigger flexibility in the position. The free hand also consists of retakes of placements, as the user may not like the way he has positioned the board previously.
VR
Game Elements In the development of the game, I took the responsibility of creating a warehouse model. I have explored many model options and eventually decided to use an existing model pack and modify it to fit our use case. The model consists of empty rows and shelf rows. The empty rows contain only ground tiles and are used by a runner (forklift) to drive through. The shelf row contains two shelves on both tile sides, with a palette. I also needed to design a structure of shelves and their ID, as the picker in our game needs to know where the position of a specific item is. To ensure this, I have decided to use letters from the alphabet for each row, and number for each position in rows. This structure was then used in order generation, which instructs the user from which position to pick the box.
Game Over Another contribution I have made is a Game Over component, that is displayed when a user has failed a game. This implementation consisted of multiple conditions that need to be met, in order for the game to be switched to Game Over state. When they are met, a game object is displayed, informing the user that he has failed the game.
Project Development During both of the projects, I have actively contributed to idealisation, feature extraction, feature prioritisation, and task creation. This was important at the start of the projects, as the topics of the projects were free, meaning we could choose what problem to solve.
Learnings
Game Development The previous semester I spent studying abroad and missed the game development course, which I later found out was a prerequisite for the XR course. Due to this, the much learned through the course was actually related to Game Development. This included understanding Unity, and its: Project structure, object references, object interfaces and lifecycles, object properties, game elements like scenes and prefabs, and properties of materials and how they can be applied. I found game development very interesting, especially how Unity takes care of many things in the background. At work, I have previously worked with game development in JavaScript, where a lot of low-level, game-engine-related functionality needed to be specifically designed and implemented, including game loops, object lifecycles, scenes, and lifecycles of scenes. All of this was very error-prone and bugs were hard to debug. I found Unity easy to use and intuitive after self-learning it.
AR I have learned many concepts and theories behind AR, and applied them to our AR project. One of them was for example usage of marker-less AR, because the environment of the AR in the project had a big variance. This was implemented by using different tools for plane detection, including PointCould and Ray Cast Manager. These technologies made sure that we can place our quest in space effectively and in any scenario.
VR As with AR, many theories have been applied to make sure the created app has a smooth experience. It was opted our of using CAVE, as we had limited access to this environment. Instead of this, we have decided to use Meta Quest headset, as it is feature-rich, simple to use, and has a good developer experience. It has support for 6 DoF, uses inside-out tracking and has food UX when it comes to interactions, as it uses its own intuitive controllers. I have also learned how to use XR toolkit. It allowed us to implement only application-specific code. It handles many VR features, including tracking, foveated rendering and acts as an interface for interactions with controllers.
Summary
Overall, I reflect on the course positively. It allowed us to take the theories learned in classes, implement them, and reflect upon them. I have learned many concepts of XR, and game development.