Personal Reflection – Constantin Ginga
Constantin Ginga
Working on the XR projects this semester was an interesting experience. I have worked with Unity before (as part of the game development course), but never with XR. Going in, I didn't know anything about the technical aspects of XR and the underlying mechanics. Needless to say, I learned a lot about creating an XR experience in Unity.
My responsibilities included:
- Marker-based AR character sheets
- VR order functionality
- VR Wrist UI
- VR money counter
Marker-based AR
Our AR project is a companion app for the Zombicide Black Plague board game. The app serves as a guide for the players, providing them information about setting up the game and the different characters. The app uses both marker-based and markerless AR. The markerless AR is used to display the game board, while the marker-based AR is used to display the character sheets. This decision was made because we wanted to be able to place the board anywhere, while the markers for the character sheets are placed around the board itself, for the different players to see.
My main responsibility for this project was the marker-based AR. I was responsible for the implementation of the interactable character sheets and displaying them based on the markers.
First, I had to create the prefabs for the character sheets. These prefabs contain the character and some UI elements, such as checkboxes for skill upgrades and a slider for the experience level. This part was quite straightforward, as I had already worked with prefabs before during the game development course.
The next step was to implement the logic for displaying the character sheets. I printed out the markers for the different character sheets (which were in the form of a QR code). The process of detecting one marker and displaying a prefab was fairly simple, as the AR Foundation abstracts away most of the complexity. I had to create a reference image library and add the marker image. Then, I created a tracked image manager in the AR Session Origin, which would detect the markers and display a prefab I selected. Everything worked as expected, expect one detail - the character sheet was supposed to face the camera. This was fixed by implementing a custom script, which rotates the character sheet based on the camera's rotation.
However, I ran into some issues when trying to detect multiple markers and displaying a different image for each of them. The AR Foundation documentation was not very helpful, so I had to implement a custom solution, combined from multiple toturials and answers on the Unity forums. So, instead of relying on the AR Tracked Image Manager, I had to make a separate script, with a dictionary of markers and prefabs. This script would detect the markers and display the corresponding prefab, while also hiding the prefab when the marker would go outside the camera's view. This solution worked, but it was not very elegant. I would have preferred to use the AR Tracked Image Manager, but I couldn't find a way to make it work with multiple markers.
VR Order functionality & Wrist UI
The VR project is a warehouse simulation, where the player can move around the warehouse and complete orders by picking up boxes and placing them on a palette. Here, I was responsible for implementing the order functionality, money counter and the wrist UI.
Creating the Wrist UI was not too difficult, as there were plenty of resources online on how to attach a UI to the player's wrist. I created a canvas, which would contain the time remaining for the order, the button to start/finish the order and the text for the positions of boxes in the order. I attached it to the left controller and added a toggle to the controller, which would display/hide the UI, by creating a new action in the XRI Input Actions.
The money counter was also quite simple to implement. I created a new world space canvas and attached it to the Main Camera. This canvas had a script that would update the text:
- Decrease the money when a box is destroyed
- Increase the money with a random value when an order is completed
The more complex part was the order functionality. Each position in the warehouse is represented by an invisible game object with a trigger collider, which would detect when a box is taken from that position. When an order is started with the wrist UI, the positions are refilled with boxes (in case the player has already taken some boxes in a previous order) and the positions are randomly chosen from a list of all positions. When a box is taken from a position and added on the runner, the box is removed from the position and added to a list of the boxes in the order. When the order is finished, the player is rewarded with money. This was the most challenging part of the project, as it had to use multiple components and scripts and I had to understand the flow of how the game would be played.
Reflections on teamwork & work process
Working in a team of 4 people was quite difficult. We had a few physical meetings to work together and some online meetings for catching up, but most of our communication was done through Discord. This made it difficult to split up work, as often times we would work on parts that depend on each other and team members had to wait for others to finish their work. Also, we didn't manage to fit well in the internal deadlines we set, as the team members had different schedules and priorities. This led to some team members having to do more work than others, which was not ideal. However, we managed to finish both projects on time and I am happy with the final result.