Blog.

Personal Reflection – Jakub Platzek

Jakub Platzek

Overall reflection of XR interaction toolkit

Having to work on AR and VR projects for this course using Unity in 3D took me out of my comfort zone as my current experience has been primarily focused on 2D projects. I think that was also the most challenging part for me as it involved considering another axis during the development as well as surprisingly quite different physics behavior of objects. Surprisingly, the XR interaction toolkit itself I found quite intuitive and easy coming compared to early days of just understanding how Unity itself works. I would definitely say that it(XR interaction toolkit) could use some improvements, most of them mainly related to flexibility of how grabables work right now, but otherwise I found it overall quite interesting and I would definitely like to delve deeper into different parts of this framework in the future developing something of my own.

Teamwork reflection

Reflecting now, as I have finished my team work with the other guys from the group, I found unfortunately most of the cooperation and interaction with other team members descouraging, finding motivation to work on the project in our group overall quite unequal. The main issue that caused me to reflect negatively on my ex-team mates is primarily probably my own fault, as I prefer to work on and finish my tasks always as soon as I can, as I find leaving my work for the last minute stressful. As this kind of approach to work has been found in our group quite often throughout the development of both projects. I understand that everybody has a right for different approaches to their responsibilities and therefore I usually find these kinds of projects tiresome to work and would just prefer to work on the project on my own instead.

AR Project

The main focus of my work on the AR project involved implementing the “Manual placement” mode as well as the main menu used for access to different features of the app. My work has included updating of the main scene after merges, quality assurance after development of features by other teammates as well as final bug fixing of the final product.

I think that during both the brainstorming phase and actual development process we have not exactly thought through the whole idea properly. I think the application has potential for real world use cases, but is lacking refinement as well as more features to be that useful.

VR Project

The main focus of my work on the VR project involved implementation of the Runner vehicle, which is accessible after spawning in the warehouse. My work included managing the main scene, extension of the warehouse model base implemented by Tomas as well handling of box physics when placed on Runner palette. Compared to the AR project, I found a VR project a lot more time consuming and challenging in regards to physics manipulation as well as using components that I'm not that familiar with. These components included the XR Grab Interactable as well as configurable joint, which were crucial for proper functionality of the Runner.

Regarding the process, I have started my main task by finding the suitable model for the Runner itself. As my search wasn’t successful, I had to modify the next best thing I found in Blender, such that the final product resembles Runner as much as possible. Next step was implementation of development controls(using WASD for movement) for the runner to ensure proper handling before involving XR interaction toolkit components. When the handling was fine tuned, implementation of the controller was the next necessary step. As this still didn’t require involvement of XR components, the implementation was tested by manipulating the controller’s transform(rotations) in the play mode, to ensure desired behavior. This involved configuration of the configurable joint and attachment of the controller to the body of the Runner.

The last step involved inclusion of the XR Grab interactable component, to ensure that the user can actually pull or in other ways manipulate the controller. This step was for me quite frustrating, as setting the controller’s physics layer to something different than “default”, caused the XR controllers to not detect the Runner controller. This required me to re-design some of my previous implementations as the mesh colliders of the controller and Runner’s body caused physics to not work properly. Resolving this forced me to create a custom collision box to mitigate the collisions of the controller/Runner’s body but at the same time ensure for the Runner to not fall through the floor.