With a new Computer Science building in construction at the University of Illinois Chicago, this project proposes a potential kiosk that can be found at the right side of the building entrance. Focused on immersive Virtual Reality (VR) at room scale, a user can interact with the kiosk using a Virtual Reality headset. Pokemon-themed, the kiosk is, for the most part, loosely modeled after the Pokemon Center found in Pokemon FireRed and LeafGreen (2004). It sells items that can be found throughout the Pokemon franchise.
As shown in the image comparison above, the kiosk includes an escalator, map, Pokemon healer, TV, green shelf, potted plant, and red counter to closely resemble the Pokemon Center on the left.
In order to use the application, one must first download the project from GitHub and install the necessary software. Assuming that the steps described in the section titled "Instructions to Build and Run" have been followed, the user can now walk around the space and interact with the kiosk. The user is able to grab and toss over twenty items such as the food, books, Pokeballs, and Pokemon. Additionally, they can duplicate items by interacting with the Pokeball dispenser, microwave, Pokemon healer, and TV. Pressing the red button of the Pokeball dispenser spawns red Pokeballs that can be grabbed and thrown. Similarly, pressing the silver button of the microwave spawns many curry bowls that can be grabbed and thrown. Picking up Eevee and tossing them into the Pokemon healer causes them to evolve into Espeon. Finally, pressing the red button on the right side of the TV turns it on. Doing so plays the opening theme of the Pokemon anime. Pressing the red button again turns the TV off.
To begin, download the project from the following link by clicking on the green button that says "Code" and selecting "Download ZIP":
Group4.Project2Extract the ZIP file as needed. The next step is to install Unity. To do so, go to https://unity.com and select the blue "Get Started" button at the top. Select the "Individual" tab and click on another blue "Get started" button underneath "Personal". Download Unity Hub for your preferred platform. Install Unity Hub which will host your Unity projects. In Unity Hub, click on "Installs" then "Install Editor" to install a new Unity version. This project uses version 2021.3.6f1. Visit the download archive at https://unity3d.com/get-unity/download/archive and select Unity 2021.x to find the correct version. Install using Unity Hub. It should then appear there. Open the project by pressing the "Open" button in "Projects". While not required to download and install, it should be noted that the project makes use of VRTK version 4.
Connect the Quest headset to the laptop using a suitable USB cable. Put on the Quest headset and use the controllers to tell the Quest to accept the PC connection and always allow that connection. Make sure developer mode is on. Click on the Quick Settings menu on the left side of the thin menu, then click on the settings menu on the upper right. Next, scroll down in the menu on the left to Developer and turn on the USB Connection Dialog.
In the hierarchy of the Unity project, make sure CameraRigs.UnityXRPluginFramework is enabled while CameraRigs.SpatialSimulator is disabled. Under Unity Build Settings, make sure you are building for the Android Platform and the Quest connected shows up under the Run Device list of compatible connected devices. If it does not show up, unplug and re-plug the USB cable, then tell the Quest to accept the connection to the laptop again. Save the project and restart Unity.
Click on Build and Run. It should take about 5 to 10 minutes for Unity to convert all the assets to a form suitable for the Quest. Disconnect the USB cable from the Quest, put on the Quest headset, and grab the two controllers. The app should start automatically.
The following is a list of all the 3D models that have been downloaded from the internet as well as their sources and credits. Included are short descriptions of how each relate to the requirements of the project. In the images of this section, the red numbers represent the downloaded 3D models.
The image above numbers the models found at the front side of the kiosk.
The following is a list of all the sounds that have been downloaded from the internet as well as their sources and credits. Included are short descriptions of how each relate to the requirements of the project.
The image above numbers the models found at the left side of the kiosk.
The following list is of all of the models created by Silver Vo and Farah Kamleh. It should be noted that they have all been created in Blender. In the images of this section, the green numbers represent the student-made models.
The image above numbers the rest of the models found at the front side of the kiosk.
As required, four new lights have been added to the kiosk. They include the yellow light in the microwave, the purple light in the Pokeball dispenser, the white light in the fridge, and the purple light on the healer.
The image above numbers the models found at the right side of the kiosk.
There are two human models in the scene, each representing a single student who worked on the project. Created using MakeHuman, the human models represent Silver Vo and Farah Kamleh. The model for Silver can be found standing near the fridge. He has an idle animation downloaded from Mixamo. When touched by the user, he speaks and states the price of the Pokeballs. The model for Farah, on the other hand, can be found sitting on the chair facing the TV. The downloaded Mixamo animation makes them point towards the screen of the TV. When touched, they yell, "It's Pikachu!".
When tested in the Quest 1 headset, the lowest frame rate observed was 31 while the highest was 72. The lowest frame rate occurs when turned towards the exit of the building. The highest frame rate occurs when when looking upwards and downwards inside of the building. When interacting with objects in the scene, the frame rate remains stable.
There is a drastic difference that can be observed when viewing and interacting with the Pokemon-themed kiosk using the Unity simulator and the Quest 1 headset. To work with the Unity simulator, one needs only to press the Play button in Unity and the running project will be visible in Unity's Game View. The user will be placed in front of the kiosk with the left and right controllers, colored yellow and red respectively, in view. In order to look around, the user can use their mouse or mousepad. In order to move forward, back, left, and right, they can use the WASD keys on their keyboard. To take control of the left controller, the 2 key on the keyboard must be pressed. Doing so will allow the user to rotate the direction of the controller using the mouse or mousepad and move it forward, back, left, and right using the WASD keys as they would their body. The right controller is used in the same way except that it is wielded by pressing the 3 key instead of the 2 key. To return to controlling the body, the user must press the 1 key.
To deploy the project onto a Quest 1 headset, one must follow the instructions provided in the section titled "Building and Running on the Quest", ensuring that, in Unity's hierarchy, CameraRigs.UnityXRPluginFramework is enabled while CameraRigs.SpatialSimulator is disabled. Using the Quest 1 headset, the user has the ability to teleport around the kiosk as they please, but they also have the option to walk around the space using their legs which is the intention of the project, as it was created with the scale of human users in mind. As though they are in a real space in the real world, the user can grab and interact with objects using their "hands". With a controller wielded in each hand, the user can grab over twenty objects and interact with four of them, producing spawns and duplicates. For example, to turn on the TV, the user simply has to stretch their hand out towards the red button on the right side and "press" it. Similarly, to grab and toss an item, the user need simply to reach out their hand towards the object, press a button, raise their hand, and toss at a distance of their choosing using a strength of their choosing as they would in the real world.
It can be argued that working with the Quest 1 headset as opposed to the Unity simulator is easier, especially when it comes to movement and interaction. The user can treat their virtual surroundings as they would their real-world surroundings. If they want to move to a specific location in the space, they simply need to walk there using their legs. If they want to pick up an object, they simply need to reach out and grab it. Using the Unity simulator, on the other hand, comes with its own challenges. For example, assessing the distance at which a controller will travel when a key is pressed proves difficult, making it tricky to grab an object and toss it as a result. Attempting to do so often results in a controller that has flown far too forward or back than intended, eventually disappearing from the view of the user.
This second project is very interesting because it introduced me to virtual reality development in unity. A second thing I like about virtual reality is that you can put most things to scale and you’ll feel more immersed because a door looks like a door in your view, and a table is how big a normal table is etc...
About the visual aspect, VR can't quite match the feel of real life for obvious reasons. The first reason is display horsepower, we see things in incredibly high definition with a peripheral and center vision that not many headsets have at the moment. Human vision is complicated and to mimic that within VR takes years of software and hardware development. The second reason is the power behind it all to run those hypothetical displays. Right now the quest has the capability to run applications and games at 1920 x 1832 per eye, which is far lower than what is needed but it already has issues with performance with some games already. We need the power of these mobile chips to improve drastically or alternatively get a very high end PC to run it and connect using cable.
Another thing to consider is the sense of touch, you can't really feel the objects you're picking up. Other than visually, you won't be able to tell the object's shape in your hand. Texture is also something to think about. Everything around us has a distinct texture, wood, fabric, metals, etc...You just won't have these contexts if you're using something virtual rather than real life.
Sound isn't much of an issue but it is a very important sense that we use for immersing ourselves in the world around us. Sometimes you close your eyes and listen to the sounds surrounding you for that reason. Not to say that VR can't match that right now, but the way audio works in game engines isn't quite like real life just yet, and the quality of the audio system whether separate or included in the headset itself won’t be enough to bridge the gap. You can get pretty close and your brain will fill in the missing gaps.
This is the final one and I think the most important one to cover. For me when I play virtual reality games the thing that will always get to me is motion sickness during extended play time. Because when we run or bike or jump. Our bodies know that we are in motion when we're doing those actions. Virtual reality on the other hand, can't give you that important context to your brain so that it won't be confused between what you're seeing and what you're experiencing. This is why it's hard for most people to get into Virtual reality because they get motion sickness when first starting out. You can only help with something like a racing set up with pistons moving you or a floating chair that creates the illusion of movement to bridge the gap. But this is one thing that we'll never solve because it's not a matter of hardware or software but it's just the matter of physics.