AGI24 Group 2



The Team


Carl Holmqvist

Caholmq@kth.se

For Hedge-In, I focused primarily on the VR interactions and the procedural maze generation. The maze was generated through the recursive backtracking algorithm, which was utilized for it's simple implementation, and the fact that it would always generate a completable maze. Other elements of the maze, like the collectibles, where placed at random with some further conditions to avoid conflicts. The VR implementation used the Unity OpenXR plugin. The movement was facilitated through checking the position of the hand controllers in world space at set time intervals, checking for differences between intervals. I also focused on the monster AI, making sure it moved around autonomously throughout the maze looking for the player, and hindering players from poking their head into the geometry. I also made an attempt at adding volumetric clouds and fog to the project, which worked well on PC, but unfortuntantely could not be integrated to function correctly in VR in time. This website is also made by me.

For Infinite Sailing, I was responsible for the infinite terrain and river generation. With that I also implemented the logic for spawning obstacles in the river, and 3D objects onto the terrain as scenery. I also wrote shaders for the grass (being made up of a geometry shader) and terrain coloring. I was also responsible for integrating hand controls into the project, which was done by a google MediaPipe implementation in Python which then communicated with Unity using a UDP web-socket.

Beatrice Galvanetto

beagal@kth.se

In developing Hedged-In, I focused on designing the monsters' AI and animations. I also worked on establishing core game mechanics, such as the "Game Over" and "Victory" conditions. I crafted detailed 3D models for key in-game elements, including collectibles, monster distractors, and the final exit. Sound design was a major focus; I incorporated ambient sounds and strategically placed sound triggers that respond to the monster's proximity, amplifying tension for the player. Lastly, I enhanced the game's atmosphere with improved fog effects and upgraded terrain, collaborating with other team members to create a visually creepier landscape that complements the VR experience.

In developing Infinite Sailing, on the graphics side, I used Blender to model the boat and rocks. Following that, I implemented the day and night cycle in the main scene, which was later refined by the team member responsible for the sky. Additionally, I took charge of the microphone interactions, adding UI elements to support this feature. I also collaborated with other team members to develop the initial tutorial presented before the game starts.

Erica Tjernell

ericatj@kth.se

My role in developing hedge-in was mostly graphics related, such as the creation of the hedge maze material using shell texturing for the strands. I implemented this in a shader using GPU instancing to improve the performance, and because we were using VR, it had to be adapted using stereo rendering. I also made it so that the bush receives and casts shadows. I was investigating the use of subsurface scattering to make the bush more realistic, but I didn’t end up completing it. Similarly, I worked on a visual effect for fire to be used for torches, but because we were close to the end of development, I ended up scrapping this due to time constraints. I also collaborated on some smaller features and bug fixes such as fixing some settings on the particle system for the fog to make it look better, as well as modeling more hedge models and adapting the code so that the slightly rounded maze pieces tiled better.

For Infinite Sailing I worked on the clouds. The clouds are explained in the graphics section! I also helped with merging and managing the project in general, since we used my GitHub to create the project, and my account on the computer with the builds.

Helin Saeid

hsaeid@kth.se

For Hedged-In my initial focus was implementing an algorithm for making random walls in maze interactable in the computer view since the maze was generated procedurally each time we ran the game. I have also worked on adding immersive effects such as sounds when the doors move and also step sounds when the VR-player was walking in the maze. I worked on implementing a shader with a visual effect which we intended to display in VR when encountering monsters in the game. The effect worked on the computer but not in VR and was therefore not included. During exhibitions I played an active part in recruiting people to try our game. I also conducted user studies by making a questionnaire for the testers and summarized the results in text and graphs for evaluation. Finally I made the demo video for the website.

For Infinite Sailing I worked on implementing some game mechanics. I implemented the collision logic for the boat and the terrain so that the boat could collide with the rocks and riverbanks and lose life when doing so. I also made the wind elements using Unity’s particle systems, and controlled these to spawn according to the movement inputs from the user. Finally I also made the demo video.

Konstantina Maria Sourmpati

kmsou@kth.se

In creating Hedged-In, I was responsible for game UI, environment graphics and artwork. I focused on visually interpreting game mechanics on-screen by designing UI for gameplay, game-over, and victory conditions. To enhance VR immersion, I developed environment elements and atmospheric effects - sky, terrain, fog, and rain - individually or in collaboration with the team. Lastly, I designed the in-game canvas art and promotional poster.

During the creation of Infinite Sailing I worked mainly on graphics related to water aesthetics. More specifically my work includes the creation of the water shader graph and designing the splash effect particle system as described in the corresponding section of the project. Also, I worked together with other team members on implementing the initial microphone interaction system for monitoring the boat’s tilt direction based on the detected audio input. Finally, I contributed to the UI design of the onboarding canvas.


Hedge In


Goals and Motivation

The concept behind our project, Hedged-In, was to create a creepy yet fun experience for players, focusing heavily on the communication and collaboration between the two players. One player navigates a maze in virtual reality, while the other observes from a top-down view. The top-view player's main role is to guide the VR player to the exit, and helping them avoid life-threatening monsters. They can also move certain parts of the maze, and distract the monster so that it does not chase the player. Our goal with *Hedged-In* was to design a game that's both eerie and enjoyable, creating a fully collaborative experience for both players.

Interaction

Hedged-In features two primary forms of interaction. The VR player, equipped with Meta Quest 2, is fully immersed in the maze, while the second player views the maze from above, using a desktop mouse to assist. The VR player moves within the maze by lifting and lowering their arms, mimicking one's arm movement when running, while seated on a revolving chair; turning their body on the chair rotates their in-game perspective. The faster they move their arms, the faster they advance through the maze. The second player utilizes the classic keyboard and mouse to interact with the game.

Graphics

We aimed to create as much of the environment as possible ourselves—from the moon and clouds to the terrain textures, hedge leaves, ambient fog and rain. We also modeled various elements, including collectibles, the monster distraction and the mausoleum which signals the end of the maze. The hedge leaves were made using shell texturing. Ambient fog and rain was made utilizing the Unity particle system, and the moon was added through a procedural skybox, which casts real time shadows upon the environment. An attempt was made at adding volumetric clouds and post processing effects as well, which worked on desktop, but couldn't be integrated properly in VR unfortunately.

Challenges

One of the primary challenges was working with VR, which for us heavily relied on Wi-Fi connectivity. This was because we chose not to build the project onto the VR headset, but to instead connect it to a PC through the Meta air-link feature to be able to utilize the power of the PC, allowing for more resource intensive graphics to be added. When the connection was poor however, gameplay became almost impossible. Additionally, integrating VR in Unity presented issues, as certain features—like post-processing effects, UI buttons and high FPS—no longer functioned smoothly. There were also challenges with colliders and making the monsters' animations fluid within the maze so they move naturally. Finally, creating realistic graphics for the hedge presented its own set of difficulties as the shell texturing technique was not as simple to adjust as was expected.

Related Work

One big inspiration for this project was a previous AGI-project called CoCar from 2016, which is a cooperative driving game where one person is driving in VR and other people are able to help create a path using a Pixelsense screen. We really liked the concept of collaboration where the players have to communicate to help each other progress in the game which became the main purpose in our game design. One thing that we wanted to change from their approach, was to have a bigger sense of urgency, which we have through our timer and the monster in the maze.

We were also inspired by A-Maze from AGI22. The project takes the form of a game where one player has VR goggles and navigates a labyrinth, while the other player helps clear the path using a PC. It features puzzles to be solved, spiders to be fought, and moveable blocks in the labyrinth. One thing that we did differently was that the movement in our game is done by moving the controllers up and down, compared to the approach in A-Maze which uses the joysticks. Using joysticks to propel the VR player can cause cybersickness as it moves the player in an unnatural way. With our more natural movement, it can hopefully reduce cybersickness.

Another source of inspiration was Have Mercy from AGI16. It features a maze environment, where both players use their phone, and one of them uses a VR headset as well. This game is competitive and the non-VR user plays against the VR user by pulling down walls, among other things. Running is done by moving and the phone registering the movement. We like the collaborative aspect more which is why we chose to do it instead of the competitive aspect, because it adds more interest to the interaction by forcing communication. Also, using phones makes the game easy to use for multiple people, but the small view can be very restricting, which is why we think that a big screen is a better approach.

Lessons Learned

Collaborative Teamwork: Working in a team highlighted the value of clear communication, shared responsibilities, and leveraging each member's strengths. Collaboration was essential in overcoming challenges and driving the project forward effectively. It was also essential to have a diverse team with varied skills and interests. For instance, if one team member had a strong interest in UI design they could focus on that area, while other concentrated on different aspects of the project, while avoiding too much overlap.

Handling High-Pressure Technical Issues: Managing unexpected technical issues under tight deadlines taught us the importance of staying calm, troubleshooting methodically, and keeping backup plans ready, such as having two headsets charged ready at hand at all times. Prioritizing tasks also helped us address the most critical issues first, ensuring the project continued moving forward despite setbacks.

Gathering User Feedback: Learning to capture meaningful user feedback emphasized listening actively and creating a comfortable environment for users to share their experiences. Structured surveys and informal conversations both proved valuable in gaining insights and iterating on our work.

Supporting Users with Cybersickness: Helping users experiencing cybersickness required patience and empathy. We learned techniques to make users feel more comfortable, like suggesting breaks and adjusting VR settings to reduce discomfort, while noting symptoms to inform future design choices. One such setting was to allow for free movement of the head, instead of just rotation, thereby giving the player the ability to look around themselves like one does in real life. We recieved feedback that this aided users in not feeling discomfort.

Guiding First-Time VR Users: Assisting new VR users taught me to simplify instructions, guide them step-by-step, and remain available for any questions. We discovered that clear, concise guidance is key to helping users feel confident and enjoy their VR experience.

User Testing


Infinite Sailing


Goals and Motivation

The game's main idea is that the player becomes a god-like creature controlling a boat traveling down a river while avoiding obstacles in the river, such as rocks and the river bank. Players can use hand movements in front of a webcam and sounds picked up by a microphone to control the wind, which moves the boat and helps avoid obstacles. The key goals for the game are:

Interactive Gameplay: Incorporate diverse types of interaction to engage users.
Visual Appeal: Design a visually captivating environment to enhance the gaming experience.
Sonic Appeal: Create a soothing yet immersive atmosphere through the power of sound.

In summary, our goal was to create a unique experience that used hand gestures and sound as interactive elements while also designing a visually appealing project that could be efficiently implemented within the limited time we had.

Interaction

This project incorporates two types of interactions: hand tracking and microphone input. Both are detected using a webcam positioned above the screen displaying the game, directly in front of the user.

Hand Tracking: The boat's movement left or right is controlled by the user's hand positions, which are captured by a web camera. Google MediaPipe was used to track the hands, which was implemented in Python. The data on the screen position of the users’ hands was then sent to Unity by using a UDP web-socket. The boat's direction is determined by the "balance of the hands". Even if the hands are far apart, as long as they are equally distanced from the center, the boat will continue moving forward. Intuitively, when both hands move to the right, the boat tilts right, and when both hands move to the left, the boat tilts left.

Microphone Interaction: The microphone detects the user's sound input and adjusts the boat's speed based on the volume. Initially, the boat moves slowly, encouraging the user to make noise. As the game progresses, the boat speeds up, and the microphone can even allow an external person to join in by making noise, further increasing the challenge. At first, we considered using sound to control the tilting of the boat rather than its speed. Achieving this would have required speech processing on the microphone, utilizing two microphones—one for the left and one for the right—and implementing a system to detect specific inputs like blowing into the microphones, combined with some form of speech filtering. Given the limited amount of time we decided to opt for hand tracking instead to control the boats inclination.

Graphics

Clouds And Sun: The clouds are created by the use of 3D noise and ray-marching. The 3D noise of varying type and frequencies will determine the cloudscape and more local details of the cloud, such as the smoothness or roughness of the edges. Ray-marching is done twice. First a ray is marched through the volume, sampling the density along it, then a ray is marched towards the sun. Some lighting phenomena are taken into account such as attenuation (Beer's law) and silver lining (henyey-greenstein). Attenuation is about how much light reaches a point, depending on distance and the density from the light source. Henyey-Greenstein is a phase function, determining the direction where light will bounce to. If it bounces inwards to the medium, it's called in-scattering, and if it bounces outwards, it's called out-scattering.

Additionally a day-night cycle was added to make the sky appear more realistic. The Unity script controlling the day-night cycle manages the rotation of the sun and moon to simulate their movement across the sky based on the time of day. The rotation is calculated using a normalized time value, where 0 represents midnight and 1 represents the end of the day. The orientation of the sun and moon is determined by combining vertical tilt and horizontal direction.

In "Infinite Sailing," the sun's intensity increases during the day and decreases at night, while the moon's intensity follows the opposite pattern. However, the long duration of the day makes it nearly impossible for users to experience the night, as they typically lose before it gets dark. Ambient lighting shifts between user-defined day and night colors, with changes driven by the sun's position. Additionally, the sun's color temperature evolves throughout the day, transitioning from warm tones at sunrise and sunset to cooler tones at noon.

Models & Assets: In this project, we created various models. The first model that was created was the boat. Using Blender, we followed several YouTube tutorials to guide the process, and we opted for a low-poly design to match the stylistic direction we chose. The rocks were also created in Blender, using its built-in rock generator. By default, Blender generates rocks with a very high vertex count, which we decreased as needed to suit our requirements. The vegetation featured on the terrain consists of pre-made 3D assets, which we integrated into the environment.

Graphics

Wind: The wind was created by using the particle systems in Unity. Three separate particle systems were created and combined to create one prefab. The particle systems were made out of one particle each and their movements were configured through their respective x, y, and z velocity curves. I added a trail and adjusted its size to fade in and out to give the system its wind-like visuals. This created a cartoon-style 3D wind element which fit with the rest of our project.

Procedural Terrain & River Generation: The terrain shape was created using Perlin noise and fractional Brownian motion. The terrain was also extended to be infinite, and utilized threading for improved performance. The terrain meshes also supported different levels of detail, to further improve performance. The river that runs through the terrain that the player traverses was made by subtracting a fall-off map from the generated base-mesh, which resulted in a carved out passage in which the player could guide their boat through. Obstacles in the river were randomly placed, and the frequency of placed obstacles depended on the distance the player had traversed in that play session. The Scenery of the terrain consisted of trees, rocks and grass. The trees and rocks were simple 3D-objects which were distributed onto the terrain, and the grass was implemented by a custom geometry shader that generated blades of grass onto the terrain mesh vertices.

Water: The river water was created by developing a custom water shader graph in Unity. The graph consisted of connections related to the water surface, the water reflection, the water refraction, and underwater colouring. Concerning the water surface, the graph elements are responsible for simulating the water feel by applying texture distortion normal mapping of Voronoi noise and regulating wave speed by time to give the sense of moving water in the river direction. For the water reflection, a ready-made package was used (by kMirrors) as a map script reference in the graph for portraying a mirror-like effect that reflected the surroundings of the river sides (grass, trees, sun). Within the reflection part, the variable of 'reflection refraction' was added to reflect the light going through the water surface. In underwater settings, refraction was also used as a method to add highlights to the water by applying the Fresnel equation paired with a glossy effect. The underwater colour was achieved by taking the scene colour and applying a transparent effect to it, particularly water colour was a combination of the green and blue colours of the surrounding reflecting on transparent water just like in real-world settings. Another element of the water system is the splash effect triggered when the boat collides with rock obstacles, enhancing both the fun and realism of the water interactions. The splash was created as a particle system. As a particle unit material, a bubble-looking material that utilises a bubble image sprite was created. Its rendering mode is set to additive to keep only the bubble image and remove the background when the particles emit. The particle system is cone-shaped, with size, colour, and rotation changing over its lifetime to make the effect look more natural.

Challenges

The main challenge we faced was deciding on the type of interaction for the game. Initially, as mentioned earlier, we envisioned the experience being entirely microphone-based. The idea was for the user to act as the wind, blowing into the microphone to tilt the boat and avoid obstacles. However, we quickly realized that microphones are not designed to handle blowing directly into them.

Following this, we considered making the project frequency-based, requiring the user to produce a specific frequency by blowing into an instrument. This posed another issue: finding an instrument that could consistently produce the same frequency regardless of intensity. Furthermore, we recognized that the project would be presented in a noisy environment, and creating a game reliant entirely on sound would require advanced speech filtering—something none of us could achieve within the limited time available.

As a result, we shifted to a gesture-based interaction system, with sound controls playing only a minor role in the gameplay. This approach provided a more practical and effective solution to the challenges we encountered.

Related Work

We were initially inspired by a series of previous projects, such as Peak Panic from AGI2023, Plane Panic, and Astro-Rush from AGI 2024. These games shared a similar mechanic, where the player controlled a character or object along a path, aiming to avoid obstacles. In all three cases, the user controlled the object by tilting their phone. We decided to build on this concept but make a change by shifting to a gesture-based control mechanic instead.

Research Papers: The first paper explored microphone interaction in VR, which served as an inspiration for our study. It demonstrated that most participants preferred generating noise in the VR environment rather than using controllers. The paper presented various tasks. The first task allowed users to make a spring wheel rotate, while the second task involved making a ball roll on a surface in VR. Both tasks were performed either by blowing into the microphone or using controllers, with participants preferring the microphone interaction. Link To Paper

Master thesis about real-time volumetric clouds that inspired the clouds in our game: Link To Paper

Lessons Learned

Teamwork: From our previous project, we learned to manage multiple tasks effectively, prioritize them, and allocate responsibilities among team members. This course emphasized the importance of teamwork and effective communication, which became a cornerstone of our project's success.

Working Under Pressure: In contrast to our earlier project, this time we had only four weeks to conceive and develop Infinite Sailing. The increased pressure served as a catalyst, pushing us to work more efficiently and remain focused throughout the process.

Adapting to Circumstances: Adaptability was a key takeaway from this experience. We made significant changes towards the end of the project, including substituting the primary interaction method for controlling the boat, demonstrating our ability to pivot and adjust based on evolving circumstances.

New Forms of Interaction: This project introduced us to new technologies and techniques, such as hand tracking and microphone inputs in Unity. While we were already familiar with Unity, integrating Python scripts and utilizing Unity's microphone input functionality were entirely new challenges. These tasks pushed us to expand our skill set and explore innovative interaction methods within the platform.

User Testing