Ece Seyrek Hasdogan's profileDoruk Hasdoğan's profile

"Normal for Norman?" VR // narrative experience in VR

"Normal for Norman?" - an experience exploring memory and narrative in virtual reality.

we made a short, narrative VR game, where players experiences main character's (Norman's) different periods of lifetime such as childhood, adulthood, elderly. I contributed on assets, environment design, lighting, game design, textures, and some interaction elements. We build this project on Unity.

credits:
Ece Seyrek Hasdogan
Doruk Hasdogan
Tommy Graven
Matthew Deline
Ben Tandy
Alex Fletcher
Jade Li Smith


Individual Contributions

Ece Seyrek Hasdoğan
   As a critical member of our art team, my responsibilities included: creating the initial scene design of the room, finding asset models with the help of Alex and Doruk, modelling in Maya, finding, creating, and manipulating textures in Adobe Photoshop, making materials and attaching components (like colliders, rigidbodies, and throwables) to gameObjects, prepared and organized the project’s prefabs, and assisted with the placement of objects in rooms within the scene to make them distinct.
   I have a collection of free assets from a previous project5 that I used as a base to create some of our initial assets by manipulating them significantly in Autodesk Maya to make them our own. Changes were made by making the objects low-poly, removing or adding detail, introducing structures, and creating collage 3D objects by combining several different details from my library. I also made completely new objects like the wardrobe, a basic square short table, a simple lamp, and more.
   One way that I would add textures to these models was by finding black and white photographs to represent what I thought memories of Norman’s life should look like. There are at least four of these images in the game, each showing different life periods (like marriage, family, and trumpet playing) that were distorted with blur and stain effects to give the feeling that the memories are unclear.
    However, during our public playtest we received feedback that the narrative flow was confusing so I continued to make new assets that told the story visually (like the black and white photographs) to make sure that our story transitions worked. I collaborated with Doruk to add new carpet textures, child drawings, coffee mugs, a house in the exterior environment, and more. I also made sure to adjust the colliders for all of these new objects (along with previous ones that weren’t working properly) to make the experience more plausible.

Doruk Hasdoğan
    During our initial discussions for the project, I didn’t support the idea we chose. I felt like it was too narrative focused and wasn’t unique to VR. Even though I felt as though the idea could also be made for a first person puzzle room game without VR, by voicing my opposition to make the team think outside of the box, we have achieved something that needs VR. To do so, I drew from my previous experiences developing for the Vive hardware, where I became familiar with the assets we would need to get started creating our scene. This is why I decided to focus on level design and the visuals that players are interacting with most.
    I also created and organized the Github, was responsible for component and model implementation, and attached scripts to game objects in an optimized way. I started by creating a empty Unity project and included two assets, Newton VR and Steam VR. My initial plan was to use Newton VR’s collision sound framework (which isn’t implemented in the final build) with Steam VR’s player controller, circular and linear drive for drawers, and throwable objects for interactions. I made prefabs from Alex and Ece’s assets by adding colliders, rigidbodies, weight and drag properties, and relevant scripts.
    We used these prefabs while building our original version of the scene that was then used along with Tommy’s script to create our Main scene that has three distinct timelines. We placed the prefabs in copies of the room in the same scene corresponding with the objects in folder organization needed for the script to function. I used unity’s terrain, tree generator, and the built-in unity component “wind zone” to create a natural feeling outdoor environment. We briefly added depth of field, and motion blur, but when we realized that they caused severe simulation sickness we decided to stick with bloom lighting and ambient occlusion as our only post-processing effects.
    During our public playtest we got the feedback that the rooms weren’t distinct enough and testers weren’t realizing that the scene was changing. Additionally, we had trumpet pieces that existed in every reality, and severe bugs would occur when people put those pieces in weird places and then change timelines. For example, incorrect objects (like drawers inside of beds) would spawn in the scene. This would cause their colliders to get stuck and would launch the trumpet in a random direction outside of the play field, making the game impossible to finish. To fix this, we made and carefully laid out a significant number of new objects (like wheelchairs, and musical instruments), materials (like wallpapers, posters, and new textures), and completely new exterior environments (like the city). This made the effect much more noticable.

Tommy Graven
    Throughout this project I worked as a developer by building unique scripts that were critical for the game to function, fixed bugs, optimized performance, and worked together in person with the rest of the team to merge our work for each major milestone.
    I made my scripts as simple and flexible as possible by making intuitive variable names, including clear comments, loosely coupled functions so that the rest of the team (primarily the developers) could easily implement new changes when testing in VR, and easy for our modeling team to understand. This was a challenge for me, because I needed to plan for other team members, which is something I have not needed to do in the past.
     I started by implementing a FPS controller and made a script so you could interact with objects, and I used this controller throughout this whole project so I could iterate quickly between developing and playtesting so I wouldn’t need any VR equipment at all times.
    Then I built a script which makes it possible to combine objects together when they are colliding with each other, which we use for the trumpet pieces. This is done by destroying the original gameObjects on collision and instantiating a new gameObject at the point of collision based on which objects were collided. This was used as a base for our first playable prototype and improved by making the function a factor of distance between objects for our final build.
    To create the main mechanic which gives the player the illusion that they are seamlessly changing timelines when they trigger certain events, we needed to make multiple rooms with different interiors, each representing a different timeline. When an event is triggered (such as picking up a trumpet piece), objects from the current interior progressively disappear as the player is not looking towards them, and are replaced with objects from a new interior so as not to break the illusion of being in that place.
    To make this easy, I designed the script so that when working on a new timeline interior, we make a new folder of objects and drag and drop it inside a list of timeline interiors within the inspector of a special gameObject (called RoomManager). Modelers can then customize the interior by adding and removing specific objects in these folders.
    Because this script would only work with objects that were organized in a certain way, I built a recursive function that iterates through each gameObject’s set of components to change the rigidbody, collider, and mesh renderer settings depending on whether or not the object is active in the scene.

Matthew Deline
    My primary responsibilities for this project were project management, narrative and experience design, and writing and editing the majority of our final report (aside from the individual and playtesting portions). If you’ve played the game, you will have also heard my voice as Norman.
     As group leader, I started by organizing brainstorming sessions and meetings to decide what we wanted our project to be, and what roles each person would take during development. I created a Trello page for listing and assigning specific tasks, and have been responsible for making sure that we are staying on track. Throughout the project, I have organized meetings, internal playtests, development sessions, and acted as liaison between teams when schedules didn’t match up. I also organized, structured and scheduled rehearsals for each of the in-class presentations for the project.
    Additionally, I co-wrote the narrative with Alex, assisted with the selection of objects for telling the story, and was responsible for the main puzzle and player experience design. I assisted with the implementation of audio assets (like public domain recordings, voice over dialogue, and example audio for early prototypes), and recorded feedback for in class playtests of said prototypes. In essence, I was the glue that kept the project together during development.

Benjamin Tandy
    My role within the team was to share C# programming duties with Tommy by implementing the VR interaction scripting outside of that afforded by the SteamVR asset and to focus on rapidly prototyping ideas for puzzles that we could test in VR. I also assisted Jade with the external playtesting session.
    My first task was to build an initial proof-of-concept prototype that showed household objects disappearing and reappearing outside of the player’s vision (when they weren’t looking) by writing a C# script that checked whether the object was within in the fulcrum of the “head camera”. When the object was not, it would alter itself or remove itself completely. This was a successful effect that Tommy extended to work on a much larger scale.
    I next built a simple experience using Unity and the NewtonVR plugin which allowed a player to pick up and try to combine parts of a trumpet that played different notes when the player held them to their mouth. We built it quickly over the course of an afternoon using primitives and playtested it the following week during class. It allowed us to confirm that players could understand the basic concept of picking up and attempting to combine objects in the context of our puzzle without explicit instruction in the form of a text or audio prompt.
    Because we had chosen to develop for the HTC Vive, we decided to build for room-scale tracking and attempted to design the rooms so that a teleportation mechanic would not be needed. We decided upon a space of 2.5m x 2.5m as a size which we could rely on for development and showcasing the project. However, we found that when the player was in VR, a room of that size seemed quite claustrophobic. We decided to use immovable furniture models (such as a bed and a coffee table) to be able to extend the room beyond the bounds that the user could walk around. A window was also implemented in the scene which opened up the space further by showcasing the exterior environments built by Ece and Doruk.
    We also found that we needed to guide the player’s attention to certain objects. To prompt the user to examine specific objects, I built a script that would dim the universal lights and bring up a spotlight when the player’s eyeline was pointed at them. I also dropped the volume within the scene to mimic a dramatic quiet moment that is a common method in film-making. This had a pleasing effect but ended up causing confusion during playtesting as it distracted from the primary puzzle of the trumpet pieces.
    Towards the end, I worked on implementing human hand models to replace the Vive controller models we were using before as they did not make sense in the historical era of the story. I worked with an asset from the Unity Store which was an untextured model that included animation of the hands gripping. When I attached the hands to the default controller prefabs from SteamVR, it took a number of attempts to get the position and rotation of the hands to feel natural. As the player’s hand grips the Vive controller like a gun, it was remarked during internal playtesting that felt strange to see your hands open inside of VR. We found that by rotating the back of the hand model to be roughly in line with the back of the players hand achieved the closest level of embodiment illusion we could hope to achieve.
    My final duties were to implement programmatic cues within the project to play Alex and Matt’s audio, build a script that would allow the player to place and play vinyl records on the record player model and to do final testing and bug fixing before project completion.

Alex Fletcher
    As a member of both audio and narrative teams, my assigned tasks for this project included sound design, storytelling and co-collaboration for building 3D model assets. My experiences as a trumpet player and connections with musicians who have had to battle with memory loss, helped us create a story about a elderly man who is trying to recollect his past memories as a professional trumpeter.
    To serve this purpose, I worked with our art team to establish objects and textures that would make sense in the world of the story. We collected a number of free assets and textures from the Unity Asset Store for each individual timeline. For unique objects, we modelled them in blender (used an inbuilt addon ‘archimesh’) and exported as .obj (window and blinds). I also modelled a trumpet to scale in Blender made from multiple individual parts that was used in the final build as a central piece of the puzzle. I also recorded a melody (Nat Adderley’s “Work Song”) on trumpet to be used when the trumpet is played.
    When designing sound, I worked with Jade to collect different sounds to tell a story through ambient effects. In combination with the narrative, this creates an embodied nostalgic feeling through audio in VR. We did this by layering foley in Logic X by taking the individual sound bites from a royalty free sample website (zapsplat.com) and bbc archived radio shows. We then created two sets (for outdoor and indoor environments) for each room. In total we have six .wav files of around 2min20sec each that can loop, and tell their own story.
    For example in the final room we have elderly people talking about their memories in the background.  When you move closer to the window the outside sounds get louder and when you move closer to the door the indoor sounds get louder.
    In addition to environmental audio, I helped collect a set of audio effects to be used in interactions in the game, but we were unable to integrate it completely by the time of release. This would be what I would most like to work on if we continued development.

Jade Hall-Smith
    During this project, my role  was to do research on disabilities which affect the memory.  Because dementia is a main influence for our VR application, most of my research was based on how dementia progresses and what parts of the brain it affects in each stage.  This included  other neurological conditions including neurodevelopmental disorders which can also cause difficulties with the memory to show how other disabilities impact on memories.  
    My other responsibility was to use my my research in the narrative design, so we could create an accurate experience  That was sensitive to people who suffer from these disorders.I created some ideas for puzzles which manipulate audio sounds, relating to the issues with the temporal lobe, part of the brain which is affected during the early stages of dementia.
    I worked closely with Alex on the sound engineering, because we want players to really feel embodied in the VR application. To achieve this, I was responsible for documenting the sounds we collected from ZapSplat, Youtube and Archive.org. We also used this document to keep track of which sounds should be played and when.
    It then lay with me to organize and external playtest with a group from the tabletop games society, ensuring there was a diverse mix for better results. This included inexperienced VR players and people with VR games experience.  I collected feedback through a google form as well as interviews from each playtester to gather data for analysis to help us build the final iteration of our VR application.
"Normal for Norman?" VR // narrative experience in VR
Published: