Here I keep my experiments with the Oculus Rift virtual reality headset. If you want to be notified when there is something new you can follow me on the Twitters: @Tamulur. If you have any questions, suggestions or feedback, please email me at firstname.lastname@example.org.
This is primarily a test of switching between first- and third person perspective. Secondarily it is a test of dodging punches. Therefore it is best played standing; dodge with your upper body like a boxer. Explore the dungeon, switching between perspectives for better orientation and detailed view. But beware, a goblin roams the dungeon, the feared Iron Myke.
This is my first attempt to try to solve the problem of switching between 1st person and top down view in VR. The problem is that in 1st person view you look straight ahead, in top down you look about 45 degrees down. If you just switch from one to the other with a button press, you find yourself staring into the void or at the ground after the switch, which is jarring. Trying to make you look straight ahead for the top-down view by having the top-down view rotated 90 degrees in front of you like a map on a wall doesn’t work either, because you feel everything in the tiny world you see should be pulled down by gravity. The alternative solution tested in this demo (switching by head angle) works well enough for me so far.
Short experiment for the Razer Hydra. Not much replayability value, but for a while it is interesting to have NPCs come up to you and interact with you like real humans.
A train arrives at the Groundhog Station. Initially, everybody is just standing idly. You are put into the role of one of the people on the platform or the train and can move and act around. After a minute or so, the train leaves, the scene resets, the train arrives again, and you are put into the role of a different person. The previous person plays back what you did last time (similar concept to Time Rifters). This way you can bring the scene to life by filling each person’s role and making them act out different things, like greeting each other with a handshake, or one person waving another goodbye.
Windows, Oculus Runtime older than 0.7: GroundHogStation_1.1.zip
Short experiment with delayed mimicry for the Razer Hydra. You need to be standing for this experiment. The NPCs mimic your dance moves, each with their own delay. The different delay is supposed to make them seem less automatic. I made this experiment to find out whether it feels fun to dance when NPCs dance with you in the same way.
Update: I was just dancing with the woman in the red vest and at the very moment when our virtual hands happened to touch, my real hand accidentally touched the Hydra base. My first flash of presence. Also, I almost crapped my pants.
IK system used: FinalIK
Button 4: Recenter view
START: Recalibrate hands
Some of the papers used:
- “Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention” Itti, Dhavale, Pighin
- “Eyes Alive” SIGGRAPH 02, Lee, Badler, Badler
Windows, Oculus Runtime 1.4 or newer: CoffeeWithoutWords_1.7.zip
Windows, Oculus Runtime 0.8: CoffeeWithoutWords_1.6.zip
Windows, Oculus Runtime older than 0.7: CoffeeWithoutWords_1.3.zip
Miniexperiment to test different movement systems to reduce simulator sickness.
Space: Recenter view
1: Standard movement mode
2: Canvas movement mode
3: Third person movement mode
4: Stepwise teleportation mode
5: Stroboscopic movement mode
Canvas mode: When you move or turn, a canvas that is anchored to your avatar’s body is partly faded in. The idea is that this makes the VR world look like it is projected onto a screen and the screen provides a frame of reference: when you turn, the world doesn’t turn around you, but only its projection on the screen turns. The screen is still relative to your avatar body, so your visual and vestibular system don’t get conflicting input. Press G to cycle through different canvas textures.
Third person mode: To move, keep the right mouse button pressed. The environment will be shown in an ambient occlusion rendering style. With the right mouse button pressed, use WASD to move your avatar and the mouse to turn it. Your viewpoint itself doesn’t move, but you can keep looking around with your head. To teleport your view into the avatar’s new position, release the right mouse button.
Stepwise teleportation mode: Keep the right mouse button pressed to enter movement mode. Look where you want to go; you will see a ghost avatar at the target point and a path leading there. You can turn his look direction with the mouse. Release the mouse to move; you will be teleported there in steps.
7,8: de-/increase step size
9, 0: de-/increase step duration
Stroboscopic: Stroboscopic view when you move or turn. It strobes by showing x frames in a row, then showing darkness for y frames in a row. To decrease/increase x (shown frames), press 7 and 8. To decrease/increase y (black frames), press 9 and 0.
I get less simulator sickness in both the Canvas and the 3rd person modes. The canvas mode is a bit more immersive, but the 3rd person mode lets me assume the avatar’s identity more.
Windows, Oculus runtime 1.4 or newer: MovementExperiments_0.9.zip
Windows, Oculus runtime 0.8: MovementExperiments_0.8.zip
Windows, Oculus runtime older than 0.7: MovementExperiments_0.7.zip
First game mode: When you walk around the subway car, people briefly look at you when they notice you. If you walk up to and keep staring at someone, he or she will eventually stare back, the rest of the world fades into a different rendering style and time slows down. This is supposed to intensify the effect of a computer character staring at you. There are other game modes as well, they are more or less about how you experience reality. Cycle through game modes with G. To show info about the current game mode, press H. But first try to find out what it does on your own.
Mini-experiment for binaural sound, using the “cocktail party effect”, the phenomenon that we can concentrate on one voice out of several. This experiment tests whether the effect is stronger if the voices are generated binaurally (via the 3Dception plugin) than if they just use Unity’s standard 3D sound. The demo’s first phase plays three voices with Unity’s standard 3D sound. The second phase plays them binaurally. Start each phase by stepping into the light. You need headphones for the binaural effect, preferably good ones. You should turn off any surround-sound simulation you might have activated on your computer (CMSS 3D, Razer Surround etc.) I find it much easier to concentrate on one story in the binaural phase.
Windows, Oculus runtime 1.4 or newer: ThreeVoices_1.6.zip
Windows, Oculus runtime 0.8: ThreeVoices_1.5.zip
Windows, Oculus runtime older than 0.7: ThreeVoices_1.4.zip
Mac, DK1: ThreeVoices_Mac_1.2.zip
This demo experiments with binaural sound effects in VR: Some sound effects, like the footsteps, are generated with the 3Dception Unity plugin. Others, like the knock or the whisper, are recordings from a real binaural setup and are played back in this demo when the direction of the player’s head coincides with the recording. You need headphones for the binaural effect, preferably good ones. Credits:
Miniexperiment. Idea: Rendering a scene with ambient occlusion only (no textures, a kind of black&white, lighting-independent rendering) makes the scene’s spatial structure super-parseable for the brain’s visual perception system. You immediately get the complete Where information of the whole scene just from your peripheral vision. But to get the What information of individual objects you need to see their texture. But if we let all objects show their texture we don’t have the super-pareseability of the ambient occlusion anymore. This demo experiments with having most of the scene in ambient-occlusion, and only the objects in the center of view or close to you rendered with texture.
This is a short experiment to test relaxation/time perception in VR. It is based on being moved very slowly through the environment. Go to the statue, look up at her until she notices you, then walk onto the platform and wait for her to lower her hand. Go onto the hand. From then on the demo is non-interactive. Just sit back and enjoy the music. For copyright reasons most music is streamed from YouTube. If you don’t have a good internet connection the music might not play at the right time.
This is a quick little experiment to try out how VR can help with brainstorming. When thinking about an issue, it often helps to look at it from the viewpoint of someone else. This demo lets you alternate between being yourself and being Socrates, who keeps questioning everything you say, until some great insight emerges.
Start the demo and type whatever issue is on your mind. Keep it short, maybe one or a just a few sentences. Whatever you type appears on your notepad. Then type Enter. You viewpoint will change to Socrates, and what you wrote appears on the blackboard. As Socrates, type whatever cynical/critical remark the grumpy old philosopher would have come up with when confronted with such youthful naivety.
Continue the dialogue until either you are too confused to go on or you have a revelation. Then press Escape to quit the demo. A text file containing the whole dialogue will be saved in the demo directory.
May you be enlightened.
I created this small game with James and Alex to try out punching with Razer Hydra controls and the Oculus Rift headset.
Windows, DK1: PumpkinPunch_02.zip
A quick test of seeing your avatar in a mirror with Oculus Rift and Razer Hydra, based on Dessimat0r’s demo GirlMirrorLook. There are several avatars in the scene, you can switch between them. If you have Hydras you can control your avatar’s hands, but the demo runs without them as well.
Windows, DK1: AvatarExperiments_02.zip