Virtual Reality and ChatGPT

Here I keep my virtual reality and ChatGPT experiments. If you want to be notified when there is something new you can follow me on the Twitters: @Tamulur. If you have any questions, suggestions or feedback, please email me at tamulur@yahoo.com.

VR Review did some very nice videos of some of my experiments: Video 1 Video 2


[ChatGTP Demo] ChatGTP Monk at Pond

Have a spiritual chat with a ChatGPT-driven monk.

This works in Virtual Reality or on normal PCs. VR requires SteamVR.
You need to a create free account on OpenAI and enter your API key into the config file for the chat to work.

Download from itch.io


[Video] ChatGPT NPC coaches me talking to people at a party in VR

This is my fourth experiment with ChatGPT-driven NPCs in VR.


[ChatGTP Demo] ChatGTP Therapy

Chat with a ChatGPT-driven therapist with your voice.

This works in Virtual Reality or on normal PCs. VR requires SteamVR.
You need to a create free account on OpenAI and enter your API key into the config file for the chat to work.

Download from itch.io


[Video] Voice Conversation with ChatGPT-driven Barkeep in a Fantasy Tavern in VR

This is my second experiment with ChatGPT-driven NPCs in VR.


[Video] “Spiritual” Voice Conversation with ChatGPT-driven Monks at a Pond in VR

This is my first experiment with ChatGPT-driven NPCs in VR. Uses OpenAI’s Whisper for speech-to-text, GPT3.5-Turbo for the “brain” of the NPCs, and ElvenLabs for text-to-speech.


[VR Demo] Big Robots Walking

This is a simple short scene that shows increasingly bigger robots walking past you. It demos the formula for how to set animation speed based on character scale described in the blog post Game Development: How much to slow animation down for giant creatures

Requires SteamVR and a compatible headset. Download from itch.io


[VR Demo] VR House Disco

I made this VR experience for when I feel like dancing. In this VR disco, music videos are streamed from YouTube. You can edit the selection of videos to your own liking. There are two dancing modes for dancing with other characters: slow-mo and mirroring.

Requires SteamVR and a compatible headset. Download from itch.io


[VR Demo] Back to the Metaverse

The first episode of a new virtual reality series.

“Hey Torty, here’s the Doc. Come over to my garage when you get this message. I need your help with another experiment. I’ve had this cool idea and made a thing and need your feedback. Don’t worry, it won’t take long; this time it’s not a trip to other timelines. Or about visiting remote branches of the multiverse. Or exploring weird alien worlds. Promised. I’m sure. At a a very high confidence level. Yeah, so come over to help me with this cool thing.”

Requires SteamVR and a compatible headset. Download from itch.io


[VR Demo] Draw Hanzi Tutor

In this experimental app a tutor teaches you how to draw Chinese characters (hanzi). The idea is that it’s easier for the brain to learn the larger arm movements when painting a hanzi in big on a canvas than when drawing it in small on paper. Also you see the teacher drawing the hanzi first, which hopefully engages your mirror neurons to help you learn even better.

Standing experience for Vive, Rift and Windows MR headsets, requires hand controllers and SteamVR running on your PC.

Requires SteamVR and a compatible headset. DrawHanziTutor_0.3.zip


[VR Demo] Pumpkin Punch

A VR minigame for hand controllers made in 2013 for Halloween, now ported to the Vive and Rift.

Punch the fruits to prevent them from hitting the bell! You get more points for harder hits.

Standing experience for Vive and Rift, requires hand controllers and SteamVR running on your PC.

Requires SteamVR and a compatible headset. PumpkinPunch_0.4.zip


Legacy Experiments

From the times of Oculus DK1 and DK2. Haven’t been updated for newer headsets.

[Legacy VR Demo] Little Big Dungeon

LittleBigDungeon

This is primarily a test of switching between first- and third person perspective. Secondarily it is a test of dodging punches. Therefore it is best played standing; dodge with your upper body like a boxer. Explore the dungeon, switching between perspectives for better orientation and detailed view. But beware, a goblin roams the dungeon, the feared Iron Myke.

This is my first attempt to try to solve the problem of switching between 1st person and top down view in VR. The problem is that in 1st person view you look straight ahead, in top down you look about 45 degrees down. If you just switch from one to the other with a button press, you find yourself staring into the void or at the ground after the switch, which is jarring. Trying to make you look straight ahead for the top-down view by having the top-down view rotated 90 degrees in front of you like a map on a wall doesn’t work either, because you feel everything in the tiny world you see should be pulled down by gravity. The alternative solution tested in this demo (switching by head angle) works well enough for me so far.

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: LittleBigDungeon_0.7.zip (legacy)


[Legacy VR Demo] Groundhog Station

GroundhogStation

Short experiment for the Razer Hydra. Not much replayability value, but for a while it is interesting to have NPCs come up to you and interact with you like real humans.

A train arrives at the Groundhog Station. Initially, everybody is just standing idly. You are put into the role of one of the people on the platform or the train and can move and act around. After a minute or so, the train leaves, the scene resets, the train arrives again, and you are put into the role of a different person. The previous person plays back what you did last time (similar concept to Time Rifters). This way you can bring the scene to life by filling each person’s role and making them act out different things, like greeting each other with a handshake, or one person waving another goodbye.

Windows, requires Razer Hydra, Oculus Runtime older than 0.7: GroundHogStation_1.1.zip (legacy)


[Legacy VR Demo] Let’s Dance

Short experiment with delayed mimicry for the Razer Hydra. You need to be standing for this experiment. The NPCs mimic your dance moves, each with their own delay. The different delay is supposed to make them seem less automatic. I made this experiment to find out whether it feels fun to dance when NPCs dance with you in the same way.

Update: I was just dancing with the woman in the red vest and at the very moment when our virtual hands happened to touch, my real hand accidentally touched the Hydra base. My first flash of presence. Also, I almost crapped my pants.

IK system used: FinalIK

Controls:

Left Hydra:
Button 4: Recenter view
START: Recalibrate hands

Windows, requires Razer Hydra, Oculus Runtime 0.8: LetsDance_1.9.zip (legacy)


[Legacy VR Demo] Coffee without Words

CoffeeWithoutWords Quick test to experiment with procedurally generated NPC eye movement.

Some of the papers used:

I summarize what I used from these and similar papers in this post. If you want to save time and don’t want to code that yourself, I made a Unity asset for this.

Sitting experience for the Oculus Rift.

Windows, Oculus Rift: CoffeeWithoutWords_1.7.zip (legacy)


[Legacy VR Demo] Movement Experiments

MovementExperiments

Miniexperiment to test different movement systems to reduce simulator sickness.

Keyboard controls:

Space: Recenter view

1: Standard movement mode
2: Canvas movement mode
3: Third person movement mode
4: Stepwise teleportation mode
5: Stroboscopic movement mode

Canvas mode: When you move or turn, a canvas that is anchored to your avatar’s body is partly faded in. The idea is that this makes the VR world look like it is projected onto a screen and the screen provides a frame of reference: when you turn, the world doesn’t turn around you, but only its projection on the screen turns. The screen is still relative to your avatar body, so your visual and vestibular system don’t get conflicting input. Press G to cycle through different canvas textures.

Third person mode: To move, keep the right mouse button pressed. The environment will be shown in an ambient occlusion rendering style. With the right mouse button pressed, use WASD to move your avatar and the mouse to turn it. Your viewpoint itself doesn’t move, but you can keep looking around with your head. To teleport your view into the avatar’s new position, release the right mouse button.

Stepwise teleportation mode: Keep the right mouse button pressed to enter movement mode. Look where you want to go; you will see a ghost avatar at the target point and a path leading there. You can turn his look direction with the mouse. Release the mouse to move; you will be teleported there in steps.

Keys:
7,8: de-/increase step size
9, 0: de-/increase step duration

Stroboscopic: Stroboscopic view when you move or turn. It strobes by showing x frames in a row, then showing darkness for y frames in a row. To decrease/increase x (shown frames), press 7 and 8. To decrease/increase y (black frames), press 9 and 0.

I get less simulator sickness in both the Canvas and the 3rd person modes. The canvas mode is a bit more immersive, but the 3rd person mode lets me assume the avatar’s identity more.

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: MovementExperiments_0.9.zip (legacy)
Source: GitHub


[Legacy VR Demo] Subway Car

SubwayCar First game mode: When you walk around the subway car, people briefly look at you when they notice you. If you walk up to and keep staring at someone, he or she will eventually stare back, the rest of the world fades into a different rendering style and time slows down. This is supposed to intensify the effect of a computer character staring at you. There are other game modes as well, they are more or less about how you experience reality. Cycle through game modes with G. To show info about the current game mode, press H. But first try to find out what it does on your own.

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: SubwayCar_1.4.zip (legacy)


[Legacy VR Demo] Three Voices

ThreeVoices Mini-experiment for binaural sound, using the “cocktail party effect”, the phenomenon that we can concentrate on one voice out of several. This experiment tests whether the effect is stronger if the voices are generated binaurally (via the 3Dception plugin) than if they just use Unity’s standard 3D sound. The demo’s first phase plays three voices with Unity’s standard 3D sound. The second phase plays them binaurally. Start each phase by stepping into the light. You need headphones for the binaural effect, preferably good ones. You should turn off any surround-sound simulation you might have activated on your computer (CMSS 3D, Razer Surround etc.) I find it much easier to concentrate on one story in the binaural phase.

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: ThreeVoices_1.6.zip (legacy)


[Legacy VR Demo] KnockKnock

KnockKnock This demo experiments with binaural sound effects in VR: Some sound effects, like the footsteps, are generated with the 3Dception Unity plugin. Others, like the knock or the whisper, are recordings from a real binaural setup and are played back in this demo when the direction of the player’s head coincides with the recording. You need headphones for the binaural effect, preferably good ones. Credits:

Binaural knock: audiocheck.net
Binaural whispering woman: HeatherFeather
Binaural sound engine: 3Dception

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: KnockKnock_0.7.zip (legacy)


[Legacy VR Demo] Ambient Occlusion Room

Screenshot_AOonlyScreenshot_FullTextureScreenshot_Mixture

Miniexperiment. Idea: Rendering a scene with ambient occlusion only (no textures, a kind of black&white, lighting-independent rendering) makes the scene’s spatial structure super-parseable for the brain’s visual perception system. You immediately get the complete Where information of the whole scene just from your peripheral vision. But to get the What information of individual objects you need to see their texture. But if we let all objects show their texture we don’t have the super-pareseability of the ambient occlusion anymore. This demo experiments with having most of the scene in ambient-occlusion, and only the objects in the center of view or close to you rendered with texture.

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: AmbientOcclusionRoom_0.5.zip (legacy)


[Legacy VR Demo] Time of Statues

Screenshot This is a short experiment to test relaxation/time perception in VR. It is based on being moved very slowly through the environment. Go to the statue, look up at her until she notices you, then walk onto the platform and wait for her to lower her hand. Go onto the hand. From then on the demo is non-interactive. Just sit back and enjoy the music. For copyright reasons most music is streamed from YouTube. If you don’t have a good internet connection the music might not play at the right time.

Sitting experience for the Oculus Rift, controls: mouse+keyboard or Xbox controller.

Windows, Oculus Rift: TimeOfStatues_1.3.zip (legacy)


[Legacy VR Demo] Ask Socrates

AskSocrates_ScreenshotThis is a quick little experiment to try out how VR can help with brainstorming. When thinking about an issue, it often helps to look at it from the viewpoint of someone else. This demo lets you alternate between being yourself and being Socrates, who keeps questioning everything you say, until some great insight emerges. Start the demo and type whatever issue is on your mind. Keep it short, maybe one or a just a few sentences. Whatever you type appears on your notepad. Then type Enter. Your viewpoint will change to Socrates, and what you wrote appears on the blackboard. As Socrates, type whatever cynical/critical remark the grumpy old philosopher would have come up with when confronted with such youthful naivety. Continue the dialogue until either you are too confused to go on or you have a revelation. Then press Escape to quit the demo. A text file containing the whole dialogue will be saved in the demo directory. May you be enlightened.

Sitting experience for the Oculus Rift, controls: keyboard.

Windows, Oculus Rift: AskSocrates_0.6 (legacy)
Source: GitHub


[Legacy VR Demo] Avatar Experiments

A quick test for the old Oculus Development Kit 1, of seeing your avatar in a mirror with Oculus Rift and Razer Hydra, based on Dessimat0r’s demo GirlMirrorLook. There are several avatars in the scene, you can switch between them. If you have Hydras you can control your avatar’s hands, but the demo runs without them as well.

Windows, requires DK1 and Razer Hydra: AvatarExperiments_02.zip (legacy)
Source: GitHub