Archives For VR project

Hello again folks!

This week, I pick up where I left off last week with Localizing FATED in Unreal Engine 4 – Part 1. I already covered the basis of text localization, i’ll follow up with Voice-Over localization using Wwise as well as other things to consider when localizing in Unreal Engine 4.

Localizing Voice-Over Using Wwise

Wwise is a really great tool, and I can’t imagine doing audio work without it ever again. The Unreal Engine integration is great, even if some stuff is missing, namely localization support. Fortunately, all the ground work is pretty much done, meaning there are only a couple of things you need to modify to make it all work.

Adding localized audio files in your Wwise project

The first thing you want to do is actually setup your localized audio files in your Wwise project using the Wwise authoring tool. This is very easy to do, and requires no additional step in the UE4 editor, since what you import in Unreal (Events and Banks) is language-agnostic. Basically, the event stays the same, but the audio files played will differ. The only thing you have to do is import the new localized audio file over the already existing audio files in Wwise. Right-click on an audio file (or folder) and select “Import Audio Files…”.
This screen will show up:

Audio File Importer
Be sure to select “Localize languages” in Import Mode, and the desired language in “Destination language”.

Suite

Select the preview language in theWwise authoring tool

 

Now, you do this for every VO of your game. It is important that every VO file name match your original VO file name. You can import by folder to go faster. You can also set your project in the desired language; that way Wwise recognizes which audio files have a counterpart file for the selected language. This makes it easy to know which audio does not have a valid localization yet. Once this is done, everything you had to do on the Wwise side is done! Let’s return to Unreal!

Generating banks for all supported languages

In AkAudioBankGenerationHelpers.cpp you will find the GenerateSoundBanks() function. In there, the following line specifies which bank language will be generated:

CommandLineParams += TEXT(" -Language English(US)");

For some reason, by default it only generates English banks; basically you have to add each language you want to generate. Even simpler, just comment out the line. This will make it so that it generates all languages specified in your Wwise project.

Specifying Culture in the Game

Now that the text and voice-overs have been generated in every supported language, you actually need to specify which language you want the game to show. In general, you’ll want to do this operation at the very start of the game. Unreal offers a simple way to launch the game in the desired culture using a launch argument: -culture=“fr”.

This will effectively call for the change in culture on the text side; unfortunately, it does not do anything regarding the Wwise banks. Instead of using the -culture argument, let’s dive in and see the actual calls we need to make the switch of culture happen.

For the text, the FInternationalization class is used to set different cultures. Here is an example of how to set the game in French:

FInternationalization::Get().SetCurrentCulture(TEXT("fr"));

This will only change the text, however. For the Wwise content, you need to do yet another small change. For that purpose, I added a function to the UAkGameplayStatics class that I called void SetLanguageToCurrentCulture().

In this function, the first step is to get the current culture:

 FString currentCulture = FInternationalization::Get().GetCurrentCulture()->GetName();

After that, you can assign the Wwise current language this way:

AK::StreamMgr::SetCurrentLanguage(AKTEXT("French(Canada)"));

The name of the language is a string equivalent of how the language is called inside the Wwise authoring tool.

Optionally, you can also get the current Wwise language and unload the VO bank if the new language is not the same as what was loaded.

AK::StreamMgr::GetCurrentLanguage();
 UnloadBankByName(TEXT("Act01_VO"));

Now you should have everything you need to have your game’s text and voice-overs localized.

IMPORTANT NOTE: For some reason, Unreal does not allow the culture to be changed directly in the editor. This can be frustrating if you want to see the content in another culture directly in the editor. Fortunately, there is a way to remedy the situation quite easily.

In TextLocalizationManager.cpp, you can find the OnCultureChanged() function. What you want to do is remove the FApp::IsGame() from the ShouldLoadGame initialization.

//const bool ShouldLoadGame = bIsInitialized && FApp::IsGame(); //Before
 const bool ShouldLoadGame = bIsInitialized; //After

IMPORTANT NOTE 2: Yet another “annoying” default behavior is that a packaged game will not fallback to “root” culture when the culture specified is more specific. For example, if your game is in “fr-CA”, it will not default back to “fr” but rather to the default culture, which is “en”. Yet again, a small change will fix that.

In ICUInternalization.cpp, you can find the SetCurrentCulture(const FString&) function. You simply want to allow fallback, which is something already done in Unreal, just not set by default.

// Allow Fallback, this make it so that we can use “root” language like “en” and “fr” and not need to specify “en-US”,”fr-CA”, “fr-CD”, “fr-CF”…

FCulturePtr NewCurrentCulture = FindOrMakeCulture(Name, true /*Allow Fallback*/);

Getting the Steam App Language

I won’t go over how to setup your game for Steam (there are already a few good tutorials out there for that), but I’ll talk about the snippet of code needed to use the language currently selected in Steam.

First, I added FString GetSteamAppLanguage(); in OnlineSubsystemSteam. You can add it to your custom version of OnlineSubsystemSteam, but I felt it was enough of a “core” feature to be added directly to OnlineSubsystemSteam.
Here is the function in its entirety:

FString FOnlineSubsystemSteam::GetSteamAppLanguage()
 {
 FString langFString = TEXT("english");//Default
ISteamApps* steamApps = SteamApps();
 if (steamApps)
 {
 langFString = FString(ANSI_TO_TCHAR(steamApps->GetCurrentGameLanguage()));
 }
 return langFString;
 }

SteamApps() won’t always be valid, depending on some factors (like if the game was launched from Steam or not), so be sure to have a default value to fallback to.

Unreal is certainly going to add a lot to its Localization toolset; in fact, it’s already happening in 4.10.1 with the OneSky localization service plugin. This will certainly come in handy if you’re thinking about using that service for your game. There is also the experimental “Localization Dashboard” that can be activated in the “Experimental” section of Editor Preferences.

U4

I did not use the Dashboard extensively, but it promises to one day remove all that “.ini” manipulation (which is not that user-friendly…) and make it all transparent through a nice visual tool. It seems to work well enough, but all it will do is manipulate all the files we talked about earlier, so it is still relevant to know how these files work.

This is it for localization in UE4. Obviously, there is more to it than what I addressed in this post, but you should be well on your way to making it work in your own game. Hope this was helpful, and don’t hesitate to ask questions in the comments section. Stay tuned for more on FATED, as we quickly approach the release date!

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Hello again!

Last month I had the mandate to localize FATED in French. As some of you may be aware, Frima Studio is a Québec City-based studio that predominantly speaks French, so it was fun to finally have the script integrated in our native language. One of our characters is even voiced by Nicolas Cage’s official French voice actor. How cool is that?!

Unreal is equipped with a lot of game localization tools, even if they’re not quite as polished as the rest of the engine. In this post, I’ll explain how localization works, from text to voice-over. I’ll also give a few tips on how to modify the engine to allow different culture directly in the editor, which changes are needed to support localization for Steam builds, and, finally, which modifications are required to have the Wwise UE4 integration fully localization-ready. In short, how we managed to have a fully working localized build for FATED!

Before reading on, take note that I worked on the localization using Unreal 4.8.2. We recently upgraded to 4.10.1, so while I can confirm that this is all still valid, some new features may have been added that I’m not aware of.

Localizing Text

If you’re familiar with UE3, you’ll notice how different the localization system is now. Epic pretty much ditched their old system. While UE3’s localization system was lacking in some ways, I personally find that UE4’s localization system sometimes neglects simplicity for an all-around, more robust and polyvalent system that is unfortunately not quite ready yet, which sometimes leads to confusion. In UE3, you just had to use a LOCALIZE macro with some parameters that would point to the correct “.ini” file containing the localized text. In UE4, the process is a bit more convoluted, but once you’ve familiarized yourself with its intricacies, it’s quite easy to use. Now let’s dive in!

FString and FText
If you’re familiar with Unreal development, you already know about FString, which is the custom implementation of strings that is used throughout the engine. For localization, Epic introduced a new class, FText, which needs to be used whenever we wish to localize text content. The usage of FText will then mark said text to be gathered in a later phase of the localization process, the “gathering” phase using a specific commandlet (namely the GatherText commandlet).

NSLOCTEXT
When changing text directly in the code, you need to use the NSLOCTEXT macro. This macro uses three parameters: the namespace of the text, a key to represent this text, and the string literal in the default language, which in our case is English. It looks something like this:

FText chapterSelectTxt = NSLOCTEXT("MainMenuUI", "ChapterSelect", "Chapter Selection");

This will later determine how your language archive is generated. We will look at the .archive file generated in a moment.

GatherText Commandlet
The next step is to actually gather all the FText from your project. This also means that we will be able to get text from every blueprint instance for every map of your game, for example. For this to work, you need to start the UE4 editor with a specific command line. I find that the best way to do this is to create a batch file. So I created the LocalizeJotunn.bat file (Jotunn is the internal codename for FATED), which is located in the same folder as the .uproject file.

set PROJECT_PATH=%CD%
..\..\Engine_4.10.1\Engine\Binaries\Win64\UE4Editor.exe %PROJECT_PATH%/Jotunn.uproject
-Run=GatherText -config="%PROJECT_PATH%/Config/Localization/Game.ini" -log > localization.log

From that file, you can notice a reference to a file named Game.ini in Config/Localization/. You need to create that file: this is where the entire configuration for the GatherText commandlet is going to reside. You can find our config file here: configFile
I strongly recommend you start from that file and adjust for your needs. This file has different sections; let’s take a look at them.

[CommonSettings]
This is where you set where the localization files will reside. It is important that you put the same path that is set under the [Internationalization] section of BaseGame.ini (or your custom %YourGameName%Game.ini). By default, this is the path you should see:

[Internationalization]
+LocalizationPaths=%GAMEDIR%Content/Localization/Game

CommonSettings is also where you get to set the native culture of your game (default language).  After that, using the CulturesToGenerate property, you can list all the languages for which you need to create localization files.

[GatherTextStep*]
There will be a number of GatherTextStep, each with its own Commandlet class. The two most important ones you will want to check are GatherTextFromSource and GatherTextFromAssets. GatherTextFromSource will scan source code for those NSLOCTEXT I mentioned earlier, while GatherTextFromAsset will scan for the FText in your .uasset and .umap.

The documentation on this on the Web is not up to date; at least it wasn’t when I worked on the localization, so follow our file for that. You will mainly want to verify the paths for the objects to gather text from. BEWARE! Some of the required parameters are paths (SearchDirectoryPaths) and some are filters (IncludePathFilters). Both kinda look the same, but for filters you don’t want to miss out on the required asterisk (*)!

I personally found that GatherTextFromAssets was getting too much FText I did not want to localize in the first place. We use a lot of TextRenderComponent in the game that are used in the editor only, and this was polluting the localization archive. Since FATED doesn’t have that much text anyway, I decided to only use GatherTextFromSource and force our texts to be localized in source using the NSLOCTEXT macro. It simplified the process for us, but it may not be what you need for your game.

The other steps (GenerateGatherManifest, GenerateGatherArchive,etc) I did not change, but they are required to actually generate the files that will be in you localization content folder (Content/Localization/Game).

Generated files: .archive and .manifest files
The main generated file that you will want to modify afterward is the .archive file. This is where your actual localization text will be stored. For each language generated, a folder will be created in a Content/Localization/Game that represents it; in our case an “en” and “fr” folder. You can open the .archive file using any text editor. For example:

"Source":
{
        "Text": "2.1 - A New Beginning"
},
"Translation":
{
        "Text": "2.1 - Un nouveau départ"
}

The .manifest file is not meant to be modified, but you can get information on the gathering process there. It could be useful to track down where the text was gathered and the actual key used. Example:

"Source":
{
        "Text": "2.1 - A New Beginning"
},
"Keys": [
{
                "Key": "Chapter4Name",
                "Path": "Source/Jotunn/Private/UI/UIChapterSelect.cpp - line 196"
        }

]

That’s it for Text Localization. I’ll go over the Voice-Over localization using Wwise as well as other things to consider when localizing in my next blog entry next week. I hope this guide has been helpful to you folks out there. See you next week.

 

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Meet your daughter, Lif

@Vincent_Martel —  November 11, 2015 — 5 Comments

Hey,

A few weeks ago, I promised to show you the end result of our animation pipeline. It kind of slipped my mind (sorry), but here it is! This short video includes the use of motion capture, “traditional” animation, facial tracking and a “look-at” system:

 

We are very happy with the result; it looks great in VR and the emotions really get through. The little girl featured in the clip is Lif, the daughter of the character you portrait in FATED. She has the feeling that you’re mad at her for something, but she doesn’t know what.

What do you think?

Vincent Martel / Producer /  @Vincent_Martel
www.fatedgame.com

A couple of weeks ago, our producer Vincent was giving a talk at Casual Connect. His talk was entitled: How to Make a VR Game Not a Game Using VR.

If you’re curious about this new medium, you should take a moment to listen to it.

Vincent
Hello everyone!
Last week, I was at Blindlight Studios in Los Angeles with our Audio Director, Patrick Balthrop, to record the voice-overs for FATED. We were amazed by the quality of both the studio staff and the actor cast. It was honestly one of my best experiences, professionally.

VoiceOver1

Blindlight works on most of the big AAA projects, from Activision Blizzard to Naughty Dog all the way to Disney Interactive Studios. It’s our first time working on such a voice-acting-heavy title, and that recording experience was almost unreal. The actors were top-notch, not only on the voice acting, but also on the emotions they were able to inject into their lines. By the end of a particularly gripping scene, one of the actresses was literally in tears …

So I would like to take this opportunity to thank everyone who was there for your amazing work. The game will be a hundred time better because of you!

Also, for anyone looking for great voice actors, here’s a full listing of our cast, each with one of the amazing projects on which they’ve worked (because there’s not enough space to list them all):

• David Lodge [ Destiny ]
• Cherani Leigh [ Borderlands 2 ]
• Andy Pessoa [ King’s Quest ]
• Lex Lang [ Mass Effect 2-3-4 ]
• Matthew Waterson [ World of Warcraft ]
• Laura Post [ League of Legends ]
• Michelle Sparks [ Sunset Overdrive ]
• Reagan Rundus

 

20150818_160626_001

Reagan Rundus, the little actress who portraits our character Lif, is only 7 years old.

Here’s a sample of what we recorded in our two-day sessions with Blindlight:

 

During the recording session, we were also capturing the actor’s faces to track facial expressions using faceshift so we can use them in-game. We are hoping that with this technique we will get richer facial animations, and that more emotions will come through.

VoiceOver2

The next step will be to merge everything together: the body animations we recorded during the motion capture sessions, the facial animations recorded with faceshift, and the voice-overs.

This step is both exciting and nerve-wracking. I’m really looking forward to seeing our characters finally come to life, and I hope they’ll look awesome! It’s our first time using this animation pipeline, and failing here would mean a lot of more work for our animators. So, fingers crossed!

I’ll try and show you the final result, but until then if you have any questions, feel free to ask!

Have a nice day.

Vincent Martel / Executive Producer / @Vincent_Martel
www.fatedgame.com

 

Mik

I’ve given a lot of thought on how to write this post as I’m going to touch on very delicate subject. It’s one on which my view (and that of the team) is not completely set yet. From my point of view as a developer, I’ve witnessed how the perception of Virtual Reality is evolving. Part of this is due to a better understanding of the forces and weakness of this new medium, but it’s also due to the fear of VR, a fear that is mostly attributed to health concerns and simulation sickness acting as the flag bearer. There’s a stigma around this topic and I thought it was time to get some well-deserved new perspective on what it entails and maybe bring about a new point of view on the subject. Bear with me a little as I lay out my thoughts on this, I swear there is a point!

Evolution of VR Mentality

When we started tinkering with the DK1 back in the beginning of 2014, the VR scene was pretty much two things: first-person view horror games and rollercoasters. A lot of people saw the future of VR entertainment as that kind of experiences. It was about “thrills” and seeing the world from another person’s perspective: that was most easily associated with first-person view games.

Motion sickness was a thing, but it was mostly seen as a technological limitation, something that would go away naturally as the tech evolved. It was not uncommon to see one person feel some simulation sickness only for another to comment that “It’s going to be solved or get way better with the DK2”.

One of the first demo developed by Oculus to showcase the Rift was a first-person experience; the Tuscany Demo.This demo is what introduced a lot of developer I know to VR, and it contained the dreaded right joystick yaw control (more on that later).

The concept of “VR Legs”, the fact that you could acclimate to the discomfort caused by some VR experiences, had a pretty strong following. On the team, I remember we were telling ourselves that we would do the most hardcore experiences (like Cyber Space) in order to get better in VR. In some respects, this did work. We got better at handling some VR related discomfort!

The reason why I mention all this and what primarily led me to write this piece is that since then, there has been such a paradigm shift in what a VR game should be, that it made us reconsider many aspects of our game. Basically, as of now, creating a VR FPS is pretty much considered heresy. The demos that once were the greatest things in the world, that made me a believer in virtual reality in the first place, are now scorned by most.

While first person VR games are still coming, this evolution led to the recent insurgence of third person perspective VR content. This can be seen in numerous articles around the web but also in recent announcements at E3 like Insomniac’s “Edge of Nowhere”.

Don’t get me wrong, I’m not saying this is a bad thing! There are good reasons behind this; locomotion is way less intrusive in third person, and locomotion in VR is a problem.

Locomotion: The “Unsolvable” Problem of VR

While things like low frame-rate and high latency can create simulation sickness, the real issue lies in movements and how the vestibular system interprets what the brain is telling it. For some, moving around in translation or rotation or both can get extremely uncomfortable, extremely quickly.

In parallel to the evolution of VR content creation, developers were hard at work finding solutions to this problem. As we entered pre-production on FATED, we were already aware of some of the issues. We knew about the major concern of standard FPS control: player yaw rotation using the right stick. Some months after we started, John Carmack, one of the leading authorities in the domain, said this about it:

John Carmack Tweet

So, naturally, we wanted to figure it out: solve the issue and make our game the most comfortable VR experience out there. Thus came the time of the hunt for the Great VR Holy Grail! *Cue the Indiana Jones theme song*

While we found some interesting (and some horrible) ways to control rotation, most of the popular ones were already out there.

 

 

Solution #1: The “Comfort Mode” Control

There is a pretty popular movement mechanic known as the “Comfort Mode” that strives to resolve this issue. You can see it explained here.

While this solved the problem at hand for some, it did also create new ones. First, for a lot of people this was breaking immersion. There is no way that “skipping” the actual rotation movement can cut it as a “natural” way to move around. Secondly, disorientation: while using the feature sparingly could work, trying to use it in a more action based setting was quickly making people unsure as to their whereabouts in the game world. And then there is the matter of precision. If you get to a point in your game where you need to be at a certain angle, then this method is bound to fail.

While these are all new problems that may also have their own solutions, there is one single thing that still stays true: this does not work for everyone. In fact, some people reported feeling sicker using this new control mode.

Solution #2: Cockpits, Cockpits Everywhere!

Cockpit

The second solution came about quite naturally in various demos that actually needed them, like in games where the player is sitting in a spaceship. I use the term “cockpit” here to designated any locomotion mean that as a reference frame that the player can relate to. Racing games and mech-warrior style games are two other examples.

Quite interestingly, this had a strong positive effect on the way players experimented simulation sickness, even going as far as completely removing the unpleasant feeling for some! Great, but now that developers found out about this, the internet exploded with ways to had that frame of reference to everything. Here are some of my favorites, not always for good reasons.

Canvas Mode

There is this demo that is awesome for its novel ideas to handle movement in VR. It’s simply called “Movement Experiments”. All are interesting movement mode but the one I want to point to is the “Canvas Mode”. What this mode does is create a virtual “cockpit”, called “canvas” in this case, whenever the player rotates around. While the author says it works (and it probably does for some people), I found it quite intrusive, and immersion-breaking. Still, the demo is worth a try, so check it out!

Floating HUD (Helmet Mode)

There’s the floating HUD solution that is basically making the UI the frame of reference for the player. That, coupled with something like the visor (or helmet) of the player character can give the desired frame of reference. A good example of that can be seen in the upcoming game Adr1ft. Unfortunately, not all game settings permit this kind of implementation. A Viking with an astronaut helmet…maybe for FATED 2!

Virtual Nose

Nose

By far my favorite is the virtual nose, which is basically the natural frame of reference of humans. This research that pretty much flooded the VR space when it was first released is basically the extreme representation of the “cockpit” concept. We tested the idea on FATED, and promptly removed it. For some reason (I’m blaming low FOV), what the player ends up seeing is two partial noses, one on each side of the screen. It’s very troubling for the player and just feels out of place, I really don’t see this becoming the miracle pill to solve motion sickness.

Solution #3: Swivel Chair

The idea here is that since rotation in VR is such a problem, why not make players rotate on themselves in the real world? This does work of course, but with the headset horribly limited by a metric ton of wires, this does not always end well. Some folks out there like the people at Roto made a chair especially for VR that aims to solve that. While the idea is sound, we believe that having to invest in yet another piece of equipment to enjoy VR is not the way to go.

All in all, these solutions can work, but there is no single “magic trick” to solving the locomotion issue for everyone. This single fact is fuel for VR skeptics.

The Fear of VR

The fear is very real

 

I’ve seen it numerous times: when someone mentions trying a VR demo, the automatic question is “but will I be sick?” It’s part fear of the unknown, part past experiences that were not exactly “perfect”. No one wants to say “maybe you’ll be sick testing my game”. It became imperative, at all costs, to make sure every VR experience was free of simulation sickness.

However, fear of VR is not limited only to the consumer fearing the ill effects of the technology on their health; it’s also the fear of the industry giants bringing this to market that bad content could kill the appeal. As VR development grew stronger over the last year and with the consumer version finally nearing completion, it became abundantly clear that keeping motion sickness at bay was primordial. Palmer Luckey, Oculus’ Founder, as well as John Carmack both stated it very eloquently:

“Virtual reality’s biggest enemy is bad virtual reality”.  –Palmer Luckey

“The fear is if a really bad V.R. product comes out, it could send the industry back to the ’90s,” –John Carmack

But this leaves us with a question that begs to be answered…

What Is BAD Virtual Reality?

All that work leaves us with a pretty bleak picture of what we can do in virtual reality with the premise that if we want a “good” VR product, we most absolutely have zero simulation sickness, for EVERYONE. All of the above solutions can work for one person and not the other. Or it can help but not completely eliminate the effect.

Here is a small list of stuff you will not be able to experience in VR if you are very very affected by motion sickness in general.

  • All experiences with movement in them.

Yikes. No rollercoasters for you!

There is a great read by Lee Perry on Gamasutra that you should really check out that go deep into the kind of thinking I believe VR content creators should adhere to. If there is one thing I would like to emphasize in this article, it’s this: “People have extremely individualized issues with VR”.

So the definition of “bad” virtual reality is going to differ from person to person, as tolerance to some movements can be handled or not. The point of this entire post is to make it clear that maybe some VR experiences will not be for everyone. While Oculus’ reluctance to push out demos that might alienate a segment of players is understandable, what would be worse is us limiting the type of experience people can enjoy. For example, while it’s true that yaw rotation makes a lot of people want to barf, for those that it does not affect (or to a lesser degree), it’s a great and intuitive way to move around.

We don’t want to limit the kind of VR content we create to encompass every demographic; it’s an impossible task that would have us throw out things we hold dear, experiences we really believe are powerful even if a smaller percentage of people can experience it comfortably. What we can do, however, is have the maximum number of options and tweaks to permit players to mold their experience to their taste and sensibility. We believe VR needs to come with its own “rating” system that will have “comfort” as its unit of measure, the same way you know if you can handle rollercoasters in an Amusement Park or if you should rather stick with the Big Wheel. We just need to be honest with our players and expose our experience as it is. Oculus already started doing this on Oculus Share with its comfort level scale, even if they seem to have shied away from the concept since then.

Well, I knew this post was going to be long but I did not quite think it would get this long! While it’s pretty much about exposing the problem, there is a second part to this that focuses way more on how we did or plan to do to resolve the simulation sickness problems in FATED. After all, we still plan to have this next to our game when it ships:

Comfort

In the meantime, if you have any questions or comment on the subject, please do share! I’ll be happy to read and discuss them with you!

Mick / Lead Programmer / @Mickd777

www.fatedgame.com

Hello there! It’s me again.

I’d like to share with you a few things about VR sickness we discovered as we were experimenting with FATED. We already broached this topic in the technical posts, but today, instead of talking about technology again, we’re going to talk about how to avoid this unwanted feeling from a design point of view.

As we were looking for the cause of the sickness, we found that it was mainly caused by the inner ear and the eye sending different messages to the brain at the same time. When the player is moving in Virtual Reality, the eyes inform the brain of the movement, but the inner ear, an authority on the subject, says otherwise. This is where the dizziness starts. Some actions have a greater effect than others, so by strategically choosing which actions to ask the player to perform, you can diminish the undesired effects. Some of these actions can be easily identified, including: height variations, rapid movements, acceleration and deceleration, rotation, and unprovoked movements.
By following a few basic rules, your level design can make a big difference on your game’s level of comfort.

Avoid making height variations in the environment.

Making the environment flatter will keep the player’s eye at the same height. This will also keep his inner ear stabilized, thus no weird feeling. This doesn’t mean that your environment should be totally flat, only that the part where the player is allowed to go should be “flatter”. After some testing, we also found out that a floor angle of about 10 degrees is still comfortable for the player. If you have a slope with a greater incline, a feeling of vertigo might start to set in, but this could differ from one person to another. So try to avoid making a VR game about running up and down flights of stairs. If you can choose between stairs and an elevator, go with the elevator. Also, downward vertical movements seem to be easier for the brain to handle. This is why roller coasters are so effective at creating a powerful feeling of vertigo and nausea.

design1

design2

design3

Avoid making rapid movements.

When tweaking your character’s speed metric, consider a real-life scale speed instead of a fictitious one, to make the brain adapt more easily. Also, if you have elements like a moving platform or an open elevator, keep their movement slow.

Avoid acceleration and deceleration.

It is better to set a constant speed that will be achieved in one frame instead of accelerating over time. Stopping should also happen instantly. This works well with speeds that are not too high. But I can’t say for very high speeds, as I haven’t experimented with those kinds of parameters.

Don’t ask your player to rotate too much.

Try to avoid having the player make repetitive rotations. Completely avoid 180-degree rotations in you walkthrough if possible. Try to present the player’s options and interactions in front of him or in a narrow angle of perception. I don’t mean that all the game should be played in a corridor; this would be bad and boring. We didn’t make FATED like this. This is more about keeping in mind to avoid repetitive rotations than not making any.

design4

Completely avoid unprovoked movements.

By “unprovoked movement”, I mean any movement that is not triggered by the player’s input. So, no surprise elevator, no sudden floor rotation, and, above all, do not take control of the player’s point of view like we see in so many first-person and third-person cut scenes.

These few simple rules can help avoid VR sickness in your game. It is only a matter of approaching level design with that in mind. I saw some of the game’s design change radically from a first-person adventure to a fixed standing experience or third-person point of view. This says a lot. VR sickness exists, and we can’t ignore it. We have to find a way to create our game in a way that makes this undesired feeling as absent as possible. I may be wrong, but I don’t think that technology will be able to counter this effect, as it is triggered by our own perception system. Trying different conceptual approaches may be the best hope we have of finding a solution to the issue, and if we want Virtual Reality to rise as a new medium, we have to create “sickless” experiences for the masses to enjoy.

Thanks for reading! If you have comments on what you’ve read here, or if you have tips and tricks that can help with VR sickness, please share them with us!

Philippe Dionne / Game Designer
www.fatedgame.com

Phil Dionne

Hello VR enthusiasts, my name is Philippe. I’m the game/level designer working on FATED.

My role on the project is wide and varied as we are like you, discovering what this new technology is all about and how to “tame it” into a new game media. I’m basically working on translating the project’s vision into an interesting gameplay experience by strategically using space and game mechanics while using VR strengths to add new aspects to the game. I’m trying to find how to make every good idea work with the constraints and opportunities that this awesome technology offers.

Inside_Layout

I feel that VR is “the” new medium. There is no comparison between this and the other mediums that I’ve been trying. The 3D television, for example, was presented as the next big thing but in fact, it was only allowing us to feel the depth of a picture. Virtual Reality is the only medium that has a sense of presence. This is the power to make us believe that we are in a precise place in another world. I think this is where this technology caught me. I always have been a big fan of immersion but this technology is bringing an all new level to it. It’s making me forget where I am and making me believe that I’m part of a different reality. I cannot even stop my brain from making me physically react to situations. This medium is just incredible and not only for gaming, but for a ton of other applications as well.

act1s2_firstdraft

I would like to present conceptual and concrete aspects of designing a game for VR to this blog. I know I won’t be able to sound as erudite as our lead programmer or be able to show you beautiful things like our art director does but I will share what it’s like to design for Virtual Reality. I’ll share with you the VR specific issues we’re encountering while making everyday design decisions for FATED. How should we approach level design for our fully immersed player? How can this simple gameplay element becomes tricky to design in first-person VR?

I hope to make it interesting enough that it will invite people to bring their own ideas and help push this new media further. I’m also very open in taking part of discussions. If you have questions or opinions about designing for VR that you would like to bounce, please, feel free to comment.

Philippe Dionne / Game Designer
www.fatedgame.com

Marianne

Hello everyone,

I thought I’d give you an update on what the art team of FATED is working on at the moment.

On my end, I’m looking into the visuals we are missing to convey the storyline clearly in the intro scenes. The story delves into abstract concepts like death and the area between worlds. To get a better sense of timing and emotional involvement, we need to hook up a lot of things that aren’t final, and for everything that’s missing, we are using placeholders.

At this point all research is based on what is missing or could be better in-game: a lot of the ground work is covered and we can focus on fixing weaknesses. One good example is a board I’m making which lists a number of ways to turn destroyed areas into more convincing war-stricken zones. Things like burned soil patches, dead trees, charred buildings and heaps of debris.

Most of these assets are derived from things we already have – we are trying to make the most of our limited resources. They will decorate the world and emphasize drama in some parts of the story. The characters of FATED are facing hardship and destruction, so the game world needs to reflect that. Spoiler alert: you might see some dead people.

I’m also working on some textures for new asset kits that will help us dress up the environment when we get to that point. This can only happen when level design is final and meanwhile, I find myself doing many different things; patching up holes here and there so that we have a more complete sample of the experience.

image_X9

As far as the rest of the team is concerned, there is a lot of animation work going on right now. Our animators Yanick Bélanger and Isabelle Fortin are modifying motion capture files and testing out facial expressions for one of the main characters. Our 3D artist Ève Mainguy is finishing up many props and producing many of the items I’m adding when reviewing the scenes. Right now, the work is planned but more flexible than usual, as we realized we needed to re-assess progress almost daily.

Work_Mocap

I also wanted to share this piece with you. We don’t know what we’ll do with it yet but it will probably serves as a design for a T-shirt or other marketing material.

WP_Tee_Blue1920x1080

I was hoping to illustrate in a whimsical way the wonder one might feel when faced with the full blown power of Ragnarök, world-changing events on which the characters have little or no control. I felt that this strong little girl in the image was the perfect character to convey this awe, courage and vulnerability. The decorative designs around the piece are meant to bring in some Viking decorative designs and the runes are derived from actual runic texts. The image inserted in the post is actually the right size if you want to use it as a desktop wallpaper.

That’s it for this week, I hope you enjoyed!

Marianne Martin / Art Director
www.fatedgame.com