Archives For

Hello again folks!

This week, I pick up where I left off last week with Localizing FATED in Unreal Engine 4 – Part 1. I already covered the basis of text localization, i’ll follow up with Voice-Over localization using Wwise as well as other things to consider when localizing in Unreal Engine 4.

Localizing Voice-Over Using Wwise

Wwise is a really great tool, and I can’t imagine doing audio work without it ever again. The Unreal Engine integration is great, even if some stuff is missing, namely localization support. Fortunately, all the ground work is pretty much done, meaning there are only a couple of things you need to modify to make it all work.

Adding localized audio files in your Wwise project

The first thing you want to do is actually setup your localized audio files in your Wwise project using the Wwise authoring tool. This is very easy to do, and requires no additional step in the UE4 editor, since what you import in Unreal (Events and Banks) is language-agnostic. Basically, the event stays the same, but the audio files played will differ. The only thing you have to do is import the new localized audio file over the already existing audio files in Wwise. Right-click on an audio file (or folder) and select “Import Audio Files…”.
This screen will show up:

Audio File Importer
Be sure to select “Localize languages” in Import Mode, and the desired language in “Destination language”.

Suite

Select the preview language in theWwise authoring tool

 

Now, you do this for every VO of your game. It is important that every VO file name match your original VO file name. You can import by folder to go faster. You can also set your project in the desired language; that way Wwise recognizes which audio files have a counterpart file for the selected language. This makes it easy to know which audio does not have a valid localization yet. Once this is done, everything you had to do on the Wwise side is done! Let’s return to Unreal!

Generating banks for all supported languages

In AkAudioBankGenerationHelpers.cpp you will find the GenerateSoundBanks() function. In there, the following line specifies which bank language will be generated:

CommandLineParams += TEXT(" -Language English(US)");

For some reason, by default it only generates English banks; basically you have to add each language you want to generate. Even simpler, just comment out the line. This will make it so that it generates all languages specified in your Wwise project.

Specifying Culture in the Game

Now that the text and voice-overs have been generated in every supported language, you actually need to specify which language you want the game to show. In general, you’ll want to do this operation at the very start of the game. Unreal offers a simple way to launch the game in the desired culture using a launch argument: -culture=“fr”.

This will effectively call for the change in culture on the text side; unfortunately, it does not do anything regarding the Wwise banks. Instead of using the -culture argument, let’s dive in and see the actual calls we need to make the switch of culture happen.

For the text, the FInternationalization class is used to set different cultures. Here is an example of how to set the game in French:

FInternationalization::Get().SetCurrentCulture(TEXT("fr"));

This will only change the text, however. For the Wwise content, you need to do yet another small change. For that purpose, I added a function to the UAkGameplayStatics class that I called void SetLanguageToCurrentCulture().

In this function, the first step is to get the current culture:

 FString currentCulture = FInternationalization::Get().GetCurrentCulture()->GetName();

After that, you can assign the Wwise current language this way:

AK::StreamMgr::SetCurrentLanguage(AKTEXT("French(Canada)"));

The name of the language is a string equivalent of how the language is called inside the Wwise authoring tool.

Optionally, you can also get the current Wwise language and unload the VO bank if the new language is not the same as what was loaded.

AK::StreamMgr::GetCurrentLanguage();
 UnloadBankByName(TEXT("Act01_VO"));

Now you should have everything you need to have your game’s text and voice-overs localized.

IMPORTANT NOTE: For some reason, Unreal does not allow the culture to be changed directly in the editor. This can be frustrating if you want to see the content in another culture directly in the editor. Fortunately, there is a way to remedy the situation quite easily.

In TextLocalizationManager.cpp, you can find the OnCultureChanged() function. What you want to do is remove the FApp::IsGame() from the ShouldLoadGame initialization.

//const bool ShouldLoadGame = bIsInitialized && FApp::IsGame(); //Before
 const bool ShouldLoadGame = bIsInitialized; //After

IMPORTANT NOTE 2: Yet another “annoying” default behavior is that a packaged game will not fallback to “root” culture when the culture specified is more specific. For example, if your game is in “fr-CA”, it will not default back to “fr” but rather to the default culture, which is “en”. Yet again, a small change will fix that.

In ICUInternalization.cpp, you can find the SetCurrentCulture(const FString&) function. You simply want to allow fallback, which is something already done in Unreal, just not set by default.

// Allow Fallback, this make it so that we can use “root” language like “en” and “fr” and not need to specify “en-US”,”fr-CA”, “fr-CD”, “fr-CF”…

FCulturePtr NewCurrentCulture = FindOrMakeCulture(Name, true /*Allow Fallback*/);

Getting the Steam App Language

I won’t go over how to setup your game for Steam (there are already a few good tutorials out there for that), but I’ll talk about the snippet of code needed to use the language currently selected in Steam.

First, I added FString GetSteamAppLanguage(); in OnlineSubsystemSteam. You can add it to your custom version of OnlineSubsystemSteam, but I felt it was enough of a “core” feature to be added directly to OnlineSubsystemSteam.
Here is the function in its entirety:

FString FOnlineSubsystemSteam::GetSteamAppLanguage()
 {
 FString langFString = TEXT("english");//Default
ISteamApps* steamApps = SteamApps();
 if (steamApps)
 {
 langFString = FString(ANSI_TO_TCHAR(steamApps->GetCurrentGameLanguage()));
 }
 return langFString;
 }

SteamApps() won’t always be valid, depending on some factors (like if the game was launched from Steam or not), so be sure to have a default value to fallback to.

Unreal is certainly going to add a lot to its Localization toolset; in fact, it’s already happening in 4.10.1 with the OneSky localization service plugin. This will certainly come in handy if you’re thinking about using that service for your game. There is also the experimental “Localization Dashboard” that can be activated in the “Experimental” section of Editor Preferences.

U4

I did not use the Dashboard extensively, but it promises to one day remove all that “.ini” manipulation (which is not that user-friendly…) and make it all transparent through a nice visual tool. It seems to work well enough, but all it will do is manipulate all the files we talked about earlier, so it is still relevant to know how these files work.

This is it for localization in UE4. Obviously, there is more to it than what I addressed in this post, but you should be well on your way to making it work in your own game. Hope this was helpful, and don’t hesitate to ask questions in the comments section. Stay tuned for more on FATED, as we quickly approach the release date!

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Hello again!

Last month I had the mandate to localize FATED in French. As some of you may be aware, Frima Studio is a Québec City-based studio that predominantly speaks French, so it was fun to finally have the script integrated in our native language. One of our characters is even voiced by Nicolas Cage’s official French voice actor. How cool is that?!

Unreal is equipped with a lot of game localization tools, even if they’re not quite as polished as the rest of the engine. In this post, I’ll explain how localization works, from text to voice-over. I’ll also give a few tips on how to modify the engine to allow different culture directly in the editor, which changes are needed to support localization for Steam builds, and, finally, which modifications are required to have the Wwise UE4 integration fully localization-ready. In short, how we managed to have a fully working localized build for FATED!

Before reading on, take note that I worked on the localization using Unreal 4.8.2. We recently upgraded to 4.10.1, so while I can confirm that this is all still valid, some new features may have been added that I’m not aware of.

Localizing Text

If you’re familiar with UE3, you’ll notice how different the localization system is now. Epic pretty much ditched their old system. While UE3’s localization system was lacking in some ways, I personally find that UE4’s localization system sometimes neglects simplicity for an all-around, more robust and polyvalent system that is unfortunately not quite ready yet, which sometimes leads to confusion. In UE3, you just had to use a LOCALIZE macro with some parameters that would point to the correct “.ini” file containing the localized text. In UE4, the process is a bit more convoluted, but once you’ve familiarized yourself with its intricacies, it’s quite easy to use. Now let’s dive in!

FString and FText
If you’re familiar with Unreal development, you already know about FString, which is the custom implementation of strings that is used throughout the engine. For localization, Epic introduced a new class, FText, which needs to be used whenever we wish to localize text content. The usage of FText will then mark said text to be gathered in a later phase of the localization process, the “gathering” phase using a specific commandlet (namely the GatherText commandlet).

NSLOCTEXT
When changing text directly in the code, you need to use the NSLOCTEXT macro. This macro uses three parameters: the namespace of the text, a key to represent this text, and the string literal in the default language, which in our case is English. It looks something like this:

FText chapterSelectTxt = NSLOCTEXT("MainMenuUI", "ChapterSelect", "Chapter Selection");

This will later determine how your language archive is generated. We will look at the .archive file generated in a moment.

GatherText Commandlet
The next step is to actually gather all the FText from your project. This also means that we will be able to get text from every blueprint instance for every map of your game, for example. For this to work, you need to start the UE4 editor with a specific command line. I find that the best way to do this is to create a batch file. So I created the LocalizeJotunn.bat file (Jotunn is the internal codename for FATED), which is located in the same folder as the .uproject file.

set PROJECT_PATH=%CD%
..\..\Engine_4.10.1\Engine\Binaries\Win64\UE4Editor.exe %PROJECT_PATH%/Jotunn.uproject
-Run=GatherText -config="%PROJECT_PATH%/Config/Localization/Game.ini" -log > localization.log

From that file, you can notice a reference to a file named Game.ini in Config/Localization/. You need to create that file: this is where the entire configuration for the GatherText commandlet is going to reside. You can find our config file here: configFile
I strongly recommend you start from that file and adjust for your needs. This file has different sections; let’s take a look at them.

[CommonSettings]
This is where you set where the localization files will reside. It is important that you put the same path that is set under the [Internationalization] section of BaseGame.ini (or your custom %YourGameName%Game.ini). By default, this is the path you should see:

[Internationalization]
+LocalizationPaths=%GAMEDIR%Content/Localization/Game

CommonSettings is also where you get to set the native culture of your game (default language).  After that, using the CulturesToGenerate property, you can list all the languages for which you need to create localization files.

[GatherTextStep*]
There will be a number of GatherTextStep, each with its own Commandlet class. The two most important ones you will want to check are GatherTextFromSource and GatherTextFromAssets. GatherTextFromSource will scan source code for those NSLOCTEXT I mentioned earlier, while GatherTextFromAsset will scan for the FText in your .uasset and .umap.

The documentation on this on the Web is not up to date; at least it wasn’t when I worked on the localization, so follow our file for that. You will mainly want to verify the paths for the objects to gather text from. BEWARE! Some of the required parameters are paths (SearchDirectoryPaths) and some are filters (IncludePathFilters). Both kinda look the same, but for filters you don’t want to miss out on the required asterisk (*)!

I personally found that GatherTextFromAssets was getting too much FText I did not want to localize in the first place. We use a lot of TextRenderComponent in the game that are used in the editor only, and this was polluting the localization archive. Since FATED doesn’t have that much text anyway, I decided to only use GatherTextFromSource and force our texts to be localized in source using the NSLOCTEXT macro. It simplified the process for us, but it may not be what you need for your game.

The other steps (GenerateGatherManifest, GenerateGatherArchive,etc) I did not change, but they are required to actually generate the files that will be in you localization content folder (Content/Localization/Game).

Generated files: .archive and .manifest files
The main generated file that you will want to modify afterward is the .archive file. This is where your actual localization text will be stored. For each language generated, a folder will be created in a Content/Localization/Game that represents it; in our case an “en” and “fr” folder. You can open the .archive file using any text editor. For example:

"Source":
{
        "Text": "2.1 - A New Beginning"
},
"Translation":
{
        "Text": "2.1 - Un nouveau départ"
}

The .manifest file is not meant to be modified, but you can get information on the gathering process there. It could be useful to track down where the text was gathered and the actual key used. Example:

"Source":
{
        "Text": "2.1 - A New Beginning"
},
"Keys": [
{
                "Key": "Chapter4Name",
                "Path": "Source/Jotunn/Private/UI/UIChapterSelect.cpp - line 196"
        }

]

That’s it for Text Localization. I’ll go over the Voice-Over localization using Wwise as well as other things to consider when localizing in my next blog entry next week. I hope this guide has been helpful to you folks out there. See you next week.

 

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Memory Management in UE4

Michaël Dubé —  October 22, 2015 — 8 Comments

Hey there!

Well, it’s been a while since I’ve written something for the blog; we’ve been busy at Frima building a cool update to our Oculus Connect 2 demo, which Vincent and fellow programmer Marc-André attended. We now have facial animations and kick-ass voice overs into the game, which bring it to a whole new level. There’s also an amazing article published on Polygon about FATED, so things are getting exciting! Click here to read the article.

Since we are only two programmers on the project, and since we are needed for pretty much every technical aspect of the game, sometimes stuff falls into the cracks. That’s pretty much what happened with memory management for FATED, and since we were both new to Unreal 4, we made some rather critical mistakes on top of not spending a minute on investigating what was loaded, and when.

When the rush to get our OC2 demo out of the door was over, it was taking a whole 75 seconds (!) to load our demo scene. That’s pretty insane for any kind of game. It was time to dig a bit deeper into what was actually taking so long to load, and maybe add a bit of asynchronous loading in there. So while this post may be somewhat of a given for Unreal veterans, I think it could be helpful for some developers with a bit less exposition to the tech.

 

WHAT’S UNDER THE HOOD?

First things first: identify the culprits. Unreal is pretty dope when it gets to commands you can input to investigate every aspect of the game, and memory statistics are no exception. One of the most useful commands out there is “memreport –full”. The “full” parameter will dump more information that could prove useful but is not required per se. Running this command will create a .memreport file in your “Saved\Profiling\MemReports” project folder. It’s a custom file extension, but this is actually a text file representing a snapshot of your game’s current memory state.

You’ll probably want to look first at the object list, which is the complete list of UObject you have loaded and information on the amount of memory space they take up. You will have a line looking like this for every object type:

AnimSequence          36        16628K           16634K           7578K             7578K

Texture2D                  150      212K               219K               260996K         96215K

AnimSequence (or Texture2D) is the name of the class of the object, while the number next to it (36 or 150) is the number of instances of that class. What you want to look into after that are mostly the first and last of the four other numbers you see up there. The first is the actual size of the objects while the last one is the size of the assets that these objects directly reference exclusively. For example, the Texture2D objects do not take much memory as objects holding the textures metadata (212K) as opposed to the actual texture data (pixels) it references exclusively (meaning no other objects references to that data), which amount to 96215K.

The two lines above were extracted from a memreport for this scene:

MemoryTestsMapThat’s right, there is absolutely nothing in there except the floor, a player start, and the skybox. So where the heck are those 36 animations and 150 textures coming from?

 

FINDING ASSET REFERENCES

Now that I knew there was something clearly wrong there, I had to find out which assets were loaded. Sure, there are 36 animation sequences in memory, but which ones and why? Again, UE4 can show that to you quite easily.

The memreport command is in fact a list of other commands executed one after the other to form a “complete” report of your game memory. One of the commands it runs is “obj list”. This is what outputs the list of objects and the number of instances, but this command can also use some parameters, one of which is the class of the objects.

 

obj list Class=AnimSequence

 

This will output all of the animation sequences in memory at that moment, in order of what uses the most memory (the first number). Like this:

 

AnimSequence /Game/Character/Horse/Animation/AS_HorseCart_idle.AS_HorseCart_idle      

491K       491K       210K       210K

 

Now, if you’re like me, you might want to make this look a bit clearer, and fortunately it’s quite easy to write your own Exec function that can output exactly what you want. It can, for instance, sort by exclusive resource size instead of object size. Here is a sample of code to get you started in that direction:

for (TObjectIterator<UAnimSequence> Itr; Itr; ++Itr)

{

            FString animName = Itr->GetName();

            int64 exclusiveResSize = Itr->GetResourceSize(EResourceSizeMode::Exclusive);

}

 

Reference viewerThe Visual Reference Viewer

Now that I could pinpoint exactly which objects were loaded, I was able to investigate with more ease what was referencing them in that seemingly empty scene. There are two ways to view object references in Unreal. First, there is the visual reference viewer (that you can see in the picture above) in the editor; this will show all potential references, not necessarily what is referencing the actual loaded asset in memory right now. Of course, there is also an easy way to figure out the current shortest reference with yet another console command.

 

obj refs name=AS_HorseCart_idle

 

This will output a chain of references on what is loaded in memory (it can take some time), and following this chain will usually lead to the culprit. In our case, we were doing one mistake that was responsible for a lot of those loose asset references: direct asset references in some of our native class constructors. Like this one for example:

 

static ConstructorHelpers::FObjectFinder<UMaterial>

TextMaterial(TEXT(“Material’/Game/UI/Materials/UITextMat.UITextMat'”));

 

In the above example, this is a direct reference to a material: this material will always be present in memory. This is quite logical when you think about it, since the constructor is not only called when an instance of said class is created, but also at startup as it is run to set default properties when the static version of the class is created. So avoid those if it’s not something you need at all times in memory! Instead, opt for a UPROPERTY that can be assigned in the blueprint version of that class, even if it’s always the same asset. At least that way if you don’t have an instance of that object loaded in your scene, it won’t be in memory.

 

ASYNCHRONOUS LOADING USING THE STREAMABLE MANAGER

The wrongly referenced assets were not the only reason why it was taking so long to load our demo, so there was still more work to do. We had some remaining temporary assets, some pretty huge textures (I’ll talk about that in more detail below) but, more importantly, we still had a lot to load and we were trying to load everything at once. Let’s see how to asynchronously load assets using what Unreal calls the Streamable Manager.

If you have an actor that references multiple Animation blueprints, for example, they will all be loaded when you instantiate it, even if you only use one at a time.

 

TSubclassOf<UAnimInstance> mAnimBPClass01;

TSubclassOf<UAnimInstance> mAnimBPClass02;

TSubclassOf<UAnimInstance> mAnimBPClass03;

 

The above lines should become this:

TAssetSubclassOf<UAnimInstance> mAnimBPClass01;

TAssetSubclassOf <UAnimInstance> mAnimBPClass02;

TAssetSubclassOf <UAnimInstance> mAnimBPClass03;

After this modification, the animation blueprint that mAnimBPClass01 references will only be loaded when you specifically ask for it to be loaded. Doing so is quite simple: you need to use the FStreamableManager object. Just make sure to declare it somewhere that will always be in memory, an object that won’t ever be deleted (like the GameInstance of your game, for example). In my case, I dedicated a special “manager” object to it, which is created at game start and never deleted. It handles everything that is dynamically loaded in our game.

 

UPROPERTY()

FStreamableManager mStreamableManager;

 

There is more than one way to load an asset asynchronously, but here is one example: mArrayOfAssetToLoad being a TArray of FStringAssetReference.

 

mStreamableManager.RequestAsyncLoad(mArrayOfAssetToLoad,

FStreamableDelegate::CreateUObject(this, &UStreamingManager::LoadAssetDone));

 

FStringAssetReference are the string representation of the full path of the asset in Unreal’s internal file system. Using a TAssetSubclassOf<> pointer, you can get it by calling ToStringReference() on it.

Furthermore, if you are using Wwise for your audio management, Audio Banks can become quite huge and long to load. Fortunately, a wise (get it?) bank subdivision and using LoadAsync() on the UAkAudioBank instead of Load() will fix that for you. Be sure to uncheck AutoLoad on the banks in the editor before you do! Also, for some reason the LoadAsync() call is not exposed in Blueprint, so you need to do that in native code or expose it yourself.

 

LEVEL COMPOSITION

Assets loading asynchronously is one thing, but you probably also want to split your levels into chunks to load separately. Unreal allows that using “Level Composition”. Level loading used to be done on the main game thread of Unreal, but it is now possible to split the loading on different threads.

 

In the DefaultEngine.ini file of your project, add this:

[Core.System]

AsyncLoadingThreadEnabled=True

 

We are still on Unreal 4.8.2, but from what I found on the subject, this may already be by default in 4.9. Anyways, this should help make the asynchronous loading of levels smoother. If you are enabling that feature, however, you need to be careful about what you do in your class constructor, as some operations are not thread-safe, and using them could result in a lock or even crash the game.

Image3

Level streaming can be done “automatically” using volumes or a simple distance factor, but we decided to do it manually in our case. In the picture above, unchecking Streaming Distance will allow every sub-level linked to that layer to be loaded manually. This can be done in blueprint using Load Stream Level or in C++ using UGameplayStatics::LoadStreamLevel().

Image4

 

Texture Mip Bias

In pretty much every game, textures are what end up using a lot of your space. In pretty much every game, there is also a bunch of oversized textures that are far from optimized pixel density-wise. Fortunately, instead of reimporting every such texture, UE4 offers a really simple way to trim the fat without needing to reimport everything: LOD Bias.

Image6

Image5

You can see that the difference in resource size is considerable when we drop 1 mip, especially when the texture is in 4096X4096! Of course, we couldn’t do that for everything, but there was a lot of stuff that was in 4096 that really did not need to be.

Of course there is a lot more to memory optimization and management, but this is pretty much what I have done to get our demo from taking 75+ seconds to load to somewhere around 10 seconds. Unreal is a great tool, and I keep learning and getting better at it. I hope this will help some of you out there in creating your own awesome content. If you have any questions or comments, I’ll be happy to answer them! In the meantime, I’ll go back to working on FATED. Stay tune for more info!

 

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Mik

I’ve given a lot of thought on how to write this post as I’m going to touch on very delicate subject. It’s one on which my view (and that of the team) is not completely set yet. From my point of view as a developer, I’ve witnessed how the perception of Virtual Reality is evolving. Part of this is due to a better understanding of the forces and weakness of this new medium, but it’s also due to the fear of VR, a fear that is mostly attributed to health concerns and simulation sickness acting as the flag bearer. There’s a stigma around this topic and I thought it was time to get some well-deserved new perspective on what it entails and maybe bring about a new point of view on the subject. Bear with me a little as I lay out my thoughts on this, I swear there is a point!

Evolution of VR Mentality

When we started tinkering with the DK1 back in the beginning of 2014, the VR scene was pretty much two things: first-person view horror games and rollercoasters. A lot of people saw the future of VR entertainment as that kind of experiences. It was about “thrills” and seeing the world from another person’s perspective: that was most easily associated with first-person view games.

Motion sickness was a thing, but it was mostly seen as a technological limitation, something that would go away naturally as the tech evolved. It was not uncommon to see one person feel some simulation sickness only for another to comment that “It’s going to be solved or get way better with the DK2”.

One of the first demo developed by Oculus to showcase the Rift was a first-person experience; the Tuscany Demo.This demo is what introduced a lot of developer I know to VR, and it contained the dreaded right joystick yaw control (more on that later).

The concept of “VR Legs”, the fact that you could acclimate to the discomfort caused by some VR experiences, had a pretty strong following. On the team, I remember we were telling ourselves that we would do the most hardcore experiences (like Cyber Space) in order to get better in VR. In some respects, this did work. We got better at handling some VR related discomfort!

The reason why I mention all this and what primarily led me to write this piece is that since then, there has been such a paradigm shift in what a VR game should be, that it made us reconsider many aspects of our game. Basically, as of now, creating a VR FPS is pretty much considered heresy. The demos that once were the greatest things in the world, that made me a believer in virtual reality in the first place, are now scorned by most.

While first person VR games are still coming, this evolution led to the recent insurgence of third person perspective VR content. This can be seen in numerous articles around the web but also in recent announcements at E3 like Insomniac’s “Edge of Nowhere”.

Don’t get me wrong, I’m not saying this is a bad thing! There are good reasons behind this; locomotion is way less intrusive in third person, and locomotion in VR is a problem.

Locomotion: The “Unsolvable” Problem of VR

While things like low frame-rate and high latency can create simulation sickness, the real issue lies in movements and how the vestibular system interprets what the brain is telling it. For some, moving around in translation or rotation or both can get extremely uncomfortable, extremely quickly.

In parallel to the evolution of VR content creation, developers were hard at work finding solutions to this problem. As we entered pre-production on FATED, we were already aware of some of the issues. We knew about the major concern of standard FPS control: player yaw rotation using the right stick. Some months after we started, John Carmack, one of the leading authorities in the domain, said this about it:

John Carmack Tweet

So, naturally, we wanted to figure it out: solve the issue and make our game the most comfortable VR experience out there. Thus came the time of the hunt for the Great VR Holy Grail! *Cue the Indiana Jones theme song*

While we found some interesting (and some horrible) ways to control rotation, most of the popular ones were already out there.

 

 

Solution #1: The “Comfort Mode” Control

There is a pretty popular movement mechanic known as the “Comfort Mode” that strives to resolve this issue. You can see it explained here.

While this solved the problem at hand for some, it did also create new ones. First, for a lot of people this was breaking immersion. There is no way that “skipping” the actual rotation movement can cut it as a “natural” way to move around. Secondly, disorientation: while using the feature sparingly could work, trying to use it in a more action based setting was quickly making people unsure as to their whereabouts in the game world. And then there is the matter of precision. If you get to a point in your game where you need to be at a certain angle, then this method is bound to fail.

While these are all new problems that may also have their own solutions, there is one single thing that still stays true: this does not work for everyone. In fact, some people reported feeling sicker using this new control mode.

Solution #2: Cockpits, Cockpits Everywhere!

Cockpit

The second solution came about quite naturally in various demos that actually needed them, like in games where the player is sitting in a spaceship. I use the term “cockpit” here to designated any locomotion mean that as a reference frame that the player can relate to. Racing games and mech-warrior style games are two other examples.

Quite interestingly, this had a strong positive effect on the way players experimented simulation sickness, even going as far as completely removing the unpleasant feeling for some! Great, but now that developers found out about this, the internet exploded with ways to had that frame of reference to everything. Here are some of my favorites, not always for good reasons.

Canvas Mode

There is this demo that is awesome for its novel ideas to handle movement in VR. It’s simply called “Movement Experiments”. All are interesting movement mode but the one I want to point to is the “Canvas Mode”. What this mode does is create a virtual “cockpit”, called “canvas” in this case, whenever the player rotates around. While the author says it works (and it probably does for some people), I found it quite intrusive, and immersion-breaking. Still, the demo is worth a try, so check it out!

Floating HUD (Helmet Mode)

There’s the floating HUD solution that is basically making the UI the frame of reference for the player. That, coupled with something like the visor (or helmet) of the player character can give the desired frame of reference. A good example of that can be seen in the upcoming game Adr1ft. Unfortunately, not all game settings permit this kind of implementation. A Viking with an astronaut helmet…maybe for FATED 2!

Virtual Nose

Nose

By far my favorite is the virtual nose, which is basically the natural frame of reference of humans. This research that pretty much flooded the VR space when it was first released is basically the extreme representation of the “cockpit” concept. We tested the idea on FATED, and promptly removed it. For some reason (I’m blaming low FOV), what the player ends up seeing is two partial noses, one on each side of the screen. It’s very troubling for the player and just feels out of place, I really don’t see this becoming the miracle pill to solve motion sickness.

Solution #3: Swivel Chair

The idea here is that since rotation in VR is such a problem, why not make players rotate on themselves in the real world? This does work of course, but with the headset horribly limited by a metric ton of wires, this does not always end well. Some folks out there like the people at Roto made a chair especially for VR that aims to solve that. While the idea is sound, we believe that having to invest in yet another piece of equipment to enjoy VR is not the way to go.

All in all, these solutions can work, but there is no single “magic trick” to solving the locomotion issue for everyone. This single fact is fuel for VR skeptics.

The Fear of VR

The fear is very real

 

I’ve seen it numerous times: when someone mentions trying a VR demo, the automatic question is “but will I be sick?” It’s part fear of the unknown, part past experiences that were not exactly “perfect”. No one wants to say “maybe you’ll be sick testing my game”. It became imperative, at all costs, to make sure every VR experience was free of simulation sickness.

However, fear of VR is not limited only to the consumer fearing the ill effects of the technology on their health; it’s also the fear of the industry giants bringing this to market that bad content could kill the appeal. As VR development grew stronger over the last year and with the consumer version finally nearing completion, it became abundantly clear that keeping motion sickness at bay was primordial. Palmer Luckey, Oculus’ Founder, as well as John Carmack both stated it very eloquently:

“Virtual reality’s biggest enemy is bad virtual reality”.  –Palmer Luckey

“The fear is if a really bad V.R. product comes out, it could send the industry back to the ’90s,” –John Carmack

But this leaves us with a question that begs to be answered…

What Is BAD Virtual Reality?

All that work leaves us with a pretty bleak picture of what we can do in virtual reality with the premise that if we want a “good” VR product, we most absolutely have zero simulation sickness, for EVERYONE. All of the above solutions can work for one person and not the other. Or it can help but not completely eliminate the effect.

Here is a small list of stuff you will not be able to experience in VR if you are very very affected by motion sickness in general.

  • All experiences with movement in them.

Yikes. No rollercoasters for you!

There is a great read by Lee Perry on Gamasutra that you should really check out that go deep into the kind of thinking I believe VR content creators should adhere to. If there is one thing I would like to emphasize in this article, it’s this: “People have extremely individualized issues with VR”.

So the definition of “bad” virtual reality is going to differ from person to person, as tolerance to some movements can be handled or not. The point of this entire post is to make it clear that maybe some VR experiences will not be for everyone. While Oculus’ reluctance to push out demos that might alienate a segment of players is understandable, what would be worse is us limiting the type of experience people can enjoy. For example, while it’s true that yaw rotation makes a lot of people want to barf, for those that it does not affect (or to a lesser degree), it’s a great and intuitive way to move around.

We don’t want to limit the kind of VR content we create to encompass every demographic; it’s an impossible task that would have us throw out things we hold dear, experiences we really believe are powerful even if a smaller percentage of people can experience it comfortably. What we can do, however, is have the maximum number of options and tweaks to permit players to mold their experience to their taste and sensibility. We believe VR needs to come with its own “rating” system that will have “comfort” as its unit of measure, the same way you know if you can handle rollercoasters in an Amusement Park or if you should rather stick with the Big Wheel. We just need to be honest with our players and expose our experience as it is. Oculus already started doing this on Oculus Share with its comfort level scale, even if they seem to have shied away from the concept since then.

Well, I knew this post was going to be long but I did not quite think it would get this long! While it’s pretty much about exposing the problem, there is a second part to this that focuses way more on how we did or plan to do to resolve the simulation sickness problems in FATED. After all, we still plan to have this next to our game when it ships:

Comfort

In the meantime, if you have any questions or comment on the subject, please do share! I’ll be happy to read and discuss them with you!

Mick / Lead Programmer / @Mickd777

www.fatedgame.com

Mik

Hi everyone!

With Oculus’s recent announcement regarding requirements and specs for the consumer version of their HMD (https://www.oculus.com/blog/powering-the-rift/), I figured it was the perfect time to write that performance bit I teased about the last time around. Let’s see how we’re dealing with optimization on FATED! First, some math to have a clear vision of what we’re trying to achieve.

Know Your Numbers!

FATED is pretty much fillrate bound (http://en.wikipedia.org/wiki/Fillrate), and it’s safe to assume that most early VR games will also be. This is why the following info is important.

A current generation game will generally push about 124 million pixels per second when running at 60 fps in 1080p. FATED is currently running on a GTX 970 (the recommended card for the consumer version of the Oculus) at ~305 million pixels per second.

1920X1080 upscaled by 140% = 2688X1512 * 75(Hz) = ~305 million

The CV1 native resolution and refresh rate looks like this:

2160X1200 * 90(Hz) = ~233 million

Oculus’ Atman Binstock revealed that “At the default eye-target scale, the Rift’s rendering requirements go much higher: around 400 million shaded pixels per second.” After some math, you can figure out that this pretty much means a 130% upscale seems to be ‘the norm’ for Oculus.

2160X1200 upscaled by 130% = 2808X1560 * 90(Hz) = ~394 million

While we don’t have the final hardware yet, we can have an estimate of what it will need in terms of processing power. The closest we can get using the DK2 is by pushing the screen percentage to around 160. This is what we are aiming to achieve!

Trade-offs A.K.A. Battle with the Art Team

Unreal can be pretty daunting when you first enter it, mainly because the engine cranks everything to 11 by default. The first thing you want to do is take a look at what is costing so much and turn off what you don’t need. Sometimes, the art team will hate you for asking to remove that extra post-process, but remember that a vomit-free experience is more important!

With FATED, we decided to go with a ‘stylized’ look so we could remove some of Unreal’s cost-heavy features while keeping the visuals as stunning as possible. We are also making the conscious decision to have each scene as tightly contained as possible. We want to control which objects are seen at each moment (limit draw calls!) and so we design the levels in consequence. These assumptions allowed us to remove some of the features of the engine without overly affecting our visual target. Here are some decisions we made early on:

  • No dynamic shadows, everything is baked (with exceptions…)
  • No dynamic lights either (with exceptions…)
  • Limit post-processes: No DOF, motion blur, or lens flare

Console Command Options

Here are some interesting commands that we used to disable some costlier features:

r.HZBOcclusion 0: Disables hardware occlusion

r.TranslucentLightingVolume 0: Disables translucent lighting

r.AmbientOcclusionLevels 0: No ambient occlusion! It’s a nice feature, but we don’t need it; remove that!

r.SSR.Quality 0: Screen space reflection is also cool, but it’s costly and we don’t need it; delete!

Profiling Limits: Dig Deeper with GPA

At one point, it’s hard to pinpoint what is really happening GPU-side using only the GPU Profiler. To really get a sense of what is going on, we need something to dig even deeper! We’re using a free tool called Intel GPA.

mic1

https://software.intel.com/en-us/gpa/faq

We won’t go in depth on how to use the tool, but there is one important thing to know: it won’t work in ‘Direct to HMD’ mode. So, to start a capture, you need to be in ‘Extend to Desktop’ mode. The quickest way we found to take a capture was to open the editor, set GPA to ‘Auto-detect launched app’, and then run the game in ‘Standalone’.

Now for the Juicy Tips!

Analyze your scene: I talked about the ‘show’ commands and ‘stat scenerendering’ in my last post; this is where you want to use them to determine what your bottleneck is.

Instancing Static Meshes + Foliage: If you have too many draw calls, this could be a life saver! Foliage is especially great if you want to have a dense forest or lots of smaller meshes. But keep in mind that the level of detail in foliage can easily multiply your draw calls. Also, instancing is not always the best option, so make sure it’s really going to help. Don’t hesitate to compare using GPA!

Particle System Bounds: While profiling FATED, I found out that a lot of particle systems we were not supposed to see where being rendered. Turns out the culling of particle systems is not set by default!

mic2

Project rendering settings – Screen Clear: This is a minor optimization, but every microsecond is worth it! If you always render something on each pixel (you have a Skybox, say) this is worth setting to ‘No Clear’. Be aware that this should only be set for actual builds, since it will cause weird artifacts in the editor asset viewer viewports.

Project rendering settings – Early Z Pass: This is one of the best helpers for the fillrate. This will do more draw calls, but it’s such a huge help for the number of pixels drawn that it is worth enabling. Some frames got as much as 25% speed gain by enabling that!

mic3

Disable post-processes when not using them: We got some really nice post-processes for some features in our game, but they are not always used. Be sure to remove those from the ‘Blendables’ array when they’re not needed!

Shipping Builds: It’s good to remember that your shipping build is going to run a bit faster than your dev build.

We’re always looking for ways to improve performance, and we’re not done optimizing, but this should give you a basic idea of how we’re working on FATED: always profile, add one feature at a time, and look for more ways to make the game run ever more smoothly. There is whole section dedicated to performance in the Unreal 4 documentation (https://docs.unrealengine.com/latest/INT/Engine/Performance/index.html); I highly recommend it to those who want further insight!

Meanwhile, if you have tips to share, or any questions or comments, send them in and I’ll be happy to address them! ‘Til next time!

Mick / Lead Programmer

Mik

Hey guys and gals!

Following last week’s post, which introduced some basic VR concepts, I thought I would do the same but oriented toward Unreal Engine 4. Since the engine went free a few weeks ago, I bet there are a lot of newcomers who could use a hand! So without further ado, let’s delve into the subject.

THE UNREAL EDITOR AND VR

There are a lot of ways to preview your content in Unreal, two of which are particularly helpful for VR. The Play mode can be found in the toolbar above the viewport in the standard Unreal layout.

VR Preview:

This Play mode starts the game in VR mode directly inside the editor. This is new in Unreal 4.7 and is very useful to actually have HMD data inside the editor to debug (in blueprints, for example). This Play mode works well with the Oculus’ Direct to HMD, but it seems to have some issues in extended mode.

PlayMode

Standalone Game:

Prior to 4.7, this was the way to test your VR content. Pressing Alt+Enter starts the stereo rendering. We still use it to profile using Intel GPA. More on our usage of Intel GPA in another post!

World to Meters:

In the World settings, this value is under the VR category. This is the representation of Unreal units to real-world meters. 100 means that 100 Unreal Units equal 1 meter in real life. For FATED, we played a bit with this value to get the proportions we needed. It is recommended to set this value so that it fits with the real-life measurement of your in-game objects.

Head Mounted Display Blueprint Nodes:

There are a lot of exposed methods to do logic directly inside blueprints. This category should be the first place to look for HMD-related functionalities.

HMDBlueprint

Use Less CPU in Background:

You want the editor to stop most of its processing when testing your content. This option can be found under the Editor Preferences in the Miscellaneous section.

LessCPU

Scalability Settings:

I feel like I’m repeating myself, but performance is critical in a VR game! Unreal Engine 4 comes with a great scalability features that can help create the range of performance for your game, from your minimum specs to your recommended specs. More on this in a performance post I’m planning to write soon.

Scalability

COMMAND LINE FUN!

If you’re already familiar with Unreal development, you know about the console and all the commands that can be entered through it. This is also true about the HMD Interface (Head-Mounted Display). Most of these commands are accessed with the “hmd” keyword; here is a list of the ones we use the most in FATED’s development.

hmd stats:

This command displays info about your HMD. A lot of the stuff I’ve mentioned in my previous post can be seen here. Your current latency, whether positional tracking is activated, whether the camera is detecting the HMD, etc. There is also information on Time Warping, and whether the engine is updating the HMD info in the render thread. These are techniques to reduce latency that come right out of the gate in Unreal 4, making it a top-of-the-line choice for VR development.

hmd sp YYY:

YYY must be replaced by a number between 30 and 300. This is one of the easiest performance boosters at the expanse of visual quality. To create a better-looking picture, Unreal scales the image before the distortion process, sampling down afterwards to fit your HMD screen. By default it is at 135, but you can easily bring it down to 100 for better performance. It does look better at a higher percentage, though!

hmd vsync on/off:

Deactivating vsync can be useful to track your actual framerate.

stat unit:

Figure out if you are render thread-bound or GPU-bound. For FATED, we are pretty much always GPU-bound.

stat fps:

Are we running at 75 fps?

stat scenerendering:

This will show useful information about the scene’s rendering. You want to keep an eye on the number of draw calls, for example.

show YYY:

The ‘show’ commands are useful to hide some elements of your scene to try and figure out how much each of these elements takes to render. ‘Show staticmeshes’ and ‘show postprocessing’ are two examples of commands we used in early profiling.

ShowCommands

profile GPU:

This will launch the Unreal Engine GPU Profiler. You can also do this at any time by pressing [Ctrl+Shift+,]. This is a good one to know about, as it is the first tool you will want to use for GPU performance profiling. It is kind of limited (and the reason we are using Intel GPA for deeper profiling), but it will point out some of the costlier effects in your scene.

GPUProfiler

These are the most useful ones for us, but go digging in there and you’ll find plenty other commands that are useful.

In my next episode of the FATED blog…

That’s it for now! Next time, I will talk a bit more about performance and how we got FATED running on a “normal” rig while still retaining great visuals!

Meanwhile, don’t hesitate to comment or ask a question; we’ll be happy to discuss with you guys!

Mik

Hi, my name is Michaël Dubé, Lead Programmer on the awesome FATED project!

Starting a new project is always thrilling and frightening at the same time. Venturing into virtual reality using a completely new technology, Unreal Engine 4, was something we were mostly excited about, though. Everyone on the team was super hyped for this project (and VR in general), which made the terrifying parts (hello performance!) not so terrible to bear.

Nevertheless, we learned a lot of really neat stuff during our first months of development, both about UE4 and VR, and my hope is for this blog to be half an entertaining view of the development of our game, and half a chronicle of all the blood and tears we shed to get there. It’s a Viking game after all; you can’t expect it to be all unicorns and rainbows!

For this first post, I wanted to keep it simple and talk about some basic but important virtual reality terminology, along with some tips on getting your rig ready to rock this new reality.

VR VOCABULARY

There are a lot of technical terms that come with VR development; I wanted to recap some of those and what they mean for FATED.

LATENCY
Basically, this is the time gap between your movement and its replication on-screen. You will often hear the term ‘motion-to-photons latency’ to talk about that. Low latency is very important in VR, as it is considered a major factor in reducing the effect of the infamous simulation sickness.

SIMULATION SICKNESS
This is what you absolutely don’t want the user to feel! This is similar to motion sickness (reading in a car) or sea sickness. Research still haven’t pinpointed what exactly causes this, but everything points to cognitive dissonance. As soon as you make the user experience something he doesn’t do in real life, there is a chance it will trigger simulation sickness. The Oculus Best Practices Guide is a great place to get info on the dos and don’ts.

http://static.oculus.com/sdk-downloads/documents/Oculus_Best_Practices_Guide.pdf

RESOLUTION AND THE SCREEN DOOR EFFECT

“The ‘screen-door effect’ […] is a visual artifact of the projection technology […] where the fine lines separating the projector’s pixels become visible in the projected image.”
http://en.wikipedia.org/wiki/Screen-door_effect

This is one of the reasons the consumer version of the technology will need better resolution. However, for developers this means that we have to take extra care when it comes to performance. We got great performances tips we applied for FATED that I’m eager to share with you!

REFRESH RATE
This is the frequency at which the screen refreshes. The DK2 is set at 75 Hz, so basically we need to update the screen at a constant 75 frames per seconds to get the best result. In VR, it is of the utmost importance to keep a steady framerate. Unless you want to make your players sick, that is (which we don’t, we swear!).

PRESENCE
This is what VR is all about! This is basically tricking your brain in thinking it’s really in your game. We are designing with this principle in mind to create mind-blowing and/or powerful moments for the players to experience. FATED is all about making the player live a meaningful and poignant adventure. Because of this alone, I strongly believe VR is the next step in the narrative medium.

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word
http://static.oculus.com/connect/slides/OculusConnect_Epic_UE4_Integration_and_Demos.pdf

HARDWARE TIPS

Installing the Oculus Runtime is as simple as it gets, but there are some important settings that every VR enthusiast should be aware of.

RIFT DISPLAY MODE
What Direct to HMD does is ensure that the application is synced to the actual refresh rate of your Rift. It also helps reduce latency since the step of going to your desktop is bypassed. While I prefer the simpler Direct to HMD, we are still using the Extended Display Mode on FATED for debugging purpose (more on that in another post!).

OCULUS CONFIGURATION UTILITY
This is where you can setup some information about yourself to get a better experience. Developers can then use this information to adjust the experience so it better fits the user, like adjusting the camera position using your IPD (interpupillary distance – the distance between your eyes).

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word_2

BEWARE OF THE NVIDIA CONTROL PANEL!

If you have an NVIDIA card, chances are you can get a huge performance boost (we definitely saw a difference on some cards) by doing two simple things in the NVIDIA Control Panel. You should of course have the latest drivers before doing that.
First, in the Manage 3D Settings, there is a “Virtual Reality pre-rendered frames” setting that is set to 1 by default. You want to set that to “Use the 3D application setting”. Also, you will see “Power management mode”. Set that to maximum performance.

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word_3

Next, you may want to go in your PC’s Control Panel, in the Power Options, and set your minimum processor state to 100%.

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word_4

The above settings were part of changes I did for FATED when we encountered a problem known on the Unreal forums as the ‘37.5 fps lock’. This lockdown would normally happen if the frame rate drops below 75 frames per seconds, but this was not the case for us and some other developers. Some Oculus staff was involved in pointing this out, and this may not be necessary anymore once the consumer version is ready, but it definitely helped for FATED.

Well, that’s all for now. Obviously, there is a lot more to VR, and we will keep adding interesting content to this blog. In the meantime, if you have great tips to share with us or if you have any questions, we’d be happy to hear you out in the comments!

Stay tuned for more on FATED!