Archives For Tech

Hello again folks!

This week, I pick up where I left off last week with Localizing FATED in Unreal Engine 4 – Part 1. I already covered the basis of text localization, i’ll follow up with Voice-Over localization using Wwise as well as other things to consider when localizing in Unreal Engine 4.

Localizing Voice-Over Using Wwise

Wwise is a really great tool, and I can’t imagine doing audio work without it ever again. The Unreal Engine integration is great, even if some stuff is missing, namely localization support. Fortunately, all the ground work is pretty much done, meaning there are only a couple of things you need to modify to make it all work.

Adding localized audio files in your Wwise project

The first thing you want to do is actually setup your localized audio files in your Wwise project using the Wwise authoring tool. This is very easy to do, and requires no additional step in the UE4 editor, since what you import in Unreal (Events and Banks) is language-agnostic. Basically, the event stays the same, but the audio files played will differ. The only thing you have to do is import the new localized audio file over the already existing audio files in Wwise. Right-click on an audio file (or folder) and select “Import Audio Files…”.
This screen will show up:

Audio File Importer
Be sure to select “Localize languages” in Import Mode, and the desired language in “Destination language”.

Suite

Select the preview language in theWwise authoring tool

 

Now, you do this for every VO of your game. It is important that every VO file name match your original VO file name. You can import by folder to go faster. You can also set your project in the desired language; that way Wwise recognizes which audio files have a counterpart file for the selected language. This makes it easy to know which audio does not have a valid localization yet. Once this is done, everything you had to do on the Wwise side is done! Let’s return to Unreal!

Generating banks for all supported languages

In AkAudioBankGenerationHelpers.cpp you will find the GenerateSoundBanks() function. In there, the following line specifies which bank language will be generated:

CommandLineParams += TEXT(" -Language English(US)");

For some reason, by default it only generates English banks; basically you have to add each language you want to generate. Even simpler, just comment out the line. This will make it so that it generates all languages specified in your Wwise project.

Specifying Culture in the Game

Now that the text and voice-overs have been generated in every supported language, you actually need to specify which language you want the game to show. In general, you’ll want to do this operation at the very start of the game. Unreal offers a simple way to launch the game in the desired culture using a launch argument: -culture=“fr”.

This will effectively call for the change in culture on the text side; unfortunately, it does not do anything regarding the Wwise banks. Instead of using the -culture argument, let’s dive in and see the actual calls we need to make the switch of culture happen.

For the text, the FInternationalization class is used to set different cultures. Here is an example of how to set the game in French:

FInternationalization::Get().SetCurrentCulture(TEXT("fr"));

This will only change the text, however. For the Wwise content, you need to do yet another small change. For that purpose, I added a function to the UAkGameplayStatics class that I called void SetLanguageToCurrentCulture().

In this function, the first step is to get the current culture:

 FString currentCulture = FInternationalization::Get().GetCurrentCulture()->GetName();

After that, you can assign the Wwise current language this way:

AK::StreamMgr::SetCurrentLanguage(AKTEXT("French(Canada)"));

The name of the language is a string equivalent of how the language is called inside the Wwise authoring tool.

Optionally, you can also get the current Wwise language and unload the VO bank if the new language is not the same as what was loaded.

AK::StreamMgr::GetCurrentLanguage();
 UnloadBankByName(TEXT("Act01_VO"));

Now you should have everything you need to have your game’s text and voice-overs localized.

IMPORTANT NOTE: For some reason, Unreal does not allow the culture to be changed directly in the editor. This can be frustrating if you want to see the content in another culture directly in the editor. Fortunately, there is a way to remedy the situation quite easily.

In TextLocalizationManager.cpp, you can find the OnCultureChanged() function. What you want to do is remove the FApp::IsGame() from the ShouldLoadGame initialization.

//const bool ShouldLoadGame = bIsInitialized && FApp::IsGame(); //Before
 const bool ShouldLoadGame = bIsInitialized; //After

IMPORTANT NOTE 2: Yet another “annoying” default behavior is that a packaged game will not fallback to “root” culture when the culture specified is more specific. For example, if your game is in “fr-CA”, it will not default back to “fr” but rather to the default culture, which is “en”. Yet again, a small change will fix that.

In ICUInternalization.cpp, you can find the SetCurrentCulture(const FString&) function. You simply want to allow fallback, which is something already done in Unreal, just not set by default.

// Allow Fallback, this make it so that we can use “root” language like “en” and “fr” and not need to specify “en-US”,”fr-CA”, “fr-CD”, “fr-CF”…

FCulturePtr NewCurrentCulture = FindOrMakeCulture(Name, true /*Allow Fallback*/);

Getting the Steam App Language

I won’t go over how to setup your game for Steam (there are already a few good tutorials out there for that), but I’ll talk about the snippet of code needed to use the language currently selected in Steam.

First, I added FString GetSteamAppLanguage(); in OnlineSubsystemSteam. You can add it to your custom version of OnlineSubsystemSteam, but I felt it was enough of a “core” feature to be added directly to OnlineSubsystemSteam.
Here is the function in its entirety:

FString FOnlineSubsystemSteam::GetSteamAppLanguage()
 {
 FString langFString = TEXT("english");//Default
ISteamApps* steamApps = SteamApps();
 if (steamApps)
 {
 langFString = FString(ANSI_TO_TCHAR(steamApps->GetCurrentGameLanguage()));
 }
 return langFString;
 }

SteamApps() won’t always be valid, depending on some factors (like if the game was launched from Steam or not), so be sure to have a default value to fallback to.

Unreal is certainly going to add a lot to its Localization toolset; in fact, it’s already happening in 4.10.1 with the OneSky localization service plugin. This will certainly come in handy if you’re thinking about using that service for your game. There is also the experimental “Localization Dashboard” that can be activated in the “Experimental” section of Editor Preferences.

U4

I did not use the Dashboard extensively, but it promises to one day remove all that “.ini” manipulation (which is not that user-friendly…) and make it all transparent through a nice visual tool. It seems to work well enough, but all it will do is manipulate all the files we talked about earlier, so it is still relevant to know how these files work.

This is it for localization in UE4. Obviously, there is more to it than what I addressed in this post, but you should be well on your way to making it work in your own game. Hope this was helpful, and don’t hesitate to ask questions in the comments section. Stay tuned for more on FATED, as we quickly approach the release date!

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Hello again!

Last month I had the mandate to localize FATED in French. As some of you may be aware, Frima Studio is a Québec City-based studio that predominantly speaks French, so it was fun to finally have the script integrated in our native language. One of our characters is even voiced by Nicolas Cage’s official French voice actor. How cool is that?!

Unreal is equipped with a lot of game localization tools, even if they’re not quite as polished as the rest of the engine. In this post, I’ll explain how localization works, from text to voice-over. I’ll also give a few tips on how to modify the engine to allow different culture directly in the editor, which changes are needed to support localization for Steam builds, and, finally, which modifications are required to have the Wwise UE4 integration fully localization-ready. In short, how we managed to have a fully working localized build for FATED!

Before reading on, take note that I worked on the localization using Unreal 4.8.2. We recently upgraded to 4.10.1, so while I can confirm that this is all still valid, some new features may have been added that I’m not aware of.

Localizing Text

If you’re familiar with UE3, you’ll notice how different the localization system is now. Epic pretty much ditched their old system. While UE3’s localization system was lacking in some ways, I personally find that UE4’s localization system sometimes neglects simplicity for an all-around, more robust and polyvalent system that is unfortunately not quite ready yet, which sometimes leads to confusion. In UE3, you just had to use a LOCALIZE macro with some parameters that would point to the correct “.ini” file containing the localized text. In UE4, the process is a bit more convoluted, but once you’ve familiarized yourself with its intricacies, it’s quite easy to use. Now let’s dive in!

FString and FText
If you’re familiar with Unreal development, you already know about FString, which is the custom implementation of strings that is used throughout the engine. For localization, Epic introduced a new class, FText, which needs to be used whenever we wish to localize text content. The usage of FText will then mark said text to be gathered in a later phase of the localization process, the “gathering” phase using a specific commandlet (namely the GatherText commandlet).

NSLOCTEXT
When changing text directly in the code, you need to use the NSLOCTEXT macro. This macro uses three parameters: the namespace of the text, a key to represent this text, and the string literal in the default language, which in our case is English. It looks something like this:

FText chapterSelectTxt = NSLOCTEXT("MainMenuUI", "ChapterSelect", "Chapter Selection");

This will later determine how your language archive is generated. We will look at the .archive file generated in a moment.

GatherText Commandlet
The next step is to actually gather all the FText from your project. This also means that we will be able to get text from every blueprint instance for every map of your game, for example. For this to work, you need to start the UE4 editor with a specific command line. I find that the best way to do this is to create a batch file. So I created the LocalizeJotunn.bat file (Jotunn is the internal codename for FATED), which is located in the same folder as the .uproject file.

set PROJECT_PATH=%CD%
..\..\Engine_4.10.1\Engine\Binaries\Win64\UE4Editor.exe %PROJECT_PATH%/Jotunn.uproject
-Run=GatherText -config="%PROJECT_PATH%/Config/Localization/Game.ini" -log > localization.log

From that file, you can notice a reference to a file named Game.ini in Config/Localization/. You need to create that file: this is where the entire configuration for the GatherText commandlet is going to reside. You can find our config file here: configFile
I strongly recommend you start from that file and adjust for your needs. This file has different sections; let’s take a look at them.

[CommonSettings]
This is where you set where the localization files will reside. It is important that you put the same path that is set under the [Internationalization] section of BaseGame.ini (or your custom %YourGameName%Game.ini). By default, this is the path you should see:

[Internationalization]
+LocalizationPaths=%GAMEDIR%Content/Localization/Game

CommonSettings is also where you get to set the native culture of your game (default language).  After that, using the CulturesToGenerate property, you can list all the languages for which you need to create localization files.

[GatherTextStep*]
There will be a number of GatherTextStep, each with its own Commandlet class. The two most important ones you will want to check are GatherTextFromSource and GatherTextFromAssets. GatherTextFromSource will scan source code for those NSLOCTEXT I mentioned earlier, while GatherTextFromAsset will scan for the FText in your .uasset and .umap.

The documentation on this on the Web is not up to date; at least it wasn’t when I worked on the localization, so follow our file for that. You will mainly want to verify the paths for the objects to gather text from. BEWARE! Some of the required parameters are paths (SearchDirectoryPaths) and some are filters (IncludePathFilters). Both kinda look the same, but for filters you don’t want to miss out on the required asterisk (*)!

I personally found that GatherTextFromAssets was getting too much FText I did not want to localize in the first place. We use a lot of TextRenderComponent in the game that are used in the editor only, and this was polluting the localization archive. Since FATED doesn’t have that much text anyway, I decided to only use GatherTextFromSource and force our texts to be localized in source using the NSLOCTEXT macro. It simplified the process for us, but it may not be what you need for your game.

The other steps (GenerateGatherManifest, GenerateGatherArchive,etc) I did not change, but they are required to actually generate the files that will be in you localization content folder (Content/Localization/Game).

Generated files: .archive and .manifest files
The main generated file that you will want to modify afterward is the .archive file. This is where your actual localization text will be stored. For each language generated, a folder will be created in a Content/Localization/Game that represents it; in our case an “en” and “fr” folder. You can open the .archive file using any text editor. For example:

"Source":
{
        "Text": "2.1 - A New Beginning"
},
"Translation":
{
        "Text": "2.1 - Un nouveau départ"
}

The .manifest file is not meant to be modified, but you can get information on the gathering process there. It could be useful to track down where the text was gathered and the actual key used. Example:

"Source":
{
        "Text": "2.1 - A New Beginning"
},
"Keys": [
{
                "Key": "Chapter4Name",
                "Path": "Source/Jotunn/Private/UI/UIChapterSelect.cpp - line 196"
        }

]

That’s it for Text Localization. I’ll go over the Voice-Over localization using Wwise as well as other things to consider when localizing in my next blog entry next week. I hope this guide has been helpful to you folks out there. See you next week.

 

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Memory Management in UE4

Michaël Dubé —  October 22, 2015 — 8 Comments

Hey there!

Well, it’s been a while since I’ve written something for the blog; we’ve been busy at Frima building a cool update to our Oculus Connect 2 demo, which Vincent and fellow programmer Marc-André attended. We now have facial animations and kick-ass voice overs into the game, which bring it to a whole new level. There’s also an amazing article published on Polygon about FATED, so things are getting exciting! Click here to read the article.

Since we are only two programmers on the project, and since we are needed for pretty much every technical aspect of the game, sometimes stuff falls into the cracks. That’s pretty much what happened with memory management for FATED, and since we were both new to Unreal 4, we made some rather critical mistakes on top of not spending a minute on investigating what was loaded, and when.

When the rush to get our OC2 demo out of the door was over, it was taking a whole 75 seconds (!) to load our demo scene. That’s pretty insane for any kind of game. It was time to dig a bit deeper into what was actually taking so long to load, and maybe add a bit of asynchronous loading in there. So while this post may be somewhat of a given for Unreal veterans, I think it could be helpful for some developers with a bit less exposition to the tech.

 

WHAT’S UNDER THE HOOD?

First things first: identify the culprits. Unreal is pretty dope when it gets to commands you can input to investigate every aspect of the game, and memory statistics are no exception. One of the most useful commands out there is “memreport –full”. The “full” parameter will dump more information that could prove useful but is not required per se. Running this command will create a .memreport file in your “Saved\Profiling\MemReports” project folder. It’s a custom file extension, but this is actually a text file representing a snapshot of your game’s current memory state.

You’ll probably want to look first at the object list, which is the complete list of UObject you have loaded and information on the amount of memory space they take up. You will have a line looking like this for every object type:

AnimSequence          36        16628K           16634K           7578K             7578K

Texture2D                  150      212K               219K               260996K         96215K

AnimSequence (or Texture2D) is the name of the class of the object, while the number next to it (36 or 150) is the number of instances of that class. What you want to look into after that are mostly the first and last of the four other numbers you see up there. The first is the actual size of the objects while the last one is the size of the assets that these objects directly reference exclusively. For example, the Texture2D objects do not take much memory as objects holding the textures metadata (212K) as opposed to the actual texture data (pixels) it references exclusively (meaning no other objects references to that data), which amount to 96215K.

The two lines above were extracted from a memreport for this scene:

MemoryTestsMapThat’s right, there is absolutely nothing in there except the floor, a player start, and the skybox. So where the heck are those 36 animations and 150 textures coming from?

 

FINDING ASSET REFERENCES

Now that I knew there was something clearly wrong there, I had to find out which assets were loaded. Sure, there are 36 animation sequences in memory, but which ones and why? Again, UE4 can show that to you quite easily.

The memreport command is in fact a list of other commands executed one after the other to form a “complete” report of your game memory. One of the commands it runs is “obj list”. This is what outputs the list of objects and the number of instances, but this command can also use some parameters, one of which is the class of the objects.

 

obj list Class=AnimSequence

 

This will output all of the animation sequences in memory at that moment, in order of what uses the most memory (the first number). Like this:

 

AnimSequence /Game/Character/Horse/Animation/AS_HorseCart_idle.AS_HorseCart_idle      

491K       491K       210K       210K

 

Now, if you’re like me, you might want to make this look a bit clearer, and fortunately it’s quite easy to write your own Exec function that can output exactly what you want. It can, for instance, sort by exclusive resource size instead of object size. Here is a sample of code to get you started in that direction:

for (TObjectIterator<UAnimSequence> Itr; Itr; ++Itr)

{

            FString animName = Itr->GetName();

            int64 exclusiveResSize = Itr->GetResourceSize(EResourceSizeMode::Exclusive);

}

 

Reference viewerThe Visual Reference Viewer

Now that I could pinpoint exactly which objects were loaded, I was able to investigate with more ease what was referencing them in that seemingly empty scene. There are two ways to view object references in Unreal. First, there is the visual reference viewer (that you can see in the picture above) in the editor; this will show all potential references, not necessarily what is referencing the actual loaded asset in memory right now. Of course, there is also an easy way to figure out the current shortest reference with yet another console command.

 

obj refs name=AS_HorseCart_idle

 

This will output a chain of references on what is loaded in memory (it can take some time), and following this chain will usually lead to the culprit. In our case, we were doing one mistake that was responsible for a lot of those loose asset references: direct asset references in some of our native class constructors. Like this one for example:

 

static ConstructorHelpers::FObjectFinder<UMaterial>

TextMaterial(TEXT(“Material’/Game/UI/Materials/UITextMat.UITextMat'”));

 

In the above example, this is a direct reference to a material: this material will always be present in memory. This is quite logical when you think about it, since the constructor is not only called when an instance of said class is created, but also at startup as it is run to set default properties when the static version of the class is created. So avoid those if it’s not something you need at all times in memory! Instead, opt for a UPROPERTY that can be assigned in the blueprint version of that class, even if it’s always the same asset. At least that way if you don’t have an instance of that object loaded in your scene, it won’t be in memory.

 

ASYNCHRONOUS LOADING USING THE STREAMABLE MANAGER

The wrongly referenced assets were not the only reason why it was taking so long to load our demo, so there was still more work to do. We had some remaining temporary assets, some pretty huge textures (I’ll talk about that in more detail below) but, more importantly, we still had a lot to load and we were trying to load everything at once. Let’s see how to asynchronously load assets using what Unreal calls the Streamable Manager.

If you have an actor that references multiple Animation blueprints, for example, they will all be loaded when you instantiate it, even if you only use one at a time.

 

TSubclassOf<UAnimInstance> mAnimBPClass01;

TSubclassOf<UAnimInstance> mAnimBPClass02;

TSubclassOf<UAnimInstance> mAnimBPClass03;

 

The above lines should become this:

TAssetSubclassOf<UAnimInstance> mAnimBPClass01;

TAssetSubclassOf <UAnimInstance> mAnimBPClass02;

TAssetSubclassOf <UAnimInstance> mAnimBPClass03;

After this modification, the animation blueprint that mAnimBPClass01 references will only be loaded when you specifically ask for it to be loaded. Doing so is quite simple: you need to use the FStreamableManager object. Just make sure to declare it somewhere that will always be in memory, an object that won’t ever be deleted (like the GameInstance of your game, for example). In my case, I dedicated a special “manager” object to it, which is created at game start and never deleted. It handles everything that is dynamically loaded in our game.

 

UPROPERTY()

FStreamableManager mStreamableManager;

 

There is more than one way to load an asset asynchronously, but here is one example: mArrayOfAssetToLoad being a TArray of FStringAssetReference.

 

mStreamableManager.RequestAsyncLoad(mArrayOfAssetToLoad,

FStreamableDelegate::CreateUObject(this, &UStreamingManager::LoadAssetDone));

 

FStringAssetReference are the string representation of the full path of the asset in Unreal’s internal file system. Using a TAssetSubclassOf<> pointer, you can get it by calling ToStringReference() on it.

Furthermore, if you are using Wwise for your audio management, Audio Banks can become quite huge and long to load. Fortunately, a wise (get it?) bank subdivision and using LoadAsync() on the UAkAudioBank instead of Load() will fix that for you. Be sure to uncheck AutoLoad on the banks in the editor before you do! Also, for some reason the LoadAsync() call is not exposed in Blueprint, so you need to do that in native code or expose it yourself.

 

LEVEL COMPOSITION

Assets loading asynchronously is one thing, but you probably also want to split your levels into chunks to load separately. Unreal allows that using “Level Composition”. Level loading used to be done on the main game thread of Unreal, but it is now possible to split the loading on different threads.

 

In the DefaultEngine.ini file of your project, add this:

[Core.System]

AsyncLoadingThreadEnabled=True

 

We are still on Unreal 4.8.2, but from what I found on the subject, this may already be by default in 4.9. Anyways, this should help make the asynchronous loading of levels smoother. If you are enabling that feature, however, you need to be careful about what you do in your class constructor, as some operations are not thread-safe, and using them could result in a lock or even crash the game.

Image3

Level streaming can be done “automatically” using volumes or a simple distance factor, but we decided to do it manually in our case. In the picture above, unchecking Streaming Distance will allow every sub-level linked to that layer to be loaded manually. This can be done in blueprint using Load Stream Level or in C++ using UGameplayStatics::LoadStreamLevel().

Image4

 

Texture Mip Bias

In pretty much every game, textures are what end up using a lot of your space. In pretty much every game, there is also a bunch of oversized textures that are far from optimized pixel density-wise. Fortunately, instead of reimporting every such texture, UE4 offers a really simple way to trim the fat without needing to reimport everything: LOD Bias.

Image6

Image5

You can see that the difference in resource size is considerable when we drop 1 mip, especially when the texture is in 4096X4096! Of course, we couldn’t do that for everything, but there was a lot of stuff that was in 4096 that really did not need to be.

Of course there is a lot more to memory optimization and management, but this is pretty much what I have done to get our demo from taking 75+ seconds to load to somewhere around 10 seconds. Unreal is a great tool, and I keep learning and getting better at it. I hope this will help some of you out there in creating your own awesome content. If you have any questions or comments, I’ll be happy to answer them! In the meantime, I’ll go back to working on FATED. Stay tune for more info!

 

Mick / Lead Programmer / @Mickd777
www.fatedgame.com

Hello VR fans,

In FATED we want players to be immersed in the world we created. We want them to feel connected to the characters they meet. To achieve this, we’re working with various animation techniques designed to make sure that our characters move and act in a realistic way.

As we stated earlier in the blog, we’re using motion capture for the character movements. When the character hops down from the chariot in the preview, we see how smoothly she moves even though it’s a pretty complicated movement. For facial expressions, we’re using Faceshift, a markerless motion capture software.

Finally, the hair and the other items are animated manually. We’re looking into other options to give the objects a more realistic feel by simulating the physics in Maya and baking it directly on the animated models.  

All three techniques are then combined in Unreal Engine 4.

What do you think? I think it’s going to look amazing!

Laurent Mercure / Community Manager / @laurentmercure

Mik

Hi everyone!

With Oculus’s recent announcement regarding requirements and specs for the consumer version of their HMD (https://www.oculus.com/blog/powering-the-rift/), I figured it was the perfect time to write that performance bit I teased about the last time around. Let’s see how we’re dealing with optimization on FATED! First, some math to have a clear vision of what we’re trying to achieve.

Know Your Numbers!

FATED is pretty much fillrate bound (http://en.wikipedia.org/wiki/Fillrate), and it’s safe to assume that most early VR games will also be. This is why the following info is important.

A current generation game will generally push about 124 million pixels per second when running at 60 fps in 1080p. FATED is currently running on a GTX 970 (the recommended card for the consumer version of the Oculus) at ~305 million pixels per second.

1920X1080 upscaled by 140% = 2688X1512 * 75(Hz) = ~305 million

The CV1 native resolution and refresh rate looks like this:

2160X1200 * 90(Hz) = ~233 million

Oculus’ Atman Binstock revealed that “At the default eye-target scale, the Rift’s rendering requirements go much higher: around 400 million shaded pixels per second.” After some math, you can figure out that this pretty much means a 130% upscale seems to be ‘the norm’ for Oculus.

2160X1200 upscaled by 130% = 2808X1560 * 90(Hz) = ~394 million

While we don’t have the final hardware yet, we can have an estimate of what it will need in terms of processing power. The closest we can get using the DK2 is by pushing the screen percentage to around 160. This is what we are aiming to achieve!

Trade-offs A.K.A. Battle with the Art Team

Unreal can be pretty daunting when you first enter it, mainly because the engine cranks everything to 11 by default. The first thing you want to do is take a look at what is costing so much and turn off what you don’t need. Sometimes, the art team will hate you for asking to remove that extra post-process, but remember that a vomit-free experience is more important!

With FATED, we decided to go with a ‘stylized’ look so we could remove some of Unreal’s cost-heavy features while keeping the visuals as stunning as possible. We are also making the conscious decision to have each scene as tightly contained as possible. We want to control which objects are seen at each moment (limit draw calls!) and so we design the levels in consequence. These assumptions allowed us to remove some of the features of the engine without overly affecting our visual target. Here are some decisions we made early on:

  • No dynamic shadows, everything is baked (with exceptions…)
  • No dynamic lights either (with exceptions…)
  • Limit post-processes: No DOF, motion blur, or lens flare

Console Command Options

Here are some interesting commands that we used to disable some costlier features:

r.HZBOcclusion 0: Disables hardware occlusion

r.TranslucentLightingVolume 0: Disables translucent lighting

r.AmbientOcclusionLevels 0: No ambient occlusion! It’s a nice feature, but we don’t need it; remove that!

r.SSR.Quality 0: Screen space reflection is also cool, but it’s costly and we don’t need it; delete!

Profiling Limits: Dig Deeper with GPA

At one point, it’s hard to pinpoint what is really happening GPU-side using only the GPU Profiler. To really get a sense of what is going on, we need something to dig even deeper! We’re using a free tool called Intel GPA.

mic1

https://software.intel.com/en-us/gpa/faq

We won’t go in depth on how to use the tool, but there is one important thing to know: it won’t work in ‘Direct to HMD’ mode. So, to start a capture, you need to be in ‘Extend to Desktop’ mode. The quickest way we found to take a capture was to open the editor, set GPA to ‘Auto-detect launched app’, and then run the game in ‘Standalone’.

Now for the Juicy Tips!

Analyze your scene: I talked about the ‘show’ commands and ‘stat scenerendering’ in my last post; this is where you want to use them to determine what your bottleneck is.

Instancing Static Meshes + Foliage: If you have too many draw calls, this could be a life saver! Foliage is especially great if you want to have a dense forest or lots of smaller meshes. But keep in mind that the level of detail in foliage can easily multiply your draw calls. Also, instancing is not always the best option, so make sure it’s really going to help. Don’t hesitate to compare using GPA!

Particle System Bounds: While profiling FATED, I found out that a lot of particle systems we were not supposed to see where being rendered. Turns out the culling of particle systems is not set by default!

mic2

Project rendering settings – Screen Clear: This is a minor optimization, but every microsecond is worth it! If you always render something on each pixel (you have a Skybox, say) this is worth setting to ‘No Clear’. Be aware that this should only be set for actual builds, since it will cause weird artifacts in the editor asset viewer viewports.

Project rendering settings – Early Z Pass: This is one of the best helpers for the fillrate. This will do more draw calls, but it’s such a huge help for the number of pixels drawn that it is worth enabling. Some frames got as much as 25% speed gain by enabling that!

mic3

Disable post-processes when not using them: We got some really nice post-processes for some features in our game, but they are not always used. Be sure to remove those from the ‘Blendables’ array when they’re not needed!

Shipping Builds: It’s good to remember that your shipping build is going to run a bit faster than your dev build.

We’re always looking for ways to improve performance, and we’re not done optimizing, but this should give you a basic idea of how we’re working on FATED: always profile, add one feature at a time, and look for more ways to make the game run ever more smoothly. There is whole section dedicated to performance in the Unreal 4 documentation (https://docs.unrealengine.com/latest/INT/Engine/Performance/index.html); I highly recommend it to those who want further insight!

Meanwhile, if you have tips to share, or any questions or comments, send them in and I’ll be happy to address them! ‘Til next time!

Mick / Lead Programmer

Étienne

Hi! Étienne Carrier, here. I’m the Technical Artist on FATED.

Developing graphics for VR is an awesome challenge! It’s an all new playing field with whole new constraints and rules. I’m learning new tricks every day, and through this blog I aim to share them with you.

ART STYLE

When we started developing the graphic pipeline, it was clear that we needed a visual style that would help us reach our performance target. In VR, there’s no slacking off. If the framerate drops even for a second, you get a hefty dose of simulation sickness. It was therefore a lot more natural to go with a stylized art style that would not only help us with performance, but also look good in VR. Smooth and non-noisy texture feels great in virtual reality, and it helped the 3D stay faithful to the awesome artwork that Marianne Martin (Art Director) and Marie-Hélène Morin-Fafard (Concept Artist) created.

Fated_Screenshot02Fated_Screenshot03

VIEW DISTANCE & NORMAL MAPS

Applying tricks learned while developing mobile games can turn out to be a lifesaver when you have to run at 75+ fps. The art really needs to be planned accordingly. We built our environments in a way that limits the view distance and allows the occlusion culling to work for us. Normal maps only work well for micro details or from far away. We actually want to fade details out with distance, as it tends to get noisy due to the pixels density on VR headsets. The micro details go against our soft texture style, and the distant normal maps go against the fade of details in the distance. So we decided to never use normal maps, which also helps us get more performance.

GAME ENGINE

We chose Unreal Engine 4 to develop our game. We began by setting up our basic scenes, because even the default template scene was not reaching 75 fps on some PCs. We removed most of the post-process, screen space reflection, and anti-aliasing. We also used static directional lights, then built from there while profiling every step of the way.

static directionnal lightspost process

UNREAL TOOLS

Unreal has a great many tools to help us build our environments efficiently. Here is a video showing how we used some of these tools together. Along with Unreal’s landscape and foliage, we built a material that projects a texture on top of props like rocks to help them blend with the terrain. The blueprints allow us to dynamically create a material instance for each static mesh, so they can all have their specific settings. Using the same texture as the landscape helps to blend them seamlessly.

Fated_RockBlueprintParametersFated_RockBlueprintFated_RockShader

That’s it for now, we’ll have more tips & tricks coming up later!

Cheers

Etienne Carrier
Technical Artist

Mik

Hey guys and gals!

Following last week’s post, which introduced some basic VR concepts, I thought I would do the same but oriented toward Unreal Engine 4. Since the engine went free a few weeks ago, I bet there are a lot of newcomers who could use a hand! So without further ado, let’s delve into the subject.

THE UNREAL EDITOR AND VR

There are a lot of ways to preview your content in Unreal, two of which are particularly helpful for VR. The Play mode can be found in the toolbar above the viewport in the standard Unreal layout.

VR Preview:

This Play mode starts the game in VR mode directly inside the editor. This is new in Unreal 4.7 and is very useful to actually have HMD data inside the editor to debug (in blueprints, for example). This Play mode works well with the Oculus’ Direct to HMD, but it seems to have some issues in extended mode.

PlayMode

Standalone Game:

Prior to 4.7, this was the way to test your VR content. Pressing Alt+Enter starts the stereo rendering. We still use it to profile using Intel GPA. More on our usage of Intel GPA in another post!

World to Meters:

In the World settings, this value is under the VR category. This is the representation of Unreal units to real-world meters. 100 means that 100 Unreal Units equal 1 meter in real life. For FATED, we played a bit with this value to get the proportions we needed. It is recommended to set this value so that it fits with the real-life measurement of your in-game objects.

Head Mounted Display Blueprint Nodes:

There are a lot of exposed methods to do logic directly inside blueprints. This category should be the first place to look for HMD-related functionalities.

HMDBlueprint

Use Less CPU in Background:

You want the editor to stop most of its processing when testing your content. This option can be found under the Editor Preferences in the Miscellaneous section.

LessCPU

Scalability Settings:

I feel like I’m repeating myself, but performance is critical in a VR game! Unreal Engine 4 comes with a great scalability features that can help create the range of performance for your game, from your minimum specs to your recommended specs. More on this in a performance post I’m planning to write soon.

Scalability

COMMAND LINE FUN!

If you’re already familiar with Unreal development, you know about the console and all the commands that can be entered through it. This is also true about the HMD Interface (Head-Mounted Display). Most of these commands are accessed with the “hmd” keyword; here is a list of the ones we use the most in FATED’s development.

hmd stats:

This command displays info about your HMD. A lot of the stuff I’ve mentioned in my previous post can be seen here. Your current latency, whether positional tracking is activated, whether the camera is detecting the HMD, etc. There is also information on Time Warping, and whether the engine is updating the HMD info in the render thread. These are techniques to reduce latency that come right out of the gate in Unreal 4, making it a top-of-the-line choice for VR development.

hmd sp YYY:

YYY must be replaced by a number between 30 and 300. This is one of the easiest performance boosters at the expanse of visual quality. To create a better-looking picture, Unreal scales the image before the distortion process, sampling down afterwards to fit your HMD screen. By default it is at 135, but you can easily bring it down to 100 for better performance. It does look better at a higher percentage, though!

hmd vsync on/off:

Deactivating vsync can be useful to track your actual framerate.

stat unit:

Figure out if you are render thread-bound or GPU-bound. For FATED, we are pretty much always GPU-bound.

stat fps:

Are we running at 75 fps?

stat scenerendering:

This will show useful information about the scene’s rendering. You want to keep an eye on the number of draw calls, for example.

show YYY:

The ‘show’ commands are useful to hide some elements of your scene to try and figure out how much each of these elements takes to render. ‘Show staticmeshes’ and ‘show postprocessing’ are two examples of commands we used in early profiling.

ShowCommands

profile GPU:

This will launch the Unreal Engine GPU Profiler. You can also do this at any time by pressing [Ctrl+Shift+,]. This is a good one to know about, as it is the first tool you will want to use for GPU performance profiling. It is kind of limited (and the reason we are using Intel GPA for deeper profiling), but it will point out some of the costlier effects in your scene.

GPUProfiler

These are the most useful ones for us, but go digging in there and you’ll find plenty other commands that are useful.

In my next episode of the FATED blog…

That’s it for now! Next time, I will talk a bit more about performance and how we got FATED running on a “normal” rig while still retaining great visuals!

Meanwhile, don’t hesitate to comment or ask a question; we’ll be happy to discuss with you guys!

Mik

Hi, my name is Michaël Dubé, Lead Programmer on the awesome FATED project!

Starting a new project is always thrilling and frightening at the same time. Venturing into virtual reality using a completely new technology, Unreal Engine 4, was something we were mostly excited about, though. Everyone on the team was super hyped for this project (and VR in general), which made the terrifying parts (hello performance!) not so terrible to bear.

Nevertheless, we learned a lot of really neat stuff during our first months of development, both about UE4 and VR, and my hope is for this blog to be half an entertaining view of the development of our game, and half a chronicle of all the blood and tears we shed to get there. It’s a Viking game after all; you can’t expect it to be all unicorns and rainbows!

For this first post, I wanted to keep it simple and talk about some basic but important virtual reality terminology, along with some tips on getting your rig ready to rock this new reality.

VR VOCABULARY

There are a lot of technical terms that come with VR development; I wanted to recap some of those and what they mean for FATED.

LATENCY
Basically, this is the time gap between your movement and its replication on-screen. You will often hear the term ‘motion-to-photons latency’ to talk about that. Low latency is very important in VR, as it is considered a major factor in reducing the effect of the infamous simulation sickness.

SIMULATION SICKNESS
This is what you absolutely don’t want the user to feel! This is similar to motion sickness (reading in a car) or sea sickness. Research still haven’t pinpointed what exactly causes this, but everything points to cognitive dissonance. As soon as you make the user experience something he doesn’t do in real life, there is a chance it will trigger simulation sickness. The Oculus Best Practices Guide is a great place to get info on the dos and don’ts.

http://static.oculus.com/sdk-downloads/documents/Oculus_Best_Practices_Guide.pdf

RESOLUTION AND THE SCREEN DOOR EFFECT

“The ‘screen-door effect’ […] is a visual artifact of the projection technology […] where the fine lines separating the projector’s pixels become visible in the projected image.”
http://en.wikipedia.org/wiki/Screen-door_effect

This is one of the reasons the consumer version of the technology will need better resolution. However, for developers this means that we have to take extra care when it comes to performance. We got great performances tips we applied for FATED that I’m eager to share with you!

REFRESH RATE
This is the frequency at which the screen refreshes. The DK2 is set at 75 Hz, so basically we need to update the screen at a constant 75 frames per seconds to get the best result. In VR, it is of the utmost importance to keep a steady framerate. Unless you want to make your players sick, that is (which we don’t, we swear!).

PRESENCE
This is what VR is all about! This is basically tricking your brain in thinking it’s really in your game. We are designing with this principle in mind to create mind-blowing and/or powerful moments for the players to experience. FATED is all about making the player live a meaningful and poignant adventure. Because of this alone, I strongly believe VR is the next step in the narrative medium.

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word
http://static.oculus.com/connect/slides/OculusConnect_Epic_UE4_Integration_and_Demos.pdf

HARDWARE TIPS

Installing the Oculus Runtime is as simple as it gets, but there are some important settings that every VR enthusiast should be aware of.

RIFT DISPLAY MODE
What Direct to HMD does is ensure that the application is synced to the actual refresh rate of your Rift. It also helps reduce latency since the step of going to your desktop is bypassed. While I prefer the simpler Direct to HMD, we are still using the Extended Display Mode on FATED for debugging purpose (more on that in another post!).

OCULUS CONFIGURATION UTILITY
This is where you can setup some information about yourself to get a better experience. Developers can then use this information to adjust the experience so it better fits the user, like adjusting the camera position using your IPD (interpupillary distance – the distance between your eyes).

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word_2

BEWARE OF THE NVIDIA CONTROL PANEL!

If you have an NVIDIA card, chances are you can get a huge performance boost (we definitely saw a difference on some cards) by doing two simple things in the NVIDIA Control Panel. You should of course have the latest drivers before doing that.
First, in the Manage 3D Settings, there is a “Virtual Reality pre-rendered frames” setting that is set to 1 by default. You want to set that to “Use the 3D application setting”. Also, you will see “Power management mode”. Set that to maximum performance.

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word_3

Next, you may want to go in your PC’s Control Panel, in the Power Options, and set your minimum processor state to 100%.

BlogPost01_BasicsRev03 (Mode Protégé) - Microsoft Word_4

The above settings were part of changes I did for FATED when we encountered a problem known on the Unreal forums as the ‘37.5 fps lock’. This lockdown would normally happen if the frame rate drops below 75 frames per seconds, but this was not the case for us and some other developers. Some Oculus staff was involved in pointing this out, and this may not be necessary anymore once the consumer version is ready, but it definitely helped for FATED.

Well, that’s all for now. Obviously, there is a lot more to VR, and we will keep adding interesting content to this blog. In the meantime, if you have great tips to share with us or if you have any questions, we’d be happy to hear you out in the comments!

Stay tuned for more on FATED!