Memory Management in UE4

Michaël Dubé —  October 22, 2015 — 8 Comments

Hey there!

Well, it’s been a while since I’ve written something for the blog; we’ve been busy at Frima building a cool update to our Oculus Connect 2 demo, which Vincent and fellow programmer Marc-André attended. We now have facial animations and kick-ass voice overs into the game, which bring it to a whole new level. There’s also an amazing article published on Polygon about FATED, so things are getting exciting! Click here to read the article.

Since we are only two programmers on the project, and since we are needed for pretty much every technical aspect of the game, sometimes stuff falls into the cracks. That’s pretty much what happened with memory management for FATED, and since we were both new to Unreal 4, we made some rather critical mistakes on top of not spending a minute on investigating what was loaded, and when.

When the rush to get our OC2 demo out of the door was over, it was taking a whole 75 seconds (!) to load our demo scene. That’s pretty insane for any kind of game. It was time to dig a bit deeper into what was actually taking so long to load, and maybe add a bit of asynchronous loading in there. So while this post may be somewhat of a given for Unreal veterans, I think it could be helpful for some developers with a bit less exposition to the tech.



First things first: identify the culprits. Unreal is pretty dope when it gets to commands you can input to investigate every aspect of the game, and memory statistics are no exception. One of the most useful commands out there is “memreport –full”. The “full” parameter will dump more information that could prove useful but is not required per se. Running this command will create a .memreport file in your “Saved\Profiling\MemReports” project folder. It’s a custom file extension, but this is actually a text file representing a snapshot of your game’s current memory state.

You’ll probably want to look first at the object list, which is the complete list of UObject you have loaded and information on the amount of memory space they take up. You will have a line looking like this for every object type:

AnimSequence          36        16628K           16634K           7578K             7578K

Texture2D                  150      212K               219K               260996K         96215K

AnimSequence (or Texture2D) is the name of the class of the object, while the number next to it (36 or 150) is the number of instances of that class. What you want to look into after that are mostly the first and last of the four other numbers you see up there. The first is the actual size of the objects while the last one is the size of the assets that these objects directly reference exclusively. For example, the Texture2D objects do not take much memory as objects holding the textures metadata (212K) as opposed to the actual texture data (pixels) it references exclusively (meaning no other objects references to that data), which amount to 96215K.

The two lines above were extracted from a memreport for this scene:

MemoryTestsMapThat’s right, there is absolutely nothing in there except the floor, a player start, and the skybox. So where the heck are those 36 animations and 150 textures coming from?



Now that I knew there was something clearly wrong there, I had to find out which assets were loaded. Sure, there are 36 animation sequences in memory, but which ones and why? Again, UE4 can show that to you quite easily.

The memreport command is in fact a list of other commands executed one after the other to form a “complete” report of your game memory. One of the commands it runs is “obj list”. This is what outputs the list of objects and the number of instances, but this command can also use some parameters, one of which is the class of the objects.


obj list Class=AnimSequence


This will output all of the animation sequences in memory at that moment, in order of what uses the most memory (the first number). Like this:


AnimSequence /Game/Character/Horse/Animation/AS_HorseCart_idle.AS_HorseCart_idle      

491K       491K       210K       210K


Now, if you’re like me, you might want to make this look a bit clearer, and fortunately it’s quite easy to write your own Exec function that can output exactly what you want. It can, for instance, sort by exclusive resource size instead of object size. Here is a sample of code to get you started in that direction:

for (TObjectIterator<UAnimSequence> Itr; Itr; ++Itr)


            FString animName = Itr->GetName();

            int64 exclusiveResSize = Itr->GetResourceSize(EResourceSizeMode::Exclusive);



Reference viewerThe Visual Reference Viewer

Now that I could pinpoint exactly which objects were loaded, I was able to investigate with more ease what was referencing them in that seemingly empty scene. There are two ways to view object references in Unreal. First, there is the visual reference viewer (that you can see in the picture above) in the editor; this will show all potential references, not necessarily what is referencing the actual loaded asset in memory right now. Of course, there is also an easy way to figure out the current shortest reference with yet another console command.


obj refs name=AS_HorseCart_idle


This will output a chain of references on what is loaded in memory (it can take some time), and following this chain will usually lead to the culprit. In our case, we were doing one mistake that was responsible for a lot of those loose asset references: direct asset references in some of our native class constructors. Like this one for example:


static ConstructorHelpers::FObjectFinder<UMaterial>



In the above example, this is a direct reference to a material: this material will always be present in memory. This is quite logical when you think about it, since the constructor is not only called when an instance of said class is created, but also at startup as it is run to set default properties when the static version of the class is created. So avoid those if it’s not something you need at all times in memory! Instead, opt for a UPROPERTY that can be assigned in the blueprint version of that class, even if it’s always the same asset. At least that way if you don’t have an instance of that object loaded in your scene, it won’t be in memory.



The wrongly referenced assets were not the only reason why it was taking so long to load our demo, so there was still more work to do. We had some remaining temporary assets, some pretty huge textures (I’ll talk about that in more detail below) but, more importantly, we still had a lot to load and we were trying to load everything at once. Let’s see how to asynchronously load assets using what Unreal calls the Streamable Manager.

If you have an actor that references multiple Animation blueprints, for example, they will all be loaded when you instantiate it, even if you only use one at a time.


TSubclassOf<UAnimInstance> mAnimBPClass01;

TSubclassOf<UAnimInstance> mAnimBPClass02;

TSubclassOf<UAnimInstance> mAnimBPClass03;


The above lines should become this:

TAssetSubclassOf<UAnimInstance> mAnimBPClass01;

TAssetSubclassOf <UAnimInstance> mAnimBPClass02;

TAssetSubclassOf <UAnimInstance> mAnimBPClass03;

After this modification, the animation blueprint that mAnimBPClass01 references will only be loaded when you specifically ask for it to be loaded. Doing so is quite simple: you need to use the FStreamableManager object. Just make sure to declare it somewhere that will always be in memory, an object that won’t ever be deleted (like the GameInstance of your game, for example). In my case, I dedicated a special “manager” object to it, which is created at game start and never deleted. It handles everything that is dynamically loaded in our game.



FStreamableManager mStreamableManager;


There is more than one way to load an asset asynchronously, but here is one example: mArrayOfAssetToLoad being a TArray of FStringAssetReference.



FStreamableDelegate::CreateUObject(this, &UStreamingManager::LoadAssetDone));


FStringAssetReference are the string representation of the full path of the asset in Unreal’s internal file system. Using a TAssetSubclassOf<> pointer, you can get it by calling ToStringReference() on it.

Furthermore, if you are using Wwise for your audio management, Audio Banks can become quite huge and long to load. Fortunately, a wise (get it?) bank subdivision and using LoadAsync() on the UAkAudioBank instead of Load() will fix that for you. Be sure to uncheck AutoLoad on the banks in the editor before you do! Also, for some reason the LoadAsync() call is not exposed in Blueprint, so you need to do that in native code or expose it yourself.



Assets loading asynchronously is one thing, but you probably also want to split your levels into chunks to load separately. Unreal allows that using “Level Composition”. Level loading used to be done on the main game thread of Unreal, but it is now possible to split the loading on different threads.


In the DefaultEngine.ini file of your project, add this:




We are still on Unreal 4.8.2, but from what I found on the subject, this may already be by default in 4.9. Anyways, this should help make the asynchronous loading of levels smoother. If you are enabling that feature, however, you need to be careful about what you do in your class constructor, as some operations are not thread-safe, and using them could result in a lock or even crash the game.


Level streaming can be done “automatically” using volumes or a simple distance factor, but we decided to do it manually in our case. In the picture above, unchecking Streaming Distance will allow every sub-level linked to that layer to be loaded manually. This can be done in blueprint using Load Stream Level or in C++ using UGameplayStatics::LoadStreamLevel().



Texture Mip Bias

In pretty much every game, textures are what end up using a lot of your space. In pretty much every game, there is also a bunch of oversized textures that are far from optimized pixel density-wise. Fortunately, instead of reimporting every such texture, UE4 offers a really simple way to trim the fat without needing to reimport everything: LOD Bias.



You can see that the difference in resource size is considerable when we drop 1 mip, especially when the texture is in 4096X4096! Of course, we couldn’t do that for everything, but there was a lot of stuff that was in 4096 that really did not need to be.

Of course there is a lot more to memory optimization and management, but this is pretty much what I have done to get our demo from taking 75+ seconds to load to somewhere around 10 seconds. Unreal is a great tool, and I keep learning and getting better at it. I hope this will help some of you out there in creating your own awesome content. If you have any questions or comments, I’ll be happy to answer them! In the meantime, I’ll go back to working on FATED. Stay tune for more info!


Mick / Lead Programmer / @Mickd777

Hello everyone!

When I started working on this project, I had minimal experience with story-driven games. But I knew that music would be critical in order to create the emotional journey I wanted FATED to be. So I made sure we were working with one of the best Audio Directors out there: Patrick Balthrop. Patrick has more than a decade of experience creating award-winning sound for games like BioShock: Infinite; he also worked with us on Chariot, so I knew what I could expect to get from him: top-notch audio.

With Patrick on our team, I started looking for references that would help us define our musical signature. I’ve listened to over a hundred tracks and found inspiration in many of them, but I would say that the most important influence has been The Fountain’s soundtrack. I hadn’t even seen the movie the first time I listened to it, but with the music alone, I was able to feel every emotion the movie puts you through. I also really liked the fact that the soundtrack felt like one long musical piece. It rapidly became a benchmark for our musical track, and I think we’ve reached our objective.

I am very excited to share FATED’s musical theme with you today, and I really hope you’ll enjoy it as much as we do.

If you are interested in knowing more about the production of audio for VR, keep an eye on the blog, as we will shortly post an article that shares our learnings.

Talk to you later!

Vincent / Producer / @Vincent_Martel

Hi everyone,

We are excited to see Fated come to life in a whole new way with mocap, facial animations and voice-over now coming together and breathing life into our characters. At this point, we are also trying to bring the most advanced levels to a nice playable polish – it’s the moment of truth for many of our favorite scenes!
 We are putting the final touches to our exteriors as well as completing decor and lighting work on some new interior scenes. The visual plan has now been adapted to the final metrics for some of these rooms, and we’ve been busy squeezing the most “oomph” out of our assets, adding more props and effects to emphasize the uniqueness and moods of this mysterious zone…

Habillage03  Habillage01


The whole team is testing the experience as we go along and giving feedback in order to make Fated the exciting and moving journey we all want it to be.

Stay tuned!

Marianne / Art Director

A couple of weeks ago, our producer Vincent was giving a talk at Casual Connect. His talk was entitled: How to Make a VR Game Not a Game Using VR.

If you’re curious about this new medium, you should take a moment to listen to it.

Hello everyone!
Last week, I was at Blindlight Studios in Los Angeles with our Audio Director, Patrick Balthrop, to record the voice-overs for FATED. We were amazed by the quality of both the studio staff and the actor cast. It was honestly one of my best experiences, professionally.


Blindlight works on most of the big AAA projects, from Activision Blizzard to Naughty Dog all the way to Disney Interactive Studios. It’s our first time working on such a voice-acting-heavy title, and that recording experience was almost unreal. The actors were top-notch, not only on the voice acting, but also on the emotions they were able to inject into their lines. By the end of a particularly gripping scene, one of the actresses was literally in tears …

So I would like to take this opportunity to thank everyone who was there for your amazing work. The game will be a hundred time better because of you!

Also, for anyone looking for great voice actors, here’s a full listing of our cast, each with one of the amazing projects on which they’ve worked (because there’s not enough space to list them all):

• David Lodge [ Destiny ]
• Cherani Leigh [ Borderlands 2 ]
• Andy Pessoa [ King’s Quest ]
• Lex Lang [ Mass Effect 2-3-4 ]
• Matthew Waterson [ World of Warcraft ]
• Laura Post [ League of Legends ]
• Michelle Sparks [ Sunset Overdrive ]
• Reagan Rundus



Reagan Rundus, the little actress who portraits our character Lif, is only 7 years old.

Here’s a sample of what we recorded in our two-day sessions with Blindlight:


During the recording session, we were also capturing the actor’s faces to track facial expressions using faceshift so we can use them in-game. We are hoping that with this technique we will get richer facial animations, and that more emotions will come through.


The next step will be to merge everything together: the body animations we recorded during the motion capture sessions, the facial animations recorded with faceshift, and the voice-overs.

This step is both exciting and nerve-wracking. I’m really looking forward to seeing our characters finally come to life, and I hope they’ll look awesome! It’s our first time using this animation pipeline, and failing here would mean a lot of more work for our animators. So, fingers crossed!

I’ll try and show you the final result, but until then if you have any questions, feel free to ask!

Have a nice day.

Vincent Martel / Executive Producer / @Vincent_Martel


Hello everyone!

My name is Ève Mainguy, and I’m Lead 3D Artist on FATED.

My work is very much focused on creating the environments the players will explore. In an atmospheric and immersive Virtual Reality experience like the one we’re creating, believable environments are essential. We’ve shown a lot of exterior settings in previous posts, but a good chunk of the adventure will take place in ancient mythical ruins. These underground caves have been there for ages, and they need to look the part.

sceneA room in the mythical ruins, work in progress.

My job consists of taking the concept pieces provided by our art director Marianne, translating them into models and textures, and integrating them in the game. We’re putting a lot of love in the overall art of FATED; even the simplest props are getting the full treatment.


Jute sacks concept art and their rendition in 3D

As a 3D artist, one of my duties involves overseeing the props to be made, and pinpointing every opportunity in which a mesh or a texture can be reused in each given area. For instance, the same mesh piece used for a sculpted column headdress can be reused on a ornate swinging blade. FATED is highly inspired by Classical Norse art, which includes a lot of intricate braided designs, so I had the opportunity to isolate these intricate interlaced patterns and reuse all of them on my props. This approach saves time while keeping the quality at the desired level.


Pictured: Time saved

Aside from environments and props, I’ve also worked on some of the characters you’ll meet in the game. We have revealed very little about the events unfolding in FATED, but the character you see below is Oswald and he is a key character in the story.

Oswald Character

Oswald, from concept to model


That’s all for this week, folks. I hope you enjoyed this peek into the work of a 3D artist. What are your thoughts on the art direction we’re taking with FATED? What would you be excited to see or read about in our next post?

Ève Mainguy / 3D Artist



I’ve given a lot of thought on how to write this post as I’m going to touch on very delicate subject. It’s one on which my view (and that of the team) is not completely set yet. From my point of view as a developer, I’ve witnessed how the perception of Virtual Reality is evolving. Part of this is due to a better understanding of the forces and weakness of this new medium, but it’s also due to the fear of VR, a fear that is mostly attributed to health concerns and simulation sickness acting as the flag bearer. There’s a stigma around this topic and I thought it was time to get some well-deserved new perspective on what it entails and maybe bring about a new point of view on the subject. Bear with me a little as I lay out my thoughts on this, I swear there is a point!

Evolution of VR Mentality

When we started tinkering with the DK1 back in the beginning of 2014, the VR scene was pretty much two things: first-person view horror games and rollercoasters. A lot of people saw the future of VR entertainment as that kind of experiences. It was about “thrills” and seeing the world from another person’s perspective: that was most easily associated with first-person view games.

Motion sickness was a thing, but it was mostly seen as a technological limitation, something that would go away naturally as the tech evolved. It was not uncommon to see one person feel some simulation sickness only for another to comment that “It’s going to be solved or get way better with the DK2”.

One of the first demo developed by Oculus to showcase the Rift was a first-person experience; the Tuscany Demo.This demo is what introduced a lot of developer I know to VR, and it contained the dreaded right joystick yaw control (more on that later).

The concept of “VR Legs”, the fact that you could acclimate to the discomfort caused by some VR experiences, had a pretty strong following. On the team, I remember we were telling ourselves that we would do the most hardcore experiences (like Cyber Space) in order to get better in VR. In some respects, this did work. We got better at handling some VR related discomfort!

The reason why I mention all this and what primarily led me to write this piece is that since then, there has been such a paradigm shift in what a VR game should be, that it made us reconsider many aspects of our game. Basically, as of now, creating a VR FPS is pretty much considered heresy. The demos that once were the greatest things in the world, that made me a believer in virtual reality in the first place, are now scorned by most.

While first person VR games are still coming, this evolution led to the recent insurgence of third person perspective VR content. This can be seen in numerous articles around the web but also in recent announcements at E3 like Insomniac’s “Edge of Nowhere”.

Don’t get me wrong, I’m not saying this is a bad thing! There are good reasons behind this; locomotion is way less intrusive in third person, and locomotion in VR is a problem.

Locomotion: The “Unsolvable” Problem of VR

While things like low frame-rate and high latency can create simulation sickness, the real issue lies in movements and how the vestibular system interprets what the brain is telling it. For some, moving around in translation or rotation or both can get extremely uncomfortable, extremely quickly.

In parallel to the evolution of VR content creation, developers were hard at work finding solutions to this problem. As we entered pre-production on FATED, we were already aware of some of the issues. We knew about the major concern of standard FPS control: player yaw rotation using the right stick. Some months after we started, John Carmack, one of the leading authorities in the domain, said this about it:

John Carmack Tweet

So, naturally, we wanted to figure it out: solve the issue and make our game the most comfortable VR experience out there. Thus came the time of the hunt for the Great VR Holy Grail! *Cue the Indiana Jones theme song*

While we found some interesting (and some horrible) ways to control rotation, most of the popular ones were already out there.



Solution #1: The “Comfort Mode” Control

There is a pretty popular movement mechanic known as the “Comfort Mode” that strives to resolve this issue. You can see it explained here.

While this solved the problem at hand for some, it did also create new ones. First, for a lot of people this was breaking immersion. There is no way that “skipping” the actual rotation movement can cut it as a “natural” way to move around. Secondly, disorientation: while using the feature sparingly could work, trying to use it in a more action based setting was quickly making people unsure as to their whereabouts in the game world. And then there is the matter of precision. If you get to a point in your game where you need to be at a certain angle, then this method is bound to fail.

While these are all new problems that may also have their own solutions, there is one single thing that still stays true: this does not work for everyone. In fact, some people reported feeling sicker using this new control mode.

Solution #2: Cockpits, Cockpits Everywhere!


The second solution came about quite naturally in various demos that actually needed them, like in games where the player is sitting in a spaceship. I use the term “cockpit” here to designated any locomotion mean that as a reference frame that the player can relate to. Racing games and mech-warrior style games are two other examples.

Quite interestingly, this had a strong positive effect on the way players experimented simulation sickness, even going as far as completely removing the unpleasant feeling for some! Great, but now that developers found out about this, the internet exploded with ways to had that frame of reference to everything. Here are some of my favorites, not always for good reasons.

Canvas Mode

There is this demo that is awesome for its novel ideas to handle movement in VR. It’s simply called “Movement Experiments”. All are interesting movement mode but the one I want to point to is the “Canvas Mode”. What this mode does is create a virtual “cockpit”, called “canvas” in this case, whenever the player rotates around. While the author says it works (and it probably does for some people), I found it quite intrusive, and immersion-breaking. Still, the demo is worth a try, so check it out!

Floating HUD (Helmet Mode)

There’s the floating HUD solution that is basically making the UI the frame of reference for the player. That, coupled with something like the visor (or helmet) of the player character can give the desired frame of reference. A good example of that can be seen in the upcoming game Adr1ft. Unfortunately, not all game settings permit this kind of implementation. A Viking with an astronaut helmet…maybe for FATED 2!

Virtual Nose


By far my favorite is the virtual nose, which is basically the natural frame of reference of humans. This research that pretty much flooded the VR space when it was first released is basically the extreme representation of the “cockpit” concept. We tested the idea on FATED, and promptly removed it. For some reason (I’m blaming low FOV), what the player ends up seeing is two partial noses, one on each side of the screen. It’s very troubling for the player and just feels out of place, I really don’t see this becoming the miracle pill to solve motion sickness.

Solution #3: Swivel Chair

The idea here is that since rotation in VR is such a problem, why not make players rotate on themselves in the real world? This does work of course, but with the headset horribly limited by a metric ton of wires, this does not always end well. Some folks out there like the people at Roto made a chair especially for VR that aims to solve that. While the idea is sound, we believe that having to invest in yet another piece of equipment to enjoy VR is not the way to go.

All in all, these solutions can work, but there is no single “magic trick” to solving the locomotion issue for everyone. This single fact is fuel for VR skeptics.

The Fear of VR

The fear is very real


I’ve seen it numerous times: when someone mentions trying a VR demo, the automatic question is “but will I be sick?” It’s part fear of the unknown, part past experiences that were not exactly “perfect”. No one wants to say “maybe you’ll be sick testing my game”. It became imperative, at all costs, to make sure every VR experience was free of simulation sickness.

However, fear of VR is not limited only to the consumer fearing the ill effects of the technology on their health; it’s also the fear of the industry giants bringing this to market that bad content could kill the appeal. As VR development grew stronger over the last year and with the consumer version finally nearing completion, it became abundantly clear that keeping motion sickness at bay was primordial. Palmer Luckey, Oculus’ Founder, as well as John Carmack both stated it very eloquently:

“Virtual reality’s biggest enemy is bad virtual reality”.  –Palmer Luckey

“The fear is if a really bad V.R. product comes out, it could send the industry back to the ’90s,” –John Carmack

But this leaves us with a question that begs to be answered…

What Is BAD Virtual Reality?

All that work leaves us with a pretty bleak picture of what we can do in virtual reality with the premise that if we want a “good” VR product, we most absolutely have zero simulation sickness, for EVERYONE. All of the above solutions can work for one person and not the other. Or it can help but not completely eliminate the effect.

Here is a small list of stuff you will not be able to experience in VR if you are very very affected by motion sickness in general.

  • All experiences with movement in them.

Yikes. No rollercoasters for you!

There is a great read by Lee Perry on Gamasutra that you should really check out that go deep into the kind of thinking I believe VR content creators should adhere to. If there is one thing I would like to emphasize in this article, it’s this: “People have extremely individualized issues with VR”.

So the definition of “bad” virtual reality is going to differ from person to person, as tolerance to some movements can be handled or not. The point of this entire post is to make it clear that maybe some VR experiences will not be for everyone. While Oculus’ reluctance to push out demos that might alienate a segment of players is understandable, what would be worse is us limiting the type of experience people can enjoy. For example, while it’s true that yaw rotation makes a lot of people want to barf, for those that it does not affect (or to a lesser degree), it’s a great and intuitive way to move around.

We don’t want to limit the kind of VR content we create to encompass every demographic; it’s an impossible task that would have us throw out things we hold dear, experiences we really believe are powerful even if a smaller percentage of people can experience it comfortably. What we can do, however, is have the maximum number of options and tweaks to permit players to mold their experience to their taste and sensibility. We believe VR needs to come with its own “rating” system that will have “comfort” as its unit of measure, the same way you know if you can handle rollercoasters in an Amusement Park or if you should rather stick with the Big Wheel. We just need to be honest with our players and expose our experience as it is. Oculus already started doing this on Oculus Share with its comfort level scale, even if they seem to have shied away from the concept since then.

Well, I knew this post was going to be long but I did not quite think it would get this long! While it’s pretty much about exposing the problem, there is a second part to this that focuses way more on how we did or plan to do to resolve the simulation sickness problems in FATED. After all, we still plan to have this next to our game when it ships:


In the meantime, if you have any questions or comment on the subject, please do share! I’ll be happy to read and discuss them with you!

Mick / Lead Programmer / @Mickd777

Hello there! It’s me again.

I’d like to share with you a few things about VR sickness we discovered as we were experimenting with FATED. We already broached this topic in the technical posts, but today, instead of talking about technology again, we’re going to talk about how to avoid this unwanted feeling from a design point of view.

As we were looking for the cause of the sickness, we found that it was mainly caused by the inner ear and the eye sending different messages to the brain at the same time. When the player is moving in Virtual Reality, the eyes inform the brain of the movement, but the inner ear, an authority on the subject, says otherwise. This is where the dizziness starts. Some actions have a greater effect than others, so by strategically choosing which actions to ask the player to perform, you can diminish the undesired effects. Some of these actions can be easily identified, including: height variations, rapid movements, acceleration and deceleration, rotation, and unprovoked movements.
By following a few basic rules, your level design can make a big difference on your game’s level of comfort.

Avoid making height variations in the environment.

Making the environment flatter will keep the player’s eye at the same height. This will also keep his inner ear stabilized, thus no weird feeling. This doesn’t mean that your environment should be totally flat, only that the part where the player is allowed to go should be “flatter”. After some testing, we also found out that a floor angle of about 10 degrees is still comfortable for the player. If you have a slope with a greater incline, a feeling of vertigo might start to set in, but this could differ from one person to another. So try to avoid making a VR game about running up and down flights of stairs. If you can choose between stairs and an elevator, go with the elevator. Also, downward vertical movements seem to be easier for the brain to handle. This is why roller coasters are so effective at creating a powerful feeling of vertigo and nausea.




Avoid making rapid movements.

When tweaking your character’s speed metric, consider a real-life scale speed instead of a fictitious one, to make the brain adapt more easily. Also, if you have elements like a moving platform or an open elevator, keep their movement slow.

Avoid acceleration and deceleration.

It is better to set a constant speed that will be achieved in one frame instead of accelerating over time. Stopping should also happen instantly. This works well with speeds that are not too high. But I can’t say for very high speeds, as I haven’t experimented with those kinds of parameters.

Don’t ask your player to rotate too much.

Try to avoid having the player make repetitive rotations. Completely avoid 180-degree rotations in you walkthrough if possible. Try to present the player’s options and interactions in front of him or in a narrow angle of perception. I don’t mean that all the game should be played in a corridor; this would be bad and boring. We didn’t make FATED like this. This is more about keeping in mind to avoid repetitive rotations than not making any.


Completely avoid unprovoked movements.

By “unprovoked movement”, I mean any movement that is not triggered by the player’s input. So, no surprise elevator, no sudden floor rotation, and, above all, do not take control of the player’s point of view like we see in so many first-person and third-person cut scenes.

These few simple rules can help avoid VR sickness in your game. It is only a matter of approaching level design with that in mind. I saw some of the game’s design change radically from a first-person adventure to a fixed standing experience or third-person point of view. This says a lot. VR sickness exists, and we can’t ignore it. We have to find a way to create our game in a way that makes this undesired feeling as absent as possible. I may be wrong, but I don’t think that technology will be able to counter this effect, as it is triggered by our own perception system. Trying different conceptual approaches may be the best hope we have of finding a solution to the issue, and if we want Virtual Reality to rise as a new medium, we have to create “sickless” experiences for the masses to enjoy.

Thanks for reading! If you have comments on what you’ve read here, or if you have tips and tricks that can help with VR sickness, please share them with us!

Philippe Dionne / Game Designer

Phil Dionne

Hello VR enthusiasts, my name is Philippe. I’m the game/level designer working on FATED.

My role on the project is wide and varied as we are like you, discovering what this new technology is all about and how to “tame it” into a new game media. I’m basically working on translating the project’s vision into an interesting gameplay experience by strategically using space and game mechanics while using VR strengths to add new aspects to the game. I’m trying to find how to make every good idea work with the constraints and opportunities that this awesome technology offers.


I feel that VR is “the” new medium. There is no comparison between this and the other mediums that I’ve been trying. The 3D television, for example, was presented as the next big thing but in fact, it was only allowing us to feel the depth of a picture. Virtual Reality is the only medium that has a sense of presence. This is the power to make us believe that we are in a precise place in another world. I think this is where this technology caught me. I always have been a big fan of immersion but this technology is bringing an all new level to it. It’s making me forget where I am and making me believe that I’m part of a different reality. I cannot even stop my brain from making me physically react to situations. This medium is just incredible and not only for gaming, but for a ton of other applications as well.


I would like to present conceptual and concrete aspects of designing a game for VR to this blog. I know I won’t be able to sound as erudite as our lead programmer or be able to show you beautiful things like our art director does but I will share what it’s like to design for Virtual Reality. I’ll share with you the VR specific issues we’re encountering while making everyday design decisions for FATED. How should we approach level design for our fully immersed player? How can this simple gameplay element becomes tricky to design in first-person VR?

I hope to make it interesting enough that it will invite people to bring their own ideas and help push this new media further. I’m also very open in taking part of discussions. If you have questions or opinions about designing for VR that you would like to bounce, please, feel free to comment.

Philippe Dionne / Game Designer


Hello everyone,

I thought I’d give you an update on what the art team of FATED is working on at the moment.

On my end, I’m looking into the visuals we are missing to convey the storyline clearly in the intro scenes. The story delves into abstract concepts like death and the area between worlds. To get a better sense of timing and emotional involvement, we need to hook up a lot of things that aren’t final, and for everything that’s missing, we are using placeholders.

At this point all research is based on what is missing or could be better in-game: a lot of the ground work is covered and we can focus on fixing weaknesses. One good example is a board I’m making which lists a number of ways to turn destroyed areas into more convincing war-stricken zones. Things like burned soil patches, dead trees, charred buildings and heaps of debris.

Most of these assets are derived from things we already have – we are trying to make the most of our limited resources. They will decorate the world and emphasize drama in some parts of the story. The characters of FATED are facing hardship and destruction, so the game world needs to reflect that. Spoiler alert: you might see some dead people.

I’m also working on some textures for new asset kits that will help us dress up the environment when we get to that point. This can only happen when level design is final and meanwhile, I find myself doing many different things; patching up holes here and there so that we have a more complete sample of the experience.


As far as the rest of the team is concerned, there is a lot of animation work going on right now. Our animators Yanick Bélanger and Isabelle Fortin are modifying motion capture files and testing out facial expressions for one of the main characters. Our 3D artist Ève Mainguy is finishing up many props and producing many of the items I’m adding when reviewing the scenes. Right now, the work is planned but more flexible than usual, as we realized we needed to re-assess progress almost daily.


I also wanted to share this piece with you. We don’t know what we’ll do with it yet but it will probably serves as a design for a T-shirt or other marketing material.


I was hoping to illustrate in a whimsical way the wonder one might feel when faced with the full blown power of Ragnarök, world-changing events on which the characters have little or no control. I felt that this strong little girl in the image was the perfect character to convey this awe, courage and vulnerability. The decorative designs around the piece are meant to bring in some Viking decorative designs and the runes are derived from actual runic texts. The image inserted in the post is actually the right size if you want to use it as a desktop wallpaper.

That’s it for this week, I hope you enjoyed!

Marianne Martin / Art Director