The Mighty Morphin' Power Reticles of Alekon

Morphing 2D Shapes in UE4 using Multi-Channel Distance Fields

Intro

Hi! I'm Max, the engineer on Alekon, and today I will show you how I implemented the morphing reticle and UI elements you see in the game, and more generally, how to easily and efficiently morph any SVG vector shape into any other, with perfect resolution-independent sharpness and custom transition effects. I'm also including a sample UE 4.25 project showing the technique. Feel free to use its contents in your product, whether free or commercial (public domain / CC0 license).

Implementation TL;DR

The rest of this article goes into details, but here's the gist if you're short on time:

  1. Make your SVG shapes, e.g. using Inkscape or Illustrator.
  2. Use msdfgen to generate a multi-channel distance field texture from each shape.
  3. Sample each texture and take the median value of its channels.
  4. Lerp between the respective medians to get morphed distance to shape.
  5. Feed the morphed distance into the Distance Field material function to render a resolution-independent shape.
  6. Control the progress of your morph animation by offsetting the lerp alpha with a grayscale texture.

Background

Since there's only three of us, I get to wear all kinds of hats, including being the UX designer and UI artist. Since I started working on the game, I have been adamant that the only UI visible during normal gameplay is the reticle. With a photography game that relies on ambiance like ours, I feel that having HUD blocking any part of your vision really harms immersion.

Of course, this put a lot of pressure on the reticle to deliver relevant information while looking pretty. One of the things we needed to communicate was zoom level. I tried all kinds of sliders, and scale bars, and radial progress bars, but all of them looked awkward, out of place, or hard to understand.

I tried my hand at designing something prettier, which resulted in this circle with a curly tail, a motif we use throughout the game, from the logo, to our magical lanterns, to the staff held by Japley, the player's guide to the game. The idea was that the tail extends from the edge to the center as the zoom level increases. I generated a spritesheet from all the frames and picked a page based on the zoom level in the reticle material.

Although this approach suffered from blurry scaling at lower resolutions, at the native 1080p resolution it was pixel perfect and all was good for a time. Soon, however, I grew spoiled by our nicely morphing zoom reticle, and the instant flip to other contextual reticles, such as those indicating interactable portals, magnets or NPCs no longer looked good enough.

I went back to the drawing board to design a way to morph any reticle into any other. Alpha fading could hide some of the transitions, but it felt static. I considered making reticles into a flat camera-space 3D mesh and morphing that, but authoring reticles that way sounded very painful, and I couldn't figure out how to support topology changes, like one circle morphing into two separated arcs.

Eventually I realized that distance field textures, rarely used outside of rendering fonts, might be the answer. Distance fields are continuous, so they can be transformed smoothly, and rendered perfectly crisp at any reasonable resolution. And what's more, since each pixel in a distance field texture is rendered completely independently, topology doesn't matter.

A sleepless night later I had my first implementation of The Mighty Morphin' Power Reticles, and eventually ended up using this approach all throughout our UI. Below is a breakdown of the technique.

Concepts

A distance field is a way to represent an arbitrary shape using a raster. This works in any number of dimensions, but in this case we only care about 2D vectors described using 2D raster textures. They are also applicable in non-rendering contexts, such as collision and AI heatmaps, but we're only concerned with rendering here.

Using distance field textures to render 2D vectors in games was first popularized by a 2007 Valve paper, Improved Alpha-Tested Magnification for Vector Textures and Special Effects. This was quickly adopted as a common way to render font glyphs, but saw limited use in other vector contexts. I would hazard a guess that this was in part because UI was often built using Scaleform, which had its own vector renderer.

In a 2D distance field texture, each pixel represents the distance from the center of that pixel to the nearest edge of the shape being described. Traditionally, when encoded into 8 bits, minimum distance pixels are encoded as 255, or white, and maximum-distance pixels are encoded as 0, or black.

A triangle and its corresponding distance field. Pixels closest to the edge are at minimal distance, represented by the brightest white here, and the further you go, the larger the distance, going to black, and this is capped at some maximum distance, anything beyond it being considered infinity.

One problem with this is that we can't tell the interior from the exterior of the shape. To address this we use signed distance fields, where distances inside the shape are negative. Then we rescale the whole range to (255, 0], similar to an unsigned distance field.

A signed distance field discriminates between interior and exterior.

Another, more complicated, problem is that our textures have finite resolution, so sharp changes in the distance from a pixel to an edge can be lost due to quantization, meaning that corners are always rounded to some extent. This was solved in Viktor Chlumský's 2015 master's thesis, Shape Decomposition for Multi-channel Distance Fields. The key insight is that you can represent corners by decomposing the shape into 3 shapes, and any point that is in at least two of the shapes will also be in the original shape. Then, if you generate distance fields for each, the median of them at any pixel will be the distance to the original shape, corners included.

I haven't studied the paper enough to understand how you decompose the shape, but game development is a practical craft, and there's a working algorithm, which is good enough for me. If this topic interests you, this wonderful short talk about the paper by Zach Tellman is very digestible and requires no background knowledge, but it does skip over the exact decomposition algorithm.

A multi-channel distance field (MSDF) allows us to represent sharp corners.

To render the shape with this representation, all we need is to sample it in the shader, take the median of the 3 channels to get the distance to the original shape, pick a "smoothness" range, representing the number of pixels across which the edge is interpolated, then output solid pixels if the distance is below this range, transparent pixels if it's above this range, and partially translucent pixels within this range, with their alpha lerped based on where in the range they are. This is known as the smoothstep() function and is available in most if not all shading languages and material systems.

One of the most interesting aspects of this representation is that because it's a distance from the edge, you can trivially inflate and deflate the shape by adding or subtracting from the distance before applying the threshold.

Taking this one step further, you can morph from one shape to another by deflating one and inflating the other, while keeping the sum at 1, and taking their union. This just means lerping from the sample of the first texture to the sample of the second texture. You can even extrapolate the lerp alpha (i.e. use a weight higher than 1) to inflate one side of the lerp beyond its original size. We use this to make some morphs feel more cartoony, inflating a bit at the end of a transition before snapping back to the final size.

Now that we're done with the theory, let's dive into the fun part, which is getting this into the game!

Generating MSDF Textures

To start, you'll need SVG files for your shapes. There are many ways to make these. I use the FOSS Inkscape editor, but many use Adobe Illustrator, and I'm sure there are plenty of others.

Next up is converting those SVG files into an MSDF texture. Remember Viktor from a few paragraphs ago? Well, he was kind enough to release msdfgen, an open source implementation of his MSDF algorithm, which can take an SVG and produce an MSDF texture. It is not without its flaws, but I've managed to get it to spit out usable assets every time with some fiddling.

MSDFGen is a command-line program. There's a third party paid UE4 marketplace plugin that wraps the MSDFGen library and allows you to simply drag SVGs into your content browser to import, but at the moment I cannot recommend it, as in my experience it fails to produce MSDFs from most non-trivial SVGs, though regular SVG rendering works well. The author promised to look into the issue, but I haven't heard from them since.

Before converting the SVG, you have to get it into as simple a form as possible, making sure that:

These are all painful steps that you'll have to repeat each time you want to convert while iterating, but for now it's the price we have to pay to be able to use this tool.

Once ready, grab the latest release of MSDFGen, or compile it if you aren't on Windows, then head over to your command line and run:

msdfgen.exe -svg in.svg -o out.png -angle 1D -size 128 128 -pxrange 16 -scale 1

The arguments are:

This will hopefully generate a PNG file with your texture. You will be able to roughly tell if it's reasonable at a glance. If it doesn't looks right, here's a bunch of troubleshooting steps:

Now import your texture like any other PNG into UE4. Once imported, it's very important to change the texture settings as follows:

Finally, make sure the Format line in the texture details says B8G8R8A8. If it says just R8, then your PNG happened to have no corners, and MSDFGen optimized it into a single channel, which will fail to render using the material described below. Resave the file as a 32-bit PNG using any raster editor and reimport it.

Side note: if your shapes are regular rasters, you can generate plain, single-channel distance fields using tools like image-sdf. They won't have sharp corners, but you can use them interchangeably with MSDFs, as long as they are stored with multiple channels as described in the previous paragraph.

Rendering a Shape

With your texture imported, let's look at the material setup. This builds a UI material, but you can certainly use this technique in other types of materials, like surfaces, decals and particles. For instance, we use distance fields in our carved rune decals to give them a gradient glow.

First, MF_CombineMSDFChannels combines the channels of the MSDF texture into a single distance value by taking their median:

Then MF_AdaptSmoothnessToResolution scales the smoothness range with resolution. This is not strictly necessary, but makes it easier to scale your shapes without having the adjust the smoothness.

Finally, M_Shape uses those two functions to convert a single MSDF texture into an opacity mask, and exposes a color parameter. The built-in DistanceField function is just a wrapper for smoothstep():

The result is this island shape:

You can increase the smoothness parameter to intentionally get blurry results, which can be useful as a glow to draw under your main shape.

Note that you will get sharp transition at the corners. You could avoid them by using a plain single-channel distance field for the glow in addition to an MSDF for the shape, but I found it more practical to use a Retainer Box with a post-effect material to render much more flexible glows for arbitrary content, including text.

Morphing

Morphing from one shape to another is as simple as lerping between the two distances, after each has been combined separately. MF_MorphMSDF does that:

M_ShapeMorph does the equivalent of M_Shape using the morph:

That gives us a transition like this as we vary FadeAlpha:

Controlled Timing

One very interesting aspect of this morphing implementation is that you can control which parts of the shape get morphed earlier, and which get morphed later by using a grayscale texture that is added to the morph alpha (I call this the fade progress texture). Here's an example:

This offers a crazy amount of flexibility, but when creating this fade progress texture, it is hard to imagine the result exactly, so it tends to involve a good amount of tweaking to get it to look right. I usually start with a gradient then paint over it bit by bit until I get the effect I want.

M_ShapeFade implements this by simply adding the texture to the fade alpha, although you will need to compensate for it by subtracting a constant to prevent the fade starting early.

I've found fade progress textures especially useful for the special case of morphing between a shape and nothing, i.e. fading it in or out. If one of the MSDFs you are morphing between is plain black, you get an inflation effect that feels quite organic. The main rule of thumb to keep in mind when authoring your shapes for this is that by default, the thicker parts of the shape always appear before the thinner ones, and the fade progress texture effect is applied on top of that.

Here's an example of a pretty complex animation that is implemented very simply and efficiently with just an MSDF texture generated from an SVG and a fade progress texture cobbled together from two conical gradients and some hand tweaking. You could even stuff the fade progress texture into the alpha channel of the MSDF and only need to sample a single texture!

Conclusion

This is a very efficient and flexible technique, but it does have some limitations. The main one is that it only deals with binary shapes, not gradients or colors, so you can't transition from arbitrary SVG images. However, you can still control color with other textures or procedural gradients.

In the case of the UE4 implementation in particular, the morph is controlled by a single material parameter, which means it's super convenient to use by mapping to some gameplay variable (e.g. a progress bar's progress), or by controlling it directly in Sequencer or UMG animations.

You can find all of the examples shown here in the sample project, which is released into the public domain (or with a CC0 license where inapplicable), meaning that you can use it in any free or commercial project, even without attribution.

If you have any questions about this technique, or anything else in the game, tweet @AlekonTheGame, and in the meantime, you can get Alekon on Steam!