Animated Vector Rendering

Rendering animated vector graphics on the GPU in-game.

A Unity 3D asset bundle for rendering animated vector graphics in-game. The bundle is still work in progress, but a first preview is available here.

  • Type: Unity asset bundle. (Still work in progress)
  • Team: Solo
  • Engine: Unity 3D.
  • Main Language: c#, CG-shaderlab.
  • Platform: Mac, Windows, WebGL.


While rasterized images certainly have their merit in photo realistic games, other parts like UI elements and for example cartoon styled games would be able to benefit from the ability to render vector images in game. Vector images are infinitely scalable without seeing jagged pixel edges. Due to the information stored per shape, they hold the promise for using far less memory in comparison to a larger bitmap texture where color information is stored per pixel.

Current GPU hardware is optimized for processing vertices and transforms in large quantities and rasterizing the results for on screen displaying, which is historically something mostly used for 3D rendering purposes. 2D vector graphics also consist of anchor points (vertices) which are transformed and rasterized for rendering. But they have mostly been rendered on CPU’s which are not optimized for such tasks.


In order to test the performance of the library and the viability of the techniques used, a small preview game was created using the library. It showcases many features and helped me squash a lot of bugs I wouldn't have noticed otherwise.

The menu is created out of one SVG document where a lot of different anmations are defined. These animations are played when the arrow keys are pressed. Since the gameplay requires a few more animations to be played concurrently the game scene is composed out of three SVG documents.

The gameplay is simple, you have to match the color shown in the middle to one of the colors on the rotating band. A match is made when you press the arrow in the direction the color is at that moment.

››› Play Now! ‹‹‹


  • SVG importing: basic shapes, groups, animations.
  • Basic shape rendering: rectangle, ellipse, polygon and paths.
  • Groups: correctly grouping elements.
  • Strokes: for rectangle and ellipse.
  • Animations: Attribute based, transform based and point based.

Shape rendering

  • Shapes are rendered by generating a triangulated mesh.
  • Rectangles and ellipses are created on a quad.
  • Polylines are simply triangulated.
  • Paths are Triangulated and curves rendered with the technique described by Loop-Blinn.


  • Animations are generic and can be applied to attributes with types of: float, vector2, color, bool, point list, spline list,...
  • Animations can be triggered throught the VectorAnimator component. With optionally a playing speed and playing range.
  • Temporary animations can be created for a single attribute.


  • Holes in paths are not implemented yet.
  • Strokes on polygons and paths are not supported yet.
  • Gradients are not implemented yet.
  • Only basic morphing animations where the shapes cover a similar area are supported.


While creating this library, I encountered a lot of challenges. From importing vector data to storing the data and displaying it.


It's important that users of the library, mainly developers and artists, can easily interact with all of it's functionality. That's why I created two components for them to interact with. These expose a UI for setting up the vector sprites and methods for editing them at run-time.


The two components attached to a GameObject.

  • The VectorRenderer component offers the bare-bones functionality of displaying imported vector sprites. Since it just shows the vector statically it does not require a lot of performance. Simply dragging the the vector sprite into the corresponding slot will show the graphic. When the GameObject to which the component is attached is rotated, scaled or translated, the image transforms with it.
  • The VectorAnimator component hooks into the VectorRenderer and allows triggering and displaying the animations of the vector sprite. It exposes a few methods to start and stop animations, play the animation at custom speed, play in reverse or play only a small part of the animation. Since it requires changing the vector mesh at run-time, this component is a bit heavier.

SVG Importing

Because creating vector data in Unity is not the goal of this library, it has to be imported. After researching different formats I opted for using SVG-files. The open SVG-standard uses an XML structure and is thus easy to create and edit, even without vector editing software.

The SVG importer script inherits from Unity's AssetPostprocessor class and will be called each time an asset is imported. If it is a .svg file, it will be imported.The importer does not import every SVG featur yet, but it allows for importing the most important features for this library.

Mesh creation

The method used for rendering vector graphics on the GPU is the one described by Loop and Blinn. The basic concept behind the method is converting the shapes to triangulated meshes. For quads and ellipses that's quite an easy task, since they can mathematically be clipped on a quad. Polygons with their straight edges are easy as well, since they can easily be triangulated. Paths at the other hand are much more difficult as described by Loop and Blinn's research. Each one of these shapes is described below.


For each vector document, exactly one mesh is created when importing. This mesh consists of multiple submeshes, one for each shape. This allows for only using one MeshFilter component with an array of materials assigned.

Rectangles and ellipses

As said before, rectangles and ellipses are easily converted to a mesh. By setting the UV-coordinates on the vertices of a quad, it is easy to calculate which parts to clip. The image below shows the how a ellipse and rounded rectangle are constructed.

Construction of a rounded rectangle and ellipse on a quad.

Construction of a rounded rectangle and ellipse on a quad.

Polylines and polygons

Polylines and polygons have straight edged, so these work very well with triangulated shapes. Currently I use the triangle.NET library, slightly adjusted to work with Unity and Mono. The shader used for polylines is extremely easy, since the fragment shader can just output the fill color for each pixel.


In order to render paths with bezier curves a triangulated mesh will be created as well, with the triangles where a curve will flow requiring special UV coordinates. This mesh is created as one inner shape and for each curve a shape following the curve’s control and anchor points, as shown on the image below.

A triangulated path consists of one inner shape and a triangle fan for each curve segment.

A triangulated path consists of one inner shape and a triangle fan for each curve segment.

Rendering a vector image is one thing, but animating it as well requires some additional work.

Since the curves can change when animating, their control points can change from pointing inside to pointing outside and vice versa. To cope with this given issue, the curve triangles and inner triangulated shape have to be created a bit differently. By giving each curve six vertices with two vertices for control points pointing inside and two vertices for control points pointing inside, the mesh is prepared for animation. When animating and the curve type changes, enough vertices will be available for two control points at either side, as shown on the image below. The inner shape will also be created with the default of four vertices per bezier segment. While this results in some overhead in the form of extra vertices and triangles, it gives the exibility to animate the mesh at run-time.

To make the path mesh flexible for animations, 6 vertices are used per curve, this allows for flexible repurposing of the vertices when the type of the curve changes.

To make the path mesh flexible for animations, 6 vertices are used per curve, this allows for flexible repurposing of the vertices when the type of the curve changes.



Because different attributes of the shape have to be animated a flexible datastructure has to be created as well.


To make the system easily extensible, the animation elements are generic classes. This makes sure there doesn’t have to be a separate class for each attribute. So far there are animation elements implemented for animating floats, colors, vector2’s, bools, polygon point data and path segment data. Because these animation elements are generic, adding new animated attributes to the shapes is as easy as assigning which type of animation element to use.

When an animation is running, each animation element’s value is retrieved and set to the element. For certain attributes only the material’s properties should be adjusted, others require an edit in the mesh. In order to do this efficiently, each frame, a hashset of elements is kept which keeps track of which elements should change the mesh. When all animation elements are checked, it will iterate over the hashset and pass through an edit mesh command. Since the hashset only contains unique elements, each element will only edit the sub-mesh once per frame.