Builtin A* Pathfinding in Unreal Engine 4.25

A few years back I wrote Neatly replacing NavMesh with A* in UE4 and ever since I’ve had a vague notion that it’s probably gone wildly out-of-date. As it happens I was recently working on another project that would benefit from A* and I noticed UE4 already has an A* implementation called FGraphAStar, so I thought I’d write a little updated post talking about both. (Fun fact: FGraphAStar was added about two months before I wrote the earlier post… the lesson here is: always search the codebase…)

I’ve put a simple demo project up at https://bitbucket.org/chrismcr/astar425. Left-click the ground and watch ConeMan A* pathfind like a winner.

ANavigationData Replacement

The process of replacing NavMesh hasn’t changed much so I won’t go over it again, though it is a little simpler to plug-in: you don’t need to edit DefaultEngine.ini manually, you can just add a default Agent to the supported agents list in Project Settings/Navigation System and set its NavigationClass to your NavMesh replacement.


There’s a comment above the definition in GraphAStar.h that tells us what we need to implement. FGraphAStar is a template struct where the template parameter class TGraph has to define a type and a couple of functions. The type, FNodeRef is just a value that uniquely-identifies a node in the graph – in my simple grid demo, it’s just an FIntPoint since a grid cell can be uniquely-identified by it’s coordinates. The functions needed are:

int32 GetNeighbourCount(FNodeRef NodeRef) const

This returns the maximum number of neighbours the given node could have. In my grid-based demo, I return 8 since I allow diagonal movement. Notice I return 8 even for an edge cell – trying to filter at this point just creates unnecessary clutter.

bool IsValidRef(FNodeRef NodeRef) const

Does the given NodeRef refer to a valid node? In my demo I just return whether the coordinates are on the grid.

FNodeRef GetNeighbour(const FNodeRef NodeRef, const int32 NeighbourIndex) const

Now things get interesting – return the Xth neighbour of the given node. In my demo I count the neighbours clockwise from postive Y (this is why trying to cater for edge cells in GetNeighbourCount would get messy: if I’m a corner cell and return 3 in that case, I need to check if I’m a corner cell here again to reinterpret NeighbourIndex correctly – but if I always have 8 neighbours, the values of NeighbourIndex always have the same meaning).

So that gets the basic structure of our node graph into A*. The rest is in the query filter, which defines the following:

float GetHeuristicCost(const FNodeRef StartNodeRef, const FNodeRef EndNodeRef) const

This is the cheaply-calculated estimated cost of pathing from StartNodeRef to EndNodeRef. For my grid demo I’m just returning the manhattan distance.

bool IsTraversalAllowed(const FNodeRef NodeA, const FNodeRef NodeB) const

Can you go from NodeA to NodeB? For the grid demo, if I only allowed 4-directional movement this would be trivial: yes, if node A and node B are both open. Since I allow diagonal movement I also make sure e.g. north and east are both clear before allowing a move north-east.

float GetTraversalCost(const FNodeRef StartNodeRef, const FNodeRef EndNodeRef) const

This is the real (i.e. not estimated) cost of travelling from StartNodeRef to the adjacent node EndNodeRef. A* finds the cheapest path, so this can be used to mark preferred routes e.g. if the current node is ‘road’, and left is ‘road’ but right is ‘swamp’ we might return 1 to go left and 2 to go right.

float GetHeuristicScale() const

The comment says “used as GetHeuristicCost’s multiplier”; I haven’t used it in the demo (I just return 1) since the environment is trivial enough that heuristics don’t really matter. In a more complex environment (and a more complex heuristic) the scale can be used to adjust how much A* relies on the heuristic – if you return zero, the heuristic is ignored completely, making A* completely accurate but much slower. As you increase the value A* will search fewer paths and possibly settle on one that’s suboptimal.

And that’s about it. Nice and simple, but then A* is.

Just Cause-style wingsuit/grapple/gliding

I’ve clocked up over 60hrs on Just Cause 3, and most of it is just gliding about with the wingsuit. There’s something very compelling about that calm beauty intersersed by moments of complete panic as you very nearly smack into a tree/car/tower block.

I wanted to have a go at recreating that same feel, and I knew I’d need a nice terrain to fly over so I turned to the UE4 open world ‘Kite’ demo. I’ll upload the code to a repo just as soon as I have time to extricate it neatly from the (multiple gigabyte) project. In the meantime, here’s a soothing video:

VR Hub / Apartment Trashing Simulator

As part of the International Festival for Business last week I helped out at Realspace’s VR hub in the Liverpool Science Centre. Realspace have recently set up, amongst other things providing a space for local businesses to experiment with VR. They knew I was working on a VR game and asked if I could demo it at the event – but the game is such an early prototype it’s still full of coder art, which I’m not keen to show to people! Instead I showed the very first thing I made after getting my DK2: I took Epic’s ‘Realistic Rendering’ environment and added some physics-based gaze mechanics. So you could walk around Epic’s beautiful apartment lounge, and turn the lamps on and off and pick up and smash objects ‘with your mind’. Ludicrous but oddly compelling – it was fun to see Serious Business People light up with joy when they realised they could smash the place up.

For The Loot!

Match details: join us tonight 20:00 BST (GMT+1), Steam has to be running, and change your download region to UK/Manchester. I’ll be hosting as ‘chrismcr’.

Download the post-jam game here!

Last weekend I took part in my first 4-day #ue4jam, as part of the Sleepless Beanbags team. The theme was ‘fire in the hole’ so obviously that implies a multiplayer brawler about miners chucking dynamite at each other.

Because we could only test 3-person multiplayer it was nervewracking to watch Allar set up a 10-person game on his livestream… but it just about held up!

Anyway – this post is to say we’ve done a post-competition release which fixes many of the bugs, most importantly the names are now visible everywhere so you can actually see which character you are!

But beyond that, because it’s a multiplayer game it’s not much fun on your own! So I’ll be hosting a game at 20:00 (UK time!) tonight, and we’d love it if you’d join us. You’ll need to run Steam first, set your download region in settings to UK/Manchester, then run the game and join the match called ‘chrismcr’.

So grab the post-jam game here and we’ll see you at 20:00!

Escape in UE4

Just over four years ago, for Ludum Dare 23, my friend Pete and I made a game called Escape in C++/OpenGL. The theme was ‘Tiny World’, which we chose to interpret as ‘the level shrinks as you play’. Inspired by the red weed from H.G. Wells’ War of The Worlds we decided the player is stranded on a planet, surrounded by alien goop which is slowly spreading towards them. Their task is to harvest resources strewn about the environment to fuel/repair their rocket and escape. Different types of resource are needed, and each type requires a different length of time to harvest, during which the player can’t move. Adding to the difficulty, the player can only carry three resources at once, after which they’ll need to drop off at the rocket before going out to harvest again. The player is also in a vehicle, controlled with the mouse – so it feels ever so slightly out of control. Pete wrote a great theme tune which lent the whole thing an unnerving, anxious feeling. The graphics would politely be called ‘lo-fi’, but overall I was rather pleased with it.

Fast-forward to last weekend, and I was investigating networking in UE4. It occurred to me Escape would be a good project to try out in Unreal, because I’d networked the original shortly after Ludum Dare and I remembered how long it had taken to find the balance between prediction and sending lots of data – it would be interesting to see how smooth that part would be in UE4.

Since then I’ve been recreating Escape in UE4, and in my network research I noticed a few people on the forums asking for complete networked demos so I thought it’d be useful to release the source. It’s not finished by any means, there’s only one type of resource right now and the logic detecting win and lose is decidedly crummy, but the networking is all there and you can play a complete game from start to finish in single and multiplayer. You can grab the project here. I wanted to do the whole thing in blueprints, but I also wanted to play online via Steam – and right now the blueprint session interface just isn’t up to the task, so I’m using Mordentral’s Advanced Sessions plugin. Which means it does need compiling, though all of the game code is still in blueprints.

I’ll be writing a bunch of posts about the project next week hopefully – the networking aspects, but also some of the effects were quite fun to reimagine so I’ll probably write about those too. And maybe a word or two about why the UE4 vehicle looks like Postman Pat’s van. For now though, it’s the #ue4jam!

Unity Visibility/Fog of War Effect

Here’s a method for ‘fog of war’ from an old Unity project. This is the kind used in games like XCOM where the world geometry outside the visible range of your units is darkened, but still at least partially visible.

I added a circle mesh (an extremely short cylinder will also work) to the player unit prefab with a radius the same as the units view radius. The circle is on its own layer (UnitViewDistance), and is textured with a radial gradient going from white to transparent black.

Screen Shot 2016-03-24 at 17.21.40

I added a camera (UnitViewCamera) with a depth of 0 (all other cameras are depth 1, so UnitViewCamera renders first). UnitViewCamera’s culling mask is set so that it only renders the UnitViewDistance layer, and it renders into a RenderTexture created in code. The result of all this is a transparent image with various white blobs, something like this:


The RenderTexture is then plugged into a shader and used as a mask. The intensity of the mask texel is used to lerp the colour of the final texel between full colour (at 1) and desaturated (at 0):

fixed4 frag (v2f i) : SV_Target
    fixed4 col = tex2D(_MainTex, i.uv);
    float bwCol = col.r*.3 + col.g*.59 + col.b*.11;

    fixed4 maskCol = tex2D(_viewTex, i.uv);
    float maskIntensity = maskCol.r;

    col = lerp(bwCol, col, maskIntensity);
    return col;

Here’s a screenshot of a programmer-art world demonstrating the (very subtle) effect – the unit is off-screen to the bottom-right, so you can see the cubes desaturating towards the top-left:

Screen Shot 2016-03-24 at 17.39.21

VR Molecular Visualisation

So this was a fun little demo I put together for a client: the first chain of insulin (apparently, I’m not a scientist!) loaded from a CML (Chemical Markup Language) file and displayed as a model in UE4. There are over 400 atoms and ~1200 bonds iirc, so I used one instanced static mesh per atom type and one for the bonds. Then I mounted the whole thing on an invisible sphere which is physically simulated and fixed in place – meaning I can then spin it like a globe using the Leap Motion.

Neat effect, and very quick to put together; the part that took the longest was reading the CML and figuring out where to put the bonds (I had no idea until wikipedia told me about covalent radii!).

UE4 Quick tip: ‘X-ray’ material

One of my prototypes needed an ability for the player to see certain objects through walls. I didn’t want the objects to be visible at all times, just when the player looked directly at them.

To prevent an object being obscured by the objects in front of it we just need to disable depth testing, which we can do in the object’s material. Set the material’s blend mode to Translucent and tick Disable Depth Test in the Translucency section of the material properties:


Apply the material to a mesh and you’ll be able to see it through whatever objects are in front of it. To make the effect localised I added a very simple bit of maths to the opacity node:


This calculates the distance of the current pixel from the centre of the screen and uses it to determine the opacity – so the pixels fade out as they leave a very narrow radius around the centre.

Here’s a video of the material applied to the Starter Content chair:

Neatly replacing NavMesh with A* in UE4


[ Update: Builtin A* Pathfinding in Unreal Engine 4.25 ]

I’ve been working on a tile-based game recently, and I wanted to use A* for pathfinding (NavMesh is overkill, and not a great fit). I could’ve just written an A* pathfinder and custom AI code that uses it, but I wondered if there might be a better way – ideally I’d like to just replace NavMesh with my pathfinder, and have the standard AIController/PathFollowingComponent code work with it seamlessly.

I came across MieszkoZ’ answer to this question, which got me started – turns out not only is it possible, it’s pretty simple (with a couple of caveats!)

A New Pathfinding Class

NavMesh is contained in a class called ARecastNavMesh, a subclass of ANavigationData. In theory all you need to do is create your own subclass and plug it into the engine. It’s worth looking at the source for ARecastNavMesh to see how it does things – the key function is FindPath:

static FPathFindingResult FindPath(const FNavAgentProperties& AgentProperties, const FPathFindingQuery& Query);

What’s a little odd is that FindPath isn’t virtual, it’s static. Comments in ANavigationData and ARecastNavMesh explain it’s for performance reasons: Epic are concerned that if a lot of agents call the pathfinder in the same frame the vtable lookups will be too slow, so instead the function is declared static and stored in a function pointer, ANavigationData::FindPathImplementation.

Another effect of FindPath being static is that it has no this pointer. Thankfully there is a weak pointer to this in Query.NavData, so you can Get() that and use it instead.

FindPath is expected to return an FPathFindingResult struct, which contains a success/failure enum and an FNavigationPath. FindPath‘s Query parameter may contain an FNavigationPath for you to use (if Query.PathInstanceToFill is valid) or you’ll have to create a new one using ANavigationData::CreatePathInstance.

All of which is much easier to say in code! So here’s a template FindPath based on ARecastNavMesh::FindPath [updated for 4.12]:

FPathFindingResult AAStarNavigationData::FindPath(const FNavAgentProperties& AgentProperties, const FPathFindingQuery& Query)
	const ANavigationData* Self = Query.NavData.Get();
	AAStarNavigationData* AStar = const_cast<AAStarNavigationData*>(dynamic_cast(Self));
	check(AStar != nullptr);

	if (AStar == nullptr)
		return ENavigationQueryResult::Error;

	FPathFindingResult Result(ENavigationQueryResult::Error);
	Result.Path = Query.PathInstanceToFill.IsValid() ? Query.PathInstanceToFill : Self->CreatePathInstance(Query);

	FNavigationPath* NavPath = Result.Path.Get();

	if (NavPath != nullptr)
		if ((Query.StartLocation - Query.EndLocation).IsNearlyZero())
			Result.Result = ENavigationQueryResult::Success;
		else if(Query.QueryFilter.IsValid())
			// **** run your pathfinding algorithm from Query.StartLocation to Query.EndLocation here
			// add each point on the path with:
			// NavPath->GetPathPoints().Add(FNavPathPoint(WORLD_POSITION));
			// NOTE: the path must contain at least 2 non-start path points

			// if your algorithm can only find a partial path call NavPath->SetIsPartial(true),
			// but remember to check if partial paths are acceptable to the caller (Query.bAllowPartialPaths)
			// - if they aren't you should return ENavigationQueryResult::Fail

			Result.Result = ENavigationQueryResult::Success;

	return Result;

Connect your pathfinding algorithm, assign FindPath to FindPathImplementation in the constructor, and you’re done!

Plugging In

To plug your new pathfinder into the engine, you need to edit Config/DefaultEngine.ini. Find the section called:


And add the line:


In your map, create a NavMeshBounds as normal, and an instance of your pathfinder will be automatically added (instead of ARecastNavMesh-Default).

Now any AIController::MoveToLocation calls will automatically use your new pathfinder.

Other Changes for Tile-Based Games

There are three things about the default behaviour of an AI controller that don’t feel right to me in a tile-based game:

  1. Characters accelerate and brake as they run around, so they overshoot corners and bump into things
  2. Characters turn immediately in the direction of movement
  3. Characters get ‘close enough’ to their destination, and stop

So I also do the following:

  1. In the CharacterMovement component, set Requested Move Use Acceleration to false (or just lower the character’s max walk speed)
  2. In CharacterMovement, set Orient Rotation to Movement to true and Rotation Rate (Yaw) to 1440
  3. In calls to MoveToLocation, set the Acceptance Radius to 0 and Stop on Overlap to false

Which gives me characters neatly running around a tile-based world, not bumping into things, and stopping exactly where I want them. Marv!