I’ve clocked up over 60hrs on Just Cause 3, and most of it is just gliding about with the wingsuit. There’s something very compelling about that calm beauty intersersed by moments of complete panic as you very nearly smack into a tree/car/tower block.
I wanted to have a go at recreating that same feel, and I knew I’d need a nice terrain to fly over so I turned to the UE4 open world ‘Kite’ demo. I’ll upload the code to a repo just as soon as I have time to extricate it neatly from the (multiple gigabyte) project. In the meantime, here’s a soothing video:
As part of the International Festival for Business last week I helped out at Realspace’s VR hub in the Liverpool Science Centre. Realspace have recently set up, amongst other things providing a space for local businesses to experiment with VR. They knew I was working on a VR game and asked if I could demo it at the event – but the game is such an early prototype it’s still full of coder art, which I’m not keen to show to people! Instead I showed the very first thing I made after getting my DK2: I took Epic’s ‘Realistic Rendering’ environment and added some physics-based gaze mechanics. So you could walk around Epic’s beautiful apartment lounge, and turn the lamps on and off and pick up and smash objects ‘with your mind’. Ludicrous but oddly compelling – it was fun to see Serious Business People light up with joy when they realised they could smash the place up.
Last weekend I took part in my first 4-day #ue4jam, as part of the Sleepless Beanbags team. The theme was ‘fire in the hole’ so obviously that implies a multiplayer brawler about miners chucking dynamite at each other.
Anyway – this post is to say we’ve done a post-competition release which fixes many of the bugs, most importantly the names are now visible everywhere so you can actually see which character you are!
But beyond that, because it’s a multiplayer game it’s not much fun on your own! So I’ll be hosting a game at 20:00 (UK time!) tonight, and we’d love it if you’d join us. You’ll need to run Steam first, set your download region in settings to UK/Manchester, then run the game and join the match called ‘chrismcr’.
So grab the post-jam game here and we’ll see you at 20:00!
Just over four years ago, for Ludum Dare 23, my friend Pete and I made a game called Escape in C++/OpenGL. The theme was ‘Tiny World’, which we chose to interpret as ‘the level shrinks as you play’. Inspired by the red weed from H.G. Wells’ War of The Worlds we decided the player is stranded on a planet, surrounded by alien goop which is slowly spreading towards them. Their task is to harvest resources strewn about the environment to fuel/repair their rocket and escape. Different types of resource are needed, and each type requires a different length of time to harvest, during which the player can’t move. Adding to the difficulty, the player can only carry three resources at once, after which they’ll need to drop off at the rocket before going out to harvest again. The player is also in a vehicle, controlled with the mouse – so it feels ever so slightly out of control. Pete wrote a great theme tune which lent the whole thing an unnerving, anxious feeling. The graphics would politely be called ‘lo-fi’, but overall I was rather pleased with it.
Fast-forward to last weekend, and I was investigating networking in UE4. It occurred to me Escape would be a good project to try out in Unreal, because I’d networked the original shortly after Ludum Dare and I remembered how long it had taken to find the balance between prediction and sending lots of data – it would be interesting to see how smooth that part would be in UE4.
Since then I’ve been recreating Escape in UE4, and in my network research I noticed a few people on the forums asking for complete networked demos so I thought it’d be useful to release the source. It’s not finished by any means, there’s only one type of resource right now and the logic detecting win and lose is decidedly crummy, but the networking is all there and you can play a complete game from start to finish in single and multiplayer. You can grab the project here. I wanted to do the whole thing in blueprints, but I also wanted to play online via Steam – and right now the blueprint session interface just isn’t up to the task, so I’m using Mordentral’s Advanced Sessions plugin. Which means it does need compiling, though all of the game code is still in blueprints.
I’ll be writing a bunch of posts about the project next week hopefully – the networking aspects, but also some of the effects were quite fun to reimagine so I’ll probably write about those too. And maybe a word or two about why the UE4 vehicle looks like Postman Pat’s van. For now though, it’s the #ue4jam!
Here’s a method for ‘fog of war’ from an old Unity project. This is the kind used in games like XCOM where the world geometry outside the visible range of your units is darkened, but still at least partially visible.
I added a circle mesh (an extremely short cylinder will also work) to the player unit prefab with a radius the same as the units view radius. The circle is on its own layer (UnitViewDistance), and is textured with a radial gradient going from white to transparent black.
I added a camera (UnitViewCamera) with a depth of 0 (all other cameras are depth 1, so UnitViewCamera renders first). UnitViewCamera’s culling mask is set so that it only renders the UnitViewDistance layer, and it renders into a RenderTexture created in code. The result of all this is a transparent image with various white blobs, something like this:
The RenderTexture is then plugged into a shader and used as a mask. The intensity of the mask texel is used to lerp the colour of the final texel between full colour (at 1) and desaturated (at 0):
So this was a fun little demo I put together for a client: the first chain of insulin (apparently, I’m not a scientist!) loaded from a CML (Chemical Markup Language) file and displayed as a model in UE4. There are over 400 atoms and ~1200 bonds iirc, so I used one instanced static mesh per atom type and one for the bonds. Then I mounted the whole thing on an invisible sphere which is physically simulated and fixed in place – meaning I can then spin it like a globe using the Leap Motion.
Neat effect, and very quick to put together; the part that took the longest was reading the CML and figuring out where to put the bonds (I had no idea until wikipedia told me about covalent radii!).
One of my prototypes needed an ability for the player to see certain objects through walls. I didn’t want the objects to be visible at all times, just when the player looked directly at them.
To prevent an object being obscured by the objects in front of it we just need to disable depth testing, which we can do in the object’s material. Set the material’s blend mode to Translucent and tick Disable Depth Test in the Translucency section of the material properties:
Apply the material to a mesh and you’ll be able to see it through whatever objects are in front of it. To make the effect localised I added a very simple bit of maths to the opacity node:
This calculates the distance of the current pixel from the centre of the screen and uses it to determine the opacity – so the pixels fade out as they leave a very narrow radius around the centre.
Here’s a video of the material applied to the Starter Content chair:
I’ve been working on a tile-based game recently, and I wanted to use A* for pathfinding (NavMesh is overkill, and not a great fit). I could’ve just written an A* pathfinder and custom AI code that uses it, but I wondered if there might be a better way – ideally I’d like to just replace NavMesh with my pathfinder, and have the standard AIController/PathFollowingComponent code work with it seamlessly.
I came across MieszkoZ’ answer to this question, which got me started – turns out not only is it possible, it’s pretty simple (with a couple of caveats!)
A New Pathfinding Class
NavMesh is contained in a class called ARecastNavMesh, a subclass of ANavigationData. In theory all you need to do is create your own subclass and plug it into the engine. It’s worth looking at the source for ARecastNavMesh to see how it does things – the key function is FindPath:
What’s a little odd is that FindPath isn’t virtual, it’s static. Comments in ANavigationData and ARecastNavMesh explain it’s for performance reasons: Epic are concerned that if a lot of agents call the pathfinder in the same frame the vtable lookups will be too slow, so instead the function is declared static and stored in a function pointer, ANavigationData::FindPathImplementation.
Another effect of FindPath being static is that it has no this pointer. Thankfully there is a weak pointer to this in Query.NavData, so you can Get() that and use it instead.
FindPath is expected to return an FPathFindingResult struct, which contains a success/failure enum and an FNavigationPath. FindPath‘s Query parameter may contain an FNavigationPath for you to use (if Query.PathInstanceToFill is valid) or you’ll have to create a new one using ANavigationData::CreatePathInstance.
All of which is much easier to say in code! So here’s a template FindPath based on ARecastNavMesh::FindPath [updated for 4.12]:
FPathFindingResult AAStarNavigationData::FindPath(const FNavAgentProperties& AgentProperties, const FPathFindingQuery& Query)
const ANavigationData* Self = Query.NavData.Get();
AAStarNavigationData* AStar = const_cast<AAStarNavigationData*>(dynamic_cast(Self));
check(AStar != nullptr);
if (AStar == nullptr)
Result.Path = Query.PathInstanceToFill.IsValid() ? Query.PathInstanceToFill : Self->CreatePathInstance(Query);
FNavigationPath* NavPath = Result.Path.Get();
if (NavPath != nullptr)
if ((Query.StartLocation - Query.EndLocation).IsNearlyZero())
Result.Result = ENavigationQueryResult::Success;
// **** run your pathfinding algorithm from Query.StartLocation to Query.EndLocation here
// add each point on the path with:
// NOTE: the path must contain at least 2 non-start path points
// if your algorithm can only find a partial path call NavPath->SetIsPartial(true),
// but remember to check if partial paths are acceptable to the caller (Query.bAllowPartialPaths)
// - if they aren't you should return ENavigationQueryResult::Fail
Result.Result = ENavigationQueryResult::Success;
Connect your pathfinding algorithm, assign FindPath to FindPathImplementation in the constructor, and you’re done!
To plug your new pathfinder into the engine, you need to edit Config/DefaultEngine.ini. Find the section called: