Just Cause-style wingsuit/grapple/gliding

I’ve clocked up over 60hrs on Just Cause 3, and most of it is just gliding about with the wingsuit. There’s something very compelling about that calm beauty intersersed by moments of complete panic as you very nearly smack into a tree/car/tower block.

I wanted to have a go at recreating that same feel, and I knew I’d need a nice terrain to fly over so I turned to the UE4 open world ‘Kite’ demo. I’ll upload the code to a repo just as soon as I have time to extricate it neatly from the (multiple gigabyte) project. In the meantime, here’s a soothing video:

Advertisements

VR Hub / Apartment Trashing Simulator

As part of the International Festival for Business last week I helped out at Realspace’s VR hub in the Liverpool Science Centre. Realspace have recently set up, amongst other things providing a space for local businesses to experiment with VR. They knew I was working on a VR game and asked if I could demo it at the event – but the game is such an early prototype it’s still full of coder art, which I’m not keen to show to people! Instead I showed the very first thing I made after getting my DK2: I took Epic’s ‘Realistic Rendering’ environment and added some physics-based gaze mechanics. So you could walk around Epic’s beautiful apartment lounge, and turn the lamps on and off and pick up and smash objects ‘with your mind’. Ludicrous but oddly compelling – it was fun to see Serious Business People light up with joy when they realised they could smash the place up.

For The Loot!

Match details: join us tonight 20:00 BST (GMT+1), Steam has to be running, and change your download region to UK/Manchester. I’ll be hosting as ‘chrismcr’.

Download the post-jam game here!

Last weekend I took part in my first 4-day #ue4jam, as part of the Sleepless Beanbags team. The theme was ‘fire in the hole’ so obviously that implies a multiplayer brawler about miners chucking dynamite at each other.

Because we could only test 3-person multiplayer it was nervewracking to watch Allar set up a 10-person game on his livestream… but it just about held up!

Anyway – this post is to say we’ve done a post-competition release which fixes many of the bugs, most importantly the names are now visible everywhere so you can actually see which character you are!

But beyond that, because it’s a multiplayer game it’s not much fun on your own! So I’ll be hosting a game at 20:00 (UK time!) tonight, and we’d love it if you’d join us. You’ll need to run Steam first, set your download region in settings to UK/Manchester, then run the game and join the match called ‘chrismcr’.

So grab the post-jam game here and we’ll see you at 20:00!

Escape in UE4

Just over four years ago, for Ludum Dare 23, my friend Pete and I made a game called Escape in C++/OpenGL. The theme was ‘Tiny World’, which we chose to interpret as ‘the level shrinks as you play’. Inspired by the red weed from H.G. Wells’ War of The Worlds we decided the player is stranded on a planet, surrounded by alien goop which is slowly spreading towards them. Their task is to harvest resources strewn about the environment to fuel/repair their rocket and escape. Different types of resource are needed, and each type requires a different length of time to harvest, during which the player can’t move. Adding to the difficulty, the player can only carry three resources at once, after which they’ll need to drop off at the rocket before going out to harvest again. The player is also in a vehicle, controlled with the mouse – so it feels ever so slightly out of control. Pete wrote a great theme tune which lent the whole thing an unnerving, anxious feeling. The graphics would politely be called ‘lo-fi’, but overall I was rather pleased with it.

Fast-forward to last weekend, and I was investigating networking in UE4. It occurred to me Escape would be a good project to try out in Unreal, because I’d networked the original shortly after Ludum Dare and I remembered how long it had taken to find the balance between prediction and sending lots of data – it would be interesting to see how smooth that part would be in UE4.

Since then I’ve been recreating Escape in UE4, and in my network research I noticed a few people on the forums asking for complete networked demos so I thought it’d be useful to release the source. It’s not finished by any means, there’s only one type of resource right now and the logic detecting win and lose is decidedly crummy, but the networking is all there and you can play a complete game from start to finish in single and multiplayer. You can grab the project here. I wanted to do the whole thing in blueprints, but I also wanted to play online via Steam – and right now the blueprint session interface just isn’t up to the task, so I’m using Mordentral’s Advanced Sessions plugin. Which means it does need compiling, though all of the game code is still in blueprints.

I’ll be writing a bunch of posts about the project next week hopefully – the networking aspects, but also some of the effects were quite fun to reimagine so I’ll probably write about those too. And maybe a word or two about why the UE4 vehicle looks like Postman Pat’s van. For now though, it’s the #ue4jam!

VR Molecular Visualisation

So this was a fun little demo I put together for a client: the first chain of insulin (apparently, I’m not a scientist!) loaded from a CML (Chemical Markup Language) file and displayed as a model in UE4. There are over 400 atoms and ~1200 bonds iirc, so I used one instanced static mesh per atom type and one for the bonds. Then I mounted the whole thing on an invisible sphere which is physically simulated and fixed in place – meaning I can then spin it like a globe using the Leap Motion.

Neat effect, and very quick to put together; the part that took the longest was reading the CML and figuring out where to put the bonds (I had no idea until wikipedia told me about covalent radii!).

UE4 Quick tip: ‘X-ray’ material

One of my prototypes needed an ability for the player to see certain objects through walls. I didn’t want the objects to be visible at all times, just when the player looked directly at them.

To prevent an object being obscured by the objects in front of it we just need to disable depth testing, which we can do in the object’s material. Set the material’s blend mode to Translucent and tick Disable Depth Test in the Translucency section of the material properties:

materialprops

Apply the material to a mesh and you’ll be able to see it through whatever objects are in front of it. To make the effect localised I added a very simple bit of maths to the opacity node:

Capture

This calculates the distance of the current pixel from the centre of the screen and uses it to determine the opacity – so the pixels fade out as they leave a very narrow radius around the centre.

Here’s a video of the material applied to the Starter Content chair:

Neatly replacing NavMesh with A* in UE4

tilebasedworld

I’ve been working on a tile-based game recently, and I wanted to use A* for pathfinding (NavMesh is overkill, and not a great fit). I could’ve just written an A* pathfinder and custom AI code that uses it, but I wondered if there might be a better way – ideally I’d like to just replace NavMesh with my pathfinder, and have the standard AIController/PathFollowingComponent code work with it seamlessly.

I came across MieszkoZ’ answer to this question, which got me started – turns out not only is it possible, it’s pretty simple (with a couple of caveats!)

A New Pathfinding Class

NavMesh is contained in a class called ARecastNavMesh, a subclass of ANavigationData. In theory all you need to do is create your own subclass and plug it into the engine. It’s worth looking at the source for ARecastNavMesh to see how it does things – the key function is FindPath:

static FPathFindingResult FindPath(const FNavAgentProperties& AgentProperties, const FPathFindingQuery& Query);

What’s a little odd is that FindPath isn’t virtual, it’s static. Comments in ANavigationData and ARecastNavMesh explain it’s for performance reasons: Epic are concerned that if a lot of agents call the pathfinder in the same frame the vtable lookups will be too slow, so instead the function is declared static and stored in a function pointer, ANavigationData::FindPathImplementation.

Another effect of FindPath being static is that it has no this pointer. Thankfully there is a weak pointer to this in Query.NavData, so you can Get() that and use it instead.

FindPath is expected to return an FPathFindingResult struct, which contains a success/failure enum and an FNavigationPath. FindPath‘s Query parameter may contain an FNavigationPath for you to use (if Query.PathInstanceToFill is valid) or you’ll have to create a new one using ANavigationData::CreatePathInstance.

All of which is much easier to say in code! So here’s a template FindPath based on ARecastNavMesh::FindPath [updated for 4.12]:

FPathFindingResult AAStarNavigationData::FindPath(const FNavAgentProperties& AgentProperties, const FPathFindingQuery& Query)
{
	const ANavigationData* Self = Query.NavData.Get();
	AAStarNavigationData* AStar = const_cast<AAStarNavigationData*>(dynamic_cast(Self));
	check(AStar != nullptr);

	if (AStar == nullptr)
	{
		return ENavigationQueryResult::Error;
	}

	FPathFindingResult Result(ENavigationQueryResult::Error);
	Result.Path = Query.PathInstanceToFill.IsValid() ? Query.PathInstanceToFill : Self->CreatePathInstance(Query);

	FNavigationPath* NavPath = Result.Path.Get();

	if (NavPath != nullptr)
	{
		if ((Query.StartLocation - Query.EndLocation).IsNearlyZero())
		{
			Result.Path->GetPathPoints().Reset();
			Result.Path->GetPathPoints().Add(FNavPathPoint(Query.EndLocation));
			Result.Result = ENavigationQueryResult::Success;
		}
		else if(Query.QueryFilter.IsValid())
		{
			// **** run your pathfinding algorithm from Query.StartLocation to Query.EndLocation here
			// add each point on the path with:
			// NavPath->GetPathPoints().Add(FNavPathPoint(WORLD_POSITION));
			// NOTE: the path must contain at least 2 non-start path points

			// if your algorithm can only find a partial path call NavPath->SetIsPartial(true),
			// but remember to check if partial paths are acceptable to the caller (Query.bAllowPartialPaths)
			// - if they aren't you should return ENavigationQueryResult::Fail

			NavPath->MarkReady();
			Result.Result = ENavigationQueryResult::Success;
		}
	}

	return Result;
}

Connect your pathfinding algorithm, assign FindPath to FindPathImplementation in the constructor, and you’re done!

Plugging In

To plug your new pathfinder into the engine, you need to edit Config/DefaultEngine.ini. Find the section called:

[/Script/Engine.NavigationSystem]

And add the line:

RequiredNavigationDataClassNames=/Script/ProjectName.NavigationClass

In your map, create a NavMeshBounds as normal, and an instance of your pathfinder will be automatically added (instead of ARecastNavMesh-Default).

Now any AIController::MoveToLocation calls will automatically use your new pathfinder.

Other Changes for Tile-Based Games

There are three things about the default behaviour of an AI controller that don’t feel right to me in a tile-based game:

  1. Characters accelerate and brake as they run around, so they overshoot corners and bump into things
  2. Characters turn immediately in the direction of movement
  3. Characters get ‘close enough’ to their destination, and stop

So I also do the following:

  1. In the CharacterMovement component, set Requested Move Use Acceleration to false (or just lower the character’s max walk speed)
  2. In CharacterMovement, set Orient Rotation to Movement to true and Rotation Rate (Yaw) to 1440
  3. In calls to MoveToLocation, set the Acceptance Radius to 0 and Stop on Overlap to false

Which gives me characters neatly running around a tile-based world, not bumping into things, and stopping exactly where I want them. Marv!