Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Skeletons and sharks, two distinct AI systems in Sea of Thieves with their own interesting design secrets.
AI and Games is a crowdfunded series about research and applications of artificial intelligence in video games. If you like my work please consider supporting the show over on Patreon for early-access and behind-the-scenes updates.
'The AI of Sea of Thieves' is released in association with the UKIE's '30 Years of Play' programme: celebrating the past, present and future of the UK interactive entertainment industry. Visit their website for links to interviews, videos, podcasts and events.
In part 1 on my series looking at Rare’s Sea of Thieves, I explored the range of AI systems at play, how missions are generated for players at each of the three quest givers and how all of this is subsequently managed at server level to suit. Having been invited to Rare’s offices there was so much to talk about and in this entry, we’re going to hear first-hand from the developers themselves about the AI in the game at launch. First we’ll explore the pigs, snakes and skeletons roaming the treasure-laden islands; how they work and the surprising secret that powers the skeleton AI behaviour. Plus we dig deep into the completely distinct navigation system built into Unreal Engine 4 by Rare that allows for navigation in open waters and just how difficult it is to stop AI sharks from swimming onto land.
So lets’ begin by examining the land-based creatures. As explained in part 1, one of the main mission types in Sea of Thieves is the Order of Souls: where you must visit one or more specific islands in the world to kill high-ranking skeletons and sell their skulls for treasure. This requires skeletons to spawn in the world when necessary, but they can also just appear throughout your time on a given island if you’re in the midst of retrieving items for a gold hoarders or merchant alliance quest.
So how do they work? Well they’re reliant on a commonly used AI paradigm called behaviour trees, which is the default AI tool built into Unreal Engine 4. As explained in my recent AI 101 episode on the topic, behaviour trees allow for branching of logic so that in certain situations, the AI will make one or more decisions that reflect the scenario. Plus they can react to changes in the world quickly and update their chosen behaviour to suit. Now as mentioned in part 1, many of the land-based AI characters such as the skeletons and the animals on the islands – which I’ll come to in a minute – are all using the original built-in AI toolchain. But there’s something special going on in the skeletons that I wanted to talk about, something unique that during my time working on AI and Games, I’ve simply never came across before.
Y’see, when an AI character in game, while you might want to ensure they’re using some of the same mechanics and features as human players – especially if they’re humanoid – when building the AI you’re thinking about the behaviour you want the character to execute often in a completely distinct way from how you would as a player. You’ll have the logic that dictates when a certain action or behaviour is going to be executed and in Unreal you’ll write specific tasks in the behaviour tree in blueprint that handle the execution on a minute level often calling existing functions in the code that players may call to do a similar thing. Say for example in Sea of Thieves you want to heal yourself after being injured, then you would open the inventory, grab a banana then hit the right-trigger or left mouse
button to eat it, which triggers the Heal() function in the codebase for human players. Typically if you want an AI character – such as a skeleton – to do the same thing, the logic would be to simply run either the same Heal() function or a similar one for that non0player character and ensure the appropriate banana-chomping animation is used to enable players to understand what is happening. Ultimately, it looks like it’s doing the same thing, but under the hood they’re completely distinct.
So imagine my surprise when – having sat down with developers Rob Massella and Sarah Noonan – that the skeletons are mimicking player input. So instead of simply triggering specific code behaviours, they’re pressing virtual equivalents of the controller/keyboard inputs and effectively ‘playing’ the game like humans are. Plus, the skeletons use the same base controller (or in UE4 terms, the same actor) as a human player, meaning they not only shares some of the players animations but also the input interface. So returning to the banana example, for a skeleton to heal itself, it’s actually pressing virtual buttons that enable it to grab a banana from its inventory and subsequently eat it.
Though it’s worth mentioning that movement on the ground isn’t using virtual representations of the sticks, they’re just using the navigation meshes baked onto the islands to walk around. What’s amazing about this is that by doing the extra legwork to parse a given interaction or behaviour for the AI into the appropriate player inputs, it ensures that skeletons can only execute actions if a player can do it as well. This kinda makes sense, given they’re… well… undead humans, but more imporantly it helps streamline testing of the skeleton AI, since if you can see them doing something that a player can not, then you know somethings gone wrong. But also, in theory it means that if new gameplay mechanics are added for the player then – once a bit of extra coding has been completed – the skeletons will be able to do it as well!
Given this can take a bit of getting used to, Andy Bastable explained to me that the gameplay team had a little ‘assignment’ that they would give to new developers to help them get to grips with the toolchain. Developers are tasked with creating a ‘Mariachi band’, whereby a group of skeletons must come together on a piece of land, pull out their instruments and start playing a song together.
Now all of the AI behaviours are managed server-side – much like what we saw in my case study on Tom Clancy’s The Division – given it ensures players on each device have the same experience as they interact with them. But there’s still the issue of balance, which as I mentioned in part 1 is addressed by having systems in place that makes sure skeletons scale in difficulty in accordance with the experience. Not only can the behaviours and base gameplay parameters such as hit points and available weapons change, but the types of skeletons are fairly broad with Overgrown, Shadow and Gold skeletons forcing players to mix up their play styles to defeat them. On starting playing for the first time, skeletons are quite slow, not particularly aggressive and can only use claws to attack or maybe a sword. As players increase their ranking in the order of souls, skeletons are given access to abilities they didn’t have before: they can strafe faster, hunt you more efficiently, back off if under attack, heal themselves with bananas and even start to use the pistol and blunderbus to attack. This is all achieved through use of data assets that can plugged into the character AI at runtime that defines how this specific skeleton will operate, with over 50 unique parameters that help diversify their attributes and behaviour.
So while the game needs to provide threats to players on any given island, there’s also all of the ambient wildlife: the pigs, chickens and snakes. They can either prove a pain in the ass while you’re avoiding a hoard of skeletons, be a resource you need to gather for Merchant Alliance quests or just add a bit of life to the surrounding environment.
In any case, they too use behaviour trees and while their architectures is largely similar to the skeletons, it is much more reduced in scale: with snakes attacking the player if in proximity and pigs and chickens just running away from you. The architecture is consistent across each type, with the data assets assigned to them helping to dictate how that specific animal will operate with the behaviour tree.
As mentioned in part 1, these are treated in much the same way as skeletons for load management and can be disabled or despawned when necessary if they’re consuming resource on the server that could be put to better use elsewhere.
Now having explored all the AI characters on land, what about at sea? So let’s check out the first real threat players are faced with in the murkey depths, sharks. Lots and lots of sharks.
From a design perspective, the sharks are intended to add a new layer of challenge for players by ensuring you don’t sit idle in the water. They only operate within a short range and spawn in when necessary. Meaning you won’t just stumble into a shark swimming the seas in the open world, instead you will effectively cause a shark to teleport into the game near your position then stalk you if the game feels like you’re sitting in the water for too long.
So while the shark behaviour trees is relatively straightforward – they only really cricle their prey or attack it – there are two distinct problems that needed to be addressed. The first big problem is navigation: how do you ensure an AI shark knows how to move through a volume of water. We typically use a navigation mesh to support movement on a static surface. This works OK on land in for characters such as the skeletons given the nav mesh is a two-dimensional surface that models movement on a three-dimensional space. However, this doesn’t scale to surfaces that are constantly changing shape or for volumes of space such as water and air – meaning you need to create a custom solution to resolve it. This isn’t a unique problem for Sea of Thieves, as we saw in my recent case study on Horizon Zero Dawn, where Guerilla Games had to build a separate navigation system for the flying enemy characters.
Rare challenged the problem head-on by building a navigation system that would integrate into the existing navigation framework in Unreal Engine but catered specifically for underwater movement. But before they could do that, there was a second design problem that needed to be addressed; a shark can’t stop moving. Whilst it varies between species, the majority of real-life sharks need to maintain movement in order to breathe. So the AI equivalent needs to replicate this behaviour: making lots of small corrective changes in direction at varying speeds. So the movement systems needed to ensure not only could the AI navigation through water like a shark, it had to move actually move like a shark would too.
So first things first, unless the sharks are instructed to attack a player, they typically swim in arcs. These is achieved by effectively calculating the arc of a circle of a given diameter, this impacts the turning rate of the shark as it’s moving and the designers can tweak the speed with which it moves along it – with that speed value also being sent to the movement components such that the animation reflects the current movement speed. The navigation systems give the sharks location in either 2D or 3D space to move towards, then create a natural arc that will fit that location.
A lot of effort is put into the turning rate of the sharks. The turn rates are constrained in such a way, that it prevents sharks from turning too sharply at high speed. If a shark needs to make a tight corrective turn, given it’s about to attack the player, it will slow down – but never to the point it stops of course – and ensure it’s lined up with the player before speeding up again. But also there’s a small window of acceptable error for shark movement, they can sometimes overshoot a target they’re arcing towards, but provided they’re not going to collide with any obstacles – which I’ll come back to in a second, then that’s fine, given it makes the sharks move more naturally.
This is all largely assuming movement in 2D space, meaning that the player and the shark are at the same depth in the water. In the event they don’t line up, the shark will plot the same paths as usual, but generate a simple besier curve to allow it to swim up or down to the same depth.
Now this is pretty cool, but there’s still one big problem left to deal with: collisions. Sharks need to avoid both ships and islands and are reliant on the environmental query systems in Unreal to spot obstacles in proximity, but they also have short range whisker-like sensors just in case they’re going to swim face-first into a boat. This is pretty important given that there’s still a small chance as it swims an arc that it runs risk of beaching onto an island…. which was apparently a much bigger issue during development!
To keep the codebase maintained, the source code for shark navigation is an extension of existing navigation, movement and AI controller code built into Unreal Engine. As such, it made life easier for the developers given it was designed to behave in much the same way as land-based navigation when called to execute and streamlined it for testing purposes, which is something that you can expect to hear more about later in this series.
Even the simplest of AI characters needed for AAA titles can prove to be a challenge, and even more so once they deviate from the expected formats in games. Even having AI that swim can prove to be a problem and it was exciting to see how these water-based threats were put together. But our journey through the Sea of Thieves on AI and Games is far from over. There are still some monstrous AI enemies that threaten to drag us down to Davy Jones Locker and in part three of the AI of Sea of Thieves we’re going to tackle them face on:
The kraken, the mighty beast that has haunted players since launch.
The mighty Megalodon released during the Hungering Deep expansion.
And the Skeleton Ships first seen thrashing the waves in the Cursed Sails, that now more aggressively seek players to plunder!
Read more about:
Featured BlogsYou May Also Like