NextBot

From Valve Developer Community
Revision as of 17:22, 5 May 2017 by Gavitro (talk | contribs)
Jump to navigation Jump to search

NextBot is an AI system used by several multiplayer Source Engine games. The system can be used for not only bots, as its name implies, but it can also be used for entities which are not controlled by the player.

The NextBot system is currently unavailable for any public version of the Source SDK. The documentation provided here is based on the nb_debug console command and an official PDF regarding Left 4 Dead's AI.

The differences between an NPC and a NextBot

Although Source NPCs (such as those from Half-Life 2) and NextBots are both used for AI, it is important to know that the two systems are not one and the same. Here are a couple key differences that set these two systems apart:

Source NPCs use nodes to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.

  • The NextBot system can be applied to both bots and non-playable entities.

The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.

Valve-developed games which use the NextBot system

The NextBot system is used for the following games:

Although Counter-Strike: Source and Counter-Strike: Global Offensive have AI bots, they do not use the NextBot system. Since the NextBot system was introduced in Left 4 Dead, these two Counter-Strike games use their own AI system to determine the behavior of their bots.

How a NextBot works

An example diagram of how an event affects a NextBot. This diagram is featured in the AI Systems of Left 4 Dead PDF.

A NextBot uses an overall structure, known as an "Actor", to determine various factors such as locomotion. When an event occurs, such as the example diagram's "Oninjured", the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:

Locomotion

This factor handles how a NextBot moves around in its environment. For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.

Body

This factor handles the animations of a NextBot. With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.

Vision

This factor handles how a NextBot sees certain entities in its environment. The field-of-view and line-of-sight functions also reside in this factor.

Keep in mind that this factor is not required for NextBots to work. A tf_zombie, for example, will find and attack enemies regardless of whether or not it sees them.

Intention

This factor is where the actual AI of a NextBot resides. The Intention factor manages the different behaviors a NextBot might have, and this factor is responsible for changing these behaviors depending on the event.

For a look at how this factor works in-game, see Example of NextBot Behaviors.

Behavior

A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.

This function can be considered to be the NextBot equivalent of a schedule, since they are both lists of certain actions the AI needs to perform.

Action

This features the actual AI code of a NextBot, which will run when its parent Behavior is run by the Intention factor. Actions can have an additional child Action, which will run at the same time as its parent Action.

This function can be considered to be the NextBot equivalent of a task, since they both contain the actual code which drives the AI itself.


See also

nb_debug

Example of NextBot Behaviors

External Links

The AI Systems of Left 4 Dead - An official PDF regarding various mechanics of Left 4 Dead, including NextBots and the AI Director