NextBot

From Valve Developer Community
Jump to navigation Jump to search
English (en)中文 (zh)Translate (Translate)

NextBot is an AI system used by several multiplayer Source Source games. The system can be used for not only bots, as its name implies, but it can also be used for entities which are not controlled by the player.

While code for the Nav mesh system is present, the NextBot code is not, and both are not compiled in any public SDK by default. As such, the documentation provided here is mostly based on certain console commands (most prominently nb_debug) and an official PDF regarding Left 4 Dead Left 4 Dead's AI.

The differences between a nodegraph NPC and a NextBot

Although Source NPCs (such as those from Half-Life 2 Half-Life 2) and NextBots are both used for AI, it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:

Source NPCs use nodes to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.

  • The NextBot system can be applied to both bots and non-playable entities.

The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.

  • The NextBot system is almost entirely built with ground-based entities in mind.

Source NPCs can be defined to either navigate on the ground or use air nodes to fly around open spaces. Two NextBots (Monoculous and Merasmus) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic, with their only point of reference being the player(s).

Valve-developed games which use the NextBot system

The NextBot system is used for the following games:

Todo: Are there other characters/entities in Dota 2 which use NextBot?
  • Any game with the NextBot system built-in

Although Counter-Strike: Source Counter-Strike: Source and Counter-Strike: Global Offensive Counter-Strike: Global Offensive have AI bots (cs_bot), they use unique systems independent of NextBot to determine the bots' behavior.

How a NextBot works

An example diagram of how an event affects a NextBot. This diagram is featured in the AI Systems of Left 4 Dead PDF.

A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's "Oninjured", the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:

Locomotion

This factor handles how a NextBot moves around in its environment. For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.

Body

This factor handles the animations of a NextBot. With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.

Vision

This factor handles how a NextBot sees certain entities in its environment. The field-of-view and line-of-sight functions mainly reside in this factor.

Keep in mind that this factor is not required for NextBots to work. A Skeleton in Team Fortress 2, for example, will find and attack enemies regardless of whether or not it sees them.

Intention

This factor is where the actual AI of a NextBot resides. The Intention factor manages the different behaviors a NextBot might have, and this factor is responsible for changing these behaviors depending on the event.

For a look at how this factor works in-game, see Example of NextBot Behaviors.

Behavior

A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.

This function can be considered to be the NextBot equivalent of a schedule, since they are both lists of certain actions the AI needs to perform.

Action

This features the actual AI code of a NextBot, which will run when its parent Behavior is run by the Intention factor. Actions can have an additional child Action, which will run at the same time as its parent Action.

This function can be considered to be the NextBot equivalent of a task, since they both contain the core programming which drives the AI itself.


See also

nb_debug

Example of NextBot Behaviors

External links

Confirm:Is there a video recording of this presentation?