NextBot: Difference between revisions

From Valve Developer Community
Jump to navigation Jump to search
No edit summary
(improved)
 
(30 intermediate revisions by 14 users not shown)
Line 1: Line 1:
'''NextBot''' is an AI system used by several multiplayer Source Engine games. The system can be used for not only [[bots]], as its name implies, but it can also be used for entities which are not controlled by the player.
{{LanguageBar}}
{{toc|style=display:none;}}


The NextBot system is currently unavailable for any public version of the Source SDK. The documentation provided here is based on certain console commands (most prominently <code>nb_debug</code>) and an official PDF regarding Left 4 Dead's AI.
{{Capsule|
== What Is A NextBot? ==
The {{Code|NextBot}} system is mainly used for creating and controlling in-game entitles that behave similarly to a player ''(commonly referred to as bots)''. Unlike [[NPC]]s NextBots were designed for in-game dynamic thinking and movement.
{{Quote|As of February 2025, the code for the system is available under [[Source SDK 2013]].}}
}}


= The differences between an NPC and a NextBot =
Although Source NPCs (such as those from [[Half-Life 2]]) and NextBots are both used for AI, it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:


* '''NextBots use [[Navigation Meshes | navigation meshes]] to move around, not [[Nodegraph | nodegraphs]].'''
{{Capsule|
Source NPCs use [[Node | nodes]] to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.
=== How a NextBot works ===
[[File:Nextbot_actor.JPG|thumb|400px|right|An example diagram of how an event affects a NextBot. This diagram is featured in the ''AI Systems of Left 4 Dead'' PDF.]]
 
A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's {{Code|Oninjured}}, the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:
 
 
=== {{Tint|color=darkgreen|Locomotion}} ===
Handles how a NextBot moves around in its environment.
{{Quote|For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.}}
 
 
=== {{Tint|color=red|Body }} ===
Handles the animations of a NextBot.
{{Quote|With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.}}
 
 
=== {{Tint|color=blue|Vision}} ===
Handles how a NextBot sees certain entities in its environment.
* functions like {{Code|field-of-view}} and {{Code|line-of-sight}}are located here.
* Keep in mind that this factor is '''NOT''' required for NextBots to work.
{{Quote|For example a skeleton in {{tf2|4}}, will find and attack enemies regardless of whether or not it sees them.}}


* '''The NextBot system can be applied to both bots and non-playable entities.'''
The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.


* '''The NextBot system is almost entirely built with ground-based entities in mind.'''
=== {{Tint|color=yellow|Intention}} ===
Source NPCs can be defined to either navigate on the ground or use [[Info_node_air | air nodes]] to fly around open spaces. Two NextBot entities ([[eyeball_boss|Monoculous]] and [[Merasmus]]) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic.
The Intention factor manages the different behaviors a NextBot might have and is responsible for changing them.
''{{Quote|For a look at how this factor works in-game, see [[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]].}}''


= Valve-developed games which use the NextBot system =
The NextBot system is used for the following games:
* [[Left 4 Dead]] and [[Left 4 Dead 2]]
** Survivor bots
** All variants of the Infected
* [[Team Fortress 2]]
** RED and BLU bots
** Mann vs. Machine robots
** Halloween NPCs ([[headless_hatman|The Horseless Headless Horsemann]], [[tf_zombie|Skeletons]], etc.)
** Robot Destruction robots


Although [[Counter-Strike: Source]], [[Counter-Strike: Global Offensive]] and [[Dota 2]] have AI bots, they do ''not'' use the NextBot system, but rather utilize unique AI systems to determine the bots' behavior.
=== {{Tint|color=orange|Behaviour}} ===
A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.
{{Quote|This function can be considered to be the NextBot equivalent of a [[Schedule]] from [[NPC]]s}}


= How a NextBot works =
[[Image:Nextbot_actor.JPG|thumb|300px|right|An example diagram of how an event affects a NextBot. This diagram is featured in the ''AI Systems of Left 4 Dead'' PDF.]]A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's "Oninjured", the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:


=== Locomotion ===
=== {{Tint|color=orange|Action}} ===
This factor handles how a NextBot moves around in its environment. For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.
This features the code for a NextBot, which will run when its parent {{Tint|color=orange|Behaviour}} is run by the {{Tint|color=yellow|Intention}} factor.
* Actions can have an additional child Action, which will run at the same time as its parent Action.
{{Quote|This function can be considered to be the NextBot equivalent of a [[Task]] from [[NPC]]s}}
|style=2}}


=== Body ===
{{Table
This factor handles the animations of a NextBot. With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.
| align = center
| caption indent:top = 1em
| caption = Valve-developed games which use the NextBot system
| caption indent:bottom = 1em
| {{tr
| {{th|radius=3px 0 0 0| Game }}
  {{th|radius=0 3px 0 0| Example }}
}}
{{tr
| {{td|bgcolor=#303030| {{l4d|4}} & {{l4d2|4}} }}
  {{td| Survivor bots & All Infected }}
}}
{{tr
| {{td|bgcolor=#303030| {{tf2|4}} }}
  {{td| Local Bots, Mann Vs Machine Robots, Halloween Event Characters }}
}}
{{tr
| {{td|bgcolor=#303030| {{dota2|4}} }}
  {{td| Hero Bots }}
}}
{{tr
| {{td|colspan=2|bgcolor=#303030|radius=0 0 3px 0| Although {{css|4}} and {{csgo|4}} have bots, they use unique systems to determine behavior. }}
}}
}}


=== Vision ===
This factor handles how a NextBot sees certain entities in its environment. The field-of-view and line-of-sight functions also reside in this factor.


Keep in mind that this factor is not required for NextBots to work. A Skeleton in Team Fortress 2, for example, will find and attack enemies regardless of whether or not it sees them.
{{Capsule|
=== The differences between a nodegraph [[NPC]] and a NextBot ===
Although Source NPCs (such as those from {{hl2|4|}}) and NextBots are both used for [[AI]], it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:


=== Intention ===
This factor is where the actual AI of a NextBot resides. The Intention factor manages the different behaviors a NextBot might have, and this factor is responsible for changing these behaviors depending on the event.


''For a look at how this factor works in-game, see [[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]].''
==== 1. NextBots use [[Nav Mesh|navigation meshes]] to move around, not [[nodegraph]]s. ====
Source NPCs use [[Node | nodes]] to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.


==== Behavior ====
A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.


This function can be considered to be the NextBot equivalent of a [[Schedule | schedule]], since they are both lists of certain actions the AI needs to perform.
==== 2. The NextBot system can be applied to both bots and non-playable entities. ====
The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.


==== Action ====
This features the actual AI code of a NextBot, which will run when its parent Behavior is run by the Intention factor. Actions can have an additional child Action, which will run at the same time as its parent Action.


This function can be considered to be the NextBot equivalent of a [[Task | task]], since they both contain the core programming which drives the AI itself.
==== 3. The NextBot system is almost entirely built with ground-based entities in mind. ====
Source NPCs can be defined to either navigate on the ground or use [[Info_node_air | air nodes]] to fly around open spaces. Two NextBots ([[eyeball_boss|Monoculous]] and [[Merasmus]]) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic, with their only point of reference being the player(s).
|style=2}}




= See also =
=== Additional Resources ===
[[nb_debug]]
* [[nb_debug]]
* [[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]]


[[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]]
=== External links ===
* [https://steamcdn-a.akamaihd.net/apps/valve/2009/ai_systems_of_l4d_mike_booth.pdf The AI Systems of Left 4 Dead] - An official PDF regarding various mechanics of Left 4 Dead, including NextBots and the AI Director
** [https://www.youtube.com/watch?v=PJNQl3K58CQ GDC Recorded Video of presentation]


= External links =
[[Category:Source]]
[http://www.valvesoftware.com/publications/2009/ai_systems_of_l4d_mike_booth.pdf The AI Systems of Left 4 Dead] - An official PDF regarding various mechanics of Left 4 Dead, including NextBots and the AI Director
[[Category:NextBot]]
[[Category:NextBot]]
[[Category:Glossary]]
[[Category:Bots]]

Latest revision as of 21:15, 18 July 2025

English (en)中文 (zh)Translate (Translate)

What Is A NextBot?

The NextBot system is mainly used for creating and controlling in-game entitles that behave similarly to a player (commonly referred to as bots). Unlike NPCs NextBots were designed for in-game dynamic thinking and movement.

As of February 2025, the code for the system is available under Source SDK 2013.


How a NextBot works

An example diagram of how an event affects a NextBot. This diagram is featured in the AI Systems of Left 4 Dead PDF.

A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's Oninjured, the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:


Locomotion

Handles how a NextBot moves around in its environment.

For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.


Body

Handles the animations of a NextBot.

With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.


Vision

Handles how a NextBot sees certain entities in its environment.

  • functions like field-of-view and line-of-sightare located here.
  • Keep in mind that this factor is NOT required for NextBots to work.
For example a skeleton in Team Fortress 2 Team Fortress 2, will find and attack enemies regardless of whether or not it sees them.


Intention

The Intention factor manages the different behaviors a NextBot might have and is responsible for changing them.

For a look at how this factor works in-game, see Example of NextBot Behaviors.


Behaviour

A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.

This function can be considered to be the NextBot equivalent of a Schedule from NPCs


Action

This features the code for a NextBot, which will run when its parent Behaviour is run by the Intention factor.

  • Actions can have an additional child Action, which will run at the same time as its parent Action.
This function can be considered to be the NextBot equivalent of a Task from NPCs


The differences between a nodegraph NPC and a NextBot

Although Source NPCs (such as those from Half-Life 2 Half-Life 2) and NextBots are both used for AI, it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:


1. NextBots use navigation meshes to move around, not nodegraphs.

Source NPCs use nodes to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.


2. The NextBot system can be applied to both bots and non-playable entities.

The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.


3. The NextBot system is almost entirely built with ground-based entities in mind.

Source NPCs can be defined to either navigate on the ground or use air nodes to fly around open spaces. Two NextBots (Monoculous and Merasmus) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic, with their only point of reference being the player(s).


Additional Resources

External links