NextBot: Difference between revisions

From Valve Developer Community
Jump to navigation Jump to search
(Source code has finally been made public for the TF2 SDK release)
(improved)
 
Line 1: Line 1:
{{LanguageBar}}
{{LanguageBar}}
{{toc|style=display:none;}}


'''NextBot''' is an [[AI]] system used by several multiplayer {{source|4}} games. The system can be used for not only [[bot]]s, as its name implies, but it can also be used for entities which are not controlled by the player.
{{Capsule|
== What Is A NextBot? ==
The {{Code|NextBot}} system is mainly used for creating and controlling in-game entitles that behave similarly to a player ''(commonly referred to as bots)''. Unlike [[NPC]]s NextBots were designed for in-game dynamic thinking and movement.
{{Quote|As of February 2025, the code for the system is available under [[Source SDK 2013]].}}
}}


As of February 2025, the code for the system is available under [[Source SDK 2013]].


== The differences between a nodegraph [[NPC]] and a NextBot ==
{{Capsule|
Although Source NPCs (such as those from {{hl2|4|}}) and NextBots are both used for AI, it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:
=== How a NextBot works ===
[[File:Nextbot_actor.JPG|thumb|400px|right|An example diagram of how an event affects a NextBot. This diagram is featured in the ''AI Systems of Left 4 Dead'' PDF.]]


* '''NextBots use [[Nav Mesh|navigation meshes]] to move around, not [[nodegraph]]s.'''
A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's {{Code|Oninjured}}, the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:
Source NPCs use [[Node | nodes]] to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.
 
 
=== {{Tint|color=darkgreen|Locomotion}} ===
Handles how a NextBot moves around in its environment.  
{{Quote|For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.}}
 
 
=== {{Tint|color=red|Body }} ===
Handles the animations of a NextBot.  
{{Quote|With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.}}


* '''The NextBot system can be applied to both bots and non-playable entities.'''
The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.


* '''The NextBot system is almost entirely built with ground-based entities in mind.'''
=== {{Tint|color=blue|Vision}} ===
Source NPCs can be defined to either navigate on the ground or use [[Info_node_air | air nodes]] to fly around open spaces. Two NextBots ([[eyeball_boss|Monoculous]] and [[Merasmus]]) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic, with their only point of reference being the player(s).
Handles how a NextBot sees certain entities in its environment.  
* functions like {{Code|field-of-view}} and {{Code|line-of-sight}}are located here.
* Keep in mind that this factor is '''NOT''' required for NextBots to work.
{{Quote|For example a skeleton in {{tf2|4}}, will find and attack enemies regardless of whether or not it sees them.}}


== Valve-developed games which use the NextBot system ==
The NextBot system is used for the following games:
* {{l4dseries|4.1}}
** Survivor bots
** All variants of the Infected
* {{tf2|4.1}}
** RED and BLU bots
** Mann vs. Machine robots and Tanks
** Halloween NPCs ([[headless_hatman|The Horseless Headless Horsemann]], [[tf_zombie|Skeletons]], etc.)
** Robot Destruction robots
** Unused/Test NPCs ([[bot_npc_archer|Archer Sniper NPC]], [[bot_npc_decoy|Spy NPC]])
* {{dota2|4.1}}
** Hero bots
:{{todo|Are there other characters/entities in Dota 2 which use NextBot?}}
* Any game with the NextBot system built-in
** [[Simple bot]]


Although {{css|4.1}} and {{csgo|4.1}} have AI bots ([[cs_bot]]), they use unique systems independent of NextBot to determine the bots' behavior.
=== {{Tint|color=yellow|Intention}} ===
The Intention factor manages the different behaviors a NextBot might have and is responsible for changing them.
''{{Quote|For a look at how this factor works in-game, see [[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]].}}''


== How a NextBot works ==
[[File:Nextbot_actor.JPG|thumb|300px|right|An example diagram of how an event affects a NextBot. This diagram is featured in the ''AI Systems of Left 4 Dead'' PDF.]]A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's "Oninjured", the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:


=== Locomotion ===
=== {{Tint|color=orange|Behaviour}} ===
This factor handles how a NextBot moves around in its environment. For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.
A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.
{{Quote|This function can be considered to be the NextBot equivalent of a [[Schedule]] from [[NPC]]s}}


=== Body ===
This factor handles the animations of a NextBot. With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.


=== Vision ===
=== {{Tint|color=orange|Action}} ===
This factor handles how a NextBot sees certain entities in its environment. The field-of-view and line-of-sight functions mainly reside in this factor.
This features the code for a NextBot, which will run when its parent {{Tint|color=orange|Behaviour}} is run by the {{Tint|color=yellow|Intention}} factor.
* Actions can have an additional child Action, which will run at the same time as its parent Action.
{{Quote|This function can be considered to be the NextBot equivalent of a [[Task]] from [[NPC]]s}}
|style=2}}


Keep in mind that this factor is not required for NextBots to work. A Skeleton in Team Fortress 2, for example, will find and attack enemies regardless of whether or not it sees them.
{{Table
| align = center
| caption indent:top = 1em
| caption = Valve-developed games which use the NextBot system
| caption indent:bottom = 1em
| {{tr
| {{th|radius=3px 0 0 0| Game }}
  {{th|radius=0 3px 0 0| Example }}
}}
{{tr
| {{td|bgcolor=#303030| {{l4d|4}} & {{l4d2|4}} }}
  {{td| Survivor bots & All Infected }}
}}
{{tr
| {{td|bgcolor=#303030| {{tf2|4}} }}
  {{td| Local Bots, Mann Vs Machine Robots, Halloween Event Characters }}
}}
{{tr
| {{td|bgcolor=#303030| {{dota2|4}} }}
  {{td| Hero Bots }}
}}
{{tr
| {{td|colspan=2|bgcolor=#303030|radius=0 0 3px 0| Although {{css|4}} and {{csgo|4}} have bots, they use unique systems to determine behavior. }}
}}
}}


=== Intention ===
This factor is where the actual AI of a NextBot resides. The Intention factor manages the different behaviors a NextBot might have, and this factor is responsible for changing these behaviors depending on the event.


''For a look at how this factor works in-game, see [[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]].''
{{Capsule|
=== The differences between a nodegraph [[NPC]] and a NextBot ===
Although Source NPCs (such as those from {{hl2|4|}}) and NextBots are both used for [[AI]], it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:


==== Behavior ====
A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.


This function can be considered to be the NextBot equivalent of a [[Schedule | schedule]], since they are both lists of certain actions the AI needs to perform.
==== 1. NextBots use [[Nav Mesh|navigation meshes]] to move around, not [[nodegraph]]s. ====
Source NPCs use [[Node | nodes]] to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.


==== Action ====
This features the actual AI code of a NextBot, which will run when its parent Behavior is run by the Intention factor. Actions can have an additional child Action, which will run at the same time as its parent Action.


This function can be considered to be the NextBot equivalent of a [[Task | task]], since they both contain the core programming which drives the AI itself.
==== 2. The NextBot system can be applied to both bots and non-playable entities. ====
The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.




== See also ==
==== 3. The NextBot system is almost entirely built with ground-based entities in mind. ====
[[nb_debug]]
Source NPCs can be defined to either navigate on the ground or use [[Info_node_air | air nodes]] to fly around open spaces. Two NextBots ([[eyeball_boss|Monoculous]] and [[Merasmus]]) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic, with their only point of reference being the player(s).
|style=2}}


[[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]]


== External links ==
=== Additional Resources ===
* [[nb_debug]]
* [[Example_of_NextBot_Behaviors | Example of NextBot Behaviors]]
 
=== External links ===
* [https://steamcdn-a.akamaihd.net/apps/valve/2009/ai_systems_of_l4d_mike_booth.pdf The AI Systems of Left 4 Dead] - An official PDF regarding various mechanics of Left 4 Dead, including NextBots and the AI Director
* [https://steamcdn-a.akamaihd.net/apps/valve/2009/ai_systems_of_l4d_mike_booth.pdf The AI Systems of Left 4 Dead] - An official PDF regarding various mechanics of Left 4 Dead, including NextBots and the AI Director
::{{confirm|Is there a video recording of this presentation?}}
** [https://www.youtube.com/watch?v=PJNQl3K58CQ GDC Recorded Video of presentation]
 
[[Category:Source]]
[[Category:Source]]
[[Category:NextBot]]
[[Category:NextBot]]
[[Category:Bots]]
[[Category:Bots]]

Latest revision as of 21:15, 18 July 2025

English (en)中文 (zh)Translate (Translate)

What Is A NextBot?

The NextBot system is mainly used for creating and controlling in-game entitles that behave similarly to a player (commonly referred to as bots). Unlike NPCs NextBots were designed for in-game dynamic thinking and movement.

As of February 2025, the code for the system is available under Source SDK 2013.


How a NextBot works

An example diagram of how an event affects a NextBot. This diagram is featured in the AI Systems of Left 4 Dead PDF.

A NextBot uses an overall structure, known as an "Actor", to run through more specific factors. When an event occurs, such as the example diagram's Oninjured, the Actor responds by changing these factors to reflect the event. Here is a summary of all the different factors a NextBot has:


Locomotion

Handles how a NextBot moves around in its environment.

For example, if a NextBot was programmed to flee after being injured, it would rely on this factor to move to a different position in the map.


Body

Handles the animations of a NextBot.

With the "oninjured" example, a NextBot would rely on this factor to play a flinching animation.


Vision

Handles how a NextBot sees certain entities in its environment.

  • functions like field-of-view and line-of-sightare located here.
  • Keep in mind that this factor is NOT required for NextBots to work.
For example a skeleton in Team Fortress 2 Team Fortress 2, will find and attack enemies regardless of whether or not it sees them.


Intention

The Intention factor manages the different behaviors a NextBot might have and is responsible for changing them.

For a look at how this factor works in-game, see Example of NextBot Behaviors.


Behaviour

A Behavior contains a series of Actions, which it will perform when the Intention factor chooses it.

This function can be considered to be the NextBot equivalent of a Schedule from NPCs


Action

This features the code for a NextBot, which will run when its parent Behaviour is run by the Intention factor.

  • Actions can have an additional child Action, which will run at the same time as its parent Action.
This function can be considered to be the NextBot equivalent of a Task from NPCs


The differences between a nodegraph NPC and a NextBot

Although Source NPCs (such as those from Half-Life 2 Half-Life 2) and NextBots are both used for AI, it is important to know that the two systems are not one and the same. Here are a few key differences that set the systems apart:


1. NextBots use navigation meshes to move around, not nodegraphs.

Source NPCs use nodes to determine not only where to navigate, but also what to do when they reach a specific node. NextBots do not use these nodes to navigate; instead they use the navigation mesh to move around and, depending on what mark a specific area may have, perform different actions.


2. The NextBot system can be applied to both bots and non-playable entities.

The inner mechanics of the Source NPC system only apply to entities which are not players. With the NextBot system, both playable and non-playable characters can be fitted with AI if needed.


3. The NextBot system is almost entirely built with ground-based entities in mind.

Source NPCs can be defined to either navigate on the ground or use air nodes to fly around open spaces. Two NextBots (Monoculous and Merasmus) are capable of flight-based movement, but this is rather rudimentary, as the system currently doesn't have official support for "air navigation meshes" or any similar mechanic, with their only point of reference being the player(s).


Additional Resources

External links