Response System
The Response System is used to decide what line of speech (and/or animation) should be used by a NPC (or player) when they want to say something, or whether to say anything at all.
Most Response System users are meant to speak in response to events unfolding the game. Speech can be triggered from the code or inputs from the map. Triggering the speech involves a response concept which corresponds to responses in the system. Concepts are categories of speech used with a specific event, like danger being noticed (TLK_DANGER
) or an enemy being seen (TLK_STARTCOMBAT
). While a concept can be used alone for basic and simple dialogue, speech can also be triggered with various criteria which describes things related to the concept and/or the current state of the speaker and the world around them. This allows for complex dialogue trees that involve different lines for different situations.
For example, npc_alyx
in Half-Life 2's episodic series speaks the TLK_PLAYER_KILLED_NPC
concept when the player kills an enemy NPC. This concept has additional criteria for conditions related to the NPC or the way it was killed. If the player killed the NPC through a headshot, the response system will pick a response that requires that criteria, which usually results in Alyx complimenting the player's shot or remarking that it was a headshot.
The Response System normally uses scripts in the scripts/talker
directory. When it receives a concept and its criteria, it searches through the response scripts for a rule that matches its concept and criteria. When it finds one, it uses that rule's response or selection of responses. A response can be a soundscript, a sentence, or even an instanced choreographed scene.
Despite the advantages, most NPCs do not use this system by default. For example, npc_pigeon
only uses soundscripts by default and does not use the Response System.
Here's a quick list of places where the Response System is used:
- In Half-Life 2, most player companion NPCs (citizens, Alyx, etc.) use the Response System for NPC speech.
- In Team Fortress 2, all classes use the Response System for voice commands and taunt responses.
- In the Left 4 Dead series, all of the survivors use the Response System to respond to the players' actions and events in the world.
- In Portal 2, Atlas and P-body (the robots in co-op mode) both use the Response System for taunt gestures.
- In Counter-Strike: Global Offensive, all player characters use the Response System and keep track of which responses have already been used.
- In Half-Life: Alyx, see HLALYX:Response_rules.txt.
CBaseFlex
with the CAI_ExpresserHost<>
template class, which is automatically implemented on CAI_BaseActor
. All NPCs and players already derive from CBaseFlex
, so you'd usually just want to use CAI_ExpresserHost<>
, but the Response System can be used in a more limited way on any entity using the DispatchResponse()
function, which is how env_speaker
works. GetResponseSystem()
can be used to make concepts search a specific response system tree and must be overridden if you are planning on using DispatchResponse()
without CAI_ExpresserHost<>
.Purpose
The Response System was created so NPCs, players, etc. use a united and robust system for managing speech. It not only allows developers and writers to create complex dialogue trees, but it also allows modders to easily modify or add new speech to the game without having to write code or modify a BSP file while still having complete control over how and when speech will be used.
This system is best used for speech that comes out of the AI system since those lines will be spoken many times throughout a game. (e.g. Citizens say something when they reload their weapon. A lot of citizens will reload their weapons during the course of the game.)
Here's an example of a complex dialogue tree from an NPC utilizing the Response System:
- I just killed an enemy.
- Did I use a shotgun?
- "Eat shotgun!"
- Was the enemy really close?
- "You got too close!"
- "Thanks for getting close!"
- Was the enemy a
npc_headcrab
?
- "You're not getting MY head!"
- "I hate headcrabs!"
You could also use combinations of criteria.
- Did I use a shotgun against a
npc_headcrab
really close to me?
- "You got too close to my shotgun, headcrab!"
- Did I use a shotgun against a
You can also control individual lines so they aren't repeated for a set amount of time, or never repeat at all.
Structure
The Response System is made up of four core pieces: Concepts, Criteria, Rules, and Response Groups. The way these are used is as follows:
- An NPC requests a line of speech for a speech concept.
- For example, let's say our NPC requests a line of speech for the
TLK_SHOT
concept. An NPC speaks this concept whenever they're shot by an enemy.
- For example, let's say our NPC requests a line of speech for the
- The NPC gathers a bunch of criteria reflecting the NPC's current state and other relevant data about the state of the world. Many concepts also have modifiers, which are criteria unique to specific concepts that usually reflect the event itself.
- When the NPC requests a line for the
TLK_SHOT
concept, the game assembles the NPC's current health, etc. as part of the default criteria set. Then, it appends the concept's modifiers, which may include the type of enemy that shot them, the amount of damage they took, etc.
- When the NPC requests a line for the
- The concept and criteria are passed into the NPC's response system, which is normally a single global instance shared by all NPCs.
- The Response System searches through its big list of rules.
- Each rule contains a list of criteria which is tested against the criteria assembled by the NPC. Note that the concept is treated as a high-priority criterion in this phase.
- Each rule gets a score based on how many of the criteria are true. If at least one criterion is marked as required and isn't satisfied, the rule will never be chosen. Most criteria are set to be required, but criteria that aren't required simply boost the rule's score.
- In our
TLK_SHOT
example, there might be multipleTLK_SHOT
rules which contain different lines. One rule might contain criteria that test to see if the NPC is very close to the enemy who shot him. Another rule might test to see if it was a specific type of enemy that shot him (such as a Combine Soldier). Another rule might test to see if the NPC has <25% of its health left after the shot.
- In our
- The Response System scores all of the rules in its list and chooses the one that scores highest. The rule that scores the highest specifies a response group.
- A response group is simply a list of possible responses, each of which might be a line of speech and/or animation. One response is selected based on the settings of the response group and the individual responses. When a valid response is chosen, the NPC plays the response.
- In our
TLK_SHOT
example, let's say the <25% health rule was chosen. This rule would have a response group which might contain a list of lines like "One more shot like that, and I'm done for!", "Oh man, I'm in trouble!", or "I need medical attention over here!" - Another rule for
TLK_SHOT
could check if the enemy shooting them is a Combine soldier and point to a response group with lines like "That Combine soldier's got me pinned down!" or "Help me with this soldier!". Another rule could check if the enemy is a Combine gunship and point to a group with "That gunship is kicking my butt!" and "Someone help me take down that gunship before it kills me!"
- In our
- If no rule matches the given criteria (or the chosen response group doesn't repeat itself and has been exhausted), the NPC doesn't say anything.
Concepts
A concept is a string that represents the high-level reason for the character's speech attempt. There are a set of concepts defined inside the code which will be called automatically, but a concept is technically just a special, high-level criterion that characters keep track of. They aren't tied to anything and you can freely create your own in the response file. You can invoke concepts in the code or with an actor's DispatchResponse
or SpeakResponseConcept
input.
Here's a list of some of the predefined NPC concepts used in Half-Life 2:
TLK_HELLO When I first meet the player. TLK_IDLE When I have been idle for a while. TLK_USE When the player tries to +USE me. TLK_PLPUSH When the player pushes me away. TLK_STARE When the player stares at me for a while. TLK_DANGER When I sense something nearby that's dangerous (e.g. a grenade). TLK_WOUND When I've taken damage. TLK_HIDEANDRELOAD When I decide to hide and reload my gun. TLK_PLYR_PHYSATK When I've been hit by an object thrown by the player.
Not all NPCs speak all concepts, and not all NPCs speak concepts under the same circumstances. See list of response concepts for a full list.
Criteria Set
Criteria is a set of conditions that contain data related to the speaker's current state and the circumstances of the concept whenever a speech attempt is made. It can be interpreted as a set of KeyValues. Here is an example of a criteria set created by a npc_alyx
who's trying to speak because she just killed her current enemy:
concept = TLK_ENEMY_DEAD The concept the NPC is trying to speak. map = d3_c17_07 The name of the current map. classname = npc_alyx The classname of the speaking NPC. name = alyx The targetname of the speaking NPC. health = 75 The health of the speaking NPC. healthfrac = 0.9375 The health of the speaking NPC, as a fraction of the NPC's max. (npc_alyx's max health is 80 by default) skill.cfg = 1 The current skill level. timesinceseenplayer = 0.090000 The amount of time since the speaking NPC has seen the player. distancetoenemy = 312.639679 The distance from the speaking NPC to its current enemy. activity = ACT_RUN The animation activity the speaking NPC is running. npcstate = [NPCState::Combat] The AI state of the speaking NPC. enemy = npc_combine_s The classname of the speaking NPC's current enemy. speed = 79.235 The movement speed of the speaking NPC. weapon = weapon_alyxgun The current weapon being held by the speaking NPC. distancetoplayer = 211.240692 The distance from the speaking NPC to the player. seeplayer = 1 Whether or not the speaking NPC can see the player. seenbyplayer = 0 Whether or not the speaking NPC is within the player's view. readiness = agitated The readiness level of the speaking NPC. playerhealth = 100 The player's current health. playerhealthfrac = 1.000 The player's current health, as a fraction of the player's max. playerweapon = weapon_shotgun The current weapon being held by the player. playeractivity = ACT_WALK The animation activity the player is running. playerspeed = 0.000 The movement speed of the player.
This concept does not have any modifiers by default. All criteria in the above list are general and gathered for each concept.
Criteria such as the ones in the list above can be checked by a rule's list of criteria, and used to make decisions about which response group to use for the desired concept. For instance:
- The enemy criterion could be used to pick the right response to the TLK_ENEMY_DEAD concept. Instead of making a general statement, Alyx could say "I took care of that soldier!" or "I took care of that headcrab!".
- The healthfrac field could be used to choose a "Phew, that was close!" line if her health was <20% when she finished off her enemy.
- The distancetoenemy field could be used to choose different lines for when she killed her enemy at long or short range. i.e. "Those guys are scary when they get that close!" or "It's not easy hitting 'em at that range."
Even though the criteria listed above are general and not concept-specific, the criteria will always vary under different circumstances and might not always be available (i.e. NPCs that aren't in combat won't have enemy or distancetoenemy criteria). Additionally, mapmakers can append extra criteria to specific NPCs, or to all NPCs in the game. See Response contexts for more info.
Rule Criteria
Rules have a list of criteria that are tested against the character's criteria set. When a rule is scored, each criterion is checked against the given data, and the rule receives points for criteria that successfully match. The amount of points a criterion earns for the rule is determined by the criterion's weight.
Criteria are defined inside the script files (see below). The following format is used:
criterion <criterion name> <key to check> <desired value> <optional: weight X> <optional: required>
The parameters are as follows:
- criterion name
- The name of the criterion. Must not match an existing criterion.
- key to check
- The key within the character's criteria set that this criterion will check.
- desired value
- The desired value of the key within the criteria set. This can take multiple forms:
- Numeric values: "0", "1", or "100".
- Inverse Numeric values: "!=0" would match if the value is not equal to 0.
- String value: "npc_alyx", "weapon_alyxgun", or "npc_combine_s".
- Enumerated value: "[NPCState::Combat]".
- Ranges:
- ">0" : Match if the value is greater than 0.
- "<=0.5" : Match if the value is less than, or equal to, 0.5.
- ">10,<=50" : Match if the value is greater than 10 and less than, or equal to, 50.
- ">0,<[NPCState::Alert] : Match if the value is greater than 0 and less then the value of enumeration for NPCState::Alert.
- Note:Does not support wildcards by default.
- weight X
- An optional parameter, where X is the amount of points this criterion is worth if it matches. If unspecified, criterion are worth 1 point by default.
- required
- An optional parameter that states that this criterion is required for rules containing it to be used at all. If a required criterion does not match successfully, rules containing it score 0 and are immediately skipped. Most criteria use this parameter.
Some examples from Half-Life 2:
- This defines a criterion named PlayerNear, which checks to make sure the player is within 500 units of the speaking NPC.
criterion "PlayerNear" "distancetoplayer" "<500" required
- This defines a criterion named IsCitizen, which checks to make sure the speaking NPC is a
npc_citizen
.
criterion "IsCitizen" "classname" "npc_citizen" "required"
- This defines a criterion named IsMap_d3_c17_12, which checks to make sure the game is currently on d3_c17_12.bsp. Useful for making all the citizens in one map say different lines than other maps.
criterion "IsMap_d3_c17_12" "map" "d3_c17_12" "required"
- This defines a criterion named IsBob, which checks to make sure the speaking NPC has a targetname of "bob". This is a unique citizen in the game, and this criteria makes it easy to have him say unique lines.
criterion "IsBob" "targetname" "bob" required
Rules
A rule contains a list of criteria and at least one response group. The rule receives points for each of the criteria that successfully matches the speaker's criteria set. The highest scoring rule will direct to one of its response groups, which is used to determine the exact speech the NPC will use. Rules are defined inside the script files (see below). The following format is used:
rule <rule name> { criteria <criterion name 1> [optional: <criterion name 2> <criterion name 3> etc.] response <response group name> [optional: <response group name 2> etc.] [optional: matchonce] [optional: applyContext <data>] }
The parameters are as follows:
- rule name : The name of the rule. Must not match an existing rule.
- criteria : The list of criteria the rule should score with.
- response : The list of response groups that should be chosen if this rule scores the highest.
- matchonce : An optional parameter which, if specified, causes this rule to be deactivated once it has been chosen once.
- applyContext : An optional parameter which applies a response context.
For example, the following text defines a rule called CitizenTalkStare. ConceptTalkStare is a criterion that checks to make sure the concept the speaking NPC wants to speak is "TLK_STARE". IsCitizen is a criterion that checks to make sure the speaking NPC is a citizen. NPCIdle is a criterion that checks to make sure the NPC's state is "NPCState::Idle". If this rule scores highest, the response group that will be used is CitizenTalkStare.
rule CitizenTalkStare { criteria ConceptTalkStare IsCitizen NPCIdle response CitizenTalkStare }
Note that the rule name and the response group name can be identical because rule names need only be unique amongst rules, and response groups names only unique amongst groups.
Response Groups
A response group contains a set of possible responses, along with some optional data that defines how the responses should be used. When a response group is chosen by a rule, a response is chosen from the list and given back to the speaker to use. Response groups are defined inside the script files (see below). The following format is used:
response <response group name> { [optional: permitrepeats] [optional: sequential] [optional: norepeat] <response type> <response> <optional: ...> <response type> <response> <optional: ...> <response type> <response> <optional: ...> }
The response group parameters are as follows:
- permitrepeats : If specified, responses in this group are allowed to repeat. If unspecified, the default behavior is to use all responses in the list before repeating any,
- sequential : If specified, responses will be used in the order they're listed in the group. If unspecified, the default behavior is to randomly choose responses from the list.
- norepeat : If specified, once all responses in the list have been played, the response group will be disabled. Any rules that choose this response group will return no response to the speaker.
Responses
The response is the actual response which is selected by the system and then used by the speaker. A response group can list as many responses as desired, with each response being one of the following types:
- speak : The response is a soundscript or raw audio file.
- sentence : The response is a sentence name from
sentences.txt
. - scene : The response is a
.vcd
file. See Choreography Implementation for more information. - response : The response is a reference to another response group which should be selected instead.
- print : The response is some text that should be printed at the speaker's position in
developer 1
(used for placeholder responses). - entityio (in all games since )[confirm](also in ) : The response is an I/O event which fires on a specific entity with the speaker as the activator. The format is
entityio "<name> <input> <param>"
. This is different from followup responses, which are covered in more detail farther below. - vscript (only in ) : The response is a line of VScript code that runs from the scope of the speaker.
- vscript_file (only in ) : The response is a path to a VScript file that runs from the scope of the speaker.
Post-response delay parameters
Each response supports a variety of optional parameters.
- nodelay : After the response has finished, the speaker is allowed to speak again immediately.
- defaultdelay : After the response has finished, the speaker won't be allowed to speak for a random amount, between 2.8 & 3.2 seconds.
- delay X : After the response has finished, the speaker won't be allowed to speak for X seconds. X can also be a range, e.g. "5.5,7.5"
- weapondelay X : When the response starts, the speaker will not fire their weapon for X seconds. Only available on HL2 NPC allies by default.
- speakonce : Prevents the response from being used more than once.
- odds X : If specified, then when this response is chosen, there is a chance the speaker will say nothing instead of saying the response. The odds is a (0-100)% chance of canceling the response, meaning an odds of 25 gives a 25% chance of saying nothing.
- respeakdelay : If specified, this response may not be used unless the concept hasn't been spoken in at least X seconds. X can also be a range, e.g. "5.5,7.5"
- soundlevel : If specified, this soundlevel should be used for the response instead of the default SNDLVL_TALKING.
- displayfirst : If specified, this response should be used first (ignores the weight parameter).
- displaylast : If specified, this response should be used last (ignores the weight parameter).
- weight : If specified, used to weight the selection of the responses in the list. By default, all responses have a weight of 1. Please note that responses won't be repeated until all of the other responses have been chosen, meaning once all of the high-weight responses are chosen, the system will only count the responses of less weight. This can be counteracted with permitrepeats.
- noscene : Stops the Response System from creating an auto-generated scene for a speak response.
- stop_on_nonidle : When the response is spoken, stop the scene when the NPC enters a non-idle state. In Source 2013, only works in HL2 episodic mods on scene responses.
- predelay : When the response is chosen, it won't actually be spoken until X seconds have passed. Only works on scene responses. X can also be a range, e.g. "5.5,7.5"
For example, the following response group is used by citizens to respond to the TLK_STARE concept. Citizens will use the responses in the list in the order they're listed (due to the sequential parameter). Each response, when chosen, stops the NPC from talking for a random amount of time between 10 & 20 seconds.
response "CitizenTalkStare" { sequential scene "scenes/npc/$gender01/doingsomething.vcd" delay "10,20" scene "scenes/npc/$gender01/getgoingsoon.vcd" delay "10,20" scene "scenes/npc/$gender01/waitingsomebody.vcd" delay "10,20" }
Script files
The /scripts/talker/response_rules.txt
is the base script file that contains all the criteria, rules, and response groups used by the Response Rules System. The file can also include other files using the #include keyword, which allows you to cleanly divide the rules up according to NPC, map, and so on.
Note that some entities, like the env_speaker, specify their script files that contain a subset of the criteria, rules, and response groups for the entity to use. See scripts/talker/terminal_pa.txt
for an example used by the terminal announcement at the train station.
Followup responses
Left 4 Dead introduced "followups", events which occur following the response. They are triggered as response parameters similar to odds, predelay, etc.
- fire : Fires an input through the I/O system with the speaker as the activator and caller. Format is
fire <target> <input> <delay>
. Does not support parameters. - then : Causes another response to be dispatched on an entity. Format is
then <target> <concept> <response contexts> <delay>
. Used for making characters dynamically respond to each other.
A few unique targetnames can be used:
- self : The entity speaking the response.
- subject : Uses the entity name found in the speaker's "Subject" context, if it exists. The game sets this for
info_remarkable
responses. - from : Uses the entity name found in the speaker's "From" context, if it exists. The game sets this to the name of the previous followup respondent, allowing followups to bounce back and forth.
- any : Dispatched to any valid respondent within the range stored in
rr_followup_maxdist
(1800 by default). - all : Dispatched to all valid respondents within the range stored in
rr_followup_maxdist
(1800 by default).
Example from Left 4 Dead 2 in coach.txt
:
Response _c1m4startelevator4bCoach { scene "scenes/Coach/WorldC1M4B01.vcd" then mechanic _c1m4startelevator5a foo:0 -2.313 //Son, you got a DEAL. scene "scenes/Coach/WorldC1M4B02.vcd" then mechanic _c1m4startelevator5a foo:0 -5.790 //Ha HA! All the way to New Orleans! Baby, that sounds like a PLAN. scene "scenes/Coach/WorldC1M4B05.vcd" then mechanic _c1m4startelevator5b foo:0 -6.334 //Normally I wouldn't do this. But in these circumstances, I think Mr. Gibbs, Jr. ain't gonna mind. scene "scenes/Coach/WorldC1M4B10.vcd" then mechanic _c1m4startelevator5b foo:0 -2.685 //Forgive us Jimmy, but we need your car. }
Advanced response rules usage
The Response System can be used in many advanced and specialized ways. Here's a few tips and tricks for advanced usage of the system:
- Firing custom responses with
DispatchResponse
- Absolutely anything can be passed to an NPC as a response concept, so long as it is to be found somewhere in the script files. You aren't limited to the AI's predefined TLK_* concepts at all.
- For example, you might want NPCs from a onlooking group to speak congratulatory response concepts when a player successfully solves part of a puzzle.
- Player allies in HL2 also have a
SpeakResponseConcept
input with more advanced handling and conditions.
See the response scripts for Alyx in EP1/EP2 or the response scripts for the L4D survivors for more examples of advanced response system usage.
Debugging
Notes
- To be able to use the Response Rules system, an NPC must be derived from the CAI_BaseActor class.
- See the
scripts/talker/npc_*.txt
files for examples of specific NPC response rules. - The
scripts/talker/response_rules.txt
is the manifest for the response rules system. If you add new response rule script files for your new NPC, make sure you #include it at the end of thescripts/talker/response_rules.txt
file.
See also
External links
- Elan Ruskin's "Rule Databases for Contextual Dialog and Game Logic" presentation at GDC 2012, which describes the Response System and its evolution. (Slides: [1])
- Two Bots One Wrench: Environmentally Responsive Speech Video, Demo Video #1
- Two Bots One Wrench: Memory And Context, Demo Video #2
- Two Bots One Wrench: Conversation, Demo Video #3