Closed Captions

From Valve Developer Community

Jump to: navigation, search
Closed captions in Half-Life 2. Dr. Breen's pre-recorded speech is made less distinct, as it is not as important.

Closed captions, or subtitles, are text descriptions that accompany sound and dialogue. They cue players who can't hear what's going on to what people are saying, and to a certain extent what's happening around them. With a bit of ingenuity they can also be used to display dialogue that has not been recorded yet.

Subtitling or full closed captions can be enabled in the Source engine in Options > Audio.

Contents

Editing closed captions

Closed captions are stored in closecaption_%language%.dat (e.g. closecaption_english.dat) in <game dir>/resource/. A .dat file is binary file compiled from a corresponding .txt file by the closedcaptioncompiler tool.

Although the engine does not read from the text file, Valve provide their originals for use by modders. If you don't already have it, you can extract from the GCF of the game you are modding (or failing that the relevant source engine GCF e.g. C:\Program Files\Steam\steamapps\source engine.gcf in folder root\hl2\resource).

The format of the text file is:

lang
{ 
	Language "English" //(or "French", etc)
	Tokens 
	{ 
		// Captions defined here.
		nameofsound	"This is the caption."
 
		barn.chatter	"We're picking up radio chatter. They're looking for your car."
		// etc...
	}
}
  • The text file must be saved in the UCS-2 Little Endian format.
  • The name of the sound must be something defined in your soundscripts, not a raw filename.
  • Remember to wrap "quote marks" around any block of data containing whitespace.

Caption codes

<sfx>
Marks a line as a sound effect that will only be displayed with full closed captioning. If the user has cc_subtitles set to "1", it will not display these lines.
<clr:255,255,255>
Sets the color of the caption using RGB color; 0 is no color, 255 is full color. For example, ;<clr:255,255,255> would be white.
<b>
Bolds all text following the tag.
<i>
Italics text following the tag.

Compiling

The captions must be compiled with captioncompiler.exe. Make sure the SDK launcher is set to the mod you're working on, then simply drag your caption file onto it. The caption file must be dragged from the mod resource directory in order to produce the .dat file. e.g. C:\Program Files\Steam\steamapps\SourceMods\[mod name]\resource\.

This process will become slightly easier if you create a batch command file in the resource folder. Copy the following code into a text file and save it as a .bat file in the resources folder:

"[game/SDK root]\bin\captioncompiler.exe" %1
pause
Tip:With a batch file, you can use the -game "path to gameinfo.txt" parameter to override the global vproject value. Place it after %1, leaving a space. This saves you from having to launch the SDK and change it there.

Drag your caption .txt files onto the Batch file to use it as you would the .exe itself.

Playing sounds in your map with closed captions

Sometimes, you may want to play a sound with closed captions (or subtitles, really) when the player activates a trigger. Unfortunately, ambient_generic's "Play Everywhere" flag is broken for soundscripts, and subtitles only work on soundscripts (except for commentary nodes, see here). Hence you'll have to do a bit of a hack: use point_clientcommand to execute three commands:

  • play path/to/sound/file
  • cc_linger_time X, where X is a manually-tweaked value
  • cc_emit commentaryFileName, where commentaryFileName is a key from the closed captions text file.

Make sure both closecaption and cc_subtitles are set to 1.

Captions without sounds

Recording dialogue is a very resource-expensive process. If, once dialogue has been recorded, you decide you want to change something, making those changes can prove very costly, so final dialogue recording is often best done as one of the final stages in a project.

However, while creating choreographed scenes, it is highly desirable to have the dialogue available as early as possible. One way around this conflict is for the developer to record their own placeholder audio, but even this can take up significant amounts of time. A cheaper way of creating placeholder dialogue is to use closed captions without any attached dialogue.

There are three steps involved in accomplishing this:

1. Creating the captions
Sound tokens (e.g. barn.chatter above) can have closed captions attached without corresponding entries in the sound manifest. To create new closed captions, simply add new entries to the closecaption_english.txt (or other language file), as described above.
2. Adding the captions in Faceposer
In Faceposer, dialogue-free closed captions are added in the same way normal dialogue is added (right-click on the timeline and choose WAV File...). However, because the closed caption tokens are not in the sound manifest, they will not appear on the list of available sounds. Instead, simply refer to the entries you added in step 1 and type the token name manually into the Sound textbox.
3. Setting the length of caption appearance time
Once the first two steps are complete, your scenes will display the closed captions when they are played in game (as long as closed captions are turned on in the game options, of course). However, because there is no associated sound file, the event has no length. This means that the closed captions will leave the screen almost as soon as they are displayed, leaving a player little time to read them. While this is acceptable for captions comprised of one or two words, some more work is required to keep longer captions up on screen for a sufficient amount of time.

Faceposer doesn't support editing the length of WAV file events, but they can be edited through manual editing of the .vcd file. Open the .vcd in Notepad and find the speak event you wish to change the length of. A search for the name of the speak event will take you to the right place in the file.

There you will find an entry like this one:

event speak "Test"
{
	time 1.000000 -1.000000
	param "test.test"
	fixedlength
	cctype "cc_master"
	cctoken ""
}

The time line denotes the start and end time of the event. The end time is set to -1.000000 because the speak event currently has no length. Edit this number to be the time (in seconds, on the timeline) when you wish the closed caption to end and save the file.

Obviously the desired length of time that a caption appears for will differ depending on the length of text. In general, when trying to decide how long text should appear on screen for, a good first estimate can often be achieved by timing yourself reading the text aloud and then doubling that number, to account for its unfamiliarity to fresh eyes.

Note:You may find it useful to include a logic_auto and a point_clientcommand in your map, set to issue the console command closecaption 1 on map load. This will ensure that you don't need to remind anyone using your mod (e.g. other team members) to turn on their captions.
Warning:As a final note, this method could charitably be described as "experimental" and less charitably as a hack. As with all hacks, there exists the possibility that unpredictable and unexpected behaviour will be introduced. It is recommended that you make backups of your .vcd's before manually editing them, and keep regular backups of your .vcd's, lest Faceposer undo those edits suddenly.

Captions for BINK videos

With the release of Left 4 Dead, it is now possible to put subtitles on Bink-based videos.

Notes

In Left 4 Dead 2 and later games, Valve uses subtitles_language for the subtitles instead of keeping them in the closecaption_language files with the "hearing impaired" captions. The probable reasoning for this is because "closed captions" is for the hearing impaired. "Subtitles" is a more accepted term to use for this entity's purpose. Though, the sounds can still be marked as <sfx>.

See Also

Personal tools
Namespaces
Variants
Actions