Intro to Game Audio Scripting: by Kurt Larson


Herein I will say a few things about the more technical side of performing game-audio work, especially focusing on the concept of "scripting". I will not attempt to teach anyone "how to script" in this article, since that is a large topic, and most of the work is proprietary to the individual game teams. Instead I will try to describe the types of technical work you will encounter, along with some recommendations of how to prepare yourself to perform them.

If you work in-house on game audio, you won't just be doing audio work, you'll mostly be doing technical work. For example, my text editor is one of my main tools. When I am actually making or implementing (plugging-in, wiring-up, scripting, etc.) audio assets, I spend most of my cycles using three tools: Ultra-Edit, FMOD Designer, and Sound Forge. (Ultra-Edit ( is a great text editor for non-programmers. It can be used by programmers, but I think it is perfect for the technically-minded non-programmer.) My point is that that text editor is at least as important, and probably MORE important than the other two more audio-centric tools. My text editor is where I create the files which the game actually uses to know HOW to play the audio assets I've created. Something I learned in my very first-ever audio contract job more than 20 years ago is:



No matter how great the renders from your awesome ProTools project are, if the poor guy who has to wire them into the game doesn't know what he's doing, they will end up just being annoying noise and will get muted by the player. If you're lucky, you're going to actually be working in-house on a game team and will be doing this implementation yourself. When you do, you will be doing a great deal of what you could call "scripting". In fact, not counting time spent in meetings and managing up, creating audio assets may be only about 20% of your time. The other 80% is implementation (technical) work.

What will you actually literally be doing though? Probably editing some .XML files, possibly working within your company's proprietary scripting system, and maybe even some work with a well-known scripting language like Python or ActionScript. What you will almost certainly NOT be doing is writing and compiling the actual game code. (Hopefully) Here I provide a few examples, not as an attempt to teach anyone how to write audio scripts, but rather to show what those assets tend to look like:

download Music Playback Script

1. Music playback script; proprietary music-scripting language. This is a script to control interactivity in music for the game PlayStation 2 game "Whiplash". The script is fed to a proprietary music interactivity system use in that game. It receives variable input from the game program, and responds to change the music according to the amount of action in which the the player engages. The syntax and rules had to be explained directly by the programmer to created the scripting environment. The important thing to note here is how things are organized. The top section sets out some basic conditions, and then there is the "MAIN LOOP" section which drives the music. Below that are five different "States", which act to change the tempo and to bring in and out different tracks.

download SFX playback script

2. SFX playback script, proprietary game-design-scripting language. Using more than just "Start" and "Stop" commands, these scripts have exercise fine control over the delivery of game sound and its sensitivity to the player's actions and contexts. In these examples, I am showing mostly just the audio-related lines which have been added to larger, more complex scripts created by the game designers. Frequently, the audio designer will edit existing scripts, rather than creating entirely new ones from scratch.

download Simple XML file

3. Simple XML file. This is an example of what we call a "Virtual Sound Bank" file at my current company, Nihilistic Software. It serves mostly to simply connect game code to audio assets. In this example, we have a series of simple events, and one more-complex one. The one-line events simply make an association between a coded sound event (the value of the "event ID" tag) and an FMOD sound event (the value of the "event" tag) Below these simple events, however, we see an event which plays a sound when it receives one code event, and stops that sound when it receives another.

The programmer assigned to doing the audio support work created these capabilities for me at my request. As the audio designer, you will be expected to figure out what capabilities are needed, and you will have to then rigorously defend the actual need for them. The programming team will ALWAYS have more important things on which to work, so you'll be carefully negotiating these features as you go. As such, it is EXTREMELY helpful to be able to understand what programmers do and to more-or-less speak their language, even if not fluently. To help with this, I strongly recommend taking a few classes, like a propositional calculus class (logic), and a basic introductory 'C' programming class. Just understanding the fundamental concepts behind programming can be of immense help to you as you work with your programmers, not to mention how much it can help you when you need to work with the scripting resources they give you.


To sum up: Be ready to get techy. Very few people who are in a position to hire audio people know, understand, or care about ProTools, Waves, compression, e-Magic, the Fletcher-Munson curve, or anything like that. What they care about is hiring someone who is not afraid to work out technical details of asset implementation with the programmers using a text editor. Good luck!


Kurt Larson is an audio professional with 29 years of experience in technical audio work, including 17 years of experience in the game industry. His career has spanned positions as a Composer, SFX designer, Audio Implementer/Scripter, and Audio Director, with strong emphasis on technical development and leadership. Kurt is currently the audio manager for Nihilistic Software. His portfolio can be viewed here