Every new form of media steals ideas, techniques and formal elements from what came before. Novels stole from oral tradition, films from theater. And so video games steal from film - constantly. Every new form of media must also exploit and treasure what makes it different. Unfortunately, music in videogames is often used in an exclusively filmic way. A movie may have a piece of music that plays throughout a scene, and a video game may have a piece that plays while players explore a particular location. And often it's not taken beyond this. Far too many developers fail to realize that interactivity, the core element that makes video games what they are, can be explored in all aspects of their game's design. Music can be made to respond to interaction. And while we have music-based games like Guitar Hero and Audiosurf, it's far more instructive if we discuss inventive uses of music in other genres. Sidescrollers, shooters, adventure games. What happens when we link gameplay to music in these contexts? There's a lot of power in having the right track for the right moment.
The single most important use of gameplay-responsive music is encouragement. We can use a well-placed musical cue as a catalyst for improving player skill, or to encourage a particular way of playing. When we're encouraging players through the use of music (or through text or dialogue), subtlety is essential. If players can see the hand of the designer too clearly, the illustion is broken and the game world loses its magic.
In Super Mario World, when the player bounces on an enemy's head or an enemy is killed by a kicked turtle shell, a note plays. It’s a cute little high-pitched ‘donk!’, designed to help players forget that they’ve just killed a (likely) innocent and (definitely) endangered member of the Testudinidae family. If players manage to land on another enemy before touching the ground, the ‘donk!’ sound plays again, but higher-pitched. It becomes immediately clear that players can chain these 'donk!'s together, with the pitch rising and rising as the player, through increased skill, manages to bounce on more and more enemies before touching the ground. The skill that’s being trained is the ability to accurately land on enemies without missing, and this is essential in some of the more difficult parts of the game, when leaps-of-faith must be made above bottomless pits, and the only thing preventing players from falling to their deaths is the bouncy body of an enemy. While not necessarily an application of music in this instance, it’s clear that we could substitute the rising scale of ‘donk!’s for a musical melody, and it could have a similar effect.
There were some real gameplay-responsive music experiments run within Super Mario World. In comparison to the dynamic music of recent games (or even the music of 1991’s Monkey Island 2, which had near-seamless song transitions powered by Lucasarts’ ‘iMUSE’), the dynamic switching of tracks in Super Mario World isn’t all that technically impressive. What’s impressive is the effect that this switching has on the player’s experience of the gameplay. There are several extended pieces of music within the game, each assigned to one or more levels and each largely without percussion. It sounds fine, and players tend not to notice that there’s anything missing. As soon as they mount Yoshi, however, the percussion track activates.
This is a great way to influence players, to let them know that there’s a shift happening. In this case, the fundamental gameplay is changing. The game is vastly different when the player has Yoshi - they can sustain more damage, the damage they cause by jumping is increased, and they have access to all of Yoshi’s shell-based power-ups. As the gameplay changes, there’s a minor but noticeable change in the accompanying soundtrack. The music sounds richer. The percussion makes it a little more powerful. More importantly, it sounds complete. There’s an implication that riding Yoshi is to be experiencing the game at its fullest.
Another memorable instance of gameplay-responsive music as encouragement is in Portal 2, when the player encounters gels. In the game, the player gets access to paint-like gels which can be spread onto surfaces to change their properties. The blue repulsion gel makes a surface bouncy, while the orange propulsion gel is slippery, and enables players to run at higher speeds. In keeping with Valve’s three-step player education strategy (teach in a safe space, make it harder, then make it deadly), the player gets some time in a relatively empty room to mess around with these gameplay elements. The music in these rooms is subdued and simple. But when the player bounces on the repulsion gel, an extra track or two is added to the music, and the player has a sudden sense that they're listening to a more complete soundtrack. The added tracks give a more hopeful and bright feel to the music - it’s undoubtedly an encouraging change. The player is being given subtle, positive feedback about their actions. This is seen throughout Portal 2, with many of the various puzzle elements having some impact on the music. It reaches fascinating levels of complexity - check out this post for more on that.
You don't need a plastic guitar or Dragonforce to play around with music in your game, to let players have an effect on the soundtrack, or to use these interactions to drive players. You don't even need to be that technically savvy. By all means, you could develop your own iMUSE or emulate Red Dead Redemption's mixing system. But you could just as easily separate one of your instrument tracks and add it back to the mix when certain conditions are met. Sometimes simple is better.
Interaction is important. This is not film. Video games aren't about telling a one-sided story - they're about an interaction between the developer and the player. The big question that needs to be asked when it comes to your game's music is this: is my music being played at the player, or with them?