Sound design in games

Sound design in games

To achieve a unified sound design in games and a more complete understanding of the sequence of sounds, they can be divided into several types: voice acting, atmospheres (ambient sounds), synchronous noises (fols), sound effects (CFX) and music.

There are several ways to use voice in games. Voiced dialogues serve as the main source of information for users. Announcer phrases reward the player (eg “Double kill”) or foreshadow certain actions (“Round one – Fight”). The voice acting of the main character’s actions, such as screaming when taking damage or jumping, breath sounds while running, helps to strengthen the connection with the character and better convey his state.

When working with voice acting, it is important to select actors with a suitable timbre so that their voice matches the character’s character. The voice of the NPC should evoke a sense of reality in the player. If the character is a monster or an alien, the peculiarities of pronunciation and the peculiarities of the speech apparatus should be emphasized. To do this, sound designers use pitch shifting and sound layering tools such as Pitch and Layering, respectively. These tools allow you to change the pitch and duration of the sound, as well as create compositions from various layers and textures. In addition, the use of effects must be considered.

Voice acting in a game is usually associated with either a dialogue tree or a trigger.

To create atmosphere, it is important to use looped audio samples, in which the end of the audio file smoothly transitions to the beginning. It is important to make sure that there are no noticeable jumps and use sounds that are at least one minute long (preferably at least five minutes long) to create the illusion of a continuous background sound. Any abrupt transition can break the player’s immersion.

When adding background sounds to the virtual world, it is important to consider their positioning on the map. For example, if it is a jungle location, general sounds such as birds chirping, waterfall noise, rustling should always be present. Each object has its own sound zone, which can be divided into two levels: the rise zone and the constant volume zone. This process is called attenuation.

When the player enters the first zone, the volume of the sound increases as they get closer to the source. This process continues until the user gets close enough. The player then moves to the second zone, where the sound volume remains unchanged. This is important for positioning and blending zones, as sounds from different sources can overlap. Background sounds are placed directly on the locations, and it is important to ensure that they do not go beyond their area of effect.
When creating an atmosphere for orientation in the game world, the following parameters are taken into account:

  • Volume: sound volume level;
  • Positioning: determining the location of sound in space;
  • Priority: determining the importance of the sound relative to other sounds;
  • Reverb: Reflection of sound in closed spaces, creating an echo.

When working with sound, it is important to pay attention to the main game elements, the so-called accents, that shape the user experience. Sound mixing and prioritization processes are useful for this.

For example, when an enemy appears, the sounds that he makes must be clearly distinguished from the background of the environment. There are various ways to achieve this task:

  • Distribution of sound layers on different audio channels and their mixing;
  • Prioritization of sounds based on game logic.

Using the Sidechain effect, which allows you to change the volume of sounds depending on priorities. For example, when an enemy appears and its sounds are made, other loud sounds will automatically fade out.

Reverb is also of great importance when positioning sounds on a map. It is important to remember that a sound, such as snapping fingers, will sound different in a room and in a cave due to reflections from walls and surfaces. The Reverb tool can help you with this task.

Reverb zones usually correspond to the rooms where the atmospheres are located and can interact with each other. It is important to ensure the unity of space for all sound layers in the scene, as well as to use reverb as the main tool for creating the sound approach and removal effect.

Synchronous noise recording (foley) is often done in a studio in order to achieve authenticity and high sound quality. In the past, using audio recordings on set was difficult due to the noisiness of the cameras. Jack Foley, an American sound engineer in the motion picture industry, developed a system in which sound was recorded separately from the video on the screen, but in sync with it.

Synchronous noises are now also created in specialized studios. It is not necessary to use natural materials to achieve the desired sound. For example, paper and special sound effects can be used to emulate the sound of electricity, or a bag of starch can be used to create the sound of snow.

The most efficient way to link synchronous noises is to map them directly to the animation. For example, to sound the sound of the character’s footsteps, you need to determine the moment when the foot touches the surface, and attach the appropriate sound to this moment. It is important to choose the right sound, which depends on the material that the character steps on.

SFX are sound effects that are not in reality: the sounds of aliens, starships, as well as accents that the player should pay attention to – explosions, gunshots, interface sounds. It is very important for SFX to match the setting of the game. For example, when creating interface sounds, you can use sounds with the same texture (glass, stone, wood) to ensure a uniform sound.

SFX are placed on locations in the same way as sound effects, or can be tied to specific triggers or animations.

Music is used to create a certain mood in a scene. She perfectly conveys the emotional state and dramaturgy. Music can be in-frame (the sound source is in the scene: gramophone, piano) or off-screen. Placement of music is carried out in the same way as background sounds. It is almost always used in cut scenes and is a simple and effective way to tie audio design in games together.

Music in games is usually divided into two types:

  • Adaptive music;
  • Linear music.

Adaptive music is used to seamlessly interact with gameplay and match the intensity and drama of what is happening. However, to achieve this, it is necessary to make smooth transitions between different segments of the composition.

For example, we have a looped musical segment A, which is connected to a simple walk of the character. When the player approaches the lava and activates the trigger, part B begins. In order to ensure a smooth transition, it is necessary to initially consider the possibility of a composition change.

There is a horizontal type of transition, which includes the transition from the first segment to the second, from the second to the third, and so on. Each segment is divided into parts, which have their own transition conditions to the next one. For example, a special final part can be recorded, which is included when switching to another segment.

There is also a vertical method, where a common background melody is used, complemented by other musical layers when moving to different locations or changing the pace of the game. For example, a modified version of the main theme may be included. It is important that all these parts are concisely combined with each other and create a harmonious sound.

To create a balanced sound design in games, it is important to adhere to certain principles:

  • Use of appropriate sound textures: do not replace the sound of a plastic object with the sound of a metal one;
  • Compliance with planned sound;
  • Applying accents to draw attention to key elements;
  • Accounting for the priority of sounds.

In addition, it is important to understand the characteristics of sound. For example, when the airship is behind a mountain and it is not visible, but heard, there are no high-frequency sounds. When the airship appears from behind the mountain, the frequencies are aligned, and there is a reverberation, an echo from the gorge. The closer the object, the less echo. The larger the object, the more low-frequency sound textures correspond to it. Many such nuances are revealed with experience.

  • Professional producer and sound engineer. Antony has been creating beats, arrangements, mixing and mastering for over 15 years. Has a degree in sound engineering. Provides assistance in the development of Amped Studio.

Free registration

Register for free and get one project for free