Disclosure: when you buy through links on our site, we may earn an affiliate commission.

Game Audio 301: Sound & Music Implementation using Wwise

Learn how to implement game sound, sfx, music and optimize audio using the middleware application, Wwise.
4.3
4.3/5
(113 reviews)
1,005 students
Created by

9.2

CourseMarks Score®

9.4

Freshness

8.3

Feedback

9.2

Content

Platform: Udemy
Video: 7h 8m
Language: English
Next start: On Demand

Table of contents

Description

So you’ve written and produced your music, recorded your dialogue and created all your sfx – but how do you get these files in your game and make them behave the way you want?
In game audio, creating the audio files themselves (sfx, music, etc) is a process but so is creating the systems that govern how and when these sounds are played.
Instead of putting audio on a timeline, game audio professionals are usually asked to design audio events which are then programmed by either a technical sound designer or programmer on the development team to work with Unreal, Unity or another game engine.
Wwise is one of the leading applications of audio middleware used to design these audio events. Able to be integrated with Unreal, Unity and most other available game engines, it gives the user the ability to implement all audio in a game space.
Students will learn the basics of working within Wwise, general considerations for implementing audio and a musical score as well as the available tools for troubleshooting.
During the course, students will implement all audio for the open source game Cube while walking through the educational resources created by the developer of Wwise, Audiokinetic.
This class is designed for sound designers, composers, developers, and anyone else experienced in sound who wants a more holistic game audio skill set.

You will learn

✓ How to implement audio and music using the middleware application, Wwise
✓ Different considerations when implementing audio vs music
✓ Using single sounds for multiple applications
✓ Audio processing resource optimization
✓ Tools for audio implementation troubleshooting
✓ How to construct an interactive music system
✓ States, Switches, RTPC’s, Stingers, Containers and other implementation-centric audio concepts

Requirements

• A basic knowledge of audio theory and concepts
• Ideally, completion of Game Audio 101 & 201 Udemy Courses

This course is for

• Sound Designers
• Composers
• Game Developers
• Programmers
Composer and Sound Designer for Games
Originally a violinist, Elliot began his journey in music and audio using the Suzuki method at the age of four. He began playing piano at eight, guitar at 12 and at 20, Daft Punk convinced him to give the computer a shot, too. (Thankfully!)Elliot has degrees in Music Composition and Sound Design, and his work has been featured in campaigns for United Airlines, Instagram, Spiderman: Far From Home, GMC, The Godrej Group, Chevrolet Motors, the Goodman Theatre and the Joffrey Ballet as well as many independent films and games.
Elliot always focuses on creating distinct personality and aesthetic in all audio while maintaining that memorable “hummable” factor in his music. He brings this sensibility to his work in games, film, trailer and commercial projects. He has served as Soundpost Co-Chair for the Chicago Symphony Orchestra Overture Council and teaches as Adjunct Faculty in the Film and Game programs at DePaul University.
Browse all courses by on Coursemarks.
Platform: Udemy
Video: 7h 8m
Language: English
Next start: On Demand

Students are also interested in