OpenAL, for example, employs direct I/O for real-time audio in games.
Indeed, Core Audio in iOS provides ways to achieve real-time audio using higher level interfaces. Many audio applications, however, don’t access this layer. Mac apps can be written to use these technologies directly when they require the highest possible, real-time performance. Host Time Services, which provides access to the computer’s clock The audio hardware abstraction layer (audio HAL), which provides a device-independent, driver-independent interface to hardwareĬore MIDI, which provides software abstractions for working with MIDI streams and devices The I/O Kit, which interacts with drivers Figure 2-1 The three API layers of Core Audio The programming interfaces for Core Audio are arranged into three layers, as illustrated in Figure 2-1. The later sections in this chapter introduce you to how Core Audio works with files, streams, recording and playback, and plug-ins. Continue reading to understand the design principles, use patterns, and programming idioms that pervade Core Audio. Read the first two sections in this chapter for a brief introduction to these interfaces and how they work together. Apple has designed the software interfaces to Core Audio using a layered, cooperative, task-focused approach.