On my last iPad project we needed the ability to record a sound clip and then play it back to the user with some visualizations. This is relatively easy with AVFoundation, but – as with many things in iOS – it takes quite a bit of boilerplate code to get it working.
To get started, make sure you add the AVFoundation framework to your linked libraries. In XCode, go to your project target -> Build Phases -> Link Binary with Libraries.
We’re going to need to create an audio session, create a file to write to and finally create the recorder with the settings we want.
As you can see we don’t need to actually use the audio session for anything, we just need to tell the device that we want to do a recording. In this case I’m just returning a boolean to indicate to the caller that the session was successfully started.
Now we need to create a temporary file to write to.
Now we can go ahead and create the recorder.
The recording settings here allow you to set various parameters such as the sampling rate and the number of channels. These values heavily influence the size and quality of the audio file you produce, so your specific requirements with dictate what you choose here.
As you can see, it’s mostly just a large amount of boilerplate code.
The playback is much easier. We simply need to point an AVAudioPlayer at the file we just recorded to and tell it to play.
My code example is available on GitHub. Happy coding.