1 year ago

#263238

test-img

John Szatmari

How to change an iPhone's microphone sampling rate

I'm working on adding a feature to an existing app, to take audio input from the device microphone, convert it to frequency domain via an FFT and sends it to a coreML model. I'm using a standard AVCaptureDevice:

guard let microphone = AVCaptureDevice.default(.builtInMicrophone,
                                                   for: .audio,
                                                   position: .unspecified),
let microphoneInput = try? AVCaptureDeviceInput(device: microphone) else {
                fatalError("Can't create microphone.")
}

The issue is, I require a custom sample rate to be defined for the microphone. Following Apple's documentation, setPreferredSampleRate (link) should be able to do that in a range between 8000-48000 Hz. However no matter which value I choose, the sample rate won't change, and no error is thrown:

print("Microphone sample rate: ", AVAudioSession.sharedInstance().sampleRate)

do { var flag = try AVAudioSession.sharedInstance().setPreferredSampleRate(20000) }
catch { print("Unable to set microphone sampling rate!") }

print("Microphone sample rate: ", AVAudioSession.sharedInstance().sampleRate)

Output:

Microphone sample rate: 48000.0
Microphone sample rate: 48000.0

How could I define the sampling rate for iOS devices?

EDIT:

Following the suggestion of using AVAudioConverter to resample microphone input, what's the most efficient way of doing this, considering I'm using AVCaptureAudioDataOutputSampleBufferDelegate and the corresponding captureOutput method to collect raw audio input from the microphone:

extension AudioSpectrogram: AVCaptureAudioDataOutputSampleBufferDelegate {

    public func captureOutput(_ output: AVCaptureOutput,
                              didOutput sampleBuffer: CMSampleBuffer,
                              from connection: AVCaptureConnection) {

        var audioBufferList = AudioBufferList()
        var blockBuffer: CMBlockBuffer?

        CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
            sampleBuffer,
            bufferListSizeNeededOut: nil,
            bufferListOut: &audioBufferList,
            bufferListSize: MemoryLayout.stride(ofValue: audioBufferList),
            blockBufferAllocator: nil,
            blockBufferMemoryAllocator: nil,
            flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
            blockBufferOut: &blockBuffer)

        guard let data = audioBufferList.mBuffers.mData else {
            return
        }

        if self.rawAudioData.count < self.sampleCount * 2 {
            let actualSampleCount = CMSampleBufferGetNumSamples(sampleBuffer)

            let ptr = data.bindMemory(to: Int16.self, capacity: actualSampleCount)
            let buf = UnsafeBufferPointer(start: ptr, count: actualSampleCount)

            rawAudioData.append(contentsOf: Array(buf))
        }

        while self.rawAudioData.count >= self.sampleCount {
            let dataToProcess = Array(self.rawAudioData[0 ..< self.sampleCount])
            self.rawAudioData.removeFirst(self.hopCount)
            self.processData(values: dataToProcess)
        }
 }

ios

swift

fft

accelerate-framework

0 Answers

Your Answer

Accepted video resources