我的应用程序(在斯威夫特中编码)进行基于音频信号的实时处理。
我需要从输入端(来自USB麦克风的2个通道)获得一个具有左、右缓冲器的函数,并为输出获得一个带有缓冲区的函数(也包括2个通道)。
我以前使用EZAudio,但我有内存问题,2通道96K格式。当EZAudio停止的时候,我想换到超级动力或者Audiokit。
我的问题是:我无法在任何这些库中获得带有缓冲区的函数。
超级动力:我在桥头增加了#导入"SuperpoweredIOSAudioIO.h“。
我在我的SuperpoweredIOSAudioIODelegate中添加了ViewController。这会自动添加中断、权限和映射通道函数,但不添加audioProcessingCallback。
我尝试了以下几点:
audio = SuperpoweredIOSAudioIO(delegate: self, preferredBufferSize: 12, preferredMinimumSamplerate: 96000, audioSessionCategory: AVAudioSessionCategoryPlayAndRecord, channels: 2, audioProcessingCallback: audioProcessingCallback, clientdata: UnsafeMutablePointer)
audio.start()
和
func audioProcessingCallback(buffers: UnsafeMutablePointer<UnsafeMutablePointer<Float>>, inputChannels: UInt32, outputChannels: UInt32, numberOfSamples: UInt32, sampleRate: UInt32, hostTime: UInt64) -> Bool {
return true
}
但我知道错误是:
无法将“类型”(UnsafeMutablePointer>、UInt32、UInt64) -> Bool的值转换为预期的参数类型“audioProcessingCallback!”(又名'ImplicitlyUnwrappedOptional<@convention(c) (可选,Optional>>>,UInt32,UInt64) -> Bool>')
我找不到斯威夫特图书馆的任何例子。
对于AudioKit,下面是我所做的:
let mic = AKMicrophone()
installTap(mic)
AudioKit.output = mic
AudioKit.start()
func installTap(_ input:AKNode) {
input.avAudioNode.installTap(onBus: 0, bufferSize: 1024, format: AudioKit.format) { [weak self] (buffer, time) -> Void in
self?.signalTracker(didReceivedBuffer: buffer, atTime: time)
}
}
func signalTracker(didReceivedBuffer buffer: AVAudioPCMBuffer, atTime time: AVAudioTime){
let samples = UnsafeBufferPointer(start: buffer.floatChannelData?[0], count:1024)
audioProcess.ProcessDataCaptureWithBuffer(samples, numberOfSamples: UInt32(1024))
}
它可以在我的算法中得到缓冲区,但它似乎不是在“实时”,我的意思是,非常慢.(对不起,很难解释)。
谢谢!
发布于 2017-11-29 17:34:32
如果您需要进行实时处理,则不应使用Swift (或ObjC)。目前,在AudioKit中这样做的方法是创建一个AUAudioUnit子类并在其中进行处理。但是,如果您只需要音频点击速度更快,那么AKLazyTap是一个很好的解决方案。它不同于普通的点击,因为你必须轮询它的数据,但这个方法允许缓冲区的重复使用,所以你可以尽快调用它。
下面是一个使用AKLazyTap获取峰值的示例:
import UIKit
import AudioKit
class ViewController: UIViewController {
let microphone = AKMicrophone()
var tap: AKLazyTap?
override func viewDidLoad() {
super.viewDidLoad()
AudioKit.output = microphone
AKSettings.ioBufferDuration = 0.002 // This is to decrease latency for faster callbacks.
tap = AKLazyTap(node: microphone.avAudioNode)
guard tap != nil,
let buffer = AVAudioPCMBuffer(pcmFormat: microphone.avAudioNode.outputFormat(forBus: 0), frameCapacity: 44100) else {
fatalError()
}
// Your timer should fire equal to or faster than your buffer duration
Timer.scheduledTimer(withTimeInterval: AKSettings.ioBufferDuration / 2, repeats: true) { _ in
var timeStamp = AudioTimeStamp()
self.tap?.fillNextBuffer(buffer, timeStamp: &timeStamp)
if buffer.frameLength == 0 { return } // This is important, since we're polling for samples, sometimes it's empty, and sometimes it will be double what it was the last call.
let leftMono = UnsafeBufferPointer(start: buffer.floatChannelData?[0], count:Int(buffer.frameLength))
var peak = Float(0)
for sample in leftMono {
peak = max(peak, fabsf(sample))
}
print("number of samples \(buffer.frameLength) peak \(peak)")
}
AudioKit.start()
}
}
发布于 2017-12-03 06:47:26
超级动力是一个C++ API,因为Swift不推荐用于实时处理.用C++编写音频处理代码。“点击”很慢,使用SuperpoweredIOSAudioIO实时获取音频。
https://stackoverflow.com/questions/47551246
复制相似问题