我正在用AVAudioEngine
、AVAudioPlayerNode
和AVAudioPCMBuffer
制作节拍器。缓冲区的创建方式如下:
/// URL of the sound file
let soundURL = Bundle.main.url(forResource: <filename>, withExtension: "wav")!
/// Create audio file
let audioFile = try! AVAudioFile(forReading: soundURL)
let audioFormat = audioFile.processingFormat
/// Create the buffer - what value to put for frameCapacity?
if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: ???) {
buffer.frameLength = audioFrameCount
try? audioFile.read(into: buffer)
return buffer
}
我在frameCapacity初始化器中为AVAudioPCMBuffer设置了什么值?
文档中说,frameCapacity应该是“PCM示例帧中缓冲区的容量”。那是什么意思?这是一个静态值,还是从音频文件中获取?
发布于 2022-09-26 04:20:06
frameCapacity
是AVAudioPCMBuffer
可以容纳的最大帧数。你不需要使用所有的框架。AVAudioPCMBuffer
的使用者只应该参考frameLength
框架和frameLength <= frameCapacity
。如果您正在处理N
帧块中的音频,并且无论出于什么原因,您都可以得到一个简短的读取,与长度不同的容量可能很有用:
let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: N)
while readChunk(chunk) {
let frameLength = min(buffer.frameCapacity, chunk.lengthInFrames)
// copy frameLength frames into buffer.floatChannelData[0] or something
buffer.frameLength = chunkLength // could be less than N
}
但是,如果您只打算存储audioFrameCount
帧(文件的长度?)在缓冲区中,然后将frameCapacity
设置为:
let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)
https://stackoverflow.com/questions/73848877
复制相似问题