我一直在尝试实现as的框架AVFAudio,以便记录音频,播放音频,以及根据用户选择的预置更改音频数据。我还试图找出如何在本地将文件保存到用户的设备上,但是,在AVFAudio上阅读苹果的文档时,我很难理解在创建这些文件时应该采取哪些步骤。我一直在跟踪https://www.raywenderlich.com/21868250-audio-with-avfoundation/lessons/1,并设法在这里设置了一些函数。
这里我已经设置了保存音频,但正如您所看到的,这只会将音频保存到一个临时目录。我想知道如何将音频文件本地保存到用户的设备上。
// MARK: Saving audio
var urlForVocals: URL {
let fileManger = FileManager.default
let tempDirectory = fileManger.temporaryDirectory
let filePath = "TempVocalRecording.caf"
return tempDirectory.appendingPathComponent(filePath)
}在使用AVFAudio时,我通常对AVFoundation的框架感到困惑,而文档https://developer.apple.com/documentation/avfaudio并没有详细说明如何实现每种方法。例如,Doc声明要创建一个音频播放器:我们需要插入它(contentsOf: url ),但是没有进入url是什么以及我们使用它的原因?有人能帮我理解进一步的步骤吗?我觉得我好像在兜圈子,试图理解这个框架和苹果文档。
发布于 2021-12-28 22:07:36
这是一个相对简单的版本。有关正在发生的事情,请参见内联注释。
cclass AudioManager : ObservableObject {
@Published var canRecord = false
@Published var isRecording = false
@Published var audioFileURL : URL?
private var audioPlayer : AVAudioPlayer?
private var audioRecorder : AVAudioRecorder?
init() {
//ask for record permission. IMPORTANT: Make sure you've set `NSMicrophoneUsageDescription` in your Info.plist
AVAudioSession.sharedInstance().requestRecordPermission() { [unowned self] allowed in
DispatchQueue.main.async {
if allowed {
self.canRecord = true
} else {
self.canRecord = false
}
}
}
}
//the URL where the recording file will be stored
private var recordingURL : URL {
getDocumentsDirectory().appendingPathComponent("recording.caf")
}
private func getDocumentsDirectory() -> URL {
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
return paths[0]
}
func recordFile() {
do {
//set the audio session so we can record
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print(error)
self.canRecord = false
fatalError()
}
//this describes the format the that the file will be recorded in
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do {
//create the recorder, pointing towards the URL from above
audioRecorder = try AVAudioRecorder(url: recordingURL,
settings: settings)
audioRecorder?.record() //start the recording
isRecording = true
} catch {
print(error)
isRecording = false
}
}
func stopRecording() {
audioRecorder?.stop()
isRecording = false
audioFileURL = recordingURL
}
func playRecordedFile() {
guard let audioFileURL = audioFileURL else {
return
}
do {
//create a player, again pointing towards the same URL
self.audioPlayer = try AVAudioPlayer(contentsOf: audioFileURL)
self.audioPlayer?.play()
} catch {
print(error)
}
}
}
struct ContentView: View {
@StateObject private var audioManager = AudioManager()
var body: some View
{
VStack {
if !audioManager.isRecording && audioManager.canRecord {
Button("Record") {
audioManager.recordFile()
}
} else {
Button("Stop") {
audioManager.stopRecording()
}
}
if audioManager.audioFileURL != nil && !audioManager.isRecording {
Button("Play") {
audioManager.playRecordedFile()
}
}
}
}
}https://stackoverflow.com/questions/70512452
复制相似问题