首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >completionHandler of AVAudioPlayerNode.scheduleFile()被过早调用

completionHandler of AVAudioPlayerNode.scheduleFile()被过早调用
EN

Stack Overflow用户
提问于 2015-04-03 06:26:27
回答 7查看 6.2K关注 0票数 8

我正在尝试在AVAudioEngine 8中使用新的iOS。

看起来,在声音文件播放完之前,completionHandler of player.scheduleFile()被称为

我使用的是长度为5s的声音文件-- println()-Message大约在声音结束前1秒左右出现。

我是做错了什么,还是误解了completionHandler的概念?

谢谢!

以下是一些代码:

代码语言:javascript
运行
复制
class SoundHandler {
    let engine:AVAudioEngine
    let player:AVAudioPlayerNode
    let mainMixer:AVAudioMixerNode

    init() {
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()
        engine.attachNode(player)
        mainMixer = engine.mainMixerNode

        var error:NSError?
        if !engine.startAndReturnError(&error) {
            if let e = error {
                println("error \(e.localizedDescription)")
            }
        }

        engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
    }

    func playSound() {
        var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
        var soundFile = AVAudioFile(forReading: soundUrl, error: nil)

        player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })

        player.play()
    }
}
EN

Stack Overflow用户

发布于 2017-11-18 00:07:29

我的bug报告以“按预期工作”结束,但苹果向我指出了scheduleFile、scheduleSegment和scheduleBuffer方法在iOS 11中的新变体。这些方法添加了一个completionCallbackType参数,您可以使用该参数来指定回放完成后需要完成回调:

代码语言:javascript
运行
复制
[self.audioUnitPlayer
            scheduleSegment:self.audioUnitFile
            startingFrame:sampleTime
            frameCount:(int)sampleLength
            atTime:0
            completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
            completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
    // do something here
}];

文档没有说明它是如何工作的,但是我测试了它,它对我起作用。

我一直在iOS 8-10中使用这个解决方案:

代码语言:javascript
运行
复制
- (void)playRecording {
    [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
        float totalTime = [self recordingDuration];
        float elapsedTime = [self recordingCurrentTime];
        float remainingTime = totalTime - elapsedTime;
        [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
    }];
}

- (float)recordingDuration {
    float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
    if (isnan(duration)) {
        duration = 0;
    }
    return duration;
}

- (float)recordingCurrentTime {
    AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
    AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
    AVAudioFramePosition sampleTime = playerTime.sampleTime;
    if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
    sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
    float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
    self.audioUnitLastKnownTime = time;
    return time;
}
票数 3
EN
查看全部 7 条回答
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/29427253

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档