首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >AVAssetWriter视频输出不播放附加音频

AVAssetWriter视频输出不播放附加音频
EN

Stack Overflow用户
提问于 2021-10-25 20:26:23
回答 1查看 791关注 0票数 2

我有一个avassetwriter来记录一个带有应用过滤器的视频,然后通过avqueueplayer播放。

我的问题是音频输出附加到音频输入,但没有声音播放在播放回放。没有遇到任何现有的解决方案,希望能得到任何指导。

其次,我的.AVPlayerItemDidPlayToEndTime通知观察者(我必须循环播放)也不会触发。

AVCaptureSession安装

代码语言:javascript
运行
复制
func setupSession() {
    
    let session = AVCaptureSession()
    session.sessionPreset = .medium
    
    guard
        let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front),
        let mic = AVCaptureDevice.default(.builtInMicrophone, for: .audio, position: .unspecified),
        let videoInput = try? AVCaptureDeviceInput(device: camera),
        let audioInput = try? AVCaptureDeviceInput(device: mic),
        session.canAddInput(videoInput), session.canAddInput(audioInput) else { return }
    
            
    let videoOutput = AVCaptureVideoDataOutput()
    let audioOutput = AVCaptureAudioDataOutput()
    guard session.canAddOutput(videoOutput), session.canAddOutput(audioOutput) else { return }
    let queue = DispatchQueue(label: "recordingQueue", qos: .userInteractive)
    videoOutput.setSampleBufferDelegate(self, queue: queue)
    audioOutput.setSampleBufferDelegate(self, queue: queue)
    
    session.beginConfiguration()
    
    session.addInput(videoInput)
    session.addInput(audioInput)
    session.addOutput(videoOutput)
    session.addOutput(audioOutput)
    
    session.commitConfiguration()
            
    if let connection = videoOutput.connection(with: AVMediaType.video) {
        if connection.isVideoStabilizationSupported { connection.preferredVideoStabilizationMode = .auto }
        connection.isVideoMirrored = true
        connection.videoOrientation = .portrait
    }
    
    _videoOutput = videoOutput
    _audioOutput = audioOutput
    _captureSession = session
    
    DispatchQueue.global(qos: .default).async { session.startRunning() }
}

AVAssetWriter安装程序+ didOutput代理

代码语言:javascript
运行
复制
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
            
    let timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds

if output == _videoOutput {
    if connection.isVideoOrientationSupported { connection.videoOrientation = .portrait }
            
    guard let cvImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
    let ciImage = CIImage(cvImageBuffer: cvImageBuffer)
    
    guard let filteredCIImage = applyFilters(inputImage: ciImage) else { return }
    self.ciImage = filteredCIImage
    
    guard let cvPixelBuffer = getCVPixelBuffer(from: filteredCIImage) else { return }
    self.cvPixelBuffer = cvPixelBuffer
            
    self.ciContext.render(filteredCIImage, to: cvPixelBuffer, bounds: filteredCIImage.extent, colorSpace: CGColorSpaceCreateDeviceRGB())
            
    metalView.draw()
   }
            
    switch _captureState {
    case .start:
        
        guard let outputUrl = tempURL else { return }
        
        let writer = try! AVAssetWriter(outputURL: outputUrl, fileType: .mp4)
        
        let videoSettings = _videoOutput!.recommendedVideoSettingsForAssetWriter(writingTo: .mp4)
        let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoSettings)
        videoInput.mediaTimeScale = CMTimeScale(bitPattern: 600)
        videoInput.expectsMediaDataInRealTime = true
        
        let pixelBufferAttributes = [
            kCVPixelBufferCGImageCompatibilityKey: NSNumber(value: true),
            kCVPixelBufferCGBitmapContextCompatibilityKey: NSNumber(value: true),
            kCVPixelBufferPixelFormatTypeKey: NSNumber(value: Int32(kCVPixelFormatType_32ARGB))
        ] as [String:Any]
        
        let adapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoInput, sourcePixelBufferAttributes: pixelBufferAttributes)
        if writer.canAdd(videoInput) { writer.add(videoInput) }
                                
        let audioSettings = _audioOutput!.recommendedAudioSettingsForAssetWriter(writingTo: .mp4) as? [String:Any]
        let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings)
        audioInput.expectsMediaDataInRealTime = true
        if writer.canAdd(audioInput) { writer.add(audioInput) }
    
        _filename = outputUrl.absoluteString
        _assetWriter = writer
        _assetWriterVideoInput = videoInput
        _assetWriterAudioInput = audioInput
        _adapter = adapter
        _captureState = .capturing
        _time = timestamp
                    
        writer.startWriting()
        writer.startSession(atSourceTime: .zero)
        
    case .capturing:
        
        if output == _videoOutput {
            if _assetWriterVideoInput?.isReadyForMoreMediaData == true {
                let time = CMTime(seconds: timestamp - _time, preferredTimescale: CMTimeScale(600))
                _adapter?.append(self.cvPixelBuffer, withPresentationTime: time)
            }
        } else if output == _audioOutput {
            if _assetWriterAudioInput?.isReadyForMoreMediaData == true {
                _assetWriterAudioInput?.append(sampleBuffer)
            }
        }
        break
        
    case .end:
        
        guard _assetWriterVideoInput?.isReadyForMoreMediaData == true, _assetWriter!.status != .failed else { break }
        
        _assetWriterVideoInput?.markAsFinished()
        _assetWriterAudioInput?.markAsFinished()
        _assetWriter?.finishWriting { [weak self] in
            
            guard let output = self?._assetWriter?.outputURL else { return }
            
            self?._captureState = .idle
            self?._assetWriter = nil
            self?._assetWriterVideoInput = nil
            self?._assetWriterAudioInput = nil
            
            
            self?.previewRecordedVideo(with: output)
        }
        
    default:
        break
    }
}
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2021-10-28 04:45:02

在遇到的第一个音频或视频示例缓冲区的演示时间戳处启动时间线:

writer.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))

以前,您以零启动时间线,但是捕获的示例缓冲区具有时间戳,这些时间戳通常与自系统启动以来传递的时间戳有关,因此在文件“启动”(sourceTime for AVAssetWriter)和出现视频和音频之间有一个大的、不理想的持续时间。

你的问题并不是说你没有看到视频,我半个人都希望一些视频播放器会跳过一大串,直到你的样本开始的时间点,但无论如何,这个文件是错误的。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/69714369

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档