首页
学习
活动
专区
工具
TVP
发布
社区首页 >问答首页 >如何用AVAssetWriter写一部有视频和音频的电影?

如何用AVAssetWriter写一部有视频和音频的电影?
EN

Stack Overflow用户
提问于 2011-03-30 10:36:35
回答 2查看 13.3K关注 0票数 21

我想用AVAssetWriter导出电影,但不知道如何同步包含视频和音频曲目。只导出视频效果很好,但当我添加音频时,结果电影如下所示:

首先我看到视频(没有音频),然后视频冻结(显示最后一个图像帧直到结束),几秒钟后我听到音频。

我尝试了一些CMSampleBufferSetOutputPresentationTimeStamp (从当前音频中减去第一个CMSampleBufferGetPresentationTimeStamp )的方法,但都不起作用,我认为这不是正确的方向,因为源电影中的视频和音频无论如何都应该是同步的……

我的设置简而言之:我创建一个AVAssetReader和2个AVAssetReaderTrackOutput (一个用于视频,一个用于音频)并将它们添加到AVAssetReader中,然后我创建一个AVAssetWriter和2个AVAssetWriterInput (视频和音频)并将它们添加到AVAssetWriter中……我从以下几个方面开始:

代码语言:javascript
复制
[assetReader startReading];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];

然后我运行两个队列来做样本缓冲区的事情:

代码语言:javascript
复制
dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
     while([assetWriterVideoInput isReadyForMoreMediaData])
     {
         CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer];
         if(sampleBuffer)
         {
             [assetWriterVideoInput appendSampleBuffer:sampleBuffer];
             CFRelease(sampleBuffer);
         } else
         {
             [assetWriterVideoInput markAsFinished];
             dispatch_release(queueVideo);
             videoFinished=YES;
             break;
         }
     }
}];

dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
    while([assetWriterAudioInput isReadyForMoreMediaData])
    {
        CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer];
        if(sampleBuffer)
        {
            [assetWriterAudioInput appendSampleBuffer:sampleBuffer];
            CFRelease(sampleBuffer);
        } else
        {
            [assetWriterAudioInput markAsFinished];
            dispatch_release(queueAudio);
            audioFinished=YES;
            break;
        }
    }
}];

在主循环中,我等待两个队列,直到它们完成:

代码语言:javascript
复制
while(!videoFinished && !audioFinished)
{
    sleep(1);
}
[assetWriter finishWriting];

此外,我尝试使用以下代码将结果文件保存在库中...

代码语言:javascript
复制
NSURL *url=[[NSURL alloc] initFileURLWithPath:path];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url])
{
    [library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error)
     {
         if(error)
             NSLog(@"error=%@",error.localizedDescription);
         else
             NSLog(@"completed...");
     }];
} else
    NSLog(@"error, video not saved...");

[library release];
[url release];

...but我得到了这个错误:

Video /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4 cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0x5e4fb90 {NSLocalizedDescription=Movie could not be played.}

代码在另一个程序中运行时没有任何问题。所以这部电影出了什么问题...?

EN

回答 2

Stack Overflow用户

发布于 2013-10-19 17:58:34

代码语言:javascript
复制
-(void)mergeAudioVideo
{

    NSString *videoOutputPath=[_documentsDirectory stringByAppendingPathComponent:@"dummy_video.mp4"];
    NSString *outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"];
    if ([[NSFileManager defaultManager]fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil];


    NSURL    *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    NSString *filePath = [_documentsDirectory stringByAppendingPathComponent:@"newFile.m4a"];
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSURL    *audio_inputFileUrl = [NSURL fileURLWithPath:filePath];
    NSURL    *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath];

    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);

    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         if (_assetExport.status == AVAssetExportSessionStatusCompleted) {

          //Write Code Here to Continue
         }
         else {
            //Write Fail Code here     
         }
     }
     ];



}

您可以使用此代码来合并音频和视频。

票数 10
EN

Stack Overflow用户

发布于 2013-06-17 15:57:59

它认为assetWriterAudioInput忽略了音频写入的采样缓冲时间。这样做。

1)写入视频轨道。

2)完成后,将其标记为完成,即videoWriterInput markAsFinished;

3)执行assetWriter startSessionAtSourceTime:timeRangeStart;

3)实例化音频阅读器,开始写入音频。

票数 -3
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/5481268

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档