我使用OpenGL on iOS 7在iPhone显示器(相同的iPhone5)上将前置摄像头的视频捕获呈现给UIView
。我使用AVCaptureSessionPreset640x480
并将其传递给AVCaptureSession
方法
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
然而,呈现的视频似乎比上面设置的分辨率低,它似乎是AVCaptureSessionPreset352x288
。事实上,无论我从这些变量中传递出什么常量,都没有区别,分辨率是一样的。
NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority;
我如何检查相机实际拍摄的分辨率?
谢谢
发布于 2014-08-24 08:04:52
读取正在捕获的缓冲区的大小,如下所示(当然,对于AVCaptureSessionPresetPhoto
,您需要捕获静态图像,而不是读取视频帧.):
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// "width" and "height" now hold your dimensions...
}
https://stackoverflow.com/questions/25069027
复制相似问题