我需要从UIImage从获得未压缩的图像数据从CMSampleBufferRef。我用的是密码:
captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
// that famous function from Apple docs found on a lot of websites
// does NOT work for still images
UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer];
}index.html是指向imageFromSampleBuffer函数的链接。
但它不能正常工作。:(
有一个jpegStillImageNSDataRepresentation:imageSampleBuffer方法,但是它给出了压缩的数据(嗯,因为JPEG)。
如何在捕获静止图像之后用最原始的非压缩数据创建UIImage?
也许,我应该为视频输出指定一些设置?我现在用的是:
captureStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
captureStillImageOutput.outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };我注意到,输出对于AVVideoCodecKey有一个默认值,即AVVideoCodecJPEG。它可以以任何方式避免,或者,甚至在捕获静止图像时,是否重要?
我在那里发现了一些东西:Raw image data from camera like "645 PRO",但我只需要一个UIImage,而不需要使用OpenCV、OpenCV或其他第三方。
发布于 2013-03-27 12:36:29
方法imageFromSampleBuffer确实有效--事实上,我使用的是它的更改版本,但是如果我没有记错,您需要正确设置outputSettings。我认为您需要将键设置为kCVPixelBufferPixelFormatTypeKey,将值设置为kCVPixelFormatType_32BGRA。
例如:
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* outputSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[newStillImageOutput setOutputSettings:outputSettings];编辑
我使用这些设置采取stillImages,而不是视频。你的sessionPreset AVCaptureSessionPresetPhoto是吗?这可能会有问题
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
[newCaptureSession setSessionPreset:AVCaptureSessionPresetPhoto];编辑2
将其保存到UIImage的部分与文档中的部分相同。这就是我为什么要问这个问题的其他原因,但我想那只是在抓吸管。我还知道另一种方式,但这需要OpenCV。
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}我想这对你没什么帮助,对不起。我不知道你的问题的其他根源。
发布于 2016-08-20 07:47:58
这里有一个更有效的方法:
UIImage *image = [UIImage imageWithData:[self imageToBuffer:sampleBuffer]];
- (NSData *) imageToBuffer:(CMSampleBufferRef)source {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);
NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return data;
}https://stackoverflow.com/questions/15572893
复制相似问题