首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >使用自定义相机Swift 3拍照

使用自定义相机Swift 3拍照
EN

Stack Overflow用户
提问于 2016-10-15 11:13:45
回答 3查看 5.5K关注 0票数 5

在Swift 2.3中,我使用此代码在自定义摄像机中拍摄了一张照片:

代码语言:javascript
运行
复制
 func didPressTakePhoto(){

        if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {

            stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
                if sampleBuffer != nil {
                    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                    let dataProvider = CGDataProviderCreateWithCFData(imageData)
                    let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
                    let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)


                    self.captureImageView.image = image
                }
            })

    }
}

但他的台词是:stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

显示此错误:

类型'AVCapturePhotoOutput‘的值没有成员AVCapturePhotoOutput

我试着解决我的问题,但我总是得到越来越多的错误,这就是为什么我张贴我的原始代码。

有人知道如何让我的代码再次工作吗?

谢谢。

EN

回答 3

Stack Overflow用户

回答已采纳

发布于 2016-11-02 22:32:12

感谢Sharpkit,我找到了我的解决方案(这段代码适用于我):

代码语言:javascript
运行
复制
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer,
            let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

            let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil)
            let dataProvider = CGDataProvider(data: imageData as! CFData)

            let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric)


            let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right)

            let cropedImage = self.cropToSquare(image: image)

            let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600))

            print(UIScreen.main.bounds.width)


            self.tempImageView.image = newImage
            self.tempImageView.isHidden = false


        } else {

        }
    }
票数 4
EN

Stack Overflow用户

发布于 2016-10-15 12:17:28

您可以在Swift 3中这样使用AVCapturePhotoOutput

您需要返回AVCapturePhotoCaptureDelegateCMSampleBuffer

如果您告诉AVCapturePhotoSettings previewFormat,您也可以获得预览图像。

代码语言:javascript
运行
复制
class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

    let cameraOutput = AVCapturePhotoOutput()

    func capturePhoto() {

      let settings = AVCapturePhotoSettings()
            let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                                 kCVPixelBufferWidthKey as String: 160,
                                 kCVPixelBufferHeightKey as String: 160,
                                 ]
            settings.previewPhotoFormat = previewFormat
            self.cameraOutput.capturePhoto(with: settings, delegate: self)

    }
    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) {

        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
          print(image: UIImage(data: dataImage).size)
        } else {

        }
    }
}
票数 7
EN

Stack Overflow用户

发布于 2017-02-09 05:40:50

好密码。谢谢你的帮助和例子。

为了澄清像我这样心智能力较慢的人,捕获(_ ...etc)方法在您的takePhoto (或您命名的)方法中调用self.cameraOutput.capturePhoto(具有:设置、委托: self)时在幕后调用。您永远不会亲自直接调用捕获方法。是自动发生的。

票数 2
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/40058320

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档