我写了一个UICropperViewController
,它在横向模式下完美地适用于图像.纵向模式下的图像存在很大问题.下图显示了带有黄色裁剪框的简单图片:
裁剪结果是:
现在谈到肖像图像我们得到了这种情况:
结果如下:
那么这里发生了什么?原始图像自动向左旋转.
我研究了很多,基本上找到了两个建议:
建议1
在裁剪之前保存图像方向并恢复它.
func didTapCropButton(sender: AnyObject) { let originalOrientation = self.imageView.image?.imageOrientation; // raw value of originalOrientation is `3` so its rotated to the right let croppedCGImage = self.imageView.image?.cgImage?.cropping(to: self.cropArea); // create a cropped cgImage let croppedImage = UIImage(cgImage: croppedCGImage!, scale: (self.imageView.image?.scale)!, orientation: (originalOrientation)!); // create the UIImage with the result from cgImage cropping and original orientation if (self.callback != nil) { self.callback.croppingDone(image: croppedImage); } self.dismiss(animated: true, completion: nil); }
但结果现在是:
显然这个建议不起作用,因为它只是简单地旋转已经裁剪的图像.
建议2
定位修复.我找到了以下代码,并承诺会修复错误:
func didTapCropButton(sender: AnyObject) { let image = self.imageView.image?.fixOrientation(); let croppedCGImage = image?.cgImage?.cropping(to: self.cropArea); let croppedImage = UIImage(cgImage: croppedCGImage!); if (self.callback != nil) { self.callback.croppingDone(image: croppedImage); } self.dismiss(animated: true, completion: nil); } extension UIImage { /// Extension to fix orientation of an UIImage without EXIF func fixOrientation() -> UIImage { guard let cgImage = cgImage else { return self } if imageOrientation == .up { return self } var transform = CGAffineTransform.identity switch imageOrientation { case .down, .downMirrored: transform = transform.translatedBy(x: size.width, y: size.height) transform = transform.rotated(by: CGFloat(M_PI)) case .left, .leftMirrored: transform = transform.translatedBy(x: size.width, y: 0) transform = transform.rotated(by: CGFloat(M_PI_2)) case .right, .rightMirrored: transform = transform.translatedBy(x: 0, y: size.height) transform = transform.rotated(by: CGFloat(-M_PI_2)) case .up, .upMirrored: break } switch imageOrientation { case .upMirrored, .downMirrored: transform.translatedBy(x: size.width, y: 0) transform.scaledBy(x: -1, y: 1) case .leftMirrored, .rightMirrored: transform.translatedBy(x: size.height, y: 0) transform.scaledBy(x: -1, y: 1) case .up, .down, .left, .right: break } if let ctx = CGContext(data: nil, width: Int(size.width), height: Int(size.height), bitsPerComponent: cgImage.bitsPerComponent, bytesPerRow: 0, space: cgImage.colorSpace!, bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue) { ctx.concatenate(transform) switch imageOrientation { case .left, .leftMirrored, .right, .rightMirrored: ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: size.height, height: size.width)) default: ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: size.width, height: size.height)) } if let finalImage = ctx.makeImage() { return (UIImage(cgImage: finalImage)) } } // something failed -- return original return self } }
但这会导致错误的裁剪区域.结果现在可能是这样的:
那么这个问题的真正解决方案是什么呢?无论如何,如果用户不想要它,会自动旋转图像的感觉是什么?是否可以禁用此自动旋转?
编辑
我的剪草机的完整来源:
import Foundation import UIKit protocol CropperCallback { func croppingDone(image: UIImage); func croppingCancelled(); } class CropperViewController : UIViewController { @IBOutlet var imageView: UIImageView!; var imageViewScaleCurrent: CGFloat! = 1.0; var imageViewScaleMin: CGFloat! = 0.5; var imageViewScaleMax: CGFloat! = 5.0; @IBOutlet var cropAreaView: CropAreaView!; @IBOutlet weak var cropAreaViewConstraintWidth: NSLayoutConstraint! @IBOutlet weak var cropAreaViewConstraintHeight: NSLayoutConstraint! @IBOutlet var btnCrop: UIButton!; @IBOutlet var btnCancel: UIButton!; var callback: CropperCallback! = nil; var image: UIImage! = nil; var imageOriginalWidth: CGFloat!; var imageOriginalHeight: CGFloat!; var cropWidth: CGFloat! = 287;/ var cropHeight: CGFloat! = 292; var cropHeightFix: CGFloat! = 1.0; var cropArea: CGRect { get { let factor = self.imageView.image!.size.width / self.view.frame.width; let scale = 1 / self.imageViewScaleCurrent; let x = (self.cropAreaView.frame.origin.x - self.imageView.frame.origin.x) * scale * factor; let y = (self.cropAreaView.frame.origin.y - self.imageView.frame.origin.y) * scale * factor; let width = self.cropAreaView.frame.size.width * scale * factor; let height = self.cropAreaView.frame.size.height * scale * factor; return CGRect(x: x, y: y, width: width, height: height); } } static func storyboardInstance() -> CropperViewController? { let storyboard = UIStoryboard(name: String(describing: NSStringFromClass(CropperViewController.classForCoder()).components(separatedBy: ".").last!), bundle: nil); return storyboard.instantiateInitialViewController() as? CropperViewController; } override func viewDidLoad() { super.viewDidLoad(); /* if (self.image.imageOrientation != .up) { self.image = UIImage(cgImage: self.image.cgImage!, scale: self.image.scale, orientation: UIImageOrientation(rawValue: 0)!); } */ self.imageView.image = self.image; self.imageView.isUserInteractionEnabled = true; self.imageView.addGestureRecognizer(UIPanGestureRecognizer(target: self, action: #selector(self.handlePan(_:)))); self.imageView.addGestureRecognizer(UIPinchGestureRecognizer(target: self, action: #selector(self.handlePinch(_:)))); self.cropAreaViewConstraintWidth.constant = self.cropWidth; self.cropAreaViewConstraintHeight.constant = self.cropHeight; self.btnCrop.addTarget(self, action: #selector(self.didTapCropButton), for: UIControlEvents.touchUpInside); self.btnCancel.addTarget(self, action: #selector(self.didTapCancelButton), for: UIControlEvents.touchUpInside); } override func viewDidLayoutSubviews() { super.viewDidLayoutSubviews(); let imageOriginalRect = self.getRectOfImageInImageView(imageView: self.imageView); self.imageOriginalWidth = imageOriginalRect.size.width; self.imageOriginalHeight = imageOriginalRect.size.height; self.createOverlay(); } func createOverlay() { let path = UIBezierPath(rect: CGRect(x: 0, y: 0, width: self.view.frame.size.width, height: self.view.frame.size.height)); let pathRect = UIBezierPath(rect: CGRect(x: self.cropAreaView.frame.origin.x, y: self.cropAreaView.frame.origin.y, width: self.cropWidth, height: self.cropHeight)); path.append(pathRect); path.usesEvenOddFillRule = true; let fillLayer = CAShapeLayer(); fillLayer.path = path.cgPath; fillLayer.fillRule = kCAFillRuleEvenOdd; fillLayer.fillColor = UIColor.white.cgColor; fillLayer.opacity = 0.1; self.view.layer.addSublayer(fillLayer); } func handlePan(_ gestureRecognizer: UIPanGestureRecognizer) { if gestureRecognizer.state == .began || gestureRecognizer.state == .changed { let rect = self.getRectOfImageInImageView(imageView: self.imageView); let xImage = rect.origin.x; let yImage = rect.origin.y; let widthImage = rect.size.width; let heightImage = rect.size.height; let xCropView = self.cropAreaView.frame.origin.x; let yCropView = self.cropAreaView.frame.origin.y; let widthCropView = self.cropAreaView.frame.size.width; let heightCropView = self.cropAreaView.frame.size.height; let translation = gestureRecognizer.translation(in: self.view); var x: CGFloat; var y: CGFloat; if (translation.x > 0) { if (!(xImage >= xCropView)) { x = gestureRecognizer.view!.center.x + translation.x; } else { x = gestureRecognizer.view!.center.x; } } else if (translation.x < 0) { if (!((xImage + widthImage) <= (xCropView + widthCropView))) { x = gestureRecognizer.view!.center.x + translation.x; } else { x = gestureRecognizer.view!.center.x; } } else { x = gestureRecognizer.view!.center.x; } if (translation.y > 0) { if (!(yImage >= (yCropView - self.cropHeightFix))) { y = gestureRecognizer.view!.center.y + translation.y; } else { y = gestureRecognizer.view!.center.y; } } else if (translation.y < 0) { if (!((yImage + heightImage) <= (yCropView + heightCropView + self.cropHeightFix))) { y = gestureRecognizer.view!.center.y + translation.y; } else { y = gestureRecognizer.view!.center.y; } } else { y = gestureRecognizer.view!.center.y; } gestureRecognizer.view!.center = CGPoint(x: x, y: y); gestureRecognizer.setTranslation(CGPoint.zero, in: self.view); self.fixImageViewPosition(); } } func handlePinch(_ gestureRecognizer: UIPinchGestureRecognizer) { if let view = gestureRecognizer.view { let widthCropView = self.cropAreaView.frame.size.width; let heightCropView = self.cropAreaView.frame.size.height; if (((self.imageViewScaleCurrent * gestureRecognizer.scale * self.imageOriginalWidth) > widthCropView) && ((self.imageViewScaleCurrent * gestureRecognizer.scale * self.imageOriginalHeight) > (heightCropView + (2 * self.cropHeightFix))) && ((self.imageViewScaleCurrent * gestureRecognizer.scale) < self.imageViewScaleMax)) { self.imageViewScaleCurrent = self.imageViewScaleCurrent * gestureRecognizer.scale; view.transform = CGAffineTransform(scaleX: self.imageViewScaleCurrent, y: self.imageViewScaleCurrent); } gestureRecognizer.scale = 1.0; self.fixImageViewPosition(); } } func fixImageViewPosition() { let rect = self.getRectOfImageInImageView(imageView: self.imageView); let xImage = rect.origin.x; let yImage = rect.origin.y; let widthImage = rect.size.width; let heightImage = rect.size.height; let xCropView = self.cropAreaView.frame.origin.x; let yCropView = self.cropAreaView.frame.origin.y; let widthCropView = self.cropAreaView.frame.size.width; let heightCropView = self.cropAreaView.frame.size.height; if (xImage > xCropView) { self.imageView.frame = CGRect(x: xCropView, y: self.imageView.frame.origin.y, width: widthImage, height: heightImage); } if ((xImage + widthImage) < (xCropView + widthCropView)) { self.imageView.frame = CGRect(x: ((xCropView + widthCropView) - widthImage), y: self.imageView.frame.origin.y, width: widthImage, height: heightImage); } if (yImage > yCropView) { self.imageView.frame = CGRect(x: self.imageView.frame.origin.x, y: (yCropView - self.cropHeightFix), width: widthImage, height: heightImage); } if ((yImage + heightImage) < (yCropView + heightCropView + self.cropHeightFix)) { self.imageView.frame = CGRect(x: self.imageView.frame.origin.x, y: ((yCropView + heightCropView + self.cropHeightFix) - heightImage), width: widthImage, height: heightImage); } } func getRectOfImageInImageView(imageView: UIImageView) -> CGRect { let imageViewSize = imageView.frame.size; let imageSize = imageView.image!.size; let scaleW = imageViewSize.width / imageSize.width; let scaleH = imageViewSize.height / imageSize.height; let aspect = min(scaleW, scaleH); var imageRect = CGRect(x: 0, y: 0, width: (imageSize.width * aspect), height: (imageSize.height * aspect)); imageRect.origin.x = (imageViewSize.width - imageRect.size.width) / 2; imageRect.origin.y = (imageViewSize.height - imageRect.size.height) / 2; imageRect.origin.x += imageView.frame.origin.x; imageRect.origin.y += imageView.frame.origin.y; return imageRect; } func getCGImageWithCorrectOrientation(_ image : UIImage) -> CGImage { if (image.imageOrientation == UIImageOrientation.up) { return image.cgImage!; } var transform : CGAffineTransform = CGAffineTransform.identity; switch (image.imageOrientation) { case UIImageOrientation.right, UIImageOrientation.rightMirrored: transform = transform.translatedBy(x: 0, y: image.size.height); transform = transform.rotated(by: CGFloat(-1.0 * M_PI_2)); break; case UIImageOrientation.left, UIImageOrientation.leftMirrored: transform = transform.translatedBy(x: image.size.width, y: 0); transform = transform.rotated(by: CGFloat(M_PI_2)); break; case UIImageOrientation.down, UIImageOrientation.downMirrored: transform = transform.translatedBy(x: image.size.width, y: image.size.height); transform = transform.rotated(by: CGFloat(M_PI)); break; default: break; } switch (image.imageOrientation) { case UIImageOrientation.rightMirrored, UIImageOrientation.leftMirrored: transform = transform.translatedBy(x: image.size.height, y: 0); transform = transform.scaledBy(x: -1, y: 1); break; case UIImageOrientation.downMirrored, UIImageOrientation.upMirrored: transform = transform.translatedBy(x: image.size.width, y: 0); transform = transform.scaledBy(x: -1, y: 1); break; default: break; } let contextWidth : Int; let contextHeight : Int; switch (image.imageOrientation) { case UIImageOrientation.left, UIImageOrientation.leftMirrored, UIImageOrientation.right, UIImageOrientation.rightMirrored: contextWidth = (image.cgImage?.height)!; contextHeight = (image.cgImage?.width)!; break; default: contextWidth = (image.cgImage?.width)!; contextHeight = (image.cgImage?.height)!; break; } let context : CGContext = CGContext(data: nil, width: contextWidth, height: contextHeight, bitsPerComponent: image.cgImage!.bitsPerComponent, bytesPerRow: image.cgImage!.bytesPerRow, space: image.cgImage!.colorSpace!, bitmapInfo: image.cgImage!.bitmapInfo.rawValue)!; context.concatenate(transform); context.draw(image.cgImage!, in: CGRect(x: 0, y: 0, width: CGFloat(contextWidth), height: CGFloat(contextHeight))); let cgImage = context.makeImage(); return cgImage!; } func didTapCropButton(sender: AnyObject) { let fixedImage = self.getCGImageWithCorrectOrientation(self.imageView.image!); // let image = self.imageView.image?.fixOrientation(); let croppedCGImage = fixedImage.cropping(to: self.cropArea); let croppedImage = UIImage(cgImage: croppedCGImage!); if (self.callback != nil) { self.callback.croppingDone(image: croppedImage); } self.dismiss(animated: true, completion: nil); } func didTapCancelButton(sender: AnyObject) { if (self.callback != nil) { self.callback.croppingCancelled(); } self.dismiss(animated: true, completion: nil); } } extension UIImageView { func imageFrame() -> CGRect { let imageViewSize = self.frame.size; guard let imageSize = self.image?.size else { return CGRect.zero; } let imageRatio = imageSize.width / imageSize.height; let imageViewRatio = imageViewSize.width / imageViewSize.height; if (imageRatio < imageViewRatio) { let scaleFactor = imageViewSize.height / imageSize.height; let width = imageSize.width * scaleFactor; let topLeftX = (imageViewSize.width - width) * 0.5; return CGRect(x: topLeftX, y: 0, width: width, height: imageViewSize.height); } else { let scaleFactor = imageViewSize.width / imageSize.width; let height = imageSize.height * scaleFactor; let topLeftY = (imageViewSize.height - height) * 0.5; return CGRect(x: 0, y: topLeftY, width: imageViewSize.width, height: height); } } } extension UIImage { // Extension to fix orientation of an UIImage without EXIF func fixOrientation() -> UIImage { guard let cgImage = self.cgImage else { return self; } if self.imageOrientation == .up { return self; } var transform = CGAffineTransform.identity; switch self.imageOrientation { case .down, .downMirrored: transform = transform.translatedBy(x: self.size.width, y: self.size.height); transform = transform.rotated(by: CGFloat(M_PI)); case .left, .leftMirrored: transform = transform.translatedBy(x: self.size.width, y: 0); transform = transform.rotated(by: CGFloat(M_PI_2)); case .right, .rightMirrored: transform = transform.translatedBy(x: 0, y: self.size.height); transform = transform.rotated(by: CGFloat(-M_PI_2)); case .up, .upMirrored: break; } switch self.imageOrientation { case .upMirrored, .downMirrored: transform.translatedBy(x: self.size.width, y: 0); transform.scaledBy(x: -1, y: 1); case .leftMirrored, .rightMirrored: transform.translatedBy(x: self.size.height, y: 0); transform.scaledBy(x: -1, y: 1); case .up, .down, .left, .right: break; } if let ctx = CGContext(data: nil, width: Int(self.size.width), height: Int(self.size.height), bitsPerComponent: cgImage.bitsPerComponent, bytesPerRow: 0, space: cgImage.colorSpace!, bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue) { ctx.concatenate(transform); switch self.imageOrientation { case .left, .leftMirrored, .right, .rightMirrored: ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: self.size.height, height: self.size.width)); default: ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: self.size.width, height: self.size.height)); } if let finalImage = ctx.makeImage() { return (UIImage(cgImage: finalImage)); } } // something failed -- return original return self; } }
Sulthan.. 8
你必须了解scale
和orientation
属性.
您的建议1(使用原始图像的方向)显然是一个正确的建议,如果您还能够旋转和缩放您的建议,它将起作用cropArea
.
你的建议2是处理旋转不错,但你还是要扩展的cropArea
.目前您根本没有处理规模.
(小调,旋转cropArea
可能比旋转整个图像有更好的性能,请参阅/sf/ask/17360801/).
你必须:
缩放(乘以)cropArea
图像的比例.
创建结果时使用原始图像比例
例如,如果您的UIImage
尺寸200x100
和尺寸2x
(它是视网膜图像),您的cgImage
尺寸会有,400x200
但您仍然在使用裁剪区域200x100
!
有点像:
func didTapCropButton(sender: AnyObject) { guard let image = self.imageView.image else { return } let cgImage = self.getCGImageWithCorrectOrientation(image); let scaledCropArea = CGRect( x: self.cropArea.x * image.scale, y: self.cropArea.y * image.scale, width: self.cropArea.width * image.scale, height: self.cropArea.height * image.scale ) let croppedCGImage = cgImage.cropping(to: scaledCropArea) let croppedImage = UIImage(cgImage: croppedCGImage!, scale: image.scale, orientation: .up) if (self.callback != nil) { self.callback.croppingDone(image: croppedImage) } self.dismiss(animated: true, completion: nil) }
自动旋转和变换UIImage
只是一种优化.由于这种优化,多个图像可以共享相同的存储(相同的存储器数据).优化已在资产加载器中完成,您无法禁用它.
另外,请参阅/sf/ask/17360801/以获得更简单,更安全的实现.
你必须了解scale
和orientation
属性.
您的建议1(使用原始图像的方向)显然是一个正确的建议,如果您还能够旋转和缩放您的建议,它将起作用cropArea
.
你的建议2是处理旋转不错,但你还是要扩展的cropArea
.目前您根本没有处理规模.
(小调,旋转cropArea
可能比旋转整个图像有更好的性能,请参阅/sf/ask/17360801/).
你必须:
缩放(乘以)cropArea
图像的比例.
创建结果时使用原始图像比例
例如,如果您的UIImage
尺寸200x100
和尺寸2x
(它是视网膜图像),您的cgImage
尺寸会有,400x200
但您仍然在使用裁剪区域200x100
!
有点像:
func didTapCropButton(sender: AnyObject) { guard let image = self.imageView.image else { return } let cgImage = self.getCGImageWithCorrectOrientation(image); let scaledCropArea = CGRect( x: self.cropArea.x * image.scale, y: self.cropArea.y * image.scale, width: self.cropArea.width * image.scale, height: self.cropArea.height * image.scale ) let croppedCGImage = cgImage.cropping(to: scaledCropArea) let croppedImage = UIImage(cgImage: croppedCGImage!, scale: image.scale, orientation: .up) if (self.callback != nil) { self.callback.croppingDone(image: croppedImage) } self.dismiss(animated: true, completion: nil) }
自动旋转和变换UIImage
只是一种优化.由于这种优化,多个图像可以共享相同的存储(相同的存储器数据).优化已在资产加载器中完成,您无法禁用它.
另外,请参阅/sf/ask/17360801/以获得更简单,更安全的实现.