Nuke: Gaussian blur without border artifacts (CoreImage bug)

Issue

Gaussian blur CIFilter naturally creates gray artifacts at the borders of the output image. A common use case when blurring an image would be avoiding these artifacts.

Proposed solution

Add a new filter that extends the edge pixels indefinitely and crop the image to its original extend after blurring.

/// A Gaussian blur filter that clamps to extent to avoid the gray border artefact.
struct GaussianBlurClampedToExtent: ImageProcessing, Hashable, CustomStringConvertible {    
    private let radius: Int
    
    /// Initializes the receiver with a blur radius.
    public init(radius: Int = 8) {
        self.radius = radius
    }
    
    /// Applies `CIGaussianBlur` filter to the image.
    public func process(image: Image, context: ImageProcessingContext?) -> Image? {
        
        // Get CI image
        let ciImageOptional: CoreImage.CIImage? = {
            if let image = image.ciImage {
                return image
            }
            if let image = image.cgImage {
                return CoreImage.CIImage(cgImage: image)
            }
            return nil
        }()
        
        // Ensure CI image was retrieved
        guard let ciImage = ciImageOptional else { return nil }
        
        // Remember original image extent
        let extent = ciImage.extent
        
        // Create image with infinitely extended border pixels to prevent gray edges from blur filter
        let inputImage: CIImage = ciImage.clampedToExtent()
        
        // Create blur filter
        let filter = CIFilter(name: "CIGaussianBlur", parameters: [kCIInputRadiusKey: radius, kCIInputImageKey: inputImage])
        
        // Get filtered image
        guard let filteredImage = filter?.outputImage else { return nil }
        
        // Get default context shared between all Core Image filters.
        let context = CIContext(options: [.priorityRequestLow: true])
        
        // Create CI image cropped to original extent
        guard let imageRef: CGImage = context.createCGImage(filteredImage, from: extent) else { return nil }
        return UIImage(cgImage: imageRef, scale: image.scale, orientation: image.imageOrientation)
    }

    public var identifier: String {
        return "com.github.kean/nuke/gaussian_blur_clamped_to_extent?radius=\(radius)"
    }

    public var hashableIdentifier: AnyHashable {
        return self
    }

    public var description: String {
        return "GaussianBlurClampedToExtend(radius: \(radius))"
    }
}

Alternative solutions

Extend existing GaussianBlur filter with a Boolean clampToExtend option: "com.github.kean/nuke/gaussian_blur?radius=\(radius)&clampToExtend=\(clampToExtend)"

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Reactions: 1
  • Comments: 17 (7 by maintainers)

Most upvoted comments

This is not a bug, the method is not correct, gaussian loose image border proportionally to gaussian radius natively! Library is buggy, you need to crop the result, it can be done with CIAffineClamp:

//
//  NukeImageProcessor+gaussianCropped.swift
//

import Foundation
import Nuke

extension ImageProcessors {
    /// Blurs an image using `CIGaussianBlur` filter.
    public struct GaussianBlurFixed: ImageProcessing, Hashable, CustomStringConvertible {
        private let radius: Int

        /// Initializes the receiver with a blur radius.
        public init(radius: Int = 8) {
            self.radius = radius
        }

        /// Applies `CIGaussianBlur` filter to the image.
        public func process(_ image: PlatformImage) -> PlatformImage? {
            /*
             let filter = CIFilter(name: "CIGaussianBlur", parameters: ["inputRadius": radius])
             return CoreImageFilter.apply(filter: filter, to: image)
             */
                       
            let ciimage: CIImage = CIImage(image: image)!

            // Added "CIAffineClamp" filter
            let affineClampFilter = CIFilter(name: "CIAffineClamp")!
            affineClampFilter.setDefaults()
            affineClampFilter.setValue(ciimage, forKey: kCIInputImageKey)
            let resultClamp = affineClampFilter.value(forKey: kCIOutputImageKey)

            // resultClamp is used as input for "CIGaussianBlur" filter
            let filter: CIFilter = CIFilter(name:"CIGaussianBlur")!
            filter.setDefaults()
            filter.setValue(resultClamp, forKey: kCIInputImageKey)
            filter.setValue(radius, forKey: kCIInputRadiusKey)


            let ciContext = CIContext(options: nil)
            let result = filter.value(forKey: kCIOutputImageKey) as! CIImage
            let cgImage = ciContext.createCGImage(result, from: ciimage.extent)! // changed to ciiimage.extend

            return UIImage(cgImage: cgImage)
        }

        public var identifier: String {
            "com.github.kean/nuke/gaussian_blur_fixed?radius=\(radius)&clamped=false"
        }

        public var hashableIdentifier: AnyHashable { self }

        public var description: String {
            "GaussianBlurFixed(radius: \(radius))"
        }
    }
}

Cool, makes sense. I will appreciate a pull request. I’m using this processor myself, so the help is welcome. If not, I can test and merge it later myself. Thanks.

Since it appears to be a Core Image bug, I’m going to close it. If you have other idea, please let me know.

For the blur matrix of CIGaussianBlur my assumption was that radius defined in kCIInputRadiusKey is the number of surrounding pixels taken into account. So that would lead to a n x n matrix where n = kCIInputRadiusKey * 2 + 1.

But that does not seem to be the case. The extent change of CIImage actually shows that kCIInputRadiusKey=50 changes an image of (origin, size) (0.0, 0.0, 100.0, 100.0) to (-150.0, -150.0, 400.0, 400.0). A radius of 100 creates an image of (-300.0, -300.0, 700.0, 700.0).

So for CIGaussianBlur it seems that n = kCIInputRadiusKey * 3 * 2 + 1, but I don’t know where the factor 3 comes from. The references I found online for Gaussian Blur all seem to indicate that the “radius” is usually the number of pixels in all directions from the center of the pixel to calculate.

Anyway, instead of clampedToExtent() I will extend the image by kCIInputRadiusKey * 3:

/// Returns a new image created by making the pixel colors along the edges extend by a given number of pixels in all directions.
/// - Parameter pixels: The number of pixels to extend, e.g. a value of 2 adds 2 pixels in each direction.
/// - Returns: The extended image.
func clamped(byPixels pixels: CGFloat) -> CIImage {
    
    // Extend image indefinitely into all directions
    let indefiniteImage = clampedToExtent()
    
    // Calculate expanded extent
    let expandedExtent = extent.insetBy(dx: -pixels, dy: -pixels)
    
    // Crop image to expanded extent
    let clampedImage = indefiniteImage.cropped(to: expandedExtent)
    return clampedImage
}

After the new app release I will update here if that fixed the crashes.

@kean Unfortunately not.

I think looking for an alternative to clampedToExtent() for solving the gray frame problem would be a workaround. Maybe it would be enough to manually expend the pixels in each direction for n pixels, then blur, then cut off the extension.

The challenge would be to find a correct n depending on the image size. The larger the image, the larger the gray border becomes. So maybe n = image.width / 2 would be enough, or maybe it only needs a 1/4 of the image width. That’s something to empirically find out I guess, unless you have more insight into how and why the gray border the created with Gaussian blur?

I filed a bug report with Apple. An Apple engineer suggested that the recursive stack trace is likely an indication that the indefinitely extended edges may cause the crash in CIImage. I’ll update here.