Skip to content

Instantly share code, notes, and snippets.

@steipete
Created August 13, 2011 20:52
Show Gist options
  • Save steipete/1144242 to your computer and use it in GitHub Desktop.
Save steipete/1144242 to your computer and use it in GitHub Desktop.
Preload UIImage for super-smooth interaction. especially great if you use JPGs, which otherwise produce a noticeable lag on the main thread.
- (UIImage *)pspdf_preloadedImage {
CGImageRef image = self.CGImage;
// make a bitmap context of a suitable size to draw to, forcing decode
size_t width = CGImageGetWidth(image);
size_t height = CGImageGetHeight(image);
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef imageContext = CGBitmapContextCreate(NULL, width, height, 8, width*4, colourSpace,
kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little);
CGColorSpaceRelease(colourSpace);
// draw the image to the context, release it
CGContextDrawImage(imageContext, CGRectMake(0, 0, width, height), image);
// now get an image ref from the context
CGImageRef outputImage = CGBitmapContextCreateImage(imageContext);
UIImage *cachedImage = [UIImage imageWithCGImage:outputImage];
// clean up
CGImageRelease(outputImage);
CGContextRelease(imageContext);
return cachedImage;
}
@mohamede1945
Copy link

Thank you so much! This snippet is very helpful.

Here is a swift version of it.

extension UIImage {

    func preloadedImage() -> UIImage {

        // make a bitmap context of a suitable size to draw to, forcing decode
        let width = CGImageGetWidth(CGImage)
        let height = CGImageGetHeight(CGImage)

        let colourSpace = CGColorSpaceCreateDeviceRGB()
        let imageContext =  CGBitmapContextCreate(nil,
                                                  width,
                                                  height,
                                                  8,
                                                  width * 4,
                                                  colourSpace,
                                                  CGImageAlphaInfo.PremultipliedFirst.rawValue | CGBitmapInfo.ByteOrder32Little.rawValue)

        // draw the image to the context, release it
        CGContextDrawImage(imageContext, CGRect(x: 0, y: 0, width: width, height: height), CGImage)

        // now get an image ref from the context
        if let outputImage = CGBitmapContextCreateImage(imageContext) {
            let cachedImage = UIImage(CGImage: outputImage)
            return cachedImage
        }

        print("Failed to preload the image")
        return self
    }
}

@rsaunders100
Copy link

I think there is a bug where it adds an alpha channel to an image which did not previously have an alpha channel. This slows down rendering slightly I believe.

Could this be fixed by checking the original alpha info with CGImageGetAlphaInfo?

However I am not sure if kCGImageAlphaPremultipliedFirst was picked because it was generic or it was optimal for iOS and using a different alpha info will have other performance implications?

@carlosen14
Copy link

carlosen14 commented Aug 26, 2016

Does this generates a smaller version from the original UIImage?

@mickeyl
Copy link

mickeyl commented Jan 22, 2017

@rasaunders100: Using kCGImageAlphaNoneSkipFirst instead of kCGImageAlphaPremultipliedFirst gets you an image without alpha.

@steipete: I have seen a similar snippet floating around which does a few additional checks, such as:

        CGImageRef imageRef = self.CGImage;
        
        CGImageAlphaInfo alpha = CGImageGetAlphaInfo( imageRef );
        BOOL anyAlpha = ( alpha == kCGImageAlphaFirst ||
                          alpha == kCGImageAlphaLast ||
                          alpha == kCGImageAlphaPremultipliedFirst ||
                          alpha == kCGImageAlphaPremultipliedLast );
        if ( anyAlpha )
        {
            return self;
        }
        
        size_t width = CGImageGetWidth( imageRef );
        size_t height = CGImageGetHeight( imageRef );
        
        // current
        CGColorSpaceModel imageColorSpaceModel = CGColorSpaceGetModel( CGImageGetColorSpace( imageRef ) );
        CGColorSpaceRef colorspaceRef = CGImageGetColorSpace( imageRef );
        
        bool unsupportedColorSpace = ( imageColorSpaceModel == 0 ||
                                       imageColorSpaceModel == -1 ||
                                       imageColorSpaceModel == kCGColorSpaceModelIndexed );
        if ( unsupportedColorSpace )
        {
            colorspaceRef = CGColorSpaceCreateDeviceRGB();
        }

What do you think about these – are they unnecessary?

@mickeyl
Copy link

mickeyl commented Apr 10, 2017

Any idea whether this applies to watchOS as well?

@lastcc
Copy link

lastcc commented Jul 20, 2017

ah... why apple don't let us:

let image = UIImage(...)

image.prepareForRenderingAsynchronously = true

@dannofx
Copy link

dannofx commented Aug 2, 2017

Thanks for the snippet!
Swift 3 version:

import UIKit

extension UIImage {
    
    func forceLoad() -> UIImage {
        guard let imageRef = self.cgImage else {
            return self //failed
        }
        let width = imageRef.width
        let height = imageRef.height
        let colourSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo: UInt32 = CGImageAlphaInfo.premultipliedFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue
        guard let imageContext = CGContext(data: nil, width: width, height: height, bitsPerComponent: 8, bytesPerRow: width * 4, space: colourSpace, bitmapInfo: bitmapInfo) else {
            return self //failed
        }
        let rect = CGRect(x: 0, y: 0, width: width, height: height)
        imageContext.draw(imageRef, in: rect)
        if let outputImage = imageContext.makeImage() {
            let cachedImage = UIImage(cgImage: outputImage)
            return cachedImage
        }
        return self //failed
    }
    

}

@v57
Copy link

v57 commented Mar 8, 2018

Pls don't forget to add old image orientation with:
let cachedImage = UIImage(cgImage: outputImage, scale: scale, orientation: imageOrientation)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment