10

I am using some CoreImage filters to process an image. Applying the filter to my input image results in an output image called filterOutputImage of type CIImage.

I now wish to display that image, and tried doing:

self.modifiedPhoto = [UIImage imageWithCIImage:filterOutputImage];
self.photoImageView.image = self.modifiedPhoto;

The view however is blank - nothing is being displayed.

If I add logging statements that print out details about both filterOutputImage and self.modifiedPhoto, those logging statements are showing me that both those vars appear to contain legitimate image data: their size is being reported and the objects are not nil.

So after doing some Googling, I found a solution that requires going through a CGImage stage; vis:

CGImageRef outputImageRef = [context createCGImage:filterOutputImage fromRect:[filterOutputImage extent]];
self.modifiedPhoto = [UIImage imageWithCGImage:outputImageRef scale:self.originalPhoto.scale orientation:self.originalPhoto.imageOrientation];
self.photoImageView.image = self.modifiedPhoto;
CGImageRelease(outputImageRef);

This second approach works: I am getting the correct image displayed in the view.

Can someone please explain to me why my first attempt failed? What am I doing wrong with the imageWithCIImage method that is resulting in an image that seems to exist but can't be displayed? Is it always necessary to "pass through" a CGImage stage in order to generate a UIImage from a CIImage?

Hoping someone can clear up my confusion :)

H.

Hamster
  • 111
  • 1
  • 4

2 Answers2

18

This should do it!

-(UIImage*)makeUIImageFromCIImage:(CIImage*)ciImage
{
    self.cicontext = [CIContext contextWithOptions:nil];
    // finally!
    UIImage * returnImage;

    CGImageRef processedCGImage = [self.cicontext createCGImage:ciImage 
                                                       fromRect:[ciImage extent]];

    returnImage = [UIImage imageWithCGImage:processedCGImage];
    CGImageRelease(processedCGImage);

    return returnImage;
}
Tony Million
  • 4,266
  • 22
  • 24
9

I assume that self.photoImageView is a UIImageView? If so, ultimately, it is going to call -[UIImage CGImage] on the UIImage and then pass that CGImage as the contents property of a CALayer.

(See comments: my details were wrong)

Per the UIImage documentation for -[UIImage CGImage]:

If the UIImage object was initialized using a CIImage object, the
value of the property is NULL.

So the UIImageView calls -CGImage, but that results in NULL, so nothing is displayed.

I haven't tried this, but you could try making a custom UIView and then using UIImage's -draw... methods in -[UIView drawRect:] to draw the CIImage.

iccir
  • 4,798
  • 2
  • 20
  • 34
  • Ahh! Thank you. Yes, the photoImageView is a UIImageView. I had no idea it used the CGImage property of the UIImage. – Hamster Feb 01 '12 at 10:42
  • One more question, in which document is it explained how the UIImageView displays its image? I'm trying to find out where you learnt that from :) – Hamster Feb 01 '12 at 10:46
  • Actually, now that I actually look at the binary code: UIImageView overrides -drawRect: rather than calling setContents: on its own CALayer. It looks like it ultimately calls into the UIImage draw methods, which grab the CGImageRef and draw it. So, same result, but my details were wrong. – iccir Feb 01 '12 at 10:49
  • class-dump helps as well. The presence of -[UIImageView drawRect:] was a huge hint to me that my original statement about contents was wrong – iccir Feb 01 '12 at 11:23