34

I'm trying to blend a background with a foreground image, where the foreground image is a transparent image with lines on it.

I am trying to do it this way.

UIGraphicsBeginImageContext(CGSizeMake(320, 480));
CGContextRef context = UIGraphicsGetCurrentContext();   

// create rect that fills screen
CGRect bounds = CGRectMake( 0,0, 320, 480);

// This is my bkgnd image
CGContextDrawImage(context, bounds, [UIImage imageNamed:@"bkgnd.jpg"].CGImage);

CGContextSetBlendMode(context, kCGBlendModeSourceIn);

// This is my image to blend in
CGContextDrawImage(context, bounds, [UIImage imageNamed:@"over.png"].CGImage);

UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();

UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);
// clean up drawing environment
//
UIGraphicsEndImageContext();

but does not seem to work.

Any suggestions will be appreciated.

ToolmakerSteve
  • 5,893
  • 8
  • 67
  • 145
  • Technical note: this is referring to "alpha-blending" (compositing two images based on an alpha channel in one of the images), *not* to using Core Graphics "blend modes" - search again if that is what you need. Actually, [Eric's answer](http://stackoverflow.com/a/3188761/199364) does briefly show use of one blend mode; you could substitute a different blend mode there, and remove the "alpha:0.8" parameter. – ToolmakerSteve Mar 10 '17 at 04:19

7 Answers7

91

This is what I've done in my app, similar to Tyler's - but without the UIImageView:

UIImage *bottomImage = [UIImage imageNamed:@"bottom.png"];
UIImage *image = [UIImage imageNamed:@"top.png"];

CGSize newSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext( newSize );

// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Apply supplied opacity
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:0.8];

UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();

If the image already has opacity, you do not need to set it (as in bottomImage) otherwise you can set it (as with image).

Eric
  • 3,795
  • 4
  • 28
  • 44
  • Thanks for this answer...It worked like charm... But i want to do the same for two layers.... – Alfa Apr 21 '14 at 04:03
  • 3
    For retina and non-retina display support, use `UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0f);`, which tells UIKit to grab and use the scale factor of the device’s main screen. – Mr. T Jun 12 '14 at 21:51
19
UIImage* bottomImage = [UIImage imageNamed:@"bottom.png"];  
UIImage* topImage    = [UIImage imageNamed:@"top.png"];
UIImageView* imageView = [[UIImageView alloc] initWithImage:bottomImage];
UIImageView* subView   = [[UIImageView alloc] initWithImage:topImage];
subView.alpha = 0.5;  // Customize the opacity of the top image.
[imageView addSubview:subView];
UIGraphicsBeginImageContext(imageView.frame.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* blendedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[subView release];
[imageView release];

[self doWhateverIWantWith: blendedImage];
lavoy
  • 1,796
  • 3
  • 20
  • 34
Tyler
  • 27,580
  • 11
  • 83
  • 103
  • Thank you for your replies.... but issnt this the same as having 50% opacity... mayb the manner in which i phrased my question is wrong... mayb i shld say i wanna merge 2 UIImages... and the image on top has alpha values. –  Aug 22 '09 at 16:54
  • 1
    If there are alpha values within the top image's png file, just leave out the "subView.alpha = 0.5;" line, and it will draw the top image, including it's custom alpha values, on top of the bottom image. – Tyler Aug 22 '09 at 20:23
10

my answer is based on Eric's answer, but allows for @2x images to retain their resolution after the image merge. Please note the URL REF in my comments, as I am acknowledging the sources that contributed assistance to my development of this function that I used in my iOS apps.

- (UIImage*) mergeTwoImages : (UIImage*) topImage : (UIImage*) bottomImage
{
    // URL REF: http://iphoneincubator.com/blog/windows-views/image-processing-tricks
    // URL REF: https://stackoverflow.com/questions/1309757/blend-two-uiimages?answertab=active#tab-top
    // URL REF: http://www.waterworld.com.hk/en/blog/uigraphicsbeginimagecontext-and-retina-display

    int width = bottomImage.size.width;
    int height = bottomImage.size.height;

    CGSize newSize = CGSizeMake(width, height);
    static CGFloat scale = -1.0;

    if (scale<0.0)
    {
        UIScreen *screen = [UIScreen mainScreen];

        if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 4.0)
        {
            scale = [screen scale];
        }
        else
        {
            scale = 0.0;    // Use the standard API
        }
    }

    if (scale>0.0)
    {
        UIGraphicsBeginImageContextWithOptions(newSize, NO, scale);
    }
    else
    {
        UIGraphicsBeginImageContext(newSize);
    }

    [bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
    [topImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:1.0];

    UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    return newImage;
}
Community
  • 1
  • 1
Jason TEPOORTEN
  • 411
  • 5
  • 10
4

Blending with alpha

UIGraphicsBeginImageContext(area.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextRetain(context);

// mirroring context
CGContextTranslateCTM(context, 0.0, area.size.height);
CGContextScaleCTM(context, 1.0, -1.0);

for (...) {
    CGContextBeginTransparencyLayer(context, nil);
    CGContextSetAlpha( context, alpha );
    CGContextDrawImage(context, area, tempimg.CGImage);
    CGContextEndTransparencyLayer(context);
}

// get created image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
CGContextRelease(context);
UIGraphicsEndImageContext();
Volodymyr B.
  • 3,180
  • 2
  • 27
  • 48
  • Please explain how this is an improvement over earlier answers / or under what circumstances you would use it / or why this alternative might be helpful. Thanks. – ToolmakerSteve Mar 10 '17 at 04:15
4

Swift 3

This function will take two images, and a CGSize and return an optional UIImage. It works best when both images are the same size. If your top image has alpha it will show the bottom image through it.

// composit two images
func compositeTwoImages(top: UIImage, bottom: UIImage, newSize: CGSize) -> UIImage? {
    // begin context with new size
    UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0)
    // draw images to context
    bottom.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
    top.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
    // return the new image
    let newImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()
    // returns an optional
    return newImage
}

Usage

let outputSize = CGSize(width: 100, height: 100)
if let topImage = UIImage(named: "myTopImage") {
    if let bottomImage = UIImage(named: "myBottomImage") {
        // composite both images
        if let finalImage = compositeTwoImages(top: topImage, bottom: bottomImage, newSize: outputSize) {
            // do something with finalImage
        }
    }
}
skymook
  • 2,777
  • 27
  • 34
1

Can you provide detail in what you mean by "it does not seem to work?" Does it draw only one image or the other image? Draw black? Noise? Crash? Why have you chosen kCGBlendModeSourceIn; what effect are you trying to achieve (there are dozens of ways to blend images)? Do either of your images have alpha already?

I assume what you're trying to do is mix two images such that each has 50% opacity? Use CGContextSetAlpha() for that rather than CGContextSetBlendMode().

Rob Napier
  • 250,948
  • 34
  • 393
  • 528
  • Thank you for your replies.... Both my images have 100% opacity.... when i blend both images... my context only draws the second image... in macosx... i use the source over filter to composite both the images.... and the second image which i want to blend in has alpha already. –  Aug 22 '09 at 16:50
  • I believe you want to go back and study the basics of Quartz Drawing. I believe you're having confusion over what it means to draw various layers with their own alpha versus what it means to blend (which is related, but different in how it is implemented). Go to the Quartz 2D Programming Guide; it will teach you the Core Graphics you need so that you can do what you want easily and with good performance, predictibility and flexibility. http://developer.apple.com/documentation/graphicsimaging/conceptual/drawingwithquartz2d/ – Rob Napier Aug 24 '09 at 04:08
0

You can use UIImage's drawInRect: or drawAtPoint: instead of CGContextDrawImage (they draw to the current context). Does using them give you any difference in output?

It may also be helpful to make sure the UIImage* values you are getting back from imageNamed: are valid.

fbrereto
  • 34,250
  • 17
  • 118
  • 176