1

I have this code

CIImage * input_ciimage = [CIImage imageWithCGImage:self.CGImage];
CIImage * output_ciimage =
[[CIFilter filterWithName:@"CILanczosScaleTransform" keysAndValues:
                           kCIInputImageKey, input_ciimage,
                           kCIInputScaleKey, [NSNumber numberWithFloat:0.72], // [NSNumber numberWithFloat: 800.0 / self.size.width],
                           nil] outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef output_cgimage = [context createCGImage:output_ciimage 
    fromRect:[output_ciimage extent]];
UIImage *output_uiimage;
output_uiimage = [UIImage imageWithCGImage:output_cgimage 
    scale:1.0 orientation:self.imageOrientation];
CGImageRelease(output_cgimage);
return output_uiimage;

So, when scaleKey greater than some value then output_uiimage is black image.

In my case if value of key kCIInputScaleKey > @0.52 then result is black image. When i rotate image on 90 degree then i got the same result but value was 0.72 (not 0.52).

Whats wrong with library or mistake in my code?

I have iPhone 4, iOS 7.1.2, xCode 6.0 if needed.

Logioniz
  • 891
  • 5
  • 13
  • I have this problem on iPad Air (ios 8.0.2 xcode 6) on big images. So CILanczosScaleTransform does black image with scale=0.5 (image size 4080x4080). On image with size 4096 this filter does black image every time. Same problems happen with other built-in and custom filters. I think, that CoreImage has still some problems with big images. – George Sep 30 '14 at 10:55
  • May be need to report bug to apple. I can't do it through my account. – Logioniz Oct 01 '14 at 12:36
  • Yes. I've reported bug to Apple. – George Oct 01 '14 at 13:29
  • Thanks. Please keep me informed. – Logioniz Oct 02 '14 at 17:57

2 Answers2

1

That's what Apple said:

This scenario exposes a bug in Core Image. The bug occurs when rendering requires an intermediate buffer that has a dimension greater than the GPU texture limits (4096) AND the input image fits into these limits. This happens with any filter that is performing a convolution (blur, lanczos) on an input image that has width or height close to the GL texture limit.
Note: the render is succesful if the one of the dimensions of the input image is increased to 4097.

Replacing CILanczosScaleTransform with CIAffineTransform (lower quality) or resizing the image with CG are possible workarounds for the provided sample code.

George
  • 910
  • 3
  • 11
  • 34
  • Hey George! We encountered a similar issue while working on our product. I'm sure your input could be highly valuable. Any chance you can take a quick look at this? https://stackoverflow.com/questions/57153640/metal-resize-video-buffer-before-passing-to-custom-kernel-filter/57163285?noredirect=1#comment100842018_57163285 – Roi Mulia Jul 24 '19 at 15:33
0

I've updated Bug report after request from Apple's engineers. They answer:

We believe that the issue is with the Core Image Lanczos filter that occurs at certain downsample scale factors. We hope to fix this issue in the future.

The filter should work well with downsample that are power of 2 (i.e. 1/2, 1/4, 1/8). So, we would recommend limiting your downsample to these values and then using AffineTransform to scale up or down further if required.

We are now closing this bug report.

George
  • 910
  • 3
  • 11
  • 34