0

High level summary: I would like to replace the rather low resolution ALAsset Group images [group posterImage] with a higher quality version so that they can be shown larger on the screen. Normally I would load them as needed by the interface, but [ALAssetsGroup enumerateAssetsAtIndexes] is very slow. (I COULD preload wider than visible by x amount and may still do that, but that seemed like more hassle than it was worth and still suffers from the slow response, especially in iOS5)

What I figured I could do was request the first asset in each group and then scale it, storing the result. However, even accounting for the larger size of the images, I am surprised by the memory allocations that are taking place. Using VM Tracker, I see a LARGE number of CGImage allocations as well as the 'mapped file' thumbnails I am creating. I am using ARC, so I expected the original large images to drop out, but my VM Tracker results don't bare that out.

If I use the default posterImage implementation, my Resident Mem ~= 30mb, Dirty Mem ~= 80mb and Virtual tops ~240mb (large in themselves). 'Live' < 10mb per the Allocation profiler.

If I use the following code instead, I crash loading the ~80th image out of 150. At that point my Resident Mem > 480mb, Dirty Mem > 420mb and Virtual was a whopping 750mb. Clearly this is untenable.

Here is the code I am using inside of an NSOperationQueue to grab the first image of each group to use as a hi-res poster image.

NSIndexSet* i = [NSIndexSet indexSetWithIndex:0];
ALAssetsGroupEnumerationResultsBlock assetsEnumerationBlock = ^(ALAsset *result, NSUInteger i, BOOL *stop) {

  if (result) {
     // pull the full resolution image and then scale it to fit our desired area
     ALAssetRepresentation *assetRepresentation = [result defaultRepresentation];
     CGImageRef ref = [assetRepresentation fullScreenImage];
     CGFloat imgWidth = CGImageGetWidth(ref);
     CGFloat imgHeight = CGImageGetHeight(ref);
     CGFloat minDimension = MIN(imgWidth,imgHeight);

     // grab a square subset of the image, centered, to use
     CGRect subRect = CGRectMake(0, 0, minDimension, minDimension);
     subRect.origin = CGPointMake(imgWidth / 2 - minDimension / 2, imgHeight / 2 - minDimension / 2);
     CGImageRef squareRef = CGImageCreateWithImageInRect(ref,subRect);
     // now scale it down to fit
     CGFloat heightScale = dimension / minDimension;
     UIImage* coverImage = [UIImage imageWithCGImage:squareRef scale:1/heightScale orientation:UIImageOrientationUp];

     if (coverImage) {
        [mainViewController performSelectorOnMainThread:@selector(imageDidLoad:) 
                                             withObject:[NSArray arrayWithObjects:coverImage, [NSNumber numberWithInt:photoIndex], nil] 
                                          waitUntilDone:NO];
     }      
     CGImageRelease(squareRef);
     // DO NOT RELEASE 'ref' it will be 'zombied' as it is already handled by the system
     //CGImageRelease(ref);

     *stop = YES;
   }
   else {
      // default image grab....
   }
};
[group enumerateAssetsAtIndexes:i options:NSEnumerationConcurrent usingBlock:assetsEnumerationBlock];

Am I doing something wrong with the above or am I just not being smart by loading all of the images? The more I think about it, the more I think loading a window of visible images plus a buffer around it is the way to go, but I would like to learn more about what I may have done wrong with the above code. Thanks!

MobileVet
  • 2,850
  • 2
  • 24
  • 39

1 Answers1

0

I did the exact same thing as you but thanks to this answer: Fetching the last image captured through iPhone camera , I managed to make it a lot faster and not draining as much memory.

This of course makes the images a lot smaller since you don't fetch the fullScreenImage and just a thumbnail of it..

My final code:

   ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {

    // Within the group enumeration block, filter to enumerate just photos.
    [group setAssetsFilter:[ALAssetsFilter allPhotos]];
    int nrOfPhotos = 200;
    [group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndexesInRange:NSMakeRange(0, nrOfPhotos)]
                            options:0
                         usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {

                             // The end of the enumeration is signaled by asset == nil.
                             if (alAsset) {
                                 NSLog(@"Index: %i", index);

                                 CGImageRef imageRef = [alAsset thumbnail];
                                 UIImage *latestPhoto = [UIImage imageWithCGImage:imageRef scale:1.f orientation:UIImageOrientationUp];

                                 //Do something with images

                         }
                         }];
}
                     failureBlock: ^(NSError *error) {
                         // Typically you should handle an error more gracefully than this.
                         NSLog(@"No groups");
                     }];
Community
  • 1
  • 1
278204
  • 588
  • 1
  • 4
  • 14