25

I'm looking at replacing ALAssetsLibrary with Photos framework in my app.

I can retrieve photos, collections, and asset sources just fine (even write them back out), but don't see anywhere to access the metadata of the photos (the dictionaries such as {Exif}, {TIFF}, {GPS}, etc...).

ALAssetsLibrary has a way. UIImagePickerController has a way. Photos must have a way too.

I see that PHAsset has a location property which will do for the GPS dictionary, but I'm looking to access all of the metadata which include faces, orientation, exposure, ISO, and tons more.

Currently apple is at beta 2. Perhaps there are more APIs to come ?

UPDATE

There is no official way to do this using only Photos APIs.

However you can read the metadata after you download the image data. There are a couple of methods to do this using either PHImageManager or PHContentEditingInput.

The PHContentEditingInput method required less code and doesn't require you to import ImageIO. I've wrapped it up in a PHAsset category.

Axel Guilmin
  • 10,343
  • 7
  • 49
  • 59
VaporwareWolf
  • 9,573
  • 8
  • 48
  • 78
  • Did you figure out if there is a way to do this without downloading the image data? – WYS Aug 02 '16 at 13:22
  • I have checked your category but requestMetadataWithCompletionBlock doesn't return metadata for videos. Is there any other way to get metadata of videos without downloading the videos – Ekra Oct 18 '16 at 07:00

6 Answers6

39

If you request a content editing input, you can get the full image as a CIImage, and CIImage has a property titled properties which is a dictionary containing the image metadata.

Sample Swift Code:

let options = PHContentEditingInputRequestOptions()
options.networkAccessAllowed = true //download asset metadata from iCloud if needed

asset.requestContentEditingInputWithOptions(options) { (contentEditingInput: PHContentEditingInput?, _) -> Void in
    let fullImage = CIImage(contentsOfURL: contentEditingInput!.fullSizeImageURL)

    print(fullImage.properties)
}

Sample Objective-C Code:

PHContentEditingInputRequestOptions *options = [[PHContentEditingInputRequestOptions alloc] init];
options.networkAccessAllowed = YES; //download asset metadata from iCloud if needed

[asset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
    CIImage *fullImage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];

    NSLog(@"%@", fullImage.properties.description);
}];

You'll get the desired {Exif}, {TIFF}, {GPS}, etc dictionaries.

Jordan H
  • 45,794
  • 29
  • 162
  • 306
  • Very cool. I am able to read the properties like you showed. Will try writing them out too. – VaporwareWolf Sep 08 '14 at 19:08
  • If you are able to write them out, please let me know! – Jordan H Sep 08 '14 at 22:30
  • This seems to be the fastest way to get the job done. It's also far less code than my example (downloading the image data), and this version doesn't depend on ImageIO. Accepted answer – VaporwareWolf Sep 25 '14 at 00:21
  • Thank you @Joey! I created gist based on your code to share it with my team. I'd like to share it here as well: [swift-photos-metadata](https://gist.github.com/MatthiasHoldorf/b4d0488641feb5d8ec55) – Matthias Holdorf Oct 26 '15 at 08:47
  • Can you comment on the performance of this? I presume it requires the full size image to be downloaded? I was hoping to access these properties with just a lower resolution placeholder. – Benjohn Jan 13 '16 at 16:06
  • 1
    @Benjohn This solution does require downloading the full image if it's not already available locally, i.e. stored in iCloud – Jordan H Jan 13 '16 at 17:38
  • Thanks @joey – useful to know! – Benjohn Jan 14 '16 at 09:41
  • I am not getting {TIFF}, {GPS} ? – karthikeyan Aug 17 '16 at 11:35
  • @karthikeyan Those will only appear if that photo contains that information. Metadata can be stripped from photos. – Jordan H Aug 17 '16 at 13:28
  • yes, i noticed that sometimes it not returns those data, i done my work with this.thanks for reply – karthikeyan Aug 18 '16 at 04:44
  • I assume that since this is a `CIImage` (i.e., " just a recipe" for an image, and not an actual bitmap, until actually rendered), the full-size image is **not** read from disk and the memory penalty isn't incurred? (as opposed to instantiating an `UIImage` with the same url?) – Nicolas Miari Oct 12 '18 at 06:53
8

I thought I'd share some code to read the metadata using the ImageIO framework in conjunction with Photos framework. You must request the image data using a PHCachingImageManager.

@property (strong) PHCachingImageManager *imageManager;

Request the image and use it's data to create a metadata dictionary

-(void)metadataReader{
    PHFetchResult *result = [PHAsset fetchAssetsInAssetCollection:self.myAssetCollection options:nil];
    [result enumerateObjectsAtIndexes:[NSIndexSet indexSetWithIndex:myIndex] options:NSEnumerationConcurrent usingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
        [self.imageManager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
            NSDictionary *metadata = [self metadataFromImageData:imageData];
                           NSLog(@"Metadata: %@", metadata.description);
            NSDictionary *gpsDictionary = metadata[(NSString*)kCGImagePropertyGPSDictionary];
            if(gpsDictionary){
                NSLog(@"GPS: %@", gpsDictionary.description);
            }
            NSDictionary *exifDictionary = metadata[(NSString*)kCGImagePropertyExifDictionary];
            if(exifDictionary){
                NSLog(@"EXIF: %@", exifDictionary.description);
            }

            UIImage *image = [UIImage imageWithData:imageData scale:[UIScreen mainScreen].scale];
            // assign image where ever you need...
        }];

    }];
}

Convert NSData to metadata

-(NSDictionary*)metadataFromImageData:(NSData*)imageData{
    CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)(imageData), NULL);
    if (imageSource) {
        NSDictionary *options = @{(NSString *)kCGImageSourceShouldCache : [NSNumber numberWithBool:NO]};
        CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (__bridge CFDictionaryRef)options);
        if (imageProperties) {
            NSDictionary *metadata = (__bridge NSDictionary *)imageProperties;
            CFRelease(imageProperties);
            CFRelease(imageSource);
            return metadata;
        }
        CFRelease(imageSource);
    }

    NSLog(@"Can't read metadata");
    return nil;
}

This has the overhead of grabbing the image, so it's not nearly as fast as enumerating your assets or collections, but it's something at least.

VaporwareWolf
  • 9,573
  • 8
  • 48
  • 78
  • This code sample has a memory leak; if `imageProperties` is non-nil, the method will return before it calls `CFRelease(imageSource)`. – Riley Testut Aug 21 '14 at 05:09
6

I prefer not to CIImage solution, but to ImageIO solution:

func imageAndMetadataFromImageData(data: NSData)-> (UIImage?,[String: Any]?) {
    let options = [kCGImageSourceShouldCache as String: kCFBooleanFalse]
    if let imgSrc = CGImageSourceCreateWithData(data, options as CFDictionary) {
        let metadata = CGImageSourceCopyPropertiesAtIndex(imgSrc, 0, options as CFDictionary) as! [String: Any]
        //print(metadata)
        // let image = UIImage(cgImage: imgSrc as! CGImage)
        let image = UIImage(data: data as Data)
        return (image, metadata)
    }
    return (nil, nil)
}

below is code to get data from PHAseet

func getImageAndMeta(asset: PHAsset){
    let options = PHImageRequestOptions()
    options.isSynchronous = true
    options.resizeMode = .none
    options.isNetworkAccessAllowed = false
    options.version = .current
    var image: UIImage? = nil
    var meta:[String:Any]? = nil
    _ = PHCachingImageManager().requestImageData(for: asset, options: options) { (imageData, dataUTI, orientation, info) in
        if let data = imageData {
            (image, meta) = imageAndMetadataFromImageData(data: data as NSData)
            //image = UIImage(data: data)
        }
    }
    // here to return image and meta
}
jcesarmobile
  • 45,750
  • 8
  • 107
  • 152
lbsweek
  • 4,230
  • 37
  • 37
5

PhotoKit limits the access to metadata to the properties of PHAsset (location, creationDate, favorite, hidden, modificatonDate, pixelWidth, pixelHeight...). The reason (I suspect) is that due to the introduction of iCloud PhotoLibrary the images may not be on the device. Therefore the whole metadata is not available. The only way to get full EXIF/IPTC metadata is to first download the original image (if not available) from iCloud and then use ImageIO to extract its metadata.

holtmann
  • 5,957
  • 30
  • 42
  • Yep, that's all that can be done right now. I filed a bug report (feature request actually) with Apple. They gave me the "Thanks, we always looking for new ideas" reply. I would be surprise to see it in 8.0. Too bad though. That could provide for some powerful filtering. They are already pulling some of it (location for example). They could pull the whole dictionary and expose it as .metadata like in ALAsset.defaultRepresentation, even if it was readonly. – VaporwareWolf Aug 03 '14 at 22:35
  • To me it seems PhotoKit it still in its very early stage. There are unfortunately still many loopholes in functionality. The new fetching concept is e.g. powerful, but the properties we can fetch on are very limited. – holtmann Aug 04 '14 at 10:45
5

Better solution i found and worked well for me is:

[[PHImageManager defaultManager] requestImageDataForAsset:photoAsset
                                                  options:reqOptions
                                            resultHandler:
         ^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
             CIImage* ciImage = [CIImage imageWithData:imageData];
             DLog(@"Metadata : %@", ciImage.properties);
         }];
Karthik
  • 1,356
  • 12
  • 7
0

You can modify the PHAsset (e.g. adding location metadata) using Photos Framework and the UIImagePickerControllerDelegate method. No overhead from third party libraries, no duplicate photos created. Works for iOS 8.0+

In the didFinishPickingMediaWithInfo delegate method, call UIImageWriteToSavedPhotosAlbum to first save the image. This will also create the PHAsset whose EXIF GPS data we will modify:

func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {

    if let myImage = info[UIImagePickerControllerOriginalImage] as? UIImage  {

        UIImageWriteToSavedPhotosAlbum(myImage, self, Selector("image:didFinishSavingWithError:contextInfo:"), nil)
    }    
}

The completion selector function will run after the save completes or fails with error. In the callback, fetch the newly created PHAsset. Then, create a PHAssetChangeRequest to modify the location metadata.

func image(image: UIImage, didFinishSavingWithError: NSErrorPointer, contextInfo:UnsafePointer<Void>)       {

    if (didFinishSavingWithError != nil) {
        print("Error saving photo: \(didFinishSavingWithError)")
    } else {
        print("Successfully saved photo, will make request to update asset metadata")

        // fetch the most recent image asset:
        let fetchOptions = PHFetchOptions()
        fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
        let fetchResult = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)

        // get the asset we want to modify from results:
        let lastImageAsset = fetchResult.lastObject as! PHAsset

        // create CLLocation from lat/long coords:
        // (could fetch from LocationManager if needed)
        let coordinate = CLLocationCoordinate2DMake(myLatitude, myLongitude)
        let nowDate = NSDate()
        // I add some defaults for time/altitude/accuracies:
        let myLocation = CLLocation(coordinate: coordinate, altitude: 0.0, horizontalAccuracy: 1.0, verticalAccuracy: 1.0, timestamp: nowDate)

        // make change request:
        PHPhotoLibrary.sharedPhotoLibrary().performChanges({

            // modify existing asset:
            let assetChangeRequest = PHAssetChangeRequest(forAsset: lastImageAsset)
            assetChangeRequest.location = myLocation

            }, completionHandler: {
                (success:Bool, error:NSError?) -> Void in

                if (success) {
                    print("Succesfully saved metadata to asset")
                    print("location metadata = \(myLocation)")
                } else {
                    print("Failed to save metadata to asset with error: \(error!)")
}
Null
  • 1,940
  • 9
  • 24
  • 29
kev8484
  • 482
  • 1
  • 7
  • 16
  • Yes you can. I shared a category in my original post which has a handy method to do this (similar to yours). Its block is called with the updated PHAsset. No dups. [asset updateLocation:location creationDate:nil assetBlock:(^assetBlock)]; – VaporwareWolf Oct 09 '15 at 21:22