I am using an AVAssetReader to read the audio out of the iPod Library. (see here) I then take the read buffer and play it through an AudioUnit. I am trying to refractor the code to stream in the Audio as I play it out. However if an AVAssetReader is running the AudioUnit stops receiving calls to its kAudioUnitProperty_SetRenderCallback.
I have simplified my code to only play a file within the AudioUnits callback...
OSStatus UnitRenderCB(void* pRefCon, AudioUnitRenderActionFlags* flags, const AudioTimeStamp* timeStamp, UInt32 busNum, UInt32 numFrames, AudioBufferList* pData){
OSStatus tErr = noErr;
MySoundStream* pS = (MySoundStream*)pRefCon;
ExtAudioFileRead(pS->InRef, &numFrames, pData); // Here's the MP3/WAV/etc Data loaded...
if (numFrames <= 0){ // EOF?
ExtAudioFileSeek(pS->InRef, 0);
return 1;
}
return tErr;
}
and then I call this code and the AudioUnit stops playing, the UnitRenderCB is no longer called.
NSError* error = nil;
AVAssetReader* reader = [[AVAssetReader alloc] initWithAsset:songURL error:&error];
{
AVAssetReaderTrackOutput* output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack outputSettings:outputSettingsDict];
[reader addOutput:output];
[output release];
}
[reader startReading];
[reader cancelReading];
[reader release];
I am using the same AudioUnit setup as is listed here
Does the AVAssetReader use the AudioUnit systems? Is there no way to make them work together?