8

i am developing an applicatoin so that people can record and change their voices thru app and share it . Basically i so many things and now its time to ask you to help . Here is my play function which plays recorded audio file and adds effects on it .

private func playAudio(pitch : Float, rate: Float, reverb: Float, echo: Float) {
        // Initialize variables
        audioEngine = AVAudioEngine()
        audioPlayerNode = AVAudioPlayerNode()
        audioEngine.attachNode(audioPlayerNode)

        // Setting the pitch
        let pitchEffect = AVAudioUnitTimePitch()
        pitchEffect.pitch = pitch
        audioEngine.attachNode(pitchEffect)

        // Setting the platback-rate
        let playbackRateEffect = AVAudioUnitVarispeed()
        playbackRateEffect.rate = rate
        audioEngine.attachNode(playbackRateEffect)

        // Setting the reverb effect
        let reverbEffect = AVAudioUnitReverb()
        reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
        reverbEffect.wetDryMix = reverb
        audioEngine.attachNode(reverbEffect)

        // Setting the echo effect on a specific interval
        let echoEffect = AVAudioUnitDelay()
        echoEffect.delayTime = NSTimeInterval(echo)
        audioEngine.attachNode(echoEffect)

        // Chain all these up, ending with the output
        audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
        audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
        audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
        audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
        audioEngine.connect(echoEffect, to: audioEngine.outputNode, format: nil)

        audioPlayerNode.stop()

        let length = 4000
        let buffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))
        buffer.frameLength = AVAudioFrameCount(length)

        try! audioEngine.start()


        let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory,  NSSearchPathDomainMask.UserDomainMask, true)[0]
        let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound.m4a"))


        do{
            print(dirPaths)
            let settings = [AVFormatIDKey: NSNumber(unsignedInt: kAudioFormatMPEG4AAC), AVSampleRateKey: NSNumber(integer: 44100), AVNumberOfChannelsKey: NSNumber(integer: 2)]
            self.newAudio = try AVAudioFile(forWriting: tmpFileUrl, settings: settings)

            audioEngine.outputNode.installTapOnBus(0, bufferSize: (AVAudioFrameCount(self.player!.duration)), format: self.audioPlayerNode.outputFormatForBus(0)){
                (buffer: AVAudioPCMBuffer!, time: AVAudioTime!)  in

                print(self.newAudio.length)
                print("=====================")
                print(self.audioFile.length)
                print("**************************")
                if (self.newAudio.length) < (self.audioFile.length){

                    do{
                        //print(buffer)
                        try self.newAudio.writeFromBuffer(buffer)
                    }catch _{
                        print("Problem Writing Buffer")
                    }
                }else{
                    self.audioPlayerNode.removeTapOnBus(0)
                }

            }
        }catch _{
            print("Problem")
        }

        audioPlayerNode.play()

    }

I guess the problem is i am installTapOnBus to audioPlayerNode but the effected audio is on audioEngine.outputNode .However i tried to installTapOnBus to audioEngine.outputNode but it gives me error.Also i've tried to connect effects to audioEngine.mixerNode but it also not a solution . So that do you have any experiences on saving effected audio file ? How can i get this effected audio?

Any help is appreciated

Thank you

Kaan Baris Bayrak
  • 2,613
  • 3
  • 13
  • 20

5 Answers5

5

Here it is my solution to question :

func playAndRecord(pitch : Float, rate: Float, reverb: Float, echo: Float) {
    // Initialize variables

// These are global variables . if you want you can just  (let audioEngine = etc ..) init here these variables
    audioEngine = AVAudioEngine()
    audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)
    playerB = AVAudioPlayerNode()

    audioEngine.attachNode(playerB)

    // Setting the pitch
    let pitchEffect = AVAudioUnitTimePitch()
    pitchEffect.pitch = pitch
    audioEngine.attachNode(pitchEffect)

    // Setting the platback-rate
    let playbackRateEffect = AVAudioUnitVarispeed()
    playbackRateEffect.rate = rate
    audioEngine.attachNode(playbackRateEffect)

    // Setting the reverb effect
    let reverbEffect = AVAudioUnitReverb()
    reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
    reverbEffect.wetDryMix = reverb
    audioEngine.attachNode(reverbEffect)

    // Setting the echo effect on a specific interval
    let echoEffect = AVAudioUnitDelay()
    echoEffect.delayTime = NSTimeInterval(echo)
    audioEngine.attachNode(echoEffect)

    // Chain all these up, ending with the output
    audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
    audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
    audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
    audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
    audioEngine.connect(echoEffect, to: audioEngine.mainMixerNode, format: nil)


    // Good practice to stop before starting
    audioPlayerNode.stop()

    // Play the audio file 
// this player is also a global variable  AvAudioPlayer
    if(player != nil){
    player?.stop()
    }

    // audioFile here is our original audio
    audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: {
        print("Complete")
    })


    try! audioEngine.start()


    let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory,  NSSearchPathDomainMask.UserDomainMask, true)[0]
    let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound2.m4a"))

//Save the tmpFileUrl into global varibale to not lose it (not important if you want to do something else)
filteredOutputURL = tmpFileUrl

    do{
        print(dirPaths)

        self.newAudio = try! AVAudioFile(forWriting: tmpFileUrl, settings:  [
            AVFormatIDKey: NSNumber(unsignedInt:kAudioFormatAppleLossless),
            AVEncoderAudioQualityKey : AVAudioQuality.Low.rawValue,
            AVEncoderBitRateKey : 320000,
            AVNumberOfChannelsKey: 2,
            AVSampleRateKey : 44100.0
            ])

        let length = self.audioFile.length


        audioEngine.mainMixerNode.installTapOnBus(0, bufferSize: 1024, format: self.audioEngine.mainMixerNode.inputFormatForBus(0)) {
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in


            print(self.newAudio.length)
            print("=====================")
            print(length)
            print("**************************")

            if (self.newAudio.length) < length {//Let us know when to stop saving the file, otherwise saving infinitely

                do{
                    //print(buffer)
                    try self.newAudio.writeFromBuffer(buffer)
                }catch _{
                    print("Problem Writing Buffer")
                }
            }else{
                self.audioEngine.mainMixerNode.removeTapOnBus(0)//if we dont remove it, will keep on tapping infinitely

                //DO WHAT YOU WANT TO DO HERE WITH EFFECTED AUDIO

             }

        }
    }catch _{
        print("Problem")
    }

    audioPlayerNode.play()

}
Kaan Baris Bayrak
  • 2,613
  • 3
  • 13
  • 20
4

This doesn't seem to be hooked up correctly. I'm just learning all this myself, but I found that the effects are correctly added when you connect them to a mixer node. Also, you'll want to tap the mixer, not the engine output node. I've just copied your code and made a few modifications to take this into account.

private func playAudio(pitch : Float, rate: Float, reverb: Float, echo: Float) {
    // Initialize variables
    audioEngine = AVAudioEngine()
    audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)

    // Setting the pitch
    let pitchEffect = AVAudioUnitTimePitch()
    pitchEffect.pitch = pitch
    audioEngine.attachNode(pitchEffect)

    // Setting the playback-rate
    let playbackRateEffect = AVAudioUnitVarispeed()
    playbackRateEffect.rate = rate
    audioEngine.attachNode(playbackRateEffect)

    // Setting the reverb effect
    let reverbEffect = AVAudioUnitReverb()
    reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
    reverbEffect.wetDryMix = reverb
    audioEngine.attachNode(reverbEffect)

    // Setting the echo effect on a specific interval
    let echoEffect = AVAudioUnitDelay()
    echoEffect.delayTime = NSTimeInterval(echo)
    audioEngine.attachNode(echoEffect)

    // Set up a mixer node
    let audioMixer = AVAudioMixerNode()
    audioEngine.attachNode(audioMixer)

    // Chain all these up, ending with the output
    audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
    audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
    audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
    audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
    audioEngine.connect(echoEffect, to: audioMixer, format: nil)
    audioEngine.connect(audioMixer, to: audioEngine.outputNode, format: nil)

    audioPlayerNode.stop()

    let length = 4000
    let buffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))
    buffer.frameLength = AVAudioFrameCount(length)

    try! audioEngine.start()


    let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory,  NSSearchPathDomainMask.UserDomainMask, true)[0]
    let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound.m4a"))


    do{
        print(dirPaths)
        let settings = [AVFormatIDKey: NSNumber(unsignedInt: kAudioFormatMPEG4AAC), AVSampleRateKey: NSNumber(integer: 44100), AVNumberOfChannelsKey: NSNumber(integer: 2)]
        self.newAudio = try AVAudioFile(forWriting: tmpFileUrl, settings: settings)

        audioMixer.installTapOnBus(0, bufferSize: (AVAudioFrameCount(self.player!.duration)), format: self.audioMixer.outputFormatForBus(0)){
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!)  in

            print(self.newAudio.length)
            print("=====================")
            print(self.audioFile.length)
            print("**************************")
            if (self.newAudio.length) < (self.audioFile.length){

                do{
                    //print(buffer)
                    try self.newAudio.writeFromBuffer(buffer)
                }catch _{
                    print("Problem Writing Buffer")
                }
            }else{
                self.audioMixer.removeTapOnBus(0)
            }

        }
    }catch _{
        print("Problem")
    }

    audioPlayerNode.play()

}

I also had trouble getting the file formatted properly. I finally got it working when I changed my path of the output file from m4a to caf. One other suggestion is to not have nil for the format parameter. I used the audioFile.processingFormat. I hope this helps. My audio effects/mixing is functional, although I did not chain my effects. So feel free to ask questions.

FromTheStix
  • 399
  • 4
  • 8
  • @KBB One question I have is where is the original audio file? Is it previously recorded and then played? If so, you'll want to schedule that file - `audioPlayerNode.scheduleFile(recordedAudioFile, atTime: nil, completionHandler: nil)` and start the audioEngine before you call `audioPlayerNode.play()` – FromTheStix Aug 16 '16 at 01:09
  • this is definitely what I am going to write as an answer to my own question :D I solved it with connecting last effect to mixer node and putting tap on mixer node , I am going to write my own solution below – Kaan Baris Bayrak Aug 16 '16 at 07:10
  • Excellent! Glad it helped. I kept getting data from the buffer and the file length was increasing as the buffer wrote to file but there was no duration to the audio file. As soon as I changed the file type to `.caf` it was readable/playable. – FromTheStix Aug 17 '16 at 14:17
1

just change the parameter unsigned int from kAudioFormatMPEG4AAC to kAudioFormatLinearPCM and also change file type to .caf it will sure helpfull my friend

0

For anyone who have the problem of having to play the audio file TWICE to save it, i just added the following line at the respective place and it solved my problem. might help someone in the future.

P.S: I used the EXACT same code as the checked Answer from above, just added this one line and solved my problem.

//Do what you want to do here with effected Audio

   self.newAudio = try! AVAudioFile(forReading: tmpFileUrl)
Arun Vinoth
  • 20,360
  • 14
  • 48
  • 135
-1

I got this after I add

self.newAudio = try! AVAudioFile(forReading: tmpFileUrl)

return like this

Error
Domain=com.apple.coreaudio.avfaudio
Code=1685348671 "(null)" UserInfo={failed
call=ExtAudioFileOpenURL((CFURLRef)fileUR
L, &_extAudioFile)}