Using AudioUnit with wav file

Discussion in 'iOS Development' started by jbonedev, Aug 7, 2009.

  1. jbonedev

    jbonedev New Member

    May 24, 2009
    Likes Received:
    iPhone 3G (Black)
    I'm new to Core Audio (and audio programming in general)

    I want to mix two wav files using a kAudioUnitSubType_MultiChannelMixer (At least that's how I think I want to do it..)

    I undertand I need to create a AudioUnit for type
    kAudioUnitSubType_RemoteIO (specified in the ComponentDescription)

    And then connect this with a mixer unit of type

    Then connect the two. That makes sense.

    After that I'm not clear on how I create AudioUnits which represent the .wav files and connect those to the Mixer.

    I found an example of how to connect a MIDI synth to a mixer which looked like this:
    AUNode SynthNode;
    AudioUnit SynthUnit;
    cd.componentManufacturer = kAudioUnitManufacturer_Apple;
    cd.componentFlags = 0;
    cd.componentFlagsMask = 0;
    cd.componentType = kAudioUnitType_MusicDevice;
    cd.componentSubType = kAudioUnitSubType_DLSSynth;
    AUGraphNewNode(AudioGraph, &cd, 0, NULL, &SynthNode);
    AUGraphGetNodeInfo(AudioGraph, SynthNode, 0, 0, 0, &SynthUnit);
    AUGraphConnectNodeInput(AudioGraph, SynthNode, 0, MixerNode, 0);
    Is there some similar type of AudioUnit which represents a .wav file and can be connected to the mixer unit?

    Another question I have is, I see a lot of people working with the AudioUnit APIs defining their own callbacks to load buffers. Do I need to do that in this case? As I say, I'm new to audio programming so I'm not sure what the callbacks need to do. Presumably read bits from the .wav file and populate a buffer which is passed as a parameter to the callback. But how exactly this needs to be done is not at all clear to me. (Where do I read from the .wav file e.g.)

Share This Page