I am trying to write a MIDIPlayer class which is a wrapper for an AVAudioEngine and an AVAudioUnitMIDIInstrument. I have written a loop that gets the names and ASBDs for all AudioComponents of type MusicDevice and then chooses the most desired unit according to a list which works quite similar to font substitution, with Apples DLS MusicDevice as the ultimate fallback. Here's my sample code:
import AVKit
fileprivate func setupAVAudioEngine(engine: inout AVAudioEngine, instrumentAU: inout AVAudioUnitMIDIInstrument?) {
    var instrumentACD = AudioComponentDescription(componentType: kAudioUnitType_MusicDevice, componentSubType: 0, componentManufacturer: 0, componentFlags: 0, componentFlagsMask: 0)
    var instrumentComponents: [(AudioComponentDescription, String)] = []
    var instrumentComponent: AudioComponent? = nil
    repeat {
        instrumentComponent = AudioComponentFindNext(instrumentComponent, &instrumentACD)
        if instrumentComponent == nil {
            break
        }
        var compDescr =  AudioComponentDescription()
        var name: Unmanaged<CFString>?
        AudioComponentCopyName(instrumentComponent!, &name)
        AudioComponentGetDescription(instrumentComponent!, &compDescr)
        let nameString = name!.takeRetainedValue() as String
        instrumentComponents.append((compDescr, nameString))
        name?.release()
    } while true
    let instrumentComponentSubstitutionList = ["MakeMusic: SmartMusicSoftSynth","Apple: AUMIDISynth","Apple: DLSMusicDevice"]
    var found = false
    for instrument in instrumentComponentSubstitutionList {
        for member in instrumentComponents {
            if member.1 == instrument {
                instrumentACD = member.0
                print("\(member.1) found")
                found = true
                break
            }
            if found {break}
        }
    }
    print("Try to create InstrumentNode with ACD: \(instrumentACD)")
    instrumentAU = AVAudioUnitMIDIInstrument(audioComponentDescription: instrumentACD)
    print("InstrumentNode created: \(instrumentAU!.name)")
    print()
    engine.attach(instrumentAU!)
    engine.connect(instrumentAU!, to: engine.mainMixerNode, format: nil)
}
open class MIDIPlayer {
    private var audioEngine: AVAudioEngine
    private var instrumentUnit: AVAudioUnitMIDIInstrument
    private var mainMixer: AVAudioMixerNode
    public init() {
        self.audioEngine = AVAudioEngine()
        self.mainMixer = audioEngine.mainMixerNode
        var instrumentAU: AVAudioUnitMIDIInstrument?
        setupAVAudioEngine(engine: &audioEngine, instrumentAU: &instrumentAU)
        self.instrumentUnit = instrumentAU!
        try! audioEngine.start()
    }
    public func playMIDINote(_ note: UInt8) {
        print("Playing MIDI Note \(note)")
        instrumentUnit.startNote(note, withVelocity: 70, onChannel: 0)
        sleep(1)
        instrumentUnit.stopNote(note, onChannel: 0)
    }
}
let midiPlayer = MIDIPlayer()
midiPlayer.playMIDINote(60)
midiPlayer.playMIDINote(62)
midiPlayer.playMIDINote(64)
midiPlayer.playMIDINote(65)
midiPlayer.playMIDINote(67)
The code works perfectly fine in an Xcode 11.3.1 playground, howewer, when I use the exact same code outside the playground I get different kinds of errors: When I copy the same code into a command line project and run it from XCode it still works but the console gives me the following error:
[AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData:  AudioObjectGetPropertyData: no object with given ID 0
When I run the executable without Xcode no error is reported.
When I create a single View app, put the class and its setup func in its own source file and create an instance in the applicationDidFinishLaunching method of the AppDelegate I get the following errors:
[AudioHAL_Client] HALC_ShellDriverPlugIn.cpp:104:Open:  HALC_ShellDriverPlugIn::Open: opening the plug-in failed, Error: 2003329396 (what)
[AudioHAL_Client] AudioHardware.cpp:666:AudioObjectGetPropertyData:  AudioObjectGetPropertyData: no object with given ID 0
These errors are written to the console even before my setup function is called. However, the code still works (I hear notes playing), but only if I change the instrumentComponentSubstitutionList so that one of the Apple AUs will be found (in other words: not the SmartMusicSoftSynth). When I keep the SoftSynth the preferred device the code crashes and I get the additional error:
[avae] AVAEInternal.h:103:_AVAE_CheckNoErr: [AUInterface.mm:461:AUInterfaceBaseV3: (AudioComponentInstanceNew(comp, &_auv2)): error -3000
Note: In the playground and the command line app the SoftSynth works.
Some observations which may or may not relate to the issue:
In this blogpost http://www.rockhoppertech.com/blog/multi-timbral-avaudiounitmidiinstrument/ Gene DeLisa mentions that the
AVAudioUnitSampleris the only subclass of the abstract classAVAudioUnitMIDIInstrument. This post is from 2016, but I do not find any further information about that. But, obviously, the DLS MusicDevice as well as the 3rd party SoftSynth work – at least in some environments.The ASBD taken from the found
AudioComponentsfor both of the Apple Units have theircomponentFlagsproperty set to 2. The documentation says: must be set to zero unless a known specific value is requested. ThecomponentFlagsproperty of the SoftSynth is 0.I still have a somehow related problem when I try to capture audio from input How do I allow Xcode to access the microphone?? - related in so far that the command line app shows different behavior when run from XCode or via the terminal, and in both issues CoreAudio is involved.
My Question(s):
Why do I get these errors?
Why does the 3rd party plugIn work in the playground and the command line app but not in a single view app?
Is AVAudioEngine ready to host other Instrument Units than the monotimbral SamplerUnit?
Or do I have to step down and use the instrumentAUs directly and not the AVAudioUnitMIDIInstrument wrapper?