I'm trying to achieve something that seems like it should be simple: listen for MIDI messages in a Mac app, and use these to play notes from an existing AVAudioUnit instrument.
Hypothesis: I need to write a bridge between the MIDIReadBlock associated with my CoreMIDI client (via MIDIInputPortCreateWithBlock
with a MIDIClientRef) and the AUScheduleMIDIEventBlock I can get from my AVAudioUnit's AUAudioUnit (via scheduleMIDIEventBlock
). This seems more complex than it should be though, since I'll be mucking around with raw MIDI data – I feel like audio units must support some sort of MIDI abstraction that's easy to use with CoreMIDI, but I can't find any related examples of this. Perhaps there's a way to use MIDIOutputPortCreate
with an AV/AUAudioUnit?
What I'm looking for is a working example of piping MIDI input directly into an audio unit (ideally using Swift 3), but if you know of any related resources that are relatively current, please share those links too. The sparsity of documentation for these APIs is pretty frustrating. Thanks!