I have been unable to find much information on CoreMIDI for iOS. Is it even possible to play a MIDI sound by sending a message to the device itself. Does the iPhone or iPad have a MIDI device installed or do you have to have a device connected to interface with?
You should give a look at pete goodliffe's blog and he generously provides an example project. It helped me a lot to start programming CoreMIDI.
Now about your questions, on iOS, mostly CoreMIDI network sessions are used. The participants to a same "Network Session" send messages to each others.
For example, you configure a network session on your Mac (using Audio MIDI Setup tool) and you can connect your iOS devices to it. This way, you can sent messages from iOS to your OSX host and vice versa.
CoreMIDI network sessions relies on the RTP protocol to transport MIDI messages and Bonjour to discover hosts.
Next to that, CoreMIDI can also handle MIDI interface connected to the system, but iOS devices don't have physical MIDI interface by default. You have to buy external hardware if you want connect directly your iPhone to a synthesizer. However, the iPad can be connected to an USB class compliant Midi interface via the camera kit.
Another thing, on a standalone iOS device, you can send use the local CoreMIDI session to send or receive messages from/to another CoreMIDI compatible application.
This is a couple of years too late, but it may help someone else out there like it helped me. This website was instrumental in helping me read MIDI data from an external MIDI keyboard. The connections are the trickiest parts, but this tutorial will walk you through it.
Here's the class that I created.
MIDIController.h
#import <Foundation/Foundation.h>
@interface MIDIController : NSObject
@property NSMutableArray *notes;
@end
MIDIController.m
#import "MIDIController.h"
#include <CoreFoundation/CoreFoundation.h>
#import <CoreMIDI/CoreMIDI.h>
#define SYSEX_LENGTH 1024
#define KEY_ON 1
#define KEY_OFF 0
@implementation MIDIController
- (id)init {
if (self = [super init]) {
_notes = [[NSMutableArray alloc] init];
[self setupMidi];
}
return self;
}
- (void) setupMidi {
MIDIClientRef midiClient;
checkError(MIDIClientCreate(CFSTR("MIDI client"), NULL, NULL, &midiClient), "MIDI client creation error");
MIDIPortRef inputPort;
checkError(MIDIInputPortCreate(midiClient, CFSTR("Input"), midiInputCallback, (__bridge_retained void *)self, &inputPort), "MIDI input port error");
checkError(connectMIDIInputSource(inputPort), "connect MIDI Input Source error");
}
OSStatus connectMIDIInputSource(MIDIPortRef inputPort) {
unsigned long sourceCount = MIDIGetNumberOfSources();
for (int i = 0; i < sourceCount; ++i) {
MIDIEndpointRef endPoint = MIDIGetSource(i);
CFStringRef endpointName = NULL;
checkError(MIDIObjectGetStringProperty(endPoint, kMIDIPropertyName, &endpointName), "String property not found");
checkError(MIDIPortConnectSource(inputPort, endPoint, NULL), "MIDI not connected");
}
return noErr;
}
void midiInputCallback(const MIDIPacketList *list, void *procRef, void *srcRef) {
MIDIController *midiController = (__bridge MIDIController*)procRef;
UInt16 nBytes;
const MIDIPacket *packet = &list->packet[0]; //gets first packet in list
for(unsigned int i = 0; i < list->numPackets; i++) {
nBytes = packet->length; //number of bytes in a packet
handleMIDIStatus(packet, midiController);
packet = MIDIPacketNext(packet);
}
}
void handleMIDIStatus(const MIDIPacket *packet, MIDIController *midiController) {
int status = packet->data[0];
//unsigned char messageChannel = status & 0xF; //16 possible MIDI channels
switch (status & 0xF0) {
case 0x80:
updateKeyboardButtonAfterKeyPressed(midiController, packet->data[1], KEY_OFF);
break;
case 0x90:
//data[2] represents the velocity of a note
if (packet->data[2] != 0) {
updateKeyboardButtonAfterKeyPressed(midiController, packet->data[1], KEY_ON);
}//note off also occurs if velocity is 0
else {
updateKeyboardButtonAfterKeyPressed(midiController, packet->data[1], KEY_OFF);
}
break;
default:
//NSLog(@"Some other message");
break;
}
}
void updateKeyboardButtonAfterKeyPressed(MIDIController *midiController, int key, bool keyStatus) {
NSMutableArray *notes = [midiController notes];
//key is being pressed
if(keyStatus) {
[notes addObject:[NSNumber numberWithInt:key]];
}
else {//key has been released
for (int i = 0; i < [notes count]; i++) {
if ([[notes objectAtIndex:i] integerValue] == key) {
[notes removeObjectAtIndex:i];
}
}
}
}
void checkError(OSStatus error, const char* task) {
if(error == noErr) return;
char errorString[20];
*(UInt32 *)(errorString + 1) = CFSwapInt32BigToHost(error);
if(isprint(errorString[1]) && isprint(errorString[2]) && isprint(errorString[3]) && isprint(errorString[4])) {
errorString[0] = errorString[5] = '\'';
errorString[6] = '\0';
}
else
sprintf(errorString, "%d", (int)error);
fprintf(stderr, "Error: %s (%s)\n", task, errorString);
exit(1);
}
@end
Additional Notes
midiInputCallback Function
midiInputCallback
is the function that is called when a MIDI event occurs via a MIDI device (keyboard)
NOTE: This is where you can start handling MIDI information
handleMIDIStatus function
handleMIDIStatus
takes the MIDI packet (which contains the information about what was played and an instance of MIDIController
NOTE: You need the reference to MIDIController so that you can populate properties for the class...in my case I store all played notes, by MIDI number, in an array for use later onwhen the
status
is0x90
, which means a note has been triggered, if it has a velocity of 0, it is considered not played...I needed to add this if statement because it wasn't functioning properly
NOTE: I only handlekey on
andkey off
events, so you would augment the switch statement to handle more MIDI events
updateKeyboardButtonAfterKeyPressed Method
- This is a method that I used to store notes that are played and I remove notes from this array once the key has been released
I hope this helps.
You should give a look at pete goodliffe's blog and he generously provides an example project. It helped me a lot to start programming CoreMIDI.
Now about your questions, on iOS, mostly CoreMIDI network sessions are used. The participants to a same "Network Session" send messages to each others.
For example, you configure a network session on your Mac (using Audio MIDI Setup tool) and you can connect your iOS devices to it. This way, you can sent messages from iOS to your OSX host and vice versa.
CoreMIDI network sessions relies on the RTP protocol to transport MIDI messages and Bonjour to discover hosts.
Next to that, CoreMIDI can also handle MIDI interface connected to the system, but iOS devices don't have physical MIDI interface by default. You have to buy external hardware if you want connect directly your iPhone to a synthesizer. However, the iPad can be connected to an USB class compliant Midi interface via the camera kit.
Another thing, on a standalone iOS device, you can send use the local CoreMIDI session to send or receive messages from/to another CoreMIDI compatible application.
import UIKit
import CoreMIDI
class ViewController : UIViewController {
// MARK: - Properties -
var inputPort : MIDIPortRef = 0
var source : MIDIDeviceRef = 0
var client = MIDIClientRef()
var connRefCon : UnsafeMutableRawPointer?
var endpoint : MIDIEndpointRef?
// MARK: - Lifecycle -
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
print("viewDidLoad")
// endpoint
self.endpoint = MIDIGetSource(MIDIGetNumberOfSources()-1)
// USB Device References
let sources = getUSBDeviceReferences()
if sources.count > 0 {
self.source = sources.first!
}
print("source: \(source)")
// create client
DispatchQueue.global().async {
self.createClient()
}
}
// MARK: - USB Device References -
/// Filters all `MIDIDeviceRef`'s for USB-Devices
private func getUSBDeviceReferences() -> [MIDIDeviceRef] {
var devices = [MIDIDeviceRef]()
for index in 0 ..< MIDIGetNumberOfDevices() {
print("index: \(index)")
let device = MIDIGetDevice(index)
var list : Unmanaged<CFPropertyList>?
MIDIObjectGetProperties(device, &list, true)
if let list = list {
let dict = list.takeRetainedValue() as! NSDictionary
print("dict: \(dict)")
if dict["USBLocationID"] != nil {
print("USB MIDI DEVICE")
devices.append(device)
}
}
}
return devices
}
// MARK: - Client -
func createClient() {
print("createClient")
let clientName = "Client" as CFString
let err = MIDIClientCreateWithBlock(clientName, &client) { (notificationPtr: UnsafePointer<MIDINotification>) in
let notification = notificationPtr.pointee
print("notification.messageID: \(notification.messageID)")
switch notification.messageID {
case .msgSetupChanged: // Can ignore, really
break
case .msgObjectAdded:
let rawPtr = UnsafeRawPointer(notificationPtr)
let message = rawPtr.assumingMemoryBound(to: MIDIObjectAddRemoveNotification.self).pointee
print("MIDI \(message.childType) added: \(message.child)")
case .msgObjectRemoved:
let rawPtr = UnsafeRawPointer(notificationPtr)
let message = rawPtr.assumingMemoryBound(to: MIDIObjectAddRemoveNotification.self).pointee
print("MIDI \(message.childType) removed: \(message.child)")
case .msgPropertyChanged:
let rawPtr = UnsafeRawPointer(notificationPtr)
let message = rawPtr.assumingMemoryBound(to: MIDIObjectPropertyChangeNotification.self).pointee
print("MIDI \(message.object) property \(message.propertyName.takeUnretainedValue()) changed.")
case .msgThruConnectionsChanged:
fallthrough
case .msgSerialPortOwnerChanged:
print("MIDI Thru connection was created or destroyed")
case .msgIOError:
let rawPtr = UnsafeRawPointer(notificationPtr)
let message = rawPtr.assumingMemoryBound(to: MIDIIOErrorNotification.self).pointee
print("MIDI I/O error \(message.errorCode) occurred")
default:
break
}
}
// createInputPort from client
self.createInputPort(midiClient: self.client)
if err != noErr {
print("Error creating MIDI client: \(err)")
}
// run on background for connect / disconnect
let rl = RunLoop.current
while true {
rl.run(mode: .default, before: .distantFuture)
}
}
// MARK: - Input Port -
func createInputPort(midiClient: MIDIClientRef) {
print("createInputPort: midiClient: \(midiClient)")
MIDIInputPortCreateWithProtocol(
midiClient,
"Input Port" as CFString,
MIDIProtocolID._1_0,
&self.inputPort) { [weak self] eventList, srcConnRefCon in
//
let midiEventList: MIDIEventList = eventList.pointee
//print("srcConnRefCon: \(srcConnRefCon)")
//print("midiEventList.protocol: \(midiEventList.protocol)")
var packet = midiEventList.packet
//print("packet: \(packet)")
(0 ..< midiEventList.numPackets).forEach { _ in
//print("\(packet)")
let words = Mirror(reflecting: packet.words).children
words.forEach { word in
let uint32 = word.value as! UInt32
guard uint32 > 0 else { return }
let midiPacket = MidiPacket(
command: UInt8((uint32 & 0xFF000000) >> 24),
channel: UInt8((uint32 & 0x00FF0000) >> 16),
note: UInt8((uint32 & 0x0000FF00) >> 8),
velocity: UInt8(uint32 & 0x000000FF))
print("----------")
print("MIDIPACKET")
print("----------")
midiPacket.printValues()
}
}
}
MIDIPortConnectSource(self.inputPort, self.endpoint ?? MIDIGetSource(MIDIGetNumberOfSources()-1), &self.connRefCon)
}
}
class MidiPacket : NSObject {
var command : UInt8 = 0
var channel : UInt8 = 0
var note : UInt8 = 0
var velocity : UInt8 = 0
init(command: UInt8, channel: UInt8, note: UInt8, velocity: UInt8) {
super.init()
self.command = command
self.channel = channel
self.note = note
self.velocity = velocity
}
func printValues(){
print("command: \(self.command)")
print("channel: \(self.channel)")
print("note: \(self.note)")
print("velocity: \(self.velocity)")
}
}
© 2022 - 2024 — McMap. All rights reserved.