MidiPad – Midi Events

Musical Instrument Digital Interface (MIDI) has been around since the early 1980’s and the basic specification has changed little since. It is a standard by which electronic musical instruments and other devices can communicate with each other. In Marshmallow (V6.0 – API 23) Android actually got some good MIDI support, and in this series of articles we’ll take a look at how we can create a MIDI controller app. For the non-musicians and those who have no interest in MIDI, do not despair there will be some custom controls we create along the way which may still be of interest. In this article we’ll take a look at how we actually send MIDI events.

So far we’ve looked at how we can discover available MIDI devices, and created the UI to allow the user to trigger MIDI events to be sent to a selected sound module. In this final article in this series we’ll hook those two things up.

Although we have discovered a list of available devices and presented them in a list to the user to select the desired output device, we can’t actually send any data because we have not actually opened a connection to the device yet. We first created MidiController in this article but we left out the code to actually connect to the device. Let’s start by adding this:

fun open(midiDeviceInfo: MidiDeviceInfo) =
        close().also {
            midiDeviceInfo.ports.first {
                it.type == MidiDeviceInfo.PortInfo.TYPE_INPUT
            }.portNumber.also { portNumber ->
                midiManager.openDevice(midiDeviceInfo, {
                    midiDevice = it
                    midiInputPort = it.openInputPort(portNumber)
                }, handler)

We first close any existing connection so we are only ever connected to a single device. Next we obtain the first instance of an ‘Input’ port for the device, and obtain it’s port number. If this seems confusing because we’re going to output data to this port, think of it as a port on the device which accepts input rather than as a port that our app is going to output to, and the naming makes complete sense. We then use our midiManager instance to open the device. The second argument of openDevice is actually a MidiManager.OnDeviceOpenedListener instance and we supply a lambda which is the implementation of the single method in that interface, which is a callback for when the device has been successfully opened. Within that we store the newly opened device to a MidiDevice variable so that we can cleanly close it later, and then open the input port and store that to a MidiInputPort variable.

The code to close the connection is pretty straightforward:

fun close() {
    midiInputPort = null
    midiDevice = null

We close both the MidiInputPort and the MidiDevice instances that we obtained during the open call.

So now we open and close the input port, so we can begin sending MIDI events. The MIDI specification is a pretty sizeable document with many addenda, and it far too much to cover in any details. For the purposes of this article we’ll stick to two separate types of MIDI event: NOTE-ON, and NOTE-OFF. These represent the start and end of a specific note and we’ll implement these by sending a NOTE-ON when we receive an ACTION_DOWN touch event for a specific pad, and then send a corresponding NOTE-OFF when we receive an ACTION_UP event. This will start a note playing when the user touches a pad, and keep that note playing until the user lifts their finger off again.

MIDI supports 16 logical channels for each device, and each channel can be assigned a different sound, or voice. The first byte of any MIDI event is named the “Statue byte” and identifies both the event type and the channel followed by a payload which can vary depending on the event type.

The high nibble of the status byte represents the event type, and the low nibble represents the channel. The event type for NOTE-ON is 0x8n and NOTE-OFF is 0x9n (where ‘n‘ is the channel). So 0x80 would be a NOTE-ON event for channel 1, and 0x9F would be a NOTE-OFF event for channel 16.

For NOTE-ON and NOTE-OFF this payload is two bytes: the first represents the note in the range 0x000x7F (0x00 is C in octave 0, 0x7F is G in octave 10, and Middle C is 0x3C); the second byte is the velocity – how hard the note was hit – ranging from 0x00 to 0x7F. So a NOTE-ON for channel 4, middle C, with mezzo-forte velocity would be:

0x83 0x3C 0x64

To represent this we’ll create a class names MidiEvent:

class MidiEvent constructor(
        private val type: Byte,
        private val channel: Byte,
        vararg private val payload: Byte) {

    val bytes: ByteArray
        get() = ByteArray(payload.size + 1) {
            when (it) {
                0 -> type and STATUS_MASK or (channel and CHANNEL_MASK)
                else -> payload[it - 1]

    companion object {
        private const val STATUS_MASK = 0xF0.toByte()
        private const val CHANNEL_MASK = 0x0F.toByte()
        private const val STATUS_NOTE_ON: Byte = 0x90.toByte()
        private const val STATUS_NOTE_OFF = 0x80.toByte()

        fun noteOn(channel: Int, note: Int, velocity: Int) =
                MidiEvent(STATUS_NOTE_ON, channel.toByte(), note.toByte(), velocity.toByte())

        fun noteOff(channel: Int, note: Int, velocity: Int) =
                MidiEvent(STATUS_NOTE_OFF, channel.toByte(), note.toByte(), velocity.toByte())

There are a couple of factory methods which will create MidiEvent instances representing NOTE-ON and NOTE-OFF events. The constructor for MidiEvent takes a Byte representing the event type; a second Byte representing the channel, then a variable number of Byte objects representing the payload. The factory methods wrap this, and looking at these shows how the varargs payload comes in handy. If we were also using additional event types which took a different sized payload, then this makes our life much easier.

The nice little trick here is in the bytes getter. This constructs a ByteArray representing the event. This ByteArray is the status byte followed by the payload, and we use the init argument of the ByteArray constructor to initialise the ByteArray using the when expression to prepend the statue byte to the payload.

At a first glance this would appear to be a prefect candidate for a Kotlin data class. However, vararg constructors are not supported for data classes, so we cannot use one in this instance. This actually fools detekt ( a Kotlin static analysis tool) which incorrectly gives a warning that this class can be converted to a data class.

Now we add a couple functions to MidiController to generate and send NOTE-ON and NOTE-OFF events:

fun noteOn(note: Int, pressure: Float) =
                MidiEvent.noteOn(CHANNEL, note, pressure.toMidiVelocity())

fun noteOff(note: Int, pressure: Float) =
                MidiEvent.noteOff(CHANNEL, note, pressure.toMidiVelocity())
private fun Float.toMidiVelocity(): Int =
        (Math.min(this.toDouble(), PRESSURE_CEILING) * PRESSURE_FACTOR).toInt()

private fun MidiInputPort.send(midiEvent: MidiEvent) =
        midiEvent.bytes.also { msg ->
            send(msg, 0, msg.size)

companion object {
    private const val PRESSURE_CEILING = 1.0
    private const val PRESSURE_FACTOR = 0x7F
    private const val CHANNEL = 0

There are a couple of support extension functions here. Float.toMidiVelocity() takes a pressure value, which is a Float in the range of 0.01.0 and converts it to an Int in the range 0x000x7F. On some devices the calibration of the touch screen may exceed 1.0, so we clip it to that level to prevent the conversion overflowing the required range.

MidiInputPort.send() gets the byte array of the MidiEvent object and sends this to connected device via the MidiInputPort.

All that remains is to call this from the touch() function in MidiPad:

private fun touch(note: Int, motionEvent: MotionEvent): Boolean =
        when (motionEvent.action) {
            MotionEvent.ACTION_DOWN -> {
                midiController?.noteOn(note, motionEvent.pressure)
            MotionEvent.ACTION_UP -> {
                midiController?.noteOff(note, motionEvent.pressure)
            else -> false

Everything is now there, and we can start practicing our Bach!

That concludes our look at MidiPad, although we may revisit some time in the future to add some additional features.

The source code for this article is available here.

© 2017, Mark Allison. All rights reserved.

Powered by WPeMatico

Gurupriyan is a Software Engineer and a technology enthusiast, he’s been working on the field for the last 6 years. Currently focusing on mobile app development and IoT.

Please follow and like us: