Accessing and receiving from multiple midi devices


I’m making a VST plugin that needs to send MIDI messages to i) external midi gear via elk dev board din 5 midi out; and ii) push2 hardware device connected via USB. It also need to receive MIDI from i) elk dev board midi in port; and ii) push2 midi usb port.

For case i) I guess I have to configure sushi to have a midi out/in and connect to the elk board midi out/in using aconnect (is that right?)

For case ii) I don’t know how to do it. Ideally I’d connect directly from my plugin to the push usb MIDI ports (that is what I do in my laptop and it works like charm). However in elk board I can’t access directly the MIDI devices because this makes the plugin crash (and I’ve been told not to open the device directly from the plugin).

What is therefore the proposed way to do it? For ii) I don’t need super precise timing, I could live for example with an intermediary app outside the plugin (connected vis osc/grpc) which does the actual sending (and receiving) from push if that is the recommended way.

As an additional note, say that I can connect to the push display via usb without problems.


For 1: Yes, as I understand you will be able to output MIDI from Sushi, and then use aconnect to route Sushi’s output to a MIDI device, which could well be the Push 2, assuming it is class compliant.

There are plugins already installed which output midi, though we do not have configuration files which demonstrate that - the LV2 plugin eg-fifths outputs midi for example, in response to midi input.
Or you could use those of our internal plugins which too produce MIDI.

For point 2: Regarding the Push 2’s display, I’d be keen to see that working, but haven’t researched how that’d work extensively. As far as I know, the Ableton transmits the images to the display over USB as if it were a class-compliant composite MIDI device - so far I see no reason why it wouldn’t work on Elk.

But the images transmitted need to be rendered off-screen, and for that I haven’t researched how it would work on Elk. I know Ableton used QT for the original Push 2, and they also have a JUCE example for achieving the same, but to what extent QT / JUCE can be used on Elk to render these images I am not sure, since there may be need for dependencies that are not part of Elk.

eglfs QT has been tested to work on Elk Pi, perhaps that could be used also for Push 2?

Ilias B.

To amend, specifically regarding the elk-pi MIDI connection, either through the board you have from the very early workshops which had a din 5 socket, or the newly release Blackboard’s minijack MIDI: neither works as a MIDI port at the time of writing. We are working to implement full support for it. In the meanwhile, any class-compliant MIDI device connected over USB will work.

Ilias B.

Hi @Ilias, thanks for your answer.

I think I might have posed my question in a confusing way. What I want to do is to send and receive MIDI data to/from 2 different devices from a single plugin running in a sushi track. One device could be an external synth, ideally connected via the DIN5 of the board, but a USB2DIN5 MIDI interface could work as well, no problem with that. The other device would be Push2 (which as you mention is controlled via USB and works a a composite device, on the one side having class-compliant MIDI in/out ports, and on the other side a USB device to connect with the screen). I’m already using Push2 screen from the ELK board and it works nice, no problem at all with that :slight_smile: The problem is using the Push2 MIDI ports because I can’t open the ports directly from code (I’m using JUCE) because my program crashes (and @Stefano told me I should not try to interface directly with the hardware).

Possible solutions I can think of:

  • Make my plugin have 2 MIDI inputs and 2 MIDI outputs. Then use aconnect to properly route the ins/outs to Push2 and the USB2DIN5 midi interface. But it is that even possible? Can a single plugin be configured in sushi to have 2 MIDI ins and 2 MIDI outs? This solution could be a bit problematic due to some VST MIDI handling issues, but I can try it.

  • Ideally I’d like to open the MIDI Push2 device directly without passing through VST midi handling code. I guess this is not possible from my plugin running inside sushi. But I imagine it should definitely be possible from a simple python script running outside sushi? Then I could connect that script and my plugin using OSC for example (or gRPC) and connect to Push2 in this way. Do you think this would work?


Now I understand, yes, that’s a whole other issue!

I will check about the possibility of outputting MIDI to two separate “devices” from a plugin in Sushi and get back to you here - that’s not a situation I’ve needed to deal with before.

Because I expect that just using the separate MIDI channels is not sufficient in your case.

Meanwhile, yes, I see no reason for why having a standalone program which communicates with your plugin shouldn’t work! You can either use shared memory between the processes (not something I’ve needed recently but should be possible), or perhaps a cleaner but less efficient solution, in particular if one is C++ an the other Python: OSC/gRPC as you say.

I’m very excited to see what you are making by the way, I was always intrigued by the prospect of harnessing the Push 2 for making an instrument involving the Elk platform, so I am very much looking forward to seeing your take on it!

Ilias of Elk

Just to get back to you on outputting to multiple MIDI “devices” from a single plugin - I checked, and indeed I don’t see any provision for it in Sushi - nor do I remember the notion being supported in the plugin APIs I am familiar with.

With that said, your option 2 is still a very practical way forward.

Ilias of Elk.

ok thanks,
I’ll try with the second option then!

I have big plans, but all this will happen on my free time only so it will go slow. My idea is to do some sort of sampler+sequencer using ELK and Push2 as an interface. Inspiration comes from octatrack and other similar machines. However, what I want to do to make it different is include integration with Freesound to access samples, and with Essentia for advanced audio analysis. This is big undertaking, but I’m building on top of existing software which make things really easy. This includes of course your ELK AudioOS and board, but also the tracktion engine (which in fact has already implemented most of what I need to start).

So far I’m building a technoogy demo project that connects all pieces together. I already have a prototype (using push2) running on my laptop fine, and even on the raspberrypi (without ELK board/audio os, and of course without multi-channel audio). Now I’m making it work on the ELK platform. Main problems are that I have to adapt/patch a number of things from JUCE/tracktion engine code so that the app can work with the RT kernel. This is being a bit more complex than expected, but @Stefano is helping me (in this thread). The other problem was Push2 connection, but the display bit is already solved*, and the MIDI communication I hope it will be easily solved as well using this intermediate script which does MIDI coms with Push2.

As soon as I have the basic working demo I’ll share the repository here so it serves as a “template” for using Push2+tracktion engine+ELK.

I’ll keep you posted!

* Drawing on screen is working fine, except for displaying text. I guess this is because the patched ELK version of JUCE mocks the draw text functions to avoid dependencies on freetype (I guess). I have not started trying to find a solution for that, but I guess the solution will be to include these libraries as well.

1 Like

Hi Frederic, just chiming in to have some opinions on midi :slight_smile: Very interesting to hear that data for the push screen is sent as midi, didn’t know that.

But basically, as Ilias said, midi outout from a plugin in Sushi can only be discriminated by channel. There is no provision to send midi to different ports in a sushi plugin.

If you wan’t to send midi screen data to the Push, then accessing the device from a non-rt thread inside your plugin could be an option. As I suppose you’re not generating the midi data for the screen in the audio thread, it makes little sense to pass it into the rt thread just to be able to output it through the main sushi loop. Outputting midi from a plugin in the audio thread is primarily for sending keyboard data to the next plugin or to external devices, where we want low latency.

Just of the top of my head, would something like this setup help?

  • Configure Sushi to have 2 midi input ports. These will show up as separate alsa ports that you can connect to using aconnect.

  • In the Sushi json config file, you can connect both of these ports to the same track (with the raw midi option enabled). To get data from both ports to your plugin.

  • Does the Push midi output show up as an alsa port? In that case you could also configure sushi to have 2 midi output ports and route the output of the track to both output ports and send midi messages on different channels. This will basically duplicate the midi data to both ports, but maybe that can still be ok?

1 Like

Very interesting to hear that data for the push screen is sent as midi, didn’t know that.

The push2 screen is not controlled using MIDI, you connect to it as a USB device (don’t know details of these things, but I just use example code provided by Ableton, based on libusb, and it works on ELK nicely). The MIDI sending to Push2 is to control the state of buttons, pads, etc. Sorry if I made this not clear.

I think discriminating by channel won’t work because Push2 uses channel information to specify things like “led blinking animation”, so I need to use all channels.

I’m not trying to send MIDI to push from the RT thread. @Stefano told me the only part of the plugin which is run in the RT thread is the processBlock (I think it is called like that) function (using JUCE here). In my prototype, MIDI communication with Push happens outside this function so I guess this is not the RT thread. However the problem is that I can’t directly open MIDI hardware device and send messages (as I do in my prototype when running in the laptop) because that crashes the app, and I was told this is to be expected because all interfacing with audio/midi devices should be done through sushi. Hence, I think the use of a proxy app that runs outside sushi and does the MIDI communication with push will be the most sensible solution. I’ll experiment and report here!

Hi @frederic,

I confirm that opening up ALSA ports from inside a plugin - when the host is doing something similar - is not a good idea. But having a separate process should work and indeed I believe that’s the best solution.

1 Like

Just confirmed that I can write a python script that runs in ELK AudioOS and uses mido python MIDI library to send/receive MIDI to/from Push2. Great! Thankfully a couple of years ago I wrote a Python library precisely to interface with Push: :slight_smile: That will come in very handy now. Now I just need to implement the OSC/gRPC interface with my plugin

1 Like

Got Push2 communication working with a simple proxy OSC python app! Seems to work quite nicely, not sure if the latency will be problematic when using push2 pads as input for midi notes, but response seems to be pretty fast so looks good enough!

Now I need to fix the remaining issues I have with the tracktion engine and I will finally have a tech demo of all parts of the system connected and running. Will let you know and post code when done. Thanks for your help.

1 Like

Thanks for the update, look forward to it!

In my experience OSC is fine for control signal communication over a wired connection, the latency is low enough to be unnoticeable when playing. After all that’s what OSC was made for primarily!

It was just me that misunderstod. Glad that you got it working and looking forward to see the results of your work.

Just FYI, I published the code of my project so far
I have a minimal technology demo which is usable (including the use of MIDI bridge app to connect to Push2), but still I have many issues (mostly related to tracktion engine and RT kernel) that need fixes. I decided to make it public because I guess a lot of time can pass until I manage to fix everything but I think what I have might already be interesting to others.


Hi Frederic,

I came across your thread by search querry ‘octatrack’ to see if there is a ongoing project to mimmic the octatrack but with more in’s and out’s, and overall less limitations. Sort of a dream coming true seeing this thread and your octopush project!

I will follow your project with close attention! I am not very capable programmer but I’ll try my best to get involved and make it progress.

The fun part is that I came across elkOS looking for some RTOS to do exactly that! About two years ago. But with limited ressources I abondoned the idea and was waiting for the money to simply buy one. Now this!


1 Like

Hi @Ganjouille,

Thanks for your kind words and I’m happy you find this interesting. Unfortunately, the octopush project is now semi abandoned because I came across too many difficulties to solve at once. Instead, I’ve been concentrating on a sampler without sequencing capabilities which runs on ELK (you can check it out here, and also recently I’ve been doing some experiments to write a sequencer. At some point both could be combined and maybe the octopush project could revive.



Well I understand your decision. Would have been nice to use the existing traktion engine but it looks like elk environement needs something more tailored to the specifity of it’s kernel isn’it?

I am quit enthusiastic and back in learning C programming. Want to get better at it because I really like the idea of being able to harness the technology I use and being able to tailor it to my needs. Especially in these disrupting times where these technologies shape our world more everyday. Neither bad or good but both. Quite brutal in my opinion.
Yes this goes way beyond the topic but sometimes I like digressing (like most of what I say here…). Apologies!

Source looks like a good starting point. I’ve came across already! I have quite a clear picture of the architecture of my ideal instrument and it’s very inspired by he octatrack… it needs some of it’s limitation but a litlle more interfacing and a bit more intuitiveness could be a good thing. But the holy grail for it is to be Opensource. Knowledge is powerfull! Hehe…

Have a nice day!