MIDI Controller Intergration
-
- Posts: 10
- Joined: 14 Dec 2021 17:37
I would love for Echotopea to add in the ability to use midi controllers. I think that it would greatly increase the functionality especially as a live performance software to be able to handle MIDI input natively, as most DAW software can.
Yes, this would be great.
It's on our ToDo, there is no clear roadmap for that feature yet, as it's very early. We are only on the prerelease phase now and moving towards release in the first quarter of this year. Then we have the basic updates and fixes that will be required by the larger community of users, and then a version for macOS and many also an official release for Linux (not sure yet) around the middle of the year.
But it's definitely in our ToDo list and we already researched the inclusion of two protocols:
MIDI 2.0: https://www.midi.org/midi-articles/deta ... y-exchange
and
OSC: https://opensoundcontrol.stanford.edu
I think those two will open up many possibilities for the use of Echotopia in installations, theater, architecture, and of course art, music, and live performances.
First, we have to see how our users actually use Echotopia, then we will be able to determine the functionality that needs to be enhanced with outside communication and the nature of that communication (what will be a note trigger and what will be a continuous controller, etc.). so we will be able to map the variables better internally and make Echotopia really useful.
We first need to see Echotopia as a musical instrument to better understand how external control would benefit creativity and performance.
It's on our ToDo, there is no clear roadmap for that feature yet, as it's very early. We are only on the prerelease phase now and moving towards release in the first quarter of this year. Then we have the basic updates and fixes that will be required by the larger community of users, and then a version for macOS and many also an official release for Linux (not sure yet) around the middle of the year.
But it's definitely in our ToDo list and we already researched the inclusion of two protocols:
MIDI 2.0: https://www.midi.org/midi-articles/deta ... y-exchange
and
OSC: https://opensoundcontrol.stanford.edu
I think those two will open up many possibilities for the use of Echotopia in installations, theater, architecture, and of course art, music, and live performances.
First, we have to see how our users actually use Echotopia, then we will be able to determine the functionality that needs to be enhanced with outside communication and the nature of that communication (what will be a note trigger and what will be a continuous controller, etc.). so we will be able to map the variables better internally and make Echotopia really useful.
We first need to see Echotopia as a musical instrument to better understand how external control would benefit creativity and performance.