Deprecated, please Chris Wilson’s WebMIDIAPI Shim
The new version allows you to generate midi events in AS3. The applet translates the AS3 midi events in to midi messages that can be sent to your midi hardware or software.
Also the Alchemy version of Fluidsynth from Yoann Huiban has been implemented. This makes it possible to use your own soundfonts in your app so you don’t have to rely anymore on the often quite poor midi sounds that ship with your soundcard.
You can check out the new version over here:
Code has been published to GitHub, see:
Or as a zip download at Google Code:
Generating midi events in AS3
I have updated the MidiEvent and the MidiData classes. The MidiEvent extends the regular flash.events.Event class. All you need to do is write some code that creates and dispathes the event and add an eventlistener somewhere in your code that listens for these events.
AS3 midi events can be send to the applet by passing the MidiData instance of the MidiEvent as an argument to the method sendMidiData() in the class MidiConfig. The midi data gets translated in to a ShortMessage once it has arrived in the applet and this ShortMessage is then subsequently sent to the selected output device.
New midi configuration panel
The MidiSetup class, now called MidiSystem, has been improved. This class instantiates helper classes that handle everything that has to do with midi. One of them is MidiConfig that allows the user to change or set midi devices, start or stop the applet and save default midi devices.
Basically you only have to add a MidiSystem instance to your code to make your app midi-enabled.
To make your app easier to debug i have improved the logging of midi traffic as well. If you click on ‘open midi configuration’ you’ll see all midi traffic appearing in the black square. Traffic like notes are only logged when the midi configuration is opened because logging has some impact on the performance.
Yoann Huiban did a great job creating a swc from the Fluidsynth code. He accomplished this by using the very interesting technology of Adobe Lab’s research project Alchemy. Fluidsynth is an open source software synthesizer that supports Soundfont 2.0. Soundfonts are basically a set of samples that belong together.
For instance a soundfont can consist of piano samples of a Steinway grandpiano of all 88 keys at 5 different volume levels. Fluidsynth plays back the correct sample when you play a note on your midi keyboard and thus provide the synthesizer with a note number and a velocity value.
In this example there are 5 different samples available for every note number and every sample contains the same note (pitch) but played at different volumes. A piano tone becomes more overtones as its volumes increases, resulting in a brighter sound. In midi velocity is measured in 127 steps, so roughly every 25 velocity steps a different sample is choosen and played back by Fluidsynth.
The swc version of Fluidsynth works roughly in the same way, albeit that the sample is played back by actionscript.
So lets assume you play the middle C at your keyboard at medium strength, you pass notenumber 60 and velocity 64 to Fluidsynth and it will return the right sample as ByteArray, and this ByteArray is subsequently fed to the Sound class by using the SampleDataEvent method.
Unfortunately, by playing back audio this way you always end up with an unacceptable latency of at least 46ms. This is so because you need to provide at least 2048 samples everytime the Sound dispatches the SampleDataEvent. If not, the Sound just completes and dispatches the Event.SOUND_COMPLETE event.
Because the Sound object expects the audio data to be at a 44.1 kHz samplerate, you can easily calculate the latency:
(1000/44100) * 2048 ≈ 46 ms
You can read more about latency in my ealier post.