Jack MIDI timestamping Was: Re: [linux-audio-dev] What parts of Linux audio simply suck ?

Benno Senoner sbenno at gardena.net
Mon Jun 20 08:33:05 EDT 2005

Stéphane Letz wrote:

> Good question....
> I would say that one the may problem is that jack MIDI does not has 
> the concept of time-stamps in the "future" . The Jack MIDI "events" 
> are received in real-time (similar to what MidiShare does with the 
> Receive callback concepts) but have also to be delivered in real-time.
> In the absence of any scheduling mechanism,  I don't think the 
> MidiShare API could be re-built on top of the underlying Jack MIDI 
> buffer transports system.

I did not look at the jack midi API yet but what do you mean with 
time-stamps in future ?
timestamped MIDI events relative to the current audio fragment or in 
future in general, eg deliver this midi event
in 100000 samples starting from the current position ?

I think it's probably enough that jack provides the former.
eg my audio fragment is 128 frames
note on ch 1, note 60 velocity 100 at frame 20
note on ch 1, note  64 velocity 80 at frame 100

so the timestamp is always between 0 and 127 (fragmentsize).

AFAIK VST does it what way and sequencers are perfectly fine with that 

I don't believe the future timestamping (with timestamp > fragmentsize) 
provides many advantages
since it leads to long queues and poor interactivity.
Eg if I want to modify the midi data in real time the first approach 
works better since the latency
of the modifications to the midi data becoming audible will be only 1-2 
audio fragments.

I think that the argument of needing a kernel based midi API that 
timestamps event is the future because
the user space processes cannot provide decent timing is moot these days 
because of the excellent RT performance
of Linux and that often sound generation is internal (eg virtual midi 
instruments/samplers) so having
jack that handles midi+audio can provide better timing (sample accuracy) 
than the ALSA sequencer and simplifies
the programming of virtual midi synths/samplers. (no RT midi sensing 
thread etc).
Look at VST: 2 simple callbacks: processEvents() and process() to 
implement sample accurate virtual instruments.

Thought ?


More information about the linux-audio-dev mailing list