Audio sync should be more stable because DAWS are focused on their audio engine before anything else including midi thus if you have it setup right it should lock to what ever the audio is doing. That said once you are ready to record I'd still disable all plugins on that track while recording, IU know that sucks if you like to jam your arrangements live
If you're layering the same digital sounds with relatively minimal processing on one, then a few samples off can make a difference. But otherwise, I agree. If you can hear a flam, and don't want it, try a nudge... or only use one of the sounds for the attack part of your stack. That's a classic solution - one layer for the stack's attack, and use the other layers for body, thickness, decay, etc.
Also, people might be surprised how sloppy MIDI jitter could be on slaved vintage sound sources - nore than 1ms. And people made some GREAT music with that jitter present.
Thanks for that awesome video Ricky, nice come back from the last one :)) This is a complex subject, but let me give my professional opinion. * What you are comparing is (A) the MIDI clock generated by Ableton and (B) the clock/pulses generated by the multiclock VST plugin. -> For now let's ignore any additional latency/jitter created by the multiclock device, by the playing device (digitakt), or by the audio interface (we can assume they are very low). * I'm sorry to say, but the MRCC has nothing to do with any of this - it just forwards the clock generated by Ableton (and I'm sure it does it well) * What we want is precision between (A) or (B) and the DAW timing, which is the timing the analog audio is being sampled by the audio interface * Let's look at (A): on your computer, it runs in a different thread/priority than the audio stuff, so as soon as your CPU gets busy, it won't be called "in time" and that will create jitter and latency. On top of that it is transfered over USB-MIDI, which does not have the highest priority either - so also extra jitter if the USB bus is busy (lots of USB devices for example, big data transfers, for example) * Let's look at (B): it runs in a VST plugin, which is processed not only real time with the other audio stuff, but more importantly the generated audio pulses are in perfect time with the other audio coming out of your DAW. So inherently the audio pulses you send to the multiclock are in perfect time with the other audio coming out of the DAW. Any jitter/latency using the setup (B) is therefore from the stuff we ignored, i.e. within the multiclock itself, or from the playing device, or from the audio interface. Now will OSes and DAWs get better so that nobody will need this kind of complex setup anymore? Yes I bloody hope so, and U-SYNC is the first step towards it. Simon
For those interested, Simon has made a cool little box(midi and audio Metronome, linked to your DAW), I have it in my setup since a year and it is awsome - check his channel and support forum.
The Multiclock has a VST plugin!? And the plugin generates the clock? Then you are still relying on the PC, which isn't going to be as accurate as hardware can be.
@@elitebeatagent8993 I guess the current batch is sold out. Try contacting him. He is very responsive and kind. And believe me - the Midronome is worth it!
Hey Ricky, been watching your videos on the MultiClock that I also heavily use. Love your channel and your approach! The power of the Multiclock, is being able to shift things in time with knobs until it "feels right" and then hitting record. To me it s priceless just for that! Every plugin you add, also EQ s being changed to Linear phase etc will affect the timing of everything including the Multiclock as Abletons internal latency will be affected. Just setting it once and expecting it to stay in sync is not achievable IMO. Often I check it s tight and then adjust by feel so the groove feels better than 100% on grid, very often do this on bass synths and toms. Don't sell it, you will regret it!
Device jitter, and small variable clock offset, is something that will always exist with the midi protocol. The midi message bytes are sent serially and received using a buffer which is then processed by the microprocessor in the device. That thing runs at its own clock rate. So a midi clock message will never be processed instantly when it arrives. The receiving device still has to figure out that it received a clock message and act accordingly. In other words, the Multiclock can sort of transmit the clock message in sync with a sample, but nothing can interpret that with sample accuracy because of how the midi protocol works. I think the accuracy might also get worse if you send additional midi messages like CC over the same port.
There is eventually a bottle neck somewhere right? So many tiny variables. I felt like I was trying to figure out a Manet painting by looking at the canvas stitching 😂
The only way to have a perfect sample accurate clock sync would be to base it on the word clock everywhere in the chain. Every device in the chain would need to know the tempo changes at the audio rate level. I believe that every connected device would probably also have to report their own internal latency so that latency compensation for all chains can be tracked because even what is perceived to be a "no" latency for a digital synth at least have a one sample latency and what it is depends on the internal sample rate. MIDI is very bad for syncing clock since it is only based on a low rate tick with no time codes, you could probably improve that a lot but it would also not be compatible with midi clock
@@pleggli Interesting, i wonder if somehow a unit speaking direct to DAW through another protocol, functioning off something more like CV might be a solution?
This was PTSD of trying to get two kick drums to align across devices. The phasing in and out was an interesting resulting sound, but also soul crushing. It's stuff like this that drive people to the dark arts of eurorack drums.
Plus there is latency on the devices you are trying to sync. It never ends. I found the best is just have pulse audio tracks in my DAW and output direct to my clock in on my sequencer/drum machine. No midi.
Thank you, Ricky, for the information you provided. I have also been researching for a few years how to optimally synchronize onboard devices with the computer, and I have come to the realization that latency cannot be completely "overcome," but different approaches can be used, as you mentioned as well. In your specific case, I believe the drift is caused by the computer itself and not by the multiclock, because of software record input monitoring enabled. For an optimal "latency-free" and rec.overdub experience, I would recommend an onboard mixer with direct monitoring on each channel to avoid internal processing as possible. Regarding browsing presets using with the multiclock, I have sorted it out by installing VSL Server on the computer and offloading all my plugins from the DAW process, thus eliminating any processing that could affect the out clock from the DAW , and I can say that it works really well. Zero drift and phase issues with live tracking and overdub recording. Best regards, Robert Trifunovic
It really got me that when midi slaved to an MPC my first pc used to display wandering tempo or 89.9 instead of 90 lol Things haven't improved with all the d/a stuff in our rigs too
@@johansebastianfauno5526 PC computers and notebooks aren't ideal for electronic music production primarily due to latency issues and system interrupts. These devices were designed for general-purpose computing rather than real-time audio processing, which requires consistent, uninterrupted performance. 1. Latency and jitter: Standard computers often struggle with maintaining low and consistent latency, which is crucial for real-time audio processing. This can lead to noticeable delays or irregularities in sound output. 2. System interrupts: Modern operating systems constantly manage various tasks and processes, causing frequent interruptions that can disrupt the smooth flow of audio data. These interrupts can result in audio glitches, dropouts, or buffer underruns. 3. Non-real-time operating systems: Most consumer PCs run on operating systems not optimized for real-time processes, making it challenging to guarantee consistent performance for audio applications. 4. Hardware limitations: Standard PC hardware, including sound cards and drivers, may not be designed to handle the demands of professional audio production, leading to suboptimal performance. This is why dedicated hardware gear remains popular in electronic music production. These devices often feature: 1. Purpose-built chip architectures: Designed specifically for audio processing, ensuring low latency and high reliability. 2. Real-time operating systems: Optimized for consistent performance and minimal interruptions. 3. Dedicated DSP (Digital Signal Processing) chips: Specialized processors that can handle complex audio calculations more efficiently than general-purpose CPUs. 4. Guaranteed "quality of service": Hardware units can provide consistent performance without the variability found in multi-purpose computers. The same principle applies to other fields requiring real-time processing, such as telecommunications. Cell phone towers, for instance, use dedicated hardware rather than general-purpose processors like Intel CPUs. This ensures reliable, low-latency communication essential for maintaining call quality and network stability. While modern PCs have improved significantly and can be optimized for audio production, dedicated hardware often remains the go-to solution for professional-grade electronic music production due to its reliability, consistency, and purpose-built design. Would you like me to elaborate on any specific aspect of this explanation?.
Gotta go old school on sync; workstation is master clock, all midi and audio clips consolidated into long sections (especially in Live), use automation over patch changes, use lots of pre roll, render or dump all midi parts to audio and immediately edit for sync. Abandon all illusions about MIDI - only audio recordings are ‘real’ and in sync. Put video on separate MIDI TC slave system. Finally, use tempo sync’d delays as a reference only. Once you know what you want delays to do, set times manually or with automation and let them free run. I hear that recording the cowbell from an 808 onto track 23 of a Studer ATR can drive the arpeggiator of a Juno 101 pretty well…
Here is my setup with the Multiclock in ableton that has been rock solid for me: I turn off all external midi sync in ableton because you want the devices to get the sync from the multiclock and not ableton via usb. Then use an external audio effect on the channel you have your multiclock plugin or audio pulse running through to route the audio to the multiclock. I find this keeps the latency compensation better when running plugins. You can then run one of your midi outs on the multiclock into the midi in on the MRCC to clock your outputs on the MRCC.
Thanks for going to all the trouble to clearly and methodically document everything! I have been doing similar, but I'm so much less familiar with Ableton, so I can't pin down how much of rhe variation is something I changed on one track without realizing it
I’ve been using Ableton live since version four and I will tell you that you have undertaken a fool’s quest. Ableton and external hardware synchronization has always been an epic nightmare. I gave up on caring and I use Ableton has basically either all software in the box or controlling hardware out of the box, but as soon as I try to get the two to work together and line up, everything goes to shit. The only answer for me has been that I record and timing after. It’s not all Ableton’s fault, either. It depends on what your synchronizing to it, and how that device handles clock changes. Midi, also being a serial protocol, does not help. Regardless of what your sequencers internal timing may be, midi is limited to 24 pulse per quarter note timing resolution, which was great in 1982, but just doesn’t work today. Midi 2.0 can’t come fast enough.
Yea I had a feeling this is the case. 😂 I have tried other DAW‘s in the past, and one thing that always stuck out to me was how everything “just worked“ when it came to synchronization. I felt that both Bitwig and Logic gave me a more stable clocking experience. But I never really tested anything, or questioned it further as I did today haha
Another great video Ricky! I got the MRCC after your last video and it’s amazing to have such a flexible MIDI patch bay and especially have the four virtual MIDI in and outs. Game changer for me. But about MIDI jitter - I genuinely believe this to be an ableton problem. I’ve done something similar to this in Logic and it was bang on but yeah, Ableton *always* did weird stuff. Not even just clock, but even just MIDI sequencing notes. Funny thing however is that Elektron Overbridge seems to always be locked-in for me regardless of DAW. Perhaps you could talk to some people and start building some ideas for an Elektron sync box? 😅
Also I just realized that in this video you’re recording with Monitoring set to Auto and not Off, without trying adjusting the Keep Latency settings. 🤔 There’s just *way* too much stuff to keep track of when recording into Ableton!
when you're tinkering with ozone, the latency of the plugin gets altered, which results in a tiny audio dropout from ableton. for sure then the multiclock gets a hickup...
@@RJ1J Agreed, during recording I'd turn off plugins or use the DAWs delay compensation. Not sure if ableton has delay compensation or not or if its any good.
Interesting analysis of MIDI clock with the two devices and your DAW. It's hard to image that a half millisecond would be noticed. With MIDI keyboard latency, somewhere between 5-10 milliseconds I can start to notice a lag. That said, if you take a song at 120 BPM, you get 500 milliseconds per quarter note (two notes per second). I use Logic, which has a default MIDI PPQ resolution of 960 (I believe Ableton is the same?). So for each quarter note occurring every 500 milliseconds, there are 960 sub-divisions the DAW is using, to determine accurate MIDI clock timing. .500/960 = .000521 per MIDI clock sub-division, or roughly a half millisecond. So the actual (DAW) clock data used to determine MIDI timing, is only accurate best case, to a half millisecond (at 120 BPM -- clearly this will change based on the song's tempo). It doesn't seem to me, you are going to be more accurate than the clock frequency. And as you demonstrated, when the CPU in the DAW is heavily loaded, other CPU threads may delay the processing of the DAW/MDI clock. I like your comment about missing the forest by focusing on the trees -- this analysis is great, but I suspect if one doesn't heavily load the DAW's CPU while recording with MIDI clock, any minor variance + or - half millisecond, would be small enough to not be noticed. BTW, I ended up getting an MRCC 880 after seeing your video last week -- didn't realize this device was available. Really a nice MIDI router/merger/spitter -- thx for the tip!
Again he didn’t test multiple external sequencers. When I’ve used a usb midi hub with one piece of gear I could get it and Ableton synced after a little bit of tweaking. You would think that now I can do the same process with other gear and then I can run 3-4 external sequencers and they would be synced. But, it always fell apart. Now I use the multiclock and at the moment I have 5 pieces of gear all running their internal sequencers locked to Ableton. When I first got the multiclock I would also measure the latency and adjust it that way to get it super tight. Now if I add a new piece of gear I just dial it in by ear. Sometimes a few milliseconds here and there get those separate sequences grooving together. Plus I could always nudge later in Ableton, if need be, after I’m done jamming and recording.
I had the ERM multi clock. plus a motu 128. After years of sync issues, and testing over many years between Ableton , Cubase and Bitwig. I came to the conclusion that Ableton was by far the worst for sync issues. So I sold Ableton Live and the ERM and a load of older hardware gear.. I will do most of my live hardware compositions in Bitwig. I've never been happier. ;o)
Most desktop computer operating systems are technically not real-time OSes so they will always be affected by CPU. A real-time OS (RTOS) is designed to work within hard constraints and so can guarantee specific response times. I think a dedicated hardware clock with a RTOS would be best but I dont know which of the hardware devices out there are designed that way. Someone mentioned Midronome but I've never used it.
I think Ensoniq keyboards Midi jitter and timing is better than almost everything available and that was in the late 80's early 90's so it must have been because it was operating to process midi and the synth as it's core function.
Most MIDI devices like MRCC use simple single core microcontrollers and are bare metal programmed. That is, there is no operating system. However, some higher performance MCUs can run an RTOS which might be helpful if there are a lot of tasks to juggle.
This stuff really gets me… we have such ridiculously powerful computers these days, but we’re still struggling with audio and midi latency. The response times could easily be achieved on modern OS’s if they just got out of their own damn way for real time processing
Something like Linux with a realtime kernel can do things on dedicated deadlines but it still won’t make latency super-low. ASICs or FPGAs will always win for stuff like that where microsecond-level timing is required. Once you’re looping through an interface or general-purpose OS for tight timing, forget it unless it’s simply an output and all the instruments are in-the-box. RME gear in general has great latency for tracking and stuff but if you really want phase-accurate you need to quantize or manually align stuff. Hybrid kinda sucks because of this.
I'm a bit late to the comment party, but as many other commenters have now suggested, there are a LOT of different choke points on both the computer running the DAW and all of the gear that you're plugging into. Trying to track down where these transient issues are happening on any given take is likely the path to insanity (although arguably still easier than troubleshooting a complex HDMI audio setup). Some concepts that might help provide a bit more context... first, even fast modern CPUs are still usually running one instruction at a time, which means the processor is rapidly jumping between dozens of tasks. It's not just your running programs but all of the OS services, and everything done over USB (audio interface included) requires a not-insignificant amount of processor overhead. Processing streams while individual apps and plugins are all greedy for high priority slices means that the more you listen for audio artifacts, the more you will absolutely hear them. This might actually be one of the most understated advantages DAWless setups have; the processors might be far less powerful, but they are not running a multitasking operating system. The second thing to keep in mind is that MIDI clock itself is a very janky thing. There's no concept of syncing to an atomic clock, and the speed a processor runs is directly influenced by the voltage it's receiving unless there's a clock running; does your MIDI gear have clock batteries that need to be replaced? Mine doesn't. So, everyone is doing their best but nobody actually can agree on how quickly time is flowing. When I was first learning the MIDI protocol as a developer, I was shocked to learn that there is no tempo message. Instead, you send timing "ticks" down the line at a rate of 48 messages per quarter note. That's right... time is defined in real-time based on the speed at which those ticks arrive. Your MIDI synth is just a hungry hippo for ticks. If you stop sending ticks, the music just stops. To me, this is wild. This is also why you see the tempo occasionally jump up or down by 0.1 when you assume it should be stable. What you're seeing is ticks arriving too quickly or too slowly, or not arriving at all. Now, the other elephant in the room is that MIDI works at 31.25k baud, and the timing messages are not given special priority. So, if you have a lot going on, that bandwidth can get eaten up. That's an area where the MRCC can really help, by defining subsets of devices to talk to each other. Still, just because MIDI can support a given speed under ideal conditions doesn't mean that the client device is powerful enough to receive them that quickly. Right now I am talking with the Hologram folks because Chroma Console literally shuts off if you send it too many messages too quickly. Which is not amazing, but this is the world we're in. I'm not intimately familiar with the audio clock but if there's software running on your laptop to interpret what it's sending, then all you're really doing is shifting the point of failure from MIDI to your CPU's ability to prioritize messages from your audio clock. TL;DR: I wouldn't pay extra for the audio sync thing, and I wouldn't do CPU intensive things while recording. And yes, if it sounds good...
The main issue with those sub-10ms changes is 2-fold: 1. Phase issues. If you are somehow trying to double up the same sound, which is not especially likely in this context, the phase can be totally off and thin things out. 2. More importantly, we CAN hear the substantive difference in transient response if say 2 percussive elements are clocked separately and continuously overlapping, you can have a subtle shifting of where a really really tight flam is happening. in a majority of cases i don't think this will really cause an issue, in some it could even be pleasant, but in certain situations it could certainly be heard, and in some it might be undesirable. I think the worst case scenario would be DJing, if you were to have similar parts overlapping poorly, could sound really shit.
Oh yea ive heard that, putting the same tone or cowbell on 2 machines and they are never super dead on. You hear the phase for sure. Would this happen if it was machine to machine midi? I guess it might depend on the machine right? Haha
@@RickyTinez It would prolly depend on a few different things but most of all on whatever hardware/software you were using. Theres stuff like JackAudio or a few other similar solutions, too, which just go software-to-software, you could probably test something like that and it'd give you an idea whether its common inside a PC, and if it is, it'd prolly come up computer-to-computer as well. I somewhat suspect all you'd deal with is latency though, rather than jitter. Oh and im realizing just now i might've misunderstood, if you mean between 2 midi hardware units, i absolutely think it would come up, my guess is it has to do with subtle analogue jitter. Most all clocks of the time-telling variety have jitter, thats why we've got the atomic clock, but even it has subtle jitter i'm told. Generally, in the digital realm, my understanding is its all 1s and 0s and the jitter doesn't really happen usually, the electricity coming in is all stabilized by the power supply to facilitate the extremely detailed operations, so that every 1 is a 1 and every 0 a 0, nothing in between. Then again, if i was all right, the atomic clock would be digital LOL, so i gotta be missing something. TLDR; Clocks are actually gnarly AF 🤣
Hey, thanks for allyour effort on this interesting research. Without knowing for sure, I think the Multiclock is so popular because it works great with older drum machines or synths that have built-in sequencers. That's where it really shows off its strengths, imo. For instance, I like to use the audio out on Channel 1 of the ERM to trigger the SH-101 or the JX-3P. Of course, that only makes sense if you’re into those old-school internal sequencers. Plus, the built-in swing in the ERM is pretty nice in this context.
Great video, thanks Ricky. FWIW when I’m sending clock from my Digitakt to my Syntakt over MIDI DIN without a computer in sight, I can see the tempo on the Syntakt fluctuate by +/- 0.1 bpm throughout. Same with the Digitone. I think there’s inherent jitter in MIDI clock at the hardware level, quite apart from when a DAW is involved. But… it doesn’t matter because jitter that’s measured in around 1ms or so isn’t going to make an audible difference. And in reality MIDI clock sync on Elektron boxes is tighter than a camel’s arse in a sandstorm. If the MRCC is getting jitter via USB from a DAW down into the 1ms range then it’s going to be almost as solid as Elektron gear at the hardware level as a clock source. That’s good news because MRCC 880 is already on my shopping list and this test confirms it’s a good buy.
i own multiclock and i did jitter test using different audio interfaces meaning Focusrite 18i20 and 2 RME Babyface pro fs and UCX2 and this is where i found the RME differ from others being super stable with audio jitter so the multiclock performed very well. now i generally stay away from any activities while recording having another dedicated PC for junk
I think you're seeing two different things: 1. Regular old MIDI Jitter, which can vary in severity based on hardware and DAW. USB vs. MIDI DIN, IO Drivers, MIDI driversetc. I've done extensive tests and some DAWS do better than others. MIDI DIN seems to be better than USB. 2. On the Ozone tests, you are probably flipping through presets with varying values of lookahead. When you instantiate a plugin or preset with lookahead it's going to momentarrily throw Live's plugin delay compensation (PDC) out of whack.
Like the last one, it’s hard to take this video in good faith and not as cynical clickbait but you’ve always seemed like a good dude so I’m giving the benefit of the doubt and just assuming you have a bunch of things twisted. 1. You’re not comparing the ERM to the MRCC because the MRCC doesn’t do anything to the clock other than route it. You’re comparing the ERM to your computer’s clock with the MRCC as a dongle between your computer and the outside world. This is an apples to oranges comparison. 2. Every receiving device behaves differently so measuring what your receiving device is doing is immaterial. Some devices are notorious for syncing very poorly no matter how perfect the clock coming at them is. A more thoughtful test would be measuring the actual clock data against an objective measurement device like a frequency counter or equivalent to see what’s happening at the clock level. 3. Buddy, why are you running a mastering chain in this scenario? Come on, guy! 4. Ableton has always been notorious for being bad and weird with MIDI timing. Run the same test in other DAWs and you’ll likely see radically different results. It’s a shame but it’s true! I use a USAMO to get around this all, BTW, and it’s generally rock solid.
Thank you for this video, I love this kind of stuff! Maybe there's a way to do some sort of a shoot out between different options? - Expert Sleepers Usamo and Silent way units - CLOCKstep: multi - Innerclock Systems, - ACME-4 - Circuit Happy Missing link. Because how stable is Ableton link? - The m4l clock devices? So many options but nothing fixes the issues.
I’m curious how the Akai MPC compares..Back in the the day the MPC was famous for it’s internal midi timing. There was this test of using all of it’s voices to trigger the same rimshot at the same moment. Still sounded like a flanged rimshot though; but way, way better then any midi sequencer in existence at the time….
@@Aristoper the mpc had it’s own internal sound engine, the Atari ST did not. Midi is a too slow serial data interface, which gives sloppy timing, as soon as you pump a lot of data through it (like a whole song, or a lot of pitchbend data) The MPC use to have 4 separate midi outs, so you could use one midi out exclusively for external drums. The Atari also had midi extension boxes with multiple outs.
Digitakt is always slightly off on the first bar when sending or receiving midi. Love to see a test comparing audio over usb(overbridge & class-compliment) vs over a solid interface like RME.
The RME is suppose to have the best jitter reduction. I wonder if it would correct issues if plugged into the UCX midi ports? I just throwing something out here, I'm not really familiar with using the midi ports on the UCX II. But I'm still curious about the Mio by Iconnectivity lol.
Interesting. What audio interface and connection method did you use. I don’t have any sync issues with Mukticlock via MOTU 16A using TB2. Did you try with any other outboard modules besides the Digitakt? Did the audio pulses going to Multiclock stay in sync?
Maybe the Multiclock has a much more precise internal clock? I would assume so, because by using audio sync its still clock generated by that same CPU, just the audio output may or may not be lower latency itself, I would assume the internal clock on Multiclock compared to ANYTHING is dead on
Agreed, if clocking from the MIDI hardware, multiclock or sequencer or whatever, all of the hardware will be in sync as the port to port latency and jitter of a MIDI router is negligible. It’s the computer that monkeys it up, and needs the audio clock to compensate. This would be a good thing to test.
@@DarrylMcGee I totally agree, I was writing some code for midi sequencing and it's surprising how easy it is to get unstable clock times on an OS where the cpu is doing a billion other things already, in fact I think he should try clocking the DAW from the multiclock
Thank you Ricky, that was another video in a long history of helpful videos. Thank you for the time and effort you put into it. (The rest of this comment is probably not helpful but...) I understand people want to eliminate latency in all its variants but be aware of the return on effort / diminishing returns / system variance. Depending on the gear I have active, the systems I'm syncing and the configuration of the system in total, if I can get a latency of under 10ms round-trip from hitting a control/keyboard/pad and hear the sound, I'm good. Can I get it lower by tweaking? Sure. Do I care? Not really. The bugaboo's like phase cancellation are noticeable and I deal with them using my ears if they are bad enough to matter. I'll spend the time on what I care about. If (for you) it's jitter and latency, that's cool, but I'll be making music more often than not. Have fun!
Latency compensation values reported by plugins is not set in stone, it can vary depending on whatever is configured within the plugin. One obvious case where this behaviour is noticeable is with the introduction of a limiter plugin, which was exactly the case here when you started switching between plugin settings in Ozone as I clearly saw that a Limiter module was present in some of the presets you browsed through. I believe in Ableton there's a mouse-over mechanism that allows you to see what the exact reported latency compensation value is at a given time. I know that for example if you take Fabfilter Pro L 2 and switch between the limiter modes, for example between "Modern" and "Transparent" there's a very significant latency variation involved.
I didn't want to make my original comment too long but, clearly you have to be very careful with the conclusions you're coming to with the kind of test that you did in the video, because there are a lot of variables involved with dealing the complex task of dealing with latency compensation in different use cases. In this case it appears evident to me that the issue you've experienced is directly related to the introduction of Ozone and effectively altering its latency compensation value in real time while you're recording as you were switching presets. This was inevitably bound to cause issues but the good news is there's almost certainly nothing wrong with your MIDI devices, you simply have to be mindful of what tools you choose to ensure "compatibility" for a given use case.
That was my suspicion too. That the Ozone plugin has some presets with a look-ahead latency. That latency is reported to the DAW pdc. Changing presets causes the PDC to recalculate.
Did the present change i ozone change the latency of the plugin maybe so that Live had to recalculate and reapply Live's delay compensation chain? I suspect that a mastering plug in can do that while an instrument plugin probably does not do it in most cases.
When you load different presets in ozone the pdc (plugin delay compensation) makes amount of samples changes. This wil inpack the audio engine and the position of de audio recording. Midi is a serial command protocol. Midi uses a speed of 31250 bits a second. This means roughly 3125 bytes a second. Midi clock uses 1 byte. The time of the byte is 0.32 ms . So it's normal to have a fluctuation around 0.16 if no other midi data is send. This can be bigger if more midi data is send like active midi sensing, cc controller data or song position.
@rickytinez could you test using multiple Elektron gear at the same time? I could never get them synched up correctly without using the Multiclock. I would have the Elektron gear running, creating sequences etc, and after 10mins or so they would be out of sync. The CPU spikes are definitely the main culprit. I think the audio engine has a higher priority to MIDI output, which is one of the reasons why the Multiclock works better.
I think the thing is that you have to choose one, learn its idiosyncrasies, and work with it. None of them will be perfect and the more time you spend hunting for the perfect tool the less time you are making music.
Now what happens when you start changing the buffer size after everything is set up? I maybe tripping, but once I got the Multiclock set up, changing buffer size everything was still locked pretty dang tight.
Hey Ricky, great video as always. If I am not mistaken, at around @9:06 you made a comment about 20% of a millisecond being 1/20 of a millisecond. That is wrong. 20%=20/100=1/5. That’s way bigger than 1/20. Thinking in percentages 1/20 is 5%.
Ricky, I really appreciate these videos, on DAW/Hardware sync. I have an RK-008 and I’ve adjusted latency per output for all of my drum machines and sequencers. I’ve gotten them all really good sounding/looking/grooving. The MAIN point to drive home first and foremost, is setting the ableton monitoring to be Off. So I assume you’re direct monitoring through your interface? I’m having great results but it’s frustrating when you’re using hardware sends (Ext Inst) in ableton. Side Note: Renoise to my ears has the best sync to hardware. I’m convinced if it.
I just bought a used ERM Midiclock (which is at €130 a lot less pricey than the Multiclock). But because I had a problem with a sequencer providing the clock (Polyend Seq with the bpm jumping anywhere between 169 and 183 instead being at 176) and ANOTHER sequencer (Squarp Hapax) being the slave but providing an involuntary MIDI loop even though every Thru setting was OFF. So in my case, the Midiclock allowed me to physically unlink my sequencer cable mess ... with a tiny little device right next to my keyboard. I do have a MIO XL MIDI router (with all MIDI connectors at the back, mind you, I do not want a cable mess on my table), but it's hidden somewhere in my rack and I wouldn't want to crawl into my rackspace every time I need to start/stop the clock. So for me, an external ERM Midiclock was indeed the best option. And everything connected now shows exactly 176 bpm, not some random number between 169 and 183.
The MRCC is known to have high lag and jitter for a device of its kind. My clock is audio pulsed out from the Daw out through Erica synths cv to midi clock converter, this is simple and sample accurate. When just using hardware my Mpc 4000 is the clock it is almost as tight as a dedicated sync box
A couple of comments: 1) Jitter happens when midi beat clock events are not sent at precise intervals. Latency is irrelevant for the issue of midi beat clock jitter because there is no real-time response to midi clock events. In other words, gear that receives midi beat clock will not do anything immediately in response to midi clock. Instead, those events will be observed and their intervals averaged over time to calculate a tempo. 2) What you've tested here is mostly the ability of Live to deliver a jitter-free clock. The proof of that is the varying tempo in the Digitakt. That's why you didn't see any difference in the jitter between the two devices. I can tell you from experience that other DAWs generally do a better job of this than Live. 3) Changing Ozone presets will change the latency compensation in Live, so you should expect that will cause glitches when doing that in real-time. CPU usage, either via plugins in Live or with other apps running, is not the issue
Nice followup, there IS a difference between daws, midiGal let’s you measure this and Bitwig has way less jitter than Live out of the box. Then there is the issue of audio-drivers, highend audio interfaces will have superior clock. On PC there is a utility called LatencyMon which shows the hardware interrupts interfering with audio timing, this let’s you optimize performance. In general stuff like graphic drivers, tcp/ip protocol but also bluetooth devices will (can) have some influence on audio timing. The erm should be rock solid during whole recording, other devices like erm also reset and get in sync again after a bar, for a recording situation this may not be crucial, but in a live situation you just want to stay in sync.
I wonder how much of that jitter can be attributed to floating point precision issues. Any time I see small decimal jumps in computing, float precision is usually the culprit
To me sub 1ms jitter is huge if your recording more than 1 midi drum instrument at once or trying to record a midi drum synth with VSTi's Phase issues, transient issues, etc. I also think a test with a DAW that isn't notorious for terrible timing like Ableton would be better as well as testing with multiple different midi devices. It also depends on what type of midi we are testing, if its clock or just note data. A good midi interface will take into account the buffers in usb and time stamp the midi while sending it a little early and then releasing it down the DIN cable at the right time. The other thing to consider is the resolution of midi when dealing with clock. There's a lot of factors here to test to come up with anything meaningful on the results side. And yes what device you are sending midi to does matter!
someone told me u should only use the internal clock of the ERM to get it right and tight not with the plugin but I never tried it out maybe it works better
I experimented with this as well some time ago and I understood that even internal sequencers are not tight. For example, my Elektron Machinedrum mk2 is digital, but it still drifting slightly from the grid.
Maybe try testing on another Daw?, Ableton has given me out-of-sync issues before!, Also external digital devices like your Digitakt or any other e-device have midi latencies. It would be worth testing midi tracks from Daw to Daw and see the results.
I bought the midronome, put it against a friends multiclock and basically ended up with a similiar set of stems :D I dont want to leave the comfort of ableton but surely something like protools handles latency better than my favorite daw..right?
ozone 10 and 11 hits my cpu hard and i have a i9- 1200k which has 24 cores if you include virtual cores? either way its got alot and i have 44 gb of ram and i dont get whats so taxing about ozone presets i know its doing alot of processing but nothing else ive used since i got this cpu about 3 months ago. maybe that could effect your test more negatively than others and if you tried something else you might get different results? ozone does sound damn good tho big improvement from the 1st .hope that helps atleast its something to consider.take err easy player.
Hello Ricky I think the problem comes from live 12 sampling stability problem or others, many users have noticed the same problems as you, including me, Live 12 systematically drops out when you open a Plugin or load presets or open the internet or other application, it's a recurring problem, I have the Erm multiclock and I have exactly the same audio dropout problem and I think that the Erm or the Mrcc are not the cause
Hey ! I use an ESI M4UeX connected via USB port to my Push Standalone to sync my different machines. There is the possibility of managing the latency of the M4UeX on the Push Standalone. This might seem like a weird question, but if I upgrade to one of the machines you present, could I achieve more stability with my MIDI sync?
Please check what happens when you use the Multiclock as clock source and not triggered by Ableton. It could be that the root cause for all the issues is Ableton and not the devices.
Multiclock can clock "analog" and "Din-Sync?" too. Also can be clocked by any "Clockpulse". ..but : The ACME Multiclock has Midi-outs and TRS-Trigger outs for every channel. that´s nice. ...and Double/half tempo ;)
I know I'm super late to the party and you might not see this but, when you showed the DT's clock jumping .1 ms it got me thinking. Have you ever tried syncing the DT with transport only and not clock? But set the tempo manually to match the DAW tempo. That way the hardware is being clocked by itself which in theory is going to be much more stable than through MIDI (it's too late here to test this myself now but just wanted to leave this here).
You actually don't need any of these boxes to achieve great sync! I'm syncing my entire studio by sending clock via the headphone output on my audio interface to my Keystep Pro. You can use a simple square wave loaded in a VST sampler or a dedicated sync plugin from any of the manufacturers of these sync boxes. Personally I use Silent Way Sync by Expert Sleepers, because it also reliably sends START/STOP messages. The important thing is that your main sequencer has a CV clock input, and that it can forward clock to its MIDI outputs.
This! How reliably does this work? Is it right on the money with the grid? I saw another comment mentioning Innerclock Sync Gen II but that requires a purchase of a discontinued hardware sync box that is at least $500 and I should have to spend that kind of money to get good sync.
@@Dan_Delix It works as reliably as is possible. Meaning, it stays on beat as long as you don't have big CPU spikes, and incoming audio will of course be late because of the latency on your audio interface. The way I deal with this is by having a plugin after my clock source that delays the clock signal by a certain amount so that my audio will be recorded one bar late. To clarify, the latency might make the recorded audio late by 0.63 bars, and I make sure that instead it will be late precisely 1.0 bars. Of course this might cause some trouble mixing hardware and software synths, but at least all your synced VST delay FX will work correctly. Also, remember that your hardware sequencer expects a loud and clean clock signal. Sync might drift if the signal is too weak, or the waveform might not be a true square wave anymore in case you are distorting it. On my particular setup, I leave the clock output at maximum in my software, and set my headphone amp to -10 dB. This setting has been the most stable for me. Last thing is that my Keystep Pro is happy to obide whenever I send it STOP signal, but currently it freaks out when I send it START signal. That's why I only send it STOP, and manually arm the sequencer before the next take. I'm trying to resolve this with Arturia support at the moment. Good luck! 😃
would be interesting to know how the mrcc and multiclock compare to a standard midi interface, from lets say a sub 250$ audio interface? cause i'm surprised of the observations you made, last time i tried to sync with midi clock from ableton over a standard midi interface, the synced sequencers showed tempos all over the place. maybe the mrcc has some clock smoothing magic internal? 🙂
Fun stuff, I always hate seeing tempo flicker. You should add a baseline recording of your Digitak directly into the daw tempo run from the Digitak. Also math in 8ths is easier to compute.
hmmmmmmmmm ok so I have to assume the preset changing in Ozone was a bit more than just CPU spiking as far as the culprit of your Midi sync falling off with either device. I experimented with switching through presets in Ozone like you were doing, and each preset has a different total sample latency, with a pretty huge variance between them. It looked like you had latency compensation enabled while running the ozone preset swapping tests, so it's most likely that Live having to reorient itself to the new maximum latency after each preset switch was causing the audio buffer to stumble. The Midi clock running in the background was probably keeping fairly steady, but because the audio buffer stumbled around it it wound up falling out of sync with Live's timeline.
Computers suffer of DPC latency, you can easily measure it and you should as laptops are prone to it. As long as we’re unable to produce a perfect clock and use it in perfect conditions, we’re unable to produce a perfect device. My Drumbrute impact is really bad, it’s clock is like rubberband stretching during the 16 steps.
i am a beginner and these latency problems are driving me crazy. i just have one question you guys need to answer for me. if i have some midi stuff playing (drum for example) and i want to play my guitar to it, monitoring through ableton with effects. now i jam over my drum until it fits. in that case, i need to record WITH latency to make it sound like what i played before. i adapted my guitar play to the latency for the timing. if an audio clip does not need to fit to a existing track directly, i correct the delay in my clip. if i record a part for a song, i leave the delay in, if i want to keep the timing. does this make sense?
Ozone causes more clocking problems than most plugins, since all presets do not have the same amount of latency (it depends on which modules are loaded in a given preset). The clocking can't keep up with the changing latency compensation.
yes. each device has it's own jitter - Elektron were not generally known as the most accurate but have greatly improved since the MD days. Innerclock Systems (who mack the sync lock stuff) have tested plenty of gear (you can look up Innerclock Systems Litmus to see their methodology and results but remember they are selling a ERM style product) and and an MPC 60 has a jitter of .33ms MPC 4000 .6ms and the MPC X a whopping 2.1ms. The only thing with no jitter is all those analogue sequencers like you might see in modular or the old stuff. Mostly I don't find jitter a problem unless you are over dubbing, then it can be a disaster.
I have also had wildly different results on different computers (sometimes USB sync is OK but sometime there is no fixing it) which is why i favour a audio sync solution despite it being slightly annoying
Why the solution is to use Ableton as a slave and avoid using its MIDI Clock to sync machines The main problem with synchronization tests using a computer's MIDI clock is that jitter (variation in clock precision) is considerably higher compared to dedicated clocks. This is due to how operating systems handle USB MIDI communication, where data is processed in a more complex and parallel manner, introducing variable latencies and fluctuations that affect tempo precision. In contrast, traditional MIDI works in a simpler and more direct way, which contributes to greater stability. Here are some key reasons why the computer's MIDI clock is not ideal and why Ableton should be used as a slave instead of a master: Jitter in USB MIDI: When using the MIDI clock of a DAW like Ableton, even with low load, the typical jitter is in the range of 2-5 ms. As the project load increases, this jitter can reach 5-10 ms or more, making the clock inconsistent and affecting the groove and synchronization. For example, if you change presets or introduce effects in the mastering chain, latency spikes can occur, causing rhythmic elements like the kick to fall out of sync, resulting in offsets of up to a quarter of a beat or more. Comparison with dedicated clocks: Devices like the E-RM multiclock offer extremely low jitter, of only ±1 sample when synced with audio at 48kHz, which is similar to what the E-RM midiclock⁺ offers. These devices guarantee precise synchronization without the fluctuations that often affect USB MIDI. Additionally, since the clock is external, it is not affected by latency generated by the computer's internal processing, always maintaining stable tempo. More precise hardware machines: Well-designed equipment, such as those from Elektron, handle the MIDI clock much better than a DAW, with jitter in the range of 1-2 ms, which is already considerably more precise than a computer under load. Latency accumulation in MIDI chains: If you have multiple devices connected in series, latency accumulates linearly, which can negatively affect synchronization. Therefore, it is recommended not to connect more than 2 or 3 devices in series to avoid timing issues. Conclusion: If you are looking for precise and stable synchronization, it is essential to use a dedicated device like the E-RM multiclock as the clock master and avoid relying on the computer's MIDI clock. USB MIDI latency and jitter will negatively affect your timing, while a dedicated clock will ensure solid and fluctuation-free synchronization. Using Ableton as a slave and an external clock as the master is the best option to maintain perfect synchronization in your setup.
the best bet might be to have an external clock that start / stops gear AND daw. not processes getting in the way. I’m sure the issue is the daw in the end. No matter what daw. It’s doing a lot to be a stable clock source.
You should do the same test with logic. I was using logic for years and really started having sync issues when switching to Ableton. But to be fair I also have issues when syncing hardware to my Multiclock in master model. Ableton is great for in the box midi tasks but it handles midi in and out the box very poorly compared to other solutions. Maybe also try the same test with a MPC as master.
Ok a couple of things. Things are way still way more complicated. First of all jitter is any deviation in, or displacement of, signal pulses in a high-frequency signal. The multiclock audio- plugin in hosted in a track of the sequencer is not excluded from the automatic latency compensation engine of your sequencer. Sticking to the way you've tested. I think Ableton is able to adjust the start time of the multiclock plugin if your are using the external device plugin. Ableton measures the delay between the midi trigger sent and the audio received by the HW-device. What I'm describing is not in anyway mentioned the manual of Multiclock. Why? That is not how the mulitclock is supposed to work or used. The computer doesn't sync external midi gear with an sample accurate audio signal. The audio plugin only syncs the muliclock. Back to your computer. First of all. You do not browse the Internet, open applications that are not managed by the sequencer. I do not know much about the Apple SOC but you are requesting the impossible. Opening your browser means, open a different IRQ, DMA channel on your processor to start the browser app. The app is stored on disk that is dependent on the CPU. Then the browser populate the screen using the integrated GPU (pictures and animation). The browser will Read the disk again for the cashed browser data and load it in to memory. It will also refresh the open tabs of pages from the Internet(involves operating your NIC with interrupts the CPU ). Your keyboard and mouse strokes are going to another application in the foreground. I'm trying to keep stuff simple here. But your are not making this easy. : )) How is Ozone 9 configured? Is het using OpenGL? Your mouse actions in the GUI of Ozone clearly interrupts the your processor. In short, I see so many issues in your test and on your computer. I would be in tears if my Wintel behaved like your Mac. All in good spirit!
About your outcome with the multiclock beeing so similar: What if this jitter was back then , when the Multiclock came out was an issue. But computers improved so much, that a clock generated by the DAW and transmitted via USB is similar stable. You made me curious: I will do a similar test. I have no MRCC but i can try clocking the DT via the Blokas Midihub and then via Multiclock. So far i can confirm the Multiclock generates in conjunction with the DTII around 1ms of jitter. Not sure what the source is. Without external clock the DTII also jitters , but a lot less - around 0.25 ms When i clock the OT MkII with the Multiclock it jitters less than the DTII ... the maximum offset i found was 0.6 ms .. the median was more like 0.3ms I'll have to test this a bit more extensivly.
TL:DR I don’t think under 1MS matters unless you’re trying to hit 2 same sounds at the same time, ask “does it sound good?” Good! Now hit record ❤
Audio sync should be more stable because DAWS are focused on their audio engine before anything else including midi thus if you have it setup right it should lock to what ever the audio is doing. That said once you are ready to record I'd still disable all plugins on that track while recording, IU know that sucks if you like to jam your arrangements live
@@X-101If you disable a plugin in Ableton Live it keeps the Latency. You have to delete it. I'm using v11, dont know if it changed with v12.
If you're layering the same digital sounds with relatively minimal processing on one, then a few samples off can make a difference. But otherwise, I agree. If you can hear a flam, and don't want it, try a nudge... or only use one of the sounds for the attack part of your stack. That's a classic solution - one layer for the stack's attack, and use the other layers for body, thickness, decay, etc.
Also, people might be surprised how sloppy MIDI jitter could be on slaved vintage sound sources - nore than 1ms. And people made some GREAT music with that jitter present.
Yea the MPC60 sequencer has some slop in it but it gives it the life
Thanks for that awesome video Ricky, nice come back from the last one :))
This is a complex subject, but let me give my professional opinion.
* What you are comparing is (A) the MIDI clock generated by Ableton and (B) the clock/pulses generated by the multiclock VST plugin.
-> For now let's ignore any additional latency/jitter created by the multiclock device, by the playing device (digitakt), or by the audio interface (we can assume they are very low).
* I'm sorry to say, but the MRCC has nothing to do with any of this - it just forwards the clock generated by Ableton (and I'm sure it does it well)
* What we want is precision between (A) or (B) and the DAW timing, which is the timing the analog audio is being sampled by the audio interface
* Let's look at (A): on your computer, it runs in a different thread/priority than the audio stuff, so as soon as your CPU gets busy, it won't be called "in time" and that will create jitter and latency. On top of that it is transfered over USB-MIDI, which does not have the highest priority either - so also extra jitter if the USB bus is busy (lots of USB devices for example, big data transfers, for example)
* Let's look at (B): it runs in a VST plugin, which is processed not only real time with the other audio stuff, but more importantly the generated audio pulses are in perfect time with the other audio coming out of your DAW. So inherently the audio pulses you send to the multiclock are in perfect time with the other audio coming out of the DAW. Any jitter/latency using the setup (B) is therefore from the stuff we ignored, i.e. within the multiclock itself, or from the playing device, or from the audio interface.
Now will OSes and DAWs get better so that nobody will need this kind of complex setup anymore? Yes I bloody hope so, and U-SYNC is the first step towards it.
Simon
For those interested, Simon has made a cool little box(midi and audio Metronome, linked to your DAW),
I have it in my setup since a year and it is awsome - check his channel and support forum.
Simon's Midronome is an amazing MIDI clock and a great alternative to the aforementioned two.
Greetings from Macedonia
The Multiclock has a VST plugin!? And the plugin generates the clock? Then you are still relying on the PC, which isn't going to be as accurate as hardware can be.
Simon’s Midronome may be awesome but it’s not available anywhere. What is an alternative that can actually be purchased today not oct 2024.
@@elitebeatagent8993 I guess the current batch is sold out.
Try contacting him. He is very responsive and kind.
And believe me - the Midronome is worth it!
Hey Ricky, been watching your videos on the MultiClock that I also heavily use. Love your channel and your approach! The power of the Multiclock, is being able to shift things in time with knobs until it "feels right" and then hitting record. To me it s priceless just for that! Every plugin you add, also EQ s being changed to Linear phase etc will affect the timing of everything including the Multiclock as Abletons internal latency will be affected. Just setting it once and expecting it to stay in sync is not achievable IMO. Often I check it s tight and then adjust by feel so the groove feels better than 100% on grid, very often do this on bass synths and toms. Don't sell it, you will regret it!
💯
Could not agree more!
Device jitter, and small variable clock offset, is something that will always exist with the midi protocol. The midi message bytes are sent serially and received using a buffer which is then processed by the microprocessor in the device. That thing runs at its own clock rate. So a midi clock message will never be processed instantly when it arrives. The receiving device still has to figure out that it received a clock message and act accordingly.
In other words, the Multiclock can sort of transmit the clock message in sync with a sample, but nothing can interpret that with sample accuracy because of how the midi protocol works.
I think the accuracy might also get worse if you send additional midi messages like CC over the same port.
There is eventually a bottle neck somewhere right? So many tiny variables. I felt like I was trying to figure out a Manet painting by looking at the canvas stitching 😂
This is correct. Aftertoutch on some USB devices can quickly overrun a MIDI serial buffer if it isn’t properly throttled.
The only way to have a perfect sample accurate clock sync would be to base it on the word clock everywhere in the chain.
Every device in the chain would need to know the tempo changes at the audio rate level.
I believe that every connected device would probably also have to report their own internal latency so that latency compensation for all chains can be tracked because even what is perceived to be a "no" latency for a digital synth at least have a one sample latency and what it is depends on the internal sample rate.
MIDI is very bad for syncing clock since it is only based on a low rate tick with no time codes, you could probably improve that a lot but it would also not be compatible with midi clock
Interesting i hadn't figured it was a midi-protocol problem, that does make a lot of sense.
@@pleggli Interesting, i wonder if somehow a unit speaking direct to DAW through another protocol, functioning off something more like CV might be a solution?
I highly appreciate your further research (and extensive video production) on this topic. Let‘s hear for the real Pros..
This was PTSD of trying to get two kick drums to align across devices. The phasing in and out was an interesting resulting sound, but also soul crushing. It's stuff like this that drive people to the dark arts of eurorack drums.
it drove me to OSC syncing - it's wild, you can dependably slide in and out of phasing of sounds.
Hahaha! Yea in that case I can see the fetal position incoming while the phase gets louder and louder.. (or quieter 😳)
Or just nudge the track a few samples after you've recorded?
Plus there is latency on the devices you are trying to sync. It never ends. I found the best is just have pulse audio tracks in my DAW and output direct to my clock in on my sequencer/drum machine. No midi.
Thank you, Ricky, for the information you provided. I have also been researching for a few years how to optimally synchronize onboard devices with the computer, and I have come to the realization that latency cannot be completely "overcome," but different approaches can be used, as you mentioned as well. In your specific case, I believe the drift is caused by the computer itself and not by the multiclock, because of software record input monitoring enabled. For an optimal "latency-free" and rec.overdub experience, I would recommend an onboard mixer with direct monitoring on each channel to avoid internal processing as possible. Regarding browsing presets using with the multiclock, I have sorted it out by installing VSL Server on the computer and offloading all my plugins from the DAW process, thus eliminating any processing that could affect the out clock from the DAW , and I can say that it works really well. Zero drift and phase issues with live tracking and overdub recording. Best regards, Robert Trifunovic
So many years and this is still a problem…
With Digitakt or Ableton or just jitter?
It really got me that when midi slaved to an MPC my first pc used to display wandering tempo or 89.9 instead of 90 lol
Things haven't improved with all the d/a stuff in our rigs too
@@johansebastianfauno5526 PC computers and notebooks aren't ideal for electronic music production primarily due to latency issues and system interrupts. These devices were designed for general-purpose computing rather than real-time audio processing, which requires consistent, uninterrupted performance.
1. Latency and jitter: Standard computers often struggle with maintaining low and consistent latency, which is crucial for real-time audio processing. This can lead to noticeable delays or irregularities in sound output.
2. System interrupts: Modern operating systems constantly manage various tasks and processes, causing frequent interruptions that can disrupt the smooth flow of audio data. These interrupts can result in audio glitches, dropouts, or buffer underruns.
3. Non-real-time operating systems: Most consumer PCs run on operating systems not optimized for real-time processes, making it challenging to guarantee consistent performance for audio applications.
4. Hardware limitations: Standard PC hardware, including sound cards and drivers, may not be designed to handle the demands of professional audio production, leading to suboptimal performance.
This is why dedicated hardware gear remains popular in electronic music production. These devices often feature:
1. Purpose-built chip architectures: Designed specifically for audio processing, ensuring low latency and high reliability.
2. Real-time operating systems: Optimized for consistent performance and minimal interruptions.
3. Dedicated DSP (Digital Signal Processing) chips: Specialized processors that can handle complex audio calculations more efficiently than general-purpose CPUs.
4. Guaranteed "quality of service": Hardware units can provide consistent performance without the variability found in multi-purpose computers.
The same principle applies to other fields requiring real-time processing, such as telecommunications. Cell phone towers, for instance, use dedicated hardware rather than general-purpose processors like Intel CPUs. This ensures reliable, low-latency communication essential for maintaining call quality and network stability.
While modern PCs have improved significantly and can be optimized for audio production, dedicated hardware often remains the go-to solution for professional-grade electronic music production due to its reliability, consistency, and purpose-built design.
Would you like me to elaborate on any specific aspect of this explanation?.
I bought a monitor that negates any jitter or timing farts I see so it’s not a problem anymore.
So let’s invent a perfect timing box to go along with pitch correction bullshit. Then let’s get a microscope and look at your dead brain cells.
Gotta go old school on sync; workstation is master clock, all midi and audio clips consolidated into long sections (especially in Live), use automation over patch changes, use lots of pre roll, render or dump all midi parts to audio and immediately edit for sync. Abandon all illusions about MIDI - only audio recordings are ‘real’ and in sync. Put video on separate MIDI TC slave system. Finally, use tempo sync’d delays as a reference only. Once you know what you want delays to do, set times manually or with automation and let them free run. I hear that recording the cowbell from an 808 onto track 23 of a Studer ATR can drive the arpeggiator of a Juno 101 pretty well…
Here is my setup with the Multiclock in ableton that has been rock solid for me:
I turn off all external midi sync in ableton because you want the devices to get the sync from the multiclock and not ableton via usb. Then use an external audio effect on the channel you have your multiclock plugin or audio pulse running through to route the audio to the multiclock. I find this keeps the latency compensation better when running plugins.
You can then run one of your midi outs on the multiclock into the midi in on the MRCC to clock your outputs on the MRCC.
Interesting
Thanks for going to all the trouble to clearly and methodically document everything! I have been doing similar, but I'm so much less familiar with Ableton, so I can't pin down how much of rhe variation is something I changed on one track without realizing it
I’ve been using Ableton live since version four and I will tell you that you have undertaken a fool’s quest. Ableton and external hardware synchronization has always been an epic nightmare. I gave up on caring and I use Ableton has basically either all software in the box or controlling hardware out of the box, but as soon as I try to get the two to work together and line up, everything goes to shit. The only answer for me has been that I record and timing after.
It’s not all Ableton’s fault, either. It depends on what your synchronizing to it, and how that device handles clock changes. Midi, also being a serial protocol, does not help. Regardless of what your sequencers internal timing may be, midi is limited to 24 pulse per quarter note timing resolution, which was great in 1982, but just doesn’t work today. Midi 2.0 can’t come fast enough.
Yea I had a feeling this is the case. 😂 I have tried other DAW‘s in the past, and one thing that always stuck out to me was how everything “just worked“ when it came to synchronization. I felt that both Bitwig and Logic gave me a more stable clocking experience. But I never really tested anything, or questioned it further as I did today haha
@@RickyTinezyou should test that then, and see if your feelings have any basis in reality
The best sync I've gotten so far is from the Midronome. Also not too expensive. I have it providing sync to everything through an MRCC.
Yeah, MIDI 2.0’s timestamps is the solution to this. After that, everything should be great.
Brilliant video Ricky and a very interesting subject. Of course, now you'll have to do it all again Vs Ableton Link 🤣🤣🤣
Another great video Ricky! I got the MRCC after your last video and it’s amazing to have such a flexible MIDI patch bay and especially have the four virtual MIDI in and outs. Game changer for me.
But about MIDI jitter - I genuinely believe this to be an ableton problem. I’ve done something similar to this in Logic and it was bang on but yeah, Ableton *always* did weird stuff. Not even just clock, but even just MIDI sequencing notes.
Funny thing however is that Elektron Overbridge seems to always be locked-in for me regardless of DAW. Perhaps you could talk to some people and start building some ideas for an Elektron sync box? 😅
Also I just realized that in this video you’re recording with Monitoring set to Auto and not Off, without trying adjusting the Keep Latency settings. 🤔
There’s just *way* too much stuff to keep track of when recording into Ableton!
when you're tinkering with ozone, the latency of the plugin gets altered, which results in a tiny audio dropout from ableton. for sure then the multiclock gets a hickup...
Yeah i think DAW latency compensation changes might be significant enough here to outweigh device inaccuracies.
Exactly what I was thinking.
Yep. I have no idea why somebody would use a heavy CPU mastering plug-in during a recording in Ableton.
@@RJ1J Agreed, during recording I'd turn off plugins or use the DAWs delay compensation. Not sure if ableton has delay compensation or not or if its any good.
Interesting analysis of MIDI clock with the two devices and your DAW. It's hard to image that a half millisecond would be noticed. With MIDI keyboard latency, somewhere between 5-10 milliseconds I can start to notice a lag. That said, if you take a song at 120 BPM, you get 500 milliseconds per quarter note (two notes per second). I use Logic, which has a default MIDI PPQ resolution of 960 (I believe Ableton is the same?). So for each quarter note occurring every 500 milliseconds, there are 960 sub-divisions the DAW is using, to determine accurate MIDI clock timing. .500/960 = .000521 per MIDI clock sub-division, or roughly a half millisecond. So the actual (DAW) clock data used to determine MIDI timing, is only accurate best case, to a half millisecond (at 120 BPM -- clearly this will change based on the song's tempo). It doesn't seem to me, you are going to be more accurate than the clock frequency. And as you demonstrated, when the CPU in the DAW is heavily loaded, other CPU threads may delay the processing of the DAW/MDI clock. I like your comment about missing the forest by focusing on the trees -- this analysis is great, but I suspect if one doesn't heavily load the DAW's CPU while recording with MIDI clock, any minor variance + or - half millisecond, would be small enough to not be noticed. BTW, I ended up getting an MRCC 880 after seeing your video last week -- didn't realize this device was available. Really a nice MIDI router/merger/spitter -- thx for the tip!
Again he didn’t test multiple external sequencers. When I’ve used a usb midi hub with one piece of gear I could get it and Ableton synced after a little bit of tweaking. You would think that now I can do the same process with other gear and then I can run 3-4 external sequencers and they would be synced. But, it always fell apart. Now I use the multiclock and at the moment I have 5 pieces of gear all running their internal sequencers locked to Ableton. When I first got the multiclock I would also measure the latency and adjust it that way to get it super tight. Now if I add a new piece of gear I just dial it in by ear. Sometimes a few milliseconds here and there get those separate sequences grooving together. Plus I could always nudge later in Ableton, if need be, after I’m done jamming and recording.
I had the ERM multi clock. plus a motu 128. After years of sync issues, and testing over many years between Ableton , Cubase and Bitwig. I came to the conclusion that Ableton was by far the worst for sync issues. So I sold Ableton Live and the ERM and a load of older hardware gear.. I will do most of my live hardware compositions in Bitwig. I've never been happier. ;o)
Most desktop computer operating systems are technically not real-time OSes so they will always be affected by CPU. A real-time OS (RTOS) is designed to work within hard constraints and so can guarantee specific response times. I think a dedicated hardware clock with a RTOS would be best but I dont know which of the hardware devices out there are designed that way. Someone mentioned Midronome but I've never used it.
I think Ensoniq keyboards Midi jitter and timing is better than almost everything available and that was in the late 80's early 90's so it must have been because it was operating to process midi and the synth as it's core function.
Most MIDI devices like MRCC use simple single core microcontrollers and are bare metal programmed. That is, there is no operating system. However, some higher performance MCUs can run an RTOS which might be helpful if there are a lot of tasks to juggle.
@@rgeraldc80s In the 80s, DAWs did not exist. At best you had things like Steinberg Pro 24 sequencing software running on say, an Atari ST.
This stuff really gets me… we have such ridiculously powerful computers these days, but we’re still struggling with audio and midi latency.
The response times could easily be achieved on modern OS’s if they just got out of their own damn way for real time processing
Something like Linux with a realtime kernel can do things on dedicated deadlines but it still won’t make latency super-low. ASICs or FPGAs will always win for stuff like that where microsecond-level timing is required. Once you’re looping through an interface or general-purpose OS for tight timing, forget it unless it’s simply an output and all the instruments are in-the-box. RME gear in general has great latency for tracking and stuff but if you really want phase-accurate you need to quantize or manually align stuff. Hybrid kinda sucks because of this.
I'm a bit late to the comment party, but as many other commenters have now suggested, there are a LOT of different choke points on both the computer running the DAW and all of the gear that you're plugging into. Trying to track down where these transient issues are happening on any given take is likely the path to insanity (although arguably still easier than troubleshooting a complex HDMI audio setup). Some concepts that might help provide a bit more context... first, even fast modern CPUs are still usually running one instruction at a time, which means the processor is rapidly jumping between dozens of tasks. It's not just your running programs but all of the OS services, and everything done over USB (audio interface included) requires a not-insignificant amount of processor overhead. Processing streams while individual apps and plugins are all greedy for high priority slices means that the more you listen for audio artifacts, the more you will absolutely hear them. This might actually be one of the most understated advantages DAWless setups have; the processors might be far less powerful, but they are not running a multitasking operating system.
The second thing to keep in mind is that MIDI clock itself is a very janky thing. There's no concept of syncing to an atomic clock, and the speed a processor runs is directly influenced by the voltage it's receiving unless there's a clock running; does your MIDI gear have clock batteries that need to be replaced? Mine doesn't. So, everyone is doing their best but nobody actually can agree on how quickly time is flowing. When I was first learning the MIDI protocol as a developer, I was shocked to learn that there is no tempo message. Instead, you send timing "ticks" down the line at a rate of 48 messages per quarter note. That's right... time is defined in real-time based on the speed at which those ticks arrive. Your MIDI synth is just a hungry hippo for ticks. If you stop sending ticks, the music just stops. To me, this is wild. This is also why you see the tempo occasionally jump up or down by 0.1 when you assume it should be stable. What you're seeing is ticks arriving too quickly or too slowly, or not arriving at all.
Now, the other elephant in the room is that MIDI works at 31.25k baud, and the timing messages are not given special priority. So, if you have a lot going on, that bandwidth can get eaten up. That's an area where the MRCC can really help, by defining subsets of devices to talk to each other. Still, just because MIDI can support a given speed under ideal conditions doesn't mean that the client device is powerful enough to receive them that quickly. Right now I am talking with the Hologram folks because Chroma Console literally shuts off if you send it too many messages too quickly. Which is not amazing, but this is the world we're in.
I'm not intimately familiar with the audio clock but if there's software running on your laptop to interpret what it's sending, then all you're really doing is shifting the point of failure from MIDI to your CPU's ability to prioritize messages from your audio clock.
TL;DR: I wouldn't pay extra for the audio sync thing, and I wouldn't do CPU intensive things while recording. And yes, if it sounds good...
Nice vid!
The main issue with those sub-10ms changes is 2-fold: 1. Phase issues. If you are somehow trying to double up the same sound, which is not especially likely in this context, the phase can be totally off and thin things out. 2. More importantly, we CAN hear the substantive difference in transient response if say 2 percussive elements are clocked separately and continuously overlapping, you can have a subtle shifting of where a really really tight flam is happening. in a majority of cases i don't think this will really cause an issue, in some it could even be pleasant, but in certain situations it could certainly be heard, and in some it might be undesirable. I think the worst case scenario would be DJing, if you were to have similar parts overlapping poorly, could sound really shit.
Oh yea ive heard that, putting the same tone or cowbell on 2 machines and they are never super dead on. You hear the phase for sure.
Would this happen if it was machine to machine midi? I guess it might depend on the machine right? Haha
@@RickyTinez It would prolly depend on a few different things but most of all on whatever hardware/software you were using. Theres stuff like JackAudio or a few other similar solutions, too, which just go software-to-software, you could probably test something like that and it'd give you an idea whether its common inside a PC, and if it is, it'd prolly come up computer-to-computer as well. I somewhat suspect all you'd deal with is latency though, rather than jitter. Oh and im realizing just now i might've misunderstood, if you mean between 2 midi hardware units, i absolutely think it would come up, my guess is it has to do with subtle analogue jitter. Most all clocks of the time-telling variety have jitter, thats why we've got the atomic clock, but even it has subtle jitter i'm told. Generally, in the digital realm, my understanding is its all 1s and 0s and the jitter doesn't really happen usually, the electricity coming in is all stabilized by the power supply to facilitate the extremely detailed operations, so that every 1 is a 1 and every 0 a 0, nothing in between. Then again, if i was all right, the atomic clock would be digital LOL, so i gotta be missing something. TLDR; Clocks are actually gnarly AF 🤣
Hey,
thanks for allyour effort on this interesting research. Without knowing for sure, I think the Multiclock is so popular because it works great with older drum machines or synths that have built-in sequencers. That's where it really shows off its strengths, imo. For instance, I like to use the audio out on Channel 1 of the ERM to trigger the SH-101 or the JX-3P. Of course, that only makes sense if you’re into those old-school internal sequencers. Plus, the built-in swing in the ERM is pretty nice in this context.
Great video, thanks Ricky. FWIW when I’m sending clock from my Digitakt to my Syntakt over MIDI DIN without a computer in sight, I can see the tempo on the Syntakt fluctuate by +/- 0.1 bpm throughout. Same with the Digitone. I think there’s inherent jitter in MIDI clock at the hardware level, quite apart from when a DAW is involved. But… it doesn’t matter because jitter that’s measured in around 1ms or so isn’t going to make an audible difference. And in reality MIDI clock sync on Elektron boxes is tighter than a camel’s arse in a sandstorm. If the MRCC is getting jitter via USB from a DAW down into the 1ms range then it’s going to be almost as solid as Elektron gear at the hardware level as a clock source. That’s good news because MRCC 880 is already on my shopping list and this test confirms it’s a good buy.
i own multiclock and i did jitter test using different audio interfaces meaning Focusrite 18i20 and 2 RME Babyface pro fs and UCX2 and this is where i found the RME differ from others being super stable with audio jitter so the multiclock performed very well. now i generally stay away from any activities while recording having another dedicated PC for junk
great video !! thank you 💪💪⚡⚡
About to watch the video. What I found using a bunch of Elektron gear is I would always get bad gitter.. using the Multiclock fixed this for me.
I definitely need a MIDI clock
Without looking thru the video yet, i want to say: great that you did this test , as i also complained about your previous video.
I think you're seeing two different things:
1. Regular old MIDI Jitter, which can vary in severity based on hardware and DAW. USB vs. MIDI DIN, IO Drivers, MIDI driversetc. I've done extensive tests and some DAWS do better than others. MIDI DIN seems to be better than USB.
2. On the Ozone tests, you are probably flipping through presets with varying values of lookahead. When you instantiate a plugin or preset with lookahead it's going to momentarrily throw Live's plugin delay compensation (PDC) out of whack.
Like the last one, it’s hard to take this video in good faith and not as cynical clickbait but you’ve always seemed like a good dude so I’m giving the benefit of the doubt and just assuming you have a bunch of things twisted.
1. You’re not comparing the ERM to the MRCC because the MRCC doesn’t do anything to the clock other than route it. You’re comparing the ERM to your computer’s clock with the MRCC as a dongle between your computer and the outside world. This is an apples to oranges comparison.
2. Every receiving device behaves differently so measuring what your receiving device is doing is immaterial. Some devices are notorious for syncing very poorly no matter how perfect the clock coming at them is. A more thoughtful test would be measuring the actual clock data against an objective measurement device like a frequency counter or equivalent to see what’s happening at the clock level.
3. Buddy, why are you running a mastering chain in this scenario? Come on, guy!
4. Ableton has always been notorious for being bad and weird with MIDI timing. Run the same test in other DAWs and you’ll likely see radically different results. It’s a shame but it’s true!
I use a USAMO to get around this all, BTW, and it’s generally rock solid.
Innerclock Systems
Thank you for this video, I love this kind of stuff!
Maybe there's a way to do some sort of a shoot out between different options?
- Expert Sleepers Usamo and Silent way units
- CLOCKstep: multi
- Innerclock Systems,
- ACME-4
- Circuit Happy Missing link. Because how stable is Ableton link?
- The m4l clock devices?
So many options but nothing fixes the issues.
sync rabbit holes are endless 😖
I’m curious how the Akai MPC compares..Back in the the day the MPC was famous for it’s internal midi timing. There was this test of using all of it’s voices to trigger the same rimshot at the same moment. Still sounded like a flanged rimshot though; but way, way better then any midi sequencer in existence at the time….
Kind of curious how the ATARI ST compared to old Akai MPC 60 at sequencing in a head to head comparison.
@@Aristoper the mpc had it’s own internal sound engine, the Atari ST did not. Midi is a too slow serial data interface, which gives sloppy timing, as soon as you pump a lot of data through it (like a whole song, or a lot of pitchbend data) The MPC use to have 4 separate midi outs, so you could use one midi out exclusively for external drums. The Atari also had midi extension boxes with multiple outs.
Digitakt is always slightly off on the first bar when sending or receiving midi. Love to see a test comparing audio over usb(overbridge & class-compliment) vs over a solid interface like RME.
I'm convinced, nothing is better than Innerclock Sync Gen II. You have to try it.
Who says it's the Multi Clock or the MRCC that are jittering and not the machine receiving those clocks which is then recorded?
That’s what I kept thinking
Been Following you for 5 years and your obsession with this is intense.
The RME is suppose to have the best jitter reduction. I wonder if it would correct issues if plugged into the UCX midi ports? I just throwing something out here, I'm not really familiar with using the midi ports on the UCX II. But I'm still curious about the Mio by Iconnectivity lol.
Interesting. What audio interface and connection method did you use. I don’t have any sync issues with Mukticlock via MOTU 16A using TB2. Did you try with any other outboard modules besides the Digitakt? Did the audio pulses going to Multiclock stay in sync?
This stresses me out. I'd give up or lose my mind.
Maybe the Multiclock has a much more precise internal clock? I would assume so, because by using audio sync its still clock generated by that same CPU, just the audio output may or may not be lower latency itself, I would assume the internal clock on Multiclock compared to ANYTHING is dead on
Agreed, if clocking from the MIDI hardware, multiclock or sequencer or whatever, all of the hardware will be in sync as the port to port latency and jitter of a MIDI router is negligible. It’s the computer that monkeys it up, and needs the audio clock to compensate. This would be a good thing to test.
@@DarrylMcGee I totally agree, I was writing some code for midi sequencing and it's surprising how easy it is to get unstable clock times on an OS where the cpu is doing a billion other things already, in fact I think he should try clocking the DAW from the multiclock
Check out the MIDI Fact Sheet in the Ableton Live manual.
Thank you Ricky, that was another video in a long history of helpful videos. Thank you for the time and effort you put into it.
(The rest of this comment is probably not helpful but...)
I understand people want to eliminate latency in all its variants but be aware of the return on effort / diminishing returns / system variance. Depending on the gear I have active, the systems I'm syncing and the configuration of the system in total, if I can get a latency of under 10ms round-trip from hitting a control/keyboard/pad and hear the sound, I'm good.
Can I get it lower by tweaking? Sure. Do I care? Not really. The bugaboo's like phase cancellation are noticeable and I deal with them using my ears if they are bad enough to matter.
I'll spend the time on what I care about. If (for you) it's jitter and latency, that's cool, but I'll be making music more often than not.
Have fun!
Latency compensation values reported by plugins is not set in stone, it can vary depending on whatever is configured within the plugin. One obvious case where this behaviour is noticeable is with the introduction of a limiter plugin, which was exactly the case here when you started switching between plugin settings in Ozone as I clearly saw that a Limiter module was present in some of the presets you browsed through. I believe in Ableton there's a mouse-over mechanism that allows you to see what the exact reported latency compensation value is at a given time. I know that for example if you take Fabfilter Pro L 2 and switch between the limiter modes, for example between "Modern" and "Transparent" there's a very significant latency variation involved.
I didn't want to make my original comment too long but, clearly you have to be very careful with the conclusions you're coming to with the kind of test that you did in the video, because there are a lot of variables involved with dealing the complex task of dealing with latency compensation in different use cases. In this case it appears evident to me that the issue you've experienced is directly related to the introduction of Ozone and effectively altering its latency compensation value in real time while you're recording as you were switching presets. This was inevitably bound to cause issues but the good news is there's almost certainly nothing wrong with your MIDI devices, you simply have to be mindful of what tools you choose to ensure "compatibility" for a given use case.
That was my suspicion too. That the Ozone plugin has some presets with a look-ahead latency. That latency is reported to the DAW pdc. Changing presets causes the PDC to recalculate.
Did the present change i ozone change the latency of the plugin maybe so that Live had to recalculate and reapply Live's delay compensation chain? I suspect that a mastering plug in can do that while an instrument plugin probably does not do it in most cases.
When you load different presets in ozone the pdc (plugin delay compensation) makes amount of samples changes. This wil inpack the audio engine and the position of de audio recording.
Midi is a serial command protocol. Midi uses a speed of 31250 bits a second. This means roughly 3125 bytes a second. Midi clock uses 1 byte. The time of the byte is 0.32 ms . So it's normal to have a fluctuation around 0.16 if no other midi data is send. This can be bigger if more midi data is send like active midi sensing, cc controller data or song position.
@rickytinez could you test using multiple Elektron gear at the same time? I could never get them synched up correctly without using the Multiclock. I would have the Elektron gear running, creating sequences etc, and after 10mins or so they would be out of sync.
The CPU spikes are definitely the main culprit. I think the audio engine has a higher priority to MIDI output, which is one of the reasons why the Multiclock works better.
I think the thing is that you have to choose one, learn its idiosyncrasies, and work with it. None of them will be perfect and the more time you spend hunting for the perfect tool the less time you are making music.
Try a different DAW ? Like Cubase or even better (for recording) Reaper ?
Now what happens when you start changing the buffer size after everything is set up?
I maybe tripping, but once I got the Multiclock set up, changing buffer size everything was still locked pretty dang tight.
Hey Ricky, great video as always. If I am not mistaken, at around @9:06 you made a comment about 20% of a millisecond being 1/20 of a millisecond. That is wrong. 20%=20/100=1/5. That’s way bigger than 1/20. Thinking in percentages 1/20 is 5%.
Hi Ricky loving your videos. What audio interface would you recommend I'm trying to decide which one to get. Thanks
I wonder what effect changing cpu priority for the daw would have.
Ricky, I really appreciate these videos, on DAW/Hardware sync. I have an RK-008 and I’ve adjusted latency per output for all of my drum machines and sequencers. I’ve gotten them all really good sounding/looking/grooving. The MAIN point to drive home first and foremost, is setting the ableton monitoring to be Off. So I assume you’re direct monitoring through your interface? I’m having great results but it’s frustrating when you’re using hardware sends (Ext Inst) in ableton.
Side Note: Renoise to my ears has the best sync to hardware. I’m convinced if it.
I just bought a used ERM Midiclock (which is at €130 a lot less pricey than the Multiclock). But because I had a problem with a sequencer providing the clock (Polyend Seq with the bpm
jumping anywhere between 169 and 183 instead being at 176) and ANOTHER sequencer (Squarp Hapax) being the slave but providing an involuntary MIDI loop even though every Thru setting was OFF. So in my case, the Midiclock allowed me to physically unlink my sequencer cable mess ... with a tiny little device right next to my keyboard. I do have a MIO XL MIDI router (with all MIDI connectors at the back, mind you, I do not want a cable mess on my table), but it's hidden somewhere in my rack and I wouldn't want to crawl into my rackspace every time I need to start/stop the clock. So for me, an external ERM Midiclock was indeed the best option. And everything connected now shows exactly 176 bpm, not some random number between 169 and 183.
The MRCC is known to have high lag and jitter for a device of its kind. My clock is audio pulsed out from the Daw out through Erica synths cv to midi clock converter, this is simple and sample accurate. When just using hardware my Mpc 4000 is the clock it is almost as tight as a dedicated sync box
A couple of comments:
1) Jitter happens when midi beat clock events are not sent at precise intervals. Latency is irrelevant for the issue of midi beat clock jitter because there is no real-time response to midi clock events. In other words, gear that receives midi beat clock will not do anything immediately in response to midi clock. Instead, those events will be observed and their intervals averaged over time to calculate a tempo.
2) What you've tested here is mostly the ability of Live to deliver a jitter-free clock. The proof of that is the varying tempo in the Digitakt. That's why you didn't see any difference in the jitter between the two devices. I can tell you from experience that other DAWs generally do a better job of this than Live.
3) Changing Ozone presets will change the latency compensation in Live, so you should expect that will cause glitches when doing that in real-time. CPU usage, either via plugins in Live or with other apps running, is not the issue
Just curious! It's been a minute; have you've settled on a solution?
Nice followup, there IS a difference between daws, midiGal let’s you measure this and Bitwig has way less jitter than Live out of the box. Then there is the issue of audio-drivers, highend audio interfaces will have superior clock. On PC there is a utility called LatencyMon which shows the hardware interrupts interfering with audio timing, this let’s you optimize performance. In general stuff like graphic drivers, tcp/ip protocol but also bluetooth devices will (can) have some influence on audio timing. The erm should be rock solid during whole recording, other devices like erm also reset and get in sync again after a bar, for a recording situation this may not be crucial, but in a live situation you just want to stay in sync.
I wonder how much of that jitter can be attributed to floating point precision issues. Any time I see small decimal jumps in computing, float precision is usually the culprit
To me sub 1ms jitter is huge if your recording more than 1 midi drum instrument at once or trying to record a midi drum synth with VSTi's Phase issues, transient issues, etc. I also think a test with a DAW that isn't notorious for terrible timing like Ableton would be better as well as testing with multiple different midi devices. It also depends on what type of midi we are testing, if its clock or just note data. A good midi interface will take into account the buffers in usb and time stamp the midi while sending it a little early and then releasing it down the DIN cable at the right time. The other thing to consider is the resolution of midi when dealing with clock. There's a lot of factors here to test to come up with anything meaningful on the results side. And yes what device you are sending midi to does matter!
What if Ableton followed the clock generated by ERM? 🤔 Could even use MRCC to route it.
someone told me u should only use the internal clock of the ERM to get it right and tight not with the plugin but I never tried it out maybe it works better
I experimented with this as well some time ago and I understood that even internal sequencers are not tight.
For example, my Elektron Machinedrum mk2 is digital, but it still drifting slightly from the grid.
are we just chasing the groove
?
Maybe try testing on another Daw?, Ableton has given me out-of-sync issues before!, Also external digital devices like your Digitakt or any other e-device have midi latencies. It would be worth testing midi tracks from Daw to Daw and see the results.
I bought the midronome, put it against a friends multiclock and basically ended up with a similiar set of stems :D I dont want to leave the comfort of ableton but surely something like protools handles latency better than my favorite daw..right?
I’m surprised you don’t know this. I thought it was coming knowledge that you should shut down all the other applications when using a DAW
Every time I use a Dan, I shut down other processes. But these days I'm 100% Danless.
Do you try this with reaper? I've done similar tests and have no problems..... It's probably ableton
ozone 10 and 11 hits my cpu hard and i have a i9- 1200k which has 24 cores if you include virtual cores? either way its got alot and i have 44 gb of ram and i dont get whats so taxing about ozone presets i know its doing alot of processing but nothing else ive used since i got this cpu about 3 months ago. maybe that could effect your test more negatively than others and if you tried something else you might get different results? ozone does sound damn good tho big improvement from the 1st .hope that helps atleast its something to consider.take err easy player.
Hello Ricky I think the problem comes from live 12 sampling stability problem or others, many users have noticed the same problems as you, including me, Live 12 systematically drops out when you open a Plugin or load presets or open the internet or other application, it's a recurring problem, I have the Erm multiclock and I have exactly the same audio dropout problem and I think that the Erm or the Mrcc are not the cause
Hey !
I use an ESI M4UeX connected via USB port to my Push Standalone to sync my different machines. There is the possibility of managing the latency of the M4UeX on the Push Standalone. This might seem like a weird question, but if I upgrade to one of the machines you present, could I achieve more stability with my MIDI sync?
Very interesting, could it be a problem with your audio interface? Does the same thing happen with a Syntakt or Octatrack...?
Midi jitter is the whole reason I gave up on a hybrid setup. I spent way more time trying to sort it out, rather than making music
I'm curious as to how the Expert Sleepers USAMO would compare in this test.
I’ve used the usamo but it was a disaster. Can’t recommend it.
Please check what happens when you use the Multiclock as clock source and not triggered by Ableton. It could be that the root cause for all the issues is Ableton and not the devices.
Multiclock can clock "analog" and "Din-Sync?" too. Also can be clocked by any "Clockpulse". ..but : The ACME Multiclock has Midi-outs and TRS-Trigger outs for every channel. that´s nice. ...and Double/half tempo ;)
I know I'm super late to the party and you might not see this but, when you showed the DT's clock jumping .1 ms it got me thinking. Have you ever tried syncing the DT with transport only and not clock? But set the tempo manually to match the DAW tempo. That way the hardware is being clocked by itself which in theory is going to be much more stable than through MIDI (it's too late here to test this myself now but just wanted to leave this here).
TBH, I think you're mostly testing the midi implementations of Ableton and the Digitakt…
You actually don't need any of these boxes to achieve great sync!
I'm syncing my entire studio by sending clock via the headphone output on my audio interface to my Keystep Pro. You can use a simple square wave loaded in a VST sampler or a dedicated sync plugin from any of the manufacturers of these sync boxes. Personally I use Silent Way Sync by Expert Sleepers, because it also reliably sends START/STOP messages.
The important thing is that your main sequencer has a CV clock input, and that it can forward clock to its MIDI outputs.
This! How reliably does this work? Is it right on the money with the grid? I saw another comment mentioning Innerclock Sync Gen II but that requires a purchase of a discontinued hardware sync box that is at least $500 and I should have to spend that kind of money to get good sync.
@@Dan_Delix It works as reliably as is possible. Meaning, it stays on beat as long as you don't have big CPU spikes, and incoming audio will of course be late because of the latency on your audio interface.
The way I deal with this is by having a plugin after my clock source that delays the clock signal by a certain amount so that my audio will be recorded one bar late.
To clarify, the latency might make the recorded audio late by 0.63 bars, and I make sure that instead it will be late precisely 1.0 bars. Of course this might cause some trouble mixing hardware and software synths, but at least all your synced VST delay FX will work correctly.
Also, remember that your hardware sequencer expects a loud and clean clock signal. Sync might drift if the signal is too weak, or the waveform might not be a true square wave anymore in case you are distorting it.
On my particular setup, I leave the clock output at maximum in my software, and set my headphone amp to -10 dB. This setting has been the most stable for me.
Last thing is that my Keystep Pro is happy to obide whenever I send it STOP signal, but currently it freaks out when I send it START signal. That's why I only send it STOP, and manually arm the sequencer before the next take. I'm trying to resolve this with Arturia support at the moment.
Good luck! 😃
would be interesting to know how the mrcc and multiclock compare to a standard midi interface, from lets say a sub 250$ audio interface? cause i'm surprised of the observations you made, last time i tried to sync with midi clock from ableton over a standard midi interface, the synced sequencers showed tempos all over the place. maybe the mrcc has some clock smoothing magic internal? 🙂
well ozone is a mastering plugin which will add latency. You don’t want to use it unless you bounce or on final mix. It will always run late
You should use a very fast click sample to increase the precision of your tests.
Fun stuff, I always hate seeing tempo flicker. You should add a baseline recording of your Digitak directly into the daw tempo run from the Digitak. Also math in 8ths is easier to compute.
hmmmmmmmmm ok so I have to assume the preset changing in Ozone was a bit more than just CPU spiking as far as the culprit of your Midi sync falling off with either device. I experimented with switching through presets in Ozone like you were doing, and each preset has a different total sample latency, with a pretty huge variance between them. It looked like you had latency compensation enabled while running the ozone preset swapping tests, so it's most likely that Live having to reorient itself to the new maximum latency after each preset switch was causing the audio buffer to stumble. The Midi clock running in the background was probably keeping fairly steady, but because the audio buffer stumbled around it it wound up falling out of sync with Live's timeline.
Computers suffer of DPC latency, you can easily measure it and you should as laptops are prone to it. As long as we’re unable to produce a perfect clock and use it in perfect conditions, we’re unable to produce a perfect device. My Drumbrute impact is really bad, it’s clock is like rubberband stretching during the 16 steps.
i am a beginner and these latency problems are driving me crazy. i just have one question you guys need to answer for me. if i have some midi stuff playing (drum for example) and i want to play my guitar to it, monitoring through ableton with effects. now i jam over my drum until it fits. in that case, i need to record WITH latency to make it sound like what i played before. i adapted my guitar play to the latency for the timing. if an audio clip does not need to fit to a existing track directly, i correct the delay in my clip. if i record a part for a song, i leave the delay in, if i want to keep the timing. does this make sense?
Ozone causes more clocking problems than most plugins, since all presets do not have the same amount of latency (it depends on which modules are loaded in a given preset). The clocking can't keep up with the changing latency compensation.
I assume you used the latest Floatingpoint firmware?
yes. each device has it's own jitter - Elektron were not generally known as the most accurate but have greatly improved since the MD days. Innerclock Systems (who mack the sync lock stuff) have tested plenty of gear (you can look up Innerclock Systems Litmus to see their methodology and results but remember they are selling a ERM style product) and and an MPC 60 has a jitter of .33ms MPC 4000 .6ms and the MPC X a whopping 2.1ms. The only thing with no jitter is all those analogue sequencers like you might see in modular or the old stuff. Mostly I don't find jitter a problem unless you are over dubbing, then it can be a disaster.
I have also had wildly different results on different computers (sometimes USB sync is OK but sometime there is no fixing it) which is why i favour a audio sync solution despite it being slightly annoying
Why the solution is to use Ableton as a slave and avoid using its MIDI Clock to sync machines
The main problem with synchronization tests using a computer's MIDI clock is that jitter (variation in clock precision) is considerably higher compared to dedicated clocks. This is due to how operating systems handle USB MIDI communication, where data is processed in a more complex and parallel manner, introducing variable latencies and fluctuations that affect tempo precision. In contrast, traditional MIDI works in a simpler and more direct way, which contributes to greater stability.
Here are some key reasons why the computer's MIDI clock is not ideal and why Ableton should be used as a slave instead of a master:
Jitter in USB MIDI: When using the MIDI clock of a DAW like Ableton, even with low load, the typical jitter is in the range of 2-5 ms. As the project load increases, this jitter can reach 5-10 ms or more, making the clock inconsistent and affecting the groove and synchronization. For example, if you change presets or introduce effects in the mastering chain, latency spikes can occur, causing rhythmic elements like the kick to fall out of sync, resulting in offsets of up to a quarter of a beat or more.
Comparison with dedicated clocks: Devices like the E-RM multiclock offer extremely low jitter, of only ±1 sample when synced with audio at 48kHz, which is similar to what the E-RM midiclock⁺ offers. These devices guarantee precise synchronization without the fluctuations that often affect USB MIDI. Additionally, since the clock is external, it is not affected by latency generated by the computer's internal processing, always maintaining stable tempo.
More precise hardware machines: Well-designed equipment, such as those from Elektron, handle the MIDI clock much better than a DAW, with jitter in the range of 1-2 ms, which is already considerably more precise than a computer under load.
Latency accumulation in MIDI chains: If you have multiple devices connected in series, latency accumulates linearly, which can negatively affect synchronization. Therefore, it is recommended not to connect more than 2 or 3 devices in series to avoid timing issues.
Conclusion: If you are looking for precise and stable synchronization, it is essential to use a dedicated device like the E-RM multiclock as the clock master and avoid relying on the computer's MIDI clock. USB MIDI latency and jitter will negatively affect your timing, while a dedicated clock will ensure solid and fluctuation-free synchronization. Using Ableton as a slave and an external clock as the master is the best option to maintain perfect synchronization in your setup.
Rumour has it that if you switch out your DT2 FUNC key for a grey A4 key you unlock an extra LFO
the best bet might be to have an external clock that start / stops gear AND daw. not processes getting in the way.
I’m sure the issue is the daw in the end. No matter what daw. It’s doing a lot to be a stable clock source.
You should do the same test with logic. I was using logic for years and really started having sync issues when switching to Ableton. But to be fair I also have issues when syncing hardware to my Multiclock in master model.
Ableton is great for in the box midi tasks but it handles midi in and out the box very poorly compared to other solutions.
Maybe also try the same test with a MPC as master.
I think that is the limit of accuracy of midi clock, even when derived from sample accurate audio.
I wish the Iconnectivity MIO or MIoXL was a part of this shootout. It could be the best of both worlds.
Ok a couple of things. Things are way still way more complicated. First of all jitter is any deviation in, or displacement of, signal pulses in a high-frequency signal. The multiclock audio- plugin in hosted in a track of the sequencer is not excluded from the automatic latency compensation engine of your sequencer. Sticking to the way you've tested. I think Ableton is able to adjust the start time of the multiclock plugin if your are using the external device plugin. Ableton measures the delay between the midi trigger sent and the audio received by the HW-device. What I'm describing is not in anyway mentioned the manual of Multiclock. Why? That is not how the mulitclock is supposed to work or used. The computer doesn't sync external midi gear with an sample accurate audio signal. The audio plugin only syncs the muliclock. Back to your computer. First of all. You do not browse the Internet, open applications that are not managed by the sequencer. I do not know much about the Apple SOC but you are requesting the impossible. Opening your browser means, open a different IRQ, DMA channel on your processor to start the browser app. The app is stored on disk that is dependent on the CPU. Then the browser populate the screen using the integrated GPU (pictures and animation). The browser will Read the disk again for the cashed browser data and load it in to memory. It will also refresh the open tabs of pages from the Internet(involves operating your NIC with interrupts the CPU ). Your keyboard and mouse strokes are going to another application in the foreground. I'm trying to keep stuff simple here. But your are not making this easy. : )) How is Ozone 9 configured? Is het using OpenGL? Your mouse actions in the GUI of Ozone clearly interrupts the your processor. In short, I see so many issues in your test and on your computer. I would be in tears if my Wintel behaved like your Mac. All in good spirit!
About your outcome with the multiclock beeing so similar: What if this jitter was back then , when the Multiclock came out was an issue. But computers improved so much, that a clock generated by the DAW and transmitted via USB is similar stable.
You made me curious: I will do a similar test. I have no MRCC but i can try clocking the DT via the Blokas Midihub and then via Multiclock.
So far i can confirm the Multiclock generates in conjunction with the DTII around 1ms of jitter. Not sure what the source is.
Without external clock the DTII also jitters , but a lot less - around 0.25 ms
When i clock the OT MkII with the Multiclock it jitters less than the DTII ... the maximum offset i found was 0.6 ms .. the median was more like 0.3ms
I'll have to test this a bit more extensivly.