Audio sync should be more stable because DAWS are focused on their audio engine before anything else including midi thus if you have it setup right it should lock to what ever the audio is doing. That said once you are ready to record I'd still disable all plugins on that track while recording, IU know that sucks if you like to jam your arrangements live
If you're layering the same digital sounds with relatively minimal processing on one, then a few samples off can make a difference. But otherwise, I agree. If you can hear a flam, and don't want it, try a nudge... or only use one of the sounds for the attack part of your stack. That's a classic solution - one layer for the stack's attack, and use the other layers for body, thickness, decay, etc.
Also, people might be surprised how sloppy MIDI jitter could be on slaved vintage sound sources - nore than 1ms. And people made some GREAT music with that jitter present.
Thanks for that awesome video Ricky, nice come back from the last one :)) This is a complex subject, but let me give my professional opinion. * What you are comparing is (A) the MIDI clock generated by Ableton and (B) the clock/pulses generated by the multiclock VST plugin. -> For now let's ignore any additional latency/jitter created by the multiclock device, by the playing device (digitakt), or by the audio interface (we can assume they are very low). * I'm sorry to say, but the MRCC has nothing to do with any of this - it just forwards the clock generated by Ableton (and I'm sure it does it well) * What we want is precision between (A) or (B) and the DAW timing, which is the timing the analog audio is being sampled by the audio interface * Let's look at (A): on your computer, it runs in a different thread/priority than the audio stuff, so as soon as your CPU gets busy, it won't be called "in time" and that will create jitter and latency. On top of that it is transfered over USB-MIDI, which does not have the highest priority either - so also extra jitter if the USB bus is busy (lots of USB devices for example, big data transfers, for example) * Let's look at (B): it runs in a VST plugin, which is processed not only real time with the other audio stuff, but more importantly the generated audio pulses are in perfect time with the other audio coming out of your DAW. So inherently the audio pulses you send to the multiclock are in perfect time with the other audio coming out of the DAW. Any jitter/latency using the setup (B) is therefore from the stuff we ignored, i.e. within the multiclock itself, or from the playing device, or from the audio interface. Now will OSes and DAWs get better so that nobody will need this kind of complex setup anymore? Yes I bloody hope so, and U-SYNC is the first step towards it. Simon
For those interested, Simon has made a cool little box(midi and audio Metronome, linked to your DAW), I have it in my setup since a year and it is awsome - check his channel and support forum.
The Multiclock has a VST plugin!? And the plugin generates the clock? Then you are still relying on the PC, which isn't going to be as accurate as hardware can be.
@@elitebeatagent8993 I guess the current batch is sold out. Try contacting him. He is very responsive and kind. And believe me - the Midronome is worth it!
This was PTSD of trying to get two kick drums to align across devices. The phasing in and out was an interesting resulting sound, but also soul crushing. It's stuff like this that drive people to the dark arts of eurorack drums.
Plus there is latency on the devices you are trying to sync. It never ends. I found the best is just have pulse audio tracks in my DAW and output direct to my clock in on my sequencer/drum machine. No midi.
Device jitter, and small variable clock offset, is something that will always exist with the midi protocol. The midi message bytes are sent serially and received using a buffer which is then processed by the microprocessor in the device. That thing runs at its own clock rate. So a midi clock message will never be processed instantly when it arrives. The receiving device still has to figure out that it received a clock message and act accordingly. In other words, the Multiclock can sort of transmit the clock message in sync with a sample, but nothing can interpret that with sample accuracy because of how the midi protocol works. I think the accuracy might also get worse if you send additional midi messages like CC over the same port.
There is eventually a bottle neck somewhere right? So many tiny variables. I felt like I was trying to figure out a Manet painting by looking at the canvas stitching 😂
The only way to have a perfect sample accurate clock sync would be to base it on the word clock everywhere in the chain. Every device in the chain would need to know the tempo changes at the audio rate level. I believe that every connected device would probably also have to report their own internal latency so that latency compensation for all chains can be tracked because even what is perceived to be a "no" latency for a digital synth at least have a one sample latency and what it is depends on the internal sample rate. MIDI is very bad for syncing clock since it is only based on a low rate tick with no time codes, you could probably improve that a lot but it would also not be compatible with midi clock
@@pleggli Interesting, i wonder if somehow a unit speaking direct to DAW through another protocol, functioning off something more like CV might be a solution?
Hey Ricky, been watching your videos on the MultiClock that I also heavily use. Love your channel and your approach! The power of the Multiclock, is being able to shift things in time with knobs until it "feels right" and then hitting record. To me it s priceless just for that! Every plugin you add, also EQ s being changed to Linear phase etc will affect the timing of everything including the Multiclock as Abletons internal latency will be affected. Just setting it once and expecting it to stay in sync is not achievable IMO. Often I check it s tight and then adjust by feel so the groove feels better than 100% on grid, very often do this on bass synths and toms. Don't sell it, you will regret it!
Here is my setup with the Multiclock in ableton that has been rock solid for me: I turn off all external midi sync in ableton because you want the devices to get the sync from the multiclock and not ableton via usb. Then use an external audio effect on the channel you have your multiclock plugin or audio pulse running through to route the audio to the multiclock. I find this keeps the latency compensation better when running plugins. You can then run one of your midi outs on the multiclock into the midi in on the MRCC to clock your outputs on the MRCC.
Gotta go old school on sync; workstation is master clock, all midi and audio clips consolidated into long sections (especially in Live), use automation over patch changes, use lots of pre roll, render or dump all midi parts to audio and immediately edit for sync. Abandon all illusions about MIDI - only audio recordings are ‘real’ and in sync. Put video on separate MIDI TC slave system. Finally, use tempo sync’d delays as a reference only. Once you know what you want delays to do, set times manually or with automation and let them free run. I hear that recording the cowbell from an 808 onto track 23 of a Studer ATR can drive the arpeggiator of a Juno 101 pretty well…
Thank you, Ricky, for the information you provided. I have also been researching for a few years how to optimally synchronize onboard devices with the computer, and I have come to the realization that latency cannot be completely "overcome," but different approaches can be used, as you mentioned as well. In your specific case, I believe the drift is caused by the computer itself and not by the multiclock, because of software record input monitoring enabled. For an optimal "latency-free" and rec.overdub experience, I would recommend an onboard mixer with direct monitoring on each channel to avoid internal processing as possible. Regarding browsing presets using with the multiclock, I have sorted it out by installing VSL Server on the computer and offloading all my plugins from the DAW process, thus eliminating any processing that could affect the out clock from the DAW , and I can say that it works really well. Zero drift and phase issues with live tracking and overdub recording. Best regards, Robert Trifunovic
I’ve been using Ableton live since version four and I will tell you that you have undertaken a fool’s quest. Ableton and external hardware synchronization has always been an epic nightmare. I gave up on caring and I use Ableton has basically either all software in the box or controlling hardware out of the box, but as soon as I try to get the two to work together and line up, everything goes to shit. The only answer for me has been that I record and timing after. It’s not all Ableton’s fault, either. It depends on what your synchronizing to it, and how that device handles clock changes. Midi, also being a serial protocol, does not help. Regardless of what your sequencers internal timing may be, midi is limited to 24 pulse per quarter note timing resolution, which was great in 1982, but just doesn’t work today. Midi 2.0 can’t come fast enough.
Yea I had a feeling this is the case. 😂 I have tried other DAW‘s in the past, and one thing that always stuck out to me was how everything “just worked“ when it came to synchronization. I felt that both Bitwig and Logic gave me a more stable clocking experience. But I never really tested anything, or questioned it further as I did today haha
Interesting analysis of MIDI clock with the two devices and your DAW. It's hard to image that a half millisecond would be noticed. With MIDI keyboard latency, somewhere between 5-10 milliseconds I can start to notice a lag. That said, if you take a song at 120 BPM, you get 500 milliseconds per quarter note (two notes per second). I use Logic, which has a default MIDI PPQ resolution of 960 (I believe Ableton is the same?). So for each quarter note occurring every 500 milliseconds, there are 960 sub-divisions the DAW is using, to determine accurate MIDI clock timing. .500/960 = .000521 per MIDI clock sub-division, or roughly a half millisecond. So the actual (DAW) clock data used to determine MIDI timing, is only accurate best case, to a half millisecond (at 120 BPM -- clearly this will change based on the song's tempo). It doesn't seem to me, you are going to be more accurate than the clock frequency. And as you demonstrated, when the CPU in the DAW is heavily loaded, other CPU threads may delay the processing of the DAW/MDI clock. I like your comment about missing the forest by focusing on the trees -- this analysis is great, but I suspect if one doesn't heavily load the DAW's CPU while recording with MIDI clock, any minor variance + or - half millisecond, would be small enough to not be noticed. BTW, I ended up getting an MRCC 880 after seeing your video last week -- didn't realize this device was available. Really a nice MIDI router/merger/spitter -- thx for the tip!
I'm a bit late to the comment party, but as many other commenters have now suggested, there are a LOT of different choke points on both the computer running the DAW and all of the gear that you're plugging into. Trying to track down where these transient issues are happening on any given take is likely the path to insanity (although arguably still easier than troubleshooting a complex HDMI audio setup). Some concepts that might help provide a bit more context... first, even fast modern CPUs are still usually running one instruction at a time, which means the processor is rapidly jumping between dozens of tasks. It's not just your running programs but all of the OS services, and everything done over USB (audio interface included) requires a not-insignificant amount of processor overhead. Processing streams while individual apps and plugins are all greedy for high priority slices means that the more you listen for audio artifacts, the more you will absolutely hear them. This might actually be one of the most understated advantages DAWless setups have; the processors might be far less powerful, but they are not running a multitasking operating system. The second thing to keep in mind is that MIDI clock itself is a very janky thing. There's no concept of syncing to an atomic clock, and the speed a processor runs is directly influenced by the voltage it's receiving unless there's a clock running; does your MIDI gear have clock batteries that need to be replaced? Mine doesn't. So, everyone is doing their best but nobody actually can agree on how quickly time is flowing. When I was first learning the MIDI protocol as a developer, I was shocked to learn that there is no tempo message. Instead, you send timing "ticks" down the line at a rate of 48 messages per quarter note. That's right... time is defined in real-time based on the speed at which those ticks arrive. Your MIDI synth is just a hungry hippo for ticks. If you stop sending ticks, the music just stops. To me, this is wild. This is also why you see the tempo occasionally jump up or down by 0.1 when you assume it should be stable. What you're seeing is ticks arriving too quickly or too slowly, or not arriving at all. Now, the other elephant in the room is that MIDI works at 31.25k baud, and the timing messages are not given special priority. So, if you have a lot going on, that bandwidth can get eaten up. That's an area where the MRCC can really help, by defining subsets of devices to talk to each other. Still, just because MIDI can support a given speed under ideal conditions doesn't mean that the client device is powerful enough to receive them that quickly. Right now I am talking with the Hologram folks because Chroma Console literally shuts off if you send it too many messages too quickly. Which is not amazing, but this is the world we're in. I'm not intimately familiar with the audio clock but if there's software running on your laptop to interpret what it's sending, then all you're really doing is shifting the point of failure from MIDI to your CPU's ability to prioritize messages from your audio clock. TL;DR: I wouldn't pay extra for the audio sync thing, and I wouldn't do CPU intensive things while recording. And yes, if it sounds good...
when you're tinkering with ozone, the latency of the plugin gets altered, which results in a tiny audio dropout from ableton. for sure then the multiclock gets a hickup...
@@RJ1J Agreed, during recording I'd turn off plugins or use the DAWs delay compensation. Not sure if ableton has delay compensation or not or if its any good.
Another great video Ricky! I got the MRCC after your last video and it’s amazing to have such a flexible MIDI patch bay and especially have the four virtual MIDI in and outs. Game changer for me. But about MIDI jitter - I genuinely believe this to be an ableton problem. I’ve done something similar to this in Logic and it was bang on but yeah, Ableton *always* did weird stuff. Not even just clock, but even just MIDI sequencing notes. Funny thing however is that Elektron Overbridge seems to always be locked-in for me regardless of DAW. Perhaps you could talk to some people and start building some ideas for an Elektron sync box? 😅
Also I just realized that in this video you’re recording with Monitoring set to Auto and not Off, without trying adjusting the Keep Latency settings. 🤔 There’s just *way* too much stuff to keep track of when recording into Ableton!
The main issue with those sub-10ms changes is 2-fold: 1. Phase issues. If you are somehow trying to double up the same sound, which is not especially likely in this context, the phase can be totally off and thin things out. 2. More importantly, we CAN hear the substantive difference in transient response if say 2 percussive elements are clocked separately and continuously overlapping, you can have a subtle shifting of where a really really tight flam is happening. in a majority of cases i don't think this will really cause an issue, in some it could even be pleasant, but in certain situations it could certainly be heard, and in some it might be undesirable. I think the worst case scenario would be DJing, if you were to have similar parts overlapping poorly, could sound really shit.
Oh yea ive heard that, putting the same tone or cowbell on 2 machines and they are never super dead on. You hear the phase for sure. Would this happen if it was machine to machine midi? I guess it might depend on the machine right? Haha
@@RickyTinez It would prolly depend on a few different things but most of all on whatever hardware/software you were using. Theres stuff like JackAudio or a few other similar solutions, too, which just go software-to-software, you could probably test something like that and it'd give you an idea whether its common inside a PC, and if it is, it'd prolly come up computer-to-computer as well. I somewhat suspect all you'd deal with is latency though, rather than jitter. Oh and im realizing just now i might've misunderstood, if you mean between 2 midi hardware units, i absolutely think it would come up, my guess is it has to do with subtle analogue jitter. Most all clocks of the time-telling variety have jitter, thats why we've got the atomic clock, but even it has subtle jitter i'm told. Generally, in the digital realm, my understanding is its all 1s and 0s and the jitter doesn't really happen usually, the electricity coming in is all stabilized by the power supply to facilitate the extremely detailed operations, so that every 1 is a 1 and every 0 a 0, nothing in between. Then again, if i was all right, the atomic clock would be digital LOL, so i gotta be missing something. TLDR; Clocks are actually gnarly AF 🤣
I had the ERM multi clock. plus a motu 128. After years of sync issues, and testing over many years between Ableton , Cubase and Bitwig. I came to the conclusion that Ableton was by far the worst for sync issues. So I sold Ableton Live and the ERM and a load of older hardware gear.. I will do most of my live hardware compositions in Bitwig. I've never been happier. ;o)
Most desktop computer operating systems are technically not real-time OSes so they will always be affected by CPU. A real-time OS (RTOS) is designed to work within hard constraints and so can guarantee specific response times. I think a dedicated hardware clock with a RTOS would be best but I dont know which of the hardware devices out there are designed that way. Someone mentioned Midronome but I've never used it.
I think Ensoniq keyboards Midi jitter and timing is better than almost everything available and that was in the late 80's early 90's so it must have been because it was operating to process midi and the synth as it's core function.
Most MIDI devices like MRCC use simple single core microcontrollers and are bare metal programmed. That is, there is no operating system. However, some higher performance MCUs can run an RTOS which might be helpful if there are a lot of tasks to juggle.
This stuff really gets me… we have such ridiculously powerful computers these days, but we’re still struggling with audio and midi latency. The response times could easily be achieved on modern OS’s if they just got out of their own damn way for real time processing
Something like Linux with a realtime kernel can do things on dedicated deadlines but it still won’t make latency super-low. ASICs or FPGAs will always win for stuff like that where microsecond-level timing is required. Once you’re looping through an interface or general-purpose OS for tight timing, forget it unless it’s simply an output and all the instruments are in-the-box. RME gear in general has great latency for tracking and stuff but if you really want phase-accurate you need to quantize or manually align stuff. Hybrid kinda sucks because of this.
Again he didn’t test multiple external sequencers. When I’ve used a usb midi hub with one piece of gear I could get it and Ableton synced after a little bit of tweaking. You would think that now I can do the same process with other gear and then I can run 3-4 external sequencers and they would be synced. But, it always fell apart. Now I use the multiclock and at the moment I have 5 pieces of gear all running their internal sequencers locked to Ableton. When I first got the multiclock I would also measure the latency and adjust it that way to get it super tight. Now if I add a new piece of gear I just dial it in by ear. Sometimes a few milliseconds here and there get those separate sequences grooving together. Plus I could always nudge later in Ableton, if need be, after I’m done jamming and recording.
i own multiclock and i did jitter test using different audio interfaces meaning Focusrite 18i20 and 2 RME Babyface pro fs and UCX2 and this is where i found the RME differ from others being super stable with audio jitter so the multiclock performed very well. now i generally stay away from any activities while recording having another dedicated PC for junk
Thanks for going to all the trouble to clearly and methodically document everything! I have been doing similar, but I'm so much less familiar with Ableton, so I can't pin down how much of rhe variation is something I changed on one track without realizing it
Hey, thanks for allyour effort on this interesting research. Without knowing for sure, I think the Multiclock is so popular because it works great with older drum machines or synths that have built-in sequencers. That's where it really shows off its strengths, imo. For instance, I like to use the audio out on Channel 1 of the ERM to trigger the SH-101 or the JX-3P. Of course, that only makes sense if you’re into those old-school internal sequencers. Plus, the built-in swing in the ERM is pretty nice in this context.
Digitakt is always slightly off on the first bar when sending or receiving midi. Love to see a test comparing audio over usb(overbridge & class-compliment) vs over a solid interface like RME.
It really got me that when midi slaved to an MPC my first pc used to display wandering tempo or 89.9 instead of 90 lol Things haven't improved with all the d/a stuff in our rigs too
@@johansebastianfauno5526 PC computers and notebooks aren't ideal for electronic music production primarily due to latency issues and system interrupts. These devices were designed for general-purpose computing rather than real-time audio processing, which requires consistent, uninterrupted performance. 1. Latency and jitter: Standard computers often struggle with maintaining low and consistent latency, which is crucial for real-time audio processing. This can lead to noticeable delays or irregularities in sound output. 2. System interrupts: Modern operating systems constantly manage various tasks and processes, causing frequent interruptions that can disrupt the smooth flow of audio data. These interrupts can result in audio glitches, dropouts, or buffer underruns. 3. Non-real-time operating systems: Most consumer PCs run on operating systems not optimized for real-time processes, making it challenging to guarantee consistent performance for audio applications. 4. Hardware limitations: Standard PC hardware, including sound cards and drivers, may not be designed to handle the demands of professional audio production, leading to suboptimal performance. This is why dedicated hardware gear remains popular in electronic music production. These devices often feature: 1. Purpose-built chip architectures: Designed specifically for audio processing, ensuring low latency and high reliability. 2. Real-time operating systems: Optimized for consistent performance and minimal interruptions. 3. Dedicated DSP (Digital Signal Processing) chips: Specialized processors that can handle complex audio calculations more efficiently than general-purpose CPUs. 4. Guaranteed "quality of service": Hardware units can provide consistent performance without the variability found in multi-purpose computers. The same principle applies to other fields requiring real-time processing, such as telecommunications. Cell phone towers, for instance, use dedicated hardware rather than general-purpose processors like Intel CPUs. This ensures reliable, low-latency communication essential for maintaining call quality and network stability. While modern PCs have improved significantly and can be optimized for audio production, dedicated hardware often remains the go-to solution for professional-grade electronic music production due to its reliability, consistency, and purpose-built design. Would you like me to elaborate on any specific aspect of this explanation?.
Like the last one, it’s hard to take this video in good faith and not as cynical clickbait but you’ve always seemed like a good dude so I’m giving the benefit of the doubt and just assuming you have a bunch of things twisted. 1. You’re not comparing the ERM to the MRCC because the MRCC doesn’t do anything to the clock other than route it. You’re comparing the ERM to your computer’s clock with the MRCC as a dongle between your computer and the outside world. This is an apples to oranges comparison. 2. Every receiving device behaves differently so measuring what your receiving device is doing is immaterial. Some devices are notorious for syncing very poorly no matter how perfect the clock coming at them is. A more thoughtful test would be measuring the actual clock data against an objective measurement device like a frequency counter or equivalent to see what’s happening at the clock level. 3. Buddy, why are you running a mastering chain in this scenario? Come on, guy! 4. Ableton has always been notorious for being bad and weird with MIDI timing. Run the same test in other DAWs and you’ll likely see radically different results. It’s a shame but it’s true! I use a USAMO to get around this all, BTW, and it’s generally rock solid.
When you load different presets in ozone the pdc (plugin delay compensation) makes amount of samples changes. This wil inpack the audio engine and the position of de audio recording. Midi is a serial command protocol. Midi uses a speed of 31250 bits a second. This means roughly 3125 bytes a second. Midi clock uses 1 byte. The time of the byte is 0.32 ms . So it's normal to have a fluctuation around 0.16 if no other midi data is send. This can be bigger if more midi data is send like active midi sensing, cc controller data or song position.
Thank you for this video, I love this kind of stuff! Maybe there's a way to do some sort of a shoot out between different options? - Expert Sleepers Usamo and Silent way units - CLOCKstep: multi - Innerclock Systems, - ACME-4 - Circuit Happy Missing link. Because how stable is Ableton link? - The m4l clock devices? So many options but nothing fixes the issues.
I just bought a used ERM Midiclock (which is at €130 a lot less pricey than the Multiclock). But because I had a problem with a sequencer providing the clock (Polyend Seq with the bpm jumping anywhere between 169 and 183 instead being at 176) and ANOTHER sequencer (Squarp Hapax) being the slave but providing an involuntary MIDI loop even though every Thru setting was OFF. So in my case, the Midiclock allowed me to physically unlink my sequencer cable mess ... with a tiny little device right next to my keyboard. I do have a MIO XL MIDI router (with all MIDI connectors at the back, mind you, I do not want a cable mess on my table), but it's hidden somewhere in my rack and I wouldn't want to crawl into my rackspace every time I need to start/stop the clock. So for me, an external ERM Midiclock was indeed the best option. And everything connected now shows exactly 176 bpm, not some random number between 169 and 183.
The MRCC is known to have high lag and jitter for a device of its kind. My clock is audio pulsed out from the Daw out through Erica synths cv to midi clock converter, this is simple and sample accurate. When just using hardware my Mpc 4000 is the clock it is almost as tight as a dedicated sync box
Latency compensation values reported by plugins is not set in stone, it can vary depending on whatever is configured within the plugin. One obvious case where this behaviour is noticeable is with the introduction of a limiter plugin, which was exactly the case here when you started switching between plugin settings in Ozone as I clearly saw that a Limiter module was present in some of the presets you browsed through. I believe in Ableton there's a mouse-over mechanism that allows you to see what the exact reported latency compensation value is at a given time. I know that for example if you take Fabfilter Pro L 2 and switch between the limiter modes, for example between "Modern" and "Transparent" there's a very significant latency variation involved.
I didn't want to make my original comment too long but, clearly you have to be very careful with the conclusions you're coming to with the kind of test that you did in the video, because there are a lot of variables involved with dealing the complex task of dealing with latency compensation in different use cases. In this case it appears evident to me that the issue you've experienced is directly related to the introduction of Ozone and effectively altering its latency compensation value in real time while you're recording as you were switching presets. This was inevitably bound to cause issues but the good news is there's almost certainly nothing wrong with your MIDI devices, you simply have to be mindful of what tools you choose to ensure "compatibility" for a given use case.
That was my suspicion too. That the Ozone plugin has some presets with a look-ahead latency. That latency is reported to the DAW pdc. Changing presets causes the PDC to recalculate.
Great video, thanks Ricky. FWIW when I’m sending clock from my Digitakt to my Syntakt over MIDI DIN without a computer in sight, I can see the tempo on the Syntakt fluctuate by +/- 0.1 bpm throughout. Same with the Digitone. I think there’s inherent jitter in MIDI clock at the hardware level, quite apart from when a DAW is involved. But… it doesn’t matter because jitter that’s measured in around 1ms or so isn’t going to make an audible difference. And in reality MIDI clock sync on Elektron boxes is tighter than a camel’s arse in a sandstorm. If the MRCC is getting jitter via USB from a DAW down into the 1ms range then it’s going to be almost as solid as Elektron gear at the hardware level as a clock source. That’s good news because MRCC 880 is already on my shopping list and this test confirms it’s a good buy.
Ricky, I really appreciate these videos, on DAW/Hardware sync. I have an RK-008 and I’ve adjusted latency per output for all of my drum machines and sequencers. I’ve gotten them all really good sounding/looking/grooving. The MAIN point to drive home first and foremost, is setting the ableton monitoring to be Off. So I assume you’re direct monitoring through your interface? I’m having great results but it’s frustrating when you’re using hardware sends (Ext Inst) in ableton. Side Note: Renoise to my ears has the best sync to hardware. I’m convinced if it.
I’m curious how the Akai MPC compares..Back in the the day the MPC was famous for it’s internal midi timing. There was this test of using all of it’s voices to trigger the same rimshot at the same moment. Still sounded like a flanged rimshot though; but way, way better then any midi sequencer in existence at the time….
@@Aristoper the mpc had it’s own internal sound engine, the Atari ST did not. Midi is a too slow serial data interface, which gives sloppy timing, as soon as you pump a lot of data through it (like a whole song, or a lot of pitchbend data) The MPC use to have 4 separate midi outs, so you could use one midi out exclusively for external drums. The Atari also had midi extension boxes with multiple outs.
I think you're seeing two different things: 1. Regular old MIDI Jitter, which can vary in severity based on hardware and DAW. USB vs. MIDI DIN, IO Drivers, MIDI driversetc. I've done extensive tests and some DAWS do better than others. MIDI DIN seems to be better than USB. 2. On the Ozone tests, you are probably flipping through presets with varying values of lookahead. When you instantiate a plugin or preset with lookahead it's going to momentarrily throw Live's plugin delay compensation (PDC) out of whack.
A couple of comments: 1) Jitter happens when midi beat clock events are not sent at precise intervals. Latency is irrelevant for the issue of midi beat clock jitter because there is no real-time response to midi clock events. In other words, gear that receives midi beat clock will not do anything immediately in response to midi clock. Instead, those events will be observed and their intervals averaged over time to calculate a tempo. 2) What you've tested here is mostly the ability of Live to deliver a jitter-free clock. The proof of that is the varying tempo in the Digitakt. That's why you didn't see any difference in the jitter between the two devices. I can tell you from experience that other DAWs generally do a better job of this than Live. 3) Changing Ozone presets will change the latency compensation in Live, so you should expect that will cause glitches when doing that in real-time. CPU usage, either via plugins in Live or with other apps running, is not the issue
Interesting. What audio interface and connection method did you use. I don’t have any sync issues with Mukticlock via MOTU 16A using TB2. Did you try with any other outboard modules besides the Digitakt? Did the audio pulses going to Multiclock stay in sync?
Thank you Ricky, that was another video in a long history of helpful videos. Thank you for the time and effort you put into it. (The rest of this comment is probably not helpful but...) I understand people want to eliminate latency in all its variants but be aware of the return on effort / diminishing returns / system variance. Depending on the gear I have active, the systems I'm syncing and the configuration of the system in total, if I can get a latency of under 10ms round-trip from hitting a control/keyboard/pad and hear the sound, I'm good. Can I get it lower by tweaking? Sure. Do I care? Not really. The bugaboo's like phase cancellation are noticeable and I deal with them using my ears if they are bad enough to matter. I'll spend the time on what I care about. If (for you) it's jitter and latency, that's cool, but I'll be making music more often than not. Have fun!
To me sub 1ms jitter is huge if your recording more than 1 midi drum instrument at once or trying to record a midi drum synth with VSTi's Phase issues, transient issues, etc. I also think a test with a DAW that isn't notorious for terrible timing like Ableton would be better as well as testing with multiple different midi devices. It also depends on what type of midi we are testing, if its clock or just note data. A good midi interface will take into account the buffers in usb and time stamp the midi while sending it a little early and then releasing it down the DIN cable at the right time. The other thing to consider is the resolution of midi when dealing with clock. There's a lot of factors here to test to come up with anything meaningful on the results side. And yes what device you are sending midi to does matter!
Maybe the Multiclock has a much more precise internal clock? I would assume so, because by using audio sync its still clock generated by that same CPU, just the audio output may or may not be lower latency itself, I would assume the internal clock on Multiclock compared to ANYTHING is dead on
Agreed, if clocking from the MIDI hardware, multiclock or sequencer or whatever, all of the hardware will be in sync as the port to port latency and jitter of a MIDI router is negligible. It’s the computer that monkeys it up, and needs the audio clock to compensate. This would be a good thing to test.
@@DarrylMcGee I totally agree, I was writing some code for midi sequencing and it's surprising how easy it is to get unstable clock times on an OS where the cpu is doing a billion other things already, in fact I think he should try clocking the DAW from the multiclock
Hey Ricky, great video as always. If I am not mistaken, at around @9:06 you made a comment about 20% of a millisecond being 1/20 of a millisecond. That is wrong. 20%=20/100=1/5. That’s way bigger than 1/20. Thinking in percentages 1/20 is 5%.
Nice followup, there IS a difference between daws, midiGal let’s you measure this and Bitwig has way less jitter than Live out of the box. Then there is the issue of audio-drivers, highend audio interfaces will have superior clock. On PC there is a utility called LatencyMon which shows the hardware interrupts interfering with audio timing, this let’s you optimize performance. In general stuff like graphic drivers, tcp/ip protocol but also bluetooth devices will (can) have some influence on audio timing. The erm should be rock solid during whole recording, other devices like erm also reset and get in sync again after a bar, for a recording situation this may not be crucial, but in a live situation you just want to stay in sync.
I experimented with this as well some time ago and I understood that even internal sequencers are not tight. For example, my Elektron Machinedrum mk2 is digital, but it still drifting slightly from the grid.
Multiclock can clock "analog" and "Din-Sync?" too. Also can be clocked by any "Clockpulse". ..but : The ACME Multiclock has Midi-outs and TRS-Trigger outs for every channel. that´s nice. ...and Double/half tempo ;)
I lost my mind trying to troubleshoot this - in my case, my audio interface was adding major issues. Sometimes up to +-16ms off, changing every time I hit record with one audio track in the session. The interface would work fine on my MacBook but not on PC. Switched to an RME interface and Innerclock for sync and I’m finally at peace! So windows users might have this extra bit to troubleshoot.
Computers suffer of DPC latency, you can easily measure it and you should as laptops are prone to it. As long as we’re unable to produce a perfect clock and use it in perfect conditions, we’re unable to produce a perfect device. My Drumbrute impact is really bad, it’s clock is like rubberband stretching during the 16 steps.
Fun stuff, I always hate seeing tempo flicker. You should add a baseline recording of your Digitak directly into the daw tempo run from the Digitak. Also math in 8ths is easier to compute.
@rickytinez could you test using multiple Elektron gear at the same time? I could never get them synched up correctly without using the Multiclock. I would have the Elektron gear running, creating sequences etc, and after 10mins or so they would be out of sync. The CPU spikes are definitely the main culprit. I think the audio engine has a higher priority to MIDI output, which is one of the reasons why the Multiclock works better.
the best bet might be to have an external clock that start / stops gear AND daw. not processes getting in the way. I’m sure the issue is the daw in the end. No matter what daw. It’s doing a lot to be a stable clock source.
The RME is suppose to have the best jitter reduction. I wonder if it would correct issues if plugged into the UCX midi ports? I just throwing something out here, I'm not really familiar with using the midi ports on the UCX II. But I'm still curious about the Mio by Iconnectivity lol.
Ok a couple of things. Things are way still way more complicated. First of all jitter is any deviation in, or displacement of, signal pulses in a high-frequency signal. The multiclock audio- plugin in hosted in a track of the sequencer is not excluded from the automatic latency compensation engine of your sequencer. Sticking to the way you've tested. I think Ableton is able to adjust the start time of the multiclock plugin if your are using the external device plugin. Ableton measures the delay between the midi trigger sent and the audio received by the HW-device. What I'm describing is not in anyway mentioned the manual of Multiclock. Why? That is not how the mulitclock is supposed to work or used. The computer doesn't sync external midi gear with an sample accurate audio signal. The audio plugin only syncs the muliclock. Back to your computer. First of all. You do not browse the Internet, open applications that are not managed by the sequencer. I do not know much about the Apple SOC but you are requesting the impossible. Opening your browser means, open a different IRQ, DMA channel on your processor to start the browser app. The app is stored on disk that is dependent on the CPU. Then the browser populate the screen using the integrated GPU (pictures and animation). The browser will Read the disk again for the cashed browser data and load it in to memory. It will also refresh the open tabs of pages from the Internet(involves operating your NIC with interrupts the CPU ). Your keyboard and mouse strokes are going to another application in the foreground. I'm trying to keep stuff simple here. But your are not making this easy. : )) How is Ozone 9 configured? Is het using OpenGL? Your mouse actions in the GUI of Ozone clearly interrupts the your processor. In short, I see so many issues in your test and on your computer. I would be in tears if my Wintel behaved like your Mac. All in good spirit!
yes. each device has it's own jitter - Elektron were not generally known as the most accurate but have greatly improved since the MD days. Innerclock Systems (who mack the sync lock stuff) have tested plenty of gear (you can look up Innerclock Systems Litmus to see their methodology and results but remember they are selling a ERM style product) and and an MPC 60 has a jitter of .33ms MPC 4000 .6ms and the MPC X a whopping 2.1ms. The only thing with no jitter is all those analogue sequencers like you might see in modular or the old stuff. Mostly I don't find jitter a problem unless you are over dubbing, then it can be a disaster.
I have also had wildly different results on different computers (sometimes USB sync is OK but sometime there is no fixing it) which is why i favour a audio sync solution despite it being slightly annoying
I'm not the world's foremost expert on this, but imho much of what you experience is basically the limitations of the MIDI protocol (originating from the early 80s) and the way USB-MIDI is handled by a computer. The (DIN) MIDI protocol works at a fixed speed of 31250 baud (31250 bits/second). That means: sending 1 byte of information takes 0.25ms. The way clock messages work is that per quarter note 24 of these clock bytes are being sent. Then, most other MIDI messages take 2 or 3 bytes (cc, noteOn, noteOff etc). And all these bytes are being sent serially: one after the other. As said, sending 1 byte already takes 0.25ms, so it's not hard to see that small variations quickly occur. Of course, the USB protocol isn't restricted to this 31250 bits/sec rate, so in theory sending MIDI over USB shouldn't give these inaccuracies. However, then there is the issue that the receiving end of these USB MIDI messages is sent into a buffer, and this buffer is only read at a limited rate (typically around 1000 per second, as far as I know). And the MIDI is only processed after the buffer is being read, so also here you will also find a small amount of jitter.
Even more madness may ensue with drum parts where multiple events fall at the same time. I recall Colin of cirklon fame mentioning this as reason why drums over midi never really land consistently if there’s that kind of demand. Not sure if removing the DIN from the equation alters this and using USB instead. I’m not even talking timing over clock but rather midi patterns in DAW triggering the samplers etc. I was into this for a while and looking to use triggers on Rossum assimilator etc as it ought to be tighter. In reality do we care? It’s hard to know. Supposedly the brain can detect micro timings so 1ms is in that realm. Funny business.
Ozone causes more clocking problems than most plugins, since all presets do not have the same amount of latency (it depends on which modules are loaded in a given preset). The clocking can't keep up with the changing latency compensation.
I actually think the problem is Ableton. The last couple of days I’ve been setting up a live midi recording rig. Between LPX,Bitwig and Ableton, Ableton was the DAW that drifted the most. Logic was great, but lacked a few features, and Bitwig was consistently stable. Same set up BTW with the MRCC and ERM. Ableton is not good. 👍
I think the thing is that you have to choose one, learn its idiosyncrasies, and work with it. None of them will be perfect and the more time you spend hunting for the perfect tool the less time you are making music.
I know I'm super late to the party and you might not see this but, when you showed the DT's clock jumping .1 ms it got me thinking. Have you ever tried syncing the DT with transport only and not clock? But set the tempo manually to match the DAW tempo. That way the hardware is being clocked by itself which in theory is going to be much more stable than through MIDI (it's too late here to test this myself now but just wanted to leave this here).
You actually don't need any of these boxes to achieve great sync! I'm syncing my entire studio by sending clock via the headphone output on my audio interface to my Keystep Pro. You can use a simple square wave loaded in a VST sampler or a dedicated sync plugin from any of the manufacturers of these sync boxes. Personally I use Silent Way Sync by Expert Sleepers, because it also reliably sends START/STOP messages. The important thing is that your main sequencer has a CV clock input, and that it can forward clock to its MIDI outputs.
This! How reliably does this work? Is it right on the money with the grid? I saw another comment mentioning Innerclock Sync Gen II but that requires a purchase of a discontinued hardware sync box that is at least $500 and I should have to spend that kind of money to get good sync.
@@Dan_Delix It works as reliably as is possible. Meaning, it stays on beat as long as you don't have big CPU spikes, and incoming audio will of course be late because of the latency on your audio interface. The way I deal with this is by having a plugin after my clock source that delays the clock signal by a certain amount so that my audio will be recorded one bar late. To clarify, the latency might make the recorded audio late by 0.63 bars, and I make sure that instead it will be late precisely 1.0 bars. Of course this might cause some trouble mixing hardware and software synths, but at least all your synced VST delay FX will work correctly. Also, remember that your hardware sequencer expects a loud and clean clock signal. Sync might drift if the signal is too weak, or the waveform might not be a true square wave anymore in case you are distorting it. On my particular setup, I leave the clock output at maximum in my software, and set my headphone amp to -10 dB. This setting has been the most stable for me. Last thing is that my Keystep Pro is happy to obide whenever I send it STOP signal, but currently it freaks out when I send it START signal. That's why I only send it STOP, and manually arm the sequencer before the next take. I'm trying to resolve this with Arturia support at the moment. Good luck! 😃
I've been down this rabbithole as well (see my latest video) and to get any sort of accuracy you need to record the midi signals, or you are looking at the latency of the DAW or the hardware that generates the sound... Which MAY be near zero. And then we have to remember that MIDI just isn't particularly accurate. 😀
I went the inner clock systems - sync gen. Amazing piece of gear at a cheaper price than the erm. Audio clock generator with expanders to allow more outputs. Also has modular run and clock out ;) I’ll have to do some longer scale tests.
@@RickyTinez it’s a locally made (for us aussies 🤪) and has some big names that use them. I’m still looking at something like the Mmrc so that it’s easier for changing it here routing in the studio. From the comp to the mpc, the requirement is to do it as quick as possible, with as little rewiring as possible ;p
You should do the same test with logic. I was using logic for years and really started having sync issues when switching to Ableton. But to be fair I also have issues when syncing hardware to my Multiclock in master model. Ableton is great for in the box midi tasks but it handles midi in and out the box very poorly compared to other solutions. Maybe also try the same test with a MPC as master.
ozone 10 and 11 hits my cpu hard and i have a i9- 1200k which has 24 cores if you include virtual cores? either way its got alot and i have 44 gb of ram and i dont get whats so taxing about ozone presets i know its doing alot of processing but nothing else ive used since i got this cpu about 3 months ago. maybe that could effect your test more negatively than others and if you tried something else you might get different results? ozone does sound damn good tho big improvement from the 1st .hope that helps atleast its something to consider.take err easy player.
This is why I work in the box. I can’t understand the world of electronic music being constantly out of phase. Musicians spend a lifetime fine tuning their groove and timing. Then you get a drum machine and try incorporating it only for it to just literally be out of the pocket. Playing an Mpc in stand-alone feels nice, they have it so dialed in that it feels spot on playing the pads. Once a computer is involved or any syncing of clocks between devices, things are shot. Groove is gone. I find it easier to just play a plug-in inside the daw.
Hey ! I use an ESI M4UeX connected via USB port to my Push Standalone to sync my different machines. There is the possibility of managing the latency of the M4UeX on the Push Standalone. This might seem like a weird question, but if I upgrade to one of the machines you present, could I achieve more stability with my MIDI sync?
hmmmmmmmmm ok so I have to assume the preset changing in Ozone was a bit more than just CPU spiking as far as the culprit of your Midi sync falling off with either device. I experimented with switching through presets in Ozone like you were doing, and each preset has a different total sample latency, with a pretty huge variance between them. It looked like you had latency compensation enabled while running the ozone preset swapping tests, so it's most likely that Live having to reorient itself to the new maximum latency after each preset switch was causing the audio buffer to stumble. The Midi clock running in the background was probably keeping fairly steady, but because the audio buffer stumbled around it it wound up falling out of sync with Live's timeline.
Please check what happens when you use the Multiclock as clock source and not triggered by Ableton. It could be that the root cause for all the issues is Ableton and not the devices.
Hello Ricky I think the problem comes from live 12 sampling stability problem or others, many users have noticed the same problems as you, including me, Live 12 systematically drops out when you open a Plugin or load presets or open the internet or other application, it's a recurring problem, I have the Erm multiclock and I have exactly the same audio dropout problem and I think that the Erm or the Mrcc are not the cause
Maybe try testing on another Daw?, Ableton has given me out-of-sync issues before!, Also external digital devices like your Digitakt or any other e-device have midi latencies. It would be worth testing midi tracks from Daw to Daw and see the results.
I wonder how much of that jitter can be attributed to floating point precision issues. Any time I see small decimal jumps in computing, float precision is usually the culprit
I bought the midronome, put it against a friends multiclock and basically ended up with a similiar set of stems :D I dont want to leave the comfort of ableton but surely something like protools handles latency better than my favorite daw..right?
Why the solution is to use Ableton as a slave and avoid using its MIDI Clock to sync machines The main problem with synchronization tests using a computer's MIDI clock is that jitter (variation in clock precision) is considerably higher compared to dedicated clocks. This is due to how operating systems handle USB MIDI communication, where data is processed in a more complex and parallel manner, introducing variable latencies and fluctuations that affect tempo precision. In contrast, traditional MIDI works in a simpler and more direct way, which contributes to greater stability. Here are some key reasons why the computer's MIDI clock is not ideal and why Ableton should be used as a slave instead of a master: Jitter in USB MIDI: When using the MIDI clock of a DAW like Ableton, even with low load, the typical jitter is in the range of 2-5 ms. As the project load increases, this jitter can reach 5-10 ms or more, making the clock inconsistent and affecting the groove and synchronization. For example, if you change presets or introduce effects in the mastering chain, latency spikes can occur, causing rhythmic elements like the kick to fall out of sync, resulting in offsets of up to a quarter of a beat or more. Comparison with dedicated clocks: Devices like the E-RM multiclock offer extremely low jitter, of only ±1 sample when synced with audio at 48kHz, which is similar to what the E-RM midiclock⁺ offers. These devices guarantee precise synchronization without the fluctuations that often affect USB MIDI. Additionally, since the clock is external, it is not affected by latency generated by the computer's internal processing, always maintaining stable tempo. More precise hardware machines: Well-designed equipment, such as those from Elektron, handle the MIDI clock much better than a DAW, with jitter in the range of 1-2 ms, which is already considerably more precise than a computer under load. Latency accumulation in MIDI chains: If you have multiple devices connected in series, latency accumulates linearly, which can negatively affect synchronization. Therefore, it is recommended not to connect more than 2 or 3 devices in series to avoid timing issues. Conclusion: If you are looking for precise and stable synchronization, it is essential to use a dedicated device like the E-RM multiclock as the clock master and avoid relying on the computer's MIDI clock. USB MIDI latency and jitter will negatively affect your timing, while a dedicated clock will ensure solid and fluctuation-free synchronization. Using Ableton as a slave and an external clock as the master is the best option to maintain perfect synchronization in your setup.
Guess we need to remove the recording onto a separate medium, or laptop.. It’s only function being to record - giving it a single process rather than Doing everything.
Did the present change i ozone change the latency of the plugin maybe so that Live had to recalculate and reapply Live's delay compensation chain? I suspect that a mastering plug in can do that while an instrument plugin probably does not do it in most cases.
Ok, now do the same test, but comparing humans playing instruments. Reckon there’d be a fair bit more variation. 😜 That push/pull of a groove is a large part of what makes music magical, right? I get the sample-level phasing issue, but how often are you likely to encounter that in real-world use?
About your outcome with the multiclock beeing so similar: What if this jitter was back then , when the Multiclock came out was an issue. But computers improved so much, that a clock generated by the DAW and transmitted via USB is similar stable. You made me curious: I will do a similar test. I have no MRCC but i can try clocking the DT via the Blokas Midihub and then via Multiclock. So far i can confirm the Multiclock generates in conjunction with the DTII around 1ms of jitter. Not sure what the source is. Without external clock the DTII also jitters , but a lot less - around 0.25 ms When i clock the OT MkII with the Multiclock it jitters less than the DTII ... the maximum offset i found was 0.6 ms .. the median was more like 0.3ms I'll have to test this a bit more extensivly.
Also, you're fighting the cpu of your mac and so you're causing the individual device's CPU to get flooded with late sync messages - how well the device can handle it is going to depend on the individual device's hardware - so cost is going to be a fair indicator of capability here. I bet both would be great if you had a UAD accelerator running your plugins so that there were limited/no cpu spikes when dealing with vsts
That's what I was thinking too, I wonder how much of a difference it would make if the ERM was primary clock and everything else including Ableton Live followed it
When MIDI was first introduced at whateverthefuck music fair. It failed publicly to perform the sync that it was meant to do. And was branded MUDI by some press. MUDI being short for: Musically Unusable Digital Interface. its the same standard still. And the "reserved for the future" parts of the concept are still somewhere in the future to be revealed. and its unlikely that there is distant enough future that they will solve shit.
It seems to me that you have become a corrupted multiclock and want ideal microcosmic accuracy) you need to remember the time when you didn't have a multiclock)))) these are all jokes, of course. I wonder if you have tried recording the same track from an electronic device, but without synchronization, just with an internal tempo generator? it seems to me that the result will be similar. Maybe these are permissible errors in the hardware, which make the music not so smooth, with a certain amount of errors and "humanize"? personally, I bought a multiclock about 5 months ago and I am incredibly happy. I have greatly simplified the jam process, and its post-editing, with a mountain of my own hardware. still, this is much much better than its absence. plus rich possibilities for midi routing, what could be cooler) in any case, thanks for what you do!
When I really started to dial in the ERM multiclock that I spent so much time and money on….I sadly started feeling the same way about it. Am I still happy I own one? yes. Knowing what I know now would I have invested in one when I did….. definitely not lol
it is almost like the DAW should be the clock for the hardware. at least everything should be off together. Also, there is no magic bullet. Latency is everywhere.
Now what happens when you start changing the buffer size after everything is set up? I maybe tripping, but once I got the Multiclock set up, changing buffer size everything was still locked pretty dang tight.
TL:DR I don’t think under 1MS matters unless you’re trying to hit 2 same sounds at the same time, ask “does it sound good?” Good! Now hit record ❤
Audio sync should be more stable because DAWS are focused on their audio engine before anything else including midi thus if you have it setup right it should lock to what ever the audio is doing. That said once you are ready to record I'd still disable all plugins on that track while recording, IU know that sucks if you like to jam your arrangements live
@@X-101If you disable a plugin in Ableton Live it keeps the Latency. You have to delete it. I'm using v11, dont know if it changed with v12.
If you're layering the same digital sounds with relatively minimal processing on one, then a few samples off can make a difference. But otherwise, I agree. If you can hear a flam, and don't want it, try a nudge... or only use one of the sounds for the attack part of your stack. That's a classic solution - one layer for the stack's attack, and use the other layers for body, thickness, decay, etc.
Also, people might be surprised how sloppy MIDI jitter could be on slaved vintage sound sources - nore than 1ms. And people made some GREAT music with that jitter present.
Yea the MPC60 sequencer has some slop in it but it gives it the life
Thanks for that awesome video Ricky, nice come back from the last one :))
This is a complex subject, but let me give my professional opinion.
* What you are comparing is (A) the MIDI clock generated by Ableton and (B) the clock/pulses generated by the multiclock VST plugin.
-> For now let's ignore any additional latency/jitter created by the multiclock device, by the playing device (digitakt), or by the audio interface (we can assume they are very low).
* I'm sorry to say, but the MRCC has nothing to do with any of this - it just forwards the clock generated by Ableton (and I'm sure it does it well)
* What we want is precision between (A) or (B) and the DAW timing, which is the timing the analog audio is being sampled by the audio interface
* Let's look at (A): on your computer, it runs in a different thread/priority than the audio stuff, so as soon as your CPU gets busy, it won't be called "in time" and that will create jitter and latency. On top of that it is transfered over USB-MIDI, which does not have the highest priority either - so also extra jitter if the USB bus is busy (lots of USB devices for example, big data transfers, for example)
* Let's look at (B): it runs in a VST plugin, which is processed not only real time with the other audio stuff, but more importantly the generated audio pulses are in perfect time with the other audio coming out of your DAW. So inherently the audio pulses you send to the multiclock are in perfect time with the other audio coming out of the DAW. Any jitter/latency using the setup (B) is therefore from the stuff we ignored, i.e. within the multiclock itself, or from the playing device, or from the audio interface.
Now will OSes and DAWs get better so that nobody will need this kind of complex setup anymore? Yes I bloody hope so, and U-SYNC is the first step towards it.
Simon
For those interested, Simon has made a cool little box(midi and audio Metronome, linked to your DAW),
I have it in my setup since a year and it is awsome - check his channel and support forum.
Simon's Midronome is an amazing MIDI clock and a great alternative to the aforementioned two.
Greetings from Macedonia
The Multiclock has a VST plugin!? And the plugin generates the clock? Then you are still relying on the PC, which isn't going to be as accurate as hardware can be.
Simon’s Midronome may be awesome but it’s not available anywhere. What is an alternative that can actually be purchased today not oct 2024.
@@elitebeatagent8993 I guess the current batch is sold out.
Try contacting him. He is very responsive and kind.
And believe me - the Midronome is worth it!
This was PTSD of trying to get two kick drums to align across devices. The phasing in and out was an interesting resulting sound, but also soul crushing. It's stuff like this that drive people to the dark arts of eurorack drums.
it drove me to OSC syncing - it's wild, you can dependably slide in and out of phasing of sounds.
Hahaha! Yea in that case I can see the fetal position incoming while the phase gets louder and louder.. (or quieter 😳)
Or just nudge the track a few samples after you've recorded?
Plus there is latency on the devices you are trying to sync. It never ends. I found the best is just have pulse audio tracks in my DAW and output direct to my clock in on my sequencer/drum machine. No midi.
Device jitter, and small variable clock offset, is something that will always exist with the midi protocol. The midi message bytes are sent serially and received using a buffer which is then processed by the microprocessor in the device. That thing runs at its own clock rate. So a midi clock message will never be processed instantly when it arrives. The receiving device still has to figure out that it received a clock message and act accordingly.
In other words, the Multiclock can sort of transmit the clock message in sync with a sample, but nothing can interpret that with sample accuracy because of how the midi protocol works.
I think the accuracy might also get worse if you send additional midi messages like CC over the same port.
There is eventually a bottle neck somewhere right? So many tiny variables. I felt like I was trying to figure out a Manet painting by looking at the canvas stitching 😂
This is correct. Aftertoutch on some USB devices can quickly overrun a MIDI serial buffer if it isn’t properly throttled.
The only way to have a perfect sample accurate clock sync would be to base it on the word clock everywhere in the chain.
Every device in the chain would need to know the tempo changes at the audio rate level.
I believe that every connected device would probably also have to report their own internal latency so that latency compensation for all chains can be tracked because even what is perceived to be a "no" latency for a digital synth at least have a one sample latency and what it is depends on the internal sample rate.
MIDI is very bad for syncing clock since it is only based on a low rate tick with no time codes, you could probably improve that a lot but it would also not be compatible with midi clock
Interesting i hadn't figured it was a midi-protocol problem, that does make a lot of sense.
@@pleggli Interesting, i wonder if somehow a unit speaking direct to DAW through another protocol, functioning off something more like CV might be a solution?
Hey Ricky, been watching your videos on the MultiClock that I also heavily use. Love your channel and your approach! The power of the Multiclock, is being able to shift things in time with knobs until it "feels right" and then hitting record. To me it s priceless just for that! Every plugin you add, also EQ s being changed to Linear phase etc will affect the timing of everything including the Multiclock as Abletons internal latency will be affected. Just setting it once and expecting it to stay in sync is not achievable IMO. Often I check it s tight and then adjust by feel so the groove feels better than 100% on grid, very often do this on bass synths and toms. Don't sell it, you will regret it!
💯
Could not agree more!
Here is my setup with the Multiclock in ableton that has been rock solid for me:
I turn off all external midi sync in ableton because you want the devices to get the sync from the multiclock and not ableton via usb. Then use an external audio effect on the channel you have your multiclock plugin or audio pulse running through to route the audio to the multiclock. I find this keeps the latency compensation better when running plugins.
You can then run one of your midi outs on the multiclock into the midi in on the MRCC to clock your outputs on the MRCC.
Interesting
Gotta go old school on sync; workstation is master clock, all midi and audio clips consolidated into long sections (especially in Live), use automation over patch changes, use lots of pre roll, render or dump all midi parts to audio and immediately edit for sync. Abandon all illusions about MIDI - only audio recordings are ‘real’ and in sync. Put video on separate MIDI TC slave system. Finally, use tempo sync’d delays as a reference only. Once you know what you want delays to do, set times manually or with automation and let them free run. I hear that recording the cowbell from an 808 onto track 23 of a Studer ATR can drive the arpeggiator of a Juno 101 pretty well…
I highly appreciate your further research (and extensive video production) on this topic. Let‘s hear for the real Pros..
Thank you, Ricky, for the information you provided. I have also been researching for a few years how to optimally synchronize onboard devices with the computer, and I have come to the realization that latency cannot be completely "overcome," but different approaches can be used, as you mentioned as well. In your specific case, I believe the drift is caused by the computer itself and not by the multiclock, because of software record input monitoring enabled. For an optimal "latency-free" and rec.overdub experience, I would recommend an onboard mixer with direct monitoring on each channel to avoid internal processing as possible. Regarding browsing presets using with the multiclock, I have sorted it out by installing VSL Server on the computer and offloading all my plugins from the DAW process, thus eliminating any processing that could affect the out clock from the DAW , and I can say that it works really well. Zero drift and phase issues with live tracking and overdub recording. Best regards, Robert Trifunovic
I’ve been using Ableton live since version four and I will tell you that you have undertaken a fool’s quest. Ableton and external hardware synchronization has always been an epic nightmare. I gave up on caring and I use Ableton has basically either all software in the box or controlling hardware out of the box, but as soon as I try to get the two to work together and line up, everything goes to shit. The only answer for me has been that I record and timing after.
It’s not all Ableton’s fault, either. It depends on what your synchronizing to it, and how that device handles clock changes. Midi, also being a serial protocol, does not help. Regardless of what your sequencers internal timing may be, midi is limited to 24 pulse per quarter note timing resolution, which was great in 1982, but just doesn’t work today. Midi 2.0 can’t come fast enough.
Yea I had a feeling this is the case. 😂 I have tried other DAW‘s in the past, and one thing that always stuck out to me was how everything “just worked“ when it came to synchronization. I felt that both Bitwig and Logic gave me a more stable clocking experience. But I never really tested anything, or questioned it further as I did today haha
@@RickyTinezyou should test that then, and see if your feelings have any basis in reality
The best sync I've gotten so far is from the Midronome. Also not too expensive. I have it providing sync to everything through an MRCC.
Yeah, MIDI 2.0’s timestamps is the solution to this. After that, everything should be great.
Interesting analysis of MIDI clock with the two devices and your DAW. It's hard to image that a half millisecond would be noticed. With MIDI keyboard latency, somewhere between 5-10 milliseconds I can start to notice a lag. That said, if you take a song at 120 BPM, you get 500 milliseconds per quarter note (two notes per second). I use Logic, which has a default MIDI PPQ resolution of 960 (I believe Ableton is the same?). So for each quarter note occurring every 500 milliseconds, there are 960 sub-divisions the DAW is using, to determine accurate MIDI clock timing. .500/960 = .000521 per MIDI clock sub-division, or roughly a half millisecond. So the actual (DAW) clock data used to determine MIDI timing, is only accurate best case, to a half millisecond (at 120 BPM -- clearly this will change based on the song's tempo). It doesn't seem to me, you are going to be more accurate than the clock frequency. And as you demonstrated, when the CPU in the DAW is heavily loaded, other CPU threads may delay the processing of the DAW/MDI clock. I like your comment about missing the forest by focusing on the trees -- this analysis is great, but I suspect if one doesn't heavily load the DAW's CPU while recording with MIDI clock, any minor variance + or - half millisecond, would be small enough to not be noticed. BTW, I ended up getting an MRCC 880 after seeing your video last week -- didn't realize this device was available. Really a nice MIDI router/merger/spitter -- thx for the tip!
I'm a bit late to the comment party, but as many other commenters have now suggested, there are a LOT of different choke points on both the computer running the DAW and all of the gear that you're plugging into. Trying to track down where these transient issues are happening on any given take is likely the path to insanity (although arguably still easier than troubleshooting a complex HDMI audio setup). Some concepts that might help provide a bit more context... first, even fast modern CPUs are still usually running one instruction at a time, which means the processor is rapidly jumping between dozens of tasks. It's not just your running programs but all of the OS services, and everything done over USB (audio interface included) requires a not-insignificant amount of processor overhead. Processing streams while individual apps and plugins are all greedy for high priority slices means that the more you listen for audio artifacts, the more you will absolutely hear them. This might actually be one of the most understated advantages DAWless setups have; the processors might be far less powerful, but they are not running a multitasking operating system.
The second thing to keep in mind is that MIDI clock itself is a very janky thing. There's no concept of syncing to an atomic clock, and the speed a processor runs is directly influenced by the voltage it's receiving unless there's a clock running; does your MIDI gear have clock batteries that need to be replaced? Mine doesn't. So, everyone is doing their best but nobody actually can agree on how quickly time is flowing. When I was first learning the MIDI protocol as a developer, I was shocked to learn that there is no tempo message. Instead, you send timing "ticks" down the line at a rate of 48 messages per quarter note. That's right... time is defined in real-time based on the speed at which those ticks arrive. Your MIDI synth is just a hungry hippo for ticks. If you stop sending ticks, the music just stops. To me, this is wild. This is also why you see the tempo occasionally jump up or down by 0.1 when you assume it should be stable. What you're seeing is ticks arriving too quickly or too slowly, or not arriving at all.
Now, the other elephant in the room is that MIDI works at 31.25k baud, and the timing messages are not given special priority. So, if you have a lot going on, that bandwidth can get eaten up. That's an area where the MRCC can really help, by defining subsets of devices to talk to each other. Still, just because MIDI can support a given speed under ideal conditions doesn't mean that the client device is powerful enough to receive them that quickly. Right now I am talking with the Hologram folks because Chroma Console literally shuts off if you send it too many messages too quickly. Which is not amazing, but this is the world we're in.
I'm not intimately familiar with the audio clock but if there's software running on your laptop to interpret what it's sending, then all you're really doing is shifting the point of failure from MIDI to your CPU's ability to prioritize messages from your audio clock.
TL;DR: I wouldn't pay extra for the audio sync thing, and I wouldn't do CPU intensive things while recording. And yes, if it sounds good...
when you're tinkering with ozone, the latency of the plugin gets altered, which results in a tiny audio dropout from ableton. for sure then the multiclock gets a hickup...
Yeah i think DAW latency compensation changes might be significant enough here to outweigh device inaccuracies.
Exactly what I was thinking.
Yep. I have no idea why somebody would use a heavy CPU mastering plug-in during a recording in Ableton.
@@RJ1J Agreed, during recording I'd turn off plugins or use the DAWs delay compensation. Not sure if ableton has delay compensation or not or if its any good.
Another great video Ricky! I got the MRCC after your last video and it’s amazing to have such a flexible MIDI patch bay and especially have the four virtual MIDI in and outs. Game changer for me.
But about MIDI jitter - I genuinely believe this to be an ableton problem. I’ve done something similar to this in Logic and it was bang on but yeah, Ableton *always* did weird stuff. Not even just clock, but even just MIDI sequencing notes.
Funny thing however is that Elektron Overbridge seems to always be locked-in for me regardless of DAW. Perhaps you could talk to some people and start building some ideas for an Elektron sync box? 😅
Also I just realized that in this video you’re recording with Monitoring set to Auto and not Off, without trying adjusting the Keep Latency settings. 🤔
There’s just *way* too much stuff to keep track of when recording into Ableton!
The main issue with those sub-10ms changes is 2-fold: 1. Phase issues. If you are somehow trying to double up the same sound, which is not especially likely in this context, the phase can be totally off and thin things out. 2. More importantly, we CAN hear the substantive difference in transient response if say 2 percussive elements are clocked separately and continuously overlapping, you can have a subtle shifting of where a really really tight flam is happening. in a majority of cases i don't think this will really cause an issue, in some it could even be pleasant, but in certain situations it could certainly be heard, and in some it might be undesirable. I think the worst case scenario would be DJing, if you were to have similar parts overlapping poorly, could sound really shit.
Oh yea ive heard that, putting the same tone or cowbell on 2 machines and they are never super dead on. You hear the phase for sure.
Would this happen if it was machine to machine midi? I guess it might depend on the machine right? Haha
@@RickyTinez It would prolly depend on a few different things but most of all on whatever hardware/software you were using. Theres stuff like JackAudio or a few other similar solutions, too, which just go software-to-software, you could probably test something like that and it'd give you an idea whether its common inside a PC, and if it is, it'd prolly come up computer-to-computer as well. I somewhat suspect all you'd deal with is latency though, rather than jitter. Oh and im realizing just now i might've misunderstood, if you mean between 2 midi hardware units, i absolutely think it would come up, my guess is it has to do with subtle analogue jitter. Most all clocks of the time-telling variety have jitter, thats why we've got the atomic clock, but even it has subtle jitter i'm told. Generally, in the digital realm, my understanding is its all 1s and 0s and the jitter doesn't really happen usually, the electricity coming in is all stabilized by the power supply to facilitate the extremely detailed operations, so that every 1 is a 1 and every 0 a 0, nothing in between. Then again, if i was all right, the atomic clock would be digital LOL, so i gotta be missing something. TLDR; Clocks are actually gnarly AF 🤣
I had the ERM multi clock. plus a motu 128. After years of sync issues, and testing over many years between Ableton , Cubase and Bitwig. I came to the conclusion that Ableton was by far the worst for sync issues. So I sold Ableton Live and the ERM and a load of older hardware gear.. I will do most of my live hardware compositions in Bitwig. I've never been happier. ;o)
Most desktop computer operating systems are technically not real-time OSes so they will always be affected by CPU. A real-time OS (RTOS) is designed to work within hard constraints and so can guarantee specific response times. I think a dedicated hardware clock with a RTOS would be best but I dont know which of the hardware devices out there are designed that way. Someone mentioned Midronome but I've never used it.
I think Ensoniq keyboards Midi jitter and timing is better than almost everything available and that was in the late 80's early 90's so it must have been because it was operating to process midi and the synth as it's core function.
Most MIDI devices like MRCC use simple single core microcontrollers and are bare metal programmed. That is, there is no operating system. However, some higher performance MCUs can run an RTOS which might be helpful if there are a lot of tasks to juggle.
@@rgeraldc80s In the 80s, DAWs did not exist. At best you had things like Steinberg Pro 24 sequencing software running on say, an Atari ST.
This stuff really gets me… we have such ridiculously powerful computers these days, but we’re still struggling with audio and midi latency.
The response times could easily be achieved on modern OS’s if they just got out of their own damn way for real time processing
Something like Linux with a realtime kernel can do things on dedicated deadlines but it still won’t make latency super-low. ASICs or FPGAs will always win for stuff like that where microsecond-level timing is required. Once you’re looping through an interface or general-purpose OS for tight timing, forget it unless it’s simply an output and all the instruments are in-the-box. RME gear in general has great latency for tracking and stuff but if you really want phase-accurate you need to quantize or manually align stuff. Hybrid kinda sucks because of this.
Again he didn’t test multiple external sequencers. When I’ve used a usb midi hub with one piece of gear I could get it and Ableton synced after a little bit of tweaking. You would think that now I can do the same process with other gear and then I can run 3-4 external sequencers and they would be synced. But, it always fell apart. Now I use the multiclock and at the moment I have 5 pieces of gear all running their internal sequencers locked to Ableton. When I first got the multiclock I would also measure the latency and adjust it that way to get it super tight. Now if I add a new piece of gear I just dial it in by ear. Sometimes a few milliseconds here and there get those separate sequences grooving together. Plus I could always nudge later in Ableton, if need be, after I’m done jamming and recording.
i own multiclock and i did jitter test using different audio interfaces meaning Focusrite 18i20 and 2 RME Babyface pro fs and UCX2 and this is where i found the RME differ from others being super stable with audio jitter so the multiclock performed very well. now i generally stay away from any activities while recording having another dedicated PC for junk
Thanks for going to all the trouble to clearly and methodically document everything! I have been doing similar, but I'm so much less familiar with Ableton, so I can't pin down how much of rhe variation is something I changed on one track without realizing it
Hey,
thanks for allyour effort on this interesting research. Without knowing for sure, I think the Multiclock is so popular because it works great with older drum machines or synths that have built-in sequencers. That's where it really shows off its strengths, imo. For instance, I like to use the audio out on Channel 1 of the ERM to trigger the SH-101 or the JX-3P. Of course, that only makes sense if you’re into those old-school internal sequencers. Plus, the built-in swing in the ERM is pretty nice in this context.
Digitakt is always slightly off on the first bar when sending or receiving midi. Love to see a test comparing audio over usb(overbridge & class-compliment) vs over a solid interface like RME.
So many years and this is still a problem…
With Digitakt or Ableton or just jitter?
It really got me that when midi slaved to an MPC my first pc used to display wandering tempo or 89.9 instead of 90 lol
Things haven't improved with all the d/a stuff in our rigs too
@@johansebastianfauno5526 PC computers and notebooks aren't ideal for electronic music production primarily due to latency issues and system interrupts. These devices were designed for general-purpose computing rather than real-time audio processing, which requires consistent, uninterrupted performance.
1. Latency and jitter: Standard computers often struggle with maintaining low and consistent latency, which is crucial for real-time audio processing. This can lead to noticeable delays or irregularities in sound output.
2. System interrupts: Modern operating systems constantly manage various tasks and processes, causing frequent interruptions that can disrupt the smooth flow of audio data. These interrupts can result in audio glitches, dropouts, or buffer underruns.
3. Non-real-time operating systems: Most consumer PCs run on operating systems not optimized for real-time processes, making it challenging to guarantee consistent performance for audio applications.
4. Hardware limitations: Standard PC hardware, including sound cards and drivers, may not be designed to handle the demands of professional audio production, leading to suboptimal performance.
This is why dedicated hardware gear remains popular in electronic music production. These devices often feature:
1. Purpose-built chip architectures: Designed specifically for audio processing, ensuring low latency and high reliability.
2. Real-time operating systems: Optimized for consistent performance and minimal interruptions.
3. Dedicated DSP (Digital Signal Processing) chips: Specialized processors that can handle complex audio calculations more efficiently than general-purpose CPUs.
4. Guaranteed "quality of service": Hardware units can provide consistent performance without the variability found in multi-purpose computers.
The same principle applies to other fields requiring real-time processing, such as telecommunications. Cell phone towers, for instance, use dedicated hardware rather than general-purpose processors like Intel CPUs. This ensures reliable, low-latency communication essential for maintaining call quality and network stability.
While modern PCs have improved significantly and can be optimized for audio production, dedicated hardware often remains the go-to solution for professional-grade electronic music production due to its reliability, consistency, and purpose-built design.
Would you like me to elaborate on any specific aspect of this explanation?.
I bought a monitor that negates any jitter or timing farts I see so it’s not a problem anymore.
So let’s invent a perfect timing box to go along with pitch correction bullshit. Then let’s get a microscope and look at your dead brain cells.
Like the last one, it’s hard to take this video in good faith and not as cynical clickbait but you’ve always seemed like a good dude so I’m giving the benefit of the doubt and just assuming you have a bunch of things twisted.
1. You’re not comparing the ERM to the MRCC because the MRCC doesn’t do anything to the clock other than route it. You’re comparing the ERM to your computer’s clock with the MRCC as a dongle between your computer and the outside world. This is an apples to oranges comparison.
2. Every receiving device behaves differently so measuring what your receiving device is doing is immaterial. Some devices are notorious for syncing very poorly no matter how perfect the clock coming at them is. A more thoughtful test would be measuring the actual clock data against an objective measurement device like a frequency counter or equivalent to see what’s happening at the clock level.
3. Buddy, why are you running a mastering chain in this scenario? Come on, guy!
4. Ableton has always been notorious for being bad and weird with MIDI timing. Run the same test in other DAWs and you’ll likely see radically different results. It’s a shame but it’s true!
I use a USAMO to get around this all, BTW, and it’s generally rock solid.
When you load different presets in ozone the pdc (plugin delay compensation) makes amount of samples changes. This wil inpack the audio engine and the position of de audio recording.
Midi is a serial command protocol. Midi uses a speed of 31250 bits a second. This means roughly 3125 bytes a second. Midi clock uses 1 byte. The time of the byte is 0.32 ms . So it's normal to have a fluctuation around 0.16 if no other midi data is send. This can be bigger if more midi data is send like active midi sensing, cc controller data or song position.
Thank you for this video, I love this kind of stuff!
Maybe there's a way to do some sort of a shoot out between different options?
- Expert Sleepers Usamo and Silent way units
- CLOCKstep: multi
- Innerclock Systems,
- ACME-4
- Circuit Happy Missing link. Because how stable is Ableton link?
- The m4l clock devices?
So many options but nothing fixes the issues.
I just bought a used ERM Midiclock (which is at €130 a lot less pricey than the Multiclock). But because I had a problem with a sequencer providing the clock (Polyend Seq with the bpm
jumping anywhere between 169 and 183 instead being at 176) and ANOTHER sequencer (Squarp Hapax) being the slave but providing an involuntary MIDI loop even though every Thru setting was OFF. So in my case, the Midiclock allowed me to physically unlink my sequencer cable mess ... with a tiny little device right next to my keyboard. I do have a MIO XL MIDI router (with all MIDI connectors at the back, mind you, I do not want a cable mess on my table), but it's hidden somewhere in my rack and I wouldn't want to crawl into my rackspace every time I need to start/stop the clock. So for me, an external ERM Midiclock was indeed the best option. And everything connected now shows exactly 176 bpm, not some random number between 169 and 183.
About to watch the video. What I found using a bunch of Elektron gear is I would always get bad gitter.. using the Multiclock fixed this for me.
The MRCC is known to have high lag and jitter for a device of its kind. My clock is audio pulsed out from the Daw out through Erica synths cv to midi clock converter, this is simple and sample accurate. When just using hardware my Mpc 4000 is the clock it is almost as tight as a dedicated sync box
Brilliant video Ricky and a very interesting subject. Of course, now you'll have to do it all again Vs Ableton Link 🤣🤣🤣
Latency compensation values reported by plugins is not set in stone, it can vary depending on whatever is configured within the plugin. One obvious case where this behaviour is noticeable is with the introduction of a limiter plugin, which was exactly the case here when you started switching between plugin settings in Ozone as I clearly saw that a Limiter module was present in some of the presets you browsed through. I believe in Ableton there's a mouse-over mechanism that allows you to see what the exact reported latency compensation value is at a given time. I know that for example if you take Fabfilter Pro L 2 and switch between the limiter modes, for example between "Modern" and "Transparent" there's a very significant latency variation involved.
I didn't want to make my original comment too long but, clearly you have to be very careful with the conclusions you're coming to with the kind of test that you did in the video, because there are a lot of variables involved with dealing the complex task of dealing with latency compensation in different use cases. In this case it appears evident to me that the issue you've experienced is directly related to the introduction of Ozone and effectively altering its latency compensation value in real time while you're recording as you were switching presets. This was inevitably bound to cause issues but the good news is there's almost certainly nothing wrong with your MIDI devices, you simply have to be mindful of what tools you choose to ensure "compatibility" for a given use case.
That was my suspicion too. That the Ozone plugin has some presets with a look-ahead latency. That latency is reported to the DAW pdc. Changing presets causes the PDC to recalculate.
Great video, thanks Ricky. FWIW when I’m sending clock from my Digitakt to my Syntakt over MIDI DIN without a computer in sight, I can see the tempo on the Syntakt fluctuate by +/- 0.1 bpm throughout. Same with the Digitone. I think there’s inherent jitter in MIDI clock at the hardware level, quite apart from when a DAW is involved. But… it doesn’t matter because jitter that’s measured in around 1ms or so isn’t going to make an audible difference. And in reality MIDI clock sync on Elektron boxes is tighter than a camel’s arse in a sandstorm. If the MRCC is getting jitter via USB from a DAW down into the 1ms range then it’s going to be almost as solid as Elektron gear at the hardware level as a clock source. That’s good news because MRCC 880 is already on my shopping list and this test confirms it’s a good buy.
Ricky, I really appreciate these videos, on DAW/Hardware sync. I have an RK-008 and I’ve adjusted latency per output for all of my drum machines and sequencers. I’ve gotten them all really good sounding/looking/grooving. The MAIN point to drive home first and foremost, is setting the ableton monitoring to be Off. So I assume you’re direct monitoring through your interface? I’m having great results but it’s frustrating when you’re using hardware sends (Ext Inst) in ableton.
Side Note: Renoise to my ears has the best sync to hardware. I’m convinced if it.
I’m curious how the Akai MPC compares..Back in the the day the MPC was famous for it’s internal midi timing. There was this test of using all of it’s voices to trigger the same rimshot at the same moment. Still sounded like a flanged rimshot though; but way, way better then any midi sequencer in existence at the time….
Kind of curious how the ATARI ST compared to old Akai MPC 60 at sequencing in a head to head comparison.
@@Aristoper the mpc had it’s own internal sound engine, the Atari ST did not. Midi is a too slow serial data interface, which gives sloppy timing, as soon as you pump a lot of data through it (like a whole song, or a lot of pitchbend data) The MPC use to have 4 separate midi outs, so you could use one midi out exclusively for external drums. The Atari also had midi extension boxes with multiple outs.
I think you're seeing two different things:
1. Regular old MIDI Jitter, which can vary in severity based on hardware and DAW. USB vs. MIDI DIN, IO Drivers, MIDI driversetc. I've done extensive tests and some DAWS do better than others. MIDI DIN seems to be better than USB.
2. On the Ozone tests, you are probably flipping through presets with varying values of lookahead. When you instantiate a plugin or preset with lookahead it's going to momentarrily throw Live's plugin delay compensation (PDC) out of whack.
A couple of comments:
1) Jitter happens when midi beat clock events are not sent at precise intervals. Latency is irrelevant for the issue of midi beat clock jitter because there is no real-time response to midi clock events. In other words, gear that receives midi beat clock will not do anything immediately in response to midi clock. Instead, those events will be observed and their intervals averaged over time to calculate a tempo.
2) What you've tested here is mostly the ability of Live to deliver a jitter-free clock. The proof of that is the varying tempo in the Digitakt. That's why you didn't see any difference in the jitter between the two devices. I can tell you from experience that other DAWs generally do a better job of this than Live.
3) Changing Ozone presets will change the latency compensation in Live, so you should expect that will cause glitches when doing that in real-time. CPU usage, either via plugins in Live or with other apps running, is not the issue
Interesting. What audio interface and connection method did you use. I don’t have any sync issues with Mukticlock via MOTU 16A using TB2. Did you try with any other outboard modules besides the Digitakt? Did the audio pulses going to Multiclock stay in sync?
Thank you Ricky, that was another video in a long history of helpful videos. Thank you for the time and effort you put into it.
(The rest of this comment is probably not helpful but...)
I understand people want to eliminate latency in all its variants but be aware of the return on effort / diminishing returns / system variance. Depending on the gear I have active, the systems I'm syncing and the configuration of the system in total, if I can get a latency of under 10ms round-trip from hitting a control/keyboard/pad and hear the sound, I'm good.
Can I get it lower by tweaking? Sure. Do I care? Not really. The bugaboo's like phase cancellation are noticeable and I deal with them using my ears if they are bad enough to matter.
I'll spend the time on what I care about. If (for you) it's jitter and latency, that's cool, but I'll be making music more often than not.
Have fun!
To me sub 1ms jitter is huge if your recording more than 1 midi drum instrument at once or trying to record a midi drum synth with VSTi's Phase issues, transient issues, etc. I also think a test with a DAW that isn't notorious for terrible timing like Ableton would be better as well as testing with multiple different midi devices. It also depends on what type of midi we are testing, if its clock or just note data. A good midi interface will take into account the buffers in usb and time stamp the midi while sending it a little early and then releasing it down the DIN cable at the right time. The other thing to consider is the resolution of midi when dealing with clock. There's a lot of factors here to test to come up with anything meaningful on the results side. And yes what device you are sending midi to does matter!
Maybe the Multiclock has a much more precise internal clock? I would assume so, because by using audio sync its still clock generated by that same CPU, just the audio output may or may not be lower latency itself, I would assume the internal clock on Multiclock compared to ANYTHING is dead on
Agreed, if clocking from the MIDI hardware, multiclock or sequencer or whatever, all of the hardware will be in sync as the port to port latency and jitter of a MIDI router is negligible. It’s the computer that monkeys it up, and needs the audio clock to compensate. This would be a good thing to test.
@@DarrylMcGee I totally agree, I was writing some code for midi sequencing and it's surprising how easy it is to get unstable clock times on an OS where the cpu is doing a billion other things already, in fact I think he should try clocking the DAW from the multiclock
Hey Ricky, great video as always. If I am not mistaken, at around @9:06 you made a comment about 20% of a millisecond being 1/20 of a millisecond. That is wrong. 20%=20/100=1/5. That’s way bigger than 1/20. Thinking in percentages 1/20 is 5%.
Nice followup, there IS a difference between daws, midiGal let’s you measure this and Bitwig has way less jitter than Live out of the box. Then there is the issue of audio-drivers, highend audio interfaces will have superior clock. On PC there is a utility called LatencyMon which shows the hardware interrupts interfering with audio timing, this let’s you optimize performance. In general stuff like graphic drivers, tcp/ip protocol but also bluetooth devices will (can) have some influence on audio timing. The erm should be rock solid during whole recording, other devices like erm also reset and get in sync again after a bar, for a recording situation this may not be crucial, but in a live situation you just want to stay in sync.
Check out the MIDI Fact Sheet in the Ableton Live manual.
I experimented with this as well some time ago and I understood that even internal sequencers are not tight.
For example, my Elektron Machinedrum mk2 is digital, but it still drifting slightly from the grid.
Multiclock can clock "analog" and "Din-Sync?" too. Also can be clocked by any "Clockpulse". ..but : The ACME Multiclock has Midi-outs and TRS-Trigger outs for every channel. that´s nice. ...and Double/half tempo ;)
I'm convinced, nothing is better than Innerclock Sync Gen II. You have to try it.
I lost my mind trying to troubleshoot this - in my case, my audio interface was adding major issues. Sometimes up to +-16ms off, changing every time I hit record with one audio track in the session. The interface would work fine on my MacBook but not on PC. Switched to an RME interface and Innerclock for sync and I’m finally at peace! So windows users might have this extra bit to troubleshoot.
Computers suffer of DPC latency, you can easily measure it and you should as laptops are prone to it. As long as we’re unable to produce a perfect clock and use it in perfect conditions, we’re unable to produce a perfect device. My Drumbrute impact is really bad, it’s clock is like rubberband stretching during the 16 steps.
Fun stuff, I always hate seeing tempo flicker. You should add a baseline recording of your Digitak directly into the daw tempo run from the Digitak. Also math in 8ths is easier to compute.
@rickytinez could you test using multiple Elektron gear at the same time? I could never get them synched up correctly without using the Multiclock. I would have the Elektron gear running, creating sequences etc, and after 10mins or so they would be out of sync.
The CPU spikes are definitely the main culprit. I think the audio engine has a higher priority to MIDI output, which is one of the reasons why the Multiclock works better.
Midi jitter is the whole reason I gave up on a hybrid setup. I spent way more time trying to sort it out, rather than making music
I’m surprised you don’t know this. I thought it was coming knowledge that you should shut down all the other applications when using a DAW
Every time I use a Dan, I shut down other processes. But these days I'm 100% Danless.
I definitely need a MIDI clock
the best bet might be to have an external clock that start / stops gear AND daw. not processes getting in the way.
I’m sure the issue is the daw in the end. No matter what daw. It’s doing a lot to be a stable clock source.
The RME is suppose to have the best jitter reduction. I wonder if it would correct issues if plugged into the UCX midi ports? I just throwing something out here, I'm not really familiar with using the midi ports on the UCX II. But I'm still curious about the Mio by Iconnectivity lol.
Ok a couple of things. Things are way still way more complicated. First of all jitter is any deviation in, or displacement of, signal pulses in a high-frequency signal. The multiclock audio- plugin in hosted in a track of the sequencer is not excluded from the automatic latency compensation engine of your sequencer. Sticking to the way you've tested. I think Ableton is able to adjust the start time of the multiclock plugin if your are using the external device plugin. Ableton measures the delay between the midi trigger sent and the audio received by the HW-device. What I'm describing is not in anyway mentioned the manual of Multiclock. Why? That is not how the mulitclock is supposed to work or used. The computer doesn't sync external midi gear with an sample accurate audio signal. The audio plugin only syncs the muliclock. Back to your computer. First of all. You do not browse the Internet, open applications that are not managed by the sequencer. I do not know much about the Apple SOC but you are requesting the impossible. Opening your browser means, open a different IRQ, DMA channel on your processor to start the browser app. The app is stored on disk that is dependent on the CPU. Then the browser populate the screen using the integrated GPU (pictures and animation). The browser will Read the disk again for the cashed browser data and load it in to memory. It will also refresh the open tabs of pages from the Internet(involves operating your NIC with interrupts the CPU ). Your keyboard and mouse strokes are going to another application in the foreground. I'm trying to keep stuff simple here. But your are not making this easy. : )) How is Ozone 9 configured? Is het using OpenGL? Your mouse actions in the GUI of Ozone clearly interrupts the your processor. In short, I see so many issues in your test and on your computer. I would be in tears if my Wintel behaved like your Mac. All in good spirit!
yes. each device has it's own jitter - Elektron were not generally known as the most accurate but have greatly improved since the MD days. Innerclock Systems (who mack the sync lock stuff) have tested plenty of gear (you can look up Innerclock Systems Litmus to see their methodology and results but remember they are selling a ERM style product) and and an MPC 60 has a jitter of .33ms MPC 4000 .6ms and the MPC X a whopping 2.1ms. The only thing with no jitter is all those analogue sequencers like you might see in modular or the old stuff. Mostly I don't find jitter a problem unless you are over dubbing, then it can be a disaster.
I have also had wildly different results on different computers (sometimes USB sync is OK but sometime there is no fixing it) which is why i favour a audio sync solution despite it being slightly annoying
I'm not the world's foremost expert on this, but imho much of what you experience is basically the limitations of the MIDI protocol (originating from the early 80s) and the way USB-MIDI is handled by a computer.
The (DIN) MIDI protocol works at a fixed speed of 31250 baud (31250 bits/second). That means: sending 1 byte of information takes 0.25ms. The way clock messages work is that per quarter note 24 of these clock bytes are being sent. Then, most other MIDI messages take 2 or 3 bytes (cc, noteOn, noteOff etc). And all these bytes are being sent serially: one after the other. As said, sending 1 byte already takes 0.25ms, so it's not hard to see that small variations quickly occur.
Of course, the USB protocol isn't restricted to this 31250 bits/sec rate, so in theory sending MIDI over USB shouldn't give these inaccuracies. However, then there is the issue that the receiving end of these USB MIDI messages is sent into a buffer, and this buffer is only read at a limited rate (typically around 1000 per second, as far as I know). And the MIDI is only processed after the buffer is being read, so also here you will also find a small amount of jitter.
Even more madness may ensue with drum parts where multiple events fall at the same time. I recall Colin of cirklon fame mentioning this as reason why drums over midi never really land consistently if there’s that kind of demand. Not sure if removing the DIN from the equation alters this and using USB instead. I’m not even talking timing over clock but rather midi patterns in DAW triggering the samplers etc. I was into this for a while and looking to use triggers on Rossum assimilator etc as it ought to be tighter. In reality do we care? It’s hard to know. Supposedly the brain can detect micro timings so 1ms is in that realm. Funny business.
well ozone is a mastering plugin which will add latency. You don’t want to use it unless you bounce or on final mix. It will always run late
Ozone causes more clocking problems than most plugins, since all presets do not have the same amount of latency (it depends on which modules are loaded in a given preset). The clocking can't keep up with the changing latency compensation.
Hi Ricky loving your videos. What audio interface would you recommend I'm trying to decide which one to get. Thanks
I actually think the problem is Ableton.
The last couple of days I’ve been setting up a live midi recording rig.
Between LPX,Bitwig and Ableton, Ableton was the DAW that drifted the most.
Logic was great, but lacked a few features, and Bitwig was consistently stable. Same set up BTW with the MRCC and ERM.
Ableton is not good. 👍
I think the thing is that you have to choose one, learn its idiosyncrasies, and work with it. None of them will be perfect and the more time you spend hunting for the perfect tool the less time you are making music.
I know I'm super late to the party and you might not see this but, when you showed the DT's clock jumping .1 ms it got me thinking. Have you ever tried syncing the DT with transport only and not clock? But set the tempo manually to match the DAW tempo. That way the hardware is being clocked by itself which in theory is going to be much more stable than through MIDI (it's too late here to test this myself now but just wanted to leave this here).
You actually don't need any of these boxes to achieve great sync!
I'm syncing my entire studio by sending clock via the headphone output on my audio interface to my Keystep Pro. You can use a simple square wave loaded in a VST sampler or a dedicated sync plugin from any of the manufacturers of these sync boxes. Personally I use Silent Way Sync by Expert Sleepers, because it also reliably sends START/STOP messages.
The important thing is that your main sequencer has a CV clock input, and that it can forward clock to its MIDI outputs.
This! How reliably does this work? Is it right on the money with the grid? I saw another comment mentioning Innerclock Sync Gen II but that requires a purchase of a discontinued hardware sync box that is at least $500 and I should have to spend that kind of money to get good sync.
@@Dan_Delix It works as reliably as is possible. Meaning, it stays on beat as long as you don't have big CPU spikes, and incoming audio will of course be late because of the latency on your audio interface.
The way I deal with this is by having a plugin after my clock source that delays the clock signal by a certain amount so that my audio will be recorded one bar late.
To clarify, the latency might make the recorded audio late by 0.63 bars, and I make sure that instead it will be late precisely 1.0 bars. Of course this might cause some trouble mixing hardware and software synths, but at least all your synced VST delay FX will work correctly.
Also, remember that your hardware sequencer expects a loud and clean clock signal. Sync might drift if the signal is too weak, or the waveform might not be a true square wave anymore in case you are distorting it.
On my particular setup, I leave the clock output at maximum in my software, and set my headphone amp to -10 dB. This setting has been the most stable for me.
Last thing is that my Keystep Pro is happy to obide whenever I send it STOP signal, but currently it freaks out when I send it START signal. That's why I only send it STOP, and manually arm the sequencer before the next take. I'm trying to resolve this with Arturia support at the moment.
Good luck! 😃
I've been down this rabbithole as well (see my latest video) and to get any sort of accuracy you need to record the midi signals, or you are looking at the latency of the DAW or the hardware that generates the sound... Which MAY be near zero.
And then we have to remember that MIDI just isn't particularly accurate. 😀
I went the inner clock systems - sync gen. Amazing piece of gear at a cheaper price than the erm.
Audio clock generator with expanders to allow more outputs. Also has modular run and clock out ;)
I’ll have to do some longer scale tests.
Oooo ive heard only good things about that one!
@@RickyTinez it’s a locally made (for us aussies 🤪) and has some big names that use them.
I’m still looking at something like the Mmrc so that it’s easier for changing it here routing in the studio. From the comp to the mpc, the requirement is to do it as quick as possible, with as little rewiring as possible ;p
You should use a very fast click sample to increase the precision of your tests.
You should do the same test with logic. I was using logic for years and really started having sync issues when switching to Ableton. But to be fair I also have issues when syncing hardware to my Multiclock in master model.
Ableton is great for in the box midi tasks but it handles midi in and out the box very poorly compared to other solutions.
Maybe also try the same test with a MPC as master.
ozone 10 and 11 hits my cpu hard and i have a i9- 1200k which has 24 cores if you include virtual cores? either way its got alot and i have 44 gb of ram and i dont get whats so taxing about ozone presets i know its doing alot of processing but nothing else ive used since i got this cpu about 3 months ago. maybe that could effect your test more negatively than others and if you tried something else you might get different results? ozone does sound damn good tho big improvement from the 1st .hope that helps atleast its something to consider.take err easy player.
This is why I work in the box. I can’t understand the world of electronic music being constantly out of phase. Musicians spend a lifetime fine tuning their groove and timing. Then you get a drum machine and try incorporating it only for it to just literally be out of the pocket.
Playing an Mpc in stand-alone feels nice, they have it so dialed in that it feels spot on playing the pads. Once a computer is involved or any syncing of clocks between devices, things are shot. Groove is gone. I find it easier to just play a plug-in inside the daw.
Hey !
I use an ESI M4UeX connected via USB port to my Push Standalone to sync my different machines. There is the possibility of managing the latency of the M4UeX on the Push Standalone. This might seem like a weird question, but if I upgrade to one of the machines you present, could I achieve more stability with my MIDI sync?
hmmmmmmmmm ok so I have to assume the preset changing in Ozone was a bit more than just CPU spiking as far as the culprit of your Midi sync falling off with either device. I experimented with switching through presets in Ozone like you were doing, and each preset has a different total sample latency, with a pretty huge variance between them. It looked like you had latency compensation enabled while running the ozone preset swapping tests, so it's most likely that Live having to reorient itself to the new maximum latency after each preset switch was causing the audio buffer to stumble. The Midi clock running in the background was probably keeping fairly steady, but because the audio buffer stumbled around it it wound up falling out of sync with Live's timeline.
Try a different DAW ? Like Cubase or even better (for recording) Reaper ?
I think that is the limit of accuracy of midi clock, even when derived from sample accurate audio.
Please check what happens when you use the Multiclock as clock source and not triggered by Ableton. It could be that the root cause for all the issues is Ableton and not the devices.
Hello Ricky I think the problem comes from live 12 sampling stability problem or others, many users have noticed the same problems as you, including me, Live 12 systematically drops out when you open a Plugin or load presets or open the internet or other application, it's a recurring problem, I have the Erm multiclock and I have exactly the same audio dropout problem and I think that the Erm or the Mrcc are not the cause
Maybe try testing on another Daw?, Ableton has given me out-of-sync issues before!, Also external digital devices like your Digitakt or any other e-device have midi latencies. It would be worth testing midi tracks from Daw to Daw and see the results.
I wish the Iconnectivity MIO or MIoXL was a part of this shootout. It could be the best of both worlds.
I wonder how much of that jitter can be attributed to floating point precision issues. Any time I see small decimal jumps in computing, float precision is usually the culprit
Who says it's the Multi Clock or the MRCC that are jittering and not the machine receiving those clocks which is then recorded?
That’s what I kept thinking
Very interesting, could it be a problem with your audio interface? Does the same thing happen with a Syntakt or Octatrack...?
I bought the midronome, put it against a friends multiclock and basically ended up with a similiar set of stems :D I dont want to leave the comfort of ableton but surely something like protools handles latency better than my favorite daw..right?
Why the solution is to use Ableton as a slave and avoid using its MIDI Clock to sync machines
The main problem with synchronization tests using a computer's MIDI clock is that jitter (variation in clock precision) is considerably higher compared to dedicated clocks. This is due to how operating systems handle USB MIDI communication, where data is processed in a more complex and parallel manner, introducing variable latencies and fluctuations that affect tempo precision. In contrast, traditional MIDI works in a simpler and more direct way, which contributes to greater stability.
Here are some key reasons why the computer's MIDI clock is not ideal and why Ableton should be used as a slave instead of a master:
Jitter in USB MIDI: When using the MIDI clock of a DAW like Ableton, even with low load, the typical jitter is in the range of 2-5 ms. As the project load increases, this jitter can reach 5-10 ms or more, making the clock inconsistent and affecting the groove and synchronization. For example, if you change presets or introduce effects in the mastering chain, latency spikes can occur, causing rhythmic elements like the kick to fall out of sync, resulting in offsets of up to a quarter of a beat or more.
Comparison with dedicated clocks: Devices like the E-RM multiclock offer extremely low jitter, of only ±1 sample when synced with audio at 48kHz, which is similar to what the E-RM midiclock⁺ offers. These devices guarantee precise synchronization without the fluctuations that often affect USB MIDI. Additionally, since the clock is external, it is not affected by latency generated by the computer's internal processing, always maintaining stable tempo.
More precise hardware machines: Well-designed equipment, such as those from Elektron, handle the MIDI clock much better than a DAW, with jitter in the range of 1-2 ms, which is already considerably more precise than a computer under load.
Latency accumulation in MIDI chains: If you have multiple devices connected in series, latency accumulates linearly, which can negatively affect synchronization. Therefore, it is recommended not to connect more than 2 or 3 devices in series to avoid timing issues.
Conclusion: If you are looking for precise and stable synchronization, it is essential to use a dedicated device like the E-RM multiclock as the clock master and avoid relying on the computer's MIDI clock. USB MIDI latency and jitter will negatively affect your timing, while a dedicated clock will ensure solid and fluctuation-free synchronization. Using Ableton as a slave and an external clock as the master is the best option to maintain perfect synchronization in your setup.
Guess we need to remove the recording onto a separate medium, or laptop..
It’s only function being to record - giving it a single process rather than
Doing everything.
sync rabbit holes are endless 😖
Did the present change i ozone change the latency of the plugin maybe so that Live had to recalculate and reapply Live's delay compensation chain? I suspect that a mastering plug in can do that while an instrument plugin probably does not do it in most cases.
Ok, now do the same test, but comparing humans playing instruments. Reckon there’d be a fair bit more variation. 😜 That push/pull of a groove is a large part of what makes music magical, right? I get the sample-level phasing issue, but how often are you likely to encounter that in real-world use?
About your outcome with the multiclock beeing so similar: What if this jitter was back then , when the Multiclock came out was an issue. But computers improved so much, that a clock generated by the DAW and transmitted via USB is similar stable.
You made me curious: I will do a similar test. I have no MRCC but i can try clocking the DT via the Blokas Midihub and then via Multiclock.
So far i can confirm the Multiclock generates in conjunction with the DTII around 1ms of jitter. Not sure what the source is.
Without external clock the DTII also jitters , but a lot less - around 0.25 ms
When i clock the OT MkII with the Multiclock it jitters less than the DTII ... the maximum offset i found was 0.6 ms .. the median was more like 0.3ms
I'll have to test this a bit more extensivly.
Also, you're fighting the cpu of your mac and so you're causing the individual device's CPU to get flooded with late sync messages - how well the device can handle it is going to depend on the individual device's hardware - so cost is going to be a fair indicator of capability here. I bet both would be great if you had a UAD accelerator running your plugins so that there were limited/no cpu spikes when dealing with vsts
That's what I was thinking too, I wonder how much of a difference it would make if the ERM was primary clock and everything else including Ableton Live followed it
Do you try this with reaper? I've done similar tests and have no problems..... It's probably ableton
Innerclock Systems
This stresses me out. I'd give up or lose my mind.
TBH, I think you're mostly testing the midi implementations of Ableton and the Digitakt…
When MIDI was first introduced at whateverthefuck music fair. It failed publicly to perform the sync that it was meant to do. And was branded MUDI by some press. MUDI being short for: Musically Unusable Digital Interface. its the same standard still. And the "reserved for the future" parts of the concept are still somewhere in the future to be revealed. and its unlikely that there is distant enough future that they will solve shit.
It seems to me that you have become a corrupted multiclock and want ideal microcosmic accuracy) you need to remember the time when you didn't have a multiclock)))) these are all jokes, of course.
I wonder if you have tried recording the same track from an electronic device, but without synchronization, just with an internal tempo generator? it seems to me that the result will be similar.
Maybe these are permissible errors in the hardware, which make the music not so smooth, with a certain amount of errors and "humanize"?
personally, I bought a multiclock about 5 months ago and I am incredibly happy. I have greatly simplified the jam process, and its post-editing, with a mountain of my own hardware. still, this is much much better than its absence. plus rich possibilities for midi routing, what could be cooler) in any case, thanks for what you do!
When I really started to dial in the ERM multiclock that I spent so much time and money on….I sadly started feeling the same way about it. Am I still happy I own one? yes. Knowing what I know now would I have invested in one when I did….. definitely not lol
it is almost like the DAW should be the clock for the hardware. at least everything should be off together. Also, there is no magic bullet. Latency is everywhere.
Now what happens when you start changing the buffer size after everything is set up?
I maybe tripping, but once I got the Multiclock set up, changing buffer size everything was still locked pretty dang tight.