Drive capacity vs slew rate?

We're using S5D9 with external 120MHz SDRAM. It's working fine.

We've got problems in EMI with 120MHz harmonics, which is not too unexpected. One thing you can do to mitigate such problems is to slow down slew rates as unnecessarily sharp clock edges have high-frequency components. SDRAM clock is constantly running whether or not you're accessing it, so it's an easy target.

Now Synergy S5D9 has "Drive Capacity" setting, which is a bit confusingly expressed in the datasheet/manual. It says low strength is 2mA average, max 4mA. Medium is 4/8mA and high is very high, e.g. 16/32mA.

Is this a current-limiter type deal or it's adjusting output slew rate i.e. output impedance? That's oscilloscope image of the 120MHz SDCLK signal, dark reference trace is with "high" drive and bright trace is with "low" drive. You'd expect significant difference between the two, especially since the probe adds another 12pF of capacitance to the measurement and that takes power to charge. However, there's very little difference. So it appears like the output impedance is actually the same?