Drive capacity vs slew rate?

We're using S5D9 with external 120MHz SDRAM. It's working fine.

We've got problems in EMI with 120MHz harmonics, which is not too unexpected. One thing you can do to mitigate such problems is to slow down slew rates as unnecessarily sharp clock edges have high-frequency components. SDRAM clock is constantly running whether or not you're accessing it, so it's an easy target.

Now Synergy S5D9 has "Drive Capacity" setting, which is a bit confusingly expressed in the datasheet/manual. It says low strength is 2mA average, max 4mA. Medium is 4/8mA and high is very high, e.g. 16/32mA.

Is this a current-limiter type deal or it's adjusting output slew rate i.e. output impedance? That's oscilloscope image of the 120MHz SDCLK signal, dark reference trace is with "high" drive and bright trace is with "low" drive. You'd expect significant difference between the two, especially since the probe adds another 12pF of capacitance to the measurement and that takes power to charge. However, there's very little difference. So it appears like the output impedance is actually the same?


Parents Reply
  • This is what I've started to suspect. I know the drive strength works for GPIO, I am driving some LEDs where you can clearly see low drive is limiting the brightness you get compared to high drive.

    However based on that measured signal, the drive strength setting does nothing or close to nothing for SDRAM. I was first suspecting that ThreadX was changing the setting behind the scenes but I've verified (or had our SW guys verify) the register drive DSCR[1:0] bits are set as expected in PFS.P602PFS.

    Is the SDRAM driving behaviour documented anywhere?