0
\$\begingroup\$

I'm looking into an inductor datasheet(page 59) and I'm having a bit of trouble understanding the difference between the ratings.

My interpretation is as follows:

1 - Rated Current based on Inductance Change is: How much current you can put through the inductor till its inductance goes 30% off the original rating.

2 - If you "dont care" about inductance, the Rated Current based on Temperature Rise is the absolute maximum current the inductor can take before thermal failure.

Is that it? Am I missing something?

enter image description here enter image description here

\$\endgroup\$

2 Answers 2

1
\$\begingroup\$

Almost complete. The other factor is that the temperature rise plus ambient must be less than 125°C. So if your ambient temperature exceeds 85°C you must reduce the maximum current down to zero at 125°C.

In other words, if your ambient is 100°C you can only pass 0.63 of the thermal maximum current. 1-(100­°C-85°C)/40°C

You should make sure both constraints are simultaneously satisfied- so the lower of the two currents is what matters.

\$\endgroup\$
1
  • \$\begingroup\$ Does this mean that if my ambient temp is lower than 85ºC, "rated curr based on Inductance change" dictates my limitations? I guess this would explain why a switching psu started to lose regulation past ~80ºC even though I was using less than both ratings... \$\endgroup\$
    – Wesley Lee
    Commented May 29, 2016 at 17:06
1
\$\begingroup\$

The inductance decreases because of either current or temperature.

Higher currents cause the magnetic material to saturate (lose permeability). This occurs at (nearly) any temperature.

High temperatures also cause the material to lose permeability. Temperatures are raised by I^2.R losses in the wiring as well as core losses in the magnetic material.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.