0
\$\begingroup\$

A step up transformer increases voltage while decreasing current because we can't get more energy than what we had in the first place.

The thing struggles my mind is how his power has a current rating? Normally 5 V is just 5 V. No current information is given since it's about V=IxR. How does this output power can have a current rating?

Let's say we connect this output side to a circuit with R1 amount of resistance. Will the current still be the same regartless of R1? If it changes with R1 what is the point of current rating?

\$\endgroup\$
1
  • \$\begingroup\$ ”since it's about V=IxR” This only applies to resistors. A transformer is not a resistor. You can model the wire in it as a resistor though. \$\endgroup\$
    – winny
    Commented May 26 at 22:06

2 Answers 2

3
\$\begingroup\$

The current rating of a transformer or power supply is the maximum current that it can safely supply, without overheating or otherwise damaging itself.

The current that the transformer will actually deliver at any time is determined by the connected load.

A transformer will not normally limit the delivered current to its rating, but will attempt to deliver the current demanded by the load, but may be damaged in the process if the demanded current is sufficiently higher than its rating.

\$\endgroup\$
2
\$\begingroup\$

V = I × R is relevant for the resistor. For the transformer the power equation, P = V × I is more relevant.

Given the power rating and the output voltage you can calculate the maximum current that can be drawn without exceeding the ratings.


So, back to your questions:

Let's say we connect this output side to a circuit with R1 amount of resistance. Will the current still be the same regardless of R1?

No. The transformer will try to maintain its rated output voltage. (In practice it droops with increasing current draw due to losses in the winding resistance. For example, you might see the voltage increase by 10% when switching from fully loaded to fully unloaded.) The current will be given by Ohm's law, \$ I = \frac V {R_1} \$. Decrease R1 and current will increase.

If it changes with R1 what is the point of current rating?

The current rating tells you when to stop reducing R1. The transformer wiring has resistance and there is a maximum current it can handle before self-heating becomes an issue and the smoke starts to come out.

Most transformer manufacturers will have a range of standard sized cores each with a VA rating. If you require a 50 V supply capable of 3.7 A then you need a 50 × 3.7 = 185 VA transformer. The next largest standard size might be 250 VA so that is what the manufacturer will offer. They now have the option of winding with wire capable of handling only 3.7 A and saving some cost but derating the transformer to 185 VA or using 5 A wire and rating it at 250 VA.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.