V = I × R is relevant for the resistor. For the transformer the power equation, P = V × I is more relevant.
Given the power rating and the output voltage you can calculate the maximum current that can be drawn without exceeding the ratings.
So, back to your questions:
Let's say we connect this output side to a circuit with R1 amount of resistance. Will the current still be the same regardless of R1?
No. The transformer will try to maintain its rated output voltage. (In practice it droops with increasing current draw due to losses in the winding resistance. For example, you might see the voltage increase by 10% when switching from fully loaded to fully unloaded.) The current will be given by Ohm's law, \$ I = \frac V {R_1} \$. Decrease R1 and current will increase.
If it changes with R1 what is the point of current rating?
The current rating tells you when to stop reducing R1. The transformer wiring has resistance and there is a maximum current it can handle before self-heating becomes an issue and the smoke starts to come out.
Most transformer manufacturers will have a range of standard sized cores each with a VA rating. If you require a 50 V supply capable of 3.7 A then you need a 50 × 3.7 = 185 VA transformer. The next largest standard size might be 250 VA so that is what the manufacturer will offer. They now have the option of winding with wire capable of handling only 3.7 A and saving some cost but derating the transformer to 185 VA or using 5 A wire and rating it at 250 VA.