What is the internal resistance of a voltage source if its terminal potential drops by 2.00 volts when the current supplied increases by 5.00 amps?
We’ll call the drop in terminal potential, 2.00 volts, 𝑉. And the change in current supply, 5.00 amps, we’ll call 𝐼. We want to solve for the internal resistance of the voltage source. This resistance we’ll symbolize with a capital 𝑅.
To get started on our solution, let’s recall Ohm’s law, which tells us that the voltage across a circuit is equal to the current in the circuit times its overall resistance 𝑅. In our case, we have an increase in the current 𝐼 which uses up more potential difference created by the voltage source 𝑉 when that potential difference is applied across a resistor 𝑅.
If we rearrange Ohm’s law to solve for 𝑅, we find it’s 𝑉 divided by 𝐼 or the potential drop, 2.00 volts, divided by the increase in current, 5.00 amps. This fraction is equal to 0.400 ohms. That’s the internal resistance the voltage source must have in order for this much current increase to lead to a voltage drop of 2.00 volts.