H
hightower
I'm trying to fathom the science behind why certain transformers need a certain power in order to work. What I've got is this, and wondering if someone can confirm my thinking:
12v output, 30W load - resistance of 4.8ohms
12v output, 2.5W load - resistance of 57.1ohms
Is it because as the power reduces, the resistance in the circuit increases and it gets to a point where there is too much resistance for the circuit to work? So then:
12v output, 6 x 2.5W load in parallel - resistance of 9.52ohms
...which might be little enough resistance to allow the circuit to work again?
Am I on the right lines here or do I have the complete wrong end of the stick?
Thanks,
12v output, 30W load - resistance of 4.8ohms
12v output, 2.5W load - resistance of 57.1ohms
Is it because as the power reduces, the resistance in the circuit increases and it gets to a point where there is too much resistance for the circuit to work? So then:
12v output, 6 x 2.5W load in parallel - resistance of 9.52ohms
...which might be little enough resistance to allow the circuit to work again?
Am I on the right lines here or do I have the complete wrong end of the stick?
Thanks,