Hi, I'm new to electronics and exploring the basics so please please forgive me.
Currently I have a simple circuit with a 12V,1A PSU, bulb and mulitmeter.
When using the multimeter to meaure the bulb's resistance I get 4.5 ohms and then 12V for the voltage across the bulb. According to Ohm's law I should get 2.6 A for the current and I do: the multi-meter says 2.6 when set to 200m and the red probe in the 10A connector.
My question is how can I have 2.6 A in my circuit when the PSU says it's output is only 1A? Also, what is the 200m setting for on my multi-meter?
Currently I have a simple circuit with a 12V,1A PSU, bulb and mulitmeter.
When using the multimeter to meaure the bulb's resistance I get 4.5 ohms and then 12V for the voltage across the bulb. According to Ohm's law I should get 2.6 A for the current and I do: the multi-meter says 2.6 when set to 200m and the red probe in the 10A connector.
My question is how can I have 2.6 A in my circuit when the PSU says it's output is only 1A? Also, what is the 200m setting for on my multi-meter?