View the thread, titled "Using only PV electricity" which is posted in Solar PV Forum | Solar Panels Forum on Electricians Forums.

M

mrs

I am not an electrician, so trying to understand and report back to a customer whether the electricity he uses while the PV array is generating is 100% the electricity from the array. Or if the power is 50% from the grid 50% from array due to the system balancing itself. The customer was an electrical engineer so I guess this may be a valid question and when I asked our electrician he admitted that it makes sense and could be the case - but didn't know. This is worrying me and i don't know if i am making myself clear - but does anyone have an opinion?
 
If he is an electrical engineer I would have thought he would be pretty genned up on this.

If you have a 4kw PV system and say in your home you are using 3kw of power at one time then all that power will be taken from the PV system and the balance less tolerances be fed back into the mains power, which is why some older meter go backwards.

If though you up the demand ie put on a kettle and take the power over 4kw then the chances are the whole load will be taken from mains, as the power drop across the inverter will drop the array voltage and in essence the inverter stops. Once you go back below the threshold and the voltage is higher again from the array than the mains voltage then the system starts to work again.

There are better inverters now and better tolerances, and I'm sure better and more balanced distribution of the systems. But in essence if your using less power in your home than the power being generated by the PV, Then all that power is coming from your system
 
Here's a real example - mine!

I have a 3.75kWp system on my roof.
Today, there's full cloud cover and it's only generating about 0.4kWh (per hour) when I looked just a minute ago. Generation is poor when there's no direct sunshine.
I boiled the kettle a few minutes ago and it consumes about 3kWh (per hour).
For those three minutes while the kettle was on, I probably used all of the 0.4kW generated from my solar panel output and drew 2.6kW from the grid.

When I boiled the kettle yesterday morning around the same time, it was quite bright and sunny, so my panels were producing at a rate of about 2.5kWh (per hour), so the remaining shortfall to boil the kettle of 0.5kW would have been drawn from the grid.
 
If though you up the demand ie put on a kettle and take the power over 4kw then the chances are the whole load will be taken from mains, as the power drop across the inverter will drop the array voltage and in essence the inverter stops. Once you go back below the threshold and the voltage is higher again from the array than the mains voltage then the system starts to work again.
No matter what power you draw the inverter shouldn't 'stop'. If you are drawing more than the inverter is kicking out you would use all of that and then start drawing on the grid to make up the shortfall.
 
I am not an electrician, so trying to understand and report back to a customer whether the electricity he uses while the PV array is generating is 100% the electricity from the array. Or if the power is 50% from the grid 50% from array due to the system balancing itself. The customer was an electrical engineer so I guess this may be a valid question and when I asked our electrician he admitted that it makes sense and could be the case - but didn't know. This is worrying me and i don't know if i am making myself clear - but does anyone have an opinion?


Surely if the client has access to both import & generation figs and the generation is greater than the import meter, you could safely assume you are using the energy you produce. This can manually be backup by viewing your electricity (DNO) meter and seeing that no electric is actually been imported at any given time, under production. Night time obviously would be all taken from the grid, unless the customer has some sort of battery backup system inplace

However if the "pressure" from the PV system at anytime isn't equal or greater than the electric coming into the house - the house will indeed consume from the national grid to make up.

I believe the actual input ( grid ) and generation ( inverter) voltages may play a part in this too..... but under normal circumstances wouldnt be detected.
 
Last edited by a moderator:
No matter what power you draw the inverter shouldn't 'stop'. If you are drawing more than the inverter is kicking out you would use all of that and then start drawing on the grid to make up the shortfall.

I will have to bow to you their YM as I don't work on PV as such.

I just thought that because of how an inverter works it will always open to the side with the highest voltage, so if your array is producing 270volts say and mains is 243v then it allows the PV side generation to "flow"

When the load pull in the installation demands more than the array produces, don't the volt drop across the PV side to an extent where the main voltage takes over ? or is the design such that the array will always over volt the mains?
 
my understanding (which may of course be flawed!) is that the inverter always pumps out at over mains voltage
 
Inverter is working dynamically as conditions change and will change its terminal voltages to ensure all the current it is generating is used, so if the voltage at the head changes the inverter compensates at its own terminals. How it does it I don't know!
 
I guess this is the reason voltage drop tolerances are so low on P.V, as the inverter can measure voltage input and increase output voltage by X (any ideas what X is?) amount to ensure the P.V current gets priority over imported? If the voltage drop is to big on the cable, and the inverter can't read what the voltage is at the other end of cable, then the imported voltage may be greater meaning inefficient use of generated current?

Is that correct?
 
I am ... trying to understand and report back to a customer whether the electricity he uses while the PV array is generating is 100% the electricity from the array.

The MPP tracker(s) in the inverter will optimise the current drawn from the array to keep it at the ideal working point. To do this it will also vary its AC terminal voltage slightly so that it supplies the corresponding amount of power to the consumer unit/Henley block (the power has to go somewhere!). It does this by measuring its own output current so it does not need to know the voltage at the other end.

Hence its terminal potential will as moggy and yvm say be slightly above the mains voltage at the CU/HB. (If cabling is sized as per DTI recommendations the difference will be a maximum of 2.4V.)

This in turn means that while the generated power is sufficient for the house loads the excess power will run "downhill" to the grid and so no power will flow "uphill" into the house.

You can therefore reassure the customer that the power he generates will be used fully in the house before any of it is exported, and conversely he will only import if the array power is insufficient.

HTH
 
Last edited:
Inverter is working dynamically as conditions change and will change its terminal voltages to ensure all the current it is generating is used, so if the voltage at the head changes the inverter compensates at its own terminals. How it does it I don't know!

It's part of the installation process, I really would have thought a man in your position would know this!!
What you do is cut a chickens head off, shake it over the inverter and then it works!
 
I guess this is the reason voltage drop tolerances are so low on P.V, as the inverter can measure voltage input and increase output voltage by X (any ideas what X is?) amount to ensure the P.V current gets priority over imported? If the voltage drop is to big on the cable, and the inverter can't read what the voltage is at the other end of cable, then the imported voltage may be greater meaning inefficient use of generated current?

Is that correct?
The inverter can't measure the voltage at the other end of the cable because its got no direct access to it. My guess is that if its trying to output 10A (for instance) it'll increase its terminal voltage until its successful, if a cloud comes over and power from the dc drops and it only is now trying to output say 4A, then it'll reduce its terminal voltage to compensate for the lower volatge drop in the supply cable coming from it!

The reduced voltage drop tolerance is because the inverters can only work over a certain range and if the head voltage is at its max, then the voltage needed at the terminals of the inverter could be quite a bit higher to force the current down the line, so having a 1% limit ensures the inverter voltage range is not exceeded.

This is only my interpretation on how I think it all works, would welcome any comments to correct me if its wrong!
 
so, if your mains is at 235 say, you could allow more than a 1% voltage drop in your cables and it would still work?
 
so, if your mains is at 235 say, you could allow more than a 1% voltage drop in your cables and it would still work?

In theory what you are saying is correct but your voltage wouldn't stay at 235V all the time - in varies with loadings of you and your neighbours. You have to allow for worst case conditions.
 

Reply to the thread, titled "Using only PV electricity" which is posted in Solar PV Forum | Solar Panels Forum on Electricians Forums.

OFFICIAL SPONSORS

Electrical Goods - Electrical Tools - Brand Names Electrician Courses Green Electrical Goods PCB Way Electrical Goods - Electrical Tools - Brand Names Pushfit Wire Connectors Electric Underfloor Heating Electrician Courses
These Official Forum Sponsors May Provide Discounts to Regular Forum Members - If you would like to sponsor us then CLICK HERE and post a thread with who you are, and we'll send you some stats etc

Daily, weekly or monthly email

Back
Top