I'm not convinced on this sort of thing.
Different types of appliances will react differently to the reduction in voltage. But simply:
Resistive loads (eg convector heaters)
Their output will be reduced. The only time this will be of benefit is if the heater was oversize for the room anyway.
If thermostatically controlled, the exact same amount of power will be used, as the heater will need to be on longer (Duty cycle) to achieve the set temperature.
Lighting loads (filament and magnetic ballast discharge)
Light output will be reduced, colour temperature will shift.
Lamp life will increase in most cases.
Motor loads (Fridge, Freezer)
A small reduction in available torque...but the appliance should have been designed to work correctly within the whole voltage range... so a small efficiency gain is possible
Equipment with linear power supply (Hi-Fi Amplifier, portable audio, alarm panel, etc)
Small efficiency gains are possible with items containing linear regulators.
But the maximum output of the amplifier will be reduced as the power supply for that stage generally does not have any regulation.
Equipment with switching power supplies (PC, TV, Freeview/Sky, advanced microwaves, HF Lighting ballasts,...)
THIS IS THE KILLER.
Most switching power supply designs are wide-range (90-264V), to use on any electrical system in the world.
They do exhibit the constant power characteristic.
So, as you reduce the input voltage, you increase the input current.
In this case, there can be no benefit from reducing the voltage. The higher current will cause more voltage-drop losses.
My next door neighbours just moved to a new house and wanted a few little jobs doing. They asked me if something was wrong with the electrics at the new place as the kettle was taking longer to boil and the shower was pathetic (funnily enough exact same model and rating)
Old house: 243V. New house 226V.
Does anyone remember those magnets that you could stick onto your car's fuel line to increase mpg lol?