you would not be concerned with the at 32 amps?
is that because the VD is only a few percent over ? or becasue you doubt they would ever be loaded that much to reach 32 amps?
A combination of all. Here's how I look at it:
A supply voltage of 216V (lower limit that is permitted to be supplied) supplying a circuit with a 5% voltage drop, gives a voltage of 205.2V at its furthest point when under load - quite low. This is perfectly acceptable according to regulation.
A supply voltage of 245V (common in my experience, much more likely to come across this than 230V) supplying a circuit with a 10% voltage drop gives a voltage of 220.5V at its furthest point under load. Twice the permitted voltage drop as a percentage so non-compliant with regs, yet the working voltage is 15V higher than the previous, compliant example, and is much closer to the 230V the equipment is designed for.
So the working voltage in the 2nd example is 'better' than that in the 1st, despite it being non-compliant.
The downside to the second example is there will be more power wasted as heat through the cables, when under load. For a circuit that is expected to be fully loaded for long periods of time (eg. EV charger), this would be a costly, wasteful problem. However, the average ring final is unlikely to be fully loaded often, if ever. Even if it is, it is unlikely that it will be fully loaded for any length of time, nor is it likely that the bulk of the load will be concentrated at the mid-point of the circuit.
In short, it's unlikely to happen, and if it does, the consequences are likely to be negligible. I'd say the cost of rewiring the circuits will probably outstrip any savings made due to reduced cable losses, many times over throughout the life of the circuit.