I'll try to explain as I understand it:
At the origin of a single phase installation, you have a Line at 230V, and a Neutral at 0V - a potential difference of 230V.
Let's build a simple circuit - 10m of 2.5mm² T+E feeding a heater pulling 13A. That 13A will flow from the origin, through 10m of line conductor, through the heater, through 10m of neutral conductor back to the origin.
The bulk of the circuit resistance is in the heater, so the bulk of the potential difference will be across the heater. But the circuit conductors have a small resistance too, so there will be a small potential difference across the 10m of line conductor, and across the 10m of neutral conductor. Added together, this potential difference is the 'voltage drop'.
From the OSG, p196;
1m of 2.5mm² copper conductor has a resistance of 0.00741ohms at 20 deg C.
Therefore 20m would be 0.00741 X 20 = 0.1482ohms at 20 deg C.
The resistance of a conductor increases as it gets warmer, so we want to calculate the voltage drop at the maximum operating temperature of the conductor, when the resistance, and therefore voltage drop would be at its greatest. For T+E this is 70 deg C. A conductor at 70 deg C will have roughly 1.2 times the resistance of a conductor at 20 deg C.
So, 0.1482ohms X 1.2 = 0.17784ohms at 70 deg C.
V=IR
V = 13 X 0.17784
V = 2.31192V This is your voltage drop for the circuit.
Let's check that it tallies with the other method. From OSG P161:
1m 2.5mm² T+E has a VD of 0.018V/A/m. That's the VD for both line and neutral conductors, at 70 deg C.
0.018 X 13 X 10 = 2.34V/A/m
More or less the same, a very slight difference I expect due to rounding.
Note that I prefer to always work with volts rather than millivolts, which i think can confuse calculations sometimes.