The gist of the OP's question is why not make the default single-phase voltage, i.e the lowest voltage available on the system, 400V rather than 230, regardless of the phase arrangement. So discount whether it's Y or delta, single or three phase, and just consider whether 400V light bulbs, hairdryers, phone chargers etc. are sensible and practical and worth the saving in copper.
The answer I suspect is no.
Historically, the choice of voltage was significantly dictated by both carbon-arc and filament lamp design and 120 years ago, both worked better at 120V than 240. Metal filament lamps are stronger, last longer and are more efficient at lower voltages (hence 12V halogens etc.) In the UK, we actually preferred 120V lamps in series pairs for certain stage lighting purposes instead of 240V and projector lamps were 120V fed from a transformer, because 230V lamps were so fragile and inefficient. We got good at making 230V general purpose lamps but 400V was basically unachievable, so there never was a 400V incandescent lighting option. None of that really applies in 2020.
The fragility argument also goes for many wound components, a small 400V transformer primary or relay coil is more expensive to make and more prone to failure than a 230V one, due to having many turns of very fine wire. Even 230V can be a problem - many small mechanical timeswitch motors (e.g. the defrost timer in the freezer and plug-in timers) actually used a 120V or lower voltage motor, fed by a capacitor dropper, because a 230V type would be too expensive and fragile. 400V would be much more of a problem still.
Discharge lamps with ballasts dictate their own voltage and the ballast takes up the difference between that and the supply. So a metal halide lamp running on 400V will take the same current as one on 230V, just at a lower power factor as the (more expensive) ballast has to drop an extra 170V. Only a transformer would solve this.
But, the main event is the switched-mode power supply, where the electronics on the primary side have to operate at the peak voltage of the supply. In most electronic devices, and any appliances containing electronics, the incoming 230V AC is rectified and smoothed to 320V DC, whereas on 400V the DC rail would be 560V. This requires both smoothing capacitors and chopper transistors of a different tier of performance, which certainly with the components available today would significantly impact the price. 560V is definitely at the top end of what electrolytic capacitors are capable of, and at this voltage it is not uncommon to have to use series pairs with balancing resistors. Possible, but probably not economic, given that many such power supplies use so little power that they would not contribute to any realistic saving in copper. Really small loads that use capacitor droppers, like the ballasted discharge lamps, would simply have to use a more expensive capacitor and drop more volts, with no saving in current.
Therefore I think with the state of the art, small power and lighting is still best served in the 120-230V range, with advantages to both voltages but increasingly in favour of 230V now that filament lamps are not a driving factor.