You will have approx 115v across each of them
Nope. Only if they have the same resistance, which they won't. You can't even guarantee that a 120V lamp and 120V contactor of equal power would work, because an AC contactor impedance and power factor vary with armature position, so the lamp would receive an impulse of more than 120V during pull-in, possibly blowing it, while its resistance would shoot up reducing the voltage available to pull the contactor in. These are not simple resistors that will divide the voltage ratiometrically.
Leaving aside the safety considerations and thinking only about the electrical theory...
To sense a load current, you want to impose a minimum of voltage drop so that the load sees as near the full supply voltage. E.g., with a 230V supply and 230V load, if you allow say 1% voltage drop in the sensing circuit, that must operate on 2.3V maximum. The load will behave like a current-source, not a voltage-source. I.e. it will dictate the current. If the load is only ever going to be a 15W lamp, its current will be 15/230= 0.065A. That makes a 0.065*2.3= 0.13W available to trigger the circuit. A sensitive miniature relay inside a rectifier could detect this and control a contactor.
It's better to do it electronically. If the load will only ever be a 15W lamp, a rectifier in series with the lamp with two zener legs, will generate a constant voltage that can drive an opto-isolator. The opto output can be used to trigger a thyristor to energise the contactor coil. Better is to use a current-transformer, as zenering the output will allow a wide range of loads to be detected. This makes a non-linear detection circuit - it can cope with a very wide range of load powers between the detection threshold and maximum.
Let's look at why a simple resistive series detection circuit, or something like a contactor coil, can't practically be used as a load detector for a range of loads.
If the sensing circuit also has to be able to carry a full 13A load, then you can't realistically allow a 1% volt drop, as the power dissipated would be 13*2.3=30W, so something in there would have to shift as much heat as a soldering iron. Let's make it 0.5V, for 13*0.5=6.5W (ample to power a large contactor). Now, when the load is the 15W lamp it will only have 0.5/13*0.065=0.0025V across it, so it would have to respond to 0.0025*0.5 = 0.00125W i.e. just over a milliwatt, barely enough to light even a high sensitivity LED
So we see that if the sensing device is resistive, it has to deal with a very wide range of power dissipations (the square of the ratio of maximum load current to the detection threshold current). From this it follows that non-linear sensing device is needed such as the electronically sensed CT.