Fuses of any sort do not have a single blowing pint. You have a curve of current-time where you can look and see at a given level how long it takes to blow.
Generally once you get over several minutes it is quite environment-dependant as to heat escaping versus reaching melting point. As the fusible wire heats up its resistance increases, further increasing the I2R heating (assuming fuse R is much lower than total circuit R) and so accelerating the process. As a result fuses show a rapid decrease in fusing time as the current increases.
But to answer your question, typically to measure a fuse you would subject it to a known overload, say 2-5 times its rated current, and then time how long it takes to blow and compare that to the expected characteristics. Since fuses are one-time items it becomes expensive to properly characterise them!
So usually you want to know the maximum continuous current, that is usually below the acting point by a factor of 1.45 for the likes of BS88 high rupture capacity fuses, and about a factor of 2 for old open-wire style fuses. However, the acting point can be many hours!
In terms of selecting a fuse compared to a cable there are a lot of factors to consider, mostly how well the cable is thermally insulated and the sort of overload events that might occur. But a quick search for automotive fusible links suggests 4 AWG steps down (i.e. your 14 gauge link is for protecting 10 gauge wire).
Here is a typical set of curves for a HRC fuse, you can see that a 1A fuse needs about 2A to blow in 5 minutes and about 2.6A to blow in 1 second, whereas a 10A fuse is also about 20A for 5 minuts (same x2 factor) but needs about 45A to blow in 1s: