The starter current does dwarf the alternator current yes. And the cable can of course handle the starter current. But what difference does that make to the resistance of the cable?
It means that ANY cable which can handle the HUNDREDS of amps to turn the starter motor without any significant voltage drop across its length must -by mathematical inference- be capable of passing the few amps required to charge the battery. -Remember that the typical alternator can only generate in the region of one hundred amps maximum, and the battery charge rate is usually a small fraction of that.
Thus ohms law tells us that the charging voltage drop must be TINY, if the cable can turn a starter motor.
THAT's what 'difference' it makes to the analysis.
An increased resistance in a charging circuit does NOT in any way alter the voltage to which a circuit component such as a battery or a capacitor charges. -It limits the maximum RATE at which the component can charge.
For a capacitor, (capacitors are simpler to analyse than batteries which are a more complex component) the standard is given as T=RC where T is the time to reach 63% of the supply voltage in seconds, R is the total resistance in circuit in ohms, and C is the capacitance in Farads.
A battery is more complex because it involves a chemical transfer of energy, (which in a capacitor can be considered analagous to dielectric absorbtion) but PART of the model is a complete resistance to increase the charge voltage across its terminals beyond a (chemically-defined) fixed number, and to convert any excess into heat. In a 'capacitor' model, this would be like a reverse-biased zener diode in parallel. -Any attempt to charge beyond a certain point (no matter what the series resistance) is dissipated as heat.
...and you DON'T want excess heat pointlessly dissipated through a fully charged battery.
Now let's briefly consider the scale of the numbers involved:
For a load to pull 12 Amps DC from 12 volts DC, its resistance has to be 1 ohm. For a load to pull 120 amps, its resistance must be one tenth of an ohm. If the cable in series has even so much as one tenth of an ohm of resistance, it would comprise 50% of the total circuit reistance, and as such would dissipate 50% of the power, and the starter would only see six of the original twelve volts. -That doesn't just go for the cable's resistance either: contact resistance, cable indictance, -everything- in short, the TOTAL impedance of the circuit to and from the starter motor must be under a hundredth of an ohm, or you're going to get a slow crank.
Now, if a battery INITIALLY charges at 20 amps (for example after a significant discharge from a long period of cranking, trying to start the engine with some sort of problem) then the voltage drop across the one-hundredth of an ohm at 20 amperes is 0.2Volts.
_So you MIGHT raise the voltage of the alternator by 0.2 volts, and that would fix it, -right?
-Wrong. -Because as the battery's internal charge was restored, its charge current would drop to a trickle. -Say it would be around 1 amp, just for some easy mathematics. -Now the voltage drop across the one-hundredth of an ohm total circuit impedance would only be a hundredth of a volt... so we'd be overcharging by 0.19 Volts. (which is actually a significant amount to a battery) and the current would in fact never drop off to that small degree, because the battery would be trying to 'sink' the extra voltage.
Adjusting the alternator voltage to address this problem would be entirely the WRONG thing to do. Minimising the circuit impedance is the RIGHT thing to do.
I've relocated batteries on a couple of race cars before now (to the front in a rear-engined aircooled VW and to the rear in a front-engined Porsche; to the spare wheel well in both cases) and Welding cable works well enough for a racecar. in Neither case did the starter motor slow down at ALL appreciably, and the batteries charged just perfectly, and to the correct voltage, and with absolutely NO measurable difference in charging current WHATSOEVER.
Super-low-gauge welding cable is perfect for an occasional track car, but I'd say it's NOT recommended for daily street use, but not because it can't handle the current... Because it's usually VERY finely stranded (which makes it flexible and easier to use on welding jobs... though such soft supple flexibility isn't much of a benefit in a permanent install like on a car) and as such has a high surface-area-to-sectional-area ratio. At a car battery, there are frequently lots of nasty corrosive gases in the environment, and they tend to gain ingress to the cable through the end. Once sulphuric gases get into the end of welding cable, that end of it quiclkly turns into a green, brittle, high-resistance nightmare. -Depending on the type of isulation (often a rubbery compound) that too may not cope well with either the inhospitable chemical environment near to the battery, or the heat at the engine bay end. -but like I say, it's GREAT for a racecar, where changing it out is just part of the 'cost of doing business'.
So wht's the summary? -For the cable resistance to appreciably slow the charging rate to in ANY way have an effect ob battery charge, the cable resistance would have to be so apprecialbe that your starter motor wouldn't turn... -simple as that.
Different types of battery may require slightly variant alternator output voltages for reliable charging, but for this hypothetical analysis, you don't need to change it at all. The voltage drop across ANY cable that can run the starter motor will be so miniscule that it will fall WELL within the alternator output's window of regulation, even when in perfect condition.