The reason this isn't right is that the radiator pulls heat from whatever water is in it, continuously. It's a continuous cycling process so it is simply a question of how much heat the intercooler is producing vs how much heat the radiator is removing. It is more like a duty cycle - coolant is always in the radiator 50% of the time and the intercooler 50% of the time.coxy321 wrote:Also, having a flow rate too fast would impede on the front heat transfer units ability to cool the coolant as it passes through.
The extra factor is that the greater the difference between the coolant and the ambient air, the more efficient the radiator is - heat transfer is proportional to temperature difference.
In theory with perfect flow the optimal cooling comes from infinitely fast water flow, the worst cooling comes from zero water flow. At infinitely fast water flow you would have the intercooler, water, and radiator all at the same optimal temperature. In reality the intercooler is hotter and the radiator is cooler than optimal.
In practice I guess there are some mechanical reasons why this breaks down at really high flow rates but it is nothing to do with how long water is in the intercooler as such, or "having enough time" to take heat out, it would be to do with other factors.
I don't want to spoil this thread, it might be better to discuss this elsewhere.