I used to lean toward a similar assumption, until someone with an actual engineering degree explained why it isn't true. According to engineers, the thing that matters is temperature delta. If the water stays in the rad longer, then yes, it will cool down more. But (a big but), once the water temp starts dropping, the heat exchange efficiency starts dropping rapidly. Lets say you get the 80* drop someone mentioned. If the water starts at 200* and outside air temp is 100*, heat will flow great at the 100* delta. But what happens to the water that's down to say, 130*? Now you only have 30* delta, but you're flowing the same quantity of air through the heat exchanger. If it's cooling a 1500 lb diesel pump engine sitting beside a pond, that might be ok. But it's a massive quantity of wasted cooling drag in an a/c. And a 3 pass rad will have 3 times the flow resistance of a single pass, all else being equal. So it takes more power (at ~30% engine efficiency, 70% added heat) to pump the water against the extra resistance. Only exception I've seen that works is a 2 pass with the hotter flow behind the cooler flow, to keep temp delta as high as possible. And it's typically done to compensate for packaging issues (face area on the a/c), rather than to improve rad efficiency.Did you ever see a radiator with a blockage welded into it to cause the water to make two or even 3 passes across the radiator? A racers trick. The flow wasn't increased but the time that the coolant was in contact with the aluminum surface was increased. The metal inside the engine or the inside of the radiator only has a certain area of contact to transfer heat. The radiator didn't change size, but the coolant was "essentially" slowed to provide more contact time. You will reach a point of no return where coolant at a normal temperature will not give up additional heat because of reduced contact time. If this were not true, a radiator could be a simple straight Tube instead of a series of tubes and fins. If we needed more cooling we would then simply increase the rate of flow with a bigger water pump.
edit: Another thing to ponder is: What will be the effect on engine longevity if water is exiting the block at ~200*, and re-entering the block at 120*?
The guys I know who are cooling *efficiently* see a *much* lower water temp drop through the rad; <20* instead of ~80*. Once everything is optimized, there's very little block output to rad output temp drop, because the rad is keeping the temp very close to set point.
The thing that I've never seen any good data on, is how much flow is needed for a given output, and how much energy is wasted when you pump too much water (obviously no single number; every situation would be different). Just a wild guess, but with mileage and emission laws being what they are, I'd be fairly confident that if the car mfgrs didn't see some efficiency improvement from being able to modulate water flow using an electric pump, they wouldn't be adding the complexity and the need to convert mechanical energy to electrical and then back to mechanical to run the pump. Having said that, I wonder if it makes sense in an a/c, unless there's a 'packaging' issue. ex: the old Mazda 13B engines had their water pump 'snouts' sticking up about 6" above the block; almost impossible to package in a typical cowl. Even then, a decent TIG welder could solve the problem of the stock pump.