It seems to me that the only way to know if an engine is truly overfilled is to find a way to measure the clearance between the bottom of the pan and the bottom of the crank. You would hope that the dipstick is calibrated for this, but I doubt they are very accurate.
I'm pretty sure the |xxxxxx| band on the dipstick is there just to give the customer some wiggle-room when topping off. If it was just a single line, people would go crazy adding a few ounces here and there to make sure it was close. With a range, the customer knows that he just has to add a quart when it says "add".
The downside of aeration is that the more bubbles in the oil, the bigger the volume of the oil. So it just whipps up even more.
This just made me wonder whether sump capacity and sludge/varnish/OCI are related. Just as an anecdote, it seems like the cars I've had with higher sump capacities seem to have cleaner engines. My Ford with the 2.5 Duratech took darn near 6 qts and was clean as a whistle after 8 years and 130,000 miles. My Grand Prix with a 4.5 qt sump capacity is filthy. The oil always smells awful.
It would seem to have a direct effect on OCI- an engine in operation has a certain amount of oil "working" in the engine- for each engine rev, there is say an ounce of oil getting smashed by bearings. It doesn't really matter what the actual amount is- every individual engine will "work" the oil so hard with each revoloution. So in identical engines, one with a 5 qt sump and one with a 6 qt sump, the oil in the engine with the smaller sump is getting worked 20% harder. Or putting it another way, after a million engine revvs, that oil is 20% more worn out. And since it theoretically runs that much hotter, you might be reducing its life by another 10-20%.
Something to consider... I wonder if the OCI computers take this into account?