Originally Posted by JAG
There have been many studies showing increased wear when the oil is cool or cold. There are a variety of potential causes, like fuel and water condensing on cylinder walls, increased acidic corrosive wear, anti wear additives being less active, ill-fitting parts due to dissimilar expansion rates, less than full-film thickness on cylinder walls, etc. I'm talking about wear during warmup phase, not just the very brief start-up event. Here is one study that focused on valve-train wear:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.924.5777&rep=rep1&type=pdf
Perfect example of misunderstanding of study results. (no reflection upon you personally, JAG).
This study linked above is essentially a HALT test.
The general protocol is that they ran an engine under prescribed conditions and then measured the wear on the valve-train. OK - fine by me so far.
But it's the way (the method) they did it, they way they
had to do it, to induce the result they sought. Read on ...
They first ran the engine at 1500 rpm for 40 hours with the oil held at 40 deg C, then then they ran the engine at 3000 rpm for 60 hours with the oil at at 100 deg C. The test is a continuous run for 100 total hours. (Note, coolant was also held near the oil temps as well, so the engine oil and coolant were within 5 deg of each other).
Now, I don't know about you and your car/truck/tractor/generator/whatever, but my engine does NOT run at "cold" oil temps for 40 straight hours. Think about it ... The reason the oil wear analysis showed up to 5x more wear is because THEY RAN THE ENGINE FOR 40 HOURS WITH THE OIL TEMP AT 40C (104F) and then contrasted that wear rate to running the engine at 100C (210F) for 60 hours. (They also reversed the test; running 60 hours hot oil and then 40 hours cold oil).
Typically, any decent water-cooled engine has a thermostat, and that allows the engines to come up to temp fairly quickly, even in winter. That warms the oil up to a point where it's doing it's job the best it can. I WANT my oil around 200F - 225F; it's supposed to be there. The reason this linked test is not really valid is because it does not, in any manner, represent the reality of what happens in our vehicles.
HALTS are great for revealing things they are designed to reveal. Be it this "cold oil" test, the infamous GM filtration study, etc. All these "tests" are typically designed to reveal a disparity between two or more options of some conditional variable. That's fine.
But you MUST stop and ask yourself: do the conditions of the test actually represent the reality of your world? If they don't, then the results of that test will not manifest into reality in your garage.
If all you ever did with your vehicle was start it up, move it from the garage to the driveway in the morning at shut it down after 30 seconds, and then reverse this process at night to put the car away, then the claim of 5x more wear from "cold" oil might be valid.
But most of us don't do that every single day. In fact, we rarely do that. Typically, even if the cabin does not get warm, the engine will get warm in a few minutes of normal driving. We not only start our engines, but we drive them too! And once it's under load, even moderate driving warms the engine coolant/oil fairly quickly. Additionally, the thinner lubes tend to come up to their operating vis sooner; hence a lower wear rate anticipated sooner.
IMO, one of the worst things you can do it start your very cold engine and let it "warm up" in winter, because it takes longer for the coolant and oil to come up to temp. Just let her idle for as long as it takes to get the "flare" idle down (that which warms the cats up), and then per her in "D" and go about your business in a moderate manner.
I don't disagree that cold oil and cold engines experience a bit more wear, but it has to be kept in perspective. That is typically unavoidable, and also completely ignores the topic of the TCB (tribochemical barrier) effect also. As oil matures, it oxidizes, and as that oxidation is laid down, it coats the surfaces (SAE study 2007-01-4133 from Ford and Conoco). The HALT study which JAG linked does not address the topic of TCB, and if they were constantly introducing fresh oil, they were also stripping away the TCB each and every time, and so bare unprotected metal is more susceptible to "wear" than metal coated by the TCB.
Again - my data shows that this is a completely overblown (and often misunderstood) topic. Real world wear-rate data does not lie. I don't care about what happens in the lab. I care about what happens in our collective driveways, and over 15,000 UOAs cannot lie; start-up wear is a moot point because the conditions that contribute to that kind of wear are very low in total quantity of operational hours.