Originally Posted by ZeeOSix
Originally Posted by oil_film_movies
Originally Posted by Gokhan
As Jim Allen once said, the optimal oil viscosity grade is the thinnest oil that is thick enough (or something to that effect -- I can't remember his exact words).
My take: Find out what the required HTHSV is for your car and don't go any higher. Then, find an oil that with the least viscosity-index improver (VII) content that meets this HTHSV spec. This will result in the highest fuel economy, smoothest- and cleanest-running engine, and probably the least engine wear as well.
I've been recommending never adding about +0.5 HTHS over the lowest recommended by the engine maker.
Your paper you cited above blames partial oil starvation (lower flow) of the extremely high HTHS oils for the gradual increase in wear as HTHS goes up excessively. ----> Take an HTHS 2.7 (0w20) oil recommendation for example: You can easily use HTHS 3.2 in that engine and you will get slightly better wear performance. (Even HTHS 3.5 probably on the higher end.) Much higher than that and you may encounter the starvation issues as the engineer saw on the rod big end. (see
https://www.bobistheoilguy.com/foru...rstanding-viscosity-and-hths#Post5110672 for the citation )
To add ... I'm not so sure I totally buy into the "oil starvation" statement as a function of oil viscosity made in that paper. As shown by the table posted earlier, journal bearings flow much more oil as their RPM increases (even though flow is reduced by higher viscosity at a constant RPM), and that along with the forced lubrication (positive displacement oil pump) I can't really believe bearings would be oil starved enough to cause damage just because a higher viscosity was used. Sure there is more heat produced from shearing at higher RPM and from the higher viscosity, but the MOFT is also increased, which helps to ensure no metal-to-metal contact. That is the basic reason why manufactures of high performance engines recommend higher viscosity oil for track use to protect the engine better. Even though the oil heats up more in the sump and in the bearings with extreme use, the MOFT is still increased for added protection (less wear).
Wear only occurs in journal bearings when there is metal-to-metal contact, and if the MOFT is always preventing parts from rubbing on each other then there is no wear. IMO, the only way that thicker oil could cause more wear is if the heating of the oil inside the bearing was so great that the MOFT went to zero and metal-to-metal contact occurs. Too tight of bearing clearances (which cuts flow, heats the oil more and reduces MOFT), and inadequate PD oil pump performance has more to do with oil starvation of oil flow through the bearings than solely the viscosity used.
I don't buy the oil-starvation hypothesis either -- although it could happen in certain conditions when the viscosity is too high.
There are two factors that determine the bearing wear and you don't seem to know about the latter, which is I think the dominating factor in most engines: (1) contact, either directly or through particles or through surface imperfections and (2) corrosion.
Detergents and antioxidants protect against the corrosion and certain additives and base oils increase corrosion. Another thing that obviously increases the corrosion is the bearing temperature and obviously the bearing temperature increases with the increasing HTHSV, as the hydrodynamic friction increases. Therefore, I think the increase in the bearing wear with the increasing HTHSV is because of the increasing bearing temperature accelerating the corrosion. Most modern engine bearings seem to run just fine with 0W-20 with no noticeable wear and the corrosion seems to be the main factor causing the bearing wear. That's another reason to stick with the manufacturer's oil-change intervals so that you don't have the build-up of acids.