Originally Posted By: ZeeOSix
Originally Posted By: dnewton3
What we all need to realize is that, to my knowledge, there is no current, relevant SAE study regarding filtration.
Most all of our information for discussion comes from decades old data (the GM filter study; the bus study). Those two in particular are often referenced as the gold standards, but one is an ALT that GROSSLY DISTORTS REALITY, and the other was done on 2-stroke DD engines that were notorious for soot generation and sub-standard air filtration.
My point in all this is that there's no relevant data that directly proves the points we discuss.
In all our discussions about the SAE Bus Study you regarded that one as having pretty good validity, as I do. IMO, it doesn't really matter if those engines were 2-stroke DD engines or not. Fact is, their study showed a clear correlation between oil filter efficiency and Fe levels/engine wear.
What this all boils down to is that there certainly is an effect on engine wear with better filtration, but on well maintained vehicles you're not going to see an engine "blow-up" or "totally wear out" beyond FSM specs before the car rusts away or other issues cause it's total demise. I've never eluded otherwise, only have always said higher efficiency oil filters will help keep engine wear down compared to a much less efficient filter, just as the bus study showed.
Personally I use high efficiency oil filters because I don't care about saving $3 on an oil filter (easier to save way more on other purchases in life), and because having an efficient oil filter to complete the "anti-wear triangle" (oil, air filter & oil filter) is worth doing. And I really don't care if someone uses a low efficiency oil filter, it's their car and money. But I do care that some people (not you dnewton) will argue and say that cleaner oil doesn't mean less engine wear just because they think they've proven otherwise on their own accord with no official data of any sort to back-up their misconceived claim.
Zee and I agree on this, but for those who doubt, let me do this:
I guess I need to make an absurd analogy to put this in perspective ...
Let's say you want to make sure you don't get any bears around your home. And so you think "I'll get a dog; he'll bark and scare the bears away." You can get a small yappy dog, or a large dog. In fact, there are some hounds that are specific to certain regions that actually are bred for this. But you cannot decide if the extra money spent on the higher efficiency animal (small dog vs big dog) is worth it. But the issue you're overlooking is the rate of bear occurrence in your area. If you live in downtown Indianapolis, there are no bears here at all. So it really does not matter how much money you spend on a more efficient dog, because the occurrence rate of the bears is pretty much non-existent. Even if there was a bear in Indianapolis, downtown Indy is a big place, and the likelihood that the bear will camp out in your lawn is very small, and the likelihood he'll remain after any dog barks at him is practically zero.
My point is this: the efficiency of the product can only have a tangible effect if the occurrence rate of the objectionable offender is reasonably present. Rare occurrences of offense are not greatly manipulated by efficiency delta in competing choices.
Now return to the oil filter discussion. Trying to show a real-world effect between two normal off-the-shelf filters is insane, because the particulate loading in a modern, well designed and made, fuel-injected engines with a good air filter system is MOOT, and I mean COMPETELY AND TOTALLY USELESS CONVERSATION. The bus study is valid, and I do like most of what it shows, but it does not relate to our cars or trucks much because the old buses they ran were DIRTY running, and had poor air filtration. Had the air filters been better, there would be far less wear overall. Had the engines run clean (not dirty 2-stroke diesels), there'd be far less contamination in the crankcase. The reason the bus study showed a large disparity is because there was a LOT for the lube filters to clean up. But today's vehicles do not present that same operating environment to FF filters today.
SAE study 952557 from Donaldson shows us that TOTAL WEAR is what we need to look at. Wear contributors are not just lube filter related, but also air intake, soot generation, fuel related, corrosive, etc.
The vast majority of potential wear comes from ambient air contamination. Let me repeat that ...
THE VAST MAJORITY OF POTENTIAL WEAR COMES FROM AMBIENT AIR CONTAMINATION. The greatest disparity of wear induction comes form the air intake tract. You can DOUBLE the wear of your engine by changing air filters too often; especially in the first 30% of the filter's lifecycle. Way down the list in terms of effect is the topic of oil filtration efficiency. The study does show the "better" filtration has a positive effect, but again, it has to use a huge magnitude of disparity to illuminate this effect. IOW, the difference in wear protection between a 20um and 10um rated filter is tangible, but that's not what we are discussing here. We're discussing the differences between filters that are 99% or 95% or 90% all at 20um. Folks, you're not going to see a significant effect by pitting a 95% filter against a 99% filter; the disparity is just too small in terms of tangible effect.
And not one study I have ever seen actually takes into account "normal variation" (standard deviation). Whereas the "average" (mean) effect might be a few particles, the typical everyday use of any equipment dwarfs the sounds of filtration disparity. It's like trying to hear your phone ring in rock-concert arena; you cannot distinguish the differences of a few decibels difference in ringer volume control of the phone during a 100 decibel ode to Bon Jovi. There is so little contamination in a sump system today that the minor difference in filtration have ZERO practical effect in overall wear.
Examples:
- You first run a series of UOAs and PCs using a 95% filter (say a Wix/NG). The average Fe wear rate at 5k miles might be 2.53ppm/1k miles. And the variation of that might be 2ppm. Hence, your "normal" magnitude of expected wear over several successive UOAs might be anywhere from .53 to 4.53ppm.
- Now you swap to a 99% filter (TG), and run the tests all over again. Now you get an average wear rate of 2.48ppm/1k miles. Your variation is still around 1.8ppm. Hence your range is now .58-4.38ppm. Your mean shift was .05ppm, but your range is still almost 4ppm!
You are either ignorant or arrogant if you think you can discern real wear in a tangible manner when your standard deviation is far greater than your mean shift! 95% of your wear rate overlaps between the filter choices! And because we only care about wear
increases (that's what' detrimental), you can cut that 5% disparity in half; only 2.5% of wear reduction might be tangible on the upper end of wear rates.
At some point, oil filtration is "good enough" to make wear a non-issue in terms of sump cleanliness. Having a clinically clean sump might make your engine last infinitely, but there's no practical return on the investment. The difference in wear rates might accumulate to perhaps a 8k mile difference after going 350k miles! In other words, if you had used the premium filter, you engine may have the experience of 350k miles, but if you had used the lessor filter, it may experience the equivalent of 358k miles of wear. And for what? Spending 2x the money on filtration ($12 filter vs a $6 filter) does NOT double the life expectancy of your engine. The disparity in lifecycle improvement is so freakin' small that's practically impossible to measure in REAL LIFE. I cannot assure you these values are 100% correct; they are examples. But the meaningful point to take away is that there is not a equivocal linear relationship between filter expenditures and wear reduction. It's parabolic. And the disparity which we discuss is so stupidly small that we cannot even accurately measure it. ALL THE FILTER STUDIES I'VE EVER READ use a large disparity between pore size, so that the disparity becomes clear in data. But our choices today (80% vs 90% vs 95% vs 99%) do NOT show such disparity in wear control. Ever wonder why it is that a Honda or Toyota engine running OEM filters lasts so long, despite the generally loose filters they employ? Did it ever occur to you that oil filters are important only to a level of "XY"%, and then they are usurped by other contolling factors like air filtration and TCB?????
Spending $10 on a FU over $6 for a TG will not get you 65% less wear for the 65% more money you spent; they are both 99% efficient and have way more capacity than you'll need. Even a Wix at 95% is more than your engine will ever need. Even a EG at 95% for $4 will give your engine all the protection it ever needs. Spending more does not equate to less wear in a tangible, real sense.
Sure - lab tests can show a disparity in filtration effect on the lube; that is a direct relationship. But it can only imply an effect on the overall lifespan of the equipment, because once lube filtration is "good enough", improvements in filtration have a diminishing rate of return on a very deescalating scale. This is true because of three things:
1) the oil filter is secondary to a good air filter and ambient dust conditions
2) there is a law of diminishing return in terms filtration in modern, clean running, low contamination engines
3) the TCB effect runs in concert with the OCI duration, thereby affect wear rates
I triple-dog-dare anyone to find an oil filter study that takes into account "normal" total wear variation of a typical OCI, and then proves that minor filtration differences matter. Go on - try to find one. I'm waiting ..............