Filtration Efficiency vs Pressure Delta

Status
Not open for further replies.
Joined
Aug 25, 2018
Messages
3,760
Location
South Carolina
Does the pressure differential across the filter media have an effect on the filtration efficiency? If so, how much?
 
Originally Posted by SubieRubyRoo
It's probably highly filter-specific; but I'd say generally as the delta P goes up, efficiency likely goes down slightly.


For sure. Look at Figures 3 and 4 in this Machinery Lubrication article (link below). Then look at the efficiency graph the Purolator/M+H got in their efficiency testing vs media loading. You can see in all cases there is a point where the efficiency goes down with more media loading. That's because - depending on the media design and performance - as the pressure across the media increases, at some point the media starts to shed captured particles, which makes the downstream efficiency go down due to dislodged particles contaminating the downstream fluid.

That's why people who always throw out the "oil filters get more efficient as they load up" haven't looked into it enough to be able to make that blanket statement.

https://www.machinerylubrication.com/Read/564/filter-beta-ratios

Figures from the Machinery Lubrication article:
Figure 3: Starts off with high efficiency when new, then loses efficiency as the delta-p increases. Similar to the Purolator/M+H data.
Figure 4: Starts off with low efficiency when new, then increases efficiency until the delta-p hits a certain point, then loses efficiency with increased dealt-p.

They also talk about "Beta Stability" which is used to define how much efficiency a filter loses as the delta-p increases across the filter.

In both graphs, there's a point where the delta-p starts dislodging captured particles in the media. I think all filters will have this characteristic to some degree - some worse than others. No filter is going to be able to hold every single particle that is initially captured, given the design of the media and enough delta-p.

IMO, full synthetic media tends to produce less delta-p across the media, and that also helps with keeping captured particles from breaking lose.

[Linked Image]


As mentioned before in these discussions, the ISO 4548-12 efficiency rating is the average efficiency over the length of the test (ie, new filter to nearly clogged filter). So if a filter has a high ISO 4548-12 efficiency that probably also means by definition that it does not shed particles as badly as it loads up as a filter that has a low ISO 4548-12 efficiency. If a filter sheds particles badly as the delta-p increases, it's not going to be stellar in the ISO 4548-12 test. A low ISO 4548-12 rating is a combination of less efficient media along with particle shedding on top of that.
 
I agree that what Zee shows is sound, but like most things, I would caution anyone whom thinks this info translates directly to their daily life.

Nearly all lab tests are HALTs (highly accelerated life test). They manipulate the conditions of the test to produce a desire result; they are good at that. HALTs show relative data in concert with the conditional parameters. But those parameters are not always (if even frequently) present in the real world. The good thing about HALTs is that it gives one the ability to control variables and focus on one to be manipulated. The bad thing about HALTs is that they very often rely on conditions which are either non-existent or unsustainable in the real world. And HALTs typically eliminate other contributors which also control the overall output, as if they never existed. This also is untrue of the real world.

I agree that with the loading presented in the ISO test, the efficiency goes down as the filter use matures. That's undeniable.
I am not convinced, however, that this result actually manifests into the real world in a typical "normal" OCI. And even if it does, it clearly does not have any substantial effect.

Why do I state this? Because we also have reams of data that shows wear rates drop substantially as the OCI matures (SAE 2007-01-4133 and my UOA normalcy study). This is REAL WORLD DATA that represents actual use of fluids and filters in then exact manner we all experience. Now, we have two sets of study data to reconcile here. One tells us that filters become less efficient as they mature, driven by lab studies. The other shows us the fact that wear rates drop with absolutely predictability as the OCIs get longer. So a few conclusions are possible here:
- the lab study does not manifest into the real world because common sense tells us that more particles should make for more wear, and yet we don't get more real world wear; i.e. the HALT does not manifest into reality in life's typical conditions
- the lab study is correct, but particulate loading does not have the influence over wear that we would be lead to believe (there are other factors which control wear to a larger degree than does the filter efficiency)
In short, either the loss of filter beta does not happen in the real world, or it does happen in the real world, but cannot produce a tangible effect. But either way, there's plenty of proof to conclude that either condition has no effect, because wear rates always down down as the OCI matures.

Now - I'm going to speak to "normal" products and typical choices in this next paragraph ... stuff that we'd find at AZ, or AAP or W/M ...
In fact, I don't even care much about filtration efficiency as long as a decent minimum beta is upheld. This is because REGARDLESS of the filter used, wear rates still go down. This tells me that OTHER CONTRIBUTORS are in control of wear rates, and not filtration, at least out to 15k miles. Again - I state this clearly ... I am not saying that filtration is useless; that is patently untrue. We need filtration to be good. But once it's "good enough", making it "better" does not show to actually have an effect in our garages. Once a minimum threshold of beta is shown to exist, it's sufficient to sustain good wear rates. Despite the general belief that we need really fine filtration, there are bazillions of examples of engines which survive quite well on fairly pedestrian filters (for example, Toyota engines using common Toyota filters; typically a "moderate" beta value and nothing to write home about).

If you have a variable that cannot prove correlation to results, then it's fair to conclude it's influence is minimal if non-existent.
Wear rates drop, regardless of what filter efficiency is used. Hence, filtration efficiency is not a contributor to overall wear control (past the admitted minimum threshold needed to be "good enough").
There is no correlation between filter efficiency and wear rates in real-world, normal product applications.
And without correlation, there can be no causation.

Does the filtration efficiency drop as the loading increases in a lab test? Yes - proven so.
Does the filtration efficiency drop actually matter in our garage? Nope - proven not to be so.
 
Last edited:
Originally Posted by dnewton3
Why do I state this? Because we also have reams of data that shows wear rates drop substantially as the OCI matures (SAE 2007-01-4133 and my UOA normalcy study). This is REAL WORLD DATA that represents actual use of fluids and filters in then exact manner we all experience. Now, we have two sets of study data to reconcile here. One tells us that filters become less efficient as they mature, driven by lab studies. The other shows us the fact that wear rates drop with absolutely predictability as the OCIs get longer. So a few conclusions are possible here

We do have "reams of data"? I have only seen that one SAE study referenced on here, are there other peer-reviewed studies?

Also, that publication has been discussed here before and shown to be somewhat less than what it appears:

https://www.bobistheoilguy.com/foru...-often-will-harm-your-engine#Post5034352

https://www.bobistheoilguy.com/foru...s-better-for-your-engine-sae#Post4918594

Are those comments incorrect?
 
Lab testing filters has a cost they can't test for weeks each filter with minute amounts of test dust.Test dust isn't the same as in the field. How do the scanners determine a 20 micron particle when the particles may be potato shaped? Still, it shows which filters are best in that test. Assuming all the media is made perfectly uniform by the maker.

In the old days they first used bypass filtration which to me is the best way, not just with TP, but with regular elements which can be made finer than full flow. I guess it was a good idea to filter all the oil with a filter before it goes to the engine, but it was at the price of allowing more particles through. Now engines run cleaner and maybe full flow isn't such a necessity.
49.gif
 
Good information.

Aside from efficiency vs delta from clogging, I'm thinking of new vs new filter. Say a car with a small filter like a Fram PH6607 and you switch it to a remote mount with a large PH8A with far more surface area and lower PSID. Assuming they both have the same efficiency rating, would the lower PSID of the larger filter allow it to be more efficient from the start? Per the information posted above, it would remain more efficient through the interval as it can hold a lot more crud before clogging.
 
Originally Posted by dnewton3
I agree that with the loading presented in the ISO test, the efficiency goes down as the filter use matures. That's undeniable.
I am not convinced, however, that this result actually manifests into the real world in a typical "normal" OCI. And even if it does, it clearly does not have any substantial effect.


It could effect someone in the everyday world use if they decided to use an oil filter way beyond it's intended use. I agree, in a "normal" OCI most people aren't in danger of pushing a filter beyond it's rating. Nobody knows for sure how the delta-p across the filter is changing with time. So running a filter way longer than it's rated for could make the filter much less efficient than one would expect. Filters are cheap (even if it's $10-12) in comparison to other maintenance items on a vehicle ... no reason to try and obtain a world record on its use.

Originally Posted by dnewton3
Why do I state this? Because we also have reams of data that shows wear rates drop substantially as the OCI matures (SAE 2007-01-4133 and my UOA normalcy study). This is REAL WORLD DATA that represents actual use of fluids and filters in then exact manner we all experience. Now, we have two sets of study data to reconcile here. One tells us that filters become less efficient as they mature, driven by lab studies. The other shows us the fact that wear rates drop with absolutely predictability as the OCIs get longer. So a few conclusions are possible here


Yes, many factors going on. Less wear as the OCI matures is mostly driven by the accumulated anti-wear additive layer of the oil building up on surfaces - not from an oil filter becoming "more efficient" IMO. Wasn't it you Dave that use to discuss how the anti-wear layer on parts can get stripped off when and oil change is done and the new oil strips off some of that built up anti-wear layer?

At least doing established industry accepted lab tests on filter efficiency vs delta-p across the media is somewhat controlled. I don't think it matters a whole lot that the tests are accelerated, which needs to be done to get the loading and resulting delta-p up to see how the media behaves as the delta-p increases.

Originally Posted by dnewton3
In fact, I don't even care much about filtration efficiency as long as a decent minimum beta is upheld.


If I recall, you use filters that are at least 95% @ 20u. In my book that fits into the "efficient filter" category. Filters that are 50% @ 20u or 99% @ 40u don't fit into the "efficient filter" category IMO. They still work pretty good, but I think using a more efficient filter than those can help in the long run. It's been shown with PC data from the UOA forum that more efficient filters do correlate to cleaner oil, and it's not a big leap to believe that cleaner oil results in less wear - all technical papers that I've read on that subject support that.
 
Originally Posted by RDY4WAR
Good information.

Aside from efficiency vs delta from clogging, I'm thinking of new vs new filter. Say a car with a small filter like a Fram PH6607 and you switch it to a remote mount with a large PH8A with far more surface area and lower PSID. Assuming they both have the same efficiency rating, would the lower PSID of the larger filter allow it to be more efficient from the start? Per the information posted above, it would remain more efficient through the interval as it can hold a lot more crud before clogging.


There were theories that the slower the oil goes through a filter the more efficient it might be, which seems logical. If the flow goes through media slower, then the delta-p is also decreased, which helps the media not shed captured particles. And as you say, the larger filter will also have less delta-p as it loads up because it can hold more debris due to more surface area. So yes, I'd say a filter with more surface area (assuming the same exact media) will probably be more efficient over the life of the filter. In new vs new between a small and larger filter with the same exact media, I'd say they are probably both about the same efficiency. So this is another reason to not run small oil filters past their intended use. Running a much larger filter longer makes more sense if someone is into long filter change intervals.

Also keep in mine the delta-p vs captured debris phenomenon. I'd say every filter will shed captured particles to some degree as the delta-p increases. So consider those instances when someone revs and engine high before the oil is fully warmed up, or cold winter start-ups where the delta-p is spiking to the bypass point. With delta-p that high, not only is some dirty oil bypass int the filter, but the media is certainly shedding some particles under that condition. So the engine gets a "burst of crud" when that happens, but of course the oil gets cleaned back up after a few circulations, but those crud burst can add some extra wear to the engine.
 
what is the benefit of oil holding particles in suspension versus particles settled out or drained out with frequent oil changes?
 
Originally Posted by Farnsworth
what is the benefit of oil holding particles in suspension versus particles settled out or drained out with frequent oil changes?

In diesel oil (HDEO) the particles of soot can be so fine that they pass right through the filter, so the oil needs to keep those particles in suspension, or agglomerate them so the filter can catch them. Not sure how much soot % that takes to make happen, vs. the oil just turning jet black & suspending it.
 
So I'm going to combine some of the great data provided by Zee and dnewton, and form it into a slightly different hypothesis:

1. Filters load up and efficiency degrades as this effect occurs.
2. As dnewton references, UOAs generally show a decreasing wear rate as mileage goes up, even though overall counts may increase slightly.
3. Combining points 1 and 2 and discarding the common advertising by filter companies for the point of accurately assessing the data and hypothesis, it MAY be possible that:
a. once past a certain particle size, gravity overcomes the weight of the particle, and it precipitates out, either in the filter can or into the sump until it is drained away at an OCI where the drain plug is the lowest point;
b. because particle sizes, at some point, become larger than internal passages, they will never enter areas where the particle size exceeds the MOFT between two highly loaded metallic surfaces that are in contact, meaning there will be no or minimal scarring;
c. The combination of hypotheses A & B jive with the facts of points 1 and 2 and therefore both observations are true, and even when using 40 micron filters it is sufficient to keep things happy...

This may explain how Toyota filters, Subaru filters, etc.... are all "terrible" in the eyes of people who are "most efficient filtration possible" proponents, yet these vehicles soldier on for many hundreds of thousands of miles on OEM filters. In my limited view, it points to two things being the main drivers of "wear" as we are all concerned about: the tribofilm layer, which is essentially ripped off every time a fully new sump of oil is introduced (and backed up by testing that shows effective wear rates skyrocket with fresh oil), and localized hotspots that cause deposit formation (coking) which then reduces or eliminates oil flow through the area driving temps higher than the oil flow could normally cope with anyways, and perpetuating metal to metal contact.

Sure, there is going to be some wear anytime particles are circulating in oil.... but after the engine is shut off and drains back, a good majority of suspended particles are going to end up in the sump and eventually be overcome by gravity, yes? I'm willing to bet that a full pan drain PC test is going to have significantly different results than a dipstick tube PC test, due to "wear" fallout. Your views?
 
Originally Posted by SubieRubyRoo
In my limited view, it points to two things being the main drivers of "wear" as we are all concerned about: the tribofilm layer, which is essentially ripped off every time a fully new sump of oil is introduced (and backed up by testing that shows effective wear rates skyrocket with fresh oil)

Can you link to this testing?
 
I don't buy that.... New oil has a lot more lubricity vs old old which would more than make up any real world difference there... And oils do not generally have drastically different additives so it is not a huge change one oil to another.... The zinc and phos are remarkably similar levels between all oils today... The Molybdenum levels are very similar across different oil brands too.. Chevron Havoline and Quaker State oils are almost exactly the same in terms of those 3 additives. Plus others has well.
 
How about you not be ridiculous and question kschachn's intelligence?? Try that on for a change...
And him not being or bringing positive response?? Aka agree with you... He's actually has a different way of thinking of this.. With good reason and rationale why.
 
Heavens, it was only a question driven by dnewton3's reference to the flawed SAE 2007-01-4133 study that's always dragged out to support the notion that "old oil is better than new oil." I had not seen a credible (nor a second) study that shows this.

I looked at the link you provided and it shows that? If it does then I missed it.

From the Machinery Lubrication article:

Quote
While the wear rate is not greatly escalated at the front end of the oil change interval, it certainly is not lessened by frequent oil changes either. Changing your oil early does not reduce wear rates, presuming you did not allow the sump load to become compromised. When you have reasonably healthy oil, the wear rate slope is generally flat.

Only after the oil becomes compromised in some manner would you see a statistical shift in wear rates. Thus, higher wear at the front of an oil change interval is plausible, but the claim of lesser wear with fresh oil is most certainly false. Those who change oil frequently at 3,000 miles are not helping their engine, and those who leave it in for longer periods are not hurting the engine

Reading through that article it seems to indicate that you don't reduce wear by more frequent oil changes, but you don't increase it significantly either. This is not supportive of your statement above that "backed up by testing that shows effective wear rates skyrocket with fresh oil." That article surely does not say that. And this is exactly what I was asking you about.
 
^^^ Notice who wrote that Machinery Lubrication article.

I think Dave (or maybe is was someone else) has mentioned the stripping of the anti-wear layer upon a fresh oil change in prior discussions here on BITOG. Maybe he can chime in on that.
 
Not saying you are a bad guy Subie... Not at all... You are a good guy on here...
 
Last edited:
Originally Posted by ZeeOSix
^^^ Notice who wrote that Machinery Lubrication article.

Yes, I saw that later. One should also note that his article is really not about increased wear with new oil, that's only a topic he invokes during his discussion about oil analysis in general.

And the idea of stripping the tribofilm layer is proposed as one mechanism for the higher iron counts, but he provides no real evidence that's what is happening. In fact he doesn't even make that statement, what he does say is: "Studies have shown that elevated wear levels after an oil change can be directly linked to chemical reactions of fresh additive packages." So if you use the same oil as you did the last time then this should not happen? That seems easy enough to test.

What do you want to guess that the "studies" he mentions is the one SAE study that isn't applicable to already formed films?
 
Status
Not open for further replies.
Back
Top