What's happening to our wedge when...

Status
Not open for further replies.
Joined
Jan 26, 2010
Messages
641
Location
England
After being recently schooled on hydrodynamic lubrication (wipe that silly smirk off your face Shannow) I was wondering what happens to the wedge under different driving modes that effect it's ability to prevent wear.

I know engines show very little wear when they are operated at constant revs, munching miles on the motorway, and after understanding the above I can see why

A lot of engine wear takes place on the warm up phase of any engine so what is happening to the wedge when the oil is cold and thick to increase wear like this?

Also excessive wear takes place when the engine is labouring (probably only relevant to manual gearboxes) at low speed/high torque and in high torque situations in general so what's happening to the wedge here?

Is this why film strength and extreme pressure performance are so critical in protecting our engines from wear?
 
Last edited:
Riggaz...there are a couple of rules (not rules of thumb, rules of physics that apply).

In hydrodynamics, the parts are separated by a film of lubricant (lubricant can be air if you design it correctly)
1) the greater the viscosity, the greater the parts separation;
2) the greater the speed difference across the lubricant, the greater the parts separation;
3) the greater the load (which is the load in pounds force over the projected area (diameterxlength)) the lower the parts separation; and
4) for a given condition 3, the narrower the bearing clearance, the greater the parts separation.

That's bearings only, and the easiest on them is clearly, as you state steady, lightly loaded speed. Worst is lugging.

If you need to explore the films strength, there's a really good change you will be causing pounding damage to the bearing.

Other places in the engine, like cams are very much different,in that they have way higher load pressures, and aren't conducive to hydrodynamics...at temperature, and speed they are usually mixed/boundary. And they need surface active additives to prevent wear.

Cold, thick oil, they can be more to the hydrodynamic side.

So at start-up, provided the oil can flow and not starve the oil pump, these parts will be as far apart as they can be, and wear will be low.

During warm-up, the viscosity drops very very quickly, and the surface active additives take heat to be active.

That's the damaging point in the warmup cycle, when the lubricant is thinning, and the additives aren't functioning. Usually, you have left over tribofilms from last time, so wear isn't an issue.

But see the Sequence IVA wear test, and it purposely holds the engine at low revs, full lubrication but low operating temperature, and specifically measures cam wear.

Does that help with your question ?
 
Shannow, I agree almost completely with you except in one detail. Although loads on bearings are very high with high load, low rpm, and oil film is weakest at this conditions I don't think that's the time when most wear occur on bearings.
I think most wear ( and stress ) occur with high rpm combined with high load because other things get involved like not perfectly balanced crankshafts, torsional forces and other.
This is why we see bearing failures at high speeds on motorways and racing , etc.
High load with low rpm really affect cylinders the most because of the nature how loads are transmitted via pistons.
 
chrisri,
point taken...and a couple of thoughts.

Inertial loads are exponential, especially if an unbalance.

Viscosity drops quickly with heat, and (e.g. my 3800 buick L67) I can get well over 130C by ten minutes at 4000RPM...115C at 20 minutes 1,800RPM highway...and the ambient's not 30C yet, speeds only 100km/h.

Throw a loaded engine at a high speed autobahn, and I'm out of my familiarity, but it can't be pretty.

Add thinning, oil pressure drop that occurs with it, and sustained revs, not sure how that related to big end bearing stability/life.

All points to add to the discussion, not an end all statement.
 
Originally Posted By: Shannow
So at start-up, provided the oil can flow and not starve the oil pump, these parts will be as far apart as they can be, and wear will be low.

During warm-up, the viscosity drops very very quickly, and the surface active additives take heat to be active.

That's the damaging point in the warmup cycle, when the lubricant is thinning, and the additives aren't functioning. Usually, you have left over tribofilms from last time, so wear isn't an issue.


That's an interesting concept. If I understand you correctly, wear is (or can be) very low immediately after startup when the oil is still cold and relatively thick, and wear is (or can be) very low once all parts are fully warmed and the engine is operated in a more-or-less constant state.

But wear is (or can be) higher in that transitory period where the oil has begun to thin, but heat-activated additives have yet to "wake up"?

Just for the sake of discussion, what oil temperature range are we generally talking here? Or...at what oil temperature have most of the heat-activated additives become energized?
 
Originally Posted By: chrisri
Shannow, I agree almost completely with you except in one detail. Although loads on bearings are very high with high load, low rpm, and oil film is weakest at this conditions I don't think that's the time when most wear occur on bearings.
I think most wear ( and stress ) occur with high rpm combined with high load because other things get involved like not perfectly balanced crankshafts, torsional forces and other.
This is why we see bearing failures at high speeds on motorways and racing , etc.
High load with low rpm really affect cylinders the most because of the nature how loads are transmitted via pistons.


That is probably very dependent on engine design, too.

I will cite one example I've seen, which backs Shannow's argument for low-speed lugging being the worst. Big-block Chryslers, which are often lugged because they show absolutely no complaints when lugged, tend to show wear on the upper rod bearing halves and lower main bearing halves LONG before the lower rod and upper main halves. Those are the surfaces that take the load when the engine is lugging. I've pulled the bearings on a 160,000 mile stock 440 that was running perfectly quietly, other than low-ish oil pressure, and found the upper rod bearing shells worn to the copper and the lowers looking brand new.
 
Shannow;
Agreed, inertial loads are exponential and this can pose serious problems with older engine designs that were quite unbalanced. This was real problem in the past limiting both power and longevity. Some manufacturer counter that with stronger internals and turbo or supercharging while limiting rpm for longevity.
This is a cheap way to get serious power from engine.

There of course is other way with building highly balanced, refined quad cam engines, but this is costly way for mainstream manufacturers.

I have some first hand anecdote from 10-12 years.
My now wife ,then girlfriend had mk2 Golf Gti, 8 valve version with conrod bearings problem. So my mate and I both bearings for 1.6 turbo diesel VW because they were half price of Gti ones, and exactly the same.
Part numbers were different though. So first 1k wile bedding in everything was great, no sounds, all good. But after that and some proper driving problem would reoccur in 2-3 k km. I was stubborn and cheap so did this twice with same result. Third time I both Gti bearings and never had that problem again.
What I'm trying to write here is that supposedly stronger bearing from a high torque TD could not work with low torque but high rpm engine. That's why I think that high rpm combined with poorly balanced crankshaft will have greater effect than lugging.
 
Last edited:
440 Magnum;
I agree, every engine acts different. American pushrod V8 with massive torque and over engineered bottom end can be lugged without problem. But how will this engine respond to 7500 rpm over prolonged time on same internals?
Not well I believe.
After 160 k some wear is normal this way or another.
 
Last edited:
Originally Posted By: chrisri
Shannow;
Agreed, inertial loads are exponential and this can pose serious problems with older engine designs that were quite unbalanced. This was real problem in the past limiting both power and longevity. Some manufacturer counter that with stronger internals and turbo or supercharging while limiting rpm for longevity.
This is a cheap way to get serious power from engine.


Don't be fooled that "perfect balance" reduces or eliminates internal stress. No matter how perfectly balanced the entire rotating assembly may be, an individual rod bearing will always see the full inertial load of reversing the direction of the piston twice every turn of the crank. Yes, the perfect balance job does take load off of the main bearings (mostly) but even then a stress may be transmitted from the crank, through a main bearing to the block structure, and then back to the crank at another bearing (or pair of bearings) where it is effectively cancelled- but those bearings still see an unbalanced force against the block. An engine that spins without a trace of external vibration may still be generating enormous internal forces between components, especially in the case of 4-bangers where paired balance shafts are used to "fix" the natural imbalance of an inline 4. The crank and block see no less stress if you were to remove the balance shafts- only the engine mounts (and driver, and accessories) would notice a difference.
 
Originally Posted By: chrisri
440 Magnum;
I agree, every engine acts different. American pushrod V8 with massive torque and over engineered bottom end can be lugged without problem. But how will this engine respond to 7500 rpm over prolonged time on same internals?
Not well I believe.
After 160 k some wear is normal this way or another.



Another interesting anecdote from "the old days" with Mopars. For several years, the big-block 440 was made in several power levels, but all versions used the exact same connecting rod- the so-called "LY" factory forged rod. I think the rod actually dates back to the 413 and 426 wedge-head engines that preceeded the 440 (the 426 Hemi was a different animal, as was the "Max Wedge"). They were pretty much bulletproof, including high-RPM -relatively speaking for a 7.2L engine- operation. When the 3x2 "six pack" carburetion setup was introduced, a stronger/heavier connecting rod was also introduced and eventually became the norm for all production 440s through the end of production in '78. But the aftermarket hot-rodders discovered an interesting thing: under high-RPM conditions that an "LY" rod engine would survive just fine, "six-pack" rod engines would sometimes fail. Sometimes violently. There was no net difference in balance, but the added stress of swinging the heavier rod/piston combination at high speed was enough to fail bearings or the rods themselves. It was never a problem for production engines- the stock cams, head porting, etc. all "ran out of breath" before the bottom end would come apart. But once the breathing was opened up and higher RPM potential enabled, the "stronger" rods were the ones that became less desirable because of the added weight. People looking for big power jumped straight to aftermarket rods anyway, but a lot of weekend warrior engine builders came to rely on replacing factory "six-pack" rods with the older/lighter factory "LY" rods and lightening the rotating assembly.
 
Yes, perfect balancing with flywheel on engine will reduce stress on internals and then you can increase max rpm after with safety. I agree, inertial stress is still there with crank twisting and acting on bearings.
This is where real engineering comes in to limit those forces to minimum. When Euro manufacturers went OHC/DOHC first thing they did is 5 main bearings, stronger blocks and forged property balanced crankshafts.
My 1300 Fiat 128 could do 8500 rpm easy with electronic distributor.
My 1300 pushrod Ford want to destroy it self at 5 k rpm.

All this you can also say for big v8 engines. Ferrari engines in 70s could do 7k, today they go to 9k. How ? It's magic.
 
Last edited:
Originally Posted By: 440Magnum
Originally Posted By: chrisri
440 Magnum;
I agree, every engine acts different. American pushrod V8 with massive torque and over engineered bottom end can be lugged without problem. But how will this engine respond to 7500 rpm over prolonged time on same internals?
Not well I believe.
After 160 k some wear is normal this way or another.



Another interesting anecdote from "the old days" with Mopars. For several years, the big-block 440 was made in several power levels, but all versions used the exact same connecting rod- the so-called "LY" factory forged rod. I think the rod actually dates back to the 413 and 426 wedge-head engines that preceeded the 440 (the 426 Hemi was a different animal, as was the "Max Wedge"). They were pretty much bulletproof, including high-RPM -relatively speaking for a 7.2L engine- operation. When the 3x2 "six pack" carburetion setup was introduced, a stronger/heavier connecting rod was also introduced and eventually became the norm for all production 440s through the end of production in '78. But the aftermarket hot-rodders discovered an interesting thing: under high-RPM conditions that an "LY" rod engine would survive just fine, "six-pack" rod engines would sometimes fail. Sometimes violently. There was no net difference in balance, but the added stress of swinging the heavier rod/piston combination at high speed was enough to fail bearings or the rods themselves. It was never a problem for production engines- the stock cams, head porting, etc. all "ran out of breath" before the bottom end would come apart. But once the breathing was opened up and higher RPM potential enabled, the "stronger" rods were the ones that became less desirable because of the added weight. People looking for big power jumped straight to aftermarket rods anyway, but a lot of weekend warrior engine builders came to rely on replacing factory "six-pack" rods with the older/lighter factory "LY" rods and lightening the rotating assembly.



You are right. Forgot about internal mass altogether. Light mass is a must when going high rpm. Cool story with conrods. Supercharged cars benefit from beefier rods that did not work well after in high rev cars. They probably didn't rev as well due to heavier internals also.
 
Awesome thread.



So are we saying that most wear isn't caused at start up but during warm up then when the oil gets warm enough to thin out but the additives aren't hot enough to do their thing?


I love this site.


Shannow is the bearing king. I've learned more about bearings just reading his posts than I ever thought I'd need to know.
Great question OP
 
Originally Posted By: Hokiefyd
Originally Posted By: Shannow
So at start-up, provided the oil can flow and not starve the oil pump, these parts will be as far apart as they can be, and wear will be low.

During warm-up, the viscosity drops very very quickly, and the surface active additives take heat to be active.

That's the damaging point in the warmup cycle, when the lubricant is thinning, and the additives aren't functioning. Usually, you have left over tribofilms from last time, so wear isn't an issue.


That's an interesting concept. If I understand you correctly, wear is (or can be) very low immediately after startup when the oil is still cold and relatively thick, and wear is (or can be) very low once all parts are fully warmed and the engine is operated in a more-or-less constant state.

But wear is (or can be) higher in that transitory period where the oil has begun to thin, but heat-activated additives have yet to "wake up"?

Just for the sake of discussion, what oil temperature range are we generally talking here? Or...at what oil temperature have most of the heat-activated additives become energized?


So is a low VI oil better? Or maybe even a straight weight? Should I return my M1 0w40?
 
When GF-6 is rolled and the addition of SAE 16, as a group, VI may be lower in PCMOs.

The point I believe aa1986 is making, a lower VI lubricant will warm up faster reducing
the transitional period.
 
Originally Posted By: used_0il
When GF-6 is rolled and the addition of SAE 16, as a group, VI may be lower in PCMOs.

The point I believe aa1986 is making, a lower VI lubricant will warm up faster reducing
the transitional period.



A lower viscosity index lubricant will warm up faster?
Why would that be?
A higher vi lube will thicken less when cold therefore being closer to ideal operational viscosity,what I'm failing to understand is why there would be difference in oil warm up time between 2 lubricants of the same grade.
Can someone elaborate or show me where to go where this is explained please.

Great thread guys.
 
Originally Posted By: aa1986

So is a low VI oil better? Or maybe even a straight weight? Should I return my M1 0w40?


Not that I can see. Lower VI means that the slope of the viscosity vs. temp curve is steeper, so that when the engine starts warming up the oil thins out EVEN FASTER before the temp is high enough to kick in the additives. If you're thinking that it'll start out so much thicker that it'll protect better until the addpack kicks in, I still don't think that's true because I'm betting the addpack "kick in" temperature is much closer to operating temp than startup temp. IN other words, the oil will be almost to operating temp thickness by the time the addpack kicks in no matter what, so all you get at lower temps with a low-VI is poorer flow and possible oil pump starvation.


At least that's how I look at it...


Shannow- aren't addpacks formulated with some non-heat-activated AW components to help bridge this gap? And are modern engines all that dependent on the heat-activated components (like ZDDP) anymore now that so much sliding high-pressure contact has been eliminated through engine design?
 
Another thought I had on this, is the roll-out of PC-11 which will separate
PCMOs and HDMOs by grade.

It looks like Chevron was the first in north America to offer 15W30 as a viable
alternative to 15W40.

I'm with aa1986 on this one.

Does the faster warm-up time a lower VI provide a desirable objective?

Which is less wrong?

An engine oil with a high VI in a grade that may not be needed at one end or the other?

Or an engine oil that is more suited to the ambient and operating conditions with a lower VI?
 
Last edited by a moderator:
Originally Posted By: used_0il
Another thought I had on this, is the roll-out of PC-11 which will separate
PCMOs and HDMOs by grade.

It looks like Chevron was the first in north America to offer 15W30 as a viable
alternative to 15W40.

I'm with aa1986 on this one.

Does the faster warm-up time a lower VI provide a desirable objective?

Which is less wrong?

An engine oil with a high VI in a grade that may not be needed at one end or the other.

Or an engine oil that is more suited to the ambient and operating conditions with a lower VI?



Good question.
I'm thinking that higher viscosity index oil would be the better alternative vs the other option.


Edit.

I'm going the other way with this actually. The latter would be better. I don't think the viscosity index is as important as matching the oil to operating conditions rather than going thicker with a higher viscosity index.
 
Last edited:
Status
Not open for further replies.
Back
Top