Why does compression usually test so high?

Joined
Jan 29, 2014
Messages
727
Location
USA
This is something I have been curious about for a while and have never been able to figure out. On the majority of automotive engines, cranking compression generally reads way higher than expected for the engine's compression ratio and I'm wondering why this is.

For example, my 05 Civic has a compression ratio of 9.9 to 1. With simple math (14.7 PSI atmospheric pressure X 9.9), we can calculate that the compression should theoretically be 145 PSI not accounting for imperfect volumetric efficiency or inevitable leakage. But when tested, that engine actually produces about 215 PSI across all 4 cylinders, which is a typical result for that engine despite seeming scientifically impossible. In theory, assuming no leakage at all and perfect cylinder filling (which is obviously impossible), it would need a compression ratio of about 14.6 to 1 to generate 215 PSI, which it definitely does not have.

Does anyone know why this is? The only potential explanation I can think of is gauge inaccuracy, but I doubt that explanation is the answer since seeing significantly higher than theoretically possible compression is typically expected on healthy engines, not just some anomaly in a few test results.
 
Last edited:
There's more to it like valve lift and timing. You can't calculate it like you did
I completely understand that, but unless somehow valve timing and lift resulted in higher than 100% volumetric efficiency while cranking (which I highly doubt), how could that cause such high compression?

To end up with higher than theoretically possible compression like I described due to valve timing and lift, wouldn't the cylinders need to be starting at higher than atmospheric pressure with the piston at bottom dead center before the compression stroke begins?
 
There's more to it like valve lift and timing. You can't calculate it like you did
I'm with Chris: I don't think it's that simple. Since I had my '57 Plymouth FSM handy, I looked there. My 230 flathead 6 has a CR of 8.0:1. Using your formula results in a theoretic maximum of 117.6. Yet the FSM specs compression at 120-150 psi.

1680320995834.jpg
 
Does Honda give a spec?
The only spec they give is that the compression should be a minimum of 135 PSI with a maximum of a 28 PSI difference between cylinders, which is their spec for every one of their engines I have seen.

However, I have enough experience with these engines to know that 180-200+ PSI is expected if it's healthy and the test is performed properly (engine hot, all plugs out, fully charged battery, wide open throttle, etc).

To be clear, I'm not asking about that engine specifically, I was just using it as an example. Many other engines exhibit this same odd behavior.
 
I think engines never hit zero PSI on the exhaust stroke, the valve is too small and not fully open enough to truly hit zero PSI. So there will always be a small amount of pressure in the cylinder once you have turned over the engine a few times when testing for compression. My untested thoughts on this.
 
I think engines never hit zero PSI on the exhaust stroke, the valve is too small and not fully open enough to truly hit zero PSI. So there will always be a small amount of pressure in the cylinder once you have turned over the engine a few times when testing for compression. My untested thoughts on this.
Are you saying there's residual compression left in the cylinder before they next compression stroke?
 
I think engines never hit zero PSI on the exhaust stroke, the valve is too small and not fully open enough to truly hit zero PSI. So there will always be a small amount of pressure in the cylinder once you have turned over the engine a few times when testing for compression. My untested thoughts on this.
Lol wut I don’t believe that is how that works.
 
This is something I have been curious about for a while and have never been able to figure out. On the majority of automotive engines, cranking compression generally reads way higher than expected for the engine's compression ratio and I'm wondering why this is.

For example, my 05 Civic has a compression ratio of 9.9 to 1. With simple math (14.7 PSI atmospheric pressure X 9.9), we can calculate that the compression should theoretically be 145 PSI not accounting for imperfect volumetric efficiency or inevitable leakage. But when tested, that engine actually produces about 215 PSI across all 4 cylinders, which is a typical result for that engine despite seeming scientifically impossible. In theory, assuming no leakage at all and perfect cylinder filling (which is obviously impossible), it would need a compression ratio of about 14.6 to 1 to generate 215 PSI, which it definitely does not have.

Does anyone know why this is? The only potential explanation I can think of is gauge inaccuracy, but I doubt that explanation is the answer since seeing significantly higher than theoretically possible compression is typically expected on healthy engines, not just some anomaly in a few test results.
Good question - I've often wondered the same.
 
One other issue is that you are also heating up the air. It’s entering the chamber at say 68 F and during the compression stroke the walls of the cylinder are at almost 200 F. The air won’t reach 200 but it will be a lot more than 68. The behaviour of the air is governed by the ideal gas law. At the top of the compression stroke for one micro second, the volume is fixed and the temp has been raised. So now the pressue is high because the air has been compressed, but it is even higher because the temperature has been increased. 🤓

5CBE948E-4166-438F-B510-8FF4346C2D69.jpeg
 
Last edited:
It takes several seconds of cranking before the pressure in the gauge peaks out. Therefor, you are reading the pressure trapped in the gauge and not the cylinder. The velocity of the air being pushed up by the piston increases the pressure.
 
This is something I have been curious about for a while and have never been able to figure out. On the majority of automotive engines, cranking compression generally reads way higher than expected for the engine's compression ratio and I'm wondering why this is.

For example, my 05 Civic has a compression ratio of 9.9 to 1. With simple math (14.7 PSI atmospheric pressure X 9.9), we can calculate that the compression should theoretically be 145 PSI not accounting for imperfect volumetric efficiency or inevitable leakage. But when tested, that engine actually produces about 215 PSI across all 4 cylinders, which is a typical result for that engine despite seeming scientifically impossible. In theory, assuming no leakage at all and perfect cylinder filling (which is obviously impossible), it would need a compression ratio of about 14.6 to 1 to generate 215 PSI, which it definitely does not have.

Does anyone know why this is? The only potential explanation I can think of is gauge inaccuracy, but I doubt that explanation is the answer since seeing significantly higher than theoretically possible compression is typically expected on healthy engines, not just some anomaly in a few test results.

The intake charge heats while being compressed and that increases pressures further
 
This is something I have been curious about for a while and have never been able to figure out. On the majority of automotive engines, cranking compression generally reads way higher than expected for the engine's compression ratio and I'm wondering why this is.

For example, my 05 Civic has a compression ratio of 9.9 to 1. With simple math (14.7 PSI atmospheric pressure X 9.9), we can calculate that the compression should theoretically be 145 PSI not accounting for imperfect volumetric efficiency or inevitable leakage. But when tested, that engine actually produces about 215 PSI across all 4 cylinders, which is a typical result for that engine despite seeming scientifically impossible. In theory, assuming no leakage at all and perfect cylinder filling (which is obviously impossible), it would need a compression ratio of about 14.6 to 1 to generate 215 PSI, which it definitely does not have.

Does anyone know why this is? The only potential explanation I can think of is gauge inaccuracy, but I doubt that explanation is the answer since seeing significantly higher than theoretically possible compression is typically expected on healthy engines, not just some anomaly in a few test results.
VVT more than likely. No need for higher lift etc. during cranking…
 
I've never heard your idea of atmospheric pressure x compression ratio equaling expected compression ratio before and I've been a mechanic for a living since 1987. Cam timing events have a lot to do with it along with compression ratio
 
When Air is compressed it heats up.
for 8:1 the air is compressed into 1/8 its former volume.. heats up SIGNIFICANTLY.
This creates the additional pressure but there are many variables in the engine so you can't calculate it out to 1 psi.
Finally, the right answer appears! If air didn't heat up when compressed, compression-ignition engines (i.e., diesels) wouldn't be able to ignite their fuel.
If an engine were a perfect adiabatic compressor with no leaks, no intake restriction, etc., you could calculate the final pressure "out to 1 psi," but an engine isn't. Basic thermodynamics ...
 
Back
Top