I know there are many reports of problems with aftermarket MAF (mass air flow) sensors and perhaps what I’m finding could be a factor – see what you think. There are many threads on specific brands of sensors but most are now made in China regardless. I didn’t want to get into specific brand discussion (Like use OEM but hard to source from 30+ year old vehicles) so first trying to get the below figured out...
I’m doing research on the early 1990’s GM OBD1 MAF sensors with my specific application being a 1993 Buick Century with the 3.3L. Nevertheless this discussion should apply to other models with the 3300 and all 3800 V6 applications during the early 1990’s.
At issue is whether the OBD1 ECM (computer) is expecting to get an analog voltage signal or a frequency signal from the MAF? Where these OBD1 systems designed to interpret a frequency signal?
From what I can find, GM was using a hot-wire sensor during this time (vs the hot-film type that became more common with their OBD2)
I have installed a brand new major aftermarket brand MAF sensor and there are three input wires to it. (The 12v supply, ground and a signal wire.)
Testing on that signal wire I expected to find analog voltage that varied with RPM output. In general numbers at idle were expected around 0.8 to 1.0v; increasing with a raise in RPM’s up to around 3.5 to 4.0 volts. Instead I get 2.5v with KOEO, 2.35v at warm idle and dropping down to 2.0v with throttle snap. (I wish I had an original OEM sensors to compare but don't)
Therefore it appears this new MAF sensors is reporting frequency (Hz). I have a Hz reading on my multi-meter and I get 3,000hz at warm idle, and up to 7000 to 8000 with throttle snaps (my scale might be wrong but you get the idea).
Now here is the concern. Was the ECM OBD1 back then analog-only and looking for a change in voltage? If so the digital output would be incompatible and cause the ECM to go into a default mode. This could contribute to rough idle, no-start and other undesirable conditions.
In addition, I speculate it would not set an engine code (34) because the ECM is seeing a voltage reading (2.0-2.5) and those are in a “normal range” for often used 2500 to 3,000 RPM’s. (Under OBD1 it probably doesn’t set the codes unless voltage is less than 0.5v or greater than perhaps 5v.)
Surely the manufactures of the MAF sensor’s would not be taking advantage of this to produce something that get’s OBD1 vehicles “running” with no codes reporting; but no where near optimum as designed?
I’m doing research on the early 1990’s GM OBD1 MAF sensors with my specific application being a 1993 Buick Century with the 3.3L. Nevertheless this discussion should apply to other models with the 3300 and all 3800 V6 applications during the early 1990’s.
At issue is whether the OBD1 ECM (computer) is expecting to get an analog voltage signal or a frequency signal from the MAF? Where these OBD1 systems designed to interpret a frequency signal?
From what I can find, GM was using a hot-wire sensor during this time (vs the hot-film type that became more common with their OBD2)
I have installed a brand new major aftermarket brand MAF sensor and there are three input wires to it. (The 12v supply, ground and a signal wire.)
Testing on that signal wire I expected to find analog voltage that varied with RPM output. In general numbers at idle were expected around 0.8 to 1.0v; increasing with a raise in RPM’s up to around 3.5 to 4.0 volts. Instead I get 2.5v with KOEO, 2.35v at warm idle and dropping down to 2.0v with throttle snap. (I wish I had an original OEM sensors to compare but don't)
Therefore it appears this new MAF sensors is reporting frequency (Hz). I have a Hz reading on my multi-meter and I get 3,000hz at warm idle, and up to 7000 to 8000 with throttle snaps (my scale might be wrong but you get the idea).
Now here is the concern. Was the ECM OBD1 back then analog-only and looking for a change in voltage? If so the digital output would be incompatible and cause the ECM to go into a default mode. This could contribute to rough idle, no-start and other undesirable conditions.
In addition, I speculate it would not set an engine code (34) because the ECM is seeing a voltage reading (2.0-2.5) and those are in a “normal range” for often used 2500 to 3,000 RPM’s. (Under OBD1 it probably doesn’t set the codes unless voltage is less than 0.5v or greater than perhaps 5v.)
Surely the manufactures of the MAF sensor’s would not be taking advantage of this to produce something that get’s OBD1 vehicles “running” with no codes reporting; but no where near optimum as designed?