The flash point is what’s used to infer fuel dilution.
And we have seen Blackstone estimates that are way worse than 0.013 discrepant. More like several percentage points.
I wasn’t even looking at the flashpoint. Just the fuel number itself.
I’m acutely aware of the costs and challenges associated with high fidelity analytical equipment. And the RSD that is appropriate.
That said, absolute accuracy or consistency isn’t the whole name of the game. The right tool and the right fidelity at the right place and time is necessary. I don’t need to know if it’s 0.5 or 0.513%. I need to know if it’s varying by percentages.
Per BS there error rate on measuring flashpoint isn't very high.
"Based on the margin of error for the methodology we use for measuring the flashpoint, the lowest fuel dilution value you’ll see on one of our reports is <0.5%. That’s our way of essentially saying that no measurable fuel dilution was detected in the oil. If the flashpoint of your sample reads the same as the “should be” value, we’ll report a “TR” (or trace) of fuel dilution. In other words, it’s likely there was a very small amount of fuel dilution present, but not enough to quantify. After that, you’ll see fuel dilution reported as a percentage of the sample. The most fuel our test can accurately read is 10%. If you have more than that, we’ll report >10% (and you should head to a mechanic)."
What is a Flash Point? | Blackstone Laboratories
www.blackstone-labs.com