I wouldn't doubt that a lab method is a titration of some sort that can be extremely exact and particular. R&D lab techs will also tend to have much better skills at getting good, reproducible, exact results, compared to standard analytical techs (I work R&D as my day job, I see this a lot).
ASTM methods are a good way of getting a general procedure that everyone can do. However, in my experience, they leave too much up in the air. I recall one method for analyzing fuel, where one of the instructions was to fill something with 2 +/-1 mL of fuel. With tolerances like that, there certainly can be 10% swings.
In much analysis, 6% RSD is acceptable. A "lab" TBN of 13 could be +/- 0.7 and still be fully acceptable under many analytical instruments used in labs. That said, on some of our equipment, we can get RSDs of 20-30%, which for the sake of comparisson (since we dont have to worry about fines from EPA, FDA, etc) is sometimes good enough, and Id venture to guess that for a TBN measurement, 20% compared to a "lab" analysis might be good enough... You want to see that you have reserve, not wring out every last bit of buffering ability.
Remember, if TBN goes too low, likely the kinetics of neutralization get too slow too. Think back to HS chemistry, doing titrations with an indicator... at first whatever you titrate in immediately reacts and all is fine, but as you reach the equivalence point, the rate of disappearance of indicator is much slower. Same thing is likely the case here.
JMH