I think what he is asking is what is the variance of COAL(bullet nose) when we seat using the Ogive.
So say we load using the ogive(not flat points), plunk test and such. Come up with COAL for the round and all that.
What is the variance of COAL (from nose of bullet) across 10 bullets or so...
For me, from memory, it is maybe .0005 or so. I do know that when I pull my pre-set 650 loading heads down to make a run of the same bullet (the dies are GTG), I do double check COAL and if its off more than that I'll recheck/readjust stuff.
What would be really neat is to have a measurement tool that fits on calipers and would measure the COAL off the O-give vs nose, kind of like the Rifle Case comparators do to measure the shoulder of the case. If we are seating with Ogive, why aren't we measuring with ogive?
Lets say there is one bullet that has a short nose and higher ogive. You measure it, Coal is right on. It goes into the "good" pile. Woops, might be an issue. If you plunk your ammo, you would know though.
So when we plunk, we can see if it passes or not, but I can't tell the variance of ones that don't pass by eyesight alone. I mean you can tell if they spin or not, but if not, how many .0000 are they too long? You could measure to bullet nose, then set it further and further each time till it spins and then recheck the bullet nose, that would give you variance, but I would hate to do that just for testing/data purposes. The only other think I can think of is plunking ammo in the barrel and using calipers to measure from base of case to some known/repeatable place on the barrel (like the edge of the first lug). I just don't know if that is possible, or easily repeatable though. It would also be nice to know the variance in Ogive vs bullet nose across some of the major bullets we use too.