Those more common values are probably based on something. Either that small reduction below 0dB computes to enough to avoid all clipping, the probability of clipping at those values is so small as to be negligible, or the recommendations are misguided. Who knows?
The -0.1 to -0.3 values are there because some CD players have been known to make a huge fuss about the decoded value for 0dB, although to be fair, I think that this was the early ones in the main. But since you can't tell what a disc will be played on, and this difference in level is held to be inaudible, it hardly matters. I don't know where the -6dB recommendation came from - I've never seen it, but I can hazard a guess, based on what happens in the following stages:
Under very adverse circumstances, as you noted it's possible to get a much higher value than 0dB out of a D-A converter, although in general, you'd be unlikely to get more than +2-3dB. And as this is not a result of incorrect codes, the only limitation on what it does to the output is based on whether there is enough headroom in the reference voltage to be able to reproduce these levels. If there is, then through they go to the next stage. If there isn't, and the next stage has the same headroom limitation, then you will get bad clipping anywhere over 0dB anyway.
And cutting your peak level back 0.3dB won't prevent these overs. In order to absolutely guarantee to do
that, you'd need about 6dB of headroom on the CD itself to prevent the output from ever rising to these levels. In all honesty I can't say that I've ever noticed or recieved notice of a problem with this myself, so I tend to stick to about -0.2dB. The +8dB figure was something of a contrivance partly based on a subsample rate trick, and I can't see this
ever turning up for real.