If I've understood this correctly, doesn't this mean that there is a bug somewhere? I've always thought that AA queried the device sourcing the data and either sample rate converted as the data was incoming or gave a warning. I didn't think that under any normal circumstances you should end up with a header that didn't match the data.
Andy's correct, and so are you - almost - when it comes to CEP/AA
initiating a recording...
The software initiating a recording sends commands to the sound device telling it what rate to sample at, and what output format it would like to see. But when it comes to a S/PDIF feed, this doesn't work because those settings are over-ruled by the source. So for instance, if I connect my DAB radio to a sound device and let it provide sync for it, that's what the device streams out - regardless of what the software tells it.
But the software thinks it's issued a command - and doesn't check to see if has been obeyed. So the 48k stream from my DAB radio gets a 44.1k file header if I'm not careful. And plays a semitone wrong.
I did try editing the header to make the file look like a proper 48K one but it still played incorrectly in AA 1.5. Actually the pitch was higher than it should have been when played from AA 1.5!
pwhodges' explanation seems to be the correct one. If you just edit the header, then you've got 48k audio playing at 44.1k. But if you play this back into a device still being clocked at 48k you will stream 48k data 48/44.1k
fast back into it, and that's why it plays back at a higher pitch. The other thing you'd notice after a while is bloody great clonks in the audio when the data clocks don't conincide any more!
The procedure for correcting this is to do a sample rate conversion, and simply save the resulting file. And I'm pretty sure that there's no other
simple way around this at all, even though in theory there ought to be.