Unorthodox Behaviour wrote:
Greetings to Rob and the list!
I've been lurking for a while but now I'd like some advice. For the lack of money for a 24/96 card, I just bought a Hercules (ex Guillemot) Game Theatre XP 6.1 card, which features the Crystal CS4630 DSP with 18-bit ADC and 20-bit DAC. (It's the same chip as the Videologic Sonic Fury, and the Turtle Beach Santacruz)
...
Question 1 is: Is there a way to certify that the resulting sound is taking advantage of the 20 bit resolution of the card and that it's not just a dithered to 16 bits sound ? (provided of course that the card's drivers do actually support these 20 bits of the ADC...)
Unfortunately, if it's a 20-bit DAC and you feed it 24 bits of data, the 4 least signifigant bits just get chopped off which is not desirable. I would venture to guess that 16 bits with MAD's dithering would be better than 24 bits with the last 4 being ignored.
Ian