Part Number:TLV320AIC34
Is there any documentation or test data describing the limit cycle and idle tone generation in the AIC34's ADC result? I am able to observe frequency tones in the ADC conversion result (sampled and reported at 96kHz) that are not present at the ADC input to the codec. The frequencies change dependent on the common mode DC level of the differential inputs: I have the ADC inputs shorted together and left floating or tied to various DC voltages...the value of the DC offset changes the observed frequencies.
If I apply a time varying DC offset to the shorted differential ADC inputs (my approach to dithering the quantizer), the frequency tones go away. From delta-sigma ADC literature, this can be considered normal behavior, however if I knew more about the design of the ADC (order of the modulator and how many levels (bits) the quantizer is), I think I would be able to predict when and where the idle tones would appear in the frequency spectrum.
On a related note, if I have the codec and PCB that it is installed on in a thermal chamber with the common mode DC signal supplied from a quiet benchtop supply outside the thermal chamber (perfectly stable over temperature) then I get a very repeatable change in frequency of the idle tones. I am suspecting that this can be attributed to a slight change in the 3.3V analog supply over temperature that we have connected to DRVDD and AVDD_DAC. Would you expect that a reference voltage change on the order of a couple mV to impact the frequency of the idle tones observed in the ADC conversion result?
I was under the impression that if the DC offset is near-0V, then the idle tones should appear at low frequency, if DRVDD is 3.3V, should I aim for a DC offset near 0V or near 1.65V, the middle of the differential input range? The critical frequency range when I need to guarantee absence of idle tones is from 5kHz - 17kHz.
Thanks for any thoughts!