I have a 16 bit ADC, I use the +/1 V range, differential channel. If I used the full range of the ADC to provide full resolution (I am aware non-linearity and noise may affect this) this calculates as:
2^16 = 65536 Therefore 2 V / 65536 bits = 30.52 µV
However my signal has a voltage between 0-0.6 V. That covers approx. 1/3 of the ADC range.
I know I lose resolution and signal to noise ratio but what is the calculation for this? I presume it is going to be approx. 90 µV.
Please clarify.