I'm working with a DAC and experimenting with removing the zero scale error - the actual output at a zero input code. I'm on a breadboard so it isn't particularly accurate but the ZSE is 5mV to 11mV.
I set up a differential amplifier as follows - image is copied from a website, but I've confirmed and re-confirmed that my setup matches:
So, I have the following inputs:
V1 = 0.005mV
V2 = DACoutput
R = 100K. All of them.
Opamp is an LM741 which is all I have on-hand, V+ = 5V, V- = ground . It has a max offset voltage of 6mV.
Given that, I would expect Vout = V2 - V1. Vout is connected to a 1K resistor to ground.
So the reality is that the R values are not perfectly 100K but are within the 10% spec. Thus I wouldn't expect Vout to be an exact difference, but 'more-or-less'.
This is what I'm measuring:
DACout (V2) = 4.06V, Vout = 4.04V
DACout (V2) = 0.011V, Vout = 3.6V
I've tried working out why with a low DACout I'm still getting a large Vout. Measuring the resistors and plugging them into the gain formula:
Vout = -0.005 * (99.03/99.5) + 0.011 * (99.21/(99.05+99.21)) * ((99.5+99.03)/99.5)
Vout = (-0.005 * 0.995) + (0.011 * 0.500 * 1.995)
Vout = -0.005 + 0.011 = 0.006
So, essentially, my calculation confirms what I would expect the amplifier to do with a -0.005mV and 0.011mV input.
I can't figure out where I've gone wrong, can anyone give me some pointers?