For every 100 people that tell you that they measure noise, only 1 will give you any detail on how they do it. I've been trying to do it for 35 years and I still have more doubts than certainties as to what I am seeing. Plus, given that vibration also contributes noise when you are working with position measurement, there is a added layer of uncertainty.
So the question is, What constitutes an ideal setup to measure noise? I'm not talking anything esoteric, just in a band up to 1 Mhz. And I'm trying to decipher microvolts in a less-than-ideal engineering environment.
1) Line/Power Supply noise: I *think* I'm fighting this a lot here, but again, I'm not sure. Is there a cheap way to measure noise on the line? Or is it better to filter the crap out of it?
2) In the Air: How much do radio waves, stuff in the air, contribute to the noise readings? Is a faraday cage important or overkill?
3) Out-of-Band Noise: Is it really there or pickup in way of measurement?
Actually, I could go on and on with the list. Right now the fact that I can get different measurements of noise on different days and different parts of days makes me wonder what I am really seeing. I once had a tech at MIT state that you always use the lowest reading because noise can read higher from stuff that really isn't there and the lower figure is closer to reality. But I can think of several different arguments that would put doubt into that.
What are the noise experts out there using?
Thanks...Steve