Hallo Guys,
I want to implement a digital filter using an Atmega16 Microcontroller, I am getting the Input signal from an 8 bit ADC "AD9057" to the Atmega 16 "port C" then I converted my input signal to ASCii Characters ( 0-128) which are displayed in my Terminal every 1 second , but there is some noise in my output signal and I want to get rid of it, so I want to use a digital filter for this purpose, but I am confused which digital filter may I implement to get the desired result.
just to give you an example:
When I apply a constant voltage to my ADC analog input; I expect a single digital output representation. but I am getting a range of data.
like when I apply 2 V I get a digital output (0 - 3)
2.2 V I get ( 40- 44)
2.3 V I get ( 65-70)
2.5 V I get ( 118-124)
And in each case the desiried output should be just one value " a single digital output representation, lets say the average " and not a range of data.
Any ideas for this issue?
regards
Imed