I have this (alternate) idea on how to implement an FPGA-based DAC, but I haven't been able to find any info in the net...
With the traditional approach of delta sigma modulating and then lowpass-filtering the signal, the closer that the modulated signal frequency gets to the filter cutoff frequency, the more that it gets attenuated and phase-shifted...
So as an alternative I thought the system could be handled like a control system, but since there is no feedback from the analog output I would run an LPF model within the FPGA in a feedforward way. In this way, if I want to push the modulating frequency closer to the cutoff frequency I would be able to compensate for the attenuation and phase shift so that there is none.
The reason that I'm thinking of applying this instead of increasing the LPF cutoff frequency is that the latter would have drawbacks like reduced noise filtering. Compensating for the attenuation and phase shift of course would not be "free", but it still is an alternative to just not being able to generate the desired signal amplitude or keep the phase shift to the minimum.
Any ideas? Does this make sense?