I've been searching for an answer for a while now but I still can't find an answer as to how microcontrollers can convert an analogue input to binary that it can understand, especially at the incredible speeds that they are able to.
If anyone could explain the methods or post a link to a circuit or other post I would be extremely grateful.