I'm working on a small project that requires a three minute timer. Default with power up an LED comes on. Push a button an the LED goes out for three minutes and then lights again. It is a simple circuit using a few pins on an Arduino Nano and the code delay (180000) to get the three minutes. My middle school teaching daughter asked me to build the LED timer (I called it an egg timer) to assist a child in her classroom that has a learning disability.
The delay code is a brute force method of counting down time. I did some reading on millis() and thought maybe it is the better approach?
I don't have a need to manage other tasks while I wait for time, which is what I see millis() does (I think). I tried generating a block of code for the three minute timer using millis() with little success. I abandoned the exercise and used delay. I have a mental post it note to follow-up on the learning to create a working three minute timer using millis(). The code would count what ever time you set, in this case three minutes.
Any insight a member has to offer is welcome. I apologise up front, if my responses to any replies are dumb. I'm a resurrectionist rather than a programmer. I have enough knowledge to kludge together different code parts to create something. I lack the skills to actually program from scratch.