I will spare you my usual bloviation about accuracy in time keeping and what “real” time is, as that discussion quickly expands into the universe and my mind ends up feeling like an M.C. Escher sketch. I will however, say that a 32.768 KHz watch crystal with a +/- 20 ppm frequency tolerance means that the usable range of this countdown timer (99Hrs, 59Min, 59Sec) could be off by +/- ~7.2 seconds in a worst case scenario. The worst case error for a 2.5 hour time setting would be merely +/- 0.18 seconds, plenty accurate for a kitchen timer. The astute will notice I am not mentioning crystal load capacitors, PCB design, the fact that a tolerance is not a deterministic offset or the fact that a kitchen is often a warmer environment than ideal for accurate time keeping. Ok – thats it. Not really, but I have to stop now.
The microcontroller is an AtMega16 because I had it around and there was no real plans to make another timer. The display uses 3 LTD-4708JR seven segment displays and the keypad lacks a part number, but I purchased several of them from AllElectronics.com. Nothing is solidly mounted inside the dollar store tuperware box yet except for the keypad which is hot glued into place. The PCBs were designed in eagle and etched in my basement laboratory.
The software multiplexes the 7-segment displays and the keypad is multiplexed to check for button presses too. The pound symbol starts the timer and the asterisk resets the timer. Time is input in HH:MM:SS format with no error checking currently. That is to say if you enter 90 and start the timer the display will jump to 1min and 30sec. This is because the software converts the time you enter to uint32_t SecondsToGo, then reformats it to HH:MM:SS for display purposes.
I could make some lame joke about not releasing the source code due to the fact that this kitchen timer is only a resistor, transistor, diode and relay away from being a really sweet looking bomb detonator, but the truth is that the software is commented worse than a boxxy youtube video.