Therefore, I have concentrated on writing code that I can compile and test as a native binary on Linux, and then (presumably) run on the Arduino with the same behavior.
Last night for the first time, the microcontroller set correctly from the recieved signal. Then for the next two hours I tried to determine what bug I had introduced that prevented it from setting a second time, or what wiring fault I had caused, or what have you.
Ultimately, I figured out that the first time was dumb luck: due to two serious errors in my code (not tested by my test harness), the device would only set if it started running somewhere in the first 1/10 of a wwvb second or so. With that bug fixed, it set 4 more times.
Next, I worked on leap seconds and DST details. My test harness shows it properly handling "special" times related to these flags:
tests/2008leapsecond set time 2008/366 23:58:59 ly=1 ls=1 dst=0 2008/366 23:59:00.0000 ly=1 ls=1 dst=0 2008/366 17:59:00.041 CST 2008/366 23:59:59.0000 ly=1 ls=1 dst=0 2008/366 17:59:59.000 CST 2008/366 23:59:60.0000 ly=1 ls=1 dst=0 2008/366 17:59:60.000 CST <-- Leap second 2009/001 0:00:00.0000 ly=0 ls=0 dst=0 2008/366 18:00:00.000 CST 2009/001 0:00:59.0000 ly=0 ls=0 dst=0 2008/366 18:00:59.000 CST tests/enddst set time 2000/303 6:58:59 ly=0 ls=1 dst=1 2000/303 6:59:00.0000 ly=0 ls=1 dst=1 2000/303 1:59:00.041 CDT 2000/303 6:59:59.0000 ly=0 ls=1 dst=1 2000/303 1:59:59.000 CDT 2000/303 7:00:00.0000 ly=0 ls=1 dst=1 2000/303 1:00:00.000 CST <-- Switch to CST 2000/303 7:01:59.0000 ly=0 ls=1 dst=1 2000/303 1:01:59.000 CST tests/startdst set time 2000/093 7:58:59 ly=0 ls=1 dst=2 2000/093 7:59:00.0000 ly=0 ls=1 dst=2 2000/093 1:59:00.041 CST 2000/093 7:59:59.0000 ly=0 ls=1 dst=2 2000/093 1:59:59.000 CST 2000/093 8:00:00.0000 ly=0 ls=1 dst=2 2000/093 3:00:00.000 CDT <-- Switch to DST 2000/093 8:00:59.0000 ly=0 ls=1 dst=2 2000/093 3:00:59.000 CDT tests/endyear-local-leap set time 2005/001 5:58:59 ly=0 ls=0 dst=0 2005/001 5:59:00.0041 ly=0 ls=0 dst=0 2004/366 23:59:00.041 CST 2005/001 5:59:59.0000 ly=0 ls=0 dst=0 2004/366 23:59:59.000 CST <-- Yesterday was day 366 2005/001 6:00:00.0000 ly=0 ls=0 dst=0 2005/001 0:00:00.000 CST tests/endyear-local-nonleap set time 2004/001 5:58:59 ly=0 ls=1 dst=0 2004/001 5:59:00.0041 ly=0 ls=1 dst=0 2003/365 23:59:00.041 CST 2004/001 5:59:59.0000 ly=0 ls=1 dst=0 2003/365 23:59:59.000 CST <-- Yesterday was day 365 2004/001 6:00:00.0000 ly=0 ls=1 dst=0 2004/001 0:00:00.000 CST
You may have noticed that the initial times all end with .041 seconds. This is the estimate at delays from all sources that would otherwise make the clock slow compared to the true NIST WWVB second. There are three major components to this delay: Propagation delay from the transmitter to receiver (under 5ms in my location), processing delay in the receiver module (CMRR-6 datasheet only gives "Output pulse widths tolerance ±35ms"), and processing delay in the microcontroller (10ms digital filter). My program can hardcode any desired delay, and the test arbitrarily harness uses 41ms. I don't know whether I have any good way to measure the delay.
As Chris points out, this is a lot of work for a "clock" that has no display.
The code I've developed so far lives in git: https://github.com/jepler-attic/wwvbdecode-2011
Entry first conceived on 12 September 2011, 20:29 UTC, last modified on 25 October 2021, 13:19 UTC
Website Copyright © 2004-2024 Jeff Epler