Sigh

I made some major changes to the way data gets loaded into my navaid.com waypoint generator database, mostly in the processing of the “combined user data”. Mostly, I wanted to make sure that if “Bob” provides me some data on Canadian airfields that includes communications frequencies but no runway data that it doesn’t wipe out the runway data from the dataset of Ontario airfields that “Alice” provided me last year, but only updates the data that has changed in the overlapping part of those two datasets. Add in the possibility that a waypoint might have changed identifier or been resurveyed so the location has changed a bit, and you can see that there are a lot of possibilities to consider.

Unfortunately, considering all these possibilities is time consuming. I’ve been testing these new scripts with a dataset from one person that covers the entire UK and some nearby locations in varying levels of detail, and another that covers Ireland in great detail, but which is unfortunately no longer being updated because the person who provided it moved. Running both datasets would be an overnight job. But now that I’m satisfied with the results of that, I decided it was time to reload the old DAFIF data though these scripts to get the combined user data exactly the way I want it. But this has caught a couple of bugs in the scripts, one of which only manifested itself after 36 hours or so of running. That one didn’t even give me enough information as to why it failed, so I had to add some “use Carp” and “use Data::Dumper” magic to my scripts and then I re-ran it and found the actual cause after another 36 hour run. I’ve been almost continually running load scripts all week. I’m hoping this run will be it, but I’m not sure.

Since my new home box is so fast, I’m thinking one possibility might be to do the load processing on it, and then just mysqldump it and bring the dump file up to the colo.