There was a play entitled ‘A Funny Thing Happened on the Way to the Forum’ from which I borrowed this title. The play’s title, according to Wikipedia, borrowed from an old vaudevillian line to introduce a joke that had nothing to do with the theatre. This seems the most apt description of my recent presentation at FOSDEM 2019 related to DNS Flag Day.
Let’s start with DNS Flag Day. This was an event that occurred on 1 February 2019. It was a day when DNS software developers and public DNS recursive service operators discontinued using software that covered up errors made by DNS authoritative service operators.
In the run-up to DNS Flag Day, a savvy software developer produced a test script that would evaluate how a domain would fare post-flag day. In other words, would the authoritative service ‘break’ in the eyes of the reformed recursive servers?
The operator had been working on this script for years (at least as far back as 2014) and used it to get operators to address issues ‘high up’ in DNS tree of names. A couple of DNS Top-Level Domain (TLD) operators, all from the country-code community (ccTLD), decided to bring the test a little deeper into the tree, the results of which they presented at RIPE 77 in October 2018.
Listening to this presentation at the time got me thinking, ‘wouldn’t it be interesting to run this over the commercial generic TLD (gTLD) zones?’ — a most apt presentation for the upcoming FOSDEM event.
A ccTLD usually has one or a small handful of zones, and possibly a million registrations, give or take an order of magnitude or two (that is roughly 10 thousand to 10 million). Running the script might take a day or two for such a zone, depending on how aggressive the test was conducted. The goal was then for the registry to contact the registrants and let them know how they stood.
Within ICANN, there are over 1,200 commercial gTLD zones. The largest is over 100 million, 11 are over 1 million, and then a ‘long tail’ to many small zones. That alone is a lot to test.
In addition, we don’t have the contact information (that is held by others) that a ccTLD would have for its registrants.
(These lay the reasons why this is ‘a funny thing that happened on the way…’ and not a story about the testing room, the forum nor the theatre.)
While I had three months until DNS Flag Day to be able to test such a large volume of zones, competing work and the end-of-year holiday break inevitably made the task less feasible. Adding to this, given the FOSDEM presentation itself would be delivered on 3 February, two days after the flag day, any warnings the test might generate would ultimately be too late to help operators. As such, my presentation focused instead on the testing process.
Processing the process
The presentation itself would make (only) a measurement geek and a stat-head smile, and while I encourage you to watch the video recording, here are a few highlights:
- 1,228 zones tested
- 193 million registrations
- 457 million NS records
- Around 35 million DNS queries made (if I had run it ‘full out’ I would have launched about 11 billion DNS queries)
- International Domain Names and IPv6 use levels are still very low relative to ASCII and IPv4.
(In a sense, having glossed over the results of my presentation, this blog post ought to be entitled ‘A funny thing happened on the way to a funny thing that happened on the way to…’)
One aspect of the testing that deserves a mention is that I had a goal to keep the time for ‘taking data’ to less than 24 hours. The zone files I used as input are delivered each day. While the data is ‘on disk’ it is getting stale as there are constant changes to zones. Speeding from the data’s arrival to the testing is a priority; less so, performing the analysis. Through the use of a set of virtual machines and running parallel data-taking scripts, I managed to keep the collection to 24 hours, with about two days to do the analysis.
— Pablo Carboni (@pcarboni) February 3, 2019
Was it worth the effort?
If the testing wasn’t about the results, was it worth the effort? This is an essential question for any research — is there a return on the investment?
For this work, I built upon some existing code to examine the DNS on a regular basis. For the testing, I extended the code base to perform many other measures.
I have since torn down some of the capabilities, namely the machines used to launch queries and collect results, but the zone file parsing and analysis code is prepared to produce other data products. These products haven’t progressed just yet, due to other projects ‘stealing time’ but I’d be happy to share the findings with those interested — comment below.
Edward Lewis is a Senior Technologist in ICANN’s Office of the CTO.
The views expressed by the authors of this blog are their own and do not necessarily reflect the views of APNIC. Please note a Code of Conduct applies to this blog.