In this episode of PING, Christian Huitema discusses how looking into the IETF data tracker gives a unique perspective on the current state of document production.
As the IETF has grown, and as the process of developing standards has got more complex, it’s understandable that producing a viable RFC will take longer but there are questions about exactly where in the process some delays come from. Is it actually better or worse than it used to be and why might that be?
Christian took an interesting approach to the problem, using a random sample of 20 documents from 2018 (initially) and manually collating the issues. He applied this same methodology back to documents from 2008 and 1998. His approach to measurement was rigorous and careful, separating his own opinions from the underlying data to aid reproducibility.
Christian has a deep history of network development and research, with experience in the Internet industry, and in the French national computing research institute ‘INRIA’ before joining Bell Communications Research, and Microsoft. He worked on OSI systems, X.500 directories, satellite communications, and the IPv6 stack including the ‘Teredo’ transition technology, the H/D ratio used in determining IPv6 allocations and assignments in the RIR model, and the QUIC transport layer protocol.
Read more research and insight related to the IETF on the APNIC blog and more of Christian’s research and technology development ideas on his blog.
Subscribe and share your story
You can stream and subscribe to PING via the following channels:
If you’re interested in sharing your insights or research, please get in touch — we’re always looking for great stories from the community. And please do let us know what you think of the podcast as well as the APNIC Blog so we can keep improving.
The views expressed by the authors of this blog are their own and do not necessarily reflect the views of APNIC. Please note a Code of Conduct applies to this blog.