
Modern networks are more complex than ever. Operators today juggle multiple protocols — old and new, simple and encrypted, wired and wireless — across heterogeneous environments. Yet despite this complexity, the specifications guiding these protocols are often ambiguous, informal, or incomplete.
Most Internet standards are still written in prose and pseudocode, which is excellent for readability, but open to interpretation. Different implementations may read the same standards differently, leading to interoperability issues, hidden vulnerabilities, and unexpected edge-case behavior.
Our paper ‘It Takes a Village: Bridging the Gaps Between Current and Formal Specifications for Protocols’ (published in Communications of the ACM) argues that it’s time to bring formal methods — mathematically precise models of protocols — into the network engineer’s toolkit. The goal isn’t to replace traditional specifications, but rather to complement them by making protocols more reliable without making standards unreadable.
Barriers to applying formal methods to network protocols
Bringing formal methods to the operational Internet is harder than it sounds. The paper identifies three main barriers:
- Cultural divide: Protocol engineers prioritize readability, flexibility, and implementation freedom. Formal methods researchers prioritize precision and proof.
- Tool fragmentation: Dozens of formal frameworks exist (for example Tamarin, Alloy, Coq, SPIN, IVy) each using different languages and assumptions. Few are intuitive for engineers.
- Scale and drift: Protocols like Transport Layer Security (TLS), QUIC, and the 5G Authentication and Key Agreement (5G-AKA) are vast and evolving. Formalizing them fully is daunting. Moreover, implementations often drift away from the model and create their own de-facto model.
Despite these obstacles, the authors found success with lightweight, incremental formalization. The key was starting small, focusing on key components, and using tools to test real implementations rather than trying to prove entire systems correct.
Success from a lightweight approach
Applying ‘lightweight’ formal methods approaches to networking protocols has been successful and has revealed correctness issues and security vulnerabilities.
Tamarin has been used to find weaknesses in 5G-AKA. Alloy models detected ambiguous assumptions in TLS 1.3 drafts, and revealed that Chord does not provide all the guarantees it claims to.
Randomized compositional testing revealed interoperability gaps in QUIC implementations and early specifications. Promela models allowed Spin to detect race conditions in SIP. Software Defined Networking (SDN) based tools were used to formally specify and verify low-level implementations of network infrastructure.
These methods are practical because they require only partial formalization and can be incrementally expanded as the protocol evolves.
A little bit of formal methods goes a long way
Formal methods already add operational value. Even small models reveal ambiguities invisible in prose. Applying these techniques to QUIC, for example, uncovered inconsistencies in the standard as well as implementation divergences that caused real-world connection issues.
Adoption, not scale, is the challenge. The complexity of modern protocols is less problematic than the usability of tools and the lack of cross-disciplinary collaboration.
Lightweight formal methods can go a long way. Network operators don’t need theorem provers; they need tools that can automatically test conformance against formalized message sequences.
Next generation protocols require tools the formal methods community has yet to develop. Only collaboration with all stakeholders can ensure that these tools will be usable and useful to handle the new challenges.
For operators, the implication is clear: Formal specifications can directly improve network reliability by turning vague text into executable, testable models. They should collaborate with the research community to develop the right tools.
A roadmap for collaboration between network engineers and researchers
The paper’s unique contribution is a roadmap for collaboration between engineers, tool builders, and researchers. Instead of reinventing protocol design, it proposes weaving formal techniques into existing processes:
- Embed formalization in standardization: Include machine-readable models alongside English prose in standards drafts.
- Automate conformance testing: Derive test suites directly from formal models.
- Lower the barrier to entry: Build simpler, domain-specific tools tailored to network engineers.
- Encourage shared validation: Collaborate with industry and academic counterparts to co-analyze drafts before publication.
Formalization isn’t just academic; it’s a practical engineering tool.
Tangible benefits, today and into the future
For network operators, this work translates into tangible benefits.
Automatically generated tests from formal models can improve reliability by ensuring consistent behavior across devices and vendors. When behavior deviates from expected behavior, the formal specification can speed up troubleshooting by pinpointing where to look.
Formal analysis can catch handshake or replay flaws before they reach deployment, improving security posture. Vendors can prove their accountability by validating conformance more transparently.
In the longer term, the integration of formal methods could reshape protocol engineering.
Standards bodies like Internet Engineering Task Force (IETF) might provide reference models alongside prose RFCs. Vendors could advertise verified compliance as a differentiator. Operators could run automated conformance tests as part of routine maintenance.
Ultimately, the message of the research is pragmatic and hopeful. It takes a village — engineers, formalists, tool developers, and operators — to make network protocols both rigorous and usable.
More details are available in our paper, ‘It Takes a Village: Bridging the Gaps Between Current and Formal Specifications for Protocols’, published in the Communications of the ACM.
Lenore D. Zuck is a computer scientist whose research involves formal methods in software engineering, as well as information privacy. She is a research professor of computer science at the University of Illinois Chicago.
The views expressed by the authors of this blog are their own and do not necessarily reflect the views of APNIC. Please note a Code of Conduct applies to this blog.