Trust, but verify: An engaged community is the best defence

By on 6 May 2021

Categories: Community Tech matters

Tags: , ,

1 Comment

Blog home

You may have seen this:

The tweet refers to a story doing the rounds in the software development community.

UMN computer science researchers who were interested in the ability to intrude bad code into the Linux kernel abused submission processes, faking supportive reviews, to insert code deep in the operating system codebase. The process had some issues in terms of ethical review, safety, and the use of the public commons for experimentation. This ultimately wound up with their submission rights being revoked in a very public display of displeasure.

Suffice to say, it was quite an uproar.

Rightly so. The intent might have been good (to show problems in the review chain) but the method was awful. Imagine someone exploring gaps in airport security by smuggling in real bombs. Yes, they may have proved their point, but there were better ways to achieve the same result.

That said, it is important to sit down and give the point they were making some serious thought. If the underlying story here is what it seems to be, then the process controls over access to source code, managed in common, have remarkably low bars at times. Normally, submitted patches and ‘pull requests’ are expected to meet some minimum bars, and there are (semi)automatic testers for code quality, integration, and automatic builds.

But in the end, legal, well-written but malicious code, is in fact something that could exist. It could be ‘sneaked past’ the process. 

Is this a new story? No. It’s not. One of the creators of UNIX, Ken Thompson (of Thompson and Ritchie fame) wrote about this in his 1984 paper [PDF], which explored the innate trust we place in compilers as the fundamental tool to turn source code into machine instructions.

As Thompson points out, if the compiler is written to insert code into your source, then the source code checks it. Logic checks on the semantic intent of some code patches don’t matter because the backdoor is being inserted inside the compilation process underneath.

Is this a problem? Well… yes and no.

Yes, because ultimately trust has to be highly qualified, and we depend on a mixture of both good intent and rigorous checking by people who are trusted by the community.

No, because the system seems to work. Although the UMN researchers did smuggle their patches through, their test failed because they were caught.

There are numerous examples demonstrating that the underlying architecture of significant building blocks of the machine can be called into question from supply chain logistics. Examples abound. Take this case, involving microchips that some claimed were designed with a back door. This case was highly contested, but the underlying principle — that implicit trust in chips and designs can be abused — has to be taken seriously.

Any supply chain could be at risk. You can’t predict where the weakness is. All you can do is maintain vigilance. And, if you want to avoid ‘qui custodes ipsos custodes’ (who guards the guards?) perhaps the answer isn’t to restrict who sees the code, but to get more eyes on the game, and more people who will look out for bad actors.

Subsequent to this incident, UMN conducted a major internal review and completely changed its posture to the problem. They apologizedwithdrew the research from publication, and have been (cautiously) reconsidered for their submitted patches, under scrutiny.

Rate this article

The views expressed by the authors of this blog are their own and do not necessarily reflect the views of APNIC. Please note a Code of Conduct applies to this blog.

One Comment

  1. Rich

    I’m not a lawyer, but I would think that what they did could be criminally prosecuted as a violation of US computer hacking laws. Maybe someone should ask the US attorney’s office in Minnesota about that.

    Actions are the evidence of intent. It’s credulous to believe that their intentions might have been good when all we can see are only objectively bad actions. Someone with good intentions would have obtained permission from the right people first.

    “Oh no, officer. I wasn’t really trying to sneak that loaded gun onto the airplane. I just wanted to see whether the security screeners were good at their jobs.” Yeah, right.

    These kinds of threats against Linux and its distros are real. I remember a few years ago the Linux Mint website got hacked and the ISO downloads replaced with ones that gave the hackers backdoor access to any machine that used them. I suppose it’s the price of popularity, but we shouldn’t tolerate it.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Top