The Right to Lie: Google’s “Web Environment Integrity” Proposal is a Geyser of Badness Threatening to Swamp the Open Web.

If your computer can’t lie to other computers, then it’s not yours.

This is a fundamental principle of free and open source software. The World Wide Web abides by this principle, although we don’t often think of it that way. The Web is just an agreed-on set of programmatic interfaces: if you send me this, I’ll send you that. Your computer can construct the “this” by whatever means it wants; it’s none of the other side’s business, because your computer is not their computer.

Google’s so-called “Web Environment Integrity” plan would destroy this independence. “Integrity” is exactly the wrong word for it — a better name would be the “Browser Environment Control” plan.

In the normal world, you show up at the store with a five dollar bill, pick up a newspaper, and the store sells you the newspaper (and maybe some change) in exchange for the bill. In Google’s proposed world, five dollar bills aren’t fungible anymore: the store can ask you about the provenance of that bill, and if they don’t like the answer, they don’t sell you the newspaper. No, they’re not worried about the bill being fake or counterfeit or anything like that. It’s a real five dollar bill, they agree, but you can’t prove that you got it from the right bank. Please feel free to come back with the right sort of five dollar bill.

This is not the Open Web that made what’s best about the Internet accessible to the whole world. On that Web, if you send a valid request with the right data, you get a valid response. How you produced the request is your business and your business alone. That’s what software freedom is all about: you decide how your machinery works, just as other people decide how their machinery works. If your machine and their machine want to talk to each other, they just need an agreed-on language (in the case of the Web, that’s HTTP) in which to do so.

Google’s plan, though, steps behind this standard language to demand something no free and open source software can ever deliver: a magical guarantee that the user has not privately configured their own computer in any way that Google disapproves of.

The effrontery is shocking, to those with enough technical background to understand what is being proposed. It’s as though Google were demanding that when you’re talking to them you must somehow guarantee, in a provable way, that you’re not also thinking impure thoughts.

How could anyone ever agree to this nonsense? Must all our computers become North Korea?

The details of your own system’s configuration are irrelevant to — and unnecessary to accurately represent in — your communications with a server, just as your private thoughts are not required to be included, in some side-band channel, along with everything you say in regular language.

If a web site wants to require that you have a username and password, that’s fine. Those are just a standard part of the HTTP request your browser sends. But if a web site wants your browser to promise that it stores that username and password locally in a file named “google-seekritz.txt”, that’s not only weird and creepy, it’s also something that a free software (as in libre) browser can never reliably attest to. Any browser maintenance team worth its salt will just ship the browser with a default configuration in which the software reports that to Google when asked while, behind the scenes, storing usernames and passwords however it damn well pleases.

Indeed, the fundamental issue here is the freedom to have a “behind the scenes” at all. Environments in which people aren’t allowed to have a “behind the scenes” are totalitarian environments. That’s not an exaggeration; it’s simply the definition of the term. Whatever bad connotations the concept of totalitarianism may have for you, they come not from the fancy-sounding multi-syllabic word but from the actual, human-level badness of the scenario itself. That scenario is what Google is asking for.

My web browser (currently Mozilla Firefox running on Debian GNU/Linux, thank you very much) will never cooperate with this bizarre and misguided proposal. And along with the rest of the free software community, I will continue working to ensure we all live in a world where your web browser doesn’t have to either.

After I cross-posted the above in the Fediverse, a friend of mine there asked how Google’s proposal was different from CORS: “i’m sure that i don’t understand the google proposal, but all the browsers enforce CORS, and don’t let you load data in many contexts.” Since Google’s proposal is very different from CORS and similar browser-side protections, I replied to explain why:

This is not about the browser enforcing something by default for the purpose of being able to make security guarantees to its user. After all, if you wanted to modify and recompile your browser to not enforce same-origin policies, you could do so. (It would a bad idea, of course, but that’s not a software freedom issue 🙂 .)

Rather, this is about the browser being able to pass back a partially-hardware-based, cryptographically secure token that attests, to a central service, that you (the owner of the computer) have not made certain system modifications that would otherwise be invisible to and undetectable by another computer that you’re interacting with over the network. The central service can then pass that attestation along to relying parties. Those relying parties would then use it for all the expected purposes. For example, if they’re considering sending you a stream of video, they’d only do so if they see a promise from your computer that it has no side-band ability to save the video stream to a file (from which you could view it again later without their knowledge). And this promise would be dependable! Under this proposal, your computer would only be able to say it if it were true.

Of course, by definition the only way such a system can work is if it does not have software freedom on the client side. It requires a cooperative relationship between the hardware manufacturer and the supplier of the software – cross-signed blobs and such – whereby your computer loses the physical ability to make the requested attestation to a third party unless your computer is in fact fully cooperating.

By analogy: right now, you can tell your browser to change its User-Agent string to anything you want. You might get weird effects from doing that, depending on what value you set it to (and it’s unfortunate that web developers let sites get so sensitized to User-Agent, but that’s another story, to be told along with a similar complaint about HTTP Referer – but I digress).

Now imagine a world in which, if you change your User-Agent string, your browser suddenly starts always sending out an extra header: “User-Agent-String-Modified-By-User: True” – and you have no choice about this. You can’t stop your browser from doing it, because your computer won’t let you.

Does this help clarify what the problem is?

Leave a Reply

Your email address will not be published. Required fields are marked * Comments Policy

− five = five