On a private mailing list, a friend recently asked this:
Playing devil’s advocate here: what privacy are you trying to protect? Is it very important to you that websites not know what sort of products you’re interested in (and if so, why)? Or is it that you simply find targeted ads annoying?
I ask as someone who spent four years trying to help websites show less annoying ads.
Below is my response (after someone else on the list said “Sorely tempted to exfiltrate the hell out of this. Can we have it on a web page please?”):
I think Eben Moglen’s observation that privacy is really an ecological concept, not a transactional one, is the best answer to this. Thinking of privacy primarily in terms of the relationship between the user and various commercial third-parties misses the point. This post gives the relevant passage from Eben (it’s not long, and there’s a link to his full talk ):
He has also pointed out  that these days it’s an explicit goal of the U.S. government to have and maintain the social graph of everyone. That is, all the relationships, to the highest degree of accuracy and resolution possible. So the information Google and other online services collect is now potential data for that graph. It’s already both subpoena’d at some times and surreptitiously exfiltrated at others (though Google has done admirable work trying to prevent the second; how successful that has been, we can’t know, but it probably has had some limiting effect).
My point is: all that data we’re collecting, once it exists, it’s valuable to more parties than the ones who originally collected it. And by the Ashley Madison Principle , there’s no such thing a confidential dataset. There are only datasets that have not yet been involuntarily shared, and those which have been. There is no guarantee you will be able to tell which category your particular dataset falls into.
So when you ask “Is it very important to you that websites not know what sort of products you’re interested in?”, you’re framing an ecological question in a transactional way. This unintentionally transforms the question from the one we should care about to the one collectors of large-scale data would prefer we ask :-).
I realize, of course, that there is a tradeoff here. Google really can improve the quality of ads — quality as seen not just from the advertizer’s point of view, but even from the user’s point of view — by tracking and analyzing everything everyone does. The benefits are near-term and (for Google and the advertizers) centralized; the costs are long-term and decentralized. But that doesn’t mean the costs aren’t significant. It’s very similar to the economics of a lot of environmental pollution, actually, which is partly why “ecological” is such a good word here. I think in some ways it’s almost the definition of an ecosystem to say it is a system from which short-term, easily measurable benefits can be extracted for particular members at long-term, hard-to-measure (but real) costs for all members. Privacy turns out to be such a system.
Does that help?