https://arstechnica.com/features/2017/04/the-secret-lives-of-google-raters/

As a software developer, this article is not only sad, but embrassing. What embarasses me is that Google engineers allegedly don’t communicate with raters. If there are even a handful of raters willing to work directly with engineers to make their processes more efficient, then why aren’t they jumping on that?

Internet surveillance and BCI

The coincidence of Elon Musk’s announcement of “Neural Lace” and the FCC rolling back privacy regulations should make people’s hairs stand up on the backs of their necks.

In case it isn’t obvious from the unabated personalization and proliferation of networked computing since the ARPANET, the Internet won’t always be a choice. It is only technically an option now to use the Internet for private affairs; practically, it’s required for the majority of the American population and a great deal of the world.

Outside of those skilled in anti-surveillance techniques, people don’t have a choice in whether to subject themselves to unwarranted surveillance while still being a part of the social sphere that has, in large part, migrated onto the Internet. Furthermore, that surveillance is acted upon to filter the information that a person has access to through the Internet, possibly without their knowledge of the mechanism by which that filter is acting. This hidden, largely unknowable manipulation of an arbitrarily large percentage of the public is antithetical to a democratic society because it undermines the principle of rational and free choice of each individual which is essential to a democracy that results in fair outcomes for the public.

I make the case that right now the dangers of unwarranted surveillance threaten democracy, but the prospect of always-on neural interfaces to the Internet makes the threat even more urgent. The potential benefits of such technology are great, and if the technology becomes affordable, accessible, and safe (in the medical sense) for the majority of people, I think the majority of people will adopt it. Thus, there is an essential risk to democracy with this technology, but this time acting through more sinister means.

Depending on the nature the neural interfaces to computer systems take, it may be difficult for a person to distinguish what they have learned and observed directly to be true from facts that have been transmitted to them directly to their conscious minds. Also, with a more efficient interface, inhibitions that limit use of the Internet would naturally diminish. At those times when people want to query a search engine or visit a website, but they stop because they’re with a loved one or in a meeting or driving, it is more the act of picking up a phone and redirecting the gaze that stops them. Would they stop if it was as easy as thinking to submit that query? To the extent that a person’s thoughts stray beyond what’s locally available in their own brains, they would emit traces of their habits of thought in ways that are now only approximated. It is impossible to say with a high degree of certainty what form the prevailing brain-computer interfaces will take, but the possibility is real of a technology that allows for information being fed into and spewed forth from the mind without the discrimination that occurs when we receive and transmit through our bodies.

Elon Musk’s announcement of a company focused on the development of brain computer interfaces is a signal that the risky technology described above is coming sooner rather than later. This article has no solution to the problem it poses, and like most threats to democracy, this threat will likely require continued vigilance to prevent it’s harms from being realized. That said, an excellent first step towards heading off that threat would be to write into law the strongest protections we can for the privacy of Internet communications between law-abiding citizens. Right now there seems to be ambivalence about the importance of privacy on the Internet in the United States. For those that believe in the American experiment of self-governance, such protections should not be controversial. Teaching people about what’s at stake here might help, but we also need to combat the insidious notion that “if you have nothing to hide, you have nothing to fear from surveillance”. Finally, the tools and techniques for protecting privacy need to be improved upon and made readily available to everyone. If that much can happen, then there’s less to fear when new technology arrives.