NYT: Peril of Knowledge
Saved under Features, Social
Tags: Governance, Productivity, Risk Management, Science and Technology
The Peril of Knowledge Everywhere
Thanks to advances in technology, we may soon revisit a question raised four centuries ago: Are there things we should try not to know?
That’s because the collection of data is increasing, in both scale and type, at a phenomenal rate.
IBM says that 2.5 quintillion bytes of data are created each day. That is a number both unimaginable and somewhat unhelpful to real understanding. It’s not just the huge scale of the information, after all, it’s the novel types of data (incidental photographs stored in the cloud, for example, or requests to Google for driving directions) that governments, corporations, and individuals gain access to for all sorts of purposes.
Take Jetpac, a mobile app that uses some of the 60 million photos a day stored on Instagram to create visual guides to over 6,000 cities worldwide. So if you’re looking for hipsters in San Francisco, for example, its algorithms can identify by location the incidence of mustaches in snapshots, and determine (are you ready for it?) that the Mission neighborhood is a good place to try. Sounds like fun.
In addition, however, “we are able to identify gay bars in Tehran. Moscow too,” said Pete Warden, a co-founder of Jetpac. The company does not want to do that, he added, but he does think it’s important that “we make people aware, get people talking about this.”
Mr. Warden was speaking on Friday at a data science conference in Berkeley, Calif., where many participants expressed concern about the effects all this data would have on the ability of powerful institutions to control people, from state coercion to product marketing.
“Big Brother couldn’t have imagined we’d tell him where we were, who we talk to, how we feel – and we’d pay to do it,” said Vivek Wadhwa, a tech entrepreneur and social critic. “We need an amendment in the Constitution that says you own your data.”
That is a difficult and quite possibly unworkable idea though. For one thing, if you walk by the camera at a cash machine, is that picture of you yours? Must you give permission every time someone like Mr. Warden wants to spot your mustache where you’ve publicly posted it? You’d spend all your time giving and denying permissions. And, since much of the data is in a transnational cloud, would it even matter what it said about privacy in a foundational American document?
“People call for regulation, but regulation is slow-moving, and the analysis will just go somewhere else,” said Gilman Louie, a venture capitalist at Alsop Louie Partners, and the former head of In-Q-Tel, the venture firm affiliated with the C.I.A. “Many people are happy to share their information, but they can’t control the flow of it, and any piece of information is a fractal of me.”
That is, one bit here and another there, both innocuous, may reveal something personal that is hidden perhaps even from myself.
If we want protection from the world we’re building, perhaps we’re asking that the algorithm wielders choose not to know things, despite their being true. To some, that may be a little like the 1616 order by the Catholic Church that Galileo cease from teaching or discussing the idea that the Earth moves around the sun.
Since then, we have been living in something closer to the spirit of the 18th-century Enlightenment, when all forms of knowledge were acceptable, and learning was a good in its own right. Regulation has been based on actions, not on knowledge.
Now, however, there is so much to know, and the business of knowing new patterns can be done by so many people, for so many different ends. That changes things.
For Mr. Louie, the situation may be something like a vastly more difficult version of laws against red lining, a practice by some banks to deny mortgages to minorities who wanted to move into white neighborhoods. The banks were allowed to know about the neighborhood, but they couldn’t use the knowledge to that end.
“Data companies will be told that certain correlations should not be applied to data,” he said. Since standards of behavior vary so much between one place and another, he said, “there will be a lot of interesting sociological debates among nations, states, even cities” about what you are allowed to know.
Other participants noted that we are also entering a new world where individuals can be as powerful as institutions. That phone gives Big Brother lots of data goodies, but it can also have access to its own pattern-finding algorithms, and publish those findings to the world.
“It’s not a one-way street, there are new ways to react against power structures too,” said Joe Reisinger, co-founder of Premise, a company that mines hyperlocal data, like prices in markets in India, to figure out national economic information. “What if you structured social action, or civil disobedience, into something that could be repeated at a huge scale?”
The mobile phone, he suggested, is akin to that other important relic of the 18th century, the Minuteman’s musket, leaning against the door as a guard against tyranny.