Those of you considering coming to Techonomy 2016 on November 9-11 in Half Moon Bay, California (and we hope you are) may find it interesting to hear the kinds of conversation we have internally as we try to develop a meaningful and useful program. Even as we create a rich and diverse palette of sessions on pragmatic themes like what companies need to do about the Internet of Things now, or how AI is changing healthcare and other industries, we ask ourselves difficult questions about society’s future. The aim is to develop the big-picture sessions our programs are known for. Getting such sessions right isn’t easy. But we think seeing big patterns and long-term possibilities lends clarity to our thinking and the resulting little decisions we all make every day. (See Josh Ramo’s excellent new book The Seventh Sense for an eloquent explanation of how critical that is.)
This year we’ve honed in on the reality that the world is entering a completely new era as data-gathering of all sorts accumulates across global society. The data will increasingly be collected in vast amounts, via ever-more-sophisticated marketing systems that record our net and mobile behavior, via health and legal and government records, as well as via the enormous data sets emerging as the Internet of Things instruments our surroundings. The IoT will measure and store tons of data about us and the world. (It is a major theme of this year’s conference.)
How will we, as a society, determine who gets to control and utilize data about people? How much will people get access to data about themselves and their communities? Who will arbitrate the relationships between corporations, governments, and individuals in determining who controls and accesses that data, and uses it to gain valuable insights about the world? We had a discussion about that in Techonomy’s conference room the other day. Here are some highlights, via a fast typist.
Simone Ross (Techonomy’s program director): We don’t have the systems in place to reach anything close to that utopian place techies love to talk about. You can talk about AR and VR to your heart’s content, but we’re far away from a time when everybody can take advantage of it. You have the divides between the haves and have-nots, but then within the haves there are further divides, because everyone is at varying distances from the data. So our actual ability to control and make decisions is in serious jeopardy.
David Kirkpatrick (chief techonomist): It’s a fault line in future society that most people have not yet acknowledged.
Ross: In the tech community many haven’t recognized it. But activists and people who think about social issues ARE thinking about it. That’s why its important to include them in the conversation. And we’re working on a session about the future of representative democracy.
Kirkpatrick: Ultimately we will need to be able to feed information like data about climate change back to individuals in real time so they can alter their behavior accordingly.
Ross: But what about peoople who don’t care about climate change? They can ignore the data and the knowledge. Think about city dwellers in the US versus those in Europe or India. It’s not going to work the same for everyone.
We think–“So much data, so much insight in the world—everything will be better.” But what we don’t think about is, “Is it neutral? Is it fair?” Does everyone who wants to have input into this system have input? They don’t. But it’s harder and harder to get your head around, because everything we do increasingly throws off a data point. We’re going to drown in it.
Kirkpatrick: There is a concept of “anonymity by obscurity.” There may be tons of data about you out there, but there’s also tons of data about everybody else out there, too, so any one piece of it may be less harmful or potentially embarrassing.
Glenda Cudaback (Techonomy program consultant): When the data is known, as it evolves, things will change. Your preferences will change. The things you want or don’t want will change. They’ll change biologically. We have speakers who are thinking about that kind of possibility in the future, like Justin Sanchez, who directs the biological technologies office at DARPA. Another person on this year’s program who’s thinking about this is computer scientist Leslie Valiant of Harvard, a winner of the Turing Prize. I think it’s the eventual merging of biological systems and all this data.
Kirkpatrick: Will that be coded by a human?
Cudaback: Scientists assume that the machine will learn from the data it collects, and the machine will then evolve according to your preferences.
Kirkpatrick: But to use the phrase “your preferences” in that sentence is making a moral assumption that whoever ultimately controls the computing and analyses has the good will of individuals at heart.
Cudaback: You want a world where everyone has equal opportunities and is well cared for—that’s your ultimate goal. So the machine learns about all that’s happening in the world—wars, innoculations, medical breakthroughs. Meanwhile people are figuring out ways to connect that machine code to neurons, which are biological. And eventually the machines will be replaced by neurons. And the biological network will be more closely tuned to your preferences.
Kirkpatrick: It remains a moral question who has control of the system that defines what is meant by “equal opportunities” and “well cared for.” And you’re presuming an evolutionary leap that basically incorporates machines and biological systems.
Ross: That’s inevitable.
The conversation continued, and did wend its way around to sessions on pragmatic topics like “The New Rules of Business Engagement” or the role of artificial intelligence for today’s companies. But at Techonomy, we do like big ideas, and we think the best leaders will be those who are conversant in the language of rapid technological change, and even in the wildest future possibilities.
View editorial post