As connectivity and intelligence spreads everywhere, a new set of interrelationships emerges between machines, people, processes, and the network. What does that mean for society? The biggest companies are rethinking business in the face of explosive change. Rob Chandhok of Qualcomm Internet Services, Cisco’s Dave Evans, Techonomy’s David Kirkpatrick, Paul Rogers of GE Global Software, and Ford’s Vijay Sankaran discuss the implications of the nascent Internet of Everything for business and society. Read the full transcript below.
Kirkpatrick: Let me just briefly introduce everybody here. We call this “Internet + Everything = ?” And that’s kind of, I think, indicative of where a lot of us are on this issue. We know it’s big, but we don’t know exactly how it’s going to manifest itself. From your left, Rob Chandhok, whose title is President of Qualcomm Services. Next, Dave Evans of Cisco. His title is great, Chief Futurist in the Internet Business Solutions Group. Paul Rogers of GE Global Software, he’s the Chief Development Officer there. And Vijay Sankaran, who’s Application Development Director for Ford, which means he’s basically in charge of all application development for Ford that’s not in the actual vehicle, in the company, in the dealerships, in the relationships with partners, etcetera.
So I want to start with Dave. Just give us a little sense of, is this something with a lot of gravitas that we ought to be thinking—I hope you just heard most of Alex’s presentation. He’s made some big claims about what’s possible. How dramatic of a scenario should we have in our heads about all this stuff?
Evans: Arguably, this is the biggest technology transition we’ve seen in our lifetime. Remarkably, we are at the early stages of a massive transition. If you consider that there are in the neighborhood of about 10 billion things connected to the network today, by 2020, we expect about 50 billion things. So five times as many things. But what’s interesting is if you think about the capabilities of those things, it’s pretty mind-blowing. You know, we think linearly. That’s how humans are programmed to think. We don’t think exponentially. And we’re all familiar with Moore’s Law and things like that, and if you start applying some of those principles to the Internet of things or the Internet of everything, the numbers are pretty staggering. I just captured a couple, and let’s just touch on a few of the technology components.
So let’s think about the physical size of things that are getting connected. The 3D volume of a thing decreases by 100 fold about every decade or so. So if you do the math and you say, okay, let’s fast-forward to 2023, it means you now have the power of something the size of a phone, an iPhone, an Android phone, in something about the size of a sugar cube. So think about what’s in a phone today, accelerometers, gyroscopes, multiple radios, GPS. Now imagine shrinking that set of capabilities into something about a centimeter cubed. We start to get very powerful, very sophisticated sensing devices. Network speeds are going to grow by orders of magnitude very, very quickly. Today an average connection to the home might be around 10 megabits or so, depending on if you have fast DSL or a cable connection, and we’re seeing gigabit trials now become fairly common. And even if you just follow standard laws of acceleration, we’re looking, conservatively, at about 650 megabits to the average home—that’s not even to a high-end home, to the average home—in the next few years.
If you take wireless connectivity to the devices themselves, I think an LTE connection, you know, a 4G connection to your phone—real world speeds of LTE today is about 18 megabits or so. And you do the math, we’re looking at, conservatively, 300 megabits on the low end to wireless devices, IoT connected devices, but it could be in the tens of gigabit range. So Samsung, and that’s just a few days ago, had some breakthroughs with 5G technology. And if that comes to pass by 2020, which they predict will, then we’re looking at multi-gigabit speeds by 2020. To put that perspective, a Cisco Telepresence session, a single screen, 5 or 6 megabits per second, we’re talking about 50 of those streams to a mobile device. What does that do in terms of—
Kirkpatrick: By 2020?
Evans: By 2020. On your handset device, to your tablet, to your phone, to connected devices, to your car. What does that do in terms of collaboration, self-driving, communication and so on, entertainment? Completely changes the game. And then the last one I’ll touch on is energy. So energy consumption gets cut in half about every one and a half years or so. In other words, the amount of energy something uses about every one and a half years or so gets cut in half. And it means that these devices today, a decade or so from now, are using a fraction of the energy required, but yet doing the same amount of compute as they would today. So you start coupling these things together, incredibly fast network speeds, computation and so on, we’re talking about a very different connected world.
And then the last point I’ll make, is these dumb devices, if you will, with a connection become a very smart device. And let me give you an example. As we move more and more intelligence into the cloud, you take something like your front door lock, which we saw earlier. Imagine a low cost sensor in your front door lock, and now all that door lock needs is a connection to the cloud, and now if it snaps a picture of your face, it’s now doing face recognition. Your Nest thermostat is now doing voice recognition, gesture recognition, and so on. Why? Not because the intelligence is in the device itself, but because the intelligence is in the cloud, and all it needs is that connection. And these connections are going to be very, very fast, so a dumb device plus connection equals a very intelligent device. So yeah, we’re in for a wild ride for sure.
Kirkpatrick: I like ‘wild ride’ as a concept. Thank you for that, that’s good. So Paul, at GE you’re a pretty big picture software guy, thinking about how to connect locomotives and jet engines and power plants—
Rogers: We are connecting a jet engine to a locomotive.
Kirkpatrick: Are you really?
Rogers: No. We’re not.
Kirkpatrick: But it’s not inconceivable, right?
Rogers: It’s not.
Kirkpatrick: I mean, I could think of a scenario where—
Rogers: I mean, it could be the locomotive of the future—
Kirkpatrick: Maybe that’s for a CIA operation—
Rogers: It’d be really fast.
Kirkpatrick: But do you think that scope you’ve heard discussed so far is appropriate, and translate it a little bit into the industrial context, which is where GE—
Rogers: You bet—
Kirkpatrick: I mean, you’ve got this whole phrase, “the industrial Internet” that you’ve talked a lot about.
Kirkpatrick: I mean, is that a subset of this exact same set of things, in your view?
Rogers: If you look at the industrial Internet—and the answer is yes. The industrial Internet is GE’s vision of an ecosphere. A broad, bold, global, open ecosphere that allows machine-to-machine and machine-to-human interactions. And when you look at big industrial equipment, the jet engines, the locomotives, they’re participating in relatively un-efficient or inefficient ecospheres. And so when you look at like a jet engine, we could spend an enormous amount of time and energy making a jet engine 1% more efficient. And that has a lot of business value associated with the airlines. But is that efficiency being realized when you think in terms of it sitting on the ground? And when I was at Washington D.C., I was talking about what could be in the future, and I was saying if you’re a business traveler, and I think a lot of you are, imagine a world where there’s no more maintenance-related delays in flights, and you think in terms of the magnitude of a change like that. And then you start thinking of the industrial Internet, and you think in terms of a jet engine that in flight, could communicate to a maintenance supervisor. And rather than land and surprise that maintenance supervisor, it could tell in flight to that maintenance supervisor that I’ve got an issue. And while I’m in flight for the next five hours, could you find a solution, could you get all of the necessary parts in place? And that sort of maintenance issue goes away. Jet lands, everybody gets off the plane, the issue’s fixed, everybody gets on the plane, and now you start having better efficiencies associated with the airports, better efficiencies because that jet engine is in the air.
To put this in perspective, literally a 1% efficiency gain in fuel economy for the airlines is $2 billion dollars a year. A 1% gain of efficiency in the energy space in regards to fuel economy is $4 billion dollars. And when you look at the rail systems, locomotives, think of that as essentially a freeway. And if locomotives can gain just one mile per hour of velocity overall, that is literally a $1 billion dollars worth of customer value. So when we talk about the industrial Internet, we’re talking about the system that big, manufactured OEM equipment plays in, their ability to communicate together, communicate their status to humans and humans to take those actions, or the machines themselves taking actions, leads to big gains in efficiencies in a relatively inefficient world.
Kirkpatrick: Yeah, beautiful. Vijay, you’re sort of involved with a lot of that kind of stuff, but you also are in a consumer business. So there’s even a bigger ecosystem that in effect you have to think about at Ford. And I know you told me on the phone that you had done some work kind of laying out a landscape for Ford, based on some of the work of Mark Weiser, who I mentioned earlier, who’s from Xerox PARC, great innovator in this area. Talk a little bit about what connecting whole swathes of the activities of Ford could start to mean for not only Ford and its own efficiencies, like that airline, but for the driver of the car, much like the rider in the airplane.
Sankaran: That’s great. I mean, I think David, what it comes down to the operative word is ‘connected.’ And to the point that Dave was talking about, we really started to investigate this whole phenomenon of pervasive computing back in 2004, 2005 timeframe, because all of the trends that Dave talked about in terms of micro-storage devices, increased connectivity, more intelligence in tinier and tinier devices. I mean, we could see that trend already coming, just in the form of things like smart phones, RFID tags, and we were already deploying RFID tags that were active in our plants to do location tracking, asset tracking, different things. And we began to see the dynamics of computing shape and the forces come about, we knew that that trend was only going to be more profound. So we actually started back in 2004 and 2005 on that journey. But the operative word really comes down to ‘connected’ and really has three ingredients. So the more ubiquitous connectivity that now will exist all throughout, with increased speeds and multimodal means of transporting information, plus software that allows us to have much more intelligent features inside of our vehicles, inside of our factories, all across our ecosystem, plus the data that you can now bring out of that and really provide much more personalized experiences for the users of our vehicles, but also make our plants more intelligent, make our product engineering process more intelligent. I mean, we are just on the cusp of profound changes. And as one of our product development chiefs told us a few years ago, we’re just on the beginning of moving from an automotive company to an automotive technology company. Really, the key disciplines going forward are going to be software, data sciences, connectivity, as we go forward in the twenty-first century. So there’s some really profound changes.
As far as our users—and hopefully some of you got to see the C-MAX Energi sitting in front. As we move to a new society of hybrid and electrical vehicles, that Internet of things and having that whole array of sensors inside the vehicle that now begin to communicate with each other is so critical. Because, just like Paul talked about the efficiency, imagine that you’re driving on the highway and you’ve got to manage Ohm’s law’s capacity of batteries inside a car, I mean, every little efficiency in terms of those sensors communicating in real time and being able to self-adjust the internal dynamics of that vehicle give you more efficient vehicles that can propel you. Also, those same sensors have the opportunity where, through multimodal means of network transportation can go out to the cloud and be able to look at all of our service partners across a broad ecosystem and bring back information into the vehicle that can really enhance the user experience. And then you’ve got mobile phones where if you’ve got a vehicle and something’s happening to that vehicle, you can even have video-type features that get relayed to you in real time. So this whole interconnectedness of things will spawn just a new era of features that we really need to think about as we go forward, and really figure out how we’d leverage for our consumers. But I mean, it really is a magical time, I would say, in terms of technology and computing as we go forward, especially for a company like Ford Motor Company.
Kirkpatrick: Thanks Vijay. So Rob, I assume you could just spin as big picture of an optimistic scenario about what’s possible as these guys, and feel free. Don’t let me stop you. But Qualcomm is one of the great companies that’s facilitating this capability because of the extraordinary technology you’ve put in such a huge percentage of the mobile devices that are pervading the world these days. So I’m curious, as a tech company, what do you see as the tools that we need to get to this vision, that maybe we don’t have yet? And I know you’re specifically working on an architecture and an infrastructure product that really could significantly contribute. So talk a little about that.
Chandhok: Yeah, it’s interesting to see, we think about the kind of cloud connectivity and driving of course that big bandwidth, and that’s our core business. Qualcomm’s about the communication. But interestingly enough, the other side of that is thinking about, the other premise of the company is built on mobility. I mean, we were the company that really said the Internet is not just going to be connections and wire connections, but the wireless mobile connections in particular. When you think about the fact that not only will you get these gigabit connections wirelessly, but they’re going to work when you’re going down the road at 80 mph and it’s going to work seamlessly. That’s where you sort of get the magic in.
Kirkpatrick: And would the speeds that he was talking about be possible, in your—
Chandhok: I mean, everybody talks about Moore’s Law. There’s also this other one called Shannon’s Law. So with enough spectrum, sure. But I mean at some point, that’s the beachfront property that everybody’s fighting over right now. But interestingly enough, the other way we’ve been thinking, and particularly the group that I work with, the team that I have here, we’re trying to think of another part of it, where we don’t necessarily think that the cloud becomes always the center of the intelligence in there. We think that actually the architecture and the topology is going to change to smarter and smarter nodes communicating locally, with locality of reference, and then also communicating back to the cloud. So one of the architectures we’re working on is a peer-to-peer mesh framework for building services and interoperability in distributed computing that is more like the Internet of things near me, rather than the Internet of everything. So instead of saying, “I know I have this door lock and I know I have this light switch, because I’ve configured it.” You walk into a room you’ve never seen before and you discover what’s out there, and maybe the agents that are in the network talk to each other. And, in fact, interact with the cloud—
Kirkpatrick: Self-configuring [overlapping]—
Chandhok: —but we’re pushing sort of the envelope on it. Because if you take the processing arguments, which I thought were good, and what we get in a sugar cube-sized stack, I mean that we’ve already seen happen and smart phones are sort of the end product of thinking about how much computing power—or the trend of computing power. We haven’t ever gotten to the point in our business where people have said we have enough computing power on the smart phone; let me just compute that in the cloud for you, right? And I don’t think we’ll ever get there. Because every time we go to that next level experience, it turns out it’s latency that just bites you. So what we’re doing is we’re trying to think of if we’re going to get to that next level of things, it’s going to be a blend of the cloud, big data intelligence, plus smarter and smarter things near you, that talk to each other.
Kirkpatrick: And there’s this infrastructure he calls—you call it Aljoyn, right?
Chandhok: Yes, thanks for spelling it.
Kirkpatrick: Did I get that right? Yeah, because it does take a little spelling. But is that an alternative approach to what we saw from smart things? Do you see that as—or it could it be compatible? How do you think about that?
Chandhok: I think they are compatible. I mean, I think the stuff Alex talked about is sort of a particular experience driver and it has a particular design viewpoint, which is the interoperability happens in the cloud. Now, I happen to think there are use cases where I don’t want that data flowing back to the cloud for either privacy, security, or scale reasons. If I have 2,000 things in my home and you have 2,000 things in your home, at some point, it’s actually more efficient to actually do the communications locally. It actually bothers me that when I look at my nest with my smart phone and it’s four inches away from my smart phone, that the traffic is going back and forth to the cloud.
Kirkpatrick: Well there may be other reasons people want to keep it close to the chest, also, just because of security. You might not want people to know and have access to all those 2,000 interactions in your home.
Chandhok: Right, right. Or I’m driving down the road in my car and I want to talk to the things that are in my car or my friends car which I just got into, so I didn’t ever see it before, so I have to discover. And that’s like a little local cloud that sometimes talks to the big cloud, but oftentimes, I can do everything I want just by computing locally. So I think it’s going to be an interesting blend. It’s not an either or. My hypothesis, or our team’s hypothesis, is that it’s going to be a blend. We won’t just, every five or six years we all predict that everything’s going to get pushed back. We’re all going to have just dumb terminals. First it was Java, then basically everything’s going to be dumb and it’s all going to be in the cloud. And the user experiences just don’t work that way.
Kirkpatrick: Paul, I see you’re nodding a lot. Talk about why you’re nodding.
Rogers: When I think in terms of data in the cloud and processing speeds and everything necessary for a smart car, I think that’s all critical. But when you take it up a notch and you think in terms of an ecosphere, you need less than you think. To have a very, very smart car, which I’m a huge advocate of, that is hyper-efficient, and then when you all think we’re all going to do at 4:00, get on the freeway and what are we going to do? We’re going to stop. And so take it up a notch and start thinking in terms of how do you maximize or optimize the freeway system itself? Which then directly translates to things like locomotives. In a typical trip, let’s say a typical journey a locomotive takes, two-thirds of that trip is sitting doing nothing. Because it’s effectively sitting in traffic. So you have this incredibly efficient locomotive, sitting around doing nothing, most of the time while it’s on duty. So think of the ecosphere and the cloud-based technology and the processing speed just to get basic information: how fast am I going? Where am I? And where am I heading? And what do I or what actions do I need to do to remove that congestion? And then you start getting into hyper-efficiencies of a rail network system and then you start seeing impacts to society.
Kirkpatrick: Rob, could that—say Aljoyn really caught on, and it really became a big deal. Would it potentially be the kind of thing that could allow cars on a highway to sort of self-configure into a kind of fleet that moved in tandem, kind of thing? Is that the long-term element of the vision?
Chandhok: Yeah, I mean I don’t think that’s special about Aljoyn. I mean, there are DSRC kinds of things that are going on. It’s not so much that—
Kirkpatrick: What is DSRC? I’m sorry.
Sankaran: It’s a short-range communication. So it’s—
Kirkpatrick: So there’s other ways to do that.
Chandhok: I thought you were going to do the acronym.
Kirkpatrick: Is that something that crosses all manufacturers?
Kirkpatrick: So if that really gets adopted properly—
Sankaran: Yeah, it’s a standard that the automotive manufacturers are working on in terms of how vehicles communicate to each other.
Kirkpatrick: Working on—how close are they to agreeing on that?
Sankaran: I mean, they’ve got a standard that they’ve agreed on how it would work. But it also requires a lot of federal and highway and infrastructure type investment, which is sort of a challenge right now in the US.
Kirkpatrick: Right, so it’s not just—right. And I interrupted you.
Chandhok: No, I’m sort of saying is that the real thing there is actually the beginning of your premise, which is do we have systems where swarms, and there are lots of swarming technology we think about, but we usually think about it in terms of flying things. But with cars, swarms of cars on a freeway could autonomously smooth themselves out locally or be managed in a broader thing. So that to me, when I think back and watch what I’ve learned as a computer scientist and what we think about and what I’ve learned at Qualcomm about communication, that hybrid of local plus cloud intelligence, which is actually really hard to do and hard to do well. Because the algorithms have to be partitioned and the system has to be thought about, that is actually what I think the future is. It’s that blend that is going to be really exciting. So it isn’t, I can put all the intelligence here or here, it actually probably will configure itself. You’ll probably get to places and off-load computation, right?
Evans: You’re right on it. It’s not an either or. I mean, and you’re right, every five, six years or so we have that debate. But it’s never been either/or, it’s always both. You’re going to have intelligence at the core and you’re going to have intelligence at the edge. For example, take a video camera today. A video camera, today, is dumb. It just streams back a bunch of bits to the core and the core processes it. Tomorrow, that video camera is going to be smart. It’s going to have intelligence at the edge, so it’s going to know that it’s, say, Vijay that I’m looking at, not just a stream of bits. So when it sends back information, the information that it’s sending back is going to be Vijay. And, oh, by the way, I’ll send some metadata along with that too. Maybe his emotional state or my location or the weather conditions, things like that.
Kirkpatrick: You mean like with image recognition software built into the camera?
Evans: Correct, that will be in the edge. That will be in the edge as well as the core.
Kirkpatrick: Well, Dave, one of the things I wanted to ask you, since you’re the chief futurist on the panel—I mean, you’re the only chief futurist on the panel, there may be other chief futurists. But what Vijay said about Ford thinking of itself as a technology company interests me tremendously, and I know Ford fairly well—
Evans: Me too.
Kirkpatrick: And I’m convinced that there’s some there, there. But how broadly applicable is that concept to companies generally? And if so, how aware are companies and their CEOs and leadership groups about that, as a chief futurist?
Evans: Yeah, it is broadly applicable. I remember six, seven years ago when I was looking out the window at our parking lot, I thought, these are not cars that I’m looking at any more, these are mobile data centers. I mean, these have so much compute, so much storage, processing, connectivity. The way we start thinking about things is going to change. As we start thinking about, for example, the margins on vehicles, you know, they’re really slim. But what if that vehicle was no longer a vehicle, what if it was no longer simply a transportation device, but it was a compute device? What if I could write applications for that platform? Today there are a billion cars on the road, within three to four decades that’s four billion cars on the road. Seventy-five percent will be autonomous, and that will increase efficiency by 273% on the road by connecting them.
Kirkpatrick: That’s a prediction or just a hypothesis?
Evans: That’s a calculated number. So if we look at cars in a different way and say that’s not a car anymore, it’s a compute platform, how does that change? If we look at a jet engine, which generates a terabyte of information per day, how does that change what that thing is?
Kirkpatrick: It’s also interesting in the context of the actual state of the planet, because if we have 4 billion cars, and we don’t get 273% increase in efficiency—
Evans: We have a problem.
Kirkpatrick: We’re going to choke on the smoke—
Evans: You’ve got a big problem.
Kirkpatrick: And Ford’s going to be in deep shit. There’s no question. Just to be technical.
Evans: But I think that’s the key concept here, is to start looking at each element or piece of equipment differently. And rather seeing it as a jet engine, exactly what you are saying is, what is its part in the greater fabric or ecosphere? When we look at the industrial Internet or the Internet of things, we look at that as the fabric for these pieces of equipment to be self-aware and recognize that they’re participating in a greater system.
Sankaran: Yeah, I think there’s something also, though, that’s very profound, right? I mean, and we’ve been talking about either jet engines or vehicles, and primarily focused on the way that we live here. But if you think about the future of megacities which is something that, you know, Ford talks a lot about, I mean it’s a profound concept, because people are going to have multimodal forms of transportation, so they may drive to a park-and-ride and then take a train to somewhere else and take a subway to somewhere else, or be able to navigate in a crowded city to find parking. So I mean, really, one of the key things that this Internet of things really brings you is that connectivity between different components as part of a broader ecosystem. And how that ties a consumer together. So the consumer is going to be constantly sending messages based upon where they’re located or what form of transportation they’re in. And really, urban environments can much more intelligently coordinate transportation services based upon all this connectivity and information that’s being passed between different devices. So I think if you think about cities like Shanghai or Beijing or Rio de Janeiro, and how these technologies can really transform just a severe form of congestion, I mean, hundreds of times worse than here in Northern California, I mean, it’s a profound concept.
Kirkpatrick: Yeah, in fact, partly because Techonomy is a big believer in the significance of this transition to megacities and the challenges faced by cities, particularly in the United States where we have so little clue about any of this compared to a lot of other countries. And I want to emphasize that, I really think we are way behind. I see nodding, which I’m happy about. We have a Techonomy Detroit event, which at least Vijay probably know about, in September—it’s our second—really focused a lot on cities, on urban revival and jobs and the whole complexities of what’s happening with cities, particularly when they go down so far as Detroit, which a surprisingly large amount of American cities have. But Paul, I don’t think I quite got to asking you whether GE thinks of itself increasingly differently because of this set of forces, as a company? I mean, the whole leadership requirement is different, right?
Rogers: You bet. It’s completely different and I think our big center—we’ve got a billion dollar investment in software and software infrastructure that we’ve put forth. And that is a completely different approach. We’re saying that we will always have that primary focus on our products but we also recognize that we have to focus on the technology for our products to participate in something bigger. And so leadership within the company is approaching this differently. When we think in terms of value, the big kicks in value to our customers is in the optimization of that ecosphere.
Kirkpatrick: Yeah. Do you think, Vijay, that we could get to the point where car companies compete as much on their software as on their design? I mean, we’ve seen sort of tiny moves in that direction, but is that a possibility?
Sankaran: I mean, I think it’s going to be a mix of ingredients, right? So I mean, I think aesthetically beautiful designs are always going to be really important to consumers, but I think once consumers get inside the vehicle, more traditional disciplines of software are going to come to the forefront. So how do all these things interact together in a meaningful and intelligent way? What does the HMI feel like inside the vehicle? I mean, is it more like an iPod or is it—
Kirkpatrick: Human-machine interface, just to be—
Sankaran: —more not like an iPod, right? And also, how do they really access the things that they do in the rest of their lives, off-board, and how does that get leveraged in the context of their vehicles? So I think this whole notion of how do you become more productive, if you’re a businessperson, while you’re stuck in traffic? How do you get access to the information that you would have in your home while you’re inside the vehicle? How do the functions of the vehicle coordinate in real time with the dealership? The whole word becomes ‘experience.’ So I think the customer experience inside the vehicle and outside the vehicle really come together. And that’s going to be a key differentiating factor for our consumers going forward.
Kirkpatrick: Interesting. Rob, anything you’ve heard you want to comment on, or I just—?
Chandhok: No, I was really glad Vijay said ‘experience’ because that’s the thing that, that’s when we think about it, when I made that comment about computing near the edge, that’s all driven by ‘experience’ and we’re working on technologies that—I don’t know how to predict them the way you do—you know, the augmented reality technologies. And we’re just starting to see people use it in a different way. This is the kind of stuff that’s actually camera-based rather than compass and GPS location-based. And the transition was we were able to start computing 30 frames a second, where something you were trying to match and render a 3-D object onto it, and match it as the thing moved around, and have enough capability to do that in the palm of my hand. With the computing power we’re going to have in five years, the whole thing—I mean, Google Glass is nothing, compared to what we’re going to have in five years. So that kind of blend, like you were talking about, like with the intelligence in the edge and in the core, is going to drive experiences that if we have an open ecosystem in developers, we’re going to see that accelerate. That’s, I think, the thing that’s really interesting.
Kirkpatrick: Let me just invite the audience, anybody have questions or comments they’d like to throw out? Okay, let’s get the mic right down here real fast. Can we get it to this guy? And who was the other hand? Get one to that lady there, after. First him and then her. Speak out loudly because I’ll repeat your question, because I want it on the recording.
A: So I listen with a lot of interest to all the different aspects of your speech, and I noted that you all give in for granted that connectivity is a given, speed is a given, bandwidth is a given, at least as I understood. But no one has talked about who’s going to pay for that? So is it the car industry who’s going to pay for the connectivity for those who provide the connectivity? What’s your thought about it?
Evans: Ultimately, the consumer is going to make the choice, right? But I think what’s going to happen is we’ll see some creative ways to monetize some of these connectivity options. So for example, look at what say Amazon is doing with the Kindle. You have constant connectivity; you have a cell phone inside that Kindle. But you don’t pay a monthly cellular fee, you pay every time you download a book. So there’s a lot of creative things that we can do to monetize some of those services. But I think ultimately, yeah, the consumer is going to pay for this. Because it’s going to be wrapped around services that are valuable to you. But I will submit that there is an opportunity for the industry, in that if you sort of itemize all the connectivity plans that you have in your home today, maybe you’ve got your cell phone and your tablet and your cable and your wireless and everything else, you have half a dozen. Well if we keep going down this path, a few more years from now you’ve got a dozen and then two dozen, it’s getting out of control. So you could make an argument that it’s an opportunity to maybe some sort of brokering services or some way to consolidate some of these connectivity options. But you’re going to pay for it as a consumer, but I think there’s opportunities to refine how the consumer pays. And more creative licensing models.
Rogers: Plus I think it’s a question of who’s getting the value? When you look at a $14 to $15 trillion dollar opportunity, and you start talking in—
Kirkpatrick: But what does $14 or 15 trillion represent?
Rogers: That’s the amount of value that’s expected to be derived from the Internet of things or the industrial Internet. When I talk in terms of $1 billion dollars here, $4 billion dollars there, that’s simply the tip of the iceberg. And that’s yearly, on a single point of efficiency or a single mile per hour in velocity, which is just minute relative to what’s available. And so when you think in terms of that value, and you think in terms of the business case associated with a railway, when you start talking in billions, connectivity becomes very valid, from a purchasing perspective.
Sankaran: I think it’s also going to be somewhat context sensitive. So I mean, certain cities, municipalities, are making investments in terms of having their community blanketed in WiFi. So I think it’s up to us as automakers to recognize that there’s going to be some context where that’s available and to enable more features in that particular context. Some consumers prefer an ‘always on’ kind of philosophy and because of that they’re probably going to be willing to invest in an embedded modem inside the vehicle that gives them 4G connectivity inside the vehicles. Others are still going to have basic smart phones, so in those kinds of contexts, you know, we provide more basic services based on their level of bandwidth. But that’s where software, I think, comes into such a level of preeminence. Because what we don’t want to do is create so much complexity that we prescribe to the consumer what you must have in terms of getting certain features. But we acknowledge that there are going to be different consumers with different sorts of preferences out there. And based on what they bring in, what their preferences and choices are, and the context within which they operate, you know, there are going to be different sets of software services that are offered in those contexts.
Chandhok: I was going to say that it’s the perfect example for the locality of reference comment that I made earlier. There’s a lot of data flow that’s going to happen best, locally. I might do 60 gigahertz, ten gigabit networks right locally in my house, and I may trickle feed those in over unused hours over the broadband connection. Because I’m going to have smarter algorithms for managing my content. Just like when the web started and we didn’t have caching, and then all of sudden caches started getting smarter and smarter, and CDNs came to handle that imbalance. So there’s a lot of topology and architecture things that we can do that have less to do with where the cost is distributed if you don’t think of it as just a hub-and-spoke network.
Kirkpatrick: I don’t want to spend the rest of the time on this, but will other countries where the government makes a commitment to ubiquity of broadband accessibility have an advantage or not? I mean, in other words, there are countries—I mean Japan did a big thing and now certainly China’s doing some surprising things. I think we’re going to see a lot of countries that you might not expect, like maybe Indonesia, countries like that, starting to make big investments and guarantee certain things that we think of as not government’s purview, because we have such a hands-off attitude. I’m curious, as a Qualcomm guy, is that possibly going to be a disadvantage for the US?
Chandhok: I don’t think it’s—I mean, not really speaking as a Qualcomm guy—I don’t think it’s a disadvantage as much for the US. Those things, the ubiquity that you’re talking about, doesn’t really match up to the rates that we were talking about. We’re working on a 1,000 times improvement in bandwidth connectivity, that’s one of our big projects that we’re working on. In the limit of that scenario, what happens is you have one person being served by one base station. That’s how you get to the spectrum of efficiency that you need to get the speeds that you’re talking about. And that’s the locality of record—
Kirkpatrick: And that’s not inconceivable.
Chandhok: No, it’s not. But it means that we have to drive the cost of a base station to be as cheap as a phone or cheaper. So that as you walk through your house, you might talk to a different base station in every room. But you’ll get the full capacity of that spectrum.
Kirkpatrick: So government’s just going to be left behind at that pace of change—
Chandhok: No, there ain’t no such thing as a free lunch! If you’re going to have that connectivity to what? If you want the connectivity back to the cloud at that speed, at every point, you have a big fat pipe following you around. If you think that you can compute things locally and then connect back at different times, in different places, more complexity in the algorithms, but you get the efficiency.
Sankaran: I will say, though, in the example that we talked about at the beginning with regards to like DSRC, where we want vehicle-to-vehicle connectivity or vehicle-to-infrastructure connectivity those governments that make those investments in creating those infrastructure points, where information can be routed back and collected and aggregated and where vehicles can connect to each other more effectively, they’re going to be a big advantage in terms of developing new safety applications, potentially collecting data on traffic patterns, weather, all kinds of stuff that we’re not going to have in this country. I mean, I’ve been witnessing—
Kirkpatrick: So you would agree that we’re not showing a lot of vision?
Sankaran: No, I mean, I’ve been witnessing V-to-V conferences in this country for the last fifteen years and the topics of those conferences seem to be the same that they were fifteen years ago, and it’s because every time a transportation bill comes through, it’s about fixing potholes, not really putting intelligence on our highways. Whereas you compare that to Singapore or somewhere, where they’ve created dynamic pricing mechanisms and tollways and things like that, and are much further advanced. Or even in some parts of the world where they have the cameras that can monitor traffic patterns and things like that. We’re well behind in terms of that. So those countries are going to have an advantage as it relates to software applications.
Kirkpatrick: Okay, interesting. I don’t want to go away from the audience. Go back to her, please.
F: Hi, my name is Jennifer. Going back to Rob—
Kirkpatrick: Can we get the mic on? It doesn’t seem to be on. Just keep talking, it should go.
F: Going back to Rob, you mentioned you don’t believe that everything should be in the cloud, it should be local and universal. Can you give specific examples of what should be local and what should be in the cloud and what devices should be both?
Chandhok: I’m not sure that there’s a hard line, but I’ll give some examples. For one thing, media and private information that I want to distribute around my home, I don’t think needs to go back, in and out to the cloud. It’s why we have things like, we stream audio over Bluetooth. We don’t actually go back to the cloud, and there’s technical reasons for latency and so forth. And there are other services that are cloud-based that do our media, but to me it’s about things that are private. So I may, for example, want a machine-learning system that monitors the patterns in my house, so instead of having to build a set of rules that say if this door opens, then turn this light on, then turn these things off. It’s more like that machine-learning side. I might actually want that data to stay very private. So for example we did a project where we did a thing that looks at the patterns on your cell phone of how you use the applications, and after learning for two or three days, starts to monitor how the applications do background syncing and actually get you extra battery life. It’s actually in the Android marketplace, it’s called Snapdragon BatteryGuru. The challenge there was to build a machine-learning system that didn’t use more power than it saved, but in particular, what’s cool about it, that I’m really proud about, is it doesn’t use the cloud. It is not crowd-sourced, because first of all, each of our patterns are very individual. My pattern is very predictable, but it’s very different than yours. So for EU privacy laws, it actually helped us a lot that we could put this app out into the marketplace because it doesn’t send any information off that local device. So I actually think it’s going to be first for privacy and then later for system scalability, and that’s sort of the example that I would give.
Rogers: From an equipment perspective, you can have supervisory recommendations come out of the cloud, but when you think in terms of safety-based control systems, you’d always want that on the edge. Simply because of a latency issue.
Kirkpatrick: Okay, I want to move quickly, so quick questions and quick answers, please.
M: I’m Carl Hewitt, I’m an academic and an entrepreneur. Seems to me there’s a huge drag on the software, mainly the productivity of our ability to write software. The increase in software productivity over the last decade is un-measurable. And it’s a cottage industry, it’s all done manually, and at this rate we won’t get there, to your dream of the future. I’m quite confident of the hardware engineers. Software can’t do it. Sorry, Charlie.
Kirkpatrick: That was a comment in itself, but go ahead.
Chandhok: Yes, you’re right. Am I disappointed we’re still programming C and it’s 2013? Yes. I’m trying to encourage people to use Go and Haskell and Erlang. That’s going to be the kind of leap. But you know what the problem is? We’ve made the computers fast enough that people can be inefficient software programmers.
Kirkpatrick: Well, he’s a pretty knowledgeable guy, so we’ll call it a comment not a question.
Evans: I disagree, by the way.
Kirkpatrick: Well if you can say it in fifteen seconds—
Evans: Well I think it’s just a focus. We’re focused solely on software productivity, and we recognize it is an issue.
M: No progress.
Evans: Super amounts of progress.
Kirkpatrick: Okay, that can wait for later.
F: All right, I’ll jump in. Jessi Hempel with Fortune. So security issues have become more and more of a concern. Certainly I’ve been writing about them a heck of a lot more as the world becomes more connected. What needs to happen so that security doesn’t derail all of the cool things you’re talking about?
Kirkpatrick: Well, you were touching on that, Rob, with some of the things about keeping stuff in the house instead of sharing it widely, implicitly, right?
Chandhok: In all of this, I’m assuming your jet engines are secure.
Kirkpatrick: You better work hard on that one.
Kirkpatrick: But quickly, I think that’s a very substantial question.
Sankaran: So I mean, I think other than the basics around encryption and all of those different things, architecture really comes to the forefront in terms of really thinking through the different scenarios of how you design the vehicles and how you look at the different functions inside the vehicles for us. For us, we have a consumer-facing function, which is really around entertainment and video and things like that, which can be very exposed to potential security-types of things, especially as you connect that to the Internet, but you also have the control system inside the vehicle that has embedded-type controls and things like that. So making sure—and this is how we do our architecture—that the firewalls that exist between those different sides of our control system are very robustly secured is absolutely critical. But as we move forward in a world where threats are more different, we’re all going to have to think about not just inside the vehicle, but all around us, you know, more a sense-and-respond and autonomic learning. More like the human body in terms of how we think about security. And we’ve been working with companies like Cisco to really rethink the paradigm of security from building walls around your fortress to something where you can look at invaders at multiple levels of abstraction. So architecture is going to be the key around that.
Rogers: Quick. I would argue that connectivity is actually going to help us with security. Because the best thing you can do to ensure security is to be able to upgrade the software. Because people are going to find surface area.
Kirkpatrick: And as the host I can tell you that this will come up again in a few minutes, at some length. So thank you for bringing it up now. So many more things we could discuss. I wanted to ask whether this whole chain of events, this whole evolution we’re talking about, is going to happen very, very inconsistently across the planet, kind of going to that national question I asked before? Or whether it might get spread quite widely as everybody increasingly has a smart phone? Maybe I do want to ask that really fast to the Qualcomm guy, or maybe Dave can quickly touch on that too?
Chandhok: I think it’s actually going to happen pretty fast.
Kirkpatrick: But widely across the planet?
Chandhok: Yeah, because I think it’s going to be driven by mobile devices as the computing platform, and then I think as Alex pointed out, the cost of actually providing this other kind of connectivity is coming down. What will gel it is more standard interoperability, not seven different radios to do the same thing.
Kirkpatrick: So we’ll see this in Nigeria as well as the UK?
Evans: Well in some cases Nigeria might advance faster than first world countries. I think we’re going to see some common threads. Mobile phones, for example, will drive certain adoption, but then in other areas we’ll see sort of a 1,000 flowers bloom. Different sort of pockets or rational experimentation.
Kirkpatrick: Sadly, we’ve got to wrap. Thank you so much guys. That was a great, wide-ranging discussion.