What to Expect in 2026

1/1

  • Techonomy16 conference in Half Moon Bay, California, Wednesday, November 9, 2016.  (Photo by Paul Sakuma Photography) www.paulsakuma.com

    Techonomy16 conference in Half Moon Bay, California, Wednesday, November 9, 2016. (Photo by Paul Sakuma Photography) www.paulsakuma.com

Speaker

Rachel Maguire
Research Director, Health Horizons Program, Institute for the Future


Rachel Maguire, Research Director at the Institute for the Future, gives a glimpse into the future of health tech ecosystem. Where is all this data, analytics and the cloud taking us? How will AI, automation, blockchain and robotics impact healthcare? How will these innovations integrate into standards of care, and how broad will the impact be?

Kirkpatrick: Rachel Maguire is the Research Director at the Institute for the Future, which is a very eminent, long-standing institution out here in Silicon Valley that has had an enormously good track record looking at what is going to happen in the future, which is what Rachel is going to try and help us to understand when it comes to healthcare.

Maguire: Well, that is a tough and lofty challenge. I was thinking last week, when I was finally able to sit down and think, in the ten minutes or so that I have here what would be the key idea that I’d try to offer up as my contribution to this conversation. It’s such a remarkable group of leaders and doers in the health and tech spaces. I re-read the agenda that Techonomy had put together, and there’s this question that’s in my overview that says, “Where is all this data analytics in the cloud taking us?” That question made me think of that line that’s often attributed to Marshall McLuhan, the social philosopher, that says, “We shape our tools, and thereafter our tools shape us.” If we’re willing to submit that our work here today, and in these kinds of conversations, is to figure out not just the tools we’re shaping, but how we can anticipate they’re going to shape us as people or as participants in this health and healthcare space, I think that asks a different question. I think what I’ll offer up today is really a thought experiment for you all to think about. If what right now we’re doing is shaping the tools, we’ve got some of the leaders in the room who are in the business of shaping these tools, figuring out what the tools are going to look like, and as we release them into the wild—which is to say, as we put them in our hands—how is that going to shape the way we confer authority and expertise in health and medicine over the next decade?

That’s my question for you all, and I’m going to unpack that a little bit with some of the ideas of The Institute for the Future that we’ve been playing around with. So to do that, I’m going to throw out this name: Abraham Flexner. Now, for the physicians in the room, anyone who studied public health in the United States, US-trained physicians, I should say, this name means a lot to you. For those of you who have never heard of him, here’s the quick history: In the early 20th century the American Medical Association reached out to Carnegie Foundation and said, “Can you look into the allegedly deplorable situation of medical education in the United States?” Abraham Flexner was an American educator at the time and what he did is he led a group that investigated all 155 medical schools that were in operation in the early 20th century. His report verified the allegations that everybody already knew about at the time, that medical education in the United States was absolutely deplorable. There are all these famous stories from the report of dissecting rooms that were also used as chicken yards. One of the most notorious stories that’s always told is he asked a Harvard professor in the medical school, “Why don’t you provide written exams?” and the professor responded, “Well, fewer than half my students can read or write, so there’s no need to test them through written exams.”

Anyway, after he wrote this very comprehensive report, it had a lasting impact on how we train medical doctors today and also what it did to our policy. In other words, over half of the medical schools that were in operation at the start of the 20th century were shut down. That included, by the way, medical schools that were directed at women and people of color. The more, and I think, lasting impact that the Flexner report had is it passed all our accredited state boards. It passed all the laws that create state medical boards and it also passed the laws that have to do with strict licensing and credentialing in medicine. When you think about what that did, as those laws went through, is it did three things: It determined how we become doctors, then it determined who—and who is very critical over the next ten years. Not what, but who has the right to certify an illness and who—again not what—can dictate what the treatment of standard of care and protocol will be for an illness. In many ways you could argue that what the Flexner report did is it set what’s called structural authority. This is what TT Patterson, the British scholar, calls sapiential authority. We conferred all our knowledge based authority to physicians. I think when you look at the next ten years and you look at how this emerging health tech ecosystem, that Techonomy calls it, how things like AI, how things like blockchain, internet of things, how they are generating new knowledge—David Weinberger calls it new forms of knowing, right? How do we start to unpack this sapiential authority that we’ve given physicians and spread some of that knowledge base authority to other entities, including in this case objects, moving forward. What the Institute has done is that we’ve identified these four new sources of authority.

The first one we call computational authority, you could think about this a number of ways. I think the easiest way to get into it, is to look at Max Little’s work at MIT around the Parkinson’s Voice Initiative. For those of you who don’t know about Dr. Little’s work, he’s essentially using the digital microphone on a smartphone, he’s using voice recognition software, and he’s using machine learning. Through that, he built an algorithm that can determine where you are on a spectrum from illness to healthy, when it comes to Parkinson’s. He can do that by getting a sample of your voice. In other words, his algorithm can anticipate where you are in a way that right now we depend upon observational diagnosis. We put people through a series of tests, then the human eye, and the human tests are able to determine what your risk for Parkinson’s is or where you are on the spectrum.

In our minds, Dr. Little, he’s not an M.D. He’s the first one to say he doesn’t know anything about Parkinson’s as a biological or neurological disease. He says, “I’m an applied mathematician, and in my mind, this was a math problem.” Microsoft calls cancer a computer problem, right? They think they’re going to be able to solve cancer in ten years, and they’re taking a real computational approach to it. They just opened their first wet lab, I think last summer, right? This type of computation authority, in our minds, is one of these spaces that is emerging over the next decade. It’s being shaped by these technologies that we’re seeing. The question is: From a structural authority point-of-view, how are we going to make space for that kind of authority to participate in health and medicine?

Now there are three other authorities that I’m not going to spend too much time going into, because I really just wanted to provoke you all to think about how these new tools will shape authority. But, I think they’re worth mentioning, particularly given what happened last night. The second authority that we see coming along with these new technologies, is that of networked authority. Who, or what, has the understanding and knows how to activate networks—whether they’re people, or devices—to get them to do something that we all want to do? Too often, we talk about behavior change like an individual problem, where really the effective change agents in questions of behavior change are those who understand how to get networks of people, how to move important nodes to influence and nudge groups of people to change their behavior.

We see this notion of networked authority playing out first in things like clinical research. When you look at who’s having success in clinical trials, how they fundamentally revolutionized how quickly you can get a clinical trial together. Some of the work that Stanford is doing with Apple Health Kit has changed the game in terms of how quickly you can put together a research study. And then, there’s ambient authority and I think this picks up where that last panel just let off. When we think about Internet of Things and the future, it’s not so much—What’s been fascinating to watch about the Amazon Echo, is not what health systems are doing with it, but what people in their homes have done with it. They brought the Echo in for entertainment purposes, or whatever brought it to their home. But, when you read the reviews, and you start to see how many caregivers are using the Echo as a tool for the person who’s aging in-home. So someone who is aging in-home or who needs care at home who has not been able to turn on the lights in his or her room the last twenty years can use an Echo that’s hooked up to their previously Philips Hue light can now do all of these activities on their own. They can turn on the TV, they can access information that they want, all using voice recognition. They don’t list it as a health tool. When you do ethnographic research and you ask people what tools they have in their house that they use for health, they don’t list things like the Echo at first glance, but when you ask them “How do you use that device?” more often than not if they’re managing a chronic disease, they’re using their digital technologies as critical health tools. We look at that ambient authority. How are we going to confer ambient authority to these devices and to people, like designers who really understand how to move the needle in environmental and social determinants of health?

The final source of authority is that of narrative authority, and we’ve actually drawn a lot of inspiration from Eric Topol’s work, especially the commencement speech he gave at Baylor after you yourself were a patient. You talked about: In an era of big data, an era with labs, and tests; more information than we can ever imagine, the true authority will come from who gets to tell the story. When are we truly going to confer narrative authority to people, not patients, not consumers, but people, so that they tell their story and it’s their story that has the authority? Because at that point, then maybe we’ve taken all of these emerging technologies and we’ve figured out a way to confer authority and expertise to the individual, so they will make the choices that make sense for them when it comes to their health.

Those were the four sources for authority that will be shaped, and that will shape us, as we shape these technologies: computational authority, networked authority, ambient authority, and narrative authority. If I come back to that first question of where it’s all going to take us over the next ten years, I suppose my snarky response is: “To $4 trillion dollars, or $4.2. Why not?” We can just keep spending. But I think until we figure out how we’re going to change our understanding of expertise and knowledge in the space and we’re going to give way to confer authority to new entities that align with these emerging tech, then we’re just going to create duplicate systems. We’re just going to basically do what we did with the home pregnancy test. Women, we all now get to take our own pregnancy test at our home, we get to find out the information on our own. But the first thing our physician does is re-do that test the second we get into the office. So I think we really run a risk of a future where we have a connected pregnancy test in our home, but our physician—for legal reasons, or whatever else reasons—may then, the second we enter into their health system re-run that test, and re-do everything we did at home; so we’ve got these duplicate systems, and that’s how we get to $4 trillion dollars in ten years. That’s our risk. If we don’t take seriously how we shape authority, along with shaping these tools, I think we get ourselves an incredibly tech savvy, but also incredibly expensive health system.

[END]

Transcription by RA Fisher Ink

Tags: , , , , , , , , ,