COP26 Was a Warning. Digital Technologies Can Help

Industrial tech shows a proven pathway towards net-zero emissions while driving profitable sustainability for companies. The Internet of Things, AI, and digital twins are critical tools.

The recent urgent climate discussions at COP26 in Glasgow reinforced our understanding that current climate pledges by countries do not go far enough. Even with the latest Nationally Determined Contributions (NDCs) from 192 territories, the world can expect a mean temperature increase of 2.7°C by 2100 and likely a 16% rise in global greenhouse gas emissions (GHGEs). Every tenth of a degree of warming makes a difference if we are to avoid catastrophic climate change. Governments and businesses must act urgently to accelerate the transition to a net-zero carbon economy.

While disruption is inevitable, this moment offers an opportunity for incredible innovation. Research increasingly indicates the path to continued profitability lies in pivoting to sustainable growth.

Getting the U.S. on the road to a carbon-free economy will require $2.5 trillion in spending over the next decade, one authoritative study concluded. But we’re already seeing the results of early investments: clean technologies created $1 trillion in value for investors in 2020, reports the World Economic Forum. New investments aside, efficiency improvements can improve sustainability indicators across existing value chains by optimizing performance and identifying areas to reduce waste and GHGEs. Technology offers the toolkit for both use cases, and crucially for industrial leaders mapping a course for a sustainable future.

This mission for industry has been my passion for most of my professional career. It’s been particularly prominent for the last decade at Schneider Electric, where I was EVP for industrial automation until last May when I took the role of CEO at AVEVA, whose industrial software helps leading multinationals save 15-30% in energy costs, reduce carbon dioxide emissions by 9-15%, and cut industrial downtime and waste by 25%. AVEVA’s experience working with over 20,000 industrial customers across more than 40 countries has given us extensive insights on the data-led innovation critical to driving performance, profitability and sustainability. Here are three ways industrial companies are doing so.

Leveraging data to optimize the entire value chain

Aker, a Norway-headquartered leader in sustainable energy solutions, created the first plant of its kind in transforming captured carbon dioxide into fertilizer. The result not only enables Aker to significantly reduce CO2 emissions from industrial flue gases, but it has turned those emissions into actual productive assets. The captured CO2 can also be transported via ship or pipelines for permanent storage elsewhere. Aker is now sharing that know-how as an offering for other companies.

This innovation could have only happened by embedding digitized data across the entire operations lifecycle, and then using it to optimize production, improve efficiencies, and design new processes. AVEVA worked closely with Aker on this carbon capture engineering technology, and we extended our partnership this year so that other companies can leverage it at a global scale.

These types of innovations are possible and practical at a time when 80% of instruments in plants are connected to the internet of things (IoT). This data is essential currency for industrial enterprises in mapping every step of the production process, allowing complex businesses to develop a meaningful digitization strategy and execute it on the ground. Platforming this data across the organization and applying predictive artificial intelligence (AI) is also helping industrial companies and manufacturers optimize operational performance and reduce downtime while tracking for sustainability goals.

Digital Twins help map sustainability metrics to industrial data

We already appreciate that data-led technologies can similarly be put to work in the battle against climate change. What is harder to do is to track and monitor greenhouse gases across the value chain, enabling industries to reduce carbon emissions with a systematic, comprehensive strategy designed to meet net-zero targets and incorporating sustainability into all facets of the business and drive meaningful innovation.

To help calculate risk/reward scenarios both for financial gain and for ESG benchmarks, we employ Digital Twin technologies, which create a virtual representation of physical assets so that companies can predict the outcome of various operational changes. Digital Twins are vital in enabling step-by-step strategies toward measurable improvement for long-term change. The results also give stakeholders across the organization a common goal.

A good example is Henkel, a manufacturer of household products such as washing liquid Persil. To support its customers and align with its sustainability commitments, Henkel built a digital backbone that connects its global operations in the cloud; that’s 3,500 sensors in each site providing 1.5 billion data points to meet fluctuating demand while reducing energy usage. To date, Henkel has reduced its environmental footprint by one-third, using less energy, less water and producing less waste.

Partnerships across the ecosystem to realize economies of scale

Technology cannot win the battle on its own. Alongside digitalization and innovation, cross-sector partnerships are fundamental in supporting organizations as they develop, invest and deploy technology to meet the world’s net-zero climate ambitions. Accelerating progress on sustainability requires breakthrough collaboration with suppliers and competitors to innovate at scale.

I’m particularly proud of AVEVA’s long-standing partnership with Microsoft to help customers facilitate sustainable business outcomes in industry by leveraging cloud technologies and big data analytics. The two companies, for example, support the energy transition vision of TechnipFMC, a leading technology provider to the energy industry. We work together on a range of sustainable implementations including green hydrogen, floating liquefied natural gas and other global projects. A joint solution deploying the AVEVA E3D software ran a dynamic flare header simulation that helped reduce the amount of steel used in the ultimate design by 34%. The exercise saved TechnipFMC €20 million, helped optimize the value chain and minimized the use of carbon-intensive resources.

These digital tools support collaboration across organizations and have proved their worth over the pandemic. Leveraging these solutions to partner organizations can help business leaders drive innovation, achieve systemic gains, and deliver enhanced value for stakeholders.

Make the choice to prioritize sustainability

Regulators and consumers are already pressing brands to embed sustainability across their operations. About 80% of major international companies now report on sustainability, and thousands of enterprises have committed to net-zero emissions by 2050 through initiatives such as the Race to Zero and Business Ambition for 1.5°C, including AVEVA. Now comes the hard part of achieving those objectives.

COP26 has made it clear that we must act now to protect the planet for ourselves and for our children.  UN General Assembly President Abdulla Shahid’s words from the opening of the conference still ring true: “We are entirely capable of turning this around, if we so choose.” We do. And we must.

Related Posts
See All

Global Fisheries are Devastated by Government Subsidies

Industrial tech shows a proven pathway towards net-zero emissions while driving profitable sustainability for companies. The Internet of Things, AI, and digital twins are critical tools.

One Solution to Fight Climate Change? Fewer Parking Spaces

Industrial tech shows a proven pathway towards net-zero emissions while driving profitable sustainability for companies. The Internet of Things, AI, and digital twins are critical tools.

Why ‘Climate Havens’ Might be Closer to Home than You’d Think

Industrial tech shows a proven pathway towards net-zero emissions while driving profitable sustainability for companies. The Internet of Things, AI, and digital twins are critical tools.

Will Climate Cookbooks Change How We Eat?

Industrial tech shows a proven pathway towards net-zero emissions while driving profitable sustainability for companies. The Internet of Things, AI, and digital twins are critical tools.

The Evolution and Death of the Electronic Medical Record

The EMR we have come to know does a very important thing, but it’s absolutely nothing compared to what they could, should, and maybe will do in the very near future.

This may sound odd coming from someone like me who has long worked in healthcare technology, and especially as the literal co-founder of an EMR company, but in the same way that we saw paper patient records heading toward extinction decades back, the EMR as we know it is beginning to wobble. It’s like an aging boxer in round 13 taking on a younger, faster opponent.

The word EMR is a noun, a static thing. In truth, when we use the term, we far more often have an active idea in our minds, making it more like a verb. And in fact, many paragraphs and entire books are dedicated to describing and understanding what an EMR actually is. The U.S. Government has gotten in on the act, with a corpus of literature, revisions, artifacts, checklists, commentary, and guides – all to define this one thing. 

So I ask you: What is an EMR?

It’s easy to hearken back to the pre-digital era and remember the patient chart, stored in a filing cabinet in our doctor’s office. Its simplicity and the simplicity of that analog era may seem quaint, but it was easy then to define a medical record.  It was one individual doctor’s record of what had been seen and done with their own eyes and hands for a given patient.

The promise of the electronic era was romantic and ambitious – to break down the silos of a world before computers and build a composite, complete picture of the core clinical information about a patient. It would encompass the work of many hands and eyes of all the doctors who touch a patient – untethered from geographical boundaries.

But that’s not what’s happened.  Instead (and unsurprisingly) form has followed function.  EMRs became mechanical turks for the original myopic recordkeeping of the doctors of yesteryear.  Only this time, to pay for their added time and expense, they were way “more so.”  EMRs enabled documentation far more than a doctor’s hand would ever bother to capture (thus making the billing codes a little higher). Anything the hands and eyes of a doctor did to that original paper chart was fully captured in highly compliant and extremely legal bloat-script. And to do that, EMRs included lots of other things we somehow added to their original definition. Like a deranged and perhaps half-blind baker, we threw together a recipe – a pinch of front-end workflow, a dash of software processing, and of course a handful of storage capabilities. We mixed it together haphazardly, and then we required it universally.

I think of today’s EMR as a three-layer cake.  

Layer one: The Workflow. You know the drill: Check-in (“insurance card please”), intake (“Smoker?”, “How many packs?”, “Can you step onto this scale?”), exam (“Cough please”), orders (drugs, labs), check-out (“that’ll be $95 and please take a card to schedule your radiology from the stack in the sliding window”). These are the step-by-step tasks that the EMR software walks staff and providers through in order to produce a consistently-documented and compliant encounter, one that does not lend itself to lawsuits but is as billable as possible. 

Layer two: The Compute. The data captured through those workflows in the first layer is then used to gonkulate new info.  CPT codes are calculated and defended by ICD codes, interview data, and time logs. “Meaningful Use” metrics are calculated and attested to, for quality, of course, and of course also for other payments. Quality scores are gonkulated again as referral forms and automated “consult letters”. Tremendous amounts of compute are funneled single-mindedly to turn the clinical experience into financial outcomes, whether fee-for-service or “value-based” care.

Finally, layer three: The Storage.  Unfortunately, this is not the storage we might want, a longitudinal profile of the patient with relevant aspects updated frequently. Instead, it is the storage of all of the above metrics generated by the compute, geared simply to be ready for audits and questions.  Armed with this storage, providers hope desperately not to be embarrassed when patients come back expecting to be remembered. This has led to many EMR cartoons that feature blind men and elephants, as even the most wizened experts struggle to adequately identify what EMR they are looking at. 

All this has contributed to an enormous explosion of healthcare vertical monopolies (and related price increases) across the country. Many have rationalized that if an  EMR is only to be a “core sample” of the Workflow, Compute, and Storage of one physician’s experience, then the logical thing to do is to move all the physicians a patient sees under the same EMR roof. Such consolidation has created a mirage of “continuity of care”. 

As EMRs have matured and regulators have blanched at the silos created by their original mandates and incentive programs a decade ago, the ecosystem has slowly nudged itself towards sharing. Record sharing (originally mandated by HIPAA) is becoming a norm, with common sharing services like Commonwell yielding a 20% match rate for Epic record queries and a 70% match rate for Athenahealth (the company I founded). It’s increasingly common now, as a doctor looks at their patients, for them to find externally-generated data in a patient’s EMR. Such a record is not machine-readable, allows no alerts to fire on that data, and leads to few algorithmically derived insights.  But it’s a helluva lot more than nothing, even as it’s a helluva a lot less than we deserve.  

My summary is that EMRs are OK as they are today, but absolutely nothing compared to what they could, should, and maybe will be in the very near future.

So what should EMR mean? Let’s go back to those three aspects: Workflow, Compute, and Storage. 

First, we need to liberate that Workflow layer which today means a thousand things.  Most of the workflows are unrelated to moving people through old-school office visits. In old EMR speak, we called this concept “the encounter,” and it was tied tightly and rigidly to care delivered over an exam table or hospital bed. 

The new workflows are different. They are the always-on, often nearly free, constant contacts of the digital health age. Think of such workflows as the “edge computing of care”. K Health and Buoy Health will give you free bot-doctoring that is getting better and better. And a dozen more digital health players will give you weirdly-affordable super-fast online team-based care.

The new explosion of health communication means text threads, care plan alerts, and group messaging, which offer ongoing relationships and community with providers and with patients like you.  These workflows will be more important than office visit workflows going forward. The original workflow engines, so a long part of traditional EMRs, should also be able to write to our records. But so should thousands of additional applications and sources.

All of this demands a totally different workflow engine. What about passive monitoring of exercise, or steps, or glucose levels? There are whole companies yet to be built that are centered on new care modalities that have nothing to do with exam rooms. 

Now let’s relook at Compute.  Sure, we should be able to compute codes and bill them. Why, though, are we stuck with EMR’s Compute being entirely geared towards claims? New age providers are increasingly creating products that don’t fit neatly within the world of the CPT code. Many are asking for a monthly subscription fee from employers and payers.  Then those organizations purchase pieces of their care product that they don’t provide with their own staff, not unlike a general contractor with subcontractors.

New age providers will need to buy care as often as they bill for it. Where is the Compute to direct that spend? Where are the ledgers?  Back in the third party administrator days in California, there were a few rudimentary capitation management apps, but they are ancient and not connected to anything.  Firefly Health, the company where I serve as executive chairman, bills employers for its care but also pays specialists for jumping in on sticky cases, which eliminates the need for a full-on referral.

There is also a whole category of marvelous Compute that will come as we more routinely ingest social-determinants-of-health data, which is super important.  The dominant providers and their fee-for-service (FFS) agreements are technically interested in such things, but cannot make such things core to their systems and processes. There is not enough return to FFS-dependent providers to incentivize their radically reducing fees generated by patients. 

Other Compute could surveil populations for what’s working and what’s not, for the purposes of making real-time trade outs. Again that’s not a thing traditional FFS business models can afford to care about, but will be a major win for all new age, value-based players. As more and more healthcare players are born digitally native, there will be more and more APIs to call to calculate care coordination. For example, can you schedule across all the siloed electronic calendars in medicine today? Looking outside healthcare, Kayak has no problem interacting with all the calendars of airlines worldwide. Such discrepancies point to a dramatic misappropriation of where we currently spend our industry’s Compute.

Finally, Storage.  What should we store? Core samples? Blind man soundings? Of course not. The term EMR should refer to the record and only the record – a complete  operational data store for every American. Meanwhile, digital health companies today are intentionally narrow. Ria Health, for example, makes no bones doing anything other than alcohol treatment. They have no plans to do a middling job at X-rays and colonoscopies and all the other things traditional IDNs seek to cobble together. They plan to do alcohol treatment well and, to do so, they need to get a really complete, comprehensive and continuous view of their subjects.

We should all have an EMR, and it should operate like our LinkedIn profile – when we make a change to it, everyone following us should be able to know right away.  There can be no “Save as PDF” and re-sending to friends and prospects. This is what digital-native companies must have. Such companies have no interest in guarding their evidence of care on a patient. In fact, they know they do a much better job when their understanding of members is complete. In that way, they are natively sharers, for both selfish and inherently good reasons.

In my old EMR life, we talked about the scalability of technology. Today, “extensibility” is just as important. Can a tech-native provider build its own workflows and have them seamlessly co-exist with “rented” ones? Can they do the same with their compute? And of course, as with Spotify, can they dispense with keeping an out-of-date, memory-hogging copy of just their own songs alone and instead get a database of all the songs in the world? We know how to do this shit. In fact, in parts of our lives that we care about much less deeply, we do it all the time. 

Simply put, the EMR we have come to know does a very important thing. But god willing, we will need it less and less. EMRs grew up to do a certain job, at a certain time, in a certain place. They are not wrong or bad, but they are closely related to the rapidly-diminishing office-based, fee-for-service world. As digital health companies move from being cool parlor tricks and gadgetry to the mainstream, let’s hope that “place” moves into the rearview mirror.

 

With upstart EMR provider athenahealth, Jonathan Bush was one of the pioneers of the digital health movement enveloping healthcare today.  He is currently an active investor in numerous young healthcare companies, is CEO of Zus Health and Executive Chairman of Firefly Health.

Related Posts
See All

Blood Test Shows Promise for Detecting Cancer Far Earlier

The EMR we have come to know does a very important thing, but it's absolutely nothing compared to what they could, should, and maybe will do in the very near future.

Start at Step 7: Accelerating Innovation in MedTech and Life Sciences

The EMR we have come to know does a very important thing, but it's absolutely nothing compared to what they could, should, and maybe will do in the very near future.

Lessons from the Pandemic: Technology Disrupting (and Improving) Chinese Healthcare

The EMR we have come to know does a very important thing, but it's absolutely nothing compared to what they could, should, and maybe will do in the very near future.

Scientists ID Genetic Risk Factors For Suicide

The EMR we have come to know does a very important thing, but it's absolutely nothing compared to what they could, should, and maybe will do in the very near future.

Bridge the Digital Divide For a Brighter Future of Work

Preparing for the future of work means creating a socially responsible supply chain, a more equitable and inclusive workplace, and a more diverse workforce at every level. The key: bridging the digital divide.

It goes without saying that companies have immensely changed the way they think about the future of work. The old thinking was dominated by narrow business imperatives and focused largely on delivering for shareholders. The new thinking is much broader and expects businesses to create value for the long-term and for a much wider set of stakeholders.

This new paradigm has motivated business leaders to imagine a future of work that’s not just efficient and productive, but also more fulfilling, healthy, equitable, and sustainable. Increasingly, customers, partners, employees, and investors expect companies to leverage their power as a part of society perceived to be both ethical and competent in order to advance social good.

Today, preparing for the future of work means creating a socially responsible supply chain, a more equitable and inclusive workplace, and a more diverse workforce at every level. This shift in philosophy is welcome and necessary, but it doesn’t go far enough. We – and I’m speaking particularly to my fellow technology leaders –have an opportunity to make a once-in-a-lifetime difference in the workforce. The key: We must work to bridge the digital divide.

The digital divide hinders innovation

Many in the tech industry believe strongly in technology’s potential to create a more just world. I’m one of them. But I must concede that technology inequitably distributed is more likely to reinforce the power imbalance than to correct it. The internet can’t really democratize information when only about 60% of the global population uses it. In some of the poorest countries in the world, that figure drops below 20%.

In the United States, computers are ubiquitous in households earning more than $100,000 a year. But they’re generally absent from the homes of the 41% of adults with household incomes below $30,000 a year. Or consider the fact that in many countries, the video calls and conferences that have become a necessity in many of our lives are prohibitively expensive. When looking at mobile internet users, the research firm Cable found that a one-hour video call cost about $4 or less on average in the UK, but in Malawi it would be $14. Think of the children who are growing up in these wildly differing households. As tech becomes even more central to every career, which of these children will be more likely to land a job with flexible hours, extended parental leave, and a strong employee development program? It’s self-evident that the future of work can’t be fully transformative until it’s fully inclusive.

There are compelling reasons for tech companies to advance digital equity besides desiring a better world. The tech industry obviously wins when our innovations see wider adoption. Even more importantly, we have much to gain from the existence of a diverse global talent pool. The people who will benefit from expanded digital equity aren’t just would-be consumers; they’re potential innovators. The digital divide robs the disadvantaged of access to education, health care, and opportunity. And it robs everyone of the innovations many of these people would create if given the chance.

Bridging the divide requires a sustained, cooperative effort

To bridge the digital divide, companies need to consider their work for digital equity as part of their business model. We need to recognize it as a goal we must meet together – part of the expanded brief of the modern company. And we must think about expanding tech access for citizens around the world with the same standards we apply to our own workforces. That means prioritizing best-in-class technology, from intuitive hardware to security-enhancing software.

There is no thermodynamic law driving the proliferation of digital access. You can’t gift a school district or an NGO a few thousand laptops and expect a transformed community the next time you drop in. The key is thinking holistically and thinking long-term. Companies committed to bridging the digital divide need to recognize that digital equity is about sustained access to current hardware, software and broadband service – as well as the skills to put those technologies to use.

Throughout these efforts, it’s important to keep the focus on the people whose lives we’re trying to improve. I serve as chief commercial officer for HP, and The HP Foundation tackles the digital divide by offering free IT and business skills training courses in seven languages through the HP LIFE program. The program measures success in learning outcomes that increase employability and foster business creation. Google’s Next Billion Users initiative conducts extensive research on the needs of new internet users around the world, to tailor its product offerings. And, crucially, it shares much of what is known about making technology accessible, relevant and empowering in different parts of the globe.

No single company can be expected to achieve global digital equity all by itself. But making any measurable impact, even in small communities, will require us to work together, making meaningful multiyear commitments to ensure the citizens of the future can access the information they need to live, work and innovate – no matter where in the world they are. 

Christoph Schell is the Chief Commercial Officer (CCO) and a member of the Executive Leadership Team of HP Inc.

Related Posts
See All

‘Geek Way’ Author Andrew McAfee On How CEOs Fumble

Preparing for the future of work means creating a socially responsible supply chain, a more equitable and inclusive workplace, and a more diverse workforce at every level. The key: bridging the digital divide.

Responsible AI is Good for Business, says Advocate Mutale Nkonde

Preparing for the future of work means creating a socially responsible supply chain, a more equitable and inclusive workplace, and a more diverse workforce at every level. The key: bridging the digital divide.

Vint Cerf’s Take on AI, The Metaverse, and Green Power

Preparing for the future of work means creating a socially responsible supply chain, a more equitable and inclusive workplace, and a more diverse workforce at every level. The key: bridging the digital divide.

AI Could Make Humans Even Less Exceptional

Preparing for the future of work means creating a socially responsible supply chain, a more equitable and inclusive workplace, and a more diverse workforce at every level. The key: bridging the digital divide.

Start at Step 7: Accelerating Innovation in MedTech and Life Sciences

Good innovators rarely start from scratch. They build on a platform of capabilities, tools, and insights already available. It’s time for health care to step up.

In the history of innovation breakthroughs, some myths remain strong. In the hardware world, game-changing devices get built in the garage. For life sciences, the eurekas come in basement labs. In these origin stories, the heroes are usually a pair of scrappy, go-it-alone engineers or scientists. They build everything from scratch and on their own, giving rise to the Apples, HPs, Medtronics, and Roches of today.

Yet, that’s not how innovation works—or has ever worked. The Bakken brothers built their first portable pacemakers by collaborating with customers who had decades of experience and know-how. Hoffmann-La Roche launched the modern pharmaceutical industry by bringing the recipes of local apothecaries to scale with industrial mass production. Jobs and Wozniak pulled ideas from Xerox, IBM, and the Homebrew Computer Club to catalyze the personal computer revolution. Their ideas and determination changed the world.

In health care, this understanding is just beginning to take hold, and it will profoundly change the way the best organizations differentiate themselves in the market and amplify their value and impact.

Innovation Without Reinvention

Ask any startup entrepreneur today whether they discovered or invented all the technology that goes into their product or service, and you’ll likely get a blank stare. They don’t start at “Step 1,” building everything they need from the ground up. They start at what I call “Step 7:” working from a platform of capabilities, tools, and insights already available. This allows them to focus as much of their resources and attention as possible on the unique value they (and only they) can bring to the market.

It’s a no-brainer. Resources are finite. Speed and efficiency are critical. Pride of ownership is irrelevant. The needs of the customer top the needs of internal stakeholders or established processes every time. Ergo, they meet every problem by looking for the best and most expedient solution, sourcing whatever tools, insights, research findings, capital infusions, or SaaS applications they need or can find.

Step 7 thinking is standard in dynamic or disruptive industries, where innovation, speed, and rapid growth are essential. Consider how the latest generation of tech giants have come to dominate their markets.

Zillow, the real estate listing platform, didn’t establish market prices for every house in America; painstakingly collate addresses on a digital map; invent a way for buyers and sellers to communicate; build massive warehouses to store and protect their data; or populate their website and search results with ads from service providers. Instead, they aggregated data from mortgage brokers, tapped Google for location services and ad buys, adopted Twilio for communication, and stored their data securely on AWS.

That freed Zillow to focus on the value they could bring to consumers. Imagine the time, resources, and energy they would have lost had they insisted on proprietary, home-grown solutions end to end. They’d be a startup footnote at best, not the dominant company in one of the biggest markets in the country.

New Ways, Old Thinking

Zillow isn’t alone in using Step 7 platforms to accelerate innovation. I dare say it’s become the norm in virtually every industry—that is, except health care. The traditional bias in health care is toward ownership and control over all assets, processes, and capabilities, including those that are non-core. This puts a tremendous drag on resources, and has a dampening effect on innovation, speed, and growth.

And while life sciences and medtech are far more experienced at outsourcing, collaborating, and acquiring key strategic assets, even these health care sectors have only begun to tap the potential of a Step 7 approach.

The technology stacks that Zillow, Uber, Netflix, Airbnb and others assemble help clarify consumer needs, create a unique user experience, and leverage digital scale and speed to build an expanding network of users and suppliers. In life sciences and medtech, the purpose of a Step 7 technology platform is to bring better processes, new solutions, and market-changing ideas to health care on behalf of patients.

Roadblocks to innovation come in many forms. For startups, the barriers are often technological and financial. For fast-growing companies, they often hinge on talent scarcity and the need for speed. For established organizations, barriers might involve entrenched beliefs or risk-averse cultures. And in discovery-driven enterprises such as life sciences and medtech, the barriers to innovation start with data.

The complexity is such that an organization rarely “knows what it knows” or can leverage its knowledge in a way that optimizes product development or drives new insights. Knowledge comes from many sources across and outside the organization, ranging from scientific and clinical information, to user knowledge and patient insights. Most of that information is siloed and non-standardized.

The Ecosystem is the Computer

The traditional approach to solving that challenge—constructing data warehouses, bridges, tunnels, side-roads, and workarounds needed to overcome a hodgepodge of legacy infrastructures—has gotten health care nowhere fast. Step 7 thinking means starting with solutions that aggregate all data streams in real-time, while also making that data standardized, accessible, and useful. This frees an organization to focus its resources on activities that build the future instead of fixing the past.

What comes after Step 7? A network effect of knowledge generation that few can imagine today. With access to all of its data enabled, life sciences and medtech organizations can cross-pollinate at immense scale and speed, across teams, inside and outside its walls; with data, information, and insight flowing in myriad directions simultaneously. As the number of participants (clinicians, vendors, business units, researchers, patients, and so on) expands and the level of activity proliferates, the ecosystem grows richer, more diverse, and more impactful.

Health care, as an industry, is highly path-dependent, replete with entrenched processes, legacy infrastructures, data silos, vested interests, rent-seeking, and paternalistic attitudes toward patient care. The ecosystem enabled by Step 7 thinking is a multiverse of possibility and potential, a marketplace of solutions where new discoveries can emerge from any source at any time.

On the brink of that capability, the question that discovery-driven organizations should ask themselves now is simple and profound: How innovative do you want to be?

Abhinav Shashank is the CEO and co-founder of Innovaccer, a leading San Francisco-based healthcare technology company. 

Related Posts
See All

12 Corporate Experts Talked Best-Practices for Innovating in Big Companies. Here Are Their 4 Conclusions.

Good innovators rarely start from scratch. They build on a platform of capabilities, tools, and insights already available. It's time for health care to step up.

Vision Performance Ripe for Innovation & Investment

Good innovators rarely start from scratch. They build on a platform of capabilities, tools, and insights already available. It's time for health care to step up.

Can Insurtech Rise to the Challenge and Meet the Post-Pandemic Moment?

Good innovators rarely start from scratch. They build on a platform of capabilities, tools, and insights already available. It's time for health care to step up.

Women’s Problems: Can FemTech Offer Relief?

Good innovators rarely start from scratch. They build on a platform of capabilities, tools, and insights already available. It's time for health care to step up.

Six Degrees of Separation From Elizabeth Holmes

The trial is not about Theranos – it’s specifically about whether Elizabeth Holmes defrauded investors and deceived patients and doctors. The more separation Holmes’ counsel can place between her and what happened, the better it will ultimately be for her. 

As many experts expected, the fraud trial of Elizabeth Holmes began to heat up on Wednesday with testimony from whistleblower Erika Cheung.

Cheung, already well-known in innovation and technology circles because of her wildly popular TED Talk on her Theranos experience, took the stand on Wednesday in what was a palpable stepping up of a reasonably sedate but also occasionally bizarre trial so far. 

The issue the court needs to address, not only with the Cheung testimony, but that of any other witnesses the prosecution may seek to bring, is how many degrees of separation exist between the actions or inactions being described at trial and Elizabeth Holmes herself. 

While many people tend to think of this as the Theranos trial, it’s not. It’s actually and legally the trial of Elizabeth Holmes. To hold any startup founder legally responsible for everything their company did or failed to do is simply not a bar that a United States District Court is going to set. That is why lawyers for Holmes are trying to draw a line in the sand this week. One way they are attempting to do this is by framing testimony about what Theranos as a company did as being far removed from Elizabeth Holmes herself. 

Where the judge allows this to go and how much leeway the prosecution is afforded, will be critically important for the rest of this case and for the January trial of co-defendant Sunny Balwani. 

What is interesting from Wednesday is that in cross-examining Cheung, one of Holmes’ counsels read through the qualifications of the scientists and laboratory directors employed at Theranos. This was obviously done in an attempt to minimize the weight and validity of less-senior employee Erika Cheung’s testimony. But this strategy lays the foundation for what should be an important part of the Holmes defense – the experience and qualification of her board members and investors. 

It’s one thing for a “self-described ‘starry-eyed’ 22-year-old scientist” to be fooled by Elizabeth Holmes, yet another thing entirely for the prosecution to prove that the adults in the room were fooled. But if there was ever a “they absolutely should have known” moment in startup history, l’affaire Theranos presents it. If senior scientists, managers, investors, and directors knew what Holmes knew, then the deception becomes a company deception, not a personal one. During Cheung’s six hours on the stand on Wednesday, she shared that she and fellow whistleblower Tyler Shultz met with Theranos Board member George Shultz. As we remember, Shultz was former Secretary of State, but also reputed to be a very active, hands-on board member at Theranos

Shultz, who died early this year at 100, was remembered for a life of service marred by the Theranos scandal. In a very kind and certainly charitable perspective on Shultz’s involvement with Theranos, back in February, The Critic observed: “Perhaps the defining story of Shultz’s life was a scandal which overtook his later years: the Theranos fraud. His involvement with the company attested to his willingness to trust even those who were unworthy of being trusted.”

It is going to be very difficult for prosecutors to argue that a 19-year-old Stanford dropout set out to defraud some of the most savvy investors, advisors, and board members in the world and actually succeeded. Painting Holmes, as the media has often done, as an evil Jobs-ian character in relentlessly-black turtlenecks is an easy œuvre, but it takes as its logical corollary that a large number of (almost all) men with great track records of success and extremely high net worth were all deceived. Yet this is all inextricably intertwined with what the prosecution needs to do here in proving their case, perhaps further proving the thesis that they set their bar far too high here. Again, unless they can show that Holmes herself was the fraudster, not the overall company, the prosecution will fail.

As we move from Cheung to Tyler Shultz to George Shultz, we are creating degrees of separation between them and Elizabeth Holmes, not actually bringing her more closely into the center of the case. 

No one should lose focus on the fact that this trial is not about Theranos as a failed startup. It is specifically about whether Elizabeth Holmes defrauded investors out of hundreds of millions of dollars and deceived hundreds of patients and doctors. The more separation Holmes’ counsel can place between her and what happened, the better it will ultimately be for her. 

—–

Aron Solomon, JD, is the Head of Strategy and Chief Legal Analyst for Esquire Digital. He has taught entrepreneurship at McGill University and the University of Pennsylvania, and was elected to Fastcase 50, recognizing the top 50 legal innovators in the world. Aron has been featured in CBS News, TechCrunch, The Hill, BuzzFeed, Fortune, Venture Beat, The Independent, Yahoo!, ABA Journal, Law.com, The Boston Globe, and many other leading publications. 

Related Posts
See All

7 Technologies for Fighting Climate Change

The trial is not about Theranos – it's specifically about whether Elizabeth Holmes defrauded investors and deceived patients and doctors. The more separation Holmes’ counsel can place between her and what happened, the better it...

Techonomy 23 to Focus On the Promise and the Peril of AI

The trial is not about Theranos – it's specifically about whether Elizabeth Holmes defrauded investors and deceived patients and doctors. The more separation Holmes’ counsel can place between her and what happened, the better it...

12 Energy Dilemmas the World Needs to Address

The trial is not about Theranos – it's specifically about whether Elizabeth Holmes defrauded investors and deceived patients and doctors. The more separation Holmes’ counsel can place between her and what happened, the better it...

The Inflation Reduction Act Could be the New, New Deal

The trial is not about Theranos – it's specifically about whether Elizabeth Holmes defrauded investors and deceived patients and doctors. The more separation Holmes’ counsel can place between her and what happened, the better it...

Digital Humanism for Life in the Cloud

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Never have we been so connected but so disconnected at the same time. Digital technology connects us around the world. We can now be physically in one place even as we are mentally thousands of miles away–talking, working, or playing with others. The mobile phone is becoming ubiquitous in every aspect of life around the planet. And now the Global South and the Global North are well on the way to being connected everywhere, everyday, for better or worse.

These connections are generating collisions which were avoided when distance separated people and ideas. Wealth is meeting poverty online, and Western science and economics are meeting Eastern body-mind-spirit. Yet…. all is not one. People are whipsawed, sometimes very painfully, between the corporatized dominance of Western science and economics to, in more and more places, an embrace of holistic well-being and one-ness with the planet. Many parts of the world no longer are so willing to accede to the West’s dominance. So while we may find ourselves in the Cloud together, the buffers of understanding and empathy that enable human relations are not just worn thin. They are torn by growing waves of cultural and religious distrust.

Yet while digital may taketh away, it also gives: measurement, for example, is entering a golden age. The sensors of the internet of things and 5G are beginning to enable precise tracking of movement, temperature, and humidity. That will eventually help lead to enhanced food security and efficiencies in agriculture, transportation and the built environment, enhancing human wellbeing. Sensing as a tool to measure and monitor climate and the environment can enable us to better steward our planet at a time when that is desperately needed.

Digital accounting and tracking generally today primarily serves for-profit private enterprises, making a few people rich, while controlling and constraining the opportunities of far more. But if the cloud can be used to benefit humankind, both production and distribution could be improved in innumerable ways, with new precision. If such tools are managed for everyone’s benefit, it will be possible to dial the levers up or down for many important functions in society, especially at the subnational and neighborhood level.

Yet for now, divisions tear the human fabric, tilting the winnings to whichever tribe knows how best to leverage digital technology to win the power game. It is within our reach to change this, but we must collectively gather the will to do it.

We have a new dimension, and that is digital. Our early forays in the digital dimension have too often not enhanced or enriched our lives. But we can visualize hypothetical digital narratives, and in effect storyboard peoples’ lives, to examine in advance the impact of interventions.  Grameen Foundation, for example, helps farmers track 822 factors that can increase crop yields in Asia and Africa by many multiples. Our intuition from the analog world will need to evolve to better serve us in the digital realm, just as it took time to learn to read and write. And many new digital utilities are needed so we cooperate together better.

What do we mean by “home on the cloud”? In our real, physical and geographically-located home, work and family life have in the last year or so been radically merging. This Covid-forced cohabitation might last longer than we imagined. The more-digital lives many of us now lead have come sooner than we expected. The consequences are many. Not only has remote work reduced the need for office space, but it has also brought down barriers to migrating for work. We can live by the beach in Bali and work each day with colleagues in Brussels or Brazil or Baltimore, even as we socialize at night or on weekends in Bangkok, Barcelona, or Bogota.

Digital humanism driven by love of people and planet is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Surely we all want to prioritize collective flourishing over the fury that will result if the world’s future is defined by conflicts between digital masters and the digitally unempowered. But too many feel helpless, blame others, and find comfort in complaining. It’s past time for such inaction. It’s time to get going —with eyes wide open, not denying that the shift towards a more humane and sustainable society, however inevitable and badly needed, will be painful and fitful.

We are connected now. The human factor in social, economic and relationship ecosystems can no longer be ignored. We can see, measure and model these interactions. We have an incredible opportunity to share and learn so we can find ways to flourish together on our planet home: what that visionary Marshall McLuhan called Spaceship Earth. 

—–

This is a summary of a chapter in the forthcoming book, “Society X.0 in the Rise of a New Epoch – The Human Factor in Social, Economic and Relational Ecosystems” sponsored by the Social Trend Institute. Editors are Marta Bertolaso, professor of Philosophy of Science in the Faculty of Engineering, and Luca Capone  Ph.D. candidate in the Philosophy of Science and Human Development,  at the University Campus Bio-Medico of Rome, along with Carlos Rodríguez-Lluesma, professor in the Department of Managing People in Organizations at IESE Business School, University of Navarra.

 

Related Posts
See All

Saving our Souls in the Digital Age

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Why Cloud is the Center of Gravity for Enterprise Software

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Creating Societal Change Through Digital Transformation

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

‘Geek Way’ Author Andrew McAfee On How CEOs Fumble

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

How the Pandemic Set the Future of Food and e-commerce in Motion

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

The pandemic changed many consumer habits. Few were impacted as much as eating and shopping.

Before COVID shut down restaurants and prompted grocery store raids, apps had already simplified food delivery. Meal kits were becoming a more frequent part of weekly rotations in kitchens across the world and grocery delivery had moved from fringe to, well, less fringe. These were for most a splurge every so often. A way to incorporate a little convenience after a long day at the office. A luxury.

In hardly a moment, these luxuries became necessities a year ago. Fortunately, much of the groundwork had been laid in building the apps and the infrastructure to at least have a starting point. The pairing of food and e-commerce, after all, was more than 25 years in the making since that first online pizza order over dial up in the ‘90s.

The most substantial – and likely lasting – impact of the pandemic was the rapid acceleration of adoption of e-commerce enabled food delivery, whether cooked meals, meal kits or basic ingredients. It forced the maturation of the industry, with consumers demanding safety, simplicity, sustainability, and convenience. The disparate food tech sector at large quickly realized that to successfully scale to meet the massive demand would require a deliberate e-commerce mindset rooted in data and surrounded by sophisticated production and distribution… all while digesting a new outsized role as mission-driven to ensure people had food at the table.

Solving for how to serve record customer volumes during the pandemic was certainly better than the alternative but came with its own pressures. Meeting huge spikes in demand is a good problem to have, but nevertheless a problem. It is also very eye-opening.

Reflecting on the past year, there are several overlooked advances that have reshaped the e-commerce food industry for the long-haul:  

Customer data drives food industry success in an e-commerce environment.

Supporting individual tastes and dietary preferences has always been crucial to serving food. Originally there were waiters, then apps enabling customization. Not to mention certain grocery aisles a consumer may consistently frequent or avoid. Translating this into an e-commerce world has required not only choice and personalization – it has also demanded an interface built to anticipate tastes and diets, some of which are constants and others which fluctuate. Amazon’s data analysis, for example, does not need to understand how hungry a customer is or if they avoid meat on Fridays during Lent. But for companies providing food, these details can be critical.

As ordering food online has proliferated, personalization has become an important and expected part of the customer experience. Consumers are looking for easy ways to replicate orders, find suitable suggestions, and block out what they want to avoid. This requires a combination questions and analysis to understand patterns and preferences, while still having an easily navigable online store in case a customer needs different options.

  • Lesson: data trumps all else in building a successful online food business

Technology-driven segmentation is the key to growth.

As the industry matures, it is critical to recognize there are many different consumers, and that segmentation is the key to unlocking the multiple parallel total addressable markets. For example, at HelloFresh we have expanded our total addressable market by more than 60 percent over the past five years.

As a group, we have our namesake mainstream brand HelloFresh, but we also serve varying groups of customers with differentiated offerings including premium, value, and specialty. We also offer heat & eat meals through a fourth brand. Customers are good at self-identifying the product that suits their lifestyle.

This segmentation allows us to capture 60 percent of households, with surprisingly little overlap between these groups. Even during the pandemic, growth largely remained within segments, so we know we have distinct groups of customers that require different interaction and different products, even if the underlying platform is the same. Having a multi-brand strategy supports the choices that build a stronger product affinity.

  • Lesson: in an e-commerce environment a company can operate multiple unique brands on one underlying platform to expand its total addressable market

Consumers have been conditioned to equate e-commerce with fast delivery.

As the e-commerce leader, Amazon has set expectations of fast, free delivery. In the food sector, this was a significant shift. While restaurant delivery operated quickly in a local sense, other categories had to break away from scheduled deliveries to faster turnarounds.

In the meal kit category, for example, we have leveraged technology to update business practices to reduce the time between order and delivery, eliminating much of the long-term planning. The data showed that reducing the order to delivery timeframe led to infrequent customers ordering more often. We also found that shortening those times unlocked new customer segments.

Today, a brand offering a deliver-to-your-door service is not seen as a disruptor. That has become table stakes. Once Amazon threw down the gauntlet with two-hour delivery, new startups began promising speedier delivery of their services. To win in the e-commerce sector – especially in the food delivery space – companies must be prepared to meet and exceed these deeply entrenched consumer expectations. If you do not, someone else will.

  • Lesson: e-commerce expectations for speed apply to all categories including food

—-

Dominik Richter is the CEO and co-founder at HelloFresh.

Related Posts
See All

Meet the Entrepreneur Disrupting the Food Industry – for Good

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

A Recipe for Less Waste in the Food Service Industry

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

Why I’m Making a Big Bet on Sustainable Food

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

‘Geek Way’ Author Andrew McAfee On How CEOs Fumble

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

Fuzzy Logic Makes AI More Explainable

Algorithms too often don’t reflect the complexity each stakeholder brings to the table. A branch of mathematics that has been around since the 1960s can help.

Despite its growth and popularity, AI is not as widely adopted as it could be—particularly by small- and mid-size businesses that lack the enormous data sets of business and tech giants. One reason is that AI designers are not doing enough to consider the end user. 

Consider the many stages an AI solution must pass through in order to be successfully integrated into a company’s workflow. An algorithm begins with an AI expert who must incorporate knowledge from a subject matter expert. Then the tool needs approval from leadership, and has to be adapted by users. Too often, our algorithms do not reflect the complexity each stakeholder brings to the table. Every one of those steps may require the solution to meet a different need.

Take for instance an algorithm used for hiring. The subject matter expert who knows what job criteria matters might need to provide relevant data for the AI, like the neural network, to learn from. Then the HR and executive leaders who must approve the tool may want to know how the algorithm combats bias, whether it can help them find non-obvious candidates, and even how to compare similarly ranked candidates. (For example, potential bias might be reduced by ensuring an applicant’s years of experience “count” more than the name of their college.) Meanwhile, end users will care about ease of use and whether they get what they’re looking for when they use it.

It’s a lot to ask of one tool, but we can design for it. This is where fuzzy logic, a branch of mathematics that has been around since the 1960s, can help. In essence, a fuzzy classification algorithm allows us to communicate grey when employing a neural network alone might show only black and white. Fuzzy logic is designed for uncertain data; it is designed to describe the degree to which something is true. Exploiting these and other fuzzy properties allows us to build transparency and explainability so that we can not only see an outcome, but understand why it emerges – in human terms. 

Let’s look at a basic example: a mug of hot chocolate. Too hot and you’ll burn your tongue. Too cool and it no longer warms you on a winter day. But there’s not just one drinking temperature for hot chocolate—there’s a window of temperatures, a range within which it’s ideal. If we were trying to quantify this scenario by just using the 1s and 0s of a typical classification algorithm, we would have to say 1=good to drink and 0= too bad to drink. With fuzzy logic, we can represent the many points between 0 and 1 at which you might enjoy the beverage.

Why does this matter for business? Because for too long we’ve used classification algorithms like neural networks that force a data point into a yes-or-no, 1-or-0 duality. The power of these neural networks is vast—but without blending in other mathematics like fuzzy logic, our ability to explain how they work is limited. Neural networks excel at uncovering patterns, but they don’t account for nuance, and they are unable to notice when they don’t notice something. Unlike humans, they lack meta-cognition, so the fact that we can’t see how the algorithm reaches its conclusion is problematic. When we combine fuzzy logic, however, we can better uncover the fuzzy relationships that show the “degree” of each data point. The software doesn’t have to code the hot chocolate as either hot or cold–it can code it as warm.

For example, you can determine the ideal amount of perseverance someone would need to succeed in a specific role. Then, using psychometrics, you can find candidates who fit that criteria. Neural networks give us the ability to find the fuzzy patterns in a set of data. Providing our algorithm with perseverance scores (and other metrics) of all VPs of sales, for example, it can find the degree of impact (or fuzzy membership) of the score across the perseverance spectrum. 

As AI designers, we need to remember that not every tool is right for every problem. We need to ask: “What problem are we trying to solve? What matters to the people involved? Who are the end users?” As the saying goes, to someone with a hammer, everything looks like a nail. Neural networks have recently been our hammer. It’s time to mix in a new tool—fuzzy logic—and discover how to create more elegant solutions.

—–

Bryce Murray, PhD, is Director of Technology and Data Science at Aperio Consulting Group, where he builds algorithms to operationalize emerging technology and scale people analytics.

Related Posts
See All

Why AI Will Help Define the Next Era of Business

Algorithms too often don't reflect the complexity each stakeholder brings to the table. A branch of mathematics that has been around since the 1960s can help.

Can Augmented Intelligence Help Save the Planet?

Algorithms too often don't reflect the complexity each stakeholder brings to the table. A branch of mathematics that has been around since the 1960s can help.

How Companies Can Learn to Use AI Responsibly

Algorithms too often don't reflect the complexity each stakeholder brings to the table. A branch of mathematics that has been around since the 1960s can help.

‘Geek Way’ Author Andrew McAfee On How CEOs Fumble

Algorithms too often don't reflect the complexity each stakeholder brings to the table. A branch of mathematics that has been around since the 1960s can help.