The Data Challenges Behind ESG Investment

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

The recent explosion of environmental, sustainable and governance (ESG) investing has created a lot of opportunities for investors and financial institutions. The adoption of the Paris Accord and their recent amplification in Glasgow, along with the election of Joe Biden in the US, have accelerated the already-growing trend toward ESG. Annual cash flow into ESG funds has grown tenfold since 2018, and ESG assets under management are expected to account for one-third of all investments worldwide by 2025, or approximately $53 trillion. 

But this massive shift also represents a set of profound data challenges. With no universally-accepted definition of ESG, how does anyone measure performance? And with more companies and countries promising to reach carbon neutrality by a given year, how does anyone measure whether they are actually keeping up, much less standardize that performance across industries and national boundaries? The capture, organization, and modeling of data are too complex and too expensive for the vast majority of organizations to do on their own. 

ESG’s data challenges were the topic of a recent Techonomy virtual salon hosted by Red Hat, and moderated by James Ledbetter, Chief Content Officer of Clarim Media, parent company of Techonomy. The salon explored the roots of recent developments in ESG, how institutional players are tackling the new data challenges, and where the data approach to ESG is headed. 

Richard Harmon, vice president and global head of financial services at Red Hat, asserted that the company’s Linux/open source background lends itself well to the ESG data challenge. “From the roots of the open source community, open culture, which we [at Red Hat] have, topics like climate change and the wider ESG framework are natural things for us to dive into.” He noted that Red Hat had  joined a consortium called OS Climate, launched last year by the Linux Foundation to “address climate risk and opportunity in the financial sector.” It currently includes Amazon, Microsoft, Allianz, BNP Paribas, Goldman Sachs, and KPMG among its member base. This broad group of companies–many of which compete with one another–is necessary, Harmon said, because the “volume and complexity of the data that needs to be utilized is unmatched.” 

Lotte Knuckles Griek, head of corporate sustainability assessments at S&P Global, explained that she has been measuring company performance on ESG metrics for 15 years, with the company being involved in ESG ranking since the beginning of the 21st century. The tremendous growth in the sector, she added, has shifted the stakeholders that S&P serves, which presents a challenge on how to present data in the most useful way. “There’s a huge variety of investors out there looking to consume ESG data, and do very different things with them,” Griek explained. 

Some of the changes in recent years have been driven by changes in regulation and government policy. For example, in the United States the Biden Administration seems to favor some kind of uniform definition for what constitutes ESG investment. In Europe, the European Central Bank has already shifted the criteria for it massive pension investments toward “green bonds,” which in turn gives individual countries and companies an incentive to embrace ESG goals. 

For her part, Griek said that she always welcomes new government standards and regulations, because they help draw attention to important ESG issues. “But it also totally raises the complexity faced by companies, ”she continued. “One framework says report data in terawatt hours, another says report in megawatt hours…all these frameworks tend to cause confusion.” 

Andrew Lee, Managing Director and Global Head of Sustainable and Impact Investing at  UBS Global Wealth Management, put the data dimension into a broader context of how UBS (and other institutional investors) serve their clients, who are increasingly interested in sustainability. “Data doesn’t necessarily drive a determination of what’s sustainable or not, for us,” Lee said, explaining that UBS clients or other investors are ultimately seeking transparency into how their money is being invested. “Data is a critical input into a process that looks at how sustainability can be incorporated into investments.” 

One topic that recurred through the discussion is the need for data that not only captures performance to date, but that can be used to make reliable models for the future. Lee noted that, while everyone in the ESG ecosystem praises the idea of standardization, business imperatives require something beyond standardization. “As an investor, you want a little bit of standardization, but you want to create alpha, right?” (Creating “alpha” is a way that investment professionals describe their competitive advantage.)  “So certain things in your process, maybe it’s the forward-looking data or how you interpret that, give you that edge or ability to outperform.”

In the end, Harmon stressed the need for world-class analytics capability. “That’s what we see as the real differentiator, to solve this global problem that we’re all facing.” 

Related Posts
See All

The Greening of Crypto: Proof of Stake and Beyond

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

2.9 Billion Reasons to Worry

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

Brooklyn Startup Addresses Gun Violence with Climate Jobs

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

UN Aims for Climate-Conscious Digital World

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

Process Transformation: The bridge to the modern enterprise, digital experiences

Brands must do more than just keep up. They must better anticipate consumer needs and rethink the relationship between operations, people and technology.

We’re living in an age of experience, extreme consumerization, and very rapid change. Just keeping up is a major challenge for companies. What is good today may not be relevant tomorrow, and a new service or product line today could be cannibalized by something much better depending on what comes next. But brands must do more than just keep up. They must rethink how they operate to better anticipate consumer needs. As a result, companies need to rethink the relationship between operations, people and technology.

Historically, a company could consider sudden market shifts as “black swan” events, but today such events (think natural disasters, pandemics, etc.) seem to happen more frequently. Businesses must adapt as quickly as markets change, and intuitively deliver experiences based on new-age product lines and services. To achieve this, they must cut across old silos. And they must build frictionless, seemingly invisible work processes that reflect the new need for greater operational agility and fluidity.

Enterprises have invested heavily in technology but little attention has been given to the processes sitting on top, which tend to be choppy. Many of these processes are, at best, inconsistent. And they often are completely broken. Leaders need to think about processes as the end-product of their technology work, the end outcome that serves customers and the market.

But even in a highly-automated age, processes will still be driven by people. So, thinking about how you motivate, train, and develop them will be the best path to better serving customers. Several key approaches will help guide your strategy. First, think of the outcomes you want to achieve. Second, consider that those outcomes will best be driven by having the right kind of people in place. Finally, take advantage of the new ways technology enables you to turn any businessperson into a programmer, or put more \\simply – engage in “citizen coding.”

Optimizing operations with outcome-oriented thinking

We need to rethink the relationship between business operations and technology.  Many leaders think their companies have been digital for the better part of a decade. However, the pandemic exposed critical gaps in between how digital they thought they were and how far they still have to go. Now they aspire to a state where “everything is digital,” but it’s unclear what that means and what companies should do to get there.

To paraphrase author Stephen Covey, start with the end in mind. Digital transformation must begin with a clear sense of the brand’s purpose in the market, which then should lead to reconfiguring work processes to achieve your purpose. First, identify the problem you are trying to solve and the value you are trying to deliver. From there, consider the kind of customer experience you want to create, the one that best reflects the values of your brand.  Only then can you decide how best to render work in a digital manner that brings modern workplace efficiencies and differentiated convenience to customers.

Getting there requires a radical rethinking of the relationship between IT and business functions.  In fact, the lines begin to blur, and that’s good. Technology needs to be considered in the context of what you are trying to deliver to the market. When we talk about process transformation, or even enterprise digital transformation, technology should not be the first thing we think about.

The Threat of Digital Sameness

As nearly every company in recent years has found itself both inspired and intimidated by Amazon’s astonishing capabilities and the often-cited “Uberization of everything,” a dispiriting haze has afflicted the digital experience of brand after brand. Many pundits call this “digital sameness,” a too-formulaic customer experience that fails to convey the unique feeling and value of a specific brand. This sameness can be avoided by taking a resolutely customer-centric approach to work processes. It means knowing your customers and what they value about your products or services, and ensuring that their digital experience with you uniquely reflects that.

To avoid the cookie-cutter approach, it’s critical to create a new empowerment in two ways—among employees and across your entire corporate ecosystem. Allow and encourage employees to focus on customers in all they do, and reward them for creatively doing so. And find every possible way to coordinate the supply chain, too, to concentrate all its efforts on customers. Brands that work fluidly with their employees, suppliers, vendors, and partners will better be able to give customers a differentiated and memorable experience.  

“Intelligent automation” is what we call the collection of technologies that combine to transform your processes to create exceptional experiences for customers. It is a lever to create fluid and adaptive business processes that are critical to building and delivering innovative products and services.

Take, for example, an international agricultural company that sought to streamline its largely manual and fragmented order fulfillment, financial reporting and customer care processes. We worked with the client to take an end-to-end approach using intelligent automation. This allowed them to improve how they served customers while reducing unit cost and cycle time, from procurement to final delivery.

Automation unlocked 52,000 hours per month, allowing associates to take on higher value work. The client’s citizen coders manage 80% of the automation development process. To date, they have created a variety of bots from a strong pipeline of automation processes. Automating the order entry and fulfillment processes improved average handling time by 75% and saved $20 million so far, generating a 4X ROI.

The softer side of digital business

During the pandemic, company after company declared they were responding aggressively with digital. The harsh reality, though, was that they realized how reliant they were on human beings.

Technology is evolving at lightning speed, and valuable capabilities are emerging that will make a big difference in every company. However, we have over-indexed on a workplace future that prioritizes machines over people. The implicit assumption is that as business becomes more digital, employees must adapt and fit in.  Instead, we need a better understanding of the digital worker ecosystem. Imagining a less “human-centric” future is misleading. The truth is, human-digital collaboration will be a key driver for how work will get done.

To drive the scale of ongoing business transformation, the key will be to unlock the talent base that already sits within an enterprise. The goal should not be only to think how to replace them, but rather how to get them to embrace a culture of digital thinking.

Many business leaders have a mindset that can be characterized as “Hey, let me skill my workforce. Let me make them a little more technically trained, and so on and so forth.”

But if you think about it as a “skilling” exercise, all you’re going to do is give people better tools to use with existing systems, and make those a little more efficient. This approach can calcify existing operations and make it harder to evolve modern ways of working. To be a fully digitally-enabled enterprise, you need to ask: What type of employees do I need within that ecosystem? Will they be cubicle workers? Repetitive workers? Or experts who can sit alongside digital workers, or “bots,” including those created by Robotic Process Automation, to focus on the needs of customers? For the enlightened enterprise, the answer will be the latter.

Bringing on more bots doesn’t mean you can get rid of humans. Technology is still far from advanced enough for that. People have the ability to think deeply, which computer algorithms don’t possess. So you need to ask: “What kind of workflow do I need? Do I have that skill profile today?” Your people must be partners to help constantly evolve your digital capabilities. Yes, they will acquire new skills, but more importantly, they will partake in a new operational mindset.

Citizen coding

When I talk to operations teams inside companies, I’m often surprised by their lack of appreciation for technology. I don’t expect them to be coders – to be proficient in writing programs or even understanding how that is done at any level of depth. But they must appreciate what technology can do, what pathways it can create.

For those leaders trying to drive positive change inside companies, a fantastic way to drive digital transformation at the grassroots level has emerged. It’s the low-code/no-code movement. People who have no training in programming and don’t think of themselves as technologists can now begin to engineer software. I call it “citizen coding.”

Citizen coding will only increase the utility and relevance of software since the people doing the coding are more likely to be ordinary business people. Over time, citizen coding will progress across many dimensions. Consider this: – one day, citizen coders could invoke voice-enabled commands to guide the entire software engineering process.

Low-code/no-code programming tools can help drive corporate transformation at the grassroots level, and empower less technical employees to move forward without fear that technology might cannibalize their jobs. They’re the ones writing the code, and they know how to improve their work and their company. And ultimately, such an approach can take companies to significantly higher-value business models.  

Some IT purists possibly resist low-code/no-code since they don’t always appreciate the value and talent that can be unlocked by empowering business teams to use such tools. Having come from the IT side, I too was skeptical of this movement. End-user software built by amateurs? That’s almost like a shadow IT organization! But I’ve gone from a skeptic to a convert. IT leaders should consider such tools, if properly selected, governed and used, can accelerate the software development lifecycle — when combined with traditional programming approaches.

In fact, I’ve been pleasantly surprised by the enthusiasm with which teams take to such tools once we orient and train them. And the way the low code/no code platforms are evolving increasingly includes governance and orchestration structures that ensure the resulting systems are safe and integrated well with existing enterprise technologies. Other new tools emerging alongside this movement, such as “process discovery and intelligence” software, help continually monitor processes and automatically generate suggestions for improvements, which further helps accelerate the pace and scale of transformation.

Related Posts
See All

The Future of Software: Less Code, More Impact, and More Programmers

Brands must do more than just keep up. They must better anticipate consumer needs and rethink the relationship between operations, people and technology.

Why AI Will Help Define the Next Era of Business

Brands must do more than just keep up. They must better anticipate consumer needs and rethink the relationship between operations, people and technology.

Data Scores Big for the Sports Fan Experience

Brands must do more than just keep up. They must better anticipate consumer needs and rethink the relationship between operations, people and technology.

How the Experience Is Delivered *Is* the Experience

Brands must do more than just keep up. They must better anticipate consumer needs and rethink the relationship between operations, people and technology.

New IP-Based Lending Model Gains Momentum

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold.”

It was a classic technology CEO quandary. The medical device company that Susan Hertzberg leads, BrainScope, was ready to bring its device and software to assess mild traumatic brain injury to a wider market. Unlike expensive, radiation-heavy and hard-to-schedule CT scans, BrainScope can help assess the likelihood of a brain bleed in 20 minutes while at the same time providing an assessment of the severity of a concussion. Hertzberg was hesitant to ask the company’s true-blue investors, who had been loyal through 12 years and seven funding rounds, to provide more capital or to further dilute their financial stake. Seeking a way forward, Hertzberg says, she began nonetheless talking with potential equity partners.

During the summer of 2021, a new option presented itself. Aon, a leading global professional services firm that already served as an advisor to BrainScope, introduced Hertzberg to the idea of intellectual property (IP)-based valuation and related non- or minimally-dilutive funding. BrainScope could raise money based on the value of its IP—and lenders could have peace of mind, knowing the asset was valued using Aon’s data-enabled algorithms and insured like any tangible property. After consideration, Hertzberg opted to go down the path of IP-backed lending. 

BrainScope and Aon announced a $35 million IP-backed lending arrangement this fall. The funding arrangement provides BrainScope the resources to expand its commercial footprint to reach more hospitals and connect with concussion centers enabling doctors to get rapid, objective insights about the likelihood of brain bleeds and concussions. Further, the “concussion index” capability added earlier this year gives clinicians the first objective, brain-based biomarker tool to track concussed patients, athletes, and military personnel and aid in determining their readiness to return to activity. With 70 million head injuries reported annually around the world, the company will be able to expand globally as well. Next-gen versions of the device show promise in detecting stroke, dementia and depression efficiently.

Aon may be one of the first to help companies articulate and realize the full value of their IP portfolios, but IP looks likely to be a significant focus in 2022. Why now? A more apt question might be, why not earlier? Today, intangible assets account for 90 percent of the value of the S&P 500 (and 90 percent of the value of Fortune 500 companies), according to a report from Ocean Tomo. Translation: That’s $20 trillion plus in the S&P case. Then there’s the explosion in intellectual property that’s taken place in the past 30 years. To put the acceleration in context, it took more than a century for the U.S. Patent office to record its millionth patent, but just three years for it to go from 9 to 10 million, which happened in 2018. Lewis Lee, a longtime patent attorney who heads Aon’s IP Solutions, puts it this way: “If data is the new oil, IP is the new gold.” 

Yet the financial landscape has been slow to change. Much of the problem has been that IP is harder to put a cash value on compared to, say, real estate or product inventory. But Aon’s solution, which is “developing the ability to measure IP at scale” in Lee’s words, is changing that. “Our platform aggregates not only intellectual property data but also financial data. We look at risk data, we look at litigation data, we look at licensing data and then we use a series of algorithms to understand those in a material way.” AI, machine learning and natural language processing are put to use in the technique. The result not only presents a stronger assessment to funders and insurers of a company’s IP’s value, but ties it to revenue streams and shows how the two will only become more inextricable.

Skeptics might ask, how are lenders going to protect themselves if, say, a copyright infringement or cyber security breach occurs? Insuring a building against a fire is one thing, admits Lee. Losing a key trade secret can be the end of a business forever. But again, he points to valuation. If you can put a monetary value on something, it can be insured—and Aon was able to “wrap” the deal for BrainScope with “a collateral protection insurance wrapper.” Further mitigating risk, he continues, is that clients appear very motivated to deliver on their profit promises in this lending configuration. “All they really have for value is their IP. They’re loath to lose it, so it would be very reasonable to expect that they’re going to do their best to make good on these loans.”

The first companies to use Aon’s new valuation tools and capital solution, which include eco-minded agritech Indigo Ag and razor company Shavelogic, show signs of success. Hertzberg says Aon’s combination of nimbleness—BrainScope’s IP report, which establishes the scope and value of the portfolio, was completed in record time—with industries-spanning business expertise makes it an ideal collaborator. “They’re insuring health systems, they’re in the personal injury market, in corporate wellness and workers compensation [so they’ve become] a strategic partner who’s going to help me actually grow this business.” Here’s a solid indicator of the potential for this new lending arrangement: Aon is developing its own proprietary lending solution to help support this business. Today the company works with a multiplicity of lenders and borrowers, totaling in the multibillions on both sides, who want to partake in the new lending model. “It makes sense,” says Hertzberg. “If you think about intellectual property and how much that’s the driving force behind so many different kinds of businesses today, it’s kind of a gem. They’re finding ways to unlock that value to create more value. I don’t think that trend is going to go away. I think it’s going to accelerate.”

Related Posts
See All

What Apprenticeships Can Do for Companies – and for Workers

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

The Pressing Need for Innovation in Unexpected Places

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

Can Insurtech Rise to the Challenge and Meet the Post-Pandemic Moment?

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

Cybersecurity: A Critical Check in M&A

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

How the New Approach to Cybersecurity Can Create Trust

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Leading companies can’t merely be digital-first. They must win the trust of customers and partners. But cybersecurity threats are evolving rapidly, so establishing that trust is hard. Meanwhile, talent is in short supply just as companies need even more sophisticated expertise. To address the cybersecurity talent shortage, practical new approaches to automation are emerging. It’s all urgent and central to the business dialogue, because without secure systems, businesses will fail to effectively innovate. So what do CIOs and business leaders need to understand now to ensure their companies can be aggressive and not become victims? What is the new suite of cybersecurity tools, and how can companies best deploy them? How can company technologists reassure CEOs and boards of directors that their systems can be trusted? This panel brings together a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation, to chart the path forward.

SPEAKERS

  • Tony Buffomante – Senior Vice President, Global Head of Cybersecurity & Risk Services at Wipro Limited
  • Mike Kiser – Director of Strategy and Standards at SailPoint 
  • Elena Kvochko – Chief Trust Officer at SAP
  • Moderated by David Kirkpatrick – Founder and Editor-in-Chief at Techonomy Media

 

Download a recent white paper covering, ‘How the New Approach to Cybersecurity Can Create Trust: Five Ways to Minimize Cyber Risk in 2022

 

IN PARTNERSHIP WITH

Related Posts
See All

Rivian is Making an Illinois City’s Electric Vehicle Vision a Reality

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Formula E Racing Helps Jaguar Accelerate Towards EVs

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Beyond the Crash–Finding Patterns in a World of Uncertainty

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

New Startup Aims to Recycle 95% of High-Value Content from Solar Panels

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Navigating Platforms and Technologies for a New Healthcare Era

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

As discussed in our previous article, many healthcare organizations we work with have set their sights on delivering holistic, frictionless customer journeys that span the boundaries between payers and providers and seamlessly blend physical and digital experiences. This vision stands in stark contrast to the disjointed, fragmented and inefficient processes that characterize today’s healthcare industry.

Before they can seize that grail, however, many healthcare organizations must first modernize, rationalize and reimagine their current core platforms and technologies. Historically, healthcare technology platforms have been optimized for an ecosystem formed in an era of proprietary data silos and adversarial market segments. Closed architectures required extensive custom code and manual workarounds to interact with external systems and devices.

These technology foundations can’t support many of the healthcare industry’s emerging attributes, including interoperable data sets across payer and provider settings, price transparency for medical services, on-demand and healthcare-anywhere care delivery models, and real-time transaction processing – to name just a few.

Put bluntly, the last-gen technology that many providers and payers currently rely on simply won’t support the demands of the new healthcare industry, and will be unable to provide a competitive advantage in the new normal.

An industry in motion

Recognizing this, some large healthcare industry players are reevaluating their technology strategies. We’re noticing an increase in market noise and hype as multiple industry stakeholders position themselves to create market value in the emerging interoperable, price-transparent healthcare economy. Some of the moves we’re seeing include:

  • Leading electronic medical record (EMR) vendors (most notably Epic but also others to varying degrees) are looking to expand beyond health system and hospital boundaries by actively diversifying into the health plan space. These vendors are investing in platforms and products that support traditional payer functionality (claims, membership, premium billing, etc.). Their belief is that an integrated, monolithic, end-to-end platform that spans payer and provider functions will drive value in a converging marketplace.

  • Many large national health plans have been moving into the platform and tech space for a while now, including UnitedHealth Group/Optum, CVS/Aetna, Anthem and others. These diversification strategies have historically included acquiring or monetizing and commercializing proprietary health plan assets (including platforms) and selling them to other health plans. 

    We expect that several of these companies will look to build out and commercialize next-generation healthcare core technology themselves, often in partnership with EMR vendors, to drive value by enabling seamless end-to-end experiences powered by shared data assets.

  • New entrants and Big Tech such as Walmart HealthAmazon Care, Microsoft and Google are investing billions of dollars into introducing new business, operating and technology models to the industry. All of these efforts will have a potentially significant impact on the evolution of core platforms in healthcare and the viability of legacy strategies.

Choosing a technology platform

We’re helping clients separate the hype from the substance of these emerging strategies by working through these key questions:

  • Will you be locked in? We’re hearing from healthcare C-suites who are initially enamored by the idea of buying a single, integrated, end-to-end solution from one vendor. While we understand the apparent appeal of this approach, in reality it can be a rigid and expensive option. The “monolithic” approach puts your company at the mercy of one vendor’s ability to deliver all required capabilities, and makes it more difficult to assemble best-of-breed solutions on open platforms.
  • How fast can you innovate? Organizations locked into a single vendor’s solution will be only as responsive to market and regulatory change as their vendor is. By contrast, those powered with open core technology will be able to vet and integrate new apps, devices and solutions as fast as the marketplace develops them.
  • How comfortable are you with a competitor powering your business? The large national health insurance carriers that have launched platform and technology operations still maintain their core insurance businesses. Our clients express concern about ensuring the companies/vendors they rely on to provide new platform and technology services are separate from the companies they compete with on their core businesses of health insurance and care delivery.

Evaluating open core technology

Many of our clients want to modernize while still getting value from their existing technology investments, data stores and experience and knowledge. That flexibility is possible with core technology that has an intelligence layer that can absorb, integrate, orchestrate and act upon data across the ecosystem. By using core environments with open, standards-based APIs, business logic and workflows, organizations can assemble best-of-breed applications and tools to create experiences specific to their member and population health needs.

Other key qualities to look for in next-gen core technology include:

  • Real-time functionality. Core technology must have real-time change capabilities fed by insights and recommendations from inside and outside the system. Both internal audiences and members/patients should have access to insights for decision-making, from comparing prices to assessing quality ratings of providers for a specific procedure.

  • Support for next-generation payment models. Core technology must enable outcomes-based benefit and contract designs, bundled services, reference pricing, provider incentives and real-time adjudication and payment. As data barriers fall, systems increasingly will need to fully integrate claims and medical data with specialty, pharmacy and non-medical health services.

  • Event-driven insights and automation. When core systems incorporate engines that publish data generated from EMR, claims and other processes in real time, that data can be made available throughout the organization’s application ecosystem. Using artificial intelligence, organizations can then make key processes predictive and automated, such as a machine learning-based review of rejected claims that then approves claims that meet a specific threshold.

Healthcare organizations’ core technology strategies go beyond one company’s balance sheet. They will influence how effectively the industry shifts to value-based models that promise improved outcomes and frictionless experiences at lower costs. Today’s interoperability regulations enable industry players to converge their data sets and use analytics to identify care gaps and social determinants of health. But friction points and outright obstacles among payers and providers with obsolete core systems still inhibit realization of these benefits.

Those that embrace open core technologies will be well positioned to tap the power of the emerging health ecosystem and its fluid data flows to deliver the end-to-end health experience consumers want. Organizations that lock in with one vendor may perpetuate these barriers, ultimately hurting their ability to compete.

William “Bill” Shea is a Vice-President within Cognizant Consulting’s Healthcare Practice. Patricia (Trish) Birch is Senior Vice-President & Global Practice Leader, Healthcare Consulting, at Cognizant.

Related Posts
See All

The Forces Driving Healthcare to Rethink Its Platform Strategies

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Rivian is Making an Illinois City’s Electric Vehicle Vision a Reality

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Formula E Racing Helps Jaguar Accelerate Towards EVs

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Beyond the Crash–Finding Patterns in a World of Uncertainty

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

The Forces Driving Healthcare to Rethink Its Platform Strategies

Senior healthcare leaders need to understand the drivers of industry change, as well as the technology underpinnings needed to support new value propositions.

The healthcare industry is rapidly being recontoured by the forces of digital innovation, new industry regulations and growing consumer expectations for better healthcare experiences. From our observations, healthcare payers and providers increasingly have both the technology tools and the regulatory incentives to overcome the industry’s traditionally adversarial silos and barriers.

The goal: delivering consumer experiences that seamlessly unify payer and provider functions and physical and digital channels. Soon, care, payment, follow-up, referrals and ancillary services will all be blended into a single, frictionless process that makes traditional boundaries between payers and providers transparent to consumers.

Already, payer-provider convergence is accelerating as interoperability, price transparency and aggressive new entrants with deep pockets erode the industry’s traditional barriers and business models. We’re seeing health plans executing diversification strategies designed to increase alignment and collaboration with providers. Some payers, including United Health Group/Optum and Blue Shield of California, have even bought provider practices.

On the provider side, some — including UPMC and Intermountain Health — have stood up their own health plans, taking on both insurance risk and the financial risk of shifting to more value-based contracts based on improving health outcomes. The race is on to deliver a seamless healthcare experience, and long-term incumbent players and new entrants alike are placing their bets.

To succeed in the emerging healthcare landscape, senior healthcare leaders need to first understand the forces reshaping healthcare and the experiences payers and providers deliver, as well as the impact of these forces on core healthcare technology.

In this article, we’ll look at the drivers of change and what they entail for technology selections, and in an upcoming essay we’ll evaluate emerging core technology strategies and explain the capabilities healthcare organizations need from next-gen technology.

A radical industry makeover

New market forces are driving out longstanding healthcare complexities and costs that historically have stymied innovation by preventing efficient data sharing and market-driven economics. These new developments include:

  • Consumer-empowering regulations. The federal data interoperability rule breaks open data silos and sets standards to enable data flows among entities both inside and outside of healthcare. Similarly, price transparency regulations that reveal contract data and negotiated rates are already influencing payer-provider negotiations. This is inching the industry closer to a market-driven model for pricing medical services.

    Together, these regulations are rendering legacy business models based on proprietary data ownership and opaque pricing strategies obsolete. Consumers stand to gain real control over their health information and the ability to easily comparison-shop on quality and value.

  • Mass “API-ification” of core payer and provider functions. With new advancements in technology — combined with interoperability regulations mandating standards-based application programming interfaces (APIs) — an ecosystem-powered approach is emerging to drive value from platform and technology investments. Standards-based APIs will help providers and payers overcome process gaps and inefficiencies by enabling real-time automated transactions.

    For example, an API could enable a best-of-breed quality management system to exchange data with a core administrative system, resulting in the ability to streamline quality reports and identify care gaps more quickly. APIs also will democratize data and functionality, enabling innovation marketplaces for third-party apps and developers. Imagine an App Store or Play Store specific to healthcare. Apps could range from FDA-vetted clinical functions to medical appointment reminders.

  • New value-based care and payment models. The adoption of new payment models is aligning payer and provider incentives and inspiring new business and care delivery models that are breaking down traditional adversarial silos. This is accelerating payer/provider convergence.

A new platform for a reshaped industry

These forces — and the new industry topology they are creating — are testing the limits of traditional health industry technology platforms and surrounding applications. Traditional technology infrastructures are ill-equipped to support new value propositions based on interoperable data flows, transparent pricing and open ecosystems. 

Consider the friction points that exist at the historically adversarial payer/provider boundary: authorizations, referrals management, claims adjudication, denials management, appeals and grievances, etc. Both sides are heavily invested in a parallel revenue cycle management (RCM) arms race plagued by redundancy, inefficiency and costs.

Converting these friction points into value for consumers and healthcare organizations will require open platforms and an ecosystem approach to technology and automation. These must be built on an intelligence layer that draws on data and insights from combined consumer, payer and provider data sets — and increasingly, from combined payer/provider business models.

An evolution ahead

To support new business models based on payer/provider collaboration and even co-ownership of health systems, the industry’s technology platforms and strategies need to evolve. Yet we see many organizations trying to drive these alignments with outdated platform strategies that are technologically incapable of supporting the data fluidity and consumer-centricity these new business models require. Today’s profound business and operating model shifts require fundamental changes in how business and enabling technologies are architected.

At the same time, healthcare organizations need to resist the market noise and focus on making sensible investments that build on, and drive value from, their existing platform investments and leverage their incumbent advantages. Organizations must take time to understand the next-generation technology capabilities they’ll need.

In Part 2, we’ll dissect emerging solutions for delivering frictionless end-to-end consumer experiences and the characteristics healthcare organizations need from next-gen tech to generate opportunities in a reshaped consumer-focused industry.

 

William “Bill” Shea is a Vice-President within Cognizant Consulting’s Healthcare Practice. Patricia (Trish) Birch is Senior Vice-President & Global Practice Leader, Healthcare Consulting, at Cognizant.

Related Posts
See All

Rivian is Making an Illinois City’s Electric Vehicle Vision a Reality

Senior healthcare leaders need to understand the drivers of industry change, as well as the technology underpinnings needed to support new value propositions.

Formula E Racing Helps Jaguar Accelerate Towards EVs

Senior healthcare leaders need to understand the drivers of industry change, as well as the technology underpinnings needed to support new value propositions.

Beyond the Crash–Finding Patterns in a World of Uncertainty

Senior healthcare leaders need to understand the drivers of industry change, as well as the technology underpinnings needed to support new value propositions.

New Startup Aims to Recycle 95% of High-Value Content from Solar Panels

Senior healthcare leaders need to understand the drivers of industry change, as well as the technology underpinnings needed to support new value propositions.

Data Scores Big for the Sports Fan Experience

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Sports are about head and heart, or so the traditional thinking goes. Now, though, another crucial element is impacting the sports industry: data. Instead of eroding the inherently human qualities of playing and watching sports, though, data is enhancing them, enabling leagues and teams to capitalize on the direct-to-consumer (DTC) model and attract more players and fans. By promoting the excitement of sports, leagues and teams also gain big business outcomes through an improved fan experience.

Most leagues and teams are rookies at tackling data and managing fan relationships. They also face a DTC pipeline that’s more fragmented than in other industries, featuring multiple stakeholders with hugely varying needs that rely on different platforms and channels.

For example, while some fans consume sports through streaming services and traditional broadcast channels, others purchase tickets through stadiums and web-based ticket services. Enthusiasts shop for merchandise at online and physical stores. Teams share performance analytics in one form for coaches and players, and another for fans. Segmentation and personalization are highly complex.

What’s more, the sports industry faces stiff competition from the entertainment sector for consumers’ time and attention. The biggest competitors for leagues and teams aren’t necessarily other sports but YouTube and Netflix. They’re in a competition for eyeballs.

The role of personalization

Capitalizing on DTC means hyper-personalization on steroids. In the UK, the Football Association (FA) is the governing body for all of football, and it manages a diversity of revenue and stakeholders that includes not only grassroots, senior and professional teams but also 90,000-seat Wembley Stadium. After a year of cancelled matches and limited play, the FA is putting data to work to coax people back into the game — and seeing big gains.

To make it happen, it’s relying on a next-generation engagement platform that we created to power tailored messaging for the FA’s many stakeholders. For example, to feed the association’s need for building elite women’s and men’s senior leagues, we developed a digital app for pre-teens and teenagers that offers engagement tools such as online gaming and social media to successfully compete for their attention.

The FA’s future vision includes serving up real-time analytics during matches, offering data-hungry fans on-field performance details like the latest stats on England captain Harry Kane.

It’s personalization at its most complex, and the FA is already seeing results. An initiative that encourages young girls to join football clubs resulted in 60,000 new participants, according to the FA. In addition, its new stadium hub triggered a 25% jump in youth registration for ticket sales, the FA notes. Through the next-gen platform, the FA expects to manage 1.5 million players this year.

Breaking through barriers

The FA isn’t alone in taking a deep dive into personalization. Legendary Formula 1 racing champion and automaker Aston Martin is looking to target and nurture a diverse racing fanbase in addition to the high-net-worth individuals who buy its iconic cars. Global racing competition SailGP is similarly looking to broaden its followers. (Cognizant is a sponsor for both organizations.)

Both motorsports and sailing face the same hurdle: Opportunities to engage are limited compared with other sports. While it’s easy to start kicking around a ball, barriers to entry are much higher for getting behind the wheel or on the water.

Engaging and expanding the fanbase is key for both sports. To connect fans with a product they can identify with and consume, Aston Martin is exploring development of custom fan experiences tailored to individual interests. For example, it hopes to use data to create one experience for fans who are longtime admirers of the brand and focused on this season’s car and engineering feats, and another for those who are, say, avid followers of driver Sebastian Vettel. (Click here for more details on how Cognizant is partnering with Aston Martin to modernize its business.)

And then there’s the power of convergence: Nielsen attributes its prediction for big growth in F1’s fanbase not only to Netflix’s popular series Formula 1: Drive to Survive but also to young drivers’ presence on platforms such as Twitch and YouTube.

We’re also partnering with SailGP to improve audience insights and enable data visualization. Our team is leading a CRM implementation to help SailGP manage fan data in a single database and communicate better with fans to deepen engagement and increase the viewing audience. To capture the exhilaration of sailing in one of the race’s hydrofoil-supported catamarans, SailGP’s plans to include an immersive fan experience that shares real-time metrics such as sailors’ heart rate and supercharged boat speeds that can break 50 knots as the catamarans fly above the water.

Standing out in the world of entertainment

For the sometimes insular world of sports, DTC and personalization are about much more than technology. They require a mindset shift for leagues and teams as they take on increasing responsibility for fan relationships and experiences.

Monetization is a motivator, but so is necessity. As viewers face a tidal wave of content and entertainment options, mastering sports’ changing playbook is helping leagues and teams to stay in the mix.

 

David Ingham is a Cognizant Client Partner for Media, Entertainment & Sport, based in London. He manages The Football Association, Aston Martin Cognizant F1 Team and SailGP, among other clients in his portfolio. Across these clients, Cognizant is working on a mix of engagement, covering performance, audience analytics, grassroots engagement and digital product creation. David can be reached via email or LinkedIn.

Related Posts
See All

Rivian is Making an Illinois City’s Electric Vehicle Vision a Reality

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Formula E Racing Helps Jaguar Accelerate Towards EVs

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Beyond the Crash–Finding Patterns in a World of Uncertainty

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

New Startup Aims to Recycle 95% of High-Value Content from Solar Panels

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Financial Disruption Is Global…and Accelerating

How fast can fintechs address global needs? Two industry leaders say the future of financial services firms will be built on data, automation, and the hybrid cloud.

“When you think about globalization, the question is how fast can fintechs address global needs and gaps. ”That’s how BMC’s Strategic CTO, Greg Bukowski began a discussion about disruption across the financial services industry. The conversation addressed how data, automation, and the hybrid cloud are driving customer experience innovation. Bukowski was joined by Morgan McKenney, the just-departed former COO of Global Consumer Banking at Citi and CDX Founder Drew Ianni at the recent CDX + FIN Accelerate FinTech Summit in New York City and online (speakers appeared in person at CDX’s offices). 

 “Financial disruption is accelerating as we speak,” said McKenney. “Most banks have realized they can’t do everything themselves and, even for large banks, partnering with fintechs – to improve capabilities with your customers – is a well-trodden path. [Disruption] has already happened on the consumer front, and soon it’s going to happen in a much deeper way at the institutional level, with new digital currencies and networks.”

Data is the New Digital Currency

BMC believes successful companies in the future will be “autonomous digital enterprises,” in which automation is not just used to streamline routine tasks and monitor and troubleshoot digital systems and networks. They will also be able to continually analyze and refine corporate and consumer data to make it more powerful. Bukowski said  “data is the new digital currency.” He added that BMC finds that all financial services firms, including disruptive fintechs, are looking to leverage value-added data and insights derived from automated processes that allow them to provide superior customer experiences and journeys.

McKenney agreed and talked about the challenges created historically because Citibankwas assembled over time as the result of dozens of acquisitions. Each of them had different databases and systems, so the challenge was integration, particularly in data processing and automation. She believes that to fully embrace automation and best take advantage of the cloud, companies “have to have an amazing data stack to be able to drive and automate continuous learning, while leveraging AI to help with things like personalization for consumer applications.”

Not Everything Has Moved to the Cloud

While there has been plenty of hype around public cloud providers and the innovations, solutions and cost savings they can deliver, most global financial services firms still continue to rely heavily on mainframe, on-premise computing to run their operations. 

Said McKenney: “As we think about cloud at a high level, Citi processes $4 trillion worth of payments a day, and holds $1.3 trillion of balances on behalf of our customers. And that means you need hyper-resilient, ultra-secure, always-on systems.” She noted that many financial services firms have moved some data and infrastructure to the cloud because of the added scalability and agility that allows, but that a significant amount of computing infrastructure, systems resiliency, and data storage remains on-premise, since security is so critically important.

 “We work with several large banks and the vast majority of their business still runs on mainframes,” added Bukowski. “In fact, we just released a mainframe survey where we interviewed 1600 customers, and 92% of them are growing workloads on their mainframe, on-premise systems.” He also acknowledged that the share of workload going to the cloud is increasing because it “provides superior scalability and the opportunity to efficiently create and deploy applications that impact the customer experience and journey.” Bukowski believes we will live in this “hybrid” IT world for some time, but as mission-critical business applications are redeveloped over time they can migrate to the cloud and be accessed as SaaS based applications and solutions.

Building For The Future – Mastering The Customer Experience

Looking ahead, Bukowski noted that larger financial services firms have a unique opportunity to “partner with the fintechs building out these platforms and services and migrate what would be mainframe workload into cloud-based SaaS services.” He currently sees this happening, for example, in the insurance industry which he describes as “a commoditized market taken over by fintechs – where the savvy players are reducing their mainframe footprint and using more SaaS-based (cloud) solutions.”

McKenney said the future is about “digitizing end-to-end,” adding “it’s not just about the glossy, customer-facing front-end. It’s everything in the middle and the back-office that ultimately delights the customer.” She believes that as far as customer experience goes, the bar is now being set as much by companies outside of financial services as both inside it. The customer expects to have great experiences regardless of what kind of company they are interacting with or where they are engaging that brand. McKenney also believes traditional financial service providers can more proactively involve their customers in product development: “I would love to see Citi’s customers giving the bank product ideas, where they become our product developers as well.”

Bukowski closed the session by suggesting that the autonomous digital enterprise will recognize that its computing and application development infrastructures must ultimately be architected and deployed to make internal teams more efficient and productive. Automation will play a big role and “as companies fundamentally transform their operational models, that will move workloads off the mainframe and into the cloud.” This will take time, of course, and enterprises of all sizes must continue to innovate and improve the customer experience by leveraging a hybrid I.T. stack.

Related Posts
See All

What IT Leaders Learned–and Gained–In the Pandemic: BMC’s CEO Executive Council

How fast can fintechs address global needs? Two industry leaders say the future of financial services firms will be built on data, automation, and the hybrid cloud.

For IT Transformation, Put People First

How fast can fintechs address global needs? Two industry leaders say the future of financial services firms will be built on data, automation, and the hybrid cloud.

Today’s Leaders Must Reinvest to Reinvent

How fast can fintechs address global needs? Two industry leaders say the future of financial services firms will be built on data, automation, and the hybrid cloud.

BMC CEO Ayman Sayed: Heading Towards Enterprise 2025

CDX sat down with the CEO of BMC Software, Ayman Sayed, to discuss the rise of the Autonomous Digital Enterprise, the future of computing and more. Below are excerpts. Drew Ianni (CDX): Welcome, Ayman. Talk a...

12 Corporate Experts Talked Best-Practices for Innovating in Big Companies. Here Are Their 4 Conclusions.

Just because you can innovate, doesn’t mean you should. And sometimes core parts of the business need to be “gently blown up.”

We’re at a “nearly unprecedented inflection point” for every enterprise on the planet. That’s how Columbia Business School’s eminent strategy and innovation expert Rita McGrath began a wide-ranging discussion about the state of open corporate innovation, at a recent virtual roundtable with senior executives from a wide range of industries. She was referring to the potent intersection of pandemic response and the resulting global business epiphany about the urgency of digital transformation and digital innovation.

The session was hosted by Ruth Yomtoubian, Senior Director of the VSP Global Innovation Center and moderated by CDX. VSP Global is a leader in health-focused vision care that provides affordable access to eye care and eyewear for more than 80 million members through a network of more than 39,000 doctors. The VSP Global Innovation Center harnesses their expertise and infrastructure to bring innovative products, services, and experiences to market.

Yomtoubian, a veteran of managing open innovation strategy and labs for firms including AT&T before she came to VSP, provided insight into the Global Innovation Center’s (GIC) approach: “Our goal is to understand outside shifts for our company while staying connected to the larger organization.” Her colleague Zachary Poll added, “We see the external innovation ecosystem as our friends and partners to help us drive innovation forward.”

1. The crisis proved how fast companies can move when forced

In a presentation describing the challenges associated with driving innovation forward, particularly at legacy enterprises, McGrath reflected on the last eighteen months and discussed her recent work with big companies: “The first interesting thing I learned was how fast companies could go, when faced with the crisis…It took that pressure, that crisis, that moment of ‘We don’t have the option of just doing nothing – we’re going have to do something now.’…When you’re moving through this type of inflection point, it can take your business to new heights, or it can take your business to catastrophe. Companies are really grappling with what that means. And what it means…is deep uncertainty.”

McGrath also discussed the importance of having executive-level knowledge about how to enable change, and how to “sculpt-in” that knowledge. “For many organizations,” she said, “there is this mythology that innovation is this ‘dark art’–nobody understands how to do it; it’s undisciplined; and people waste a lot of money. What I’m always astonished by is how much we do know and how little of that knowledge is actually mainstream in the management community.”

2. Just because you can innovate, doesn’t mean you should

The conversation continued with a group discussion on innovation best practices and how to avoid typical pitfalls associated with creating digital transformation.

A senior innovation executive at a global airline said successful innovation starts with deciding whether or not you even should innovate. “Just because you can doesn’t mean you have to,” he said. His team explored some emerging – and supposedly game changing – technologies like blockchain and beacons, but decided not to pursue them, because they did not solve a core customer or operational problem.

He said innovation is “not just about understanding your customers and their needs but empowering your employees to have agency to innovate.” Referring to his company’s core product, he said “it’s just a tube in the air. So what’s different? Our products, relative to the competition, are more or less the same. But it’s our people that are different and so we intently focus on our people because that’s the frontline. And we want to innovate with our frontline people because they have the best ideas.”

3. Core parts of the business need to be “gently blown up,” sometimes with new models

The open innovation leader for a big consumer products company said his team’s innovation approach is “multi-type innovation – thinking about the entire consumer experience on top of the product.” His company, he said, has become fairly adept at the discovery process. But the next and more difficult step for his team is “grappling with the profit-model side. That involves changing the innards of your company, because most organizations are optimized for the past and want to protect the core.” Impactful innovation, he said, requires not exploring new business models but also questioning the models of the past. “And pitching new business models is difficult,” he conceded. “It’s really hard to make the case for ‘gently’ blowing up parts of your existing organization to build for the future.”

4. Putting the right startup in front of decision makers isn’t enough

There was significant interest in how companies committed to how open innovation should engage with broader industry ecosystems. The head of an innovation center for an advanced-materials company talked about what it encountered when the company set up the center in Silicon Valley: “The initial mission was to accelerate opportunities across the enterprise. It has evolved into focusing on driving more strategic products, by leveraging the ecosystem that we engage through the innovation center.” As for so-called “demo days” that her center set up to help educate the enterprise? “They didn’t work,” she explained. “We said, ‘Hey, there’s this great thing you guys can leverage, and we can bring in all these startups.’ They were polite, and listened, but didn’t take it up. You need to have the subject matter experts in the right spot, and we weren’t the subject matter experts for someone else’s project. So it just created a difficulty.”

Yomtoubian closed the intensely interactive session by saying that what she took away from it all was “that to be successful, there is a certain level of exposure and education that needs to happen first-hand across the company if innovation is to move forward and thrive.”

Related Posts
See All

Vision Performance Ripe for Innovation & Investment

Just because you can innovate, doesn't mean you should. And sometimes core parts of the business need to be "gently blown up."

A Glimpse Into Future Investing

Just because you can innovate, doesn't mean you should. And sometimes core parts of the business need to be "gently blown up."

Why Big Company Fintech Innovation Is So Hard

Just because you can innovate, doesn't mean you should. And sometimes core parts of the business need to be "gently blown up."

Software Quality Rises When Considered from the Start

Just because you can innovate, doesn't mean you should. And sometimes core parts of the business need to be "gently blown up."