The Data Challenges Behind ESG Investment

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

The recent explosion of environmental, sustainable and governance (ESG) investing has created a lot of opportunities for investors and financial institutions. The adoption of the Paris Accord and their recent amplification in Glasgow, along with the election of Joe Biden in the US, have accelerated the already-growing trend toward ESG. Annual cash flow into ESG funds has grown tenfold since 2018, and ESG assets under management are expected to account for one-third of all investments worldwide by 2025, or approximately $53 trillion. 

But this massive shift also represents a set of profound data challenges. With no universally-accepted definition of ESG, how does anyone measure performance? And with more companies and countries promising to reach carbon neutrality by a given year, how does anyone measure whether they are actually keeping up, much less standardize that performance across industries and national boundaries? The capture, organization, and modeling of data are too complex and too expensive for the vast majority of organizations to do on their own. 

ESG’s data challenges were the topic of a recent Techonomy virtual salon hosted by Red Hat, and moderated by James Ledbetter, Chief Content Officer of Clarim Media, parent company of Techonomy. The salon explored the roots of recent developments in ESG, how institutional players are tackling the new data challenges, and where the data approach to ESG is headed. 

Richard Harmon, vice president and global head of financial services at Red Hat, asserted that the company’s Linux/open source background lends itself well to the ESG data challenge. “From the roots of the open source community, open culture, which we [at Red Hat] have, topics like climate change and the wider ESG framework are natural things for us to dive into.” He noted that Red Hat had  joined a consortium called OS Climate, launched last year by the Linux Foundation to “address climate risk and opportunity in the financial sector.” It currently includes Amazon, Microsoft, Allianz, BNP Paribas, Goldman Sachs, and KPMG among its member base. This broad group of companies–many of which compete with one another–is necessary, Harmon said, because the “volume and complexity of the data that needs to be utilized is unmatched.” 

Lotte Knuckles Griek, head of corporate sustainability assessments at S&P Global, explained that she has been measuring company performance on ESG metrics for 15 years, with the company being involved in ESG ranking since the beginning of the 21st century. The tremendous growth in the sector, she added, has shifted the stakeholders that S&P serves, which presents a challenge on how to present data in the most useful way. “There’s a huge variety of investors out there looking to consume ESG data, and do very different things with them,” Griek explained. 

Some of the changes in recent years have been driven by changes in regulation and government policy. For example, in the United States the Biden Administration seems to favor some kind of uniform definition for what constitutes ESG investment. In Europe, the European Central Bank has already shifted the criteria for it massive pension investments toward “green bonds,” which in turn gives individual countries and companies an incentive to embrace ESG goals. 

For her part, Griek said that she always welcomes new government standards and regulations, because they help draw attention to important ESG issues. “But it also totally raises the complexity faced by companies, ”she continued. “One framework says report data in terawatt hours, another says report in megawatt hours…all these frameworks tend to cause confusion.” 

Andrew Lee, Managing Director and Global Head of Sustainable and Impact Investing at  UBS Global Wealth Management, put the data dimension into a broader context of how UBS (and other institutional investors) serve their clients, who are increasingly interested in sustainability. “Data doesn’t necessarily drive a determination of what’s sustainable or not, for us,” Lee said, explaining that UBS clients or other investors are ultimately seeking transparency into how their money is being invested. “Data is a critical input into a process that looks at how sustainability can be incorporated into investments.” 

One topic that recurred through the discussion is the need for data that not only captures performance to date, but that can be used to make reliable models for the future. Lee noted that, while everyone in the ESG ecosystem praises the idea of standardization, business imperatives require something beyond standardization. “As an investor, you want a little bit of standardization, but you want to create alpha, right?” (Creating “alpha” is a way that investment professionals describe their competitive advantage.)  “So certain things in your process, maybe it’s the forward-looking data or how you interpret that, give you that edge or ability to outperform.”

In the end, Harmon stressed the need for world-class analytics capability. “That’s what we see as the real differentiator, to solve this global problem that we’re all facing.” 

Related Posts
See All

Attack of the Chatbots

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

Climate Becomes Industrial Policy, Cities Jump Onboard as EV Popularity Surges

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

To Transform Healthcare, Embrace Responsible Innovation

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

The New Pangenome Project Just Unlocked the Future of Precision Medicine

The recent explosion of environmental, sustainable and governance investing has created a lot of opportunities for investors and financial institutions. But with no universally-accepted definition of ESG, how does anyone measure performance?

New IP-Based Lending Model Gains Momentum

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold.”

It was a classic technology CEO quandary. The medical device company that Susan Hertzberg leads, BrainScope, was ready to bring its device and software to assess mild traumatic brain injury to a wider market. Unlike expensive, radiation-heavy and hard-to-schedule CT scans, BrainScope can help assess the likelihood of a brain bleed in 20 minutes while at the same time providing an assessment of the severity of a concussion. Hertzberg was hesitant to ask the company’s true-blue investors, who had been loyal through 12 years and seven funding rounds, to provide more capital or to further dilute their financial stake. Seeking a way forward, Hertzberg says, she began nonetheless talking with potential equity partners.

During the summer of 2021, a new option presented itself. Aon, a leading global professional services firm that already served as an advisor to BrainScope, introduced Hertzberg to the idea of intellectual property (IP)-based valuation and related non- or minimally-dilutive funding. BrainScope could raise money based on the value of its IP—and lenders could have peace of mind, knowing the asset was valued using Aon’s data-enabled algorithms and insured like any tangible property. After consideration, Hertzberg opted to go down the path of IP-backed lending. 

BrainScope and Aon announced a $35 million IP-backed lending arrangement this fall. The funding arrangement provides BrainScope the resources to expand its commercial footprint to reach more hospitals and connect with concussion centers enabling doctors to get rapid, objective insights about the likelihood of brain bleeds and concussions. Further, the “concussion index” capability added earlier this year gives clinicians the first objective, brain-based biomarker tool to track concussed patients, athletes, and military personnel and aid in determining their readiness to return to activity. With 70 million head injuries reported annually around the world, the company will be able to expand globally as well. Next-gen versions of the device show promise in detecting stroke, dementia and depression efficiently.

Aon may be one of the first to help companies articulate and realize the full value of their IP portfolios, but IP looks likely to be a significant focus in 2022. Why now? A more apt question might be, why not earlier? Today, intangible assets account for 90 percent of the value of the S&P 500 (and 90 percent of the value of Fortune 500 companies), according to a report from Ocean Tomo. Translation: That’s $20 trillion plus in the S&P case. Then there’s the explosion in intellectual property that’s taken place in the past 30 years. To put the acceleration in context, it took more than a century for the U.S. Patent office to record its millionth patent, but just three years for it to go from 9 to 10 million, which happened in 2018. Lewis Lee, a longtime patent attorney who heads Aon’s IP Solutions, puts it this way: “If data is the new oil, IP is the new gold.” 

Yet the financial landscape has been slow to change. Much of the problem has been that IP is harder to put a cash value on compared to, say, real estate or product inventory. But Aon’s solution, which is “developing the ability to measure IP at scale” in Lee’s words, is changing that. “Our platform aggregates not only intellectual property data but also financial data. We look at risk data, we look at litigation data, we look at licensing data and then we use a series of algorithms to understand those in a material way.” AI, machine learning and natural language processing are put to use in the technique. The result not only presents a stronger assessment to funders and insurers of a company’s IP’s value, but ties it to revenue streams and shows how the two will only become more inextricable.

Skeptics might ask, how are lenders going to protect themselves if, say, a copyright infringement or cyber security breach occurs? Insuring a building against a fire is one thing, admits Lee. Losing a key trade secret can be the end of a business forever. But again, he points to valuation. If you can put a monetary value on something, it can be insured—and Aon was able to “wrap” the deal for BrainScope with “a collateral protection insurance wrapper.” Further mitigating risk, he continues, is that clients appear very motivated to deliver on their profit promises in this lending configuration. “All they really have for value is their IP. They’re loath to lose it, so it would be very reasonable to expect that they’re going to do their best to make good on these loans.”

The first companies to use Aon’s new valuation tools and capital solution, which include eco-minded agritech Indigo Ag and razor company Shavelogic, show signs of success. Hertzberg says Aon’s combination of nimbleness—BrainScope’s IP report, which establishes the scope and value of the portfolio, was completed in record time—with industries-spanning business expertise makes it an ideal collaborator. “They’re insuring health systems, they’re in the personal injury market, in corporate wellness and workers compensation [so they’ve become] a strategic partner who’s going to help me actually grow this business.” Here’s a solid indicator of the potential for this new lending arrangement: Aon is developing its own proprietary lending solution to help support this business. Today the company works with a multiplicity of lenders and borrowers, totaling in the multibillions on both sides, who want to partake in the new lending model. “It makes sense,” says Hertzberg. “If you think about intellectual property and how much that’s the driving force behind so many different kinds of businesses today, it’s kind of a gem. They’re finding ways to unlock that value to create more value. I don’t think that trend is going to go away. I think it’s going to accelerate.”

Related Posts
See All

What Apprenticeships Can Do for Companies – and for Workers

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

The Pressing Need for Innovation in Unexpected Places

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

Can Insurtech Rise to the Challenge and Meet the Post-Pandemic Moment?

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

Cybersecurity: A Critical Check in M&A

One longtime patent attorney put it this way: “If data is the new oil, IP is the new gold."

How the New Approach to Cybersecurity Can Create Trust

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Leading companies can’t merely be digital-first. They must win the trust of customers and partners. But cybersecurity threats are evolving rapidly, so establishing that trust is hard. Meanwhile, talent is in short supply just as companies need even more sophisticated expertise. To address the cybersecurity talent shortage, practical new approaches to automation are emerging. It’s all urgent and central to the business dialogue, because without secure systems, businesses will fail to effectively innovate. So what do CIOs and business leaders need to understand now to ensure their companies can be aggressive and not become victims? What is the new suite of cybersecurity tools, and how can companies best deploy them? How can company technologists reassure CEOs and boards of directors that their systems can be trusted? This panel brings together a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation, to chart the path forward.

SPEAKERS

  • Tony Buffomante – Senior Vice President, Global Head of Cybersecurity & Risk Services at Wipro Limited
  • Mike Kiser – Director of Strategy and Standards at SailPoint 
  • Elena Kvochko – Chief Trust Officer at SAP
  • Moderated by David Kirkpatrick – Founder and Editor-in-Chief at Techonomy Media

 

Download a recent white paper covering, ‘How the New Approach to Cybersecurity Can Create Trust: Five Ways to Minimize Cyber Risk in 2022

 

IN PARTNERSHIP WITH

Related Posts
See All

Citing Wildfire Risk, State Farm Will Stop Selling Home Insurance in California

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Phoenix: Leading the Way in Autonomous Vehicle Technology with Waymo

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Should We Pull Carbon Out of the Air with Trees, or Machines?

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

What to Read: Silicon Heartland–Transforming the Midwest from Rust Belt to Tech Belt

Watch the recap of our session with a top cybersecurity practitioner, a chief trust officer, and a leader of the new era of security automation discuss practical approaches to automation in light of cybersecurity threats.

Navigating Platforms and Technologies for a New Healthcare Era

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

As discussed in our previous article, many healthcare organizations we work with have set their sights on delivering holistic, frictionless customer journeys that span the boundaries between payers and providers and seamlessly blend physical and digital experiences. This vision stands in stark contrast to the disjointed, fragmented and inefficient processes that characterize today’s healthcare industry.

Before they can seize that grail, however, many healthcare organizations must first modernize, rationalize and reimagine their current core platforms and technologies. Historically, healthcare technology platforms have been optimized for an ecosystem formed in an era of proprietary data silos and adversarial market segments. Closed architectures required extensive custom code and manual workarounds to interact with external systems and devices.

These technology foundations can’t support many of the healthcare industry’s emerging attributes, including interoperable data sets across payer and provider settings, price transparency for medical services, on-demand and healthcare-anywhere care delivery models, and real-time transaction processing – to name just a few.

Put bluntly, the last-gen technology that many providers and payers currently rely on simply won’t support the demands of the new healthcare industry, and will be unable to provide a competitive advantage in the new normal.

An industry in motion

Recognizing this, some large healthcare industry players are reevaluating their technology strategies. We’re noticing an increase in market noise and hype as multiple industry stakeholders position themselves to create market value in the emerging interoperable, price-transparent healthcare economy. Some of the moves we’re seeing include:

  • Leading electronic medical record (EMR) vendors (most notably Epic but also others to varying degrees) are looking to expand beyond health system and hospital boundaries by actively diversifying into the health plan space. These vendors are investing in platforms and products that support traditional payer functionality (claims, membership, premium billing, etc.). Their belief is that an integrated, monolithic, end-to-end platform that spans payer and provider functions will drive value in a converging marketplace.

  • Many large national health plans have been moving into the platform and tech space for a while now, including UnitedHealth Group/Optum, CVS/Aetna, Anthem and others. These diversification strategies have historically included acquiring or monetizing and commercializing proprietary health plan assets (including platforms) and selling them to other health plans. 

    We expect that several of these companies will look to build out and commercialize next-generation healthcare core technology themselves, often in partnership with EMR vendors, to drive value by enabling seamless end-to-end experiences powered by shared data assets.

  • New entrants and Big Tech such as Walmart HealthAmazon Care, Microsoft and Google are investing billions of dollars into introducing new business, operating and technology models to the industry. All of these efforts will have a potentially significant impact on the evolution of core platforms in healthcare and the viability of legacy strategies.

Choosing a technology platform

We’re helping clients separate the hype from the substance of these emerging strategies by working through these key questions:

  • Will you be locked in? We’re hearing from healthcare C-suites who are initially enamored by the idea of buying a single, integrated, end-to-end solution from one vendor. While we understand the apparent appeal of this approach, in reality it can be a rigid and expensive option. The “monolithic” approach puts your company at the mercy of one vendor’s ability to deliver all required capabilities, and makes it more difficult to assemble best-of-breed solutions on open platforms.
  • How fast can you innovate? Organizations locked into a single vendor’s solution will be only as responsive to market and regulatory change as their vendor is. By contrast, those powered with open core technology will be able to vet and integrate new apps, devices and solutions as fast as the marketplace develops them.
  • How comfortable are you with a competitor powering your business? The large national health insurance carriers that have launched platform and technology operations still maintain their core insurance businesses. Our clients express concern about ensuring the companies/vendors they rely on to provide new platform and technology services are separate from the companies they compete with on their core businesses of health insurance and care delivery.

Evaluating open core technology

Many of our clients want to modernize while still getting value from their existing technology investments, data stores and experience and knowledge. That flexibility is possible with core technology that has an intelligence layer that can absorb, integrate, orchestrate and act upon data across the ecosystem. By using core environments with open, standards-based APIs, business logic and workflows, organizations can assemble best-of-breed applications and tools to create experiences specific to their member and population health needs.

Other key qualities to look for in next-gen core technology include:

  • Real-time functionality. Core technology must have real-time change capabilities fed by insights and recommendations from inside and outside the system. Both internal audiences and members/patients should have access to insights for decision-making, from comparing prices to assessing quality ratings of providers for a specific procedure.

  • Support for next-generation payment models. Core technology must enable outcomes-based benefit and contract designs, bundled services, reference pricing, provider incentives and real-time adjudication and payment. As data barriers fall, systems increasingly will need to fully integrate claims and medical data with specialty, pharmacy and non-medical health services.

  • Event-driven insights and automation. When core systems incorporate engines that publish data generated from EMR, claims and other processes in real time, that data can be made available throughout the organization’s application ecosystem. Using artificial intelligence, organizations can then make key processes predictive and automated, such as a machine learning-based review of rejected claims that then approves claims that meet a specific threshold.

Healthcare organizations’ core technology strategies go beyond one company’s balance sheet. They will influence how effectively the industry shifts to value-based models that promise improved outcomes and frictionless experiences at lower costs. Today’s interoperability regulations enable industry players to converge their data sets and use analytics to identify care gaps and social determinants of health. But friction points and outright obstacles among payers and providers with obsolete core systems still inhibit realization of these benefits.

Those that embrace open core technologies will be well positioned to tap the power of the emerging health ecosystem and its fluid data flows to deliver the end-to-end health experience consumers want. Organizations that lock in with one vendor may perpetuate these barriers, ultimately hurting their ability to compete.

William “Bill” Shea is a Vice-President within Cognizant Consulting’s Healthcare Practice. Patricia (Trish) Birch is Senior Vice-President & Global Practice Leader, Healthcare Consulting, at Cognizant.

Related Posts
See All

The Forces Driving Healthcare to Rethink Its Platform Strategies

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Citing Wildfire Risk, State Farm Will Stop Selling Home Insurance in California

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Phoenix: Leading the Way in Autonomous Vehicle Technology with Waymo

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Should We Pull Carbon Out of the Air with Trees, or Machines?

Here’s what payers and providers should look for as they update their core platform strategies for the new normal in healthcare.

Data Scores Big for the Sports Fan Experience

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Sports are about head and heart, or so the traditional thinking goes. Now, though, another crucial element is impacting the sports industry: data. Instead of eroding the inherently human qualities of playing and watching sports, though, data is enhancing them, enabling leagues and teams to capitalize on the direct-to-consumer (DTC) model and attract more players and fans. By promoting the excitement of sports, leagues and teams also gain big business outcomes through an improved fan experience.

Most leagues and teams are rookies at tackling data and managing fan relationships. They also face a DTC pipeline that’s more fragmented than in other industries, featuring multiple stakeholders with hugely varying needs that rely on different platforms and channels.

For example, while some fans consume sports through streaming services and traditional broadcast channels, others purchase tickets through stadiums and web-based ticket services. Enthusiasts shop for merchandise at online and physical stores. Teams share performance analytics in one form for coaches and players, and another for fans. Segmentation and personalization are highly complex.

What’s more, the sports industry faces stiff competition from the entertainment sector for consumers’ time and attention. The biggest competitors for leagues and teams aren’t necessarily other sports but YouTube and Netflix. They’re in a competition for eyeballs.

The role of personalization

Capitalizing on DTC means hyper-personalization on steroids. In the UK, the Football Association (FA) is the governing body for all of football, and it manages a diversity of revenue and stakeholders that includes not only grassroots, senior and professional teams but also 90,000-seat Wembley Stadium. After a year of cancelled matches and limited play, the FA is putting data to work to coax people back into the game — and seeing big gains.

To make it happen, it’s relying on a next-generation engagement platform that we created to power tailored messaging for the FA’s many stakeholders. For example, to feed the association’s need for building elite women’s and men’s senior leagues, we developed a digital app for pre-teens and teenagers that offers engagement tools such as online gaming and social media to successfully compete for their attention.

The FA’s future vision includes serving up real-time analytics during matches, offering data-hungry fans on-field performance details like the latest stats on England captain Harry Kane.

It’s personalization at its most complex, and the FA is already seeing results. An initiative that encourages young girls to join football clubs resulted in 60,000 new participants, according to the FA. In addition, its new stadium hub triggered a 25% jump in youth registration for ticket sales, the FA notes. Through the next-gen platform, the FA expects to manage 1.5 million players this year.

Breaking through barriers

The FA isn’t alone in taking a deep dive into personalization. Legendary Formula 1 racing champion and automaker Aston Martin is looking to target and nurture a diverse racing fanbase in addition to the high-net-worth individuals who buy its iconic cars. Global racing competition SailGP is similarly looking to broaden its followers. (Cognizant is a sponsor for both organizations.)

Both motorsports and sailing face the same hurdle: Opportunities to engage are limited compared with other sports. While it’s easy to start kicking around a ball, barriers to entry are much higher for getting behind the wheel or on the water.

Engaging and expanding the fanbase is key for both sports. To connect fans with a product they can identify with and consume, Aston Martin is exploring development of custom fan experiences tailored to individual interests. For example, it hopes to use data to create one experience for fans who are longtime admirers of the brand and focused on this season’s car and engineering feats, and another for those who are, say, avid followers of driver Sebastian Vettel. (Click here for more details on how Cognizant is partnering with Aston Martin to modernize its business.)

And then there’s the power of convergence: Nielsen attributes its prediction for big growth in F1’s fanbase not only to Netflix’s popular series Formula 1: Drive to Survive but also to young drivers’ presence on platforms such as Twitch and YouTube.

We’re also partnering with SailGP to improve audience insights and enable data visualization. Our team is leading a CRM implementation to help SailGP manage fan data in a single database and communicate better with fans to deepen engagement and increase the viewing audience. To capture the exhilaration of sailing in one of the race’s hydrofoil-supported catamarans, SailGP’s plans to include an immersive fan experience that shares real-time metrics such as sailors’ heart rate and supercharged boat speeds that can break 50 knots as the catamarans fly above the water.

Standing out in the world of entertainment

For the sometimes insular world of sports, DTC and personalization are about much more than technology. They require a mindset shift for leagues and teams as they take on increasing responsibility for fan relationships and experiences.

Monetization is a motivator, but so is necessity. As viewers face a tidal wave of content and entertainment options, mastering sports’ changing playbook is helping leagues and teams to stay in the mix.

 

David Ingham is a Cognizant Client Partner for Media, Entertainment & Sport, based in London. He manages The Football Association, Aston Martin Cognizant F1 Team and SailGP, among other clients in his portfolio. Across these clients, Cognizant is working on a mix of engagement, covering performance, audience analytics, grassroots engagement and digital product creation. David can be reached via email or LinkedIn.

Related Posts
See All

Citing Wildfire Risk, State Farm Will Stop Selling Home Insurance in California

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Phoenix: Leading the Way in Autonomous Vehicle Technology with Waymo

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Should We Pull Carbon Out of the Air with Trees, or Machines?

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

What to Read: Silicon Heartland–Transforming the Midwest from Rust Belt to Tech Belt

Here’s how sports leagues and teams can boost the fan experience through personalization and real-time data analytics.

Uber-Inspired Software Flags COVID-19 Variants Before They Explode

Until now there was no way to predict the most transmissible variants, or to guide policy as new strains emerged. Now software from ride hailing can analyze the genome of variants with precision.

Data scientists took a tool originally developed for Uber and built a new prediction model to help make sense of emerging variants.

Until recently, there was no scientific way to predict which COVID-19 variants would be the most transmissible, and therefore no way to guide public policy as new strains emerged. For the past year and a half, public health experts have had to base their planning on simple observation, and to some extent, guesswork: which variants were becoming dominant in other regions or countries, and how soon could we expect to see them here?

Now, though, a collaborative team of data scientists, biologists, and infectious disease experts has applied machine learning advances originally designed, believe it or not, for the ride-sharing industry to this challenge. The result: a new tool that can actually predict the transmissibility of variants well ahead of time, accurately forecasting variant transmission patterns for the next one to two months. (Note: this tool and a description of its scientific validation have been posted as a preprint, which is a scientific paper that has not yet undergone the rigors of peer review.)

The tool would not have been possible without an unusual pairing. In the summer of 2020, data scientists who had previously worked for Uber joined one of the world’s leading genomic institutes, teaming up with scientists dedicated to fighting the COVID-19 pandemic. Last year, the Broad Institute (in this case, Broad rhymes with “rode”) in Cambridge, Mass., quickly converted some of its industrial-scale genomics lab capacity into a pandemic testing facility. In addition to determining whether samples were positive or negative for the SARS-CoV-2 virus, the team also sequenced tens of thousands of viral genomes.

Around the world, many laboratories are contributing to the database of viral genomes as well; the GISAID repository has had 3.7 million submissions. That’s a wealth of data, but running any kind of comparison across so many genomes is prohibitively costly in computational terms.

At the Broad, scientists wanted to do more with this data, and they had just the team to make it happen — three data scientists recruited from Uber’s AI team who had created a machine learning tool called Pyro to help customize models of traffic patterns and other elements for cities or regions. The tool was particularly good for building new models that contained many uncertain variables. When it was publicly released by Uber as an open-source platform, it got a surprising amount of uptake in the life science community, where it could be used for probabilistic modeling of biological experiments. “It’s actually more useful for science than it is for a ride-sharing company,” says Fritz Obermeyer, one of the developers who formerly worked at Uber.

At the Broad, Obermeyer and his colleagues quickly took up the challenge of mining the millions of available SARS-CoV-2 genomes to try to forecast the transmissibility of new variants. Rather than comparing every genome to every other genome, they streamlined the process by analyzing clusters of closely related variants. Their preprint describes the analysis of 2.1 million genomes, clustered into nearly 1,300 lineages representing more than 1,000 different regions around the world.

The machine learning tool they built is based on the original Pyro framework — this one is called PyR0, a play on the R0 metric used to assess disease transmissibility. It models variant patterns based on specific mutations in the viral genome. “The predictive capability of this model relies on the repeated emergence of the same mutation in different strains independently,” Obermeyer says. “That allows us to predict the growth rate of a particular strain based on the new mutations it has acquired.”

While the model relies on mutations that have been seen before, one of its most important features is that it does not need to know what any given mutation does. Typically, scientists seeking to assess transmissibility of a variant have to perform a series of lab experiments to tease out the precise function of each new mutation. For Obermeyer’s tool, those time-consuming functional tests aren’t necessary for forecasting. The model has access to all of the mutations from genome sequence data, and can infer from the data which ones are associated with increased transmissibility. That is a huge leap in capability for epidemiological researchers focusing on the COVID-19 pandemic.

According to Bronwyn MacInnis, an infectious disease scientist at the Broad who described this work in a presentation at the recent AGBT Precision Health conference, the PyR0 tool accurately predicted both the explosive growth of the Delta variant and the relatively minor emergence of the Mu variant (originally detected in Colombia earlier this year), long before conventional scientific approaches could have. Using genomic data for epidemiology and infectious disease has “really come of age” in the pandemic, she said. But genomic tools were not built for this kind of use. “The field really needs some great and quick innovation to keep up with the data,” she added, pointing to the former Uber team’s work as a great example.

Obermeyer points out that the model only works as well as it does because it has access to such an enormous trove of genomic data collected around the world. “It’s really important to be able to share observations [of mutations] across countries and across cities,” he says.

Now that the tool is available, public health experts have one more arrow in the quiver to help guide the pandemic response. Mask mandates, indoor capacity limits, and other measures can all be used in a more targeted manner if we can predict the likelihood of the spread of specific new COVID-19 variants. “As soon as we see that there’s a more highly transmissible strain in a particular region, then we [can] react to that by changing these intervention measures,” Obermeyer says.

Related Posts
See All

New Research Defends Curbside Recycling as Effective Climate Tool

Until now there was no way to predict the most transmissible variants, or to guide policy as new strains emerged. Now software from ride hailing can analyze the genome of variants with precision.

What to Read: Silicon Heartland–Transforming the Midwest from Rust Belt to Tech Belt

Until now there was no way to predict the most transmissible variants, or to guide policy as new strains emerged. Now software from ride hailing can analyze the genome of variants with precision.

Holy Grail or Over-Hyped?

Until now there was no way to predict the most transmissible variants, or to guide policy as new strains emerged. Now software from ride hailing can analyze the genome of variants with precision.

To Transform Healthcare, Embrace Responsible Innovation

Until now there was no way to predict the most transmissible variants, or to guide policy as new strains emerged. Now software from ride hailing can analyze the genome of variants with precision.

Software Quality Rises When Considered from the Start

Improving software quality is one of the best ways companies can save money, improve their brand, and generate more business. But building quality software requires planning, teamwork, and clear communication.

It is always cheaper to get software right the first time rather than fix it later. To achieve higher quality levels for our software we need to rethink how it is produced. Software today must be innovative, and it must be developed with agility. But often, when software is handed over to the end consumer, or just before that, testers or end users find malfunctions, or that the code does not serve the purpose. This is true for both business and consumer applications.  Making the resulting fixes is time consuming, as well as costly, and frustrating. That’s just the opposite of what we want. Improving software quality is thus one of the biggest and best ways companies can save money, improve their brand and generate more business.

Quality is a strange animal–nobody really cares about it until it is missing. But like property and casualty insurance, as soon as there is an incident, software quality (or the lack thereof) becomes everyone’s focus.

At Cognizant we have worked with clients for more than two decades to build in quality to their software from the start, using a lifecycle approach. Like in manufacturing, quality needs to be built in and monitored at every stage. From the first idea through development and into production support, the appropriate quality assurance activities need to be planned and executed to ensure a market-ready, industrial-strength solution emerges at the end of the software development lifecycle.

As digital technology and connectedness accelerate in the pandemic age, the challenge of inferior software affects an increasing number of industries and end users. Increasingly, the mechanics of software development are merging with the extended business value of what software enables. That creates a big challenge for industrial manufacturers of physical products that include software, because the quality of most software is not close to the maturity and quality of most hardware.

Hardware product quality has made huge strides in recent decades. When we went on a car trip back in the 1980s, we accepted that on a long distance journey something would go wrong and we would need to call AAA to help us get moving again. Today, cars require almost nothing but routine maintenance. They rarely break down, because their mechanical engineering is top quality. When Henry Ford built his Model T, quality was not much of a consideration. But as the auto industry matured, a major opportunity to differentiate with quality emerged. And then a small player, Toyota, shifted the paradigm of mass production to “stop building if the quality is not right,” rather than fixing problems at the end. The result was a dramatic reduction in costs on a cost-per-unit basis, improved brand recognition, and a complete rethinking of quality in a mass production context. Toyota eventually grew to become the world’s largest car maker. “The Toyota Way” was eventually applied across industries.

Today, we are at a similar inflection point with IT. As software becomes more and more integrated with hardware, the quality of both components has to be at par. If that is not the case, the quality of an integrated system will be determined by its weakest component. The consequences can be dire and expensive as recent cases show. For example, Porsche had to recall its electric car because of an error in the battery management system. Boeing had to ground the 737 MAX after a software problem in the MCAS system led to several disastrous crashes.

Software must meet the quality standards of the mechanical components it gets integrated with. The same rigor that is applied during the physical assembly process will have to be applied to the software manufacturing process. To get there, we have to roll out best practices that have proven successful in many projects in the past.

In my many decades of experience working on software quality and testing, I’ve found that the real mistakes typically happen during what we call the requirements phase. The key is to make sure you provide software engineers with complete, consistent and unambiguous requirements before they start coding. It’s like building a house. The people building the house and the architects debate, based on the blueprints, until they are all sure they’re ready. Only then do they get the materials and start building. The same kind of conversation is happening for all products. There will be design reviews, customer workshops, and all kinds of well-established reviews driven by the product owner to make sure the final version meets the market’s expectations and thus is likely to sell.

For most IT solutions, that does not happen. Instead, developers are left with vague requirements and roughly-sketched expectations. When it comes to user acceptance tests, everyone is frustrated when. the version delivered does not meet the expections. Then, through many cycles, the system is gradually improved until it can be accepted for production.

Building quality software is very much a team sport. It requires the participation of end users of the software, developers with the right skills, and quality engineers. Together they need to set up and maintain continuous automated quality assurance. Modern agile teams are set up exactly like that.

In addition, the pure software functionality needs to be fit for purpose. Today, the keyword is experience – and quality is essential to exceptional ones. A nice example of how quality is changing the perceived value of a product is the iPhone. When it entered the market a typical price for a phone was around $100. Because of the vastly superior customer experience that an iPhone provided, suddenly a price of $600 became acceptable to many consumers.

Quality helps to sell more at better price points and saves operating costs. Another good example is gaming software. A Sony PlayStation or Xbox uses complex and sophisticated software, but works to perfection. That’s because gaming companies have no choice. Why? Because children are very unforgiving. If there’s a problem, they will throw the product away or never use it again.

So we recommend that before an organization starts developing a piece of software, it asks what kind of persona it is targetting. An old or young person? Men or women, or both? Will it be a mobile app? Should it be portable to all mobile devices? Does it require internet at all times? Such basic design questions must be answered. But surprisingly, too often they are not, until much later in the process.

Our most successful clients are using software quality as a differentiator and a unique selling point. The return on investment for better quality assurance is surprisingly short — and that only factors in the operating expenses for development and day one support. If you add to this accelerated time to market, better customer experience and improved brand recognition,  enhancing software quality becomes the only smart move.

As military hero William A. Foster once said: “Quality is never an accident, it is always the result of high intention, sincere effort, intelligent direction and skillful execution, it represents the wise choice of many alternatives.”

Andreas Golze is Vice President, Quality Engineering and Assurance at Cognizant.

Related Posts
See All

Citing Wildfire Risk, State Farm Will Stop Selling Home Insurance in California

Improving software quality is one of the best ways companies can save money, improve their brand, and generate more business. But building quality software requires planning, teamwork, and clear communication.

Phoenix: Leading the Way in Autonomous Vehicle Technology with Waymo

Improving software quality is one of the best ways companies can save money, improve their brand, and generate more business. But building quality software requires planning, teamwork, and clear communication.

Should We Pull Carbon Out of the Air with Trees, or Machines?

Improving software quality is one of the best ways companies can save money, improve their brand, and generate more business. But building quality software requires planning, teamwork, and clear communication.

What to Read: Silicon Heartland–Transforming the Midwest from Rust Belt to Tech Belt

Improving software quality is one of the best ways companies can save money, improve their brand, and generate more business. But building quality software requires planning, teamwork, and clear communication.

Digital Humanism for Life in the Cloud

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Never have we been so connected but so disconnected at the same time. Digital technology connects us around the world. We can now be physically in one place even as we are mentally thousands of miles away–talking, working, or playing with others. The mobile phone is becoming ubiquitous in every aspect of life around the planet. And now the Global South and the Global North are well on the way to being connected everywhere, everyday, for better or worse.

These connections are generating collisions which were avoided when distance separated people and ideas. Wealth is meeting poverty online, and Western science and economics are meeting Eastern body-mind-spirit. Yet…. all is not one. People are whipsawed, sometimes very painfully, between the corporatized dominance of Western science and economics to, in more and more places, an embrace of holistic well-being and one-ness with the planet. Many parts of the world no longer are so willing to accede to the West’s dominance. So while we may find ourselves in the Cloud together, the buffers of understanding and empathy that enable human relations are not just worn thin. They are torn by growing waves of cultural and religious distrust.

Yet while digital may taketh away, it also gives: measurement, for example, is entering a golden age. The sensors of the internet of things and 5G are beginning to enable precise tracking of movement, temperature, and humidity. That will eventually help lead to enhanced food security and efficiencies in agriculture, transportation and the built environment, enhancing human wellbeing. Sensing as a tool to measure and monitor climate and the environment can enable us to better steward our planet at a time when that is desperately needed.

Digital accounting and tracking generally today primarily serves for-profit private enterprises, making a few people rich, while controlling and constraining the opportunities of far more. But if the cloud can be used to benefit humankind, both production and distribution could be improved in innumerable ways, with new precision. If such tools are managed for everyone’s benefit, it will be possible to dial the levers up or down for many important functions in society, especially at the subnational and neighborhood level.

Yet for now, divisions tear the human fabric, tilting the winnings to whichever tribe knows how best to leverage digital technology to win the power game. It is within our reach to change this, but we must collectively gather the will to do it.

We have a new dimension, and that is digital. Our early forays in the digital dimension have too often not enhanced or enriched our lives. But we can visualize hypothetical digital narratives, and in effect storyboard peoples’ lives, to examine in advance the impact of interventions.  Grameen Foundation, for example, helps farmers track 822 factors that can increase crop yields in Asia and Africa by many multiples. Our intuition from the analog world will need to evolve to better serve us in the digital realm, just as it took time to learn to read and write. And many new digital utilities are needed so we cooperate together better.

What do we mean by “home on the cloud”? In our real, physical and geographically-located home, work and family life have in the last year or so been radically merging. This Covid-forced cohabitation might last longer than we imagined. The more-digital lives many of us now lead have come sooner than we expected. The consequences are many. Not only has remote work reduced the need for office space, but it has also brought down barriers to migrating for work. We can live by the beach in Bali and work each day with colleagues in Brussels or Brazil or Baltimore, even as we socialize at night or on weekends in Bangkok, Barcelona, or Bogota.

Digital humanism driven by love of people and planet is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Surely we all want to prioritize collective flourishing over the fury that will result if the world’s future is defined by conflicts between digital masters and the digitally unempowered. But too many feel helpless, blame others, and find comfort in complaining. It’s past time for such inaction. It’s time to get going —with eyes wide open, not denying that the shift towards a more humane and sustainable society, however inevitable and badly needed, will be painful and fitful.

We are connected now. The human factor in social, economic and relationship ecosystems can no longer be ignored. We can see, measure and model these interactions. We have an incredible opportunity to share and learn so we can find ways to flourish together on our planet home: what that visionary Marshall McLuhan called Spaceship Earth. 

—–

This is a summary of a chapter in the forthcoming book, “Society X.0 in the Rise of a New Epoch – The Human Factor in Social, Economic and Relational Ecosystems” sponsored by the Social Trend Institute. Editors are Marta Bertolaso, professor of Philosophy of Science in the Faculty of Engineering, and Luca Capone  Ph.D. candidate in the Philosophy of Science and Human Development,  at the University Campus Bio-Medico of Rome, along with Carlos Rodríguez-Lluesma, professor in the Department of Managing People in Organizations at IESE Business School, University of Navarra.

 

Related Posts
See All

Saving our Souls in the Digital Age

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Why Cloud is the Center of Gravity for Enterprise Software

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Creating Societal Change Through Digital Transformation

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

Citing Wildfire Risk, State Farm Will Stop Selling Home Insurance in California

Driven by love of people and planet, digital humanism is urgently needed as a counterweight to what will otherwise be an inevitable digital colonization driven by profit.

How the Pandemic Set the Future of Food and e-commerce in Motion

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

The pandemic changed many consumer habits. Few were impacted as much as eating and shopping.

Before COVID shut down restaurants and prompted grocery store raids, apps had already simplified food delivery. Meal kits were becoming a more frequent part of weekly rotations in kitchens across the world and grocery delivery had moved from fringe to, well, less fringe. These were for most a splurge every so often. A way to incorporate a little convenience after a long day at the office. A luxury.

In hardly a moment, these luxuries became necessities a year ago. Fortunately, much of the groundwork had been laid in building the apps and the infrastructure to at least have a starting point. The pairing of food and e-commerce, after all, was more than 25 years in the making since that first online pizza order over dial up in the ‘90s.

The most substantial – and likely lasting – impact of the pandemic was the rapid acceleration of adoption of e-commerce enabled food delivery, whether cooked meals, meal kits or basic ingredients. It forced the maturation of the industry, with consumers demanding safety, simplicity, sustainability, and convenience. The disparate food tech sector at large quickly realized that to successfully scale to meet the massive demand would require a deliberate e-commerce mindset rooted in data and surrounded by sophisticated production and distribution… all while digesting a new outsized role as mission-driven to ensure people had food at the table.

Solving for how to serve record customer volumes during the pandemic was certainly better than the alternative but came with its own pressures. Meeting huge spikes in demand is a good problem to have, but nevertheless a problem. It is also very eye-opening.

Reflecting on the past year, there are several overlooked advances that have reshaped the e-commerce food industry for the long-haul:  

Customer data drives food industry success in an e-commerce environment.

Supporting individual tastes and dietary preferences has always been crucial to serving food. Originally there were waiters, then apps enabling customization. Not to mention certain grocery aisles a consumer may consistently frequent or avoid. Translating this into an e-commerce world has required not only choice and personalization – it has also demanded an interface built to anticipate tastes and diets, some of which are constants and others which fluctuate. Amazon’s data analysis, for example, does not need to understand how hungry a customer is or if they avoid meat on Fridays during Lent. But for companies providing food, these details can be critical.

As ordering food online has proliferated, personalization has become an important and expected part of the customer experience. Consumers are looking for easy ways to replicate orders, find suitable suggestions, and block out what they want to avoid. This requires a combination questions and analysis to understand patterns and preferences, while still having an easily navigable online store in case a customer needs different options.

  • Lesson: data trumps all else in building a successful online food business

Technology-driven segmentation is the key to growth.

As the industry matures, it is critical to recognize there are many different consumers, and that segmentation is the key to unlocking the multiple parallel total addressable markets. For example, at HelloFresh we have expanded our total addressable market by more than 60 percent over the past five years.

As a group, we have our namesake mainstream brand HelloFresh, but we also serve varying groups of customers with differentiated offerings including premium, value, and specialty. We also offer heat & eat meals through a fourth brand. Customers are good at self-identifying the product that suits their lifestyle.

This segmentation allows us to capture 60 percent of households, with surprisingly little overlap between these groups. Even during the pandemic, growth largely remained within segments, so we know we have distinct groups of customers that require different interaction and different products, even if the underlying platform is the same. Having a multi-brand strategy supports the choices that build a stronger product affinity.

  • Lesson: in an e-commerce environment a company can operate multiple unique brands on one underlying platform to expand its total addressable market

Consumers have been conditioned to equate e-commerce with fast delivery.

As the e-commerce leader, Amazon has set expectations of fast, free delivery. In the food sector, this was a significant shift. While restaurant delivery operated quickly in a local sense, other categories had to break away from scheduled deliveries to faster turnarounds.

In the meal kit category, for example, we have leveraged technology to update business practices to reduce the time between order and delivery, eliminating much of the long-term planning. The data showed that reducing the order to delivery timeframe led to infrequent customers ordering more often. We also found that shortening those times unlocked new customer segments.

Today, a brand offering a deliver-to-your-door service is not seen as a disruptor. That has become table stakes. Once Amazon threw down the gauntlet with two-hour delivery, new startups began promising speedier delivery of their services. To win in the e-commerce sector – especially in the food delivery space – companies must be prepared to meet and exceed these deeply entrenched consumer expectations. If you do not, someone else will.

  • Lesson: e-commerce expectations for speed apply to all categories including food

—-

Dominik Richter is the CEO and co-founder at HelloFresh.

Related Posts
See All

Meet the Entrepreneur Disrupting the Food Industry – for Good

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

A Recipe for Less Waste in the Food Service Industry

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

Why I’m Making a Big Bet on Sustainable Food

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses

Citing Wildfire Risk, State Farm Will Stop Selling Home Insurance in California

Data, technology-driven segmentation, and speed of delivery will continue to determine success among the online food businesses