Expert Group Finds Covid Response a “Massive Global Failure”

Infectious disease and public policy experts investigated the global pandemic response, cataloged its many failures and assembled recommendations for future. But sadly their words will probably fall flat.

This month, a group of 40 experts in infectious disease and public policy from around the world published the results of a major new review of the global response to the Covid-19 pandemic, along with recommendations for how to emerge from this pandemic and prepare for future threats in public health. They also sought to draw lessons about how to better address other overarching global threats like climate change and achieving the United Nations’ Sustainable Development Goals for 2030. Prof. Jeffrey Sachs of Columbia, a prominent cheerleader for the SDGs, was a major contributor to the effort. Unfortunately, while their advice might have been useful two years ago for the pandemic, it seems unlikely to be heeded as countries increasingly choose to act as though the pandemic is over.

Assembled as a special global commission by The Lancet, the highly regarded medical journal, the commission was tasked with investigating various aspects of the Covid pandemic. According to a comment from some of the authors, their mandate was “to help speed up global, equitable, and lasting solutions to the pandemic.”

The group’s report pulls no punches. An editorial published by the journal alongside the study says that it “lays bare what has been nothing less than a massive global failure—a failure of rationality, transparency, norms of public health practice, operational coordination, and international solidarity.”

While the report itself is scathing, its findings will sadly come as a surprise to almost no one. Government responses came too slowly, the authors note. They let vulnerable populations fall through the cracks. Their attempts to combat disinformation were unsuccessful. “The result was millions of preventable deaths and a reversal in progress towards sustainable development for many countries,” the editorial continues. By the end of May, the study’s authors report, there were nearly 7 million reported global deaths from Covid, with more than twice that — a total of more than 17 million deaths — believed to be attributable to Covid but not accurately counted. Thousands are still dying each day around the world from the disease (over 400 daily in the U.S. alone, on average).

The authors blame much of this on a lack of cooperation among governments. Failures on this front, they write, include delays in notification about the original outbreak, a lack of equitable distribution of important supplies, taking too long to acknowledge that Covid is airborne, poor coordination about strategies to prevent transmission, insufficient funding, and a shortage of useful data, among others.

Perhaps the most useful part of the report comes in its recommendations for how we emerge from the pandemic. Proving that hope does indeed spring eternal, even as the authors acknowledge two-plus years of global cooperation failure, they say that “putting such cooperation into place is still urgent.” They recommend that governments remain vigilant about new virus variants and sound the alarm about waning immunity, both from natural infections and vaccines, which keeps populations at high risk of infection. They also call for coordinated surveillance systems. However, as more and more countries relax mitigation measures and stop collecting key data, that seems unrealistic. And the commission tasks the World Health Organization with the responsibility to ensure mass immunization across all countries. Oddly, the authors call for further investigation into the origin of the SARS-CoV-2 virus responsible for Covid, even though there is now fairly broad scientific consensus and strong evidence to indicate that the virus spilled over from an animal or animals at the Huanan seafood market in Wuhan, China.

While these recommendations seem exceedingly unlikely to be enacted, especially on a global scale, the authors also addressed what needs to happen to prepare us better for future pandemics. Here, perhaps, there is a chance at least for individual governments and public health agencies to step up. The review also advises strengthening the World Health Organization with new regulatory authority — including the ability to investigate outbreaks on the ground without political interference — and with a bigger budget to allow it to oversee and coordinate the response to new infectious diseases. The commission calls for a new global surveillance system that would build on existing programs and add new capabilities in disease forecasting and mitigation strategies.

“It is certain that future pandemics will arise from interactions between humans and animals,” the group writes. Whether that happens a year from now or 100 years from now, being prepared for it can only help. While our response to the Covid pandemic may have been a failure, we could still salvage something from this crisis if we learned important lessons and used them to do better next time.

Related Posts
See All

Expert Group Finds Covid Response a “Massive Global Failure”

Infectious disease and public policy experts investigated the global pandemic response, cataloged its many failures and assembled recommendations for future. But sadly their words will probably fall flat.

Solar Sovereignty: The Promise of Native-Led Renewables

Infectious disease and public policy experts investigated the global pandemic response, cataloged its many failures and assembled recommendations for future. But sadly their words will probably fall flat.

Major Investment in Air-Conditioning Needed to Address Future Heat Waves

Infectious disease and public policy experts investigated the global pandemic response, cataloged its many failures and assembled recommendations for future. But sadly their words will probably fall flat.

5 Most Sustainable Startups 2022

Infectious disease and public policy experts investigated the global pandemic response, cataloged its many failures and assembled recommendations for future. But sadly their words will probably fall flat.

Climate Action is Unsustainable Without Gender Equity

Can We Build an Earth for All? with Sandrine Dixson-Declève

The Business of Business is Climate

The climate is telling us something’s wrong. Business is starting to hear the message. U.S. Steel, Ericsson, and other companies are changing, and finding new ways to work with environmentalists.

How bad does it need to get? As we at Techonomy prepped for our recent conference on business and climate, we talked to businesspeople of all sorts, as well as climate scientists, environmentalists, and investing experts. As we did, global weather was in a historic tailspin which certainly is just the tip of a melting iceberg yet to come. Tens of millions of Pakistanis are fleeing still-rising waters. A heatwave with no precedent afflicts much of Western China. Europe had its hottest summer in history. And the Yangtze, Rhine, and Colorado rivers are all drastically lower than in memory. 

Across the U.S., we’re feeling it strongly. Temperatures recently broke 100 degrees fahrenheit in Palo Alto, the center of our old world, back when we thought the internet was the most important topic. (And it hit 116 in Sacramento.) And in just a hint of the kinds of second-order effects that are likely to begin afflicting more and more of us, 200,000 residents in predominantly-Black Jackson, Mississippi were without safe water after floods devastated the city’s infrastructure – an example of what some are calling “environmental racism”. 

Yet the world proceeds on its blithe course. It continues to be considered, at least in the U.S., acceptable discourse to deny that global warming is caused by humans. Some states, notably Texas and Florida, are actually trying to punish investment firms that take global warming seriously. And just about all of us still behave as if things are basically the way they’ve always been. We drive our guzzlers, we keep buying things we don’t need. Many of us still fly around, even though that is generally how individuals contribute most to climate-warming emissions. 

It feels heartening, on a day to day basis, when we talk to people like Rich Fruehauf, chief strategy and sustainability officer at U.S. Steel. Contrary to what many might expect, that company is taking extensive steps to reduce emissions. Steel is a big contributor to global warming–some estimates suggest the industry contributes up to 8 percent of global CO2. But we need it for the green transition, for electric infrastructure, charging stations, new utility transformers, and transmission lines, not to mention e-vehicles. Did you know that two-thirds of American steel mills now use “electric arc furnaces”, which create only one-quarter the greenhouse gasses that traditional blast furnace steel plants do? Such electrically-powered plants produce new steel from recycled scrap—old cars, washing machines, and the like.

Another speaker who gives us hope is Heather Johnson, chief sustainability officer at telecommunications giant Ericsson. It’s had active work in this area for well over a decade, and is committed not just to “net zero” emissions by 2040 in all its operations, but also in the use of its products by customers. We had a good conversation in our prep call about how that sort of ecosystem pressure will generate reductions across supply chains, as companies of all sorts, we hope, adopt increasingly-stringent goals. (A similar conversation emerged in our recent virtual roundtable on carbon accounting–it’s worth watching.)

Speakers at the conference were enthusiastic about the just-enacted U.S. Inflation Reduction Act, which this crowd more often thinks of as the “Carbon Reduction Act,” even though they wish it had done much more. The bill allocates a historic amount of federal dollars for climate tech and climate action. Among many follow-on benefits is that investors will gain reassurance that climate-friendly corporate actions will be profitable. Climate tech companies will also see prospects brighten as their products and services find larger and more reliable markets.

We need systems thinking to address our interlocking crises, and there’s not enough of it. One entire session at the conference focused on that, with speakers from our partner Deloitte as well as Citi and environmental group RMI. And Elizabeth Sturcken, who heads the EDF+Business program at the Environmental Defense Fund, another of our partners, interviewed authors of the important recent book Speed & Scale: An Action Plan for Solving our Climate Crisis Now

But there’s just so much to do, and still not nearly enough being done. It’s the inevitably slow pace of emissions-reduction progress, alongside the quickening pace of climate degradation, that is most disturbing. And the challenge is made dramatically greater by the fact that well over half the planet remains justifiably intent on accelerating their pace of economic development. It is not just incongruous but also unethical for us to live the way we do while billions have so little power or water or infrastructure. As we steadily reduce the global inequality gap, which we must, we and they have to find ways for those billions to improve their standard of living without concomitant increases in emissions. 

If we don’t, and don’t simultaneously reduce our own developed-world carbon footprint, the prognosis is unbelievably grim. Temperatures will increase considerably from here anyway, no matter what we do. We’re barely past one degree celsius over historic levels so far, and even major emissions reductions will likely not keep us to less than two degrees in coming decades. That’s one reason a recent study calculated that, as a Guardian headline put it, “Major sea-level rise caused by melting of Greenland ice cap is ‘now inevitable.’” And Bill McKibben points out this week in his superb newsletter that weather extremes grow worse exponentially–not linearly–as temperatures go up.

Our conference was all about the ways this complex and scary set of realities must transform the conduct and nature of business globally. We’re happy to have been able to bring together not just the speakers we’ve mentioned but also pioneers like Eileen Fisher and newly-passionate business thinkers like Seth Godin. It’s just the beginning of a transformed economy. We hope.

Related Posts
See All

Solar Sovereignty: The Promise of Native-Led Renewables

The climate is telling us something’s wrong. Business is starting to hear the message. U.S. Steel, Ericsson, and other companies are changing, and finding new ways to work with environmentalists.

Major Investment in Air-Conditioning Needed to Address Future Heat Waves

The climate is telling us something’s wrong. Business is starting to hear the message. U.S. Steel, Ericsson, and other companies are changing, and finding new ways to work with environmentalists.

5 Most Sustainable Startups 2022

The climate is telling us something’s wrong. Business is starting to hear the message. U.S. Steel, Ericsson, and other companies are changing, and finding new ways to work with environmentalists.

Sustainability Software Takes the Spotlight

The climate is telling us something’s wrong. Business is starting to hear the message. U.S. Steel, Ericsson, and other companies are changing, and finding new ways to work with environmentalists.

Q&A: Do We Have to Choose Between Speedy Development and Environment?

Environmental review is under scrutiny on Capitol Hill. Here’s what a NEPA expert thinks policymakers should keep in mind.

On August 16, President Joe Biden signed into law a tax reform and spending bill that is also the country’s largest climate change investment. A surprise and historic compromise between Sen. Joe Manchin of West Virginia and Senate Majority Leader Chuck Schumer, the Inflation Reduction Act is projected to cut domestic greenhouse gas emissions by at least 40 percent by 2030 compared to 2005 levels, mostly through hundreds of billions of dollars in renewable energy tax credits.

In exchange for his vote, Manchin, who has a stake in a family coal business, secured a legislative concession: the promise of a side deal to speed the permitting process for big energy projects.

What little is known about the deal comes from a leaked, single-page summary obtained by the Washington Post and what appears to be an incomplete legislative text from Bloomberg News. Among the deal’s most enduring provisions could be sweeping changes to the National Environmental Policy Act, the bedrock environmental law which governs virtually every major infrastructure project in the country.

From pipelines and dams to power plants and critical mineral mining, there are few industries or environments that NEPA doesn’t touch. Likewise, the law’s critics cover the American political spectrum. Development-friendly coalitions say its requirements are expensive and time-consuming; some decarbonization advocates say it could interfere with a clean energy buildout. Equity-focused groups say it’s abused by wealthy areas to stop much-needed public works.

The side deal could bring those criticisms to the forefront. Its proposed changes to NEPA include:

  • Creating a two-year time limit for permitting reviews of major projects, and a one-year limit for lower-impact projects.
  • Boosting inter-agency coordination for environmental reviews.
  • Addressing “excessive litigation delays” by creating a statute of limitations for legal challenges, and requiring agencies to respond quickly when permits are knocked down in court.

Jamie Pleune is a professor of law at the University of Utah. She was part of a team that analyzed over 41,000 NEPA decisions in a paper published earlier this year. Pleune spoke with Circle of Blue to sort out myth from fact about NEPA. She discussed whether NEPA is responsible for hindering infrastructure projects in the United States, what’s in the side deal, and what she hopes policymakers will keep in mind as they enter negotiations.

This conversation has been edited for length and clarity.

Laura Gersony: Let’s start with the basics: what is NEPA? 

Jamie Pleune: NEPA is often called the Magna Carta of environmental law. It requires the federal government to stop and consider the possible effects of an action that may have a significant impact on the environment, before they irretrievably commit resources.

NEPA is a procedural law; it doesn’t mandate particular results. All it requires is the federal government to consider what they think the impacts are, and consider alternatives, or ways that they can mitigate the impact — and it requires them to make all of that analysis public.

There are three levels of review under NEPA. The highest level of review is an Environmental Impact Statement (EIS). That’s a very thorough analysis, and that’s reserved for really big projects. There’s also an Environmental Assessment (EA), which is a more limited analysis that the government can do when they’re not sure if an action will have a significant impact or not. And the third is a Categorical Exclusion, which requires a very truncated review. These are for small actions that typically do not have a significant effect on the environment: things like re-paving a parking lot, maintaining an existing campground, stuff like that.

LG: It’s a fairly common belief — and a key premise of the side deal — that NEPA, and particularly the EIS process, holds up big infrastructure projects. This angers decarbonization advocates, who worry it’s slowing a transition to clean energy, and also supporters of fossil fuels, who want to build out oil and gas infrastructure. Last year, you and your colleagues analyzed over 41,000 NEPA decisions by the U.S. Forest Service. Can you discuss your key findings, and whether they support this argument? 

JP: Sure. Almost all of the discussions about NEPA focus on Environmental Impact Statements, the highest level of review. But one of our interesting findings was that, of those 41,000 decisions between 2004 and 2020, only 2 percent were EISs. Seventeen percent of decisions were EAs, and 81 percent were Categorical Exclusions. So our first observation is that streamlining already happens: the vast majority of decisions are made at a level of review that’s appropriate to the complexity of the project.

Secondly, the vast majority of decisions were also made within a pretty efficient and reasonable amount of time for the level of review. The median time to complete an EIS was 2.8 years; the median time to complete an EA was 1.2 years; and the median time to complete a categorical exclusion was four months.

However, although the majority of decisions were made in an efficient timeframe, there was this very extended tail on the graph: in other words, even though most of the decisions are made within two years, the slowest 5 percent of EISs average eight years. That caused us to wonder: what causes these projects to get bogged down?

We found that there are three main sources of delay: a lack of agency capacity, delays that were attributable to the operator or market conditions, and delays that were caused by compliance with other laws. NEPA-specific factors could only explain 25 percent of the variation in decision making times. This tells us that most often, there are factors outside of NEPA that cause these delays.

LG: Your colleague at the University of Utah, John Ruple, published a paper earlier this year analyzing over 600 critical habitat rulings. He found that decisions that underwent the NEPA process were completed three months faster than those that did not, on average. What do you make of that finding?

JP: Right. There was a natural experiment set up, because there was a circuit court split on whether or not a critical habitat designation required a NEPA analysis. What he found was, perhaps contrary to expectations, the group of decisions that went through a NEPA analysis were completed, on average, three months faster than the group of decisions that did not go through the NEPA process.

I think this shows that, when done properly, NEPA actually does help projects go faster, because it provides a backbone and a framework for coordinated information sharing between agencies. Rather than a hindrance, NEPA has the potential to be a tool.

But in order to use it that way, we have to engage in practices that facilitate efficiency. Actually, some of the reforms that are proposed in this one-page leaked document that we have from Sen. Manchin are consistent with practices that have already been implemented by the Federal Permitting Improvement Steering Council and the FAST-41 program. Specifically, practices like designating a lead agency to coordinate inter-agency review and facilitating concurrent agency review processes are already a part of the FAST-41 program.

I’m hopeful that the suggestion to provide additional funds to the Federal Permitting Improvement Steering Council will encourage wider adoption of these best practices.

LG: I want to bring in here the equity critique of NEPA. There’s good data on the cost of highway building, showing that NEPA’s passage in 1969 was an inflection point for spending on highway projects. Almost immediately, higher project costs became strongly correlated with affluent areas. The inference is that NEPA is being abused by politically powerful areas, and in some cases, interfering with climate-friendly projects like public transportation or clean energy. In the words of Jerusalem Demsas, a reporter who has studied this issue: are we just “paying for wealthy people to exert their preferences over everyone else?” 

JP: I have to say that I disagree with Demsas’ premise–that NEPA created an avenue for wealthy, influential groups to exert their preferences. In my opinion, wealthy individuals and powerful groups have always had the ability to exert their preferences–usually in a backroom. That is the nature of politics and the reality of wealth. NEPA can be used–and has been used–as a tool by the wealthy and influential, but if NEPA did not exist, this sort of influence would be exerted in another way. Anecdotes focused on obstreperous wealthy groups abusing the process divert attention from the democratic purpose of public participation. NEPA’s process of transparency and public participation allow the possibility that non-wealthy, non-powerful groups will also have an opportunity to influence projects.

It’s not perfect, and it certainly needs to be implemented more efficiently. But NEPA’s goal–to level the playing field and to promote better governance by forcing public participation–is a very important one that I do not think we should turn our backs on easily.

LG: Let’s turn now to the IRA side deal. Based on the draft obtained by the Post, one key provision of this deal is a two-year “maximum timeline” for NEPA reviews. All else being equal, how would this rule play out?

JP: It’s hard to answer based on this worksheet we have, because I don’t know how it’ll be set up. But I can speak to what’s good about mandatory deadlines, and what’s bad about mandatory deadlines.

First, deadlines can be good. It is human nature to put off a project if we don’t have a deadline. So creating deadlines, and making it public whether or not you’ve met that deadline, that’s a good thing. However, the mandatory deadlines should be project-specific and responsive to the data needs of that project.

Secondly, there needs to be an “escape hatch” for missing the deadline with justification. If an agency is able to say a month before the deadline, I’m not going to meet it, they need to explain why not. For example, perhaps the agency requested additional information from the permit applicant and did not receive a timely response. Alternatively, perhaps the decision required collecting baseline water quality data and the samples received were unreliable. These are justifiable reasons for a delayed decision.

And thirdly, there should be an escape hatch for unforeseen circumstances. For example, perhaps an element of the project was redesigned in response to stakeholder concerns. Perhaps the scope of the project changed. Perhaps an unexpected environmental condition, such as contaminated soil in a building project was discovered. These unexpected circumstances may require additional time. So as long as the deadlines are project specific and include escape hatches for justified delays and unforeseen circumstances, I think that’s sufficient flexibility.

Mandatory deadlines are bad if they are paired with a presumptive approval: that is, if an agency misses the deadline, the applicant gets an approval. You’re really depriving the public of a meaningful review that way. Secondly, they’re bad if they don’t acknowledge delays that are caused by waiting for information from an operator or permit applicant — which, we found in our research, is often a significant cause of delay.

The purpose of NEPA is to make better decisions. It’s not to make fast decisions. And so if you impose a mandatory deadline, and it causes the agency to not fulfill its duty under NEPA, correcting that decision is going to take a lot longer than it takes to make a good decision in the first place. One 2008 study by Piet and Carole deWitt found that, for agencies that were required to supplement an EIS after completing it, on average, it resulted in a 2.3 year delay. Similarly, my colleague John Ruple found that supplementing the analysis associated with a Resource Management Plan to cure a defect resulted in an average delay of 363.4 days.

So if mandatory deadlines get in the way of agency decision-making, it might actually sacrifice efficiency, for the sake of speed.

LG: On that note, the IRA includes about $750M for the implementation of NEPA. Is this the kind of measure that could solve the capacity issues your work has outlined? How far will those dollars go?

JP: The real answer is: it depends on how it’s spent.

Under the Trump administration, several agencies lost a huge number of employees. Between 2016 and 2020, the EPA lost 750 senior scientists, which is one in four of their environmental protection specialists. The Fish and Wildlife Service lost 231 staff scientists. The Bureau of Land Management lost 300 senior Washington staff. So we’re talking about 1,281 missing employees alone, just to get back to the capacity that we were at in 2016. And it’s not to say that agencies were robust before 2016; there are reports showing that a lack of agency capacity has been the number one source of delay in hardrock mining permitting, going back 20 years.

I don’t know if this funding is going to help fill that gap, but it’s a great start.

LG: Among environmentalists, NEPA reform is often discussed as a trade off: that we have to choose between NEPA and quick clean energy development. It forces us to litigate a deeper moral question: what casualties are we willing to permit for a rapid buildout of clean energy? But your research suggests that maybe this isn’t as much of a dichotomy as we might think. Do we have to choose between safety on the environmental front lines and speedy development?

JP: I think it’s a false dichotomy. As you noted, there is research demonstrating that, especially for complex projects, using the NEPA process properly can increase the speed and the time and the quality of the project at the end.

There are some really great examples of how using the process efficiently can facilitate permitting down the road, especially when you’re talking about projects that are crossing jurisdictions and requiring coordination of local and state and federal agencies. When agencies use that process in a way that the stakeholder meetings are meaningful, and they’re engaged, it really does facilitate faster decision-making, and a better project.

LG: Is there anything else you think is important to keep in mind over the next few weeks, as we learn more about this deal?

JP: Maybe just that it’s important to step back and remember: what is the purpose of NEPA? You have instances like the Love Canal, where permits were granted or maybe not even required, and decisions were made without consideration of the long-term effect.

It is easy to complain about costs and delays associated with NEPA’s public process. But it is also important to recognize that there are bigger costs and delays caused by going too quickly and getting a decision wrong. Pilots do not take off in an aircraft without checking the weather and doing a thorough pre-flight inspection. We should demand the same of government decisions. Before making an irretrievable commitment of resources, the government should disclose the impacts of the decision and consider alternatives. Tragedies like the Love Canal or the Deepwater Horizon accident help remind us that big decisions with big consequences should be made deliberately, even if it takes just a little bit of extra time.

Related Posts
See All

Solar Sovereignty: The Promise of Native-Led Renewables

Environmental review is under scrutiny on Capitol Hill. Here’s what a NEPA expert thinks policymakers should keep in mind.

Major Investment in Air-Conditioning Needed to Address Future Heat Waves

Environmental review is under scrutiny on Capitol Hill. Here’s what a NEPA expert thinks policymakers should keep in mind.

5 Most Sustainable Startups 2022

Environmental review is under scrutiny on Capitol Hill. Here’s what a NEPA expert thinks policymakers should keep in mind.

Sustainability Software Takes the Spotlight

Environmental review is under scrutiny on Capitol Hill. Here’s what a NEPA expert thinks policymakers should keep in mind.

Why Environmentalists Must Work With Business

Governments worldwide have punted in the face of overwhelming evidence of environmental decay, and now we are in trouble. More responsive these days, ironically, are companies, which face pressure from consumers, partners, and especially hard-to-retain […]

Governments worldwide have punted in the face of overwhelming evidence of environmental decay, and now we are in trouble. More responsive these days, ironically, are companies, which face pressure from consumers, partners, and especially hard-to-retain employees. Not to mention that they are often where pollution and greenhouse gasses come from.

Environmentalists are often hesitant to work with these partners, even though businesses are increasingly tractable. But the Environmental Defense Fund led the way in recognizing the opportunity decades ago, and has refined its toolkit and approach.

In this virtual session we heard from the leader of the “EDF+Business” group, Elizabeth Sturcken, alongside Michelle Lancaster – a longtime partner from Microsoft, perhaps tech’s most pragmatic company.

Related Posts
See All

Solar Sovereignty: The Promise of Native-Led Renewables

Governments worldwide have punted in the face of overwhelming evidence of environmental decay, and now we are in trouble. More responsive these days, ironically, are companies, which face pressure from consumers, partners, and especially hard-to-retain...

Major Investment in Air-Conditioning Needed to Address Future Heat Waves

Governments worldwide have punted in the face of overwhelming evidence of environmental decay, and now we are in trouble. More responsive these days, ironically, are companies, which face pressure from consumers, partners, and especially hard-to-retain...

5 Most Sustainable Startups 2022

Governments worldwide have punted in the face of overwhelming evidence of environmental decay, and now we are in trouble. More responsive these days, ironically, are companies, which face pressure from consumers, partners, and especially hard-to-retain...

Sustainability Software Takes the Spotlight

Governments worldwide have punted in the face of overwhelming evidence of environmental decay, and now we are in trouble. More responsive these days, ironically, are companies, which face pressure from consumers, partners, and especially hard-to-retain...

Is it Ethical to Patent Emerging Climate Tech?

The U.S. is fast-tracking patent applications for climate tech. Is this a good move?

Way back in January 2021, newly inaugurated President Joe Biden signed Executive Order (EO) 14008. If you missed it, don’t worry. Most news was covering the events from earlier in the month, and this act of executive power went through with relatively little fanfare — despite its far-reaching implications for the climate tech community.

Essentially, EO 14008 placed the effort to mitigate climate change back onto the priority list of the federal government. One way in which this has manifested is with the U.S. Patent and Trademark Office’s (USPTO) June announcement of an exciting new program, the Climate Change Mitigation Pilot Program (the USPTO is not famous for its witty way with words). Henceforth, this program will be known as CCMPP for brevity’s sake.

The CCMPP was designed to “positively impact the climate by accelerating examination of patent applications for innovations that reduce greenhouse gas emissions,” according to the official announcement. While the pilot is intended to scale up production and distribution of climate tech, to potential funders, acquiring a U.S. patent increases the viability of the tech, protecting the value of the intellectual property and mollifying investors about their stakes.

Patents provide the holder exclusive intellectual property rights to the invention. In a capitalist system, the idea and practicality of a patent make sense. Profits made directly from the use of that invention go exclusively to the holder. Even President Abraham Lincoln himself once said, “The Patent System added the fuel of interest to the fire of genius.”

So, assuming a given technology meets CCMPP’s required criteria, it will bypass the average 18-month wait for patent application approval. Additionally, the application fees are waived.

But what if this pilot does just as much – or more – harm than good?

Lincoln did not forsee climate change

When an invention, such as a hypothetical GHG emission mitigating technology, serves a public benefit and could potentially affect the current path of the climate crisis, the implications of sole ownership become potentially insidious, given the literal life-or-death stakes.

Aidan Hollis, professor of economics at the University of Calgary, expressed the negative impact of climate tech patent holding well in a recent Grist op-ed. Citing a new technology that can smelt aluminum without emitting carbon dioxide, Hollis describes the boon the mechanism will bring to wealthy aluminum producers and countries that can afford to license the machine. However, Hollis continues, less affluent countries and producers will have to wait 20 years for the patent to expire and the tech to enter public domain, “but the world can’t wait that long to access breakthroughs that could help to save our planet.”

And thus the woes of patenting climate tech become apparent. In a time when communities across the globe are acutely and painfully feeling the impact of climate change, sequestering the rights to potentially life-changing technology poses moral and ethical quandaries. Should one person (or a board of people) be allowed the absolute power and distributive control of technology that can, hypothetically, prevent flooding or drought and with it the associated loss of life? Additionally, with the timeline to halt global temperature rise by 2030 becoming shorter by the hour, should patenting any technology aiding that goal even be allowed?

In terms of accessibility, does the concept of patenting climate tech reinforce the notion that the wealthy will weather the storm of the climate crisis and emerge on the other side while the poor are expected to bear the brunt of extreme weather and commodity shortages? Adversely, should the individuals who create the technology be expected to shoulder the burdens of solving the climate crisis without any kind of compensation for their efforts?

And who, precisely, is the patent program meant for? Due to imprecise language in the “Qualifications” section of the pilot program, an immediate question comes to my mind: Can a technology with ancillary GHG emitting effects qualify for the expedited application process offered in the CCMPP?

An example. If Company X figures out a way to cool homes without emitting hydrochlorofluorocarbons, can Company X apply for expedited patent approval through the CCMPP, even though GHG emission mitigation effects are ancillary to the main purpose of the machine, a.k.a. cooling buildings? It would make sense to allow these types of applications to proceed.

But, because the language isn’t clear, those arguing against ancillary impact could easily assert that applicants could take advantage of the pilot program to further their own economic ends. This notion is supported by one of the CCMPP’s vague application requirements: “the applicant has a good faith belief that the expediting patent examination will likely have a positive impact on the climate.” So much subjectivity lies within the meaning of “good faith belief.”

Another example. Company X has eliminated the toxic release of the hydrochlorofluorocarbons from its product, but the true purpose of Company X is to cool buildings and sell cooling units, not mitigate GHG emissions to have a positive impact on the climate. The CCMPP could be seen as an expedited path to market for Company X, providing an unfair economic advantage.

The supercharged injection of the Executive Branch

The Biden Administration aims to address the historical precedent of uneven accessibility. The White House created the Justice40 Initiative, pledging that 40 percent “of the overall benefits of certain Federal investments flow to disadvantaged communities that are marginalized, underserved, and overburdened by pollution.”

Michelle Moore, CEO of Groundswell and a senior adviser of the White House Office of Management and Budget during the Obama administration, spoke to GreenBiz regarding the connection between this initiative and the CCMPP. Moore said, “The USPTO’s program to supercharge the patent process for climate mitigating technologies is also going to benefit scientists, researchers, innovators, entrepreneurs [and] inventors who have historically been underrepresented … and that really helps to realize the wealth-building promise of the innovation that’s being unleashed by the IRA’s historic investment.”

Both the Justice40 program and the CCMPP got their start due to EO 14008, indelibly connecting the effectiveness of the former to benefit the latter.

And the CCMPP is moving forward. As of last week, 71 applicants have been filed, with 28 granted. The CCMPP will allow only 1,000 applications to proceed through June 5.

With luck, federal bureaucracy will actually aid in the stimulation of the climate tech sector while simultaneously increasing hitherto universal accessibility. As with all government initiatives, only time will reveal the true success or failure of the program.

Related Posts
See All

Solar Sovereignty: The Promise of Native-Led Renewables

The U.S. is fast-tracking patent applications for climate tech. Is this a good move?

Major Investment in Air-Conditioning Needed to Address Future Heat Waves

The U.S. is fast-tracking patent applications for climate tech. Is this a good move?

5 Most Sustainable Startups 2022

The U.S. is fast-tracking patent applications for climate tech. Is this a good move?

Sustainability Software Takes the Spotlight

The U.S. is fast-tracking patent applications for climate tech. Is this a good move?

Biden Admin Expands Access to the Latest Scientific Findings

For the first time, the White House tells scientific publishers to take down the paywalls around federally funded research.

Last week, the Biden administration announced a new policy designed to expand access to results of the latest published scientific reports, such as new findings about Covid. Updated policy guidelines may sound like a snoozefest, but in this case they deserve your attention.

The dirty little secret of scientific publishing is that companies have found a way to charge people again for something they’ve already paid for. The vast majority of scientific research in this country is funded with money collected from American taxpayers. All of those experiments, all of those in-depth analyses — yep, you paid for that. Scientists diligently write up the results of that work, shop the manuscript around to various research journals, and get it published so other people can learn from or build on what they’ve done.

But in most cases, only subscribers can read those papers. And subscriptions to these journals are not cheap. In 2019, the University of California famously canceled its system-wide subscription to all journals from the publisher Elsevier, revealing in the process that it had been paying more than $10 million annually to ensure that scientists at all of its universities could access those journals.

For those of us who don’t work at a university and might want to read a few articles to learn about the latest research into a particular disease, for example, we have to shell out for each article. Science, one of the most influential journals in the research world, will gladly give you access to a single article for $30 a pop. JAMA, a leading medical journal, will do the same for $40. That adds up fast.

Over the past couple of decades, some scientists have been pushing for open-access publishing, a model in which papers are freely accessible upon publication after manuscript authors pay an upfront fee (ostensibly to cover the costs of managing the peer-review process and publishing, though the scientists who actually perform the work of peer review will be quick to tell you they get no payment for it).

Nearly a decade ago, the White House Office of Science and Technology Policy (OSTP) used its clout to direct all federal funding agencies to embrace open-access publishing for taxpayer-funded research. But they gave science publishers a huge concession: a 12-month “embargo period” during which papers could be kept behind a paywall. After one year, those papers had to be freely accessible.

That change was enormously helpful in expanding access to scientific research. But the 12-month delay was problematic from the start. In the era of Covid, when data needs to be available to the entire research community immediately to help address the pandemic, a year-long delay would have greatly slowed progress. (To their credit, some scientific publishers eliminated paywalls for Covid-related content, at least for a period of time. And researchers used preprints to get results to colleagues and the public even faster.)

Now, the OSTP has struck again, expanding access more dramatically. With the new policy, the 12-month embargo period is gone, and all federally funded research has to be available to anyone immediately upon publication. The data underlying those publications also has to be shared to make it easier for other scientists to check each other’s work.

In an OSTP statement explaining the rationale for the updated guidelines, Christopher Steven Marcum and Ryan Donohue wrote: “In too many cases, discrimination and structural inequalities — such as funding disadvantages experienced by minority-serving colleges and institutions — prevent some communities from reaping the rewards of the scientific and technological advancements they have helped to fund.” The revised policy was designed to overcome those challenges so that “all communities [can] take part in America’s scientific possibilities,” they added.

Michael Eisen, a leading advocate for open-access scientific publishing, applauded the change in a Twitter thread. “The reason this is a big deal is that it is the first time, in over twenty years of #openaccess initiatives from both Congress and White House, that the policy focused exclusively on what is best for the public, without any baked in concessions to publishers,” he wrote.

Related Posts
See All

Expert Group Finds Covid Response a “Massive Global Failure”

For the first time, the White House tells scientific publishers to take down the paywalls around federally funded research.

The Business of Business is Climate

For the first time, the White House tells scientific publishers to take down the paywalls around federally funded research.

Q&A: Do We Have to Choose Between Speedy Development and Environment?

For the first time, the White House tells scientific publishers to take down the paywalls around federally funded research.

Is it Ethical to Patent Emerging Climate Tech?

For the first time, the White House tells scientific publishers to take down the paywalls around federally funded research.

The Kyiv Tech Summit: Hacking for Ukraine

The hackathon will engage the blockchain community and others. Programmers, designers, and innovators aim to build tech to “make wartime life easier.”

Americans have notoriously short attention spans. After six months of unrelenting war as Russia occupies parts of Ukraine and attacks the rest, Americans seem to want steadily more distance from the war and its casualties.

“Don’t forget about Ukraine,” warned President Volodymyr Zelensky during an interview with CBS way back in April. The sentiment was prescient. Each day the battle continues, our hopes for improving the outcome diminish and worldwide impatience about a protracted battle grows.

About a month ago, I met a vivacious young Ukrainian woman, Inna Kosianets. She’s beautiful, brilliant, and exhilarated about the power of tech to solve real-world problems. By day Kosianets is a Senior VP at an NYC-based investment firm.  After hours she and other activists, many of them part of the Ukrainian diaspora, are working feverishly to coordinate a hackathon that they believe can help address some of Ukraine’s most pressing wartime problems. The event is called Kyiv Tech Summit and will run from September 6th to 9th.  It will be live. Only local Kyiv residents will be able to attend in person, and the location is undisclosed for safety reasons. The event will also be streamed online. Kyiv Tech Summit’s mission is to engage the local blockchain community, programmers, designers, and innovators as well as others across the globe to build technology that will “make war-time life easier for Ukrainians and the world.”

The hackathon’s focus is on solutions that include:

  • Resisting disinformation and information manipulation.
  • Supporting access and communication.
  • Helping displaced people.
  • Collecting donations and coordination of humanitarian aid.
  • Reconstruction of the economy of Ukraine.

Learn more about each area of the hackathon.

“Kyiv, and Ukraine, in general, have an enormous amount of tech talent,” Ms. Kosianets reminded me. Recent statistics show that there are about 300,000 IT professionals, about 285,000 of them developers. “The Ukrainian IT sector has shown unprecedented resilience, flexibility, and ability to withstand conditions and keep “coding as usual” during these months of the war,” she added.

In a recent tweet,Vitalik Buterin, a Canadian of Russian descent and creator of Ethereum, commended the Kyiv Tech Summit for its bravery and innovation.

I asked Kosianets how tech event programming could possibly continue under the present conditions. “It depends where you are in Ukraine,” she answered,  pointing me to this Wall Street Journal article. “One thing is certain – you are not safe anywhere and any city can be shelled by the Russians. No matter if it’s a school, supermarket, residential area, public park,” she added, sadly.

Being in Kyiv now almost feels like normal, she said, except for air sirens going off every now and then and the still-lingering fear of being bombed. “But, if you look at my hometown Kherson,”  she continued, “it is now called ‘a place of quiet terror.’” The city has been occupied by Russians from the early days of the war. Life there, Ms. Kosianets reiterated, is miserable. Food and medicine are in short supply, as are other basic supplies. People are being tortured and kidnapped by Russians just for being pro-Ukrainian. She told me about a close friend of hers who was taken from her home on her birthday in May. Her whereabouts are still unknown. Kosianets herself said she just got permission to relocate her family members to the U.S.

The core team behind the Kyiv Tech Summit hackathon is diverse. Four members are Ukrainians. The two other key organizers are from England and the U.S. All either have friends or family still in Ukraine. Other active organizers include Rev Miller of  Unchain Fund; CJ Hetherington, a co-founder of the metaverse Atlantis World; Alona Shevchenko a co-founder of Ukraine DAO; Tyler Morrey, co-founder of pieFi; and Nadiia Yakubets, head of communications at Unchain Fund. Links to the team as well as the event’s sponsors can be found on its website.

The organizers report that 140+ hackers have already signed up. Sponsoring organizations include NEAR, Aurora, Celo, Aragon, Consensys Academy, NYM, Aave Grants DAO, ZKX, Coindesk, Incrypted, Cointelegraph, ForkLog, and the Blockchain Association of Ukraine. Organizers are also working with media and documentary filmmakers to document this historic hackathon, which whould cement Ukraine’s potential in the Web3 world. (Netflix and ETH film crews both are planning to film the event!)

Ukraine is Rich in Tech

Ukraine stands 4th among the most educated nations in the world. Over 99.7% of Ukrainians are literate. Today in Ukraine there are 198 universities, 62 academies, 83 institutes, 245 colleges, 97 technical schools, 117 specialized schools, and 1 conservatory. Six Ukrainian universities are listed in the international ranking “QS World University Rankings”. Russia also enjoys a high quality of education, though many studies find Russia falls short in innovation and modernization.

Ukraine also has its fair share of successful startups. Each year Ukrainians patent somewhere between 10,000 and 15,000 inventions and ideas. Founders of such companies as PayPal (Max Levchin) and WhatsApp (Jan Koum) were born and raised in Ukraine. Zelensky himself has appeared at events as a hologram, emphasizing the importance of tech in getting out information from Ukraine.

How can we help?  Spread the news about The Kyiv Tech Summit. There are still opportunities for sponsorships and collaborations. Partner with them as media or tech partners.  Mentor the hackathon entrants. Offer to judge the events. Whatever you can contribute. It all starts with this form.  And hackers can still registerThe event comes right after Ukraine celebrated its Independence Day August 24th. There’s something appropriate about that.

Official Links: Website | Twitter  |  Telegram | Discord |  We’ll report back on the winners.

Related Posts
See All

Solar Sovereignty: The Promise of Native-Led Renewables

The hackathon will engage the blockchain community and others. Programmers, designers, and innovators aim to build tech to "make wartime life easier."

Major Investment in Air-Conditioning Needed to Address Future Heat Waves

The hackathon will engage the blockchain community and others. Programmers, designers, and innovators aim to build tech to "make wartime life easier."

5 Most Sustainable Startups 2022

The hackathon will engage the blockchain community and others. Programmers, designers, and innovators aim to build tech to "make wartime life easier."

Sustainability Software Takes the Spotlight

The hackathon will engage the blockchain community and others. Programmers, designers, and innovators aim to build tech to "make wartime life easier."