Emerging Challenges Fund

About the ECF

Over the next decade, emerging technologies will pose significant challenges to global security. Rapid advances in AI, for example, could lower the barriers for malicious actors to carry out large-scale biological or cyber attacks, bring democratic processes under unprecedented strain, and accelerate scientific and economic progress like never before. At present, we are neither ready to face the next deadly pandemic nor equipped to navigate escalating geopolitical tensions as global superpowers build more nuclear weapons, of more types, on more platforms.

The ECF aims to prepare the world for these challenges. We prefer to support projects that meet Longview’s usual grantmaking criteria and pass two further tests:

  1. Does the project have a legible theory of impact? ECF grantees must have a compelling and transparent case in favour of their impact that a range of donors will appreciate.
  2. Will the project benefit from diverse funding? Often, support by a large number of donors, rather than a single organisation or donor, is of particular value.

In 2024, we allocated over half of the ECF to civil society organisations invited to help draft the EU AI Act’s Code of Practice. This grant round is especially well-suited for the Fund, as (i) providing expertise to those shaping the implementation of the EU AI Act is both vitally important and legible, and (ii) it is important that these civil society organisations are funded by a diversity of sources and can remain credibly independent from any single interest group, and (iii) the Fund allowed us to make the grants quickly as they were needed urgently. We aim for this to exemplify the ECF’s grantmaking strategy—quickly supporting clear opportunities that other philanthropists overlook or are not well-suited to support.

For those seeking to invest in a safer future this fund provides unique expertise across beneficial AI, biosecurity, and nuclear weapons policy and fills critical funding gaps at organisations in need of rapid financial support and a diversity of donors.

Longview also delivers additional services for larger donations:

  1. Recommendations. For donors wishing to make large gifts, we offer access to grant recommendations drawn from our top opportunities. These concise analyses help donors find and fill the most critical funding gaps.
  2. End-to-End Effective Giving. For major donors seeking to develop significant philanthropic portfolios, we provide a bespoke end-to-end service at no cost. This includes detailed analysis, expert-led learning series, residential summits, tailored strategic planning, grant recommendations, due diligence, and impact assessment. Please get in touch with our CEO, Simran Dhaliwal, at sim@longview.org.
Fund Managers
Emerging Challenges Fund
Simran Dhaliwal
CEO
Simran coordinates Longview Philanthropy’s research, grantmaking, and advising work. Prior to joining, she was a research analyst at Goldman Sachs, working on a two-person team recognised as the best sell-side stockpickers in London in 2018. While there, she also became a Chartered Financial Analyst (CFA) charterholder and was donating to high-impact charities. Simran read philosophy, politics, and economics at the University of Oxford, where she first came across the concept of using evidence and reason to do the most good at a Giving What We Can talk.
Emerging Challenges Fund
Carl Robichaud
Nuclear Weapons Policy Programme Director
Carl leads Longview’s programme on nuclear weapons policy and co-manages Longview’s Nuclear Weapons Policy Fund. For more than a decade, Carl led grantmaking in nuclear security at the Carnegie Corporation of New York, a philanthropic fund which grants over $30 million annually to strengthen international peace and security. Carl previously worked with The Century Foundation and the Global Security Institute, where his extensive research spanned arms control, international security policy, and nonproliferation.
Emerging Challenges Fund
Matthew Gentzel
Nuclear Weapons Policy Programme Officer
Matthew conducts grant investigations for Longview’s programme on nuclear weapons policy and co-manages its Nuclear Weapons Policy Fund. His prior work spanned emerging technology threat and policy assessment, focusing on how advancements in AI may shape influence operations, nuclear strategy, and cyber attacks. He has worked as a policy researcher with OpenAI, an analyst in the US Department of Defense’s Innovation Steering Group, and director of research and analysis at the US National Security Commission on Artificial Intelligence.
Emerging Challenges Fund
Aidan O’Gara
AI Programme Officer
 
Aidan conducts grant investigations in artificial intelligence (AI), with a particular focus in technical research on AI safety. Before joining Longview, he conducted research on machine learning and AI policy at GovAI, Epoch, Cornell University, AI Impacts, and the Center for AI Safety. He also spent three years leading the data science team at a fintech startup. Alongside his work at Longview, Aidan is a DPhil candidate in AI at Oxford University.
 
Emerging Challenges Fund
Dr Zach Freitas-Groff
Senior Programme Associate
 
Zach conducts grant investigations in artificial intelligence (AI). He completed his PhD in economics at Stanford University, where he received support from the National Science Foundation, the Forethought Foundation for Global Priorities Research, and the Stanford Institute for Economic Policy Research. Zach has conducted research covered by The New York Times, Reuters, Marginal Revolution, and, most importantly, Saturday Night Live. Before that, he was a Research Analyst at Innovations for Poverty Action and the Global Poverty Research Lab at Northwestern University.