ABOUT US
At Founders Pledge, we empower entrepreneurs to do immense good. Since 2015, our members have pledged over $10 billion to high-impact social causes. We help leading tech founders and philanthropists turn their intent into real-world impact through world-class research, tailored advice, and direct fund management. Our research team is small, expert, and deeply mission-driven.
ABOUT THE GCR TEAM
The mission of the Global Catastrophic Risks (GCR) team at Founders Pledge is to prevent extreme risks threatening human civilization, including risks from advanced artificial intelligence, engineered pandemics, and large-scale nuclear war. We view relations between the world’s great powers as a major risk factor for all of these threats. For example, unconstrained U.S.-China competition on advanced technologies could lead to dangerous racing dynamics and the deployment of unsafe AI systems. Our team has launched new organizations working on GCRs, funded major policy advocacy efforts, supported U.S.-China multi-track diplomacy on artificial intelligence, and more. Our funding runs through a mixture of philanthropic advising, partnerships with UHNW donors, and the GCR Fund. We believe this work has already substantially influenced international policy designed to reduce extreme risks.
ABOUT THE ROLE
We're hiring a Cause Area Lead to lead our work on Global Catastrophic Risks (GCRs) — threats that could cause significant harm to civilization today and to humanity’s long-term potential. In this role, you’ll drive forward our GCR strategy, manage a small team, and co-manage the GCR Fund alongside Christian Ruhl. This is a hybrid research-leadership role, ideal for someone who wants to shape one of the most important philanthropic agendas in the world.
You’ll split your time between:
This is a highly entrepreneurial role with direct influence over tens of millions in grantmaking decisions.
KEY RESPONSIBILITIES
Strategy & Leadership
Fund Management (with Christian Ruhl)
People Management
Cross-Team Collaboration
ABOUT YOU
You’re a sharp, impact-oriented thinker who can combine rigorous analysis with strategic leadership. You have a track record of excellent independent research, preferably in areas related to global catastrophic risks (AI governance, biosecurity, nuclear risk, geopolitics), and you’re excited to guide a team and move large sums of money to the highest-impact opportunities.
You likely have:
Nice to Have
WHY WORK WITH US
We bring the global team together once per year for a global offsite. This is a week of collaborative learning, ideas generation and socialising. In addition to this, each team has the opportunity to meet together at least once per year.
We are proud to be an equal opportunity employer and value diversity at Founders Pledge. We seek people with different strengths, experiences and backgrounds, who share our drive to understand and solve complex social challenges.
We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Finding solutions to the world’s most pressing problems requires different perspectives and unique ways of thinking and we are committed to building an inclusive and diverse workplace where everyone can do their best work.
Please note: When you apply to this role, you’ll be asked to complete an equal opportunities form. As this role is being advertised in multiple countries, you’ll be asked to complete a UK equal opportunities form, as this is where our HQ is based.
ABOUT US
At Founders Pledge, we empower entrepreneurs to do immense good. Since 2015, our members have pledged over $10 billion to high-impact social causes. We help leading tech founders and philanthropists turn their intent into real-world impact through world-class research, tailored advice, and direct fund management. Our research team is small, expert, and deeply mission-driven.
ABOUT THE GCR TEAM
The mission of the Global Catastrophic Risks (GCR) team at Founders Pledge is to prevent extreme risks threatening human civilization, including risks from advanced artificial intelligence, engineered pandemics, and large-scale nuclear war. We view relations between the world’s great powers as a major risk factor for all of these threats. For example, unconstrained U.S.-China competition on advanced technologies could lead to dangerous racing dynamics and the deployment of unsafe AI systems. Our team has launched new organizations working on GCRs…
English - fluent
English - fluent