PauseAI US is a grassroots organization demanding a pause on frontier AI development until it is safe to proceed. We are a coalition of all types of Americans, of all political affiliations, all ages and backgrounds, from AI experts to AI novices, all belonging to the 70% of Americans who want to Pause AI. We make the voice of the people heard through public engagement, petitioning, grassroots lobbying, and more.
PauseAI US is a grassroots organization demanding a pause on frontier AI development until it is safe to proceed. We are a coalition of all types of Americans, of all political affiliations, all ages and backgrounds, from AI experts to AI novices, all belonging to the 70% of Americans who want to Pause AI. We make the voice of the people heard through public engagement, petitioning, grassroots lobbying, and more.
Surveys of thousands of AI researchers estimate a 1 in 6 chance of human extinction from uncontrollable, superintelligent AI. Uncontrollable, superintelligent is exactly where we are headed if the AI industry gets its way. There are almost no guardrails into the development of frontier AI. AI companies are allowed to train models of any size, regardless of danger level. Some AI company CEOs have admitted there’s a chance their technology destroys humanity – but they’re willing to roll the dice anyway.
Our proposal is simple: Don’t build powerful AI systems until we know how to keep them safe. Pause AI. We advocate government regulation of the chips that are necessary to train and run the most powerful AI, pausing training runs above a threshold of computing power, and international cooperation to ensure that no company or country builds unsafe AI.
If we Pause AI, we can enjoy the benefits of the safe AI systems below the Pause threshold while we have the time to work to ensure that it’s safe to build even more powerful systems that could be even more beneficial.
Surveys of thousands of AI researchers estimate a 1 in 6 chance of human extinction from uncontrollable, superintelligent AI. Uncontrollable, superintelligent is exactly where we are headed if the AI industry gets its way. There are almost no guardrails into the development of frontier AI. AI companies are allowed to train models of any size, regardless of danger level. Some AI company CEOs have admitted there’s a chance their technology destroys humanity – but they’re willing to roll the dice anyway.
Our proposal is simple: Don’t build powerful AI systems until we know how to keep them safe. Pause AI. We advocate government regulation of the chips that are necessary to train and run the most powerful AI, pausing training runs above a threshold of computing power, and international cooperation to ensure that no company or country builds unsafe AI.
If we Pause AI, we can…
Volunteer Opportunity: Local Group Leader – PauseAI US | Estados Unidos(A Distancia) | ... |