Organización Sin Fin de Lucro

PauseAI US

San Francisco, CA
|
www.pauseai-us.org/

  • Misión

    PauseAI US is a grassroots organization demanding a pause on frontier AI development until it is safe to proceed. We are a coalition of all types of Americans, of all political affiliations, all ages and backgrounds, from AI experts to AI novices, all belonging to the 70% of Americans who want to Pause AI. We make the voice of the people heard through public engagement, petitioning, grassroots lobbying, and more.

    PauseAI US is a grassroots organization demanding a pause on frontier AI development until it is safe to proceed. We are a coalition of all types of Americans, of all political affiliations, all ages and backgrounds, from AI experts to AI novices, all belonging to the 70% of Americans who want to Pause AI. We make the voice of the people heard through public engagement, petitioning, grassroots lobbying, and more.

    Acerca de

    Our proposal is simple:

    Don’t build powerful AI systems until we know how to keep them safe. Pause AI.

    Surveys of thousands of AI researchers estimate a 1 in 6 chance of human extinction from uncontrollable, superintelligent AI. Uncontrollable, superintelligent is exactly where we are headed if the AI industry gets its way. There are almost no guardrails into the development of frontier AI. AI companies are allowed to train models of any size, regardless of danger level. Some AI company CEOs have admitted there’s a chance their technology destroys humanity – but they’re willing to roll the dice anyway.

    Our proposal is simple: Don’t build powerful AI systems until we know how to keep them safe. Pause AI. We advocate government regulation of the chips that are necessary to train and run the most powerful AI, pausing training runs above a threshold of computing power, and international cooperation to ensure that no company or country builds unsafe AI.

    If we Pause AI, we can enjoy the benefits of the safe AI systems below the Pause threshold while we have the time to work to ensure that it’s safe to build even more powerful systems that could be even more beneficial.

    Our proposal is simple:

    Don’t build powerful AI systems until we know how to keep them safe. Pause AI.

    Surveys of thousands of AI researchers estimate a 1 in 6 chance of human extinction from uncontrollable, superintelligent AI. Uncontrollable, superintelligent is exactly where we are headed if the AI industry gets its way. There are almost no guardrails into the development of frontier AI. AI companies are allowed to train models of any size, regardless of danger level. Some AI company CEOs have admitted there’s a chance their technology destroys humanity – but they’re willing to roll the dice anyway.

    Our proposal is simple: Don’t build powerful AI systems until we know how to keep them safe. Pause AI. We advocate government regulation of the chips that are necessary to train and run the most powerful AI, pausing training runs above a threshold of computing power, and international cooperation to ensure that no company or country builds unsafe AI.

    If we Pause AI, we can…

    Áreas de Impacto incluyen

    • Participación Ciudadana
    • Cooperación Internacional
    • Política
    • Ciencia & Tecnología
    • Voluntariado

    Información y Contacto

    Avisos recientes

    Illustration

    Descubre Tu Vocación

    Encuentra oportunidades para cambiar el mundo con las últimas oportunidades de empleo, pasantías/prácticas y voluntariado con impacto social. Además, podrás explorar recursos para generar impacto positivo en tu comunidad.