Community Group
Published 11/17/25 5:47PM

AI & Emotional Safety Architect (Foundational Volunteer Role → Paid Position After Funding)

Remote, Volunteer can be anywhere in the world
I Want to Help


  • Details

    Available Times:
    Weekdays (daytime)
    Time Commitment:
    Flexible
    Recurrence:
    Recurring
    Cause Areas:
    Community Development, Education, Mental Health, Research & Social Science, Science & Technology
    Age Requirement:
    18+

    Description

    AI & Emotional Safety Architect (Foundational Volunteer Role → Paid Position After Funding)

    TEG-Blue™ is building a new kind of emotional technology — a system that helps humans and AI understand emotional patterns, belonging, harm, and repair. We are looking for someone who can help us design the AI layer of the project: a person who understands technology deeply and humanity gently.

    This role is for someone who wants to build ethically, slowly, and responsibly — a person who values emotional safety as much as innovation.

    About TEG-Blue™

    TEG-Blue is a large-scale emotional technology system created to map how emotions move through individuals, communities, institutions, and AI.

    We work at the intersection of psychology, neuroscience, design, and machine learning — building tools for emotional clarity, healing, and repair.

    We are now forming the AI & Emotional Safety Department, which will design how emotional logic is translated into machine-readable structures (TEG-Code, EMLU), and how AI systems can recognize trauma patterns, manipulation, belonging, and repair.

    We are looking for someone who can help us build this foundation with care, rigor, and integrity.

    Responsibilities

    This role combines architecture, research, and prototyping.

    You will:

    • Help design the high-level architecture for the AI layer of TEG-Blue
    • (TEG-Code, emotional logic schemas, embeddings, datasets)
    • Work with Anna (Founder) to understand the emotional and relational logic behind the maps
    • Translate emotional patterns into technical structures (JSON schemas, embeddings, model attributes)
    • Build early prototypes that show how AI can detect emotional modes
    • (Belonging, Defense, Manipulation, Tyranny)
    • Explore how machine learning could be trained to identify:
      • trauma patterns
      • emotional safety cues
      • belonging signals
      • relational harm
    • Work with the research team to connect empirical findings to AI structures
    • Design ethical and safety guardrails for all AI usage and outputs
    • Build small tools that help the platform understand user states ethically and safely
    • Support future development of the EMLU (Emotional Multitask Language Understanding) layer

    You don’t need to do everything alone — but you will help design the path.

    1. Requirements (Realistic + Values-aligned)

    The ideal person has experience with:

    Technical Skills

    • AI system design / ML engineering
    • Python (PyTorch or TensorFlow)
    • NLP / language models
    • Embeddings, dataset design, vector stores
    • Building prototypes or research tools
    • Data ethics & responsible AI practices
    • JSON schemas, APIs, data modeling

    Knowledge / Orientation

    • Interest in emotional patterns, relational systems, or human behavior
    • Understanding of trauma-informed principles (or willingness to learn)
    • Experience working with complex, abstract systems
    • Ability to translate human concepts into structured logic

    Personal Capacities

    • emotionally mature and self-aware
    • comfortable working in ambiguity
    • collaborative, grounded, low-ego
    • curious, patient, and precise
    • enjoys working at the frontier of disciplines

    Nice to Have

    • Experience with computational psychology
    • Experience with safety alignment research
    • Anthropic / OpenAI / HuggingFace experience
    • Experience building data labeling pipelines
    • Graph database understanding (Neo4j)
    • Experience designing ontology-level systems
    • Familiarity with ethics committees or sensitive data protection
    • Knowledge of neuroscientific or psychological models

    What Kind of Person Belongs Here

    This role is for someone who:

    • wants to build technology that protects people
    • cares about emotional safety
    • enjoys bridging science and human experience
    • thinks slowly and deeply
    • doesn’t rush for the sake of rushing
    • values clarity, honesty, and responsibility
    • believes AI should be built with integrity

    You don’t need to be “perfect.”

    Just grounded, curious, and aligned with the mission of making intelligence — human or artificial — safer and more connected.

    TEG-Blue™ is a place for people who care — about dignity, about repair, and about the emotional wellbeing of communities and future AI. We are building slowly and consciously. If this mission speaks to you, we would love to hear from you.

    Location

    Remote
    Volunteer can be anywhere in the world
    Associated Location
    Barcelona, Spain

    Please fill out this form

    Instructions:

    This is a foundational volunteer role during the early building phase.

    Once funding arrives, the position transitions to a paid role with fair and transparent compensation.

    All fields are required
    I acknowledge that use of the Idealist Applicant Tracking System is subject to Idealist's Privacy Policy and Terms of Service.
    Illustration

    Discover Your Calling

    Find opportunities to change the world with the latest social-impact job, internship, and volunteer listings. Plus, explore resources for taking action in your community.
    Already a user? Log in