For the right fit, we're open to compensating at a much higher pay rate.
The Cambridge Boston Alignment Initiative (CBAI) is a nonprofit research organization working to advance research and education directed towards ensuring that society navigates a safe and beneficial transition to advanced AI systems. Our work takes the form of producing original research efforts and accelerating AI safety research through fellowship programs.
Our inaugural summer fellowship cohort has already published a spotlight paper at the Mechanistic Interpretability Workshop at NeurIPS, accepted papers at ICLR, and some of our fellows have joined Goodfire and Redwood Research. After a successful 2025 launch, we're rapidly scaling in 2026. We will host multiple fellowship cycles, double our fellowship cohort, and quadruple our team.
CBAI has been the anchor organization for AI safety in Cambridge since late 2022. We serve as the connective tissue between the AI safety research community across Harvard, MIT, and Northeastern — running workshops, convenings, and networking events that bring together researchers, students, and practitioners working on the most important problems in the field.
Refer us candidates, and receive $5,000 if we hire them.
You'll own CBAI's external-facing event and workshop portfolio — the programs that make Cambridge a genuine gathering point for AI safety researchers and that build the field's next generation of talent. This includes three distinct but interconnected responsibilities: scaling and improving the Harvard-MIT AI Safety Workshops held in Essex, MA; running CBAI's monthly city-level networking events; and sourcing, designing, and executing high-visibility specialized research workshops with Harvard, MIT, and Northeastern research groups. This is equal parts community strategy and operational execution — you'll need to be as comfortable designing a workshop curriculum with student group leaders as you are negotiating a venue contract or managing a speaker's travel logistics.
Own the end-to-end planning and execution of the Harvard-MIT AI Safety Workshops, currently held 5 times per academic year in Essex, MA, with ~40 participants and ~10 guests per workshop
Work with the Harvard AI Safety Student Team (AISST) and MIT AI Alignment (MAIA) to scale the program from 5 to 8 workshops per academic calendar year
Curate and invite guests from frontier AI safety research organizations — including Redwood Research, METR, Google DeepMind, Anthropic, OpenAI's safety teams, and Cambridge-area academics
Lead participant selection in collaboration with AISST and MAIA, ensuring cohort quality and diversity across partnering university groups
Design workshop programming: session formats, discussion structures, speaker slots, and career programming
Gather and synthesize participant and guest feedback to continuously improve workshop quality
Identify and source opportunities for high-visibility, weekend-long specialized research workshops in partnership with Harvard, MIT, and Northeastern research groups
Build relationships with faculty and research group leads to develop workshop concepts that serve genuine research needs
Own end-to-end execution: venue, logistics, programming design, speaker coordination, and participant experience
As the portfolio grows, take on a managerial dimension — coordinating contractors, volunteers, or support staff to run larger events effectively
Own and run CBAI's monthly Cambridge AI safety networking events, serving as the connective tissue between researchers, students, and practitioners across the city
Curate invitee lists, manage outreach, and create programming that makes each event genuinely worth attending
Build and maintain CBAI's relationships with the broader Cambridge AI safety community, using events as a vehicle for strengthening the ecosystem
We expect you to be characterized by most of the qualities listed below.
You're an exceptional event producer. You've planned and executed complex, multi-stakeholder events — ideally including residential or multi-day formats — and you take genuine pride in the quality of the experience you create. You anticipate problems before they happen, keep logistics airtight, and remain calm when they don't.
You're a natural community builder. You understand that events are a means, not an end. You think about what makes a community worth belonging to, who should be in the room and why and how to design experiences that strengthen relationships and generate real intellectual value — not just fill a calendar.
You're a credible interlocutor in research spaces. Familiarity with AI safety research is a strong plus for this role. You don't need to be a researcher, but you need enough fluency to hold a substantive conversation with a safety researcher at Anthropic, understand why a particular workshop topic matters, and make good curatorial judgments about guests and programming. Candidates with meaningful AI safety familiarity will be strongly preferred.
You're an excellent relationship builder. A significant part of this role is maintaining trust with student group leaders at Harvard and MIT, researchers at frontier labs, and faculty across multiple universities. You know how to build those relationships thoughtfully and keep them warm over time.
Operationally excellent. You manage vendors, timelines, budgets, and logistics without dropping things. You build systems and documentation that make each successive event easier to run than the last.
Excellent communicator. You write clearly, represent CBAI well to external audiences, and give and receive feedback effectively. You're proactive when you notice a problem.
Mission-motivated. You care about AI safety and want to contribute to building the research community working on it — even if your contribution is through convening and community rather than research itself.
Entrepreneurial. The specialized workshop portfolio is nascent. You'll need to identify opportunities, make the case to potential partners, and build something that doesn't yet fully exist. If you wait to be handed a clear brief, this role will be frustrating.
The ideal candidate will have meaningful experience in event production, academic program coordination, or research community building — ideally in a university, think tank, or mission-driven research organization context. What matters most is that you've built events that people genuinely valued, and that you bring demonstrated judgment in curating rooms that produce real intellectual and professional value.
Existing relationships in the AI safety, AI governance, or broader Cambridge research community
Experience with residential or retreat-style event formats
Familiarity with the Cambridge and greater Boston venue and vendor landscape
Experience working with or within university student organizations or academic departments
This is not a research role. You'll be deeply embedded in cutting-edge AI safety research conversations, but your job is to build the environments where those conversations happen — not to participate in them as a researcher. If you're primarily motivated by contributing to AI safety research directly, this role won't be satisfying.
The portfolio is ambitious and partly unbuilt. The Harvard-MIT workshops are established; the specialized research workshop portfolio is early-stage. You'll need to source opportunities, build relationships from scratch, and create programming where none exists yet. If you prefer inheriting a fully defined scope, this may be challenging.
Workload is uneven. Event-heavy periods are intense. Between major workshops, the pace shifts toward relationship-building and planning. If you need a steady, predictable rhythm to do your best work, this variability may be difficult.
Your impact is felt in the room, not on paper. When a workshop produces a collaboration that leads to an important paper, or a networking event connects someone with the opportunity that changes their career trajectory, that's your win — but your name won't be on it. If you need visible, attributable outputs to feel satisfied, this role will be frustrating.
If this sounds exciting to you, if you want to spend at least a year becoming excellent at building research communities, running high-quality convenings, and contributing to the growth of AI safety as a field from one of the world's great research cities, this role could be a great fit.
Team: You'll report to CBAI leadership.
Salary: $80,000 – $100,000, depending on experience.
We also provide:
5% 403(b) match contribution
Comprehensive health insurance
Generous PTO policy
Meals provided during weekdays
Employer-paid commuter benefits
Reimbursement for work-related technology and/or home office expenses
U.S. work authorization required (we accept OPT).
Location: Cambridge, MA. This role is primarily in-person given the nature of the work, though some flexibility exists between major events.
Start date: May 2026
Application Review: We review applications on a rolling basis. Your application will be reviewed in detail by a CBAI employee.
Initial Phone Screen (15 minutes): A conversation with the team manager to discuss your background, interest in the role, and initial questions.
Paid Test Task: Strong candidates will receive a paid test task mirroring actual responsibilities — such as designing a workshop agenda, drafting a guest outreach strategy, or proposing a programming format for a monthly networking event.
Interview: Top candidates will be invited for an interview, including discussion of your test task, an event design case study, a community strategy scenario, and conversation with CBAI team members and potentially a student group leader or partner.
Reference Checks: Conducted for top finalists, followed by a final conversation to ensure mutual fit.
Offer: Selected candidates receive an offer and onboarding information.
CBAI is an Equal Opportunity Employer and does not discriminate on the basis of race, religion, color, sex, gender identity or expression, sexual orientation, age, disability, national origin, veteran status, or any other basis covered by appropriate law.
In acknowledgement of the research that suggests that women, gender minorities, and other marginalized groups may be less likely to apply for roles where they don’t meet every criterion, we especially encourage people in these categories to apply.
We may use AI to assist in the initial screening of applications, including to detect whether candidates have used AI models in drafting their application. Decisions are always made by a human on our team.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Telix Pharmaceuticals is hiring a Senior Project Manager to own end-to-end delivery of a late-stage therapeutics program, coordinating cross-functional teams and stakeholders across a global organization.
AES Clean Energy is hiring an Associate Developer to manage and advance wind project development across central U.S. markets, overseeing schedules, budgets, permitting, interconnection and stakeholder engagement.
LinkedIn seeks a Staff Technical Program Manager in Mountain View to lead high-impact, AI/LLM-driven programs across Talent Marketplace Engineering and partner orgs to drive roadmap, execution, and measurable outcomes.
STATION Austin is hiring a Senior Event Specialist to plan and execute member-focused programming and large-scale activations that deepen engagement across Austin’s startup ecosystem.
Lead multi-discipline teams at Bosch Mobility Aftermarket to plan, execute, and deliver new automotive diagnostics equipment from concept through market launch.
Clarion is hiring an in-person Implementation Manager in New York to run end-to-end SaaS deployments and scale implementation playbooks for its AI-driven healthcare scheduling platform.
Sia Partners seeks a technically fluent Compliance Technical Project Manager to drive end-to-end delivery of complex fintech and technology programs while aligning stakeholders and ensuring operational and compliance excellence.
Senior leader needed to own and scale Candid’s project delivery practice, ensuring engineering-grade Pre-FEED/FEED engagements across LNG, power, and petrochemical portfolios.
Senior Program Manager role focused on cash posting, reconciliation, and automation within Amazon One Medical's Revenue Cycle Management function, partnering across operations, finance, and external health system partners.
Experienced program manager needed to lead strategic customer cooling programs—coordinating engineering, operations, supply chain and quality to deliver projects on time, on budget, and to specification.
Databricks seeks a Sr. Technical Program Manager to drive delivery of complex Public Sector Professional Services engagements, guiding architecture decisions and ensuring exceptional customer outcomes.
AECOM is hiring a Senior Environmental Project Manager / Client Account Manager in Raleigh to lead rail remediation projects, grow client relationships, and deliver technical and commercial project success.
Lead delivery of Thumbtack's Internal Platform Infrastructure programs—driving identity, integrations, vendor decisions, and enterprise AI initiatives across a complex, cross-functional tooling stack.