The Fox Says

The Fox Says

Home
Archive
About
How cost-effective are AI safety YouTubers?
Early work on ”GiveWell for AI Safety”
Sep 12 • 
Marcus Abramovitch
 and 
Austin Chen
23
8

July 2025

AI protests from around the world
AI Protest Actions #1, a guest post by Rachel Shu
Jul 22 • 
Rachel Shu
10
Consider political giving for AI safety
AI policy and the unique advantages of individual donors
Jul 16 • 
Erin Braid
11
1

June 2025

Announcing Manival
An LLM-powered grant evaluator
Jun 18 • 
Lydia Nottingham
11
5

May 2025

‘GiveWell for AI Safety’: Lessons learned in a week
Finding effective giving opportunities: how can Manifund help?
May 30 • 
Lydia Nottingham
10
6

April 2025

What makes a good "regrant"?
Reviewing some of our favorite AI safety regrants - and some less good fits
Apr 24 • 
Jesse Richardson
9
1
Manifund 2025 Regrants
Announcing 10 AI safety regrantors, with $2m+ total to distribute
Apr 22 • 
Austin Chen
14

March 2025

Fundraising for Mox, our space in SF
Coworking & events for AI safety, AI labs, EA charities & startups
Mar 31 • 
Manifund
9
3
AI for Epistemics Hackathon
Seeking truth via LLMs; 9 projects built in 8 hours
Mar 14 • 
Manifund
5

December 2024

Come to minifest, our cozy one-day unfestival
Saturday, Dec 14 at Lighthaven, Berkeley
Dec 9, 2024 • 
Manifund
6

October 2024

5 homegrown EA projects, seeking small donors
plus updates on what Manifund has been up to
Oct 28, 2024 • 
Manifund
7
4

August 2024

Claim your funds now for EA Community Choice!
$100k airdrop, while supplies last~
Aug 21, 2024 • 
Manifund
5
© 2025 Austin Chen
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture