The Fox Says

The Fox Says

Home
Archive
About
How cost-effective are AI safety YouTubers?
Early work on ”GiveWell for AI Safety”
Sep 12 • 
Marcus Abramovitch
 and 
Austin Chen

July 2025

AI protests from around the world
AI Protest Actions #1, a guest post by Rachel Shu
Jul 22 • 
Rachel Shu
Consider political giving for AI safety
AI policy and the unique advantages of individual donors
Jul 16 • 
Erin Braid

June 2025

Announcing Manival
An LLM-powered grant evaluator
Jun 18 • 
Lydia Nottingham

May 2025

‘GiveWell for AI Safety’: Lessons learned in a week
Finding effective giving opportunities: how can Manifund help?
May 30 • 
Lydia Nottingham

April 2025

What makes a good "regrant"?
Reviewing some of our favorite AI safety regrants - and some less good fits
Apr 24 • 
Jesse Richardson
Manifund 2025 Regrants
Announcing 10 AI safety regrantors, with $2m+ total to distribute
Apr 22 • 
Austin Chen

March 2025

Fundraising for Mox, our space in SF
Coworking & events for AI safety, AI labs, EA charities & startups
Mar 31 • 
Manifund
AI for Epistemics Hackathon
Seeking truth via LLMs; 9 projects built in 8 hours
Mar 14 • 
Manifund

December 2024

Come to minifest, our cozy one-day unfestival
Saturday, Dec 14 at Lighthaven, Berkeley
Dec 9, 2024 • 
Manifund

October 2024

5 homegrown EA projects, seeking small donors
plus updates on what Manifund has been up to
Oct 28, 2024 • 
Manifund

August 2024

Claim your funds now for EA Community Choice!
$100k airdrop, while supplies last~
Aug 21, 2024 • 
Manifund
© 2025 Austin Chen
Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture