<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Fox Says]]></title><description><![CDATA[A newsletter for https://manifund.org]]></description><link>https://manifund.substack.com</link><generator>Substack</generator><lastBuildDate>Fri, 10 Apr 2026 11:16:14 GMT</lastBuildDate><atom:link href="https://manifund.substack.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Austin Chen]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[manifund@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[manifund@substack.com]]></itunes:email><itunes:name><![CDATA[Manifund]]></itunes:name></itunes:owner><itunes:author><![CDATA[Manifund]]></itunes:author><googleplay:owner><![CDATA[manifund@substack.com]]></googleplay:owner><googleplay:email><![CDATA[manifund@substack.com]]></googleplay:email><googleplay:author><![CDATA[Manifund]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[One year of Mox, and fundraising for year 2]]></title><description><![CDATA[Support the largest AI safety & EA hub in SF]]></description><link>https://manifund.substack.com/p/one-year-of-mox-and-fundraising-for</link><guid isPermaLink="false">https://manifund.substack.com/p/one-year-of-mox-and-fundraising-for</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Mon, 09 Mar 2026 18:51:22 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/3a1620ac-f6f2-49cb-8868-a6e5034936aa_2048x1366.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Mox is San Francisco&#8217;s primary AI safety incubator and Effective Altruism community space. Over the last year, we&#8217;ve supported high-impact work by hosting fellowships, events, offices and coworking. We&#8217;re now looking for individual donors and institutional funding to carry us through the next year!</p><p>We&#8217;re aiming to raise $450k, and think we can effectively deploy up to $1.2m to run more great events, improve our space, and incubate new fellowships. To kick things off, one anonymous donor has offered Mox a 1:1 match up to $100k -- if we can raise $50k+.</p><p>Please support us through <a href="https://manifund.org/projects/mox-2026-fundraiser">our Manifund page</a>, or read on to learn more about our work!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://manifund.org/projects/mox-2026-fundraiser&quot;,&quot;text&quot;:&quot;Donate&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://manifund.org/projects/mox-2026-fundraiser"><span>Donate</span></a></p><p><em>Interested in donating a large amount? Reach out to Rachel Shu, Mox Director at <a href="mailto:rachel@moxsf.com">rachel@moxsf.com</a>.</em></p><h2><strong>Our first year in review</strong></h2><p>Mox launched on Feb 15, 2025. In a single year on a shoestring budget, we&#8217;ve become a primary nexus for the AI safety community and other EA-adjacent work in San Francisco. We&#8217;ve hosted hundreds of members and thousands of visitors, organized our own public events and fellowship, and supported dozens of other impactful orgs.</p><p><strong>By the numbers:</strong></p><ul><li><p>183 active members</p></li><li><p>15 private offices</p></li><li><p>377 events hosted</p></li><li><p>Partnerships with 19 AI Safety &amp; EA orgs</p></li><li><p>4 floor (40k sq ft) buildout</p></li></ul><h2><strong>Fellowships &amp; programs</strong></h2><p>We supported five residencies in the last year:</p><ul><li><p><strong><a href="https://aiforhumanreasoning.com/">FLF Fellowship on AI for Human Reasoning</a></strong>: 30 fellows exploring, researching, and developing potential beneficial AI for Human Reasoning tools.</p></li><li><p><strong><a href="https://pibbss.ai/symposium-25/">PIBBSS Fellowship 2025</a></strong> (now Principles of Intelligence): 17 fellows in residence for cross-disciplinary AI safety research.</p></li><li><p><strong><a href="https://seldonlab.com/blog/seldon-grande-finale-celebrating-a-successful-batch-1">Seldon Lab Accelerator</a>, Batch 1:</strong> 4 startups building AI safety infrastructure, including Andon Labs, Workshop Labs, and Lucid Computing.</p></li><li><p><strong>Seldon Lab Accelerator, Batch 2</strong>: 6 startups, currently in residence.</p></li><li><p><strong><a href="https://framefellowship.com/">The Frame Fellowship</a></strong>: An 8-week program for 8 video creators communicating about AI safety, developed in-house at Mox.</p></li></ul><p>We also offer our space for retreats, workshops, and hackathons, serving orgs like <a href="https://www.tarbellcenter.org/">Tarbell</a>, <a href="https://www.1daysooner.org/">1DaySooner</a>, <a href="https://elicit.com/">Elicit</a>, <a href="https://press.asimov.com/">Asimov Press</a>, <a href="https://futuresearch.ai/">FutureSearch</a>, and <a href="https://www.avainternational.org/">AVA International</a>.</p><p>Finally, we work with local conferences, hosting pre/post conference coworking and side events for <a href="https://www.effectivealtruism.org/ea-global/events/ea-global-san-francisco-2026">EAG Bay Area</a>, <a href="https://less.online/">LessOnline</a>, <a href="https://manifest.is/">Manifest</a>, and <a href="https://thecurve.goldengateinstitute.org/">The Curve</a>.</p><p><strong>How does Mox contribute to these programs&#8217; success?</strong></p><ul><li><p>Provides a fully furnished office</p></li><li><p>Situates them alongside other groups doing similar work</p></li><li><p>Provides event venue space directly connected to their workspace</p></li><li><p>Handles daily catering, janitorial and supplies</p></li><li><p>Troubleshoots participant tech</p></li></ul><h2><strong>Public Events</strong></h2><p>We hosted <strong>377</strong> events over the last year, including:</p><ul><li><p><strong><a href="https://luma.com/jxsanwtn">Senator Scott Wiener on AI safety legislation</a></strong> &#8212; live Q&amp;A with CA Senator and author of SB-53.</p></li><li><p><strong><a href="https://www.sentientfutures.ai/sfsbay2026">Sentient Futures Summit</a></strong> &#8212; a 350+ attendee, 3-day conference in February 2026 focused on AI and Animal Welfare, as <a href="https://sfstandard.com/2026/02/19/sentient-futures-ai-rights/">featured in the SF Standard</a>.</p></li><li><p><strong><a href="https://luma.com/45t9ec6t">Models in Moral Mazes</a></strong> &#8212; Anthropic research scholars previewed an unpublished paper on misalignment at Mox before public release.</p></li><li><p><strong><a href="https://luma.com/xjcpo8tp">Man vs Machine Hackathon</a></strong> (<a href="https://metr.org/">METR</a> &#215; <a href="http://factory.ai/">Factory.AI</a>) &#8212; a 300 person, live RCT on AI coding agent productivity</p></li><li><p>Q&amp;As and fireside chats with <a href="https://partiful.com/e/0amUhez4xeatKYZI6GBL">Joe Carlsmith</a>, <a href="https://partiful.com/e/LRh3wHKLIywblTtYm75W">Eli Lifland</a>, <a href="https://luma.com/u6r1itp9">Nate Soares</a>, <a href="https://partiful.com/e/j5Ar45JijdiGvfRANbAz">Scott Aaronson</a>, <a href="https://luma.com/9j8wpppr?tk=sOXOYR">Joel Becker</a> and <a href="https://luma.com/6m18zbbo">Bryan Caplan</a>.</p></li></ul><p>Mox also hosts recurring community events, such as:</p><ul><li><p><strong>Effective Altruism SF</strong>, biweekly events and meetups</p></li><li><p><strong>Astral Codex Ten SF</strong>, monthly meetups</p></li><li><p><strong>90/30 Club</strong>, machine learning paper reading group</p></li><li><p><strong>Mathematics with Lean</strong>, the interactive theorem prover</p></li></ul><blockquote><p><em><strong>Mox has been an invaluable resource for us when running EA SF [Effective Altruism San Francisco], since its large and well-equipped facility allowed us to cater food, run speaker events, workshops, and otherwise host much larger and more ambitious events than we otherwise would have been able to.</strong></em></p></blockquote><p><em>&#8212; Lead organizers of EA SF</em></p><h2><strong>Individuals &amp; Coworking</strong></h2><p>We currently have 183 active members; on a typical coworking day, 50-80 people are at Mox. A sampling of individual members who are frequently at Mox:</p><ul><li><p>Justin Kuiper, AI safety video producer</p></li><li><p>Ross Rheingans-Yoo, philanthropic investor</p></li><li><p>Itsi Weinstock, Senterra Funders</p></li><li><p>Kamile Lukosuite, GovAI</p></li><li><p>Joshua Levy, Holloway</p></li><li><p>Ronak Mehta, AI safety researcher</p></li></ul><p>You can see a list of all members here: <a href="https://moxsf.com/people">https://moxsf.com/people</a></p><p><strong>Testimonials from our August 2025 feedback survey:</strong></p><blockquote><p><em><strong>It feels like a second home, but more lively. I can always expect to run into a friend who is down to cowork or hang.</strong></em></p></blockquote><p>&#8212; <em>Constance Li, founder of Sentient Futures</em></p><blockquote><p><em><strong>I can walk up to anyone and have an interesting conversation; every single person I&#8217;ve met here has welcomed questions about their work and been curious about mine.</strong></em></p></blockquote><p><em>&#8212; Gavriel Kleinwaks, Horizon Fellow</em></p><blockquote><p><em><strong>Mox has the best density of people with the values &amp; capabilities I care about the most. In general, it&#8217;s more social &amp; feels better organized for serendipity vs any coworking space I&#8217;ve been to before, comparable to perhaps like 0.3 Manifests per month.</strong></em></p></blockquote><p><em>&#8212; Venki Kumar</em></p><h2><strong>Private offices and partner organizations</strong></h2><p>In Year 1, Mox was home to 15 private offices, including:</p><ul><li><p><strong>Sentient Futures:</strong> promoting animal welfare and sentience research</p></li><li><p><strong>Tampersec:</strong> building physical computing infrastructure security</p></li><li><p><strong>Andon Labs:</strong> building autonomous organizations such as <a href="https://www.anthropic.com/research/project-vend-2">Project Vend</a>, via Seldon accelerator</p></li><li><p><strong>Pantograph:</strong> building a preschool for robots</p></li><li><p><strong>BlueDot Impact</strong> (pending visa): online courses for AI safety upskilling</p></li></ul><p>We also maintain a <a href="https://moxsf.com/guest-program">Guest Program</a> with 19 partner organizations, providing complimentary drop-in access for a variety of orgs we highly respect. Public program partners include: MIRI, FAR.AI, Redwood Research, Palisade, GovAI, Epoch, AI Impacts, Timaeus, Elicit, Evitable, FAI, and MATS.</p><blockquote><p><em><strong>Our teammates visit San Francisco a couple of times a month. Instead of renting a coworking spot, Mox gives us a familiar space with friendly faces that we reliably run into. It feels closer to going to the college library with friends than to an office. We hang out there for many hours after our work is done!</strong></em></p></blockquote><p><em>&#8212; Deger Turan, CEO of Metaculus</em></p><h2><strong>Past funding and budget updates</strong></h2><p><strong>Grant updates</strong></p><p>In our <a href="https://manifund.org/projects/mox-a-coworking--events-space-in-sf?tab=donations">initial fundraising post</a> a year ago, we proposed three budget tiers &#8212; minimal ($1.6M/year), mainline (~$2M), and ambitious ($3.6M).</p><p>What we spent annualized to roughly $1.2M, less than even our &#8216;minimal&#8217; tier projection. What we delivered landed closer to &#8216;mainline&#8217;: 183 members, 144 Guest Program participants, 15 offices, a team of 5, and 2 tentpole events most months. And from the &#8216;ambitious&#8217; tier, we succeeded at expanding Mox to all four floors of 1680 Mission.</p><p>Mox operates on a lean budget; we believe our per-member and total costs compare favorably to other AI safety hubs such as Constellation, Lighthaven, and LISA. We&#8217;ve done this by keeping our team small, finding good deals on rent and furnishings, and charging fair prices to our members and clients. We expect monthly revenue to continue growing by $10-15k/mo for the next 3-6 months, with offices and memberships both scaling steadily, and project steady state expenses to be ~$150k/month.</p><p>See our monthly revenue and expenses spreadsheet: <a href="https://docs.google.com/spreadsheets/d/18OJQzJ_CRt5ADhsl-zmbIlIbRVYjsTQDVK5-OmVaWkE/edit?usp=sharing">May 2025 - Jan 2026</a></p><h3><strong>Budget basics</strong></h3><p>Grant funding (Craig Falls &amp; EAIF): $300k<br>Grant funding (Manifund): $500k<br>Revenue to date: $600k<br>Projected revenue in 2026: $1.4m</p><ul><li><p>35% &#8212; Memberships</p></li><li><p>35% &#8212; Offices</p></li><li><p>20% &#8212; Programs</p></li><li><p>10% &#8212; Events</p></li></ul><p>Projected costs in 2026: $1.5m</p><ul><li><p>40% &#8212; Labor</p></li><li><p>30% &#8212; Rent &amp; utilities</p></li><li><p>25% &#8212; Office supplies</p></li><li><p>5% &#8212; Event costs</p></li></ul><p><em>(all figures approximate)</em></p><h2><strong>Upcoming Plans</strong></h2><h3><strong>1. Grow and improve our main offerings</strong></h3><p><strong>Events:</strong></p><ul><li><p>More major conferences like Sentient Futures Summit</p></li><li><p>More public talks with key speakers like Senator Weiner</p></li><li><p>Improve our first floor and make it highly usable and more publicly accessible, building our ability to provide a good space, which mostly shows up in the impact we have, and somewhat in revenue.</p></li></ul><p><strong>Programs:</strong></p><ul><li><p>Serve repeat cohorts of the fellowship programs that have used our space so far</p></li><li><p>Additionally serve 3-7 new fellowships and workshops in this coming year</p></li></ul><p><strong>Coworking:</strong></p><ul><li><p>Continue growing our community of individual members to 120-150 daily users, 300+ total members</p></li><li><p>Maintain the ability to select private offices based on fit, rather than market rate</p></li><li><p>Create additional meeting rooms and other communal areas in the coworking space</p></li></ul><h3><strong>2. Attract international talent via Global Expert Fellowship</strong></h3><p>A key part of our second-year vision is the Global Expert Fellowship: hosting independent researchers, domain specialists, and builders through J-1 visa programs to create new frontier technology collaborations within the Mox community and internationally. Learn more <a href="https://www.notion.so/Researchers-and-Founders-Join-Mox-s-Global-Expert-Fellowship-30d54492ea7a80bc9c5ce70ccfceae07?pvs=21">here</a>.</p><p><strong>This may be the highest-impact thing we can achieve this year</strong>. It has immediate external impact by enabling independent researchers to quickly enter the US to do work, and it strengthens Mox by expanding our network of high-quality talent. Mox is in a rare position to pull this off, as we are able to meet State Department requirements for visa-qualifying cultural exchange which many other organizations cannot.</p><h3><strong>3. Incubate new workshops and programs</strong></h3><p>We have an advantage in creating our own programs, sourcing from the talent pool we&#8217;re developing.</p><p>Upcoming example: the <strong>Muybridge Fellowship for Visual Interpretability</strong>, which would bring together technical visual and interactive pioneers to improve the presentation of mechanistic interpretability research and broaden its accessibility. This builds on the experience gained running the existing Frame Fellowship.</p><h2><strong>Support Mox</strong></h2><p>Thank you for reading this! We&#8217;d appreciate your support, whether via:</p><ul><li><p>A direct donation (through Manifund, or contact Rachel Shu at <a href="mailto:rachel@moxsf.com">rachel@moxsf.com</a>)</p></li><li><p>Sharing this fundraiser with other potential donors</p></li><li><p>Leaving a comment about your own experience of Mox</p></li></ul><p>And of course, we&#8217;re always on the lookout for excellent members, orgs, or events to work with us; please send them our way!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://manifund.org/projects/mox-2026-fundraiser&quot;,&quot;text&quot;:&quot;Donate&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://manifund.org/projects/mox-2026-fundraiser"><span>Donate</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[How cost-effective are AI safety YouTubers?]]></title><description><![CDATA[Early work on &#8221;GiveWell for AI Safety&#8221;]]></description><link>https://manifund.substack.com/p/how-cost-effective-are-ai-safety</link><guid isPermaLink="false">https://manifund.substack.com/p/how-cost-effective-are-ai-safety</guid><dc:creator><![CDATA[Marcus Abramovitch]]></dc:creator><pubDate>Fri, 12 Sep 2025 16:30:27 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!V_1u!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Hey! Austin here. Some of Manifund&#8217;s <a href="https://manifund.org/projects/creating-making-god-an-accessible-feature-documentary-on-risks-from-agi">most</a> <a href="https://manifund.org/projects/finishing-the-sb-1047-documentary-in-6-weeks">popular</a> <a href="https://manifund.org/projects/development-of-a-cautionary-tale-feature-film-about-gradual-disempowerment">projects</a> have been videos on AI safety. We may push more in this direction &#8212; eg running a content creator fellowship out of Mox. At the same time, I&#8217;ve wanted better estimates of the impact of work we fund. Regrantor <a href="https://manifund.org/MarcusAbramovitch">Marcus Abramovitch</a> and I have started looking into data on these videos, and are excited to share early results. Here&#8217;s Marcus:</em></p><h2><strong>Intro</strong></h2><p>EA was founded on the principle of <em>cost-effectiveness</em>. We should fund projects that do more with less, and more generally, spend resources as efficiently as possible. And yet, while much interest, funding, and resources in EA have shifted towards AI safety, it&#8217;s rare to see any cost-effectiveness calculations. The focus on AI safety is based on vague philosophical arguments that the future could be very large and valuable, and thus whatever is done towards this end is worth orders of magnitude more than most short-term effects.</p><p>Even if AI safety is the most important problem, you should still strive to optimize how resources are spent to achieve maximum impact, since there are limited resources.</p><p>Global health organizations and animal welfare organizations work hard to measure cost-effectiveness, evaluate charities, make sure effects are counterfactual, run RCTs, estimate moral weights, scope out interventions, and more. Conversely, in AI safety/x-risk/longtermism, very few efforts are spent on measuring impact. It&#8217;s hard to find anything public that compares the results of interventions. Perhaps funders make these calculations in private, but one of the things that made Givewell so great was that everything was out in the open. From day 1, you could see Givewell&#8217;s thinking on their <a href="https://blog.givewell.org/2006/12/23/nice-to-meet-you/">blog</a>, examine their spreadsheets, and change parameters based on your own thinking.</p><p>I&#8217;m making a first attempt at a &#8220;GiveWell for AI safety&#8221;. The end goal is to establish units to measure the cost-effectiveness of AI safety interventions, similar to DALY/$ in global health. <a href="https://forum.effectivealtruism.org/posts/E7pkeDruknpSa7j3i/results-of-an-informal-survey-on-ai-grantmaking">Scott Alexander&#8217;s survey</a> from last year is a great example of different kinds of relevant units for AI safety work. Given units like these, a donor or funder can look at the outputs of different AI safety orgs and &#8220;buy&#8221; certain things for far cheaper than they can for others.</p><p>To begin, I&#8217;m looking into the cost-effectiveness of AI safety communications. I&#8217;m starting with communications because it&#8217;s easiest to get metrics, and the outputs of comms work are publicly viewable (compared to e.g., AI policy work) and more easily assessable (compared to e.g., technical AI safety work). For this post, I&#8217;m going to be specifically focused on YouTube videos, where metrics were easiest to gather; I&#8217;m introducing my framework for evaluating different YouTube channels, along with measurements and data I&#8217;ve collected.</p><h2><strong>Step 1: Gathering data</strong></h2><p>The end goal metric I propose for AI safety videos' cost effectiveness is &#8220;quality-adjusted viewer-minute, per dollar spent&#8221;. I started by collecting data for channel viewership and costs.</p><h3><strong>Viewer minutes</strong></h3><p>To calculate time spent watching videos, I first wrote a <a href="https://gist.github.com/marcussabramovitch/304772297a8b7bf23923fd202cb3cb04">Python script</a> to query the YouTube API for views and view lengths for every video in specific channels. I multiplied these, and adjusted by a factor for the average percentage of a video watched: 33%, based on conversations with creators. I also messaged creators asking them for their numbers, using those directly where possible. For any creators that responded, I included their metrics (which were usually screenshotted, for authenticity).</p><h3><strong>Costs and revenue</strong></h3><p>To measure cost-effectiveness, we also need to estimate the costs of making the videos. In addition to direct costs (such as equipment and editing), I consider the value of time by people producing videos to be a major cost. This is because they could be doing other things, such as earning to give. Because of this, I generally ask people to estimate their market rate salaries if they are doing the work unpaid. For example, for AI in Context, this included the salaries paid to 80000 Hours employees for the production of their video. For other creators who had lots of personal savings and weren&#8217;t getting paid, I asked them to include the value of their time.</p><p>Some channels and podcasts also produce revenue through ads and sponsorships. This is a very good thing and is a sign that people want to see the content. In fact, I expect the best content will be able to self-fund after some time, and even be profitable. That said, for now, most aren&#8217;t profitable and subsist on donations; thus, I count revenues as offsetting costs for cost-effectiveness calculations, because this funding was produced organically, though I&#8217;m open to treating this differently.</p><h3><strong>Results</strong></h3><p>Before we get into quality adjustments, here&#8217;s a snapshot of where different channels stand:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!V_1u!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!V_1u!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 424w, https://substackcdn.com/image/fetch/$s_!V_1u!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 848w, https://substackcdn.com/image/fetch/$s_!V_1u!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 1272w, https://substackcdn.com/image/fetch/$s_!V_1u!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!V_1u!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png" width="1456" height="538" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:538,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:509104,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/173411122?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!V_1u!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 424w, https://substackcdn.com/image/fetch/$s_!V_1u!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 848w, https://substackcdn.com/image/fetch/$s_!V_1u!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 1272w, https://substackcdn.com/image/fetch/$s_!V_1u!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef47ee73-bc48-4d4d-9ae5-cb5d3bda734e_1906x704.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><h2><strong>Step 2: Quality-adjusting</strong></h2><p>For quality adjustment, there are three factors I introduce: the quality adjustment for the audience, Qa, the quality adjustment for the fidelity of the message, Qf, and the alignment of the message, Qm. So the overall equation is:</p><blockquote><p>Quality-adjusted viewer minute = Views &#215; Video length &#215; Watch % &#215; Qa &#215; Qf &#215; Qm</p></blockquote><h3><strong>Quality of Audience (Qa)</strong></h3><p>This quality adjustment is meant to capture the average quality/importance of the audience. The default audience is set to 1 as if the average person watching is an average person (totally random sample from the human population). If an audience member is more influential, has more resources, or is otherwise more impactful on the world, this number goes up; vice versa if it goes down. At the extreme, you might think of a value of 0 for someone who is about to die alone on their deathbed right after they watch the video, and perhaps as high as 1,000,000,000 for an audience of just the President of the US. Other things that make a viewer valuable are things like being at a pivotal time in their career, being extremely intelligent, etc.</p><p>This is comparable but not directly correlated with CPM (cost per thousand views, aka how much advertisers would pay to advertise to this person). This value could, in theory, be negative (it'd be better for this person not to watch this video/we'd pay for a person who was about to watch the video to have their internet shut down and the video not load), though I&#8217;d suggest ignoring that.</p><p>Normal values for this factor will be between 0.1 and 100, but should center around 1-10.</p><h3><strong>Fidelity of Message (Qf)</strong></h3><p>This refers to how well the message intended for the audience is received by the audience, on average, across all viewer-minutes. It attempts to measure the importance of the message and how well the message is conveyed for your goal. If your goal is to explain instrumental convergence or give the viewer an understanding of mechanistic interpretability, how well the video conveys the message is what is being measured here, alongside how important that message is.</p><p>An intuitive way to grasp this metric is to consider how much you&#8217;d rather someone watch one minute of a certain video compared to a reference video. For now, I am somewhat arbitrarily setting the reference video to be the average of <a href="https://www.youtube.com/@RobertMilesAI/videos">Robert Miles&#8217; AI safety videos</a>. I&#8217;m seeking a better reference video; perhaps one that is more widely known or is simply considered to be the canonical AI safety video. If you would trade off X minutes of watching the video in question for 1 minute of the average Robert Miles video, then the Qf factor is 1/X.</p><p>Normal values of this number for relevant videos will be between 0.01 and 10.</p><h3><strong>Alignment of Message (Qm)</strong></h3><p>This factor refers to the message being sent relative to your values. This value will range from -1 to 1 and where a value of 1 is &#8220;this is the message I most want to get across to the viewers&#8221; and a value of -1 is &#8220;this is the exact opposite of the message I want to get across to viewers. For example, if your most preferred message to get across is &#8220;change your career to an AI safety career&#8221; and the message a particular video portrays is &#8220;pause AI&#8221; which you prefer half as much, Qm for this video is 0.5 and if the exact opposite message you want to portray is &#8220;accelerate AI as fast as possible&#8221;, you&#8217;d give a value of -1.</p><p>Importantly, this is perhaps the most subjective of the factors and depends greatly on your values</p><h3><strong>Results</strong></h3><p><a href="https://docs.google.com/spreadsheets/d/1vEVv2Tbezrkx5yGyxdtmaxjsz5ntjptUEfiPNsZZawc/edit?usp=sharing">Here</a> are my results:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!q1hW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!q1hW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 424w, https://substackcdn.com/image/fetch/$s_!q1hW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 848w, https://substackcdn.com/image/fetch/$s_!q1hW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 1272w, https://substackcdn.com/image/fetch/$s_!q1hW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!q1hW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png" width="1456" height="380" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:380,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:659266,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/173411122?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!q1hW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 424w, https://substackcdn.com/image/fetch/$s_!q1hW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 848w, https://substackcdn.com/image/fetch/$s_!q1hW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 1272w, https://substackcdn.com/image/fetch/$s_!q1hW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5ecabdfb-0f78-4c26-b450-7d70a98c9f75_2702x706.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Feel free to make a copy of the Google sheet, insert your own values, play around with it, and compare things, as you can with <a href="https://www.givewell.org/how-we-work/our-criteria/cost-effectiveness/cost-effectiveness-models/September-2023-version">Givewell&#8217;s spreadsheets</a>. I intend to update the master spreadsheet with info as I receive it (costs, viewer minutes, etc.), so I don&#8217;t recommend you change non-subjective values. I&#8217;ve added comments to explain the estimates I made in the Google Sheet. I&#8217;m seeking recommendations for the best way to capture estimates vs. reported data, or any other suggestions.</p><h2><strong>Observations</strong></h2><p>The top cost-effectiveness comes from creators that are monetizing their content (AI Species and Cognitive Revolution) as well as well-produced videos of typical YouTube length (5-30 minutes), and not from long podcasts or short-form videos.</p><p>We&#8217;re seeing that for 1 dollar, good AI safety YouTube channels generate on the order of 150-300 QAVM (or about 2.5-5 hours).</p><p>Some things to compare this to.</p><ul><li><p>Global average income is about $12k/year or about $6/hr. We could pay people this wage to watch videos; this would be 10 viewer-minutes per dollar.</p></li><li><p>We could pay to promote existing videos on YouTube. CPM on YouTube videos is about $0.03/view of about 15 seconds, which would be $0.12/minute or about 8 VM/$</p></li><li><p>We could pay to show video ads in other locations (eg, Super Bowl ads). A 30-second ad on Super Bowl LIX would have cost <a href="https://www.usatoday.com/story/sports/ad-meter/2025/02/09/how-much-does-super-bowl-commercial-cost-2025/78371882007/">$8M</a> and been shown to <a href="https://www.notion.so/How-cost-effective-are-AI-safety-Youtubers-26254492ea7a80578d00fb3eb6ae7619?pvs=21">120M</a> US people, or ~8 viewer-minutes per dollar - with perhaps a 3x audience quality adjustment for primarily US viewers, you get 24 QAVM/$.</p></li></ul><p>Of course, these comparisons don&#8217;t include the cost of video production itself.</p><p>Other notes from this exercise:</p><ol><li><p>Over the course of this, I asked a lot of people, formally and informally, who they thought of as AI safety communicators, YouTubers, etc. Essentially, everyone said Robert Miles was the first that came to mind, and a few said that they have careers in AI safety, at least partially due to his videos. This led me to make the &#8220;audience quality&#8221; category and rate his audience much higher.</p></li><li><p>Here I am measuring average cost effectiveness, though we probably want to be doing this at the margin. While this is a different exercise, I think it should still be illuminating and should serve as a good benchmark we should be aiming for.</p></li><li><p>Where is Dwarkesh? Austin and I argued for a while on this &#8212; Austin thinks Dwarkesh should be included as someone whose channel reaches important people, while I think that very few of Dwarkesh&#8217;s videos count towards AI safety. I informally surveyed a bunch of people by asking them to name AI safety YouTubers, and Dwarkesh&#8217;s name never came up. When mentioned, he was not considered to be an &#8220;AI safety YouTuber&#8221;.</p></li><li><p>Many creators seemed to want me to include their future growth. I think this is perhaps a bit too subjective and would introduce a lot of bias. These view counts are a snapshot at this point in time, but when deciding what to fund, you should also consider growth rates and look to fund things that could reach a certain cost-effectiveness bar.</p></li><li><p>I don&#8217;t think that this post should cause large-scale shifts in what gets funded or what people do, but I do think cost-effectiveness is one of the things people should be looking at for the projects they pursue</p></li></ol><h2><strong>How to help</strong></h2><p>The main thing I need is data, both of metrics and of costs, including the salaries of people spent making these videos. A lot of data is particularly hard to come by, and otherwise, I make estimates. Thanks to all the people who responded to my emails and text messages and gave me data already!</p><p>For next steps, I am planning to expand this into all media of AI communications (podcasts, books, articles, signed letters, article readings, website visits), collect data on metrics and costs, and come up with quality adjustments. If you make content or work for an org that has data, please message me at <a href="mailto:marcus.s.abramovitch@gmail.com">marcus.s.abramovitch@gmail.com</a>.</p><p>The end goal of this remains to cross-compare different &#8220;interventions&#8221; in AI safety, like fieldbuilding (MATS) and policy interventions (policy papers, lobbying efforts,) and research (quality/quantity of papers). Stay tuned!</p><h2><strong>Appendix: Examples of Data Collection</strong></h2><h3><strong>Rob Miles</strong></h3><p>The <a href="https://donations.vipulnaik.com/">Donations List Website</a> indicates Rob has received ~$300k across 3 years from the LTFF. Since Rob has been making videos for ~10 years, I am estimating it takes ~$100k/year for a total of $1M. I used my scraper and estimated here of ~33% average watch time of video per view for ~31.6M viewer minutes. I will update this one as soon as Rob responds to me, since Rob is considered the canonical AI safety YouTuber.</p><h3><strong>AI Species (Drew Spartz)</strong></h3><p>Drew told me he spent $100k for all the videos on his channel, and including his time (~1 year at $100k/year), $200k would be fair for total resources spent. I used my script to pull views for each video, and estimated that about 50% of each view was a full watch. I sent this to Drew, and he told me 30-35% was more realistic, and then he gave me his raw data. This was very helpful since it allowed me to calibrate % watch time for various video lengths.</p><h3><strong>Rational Animations</strong></h3><p>I looked at Open Phil grants, which totaled $2.785M. FTX FF gave a further $400k. I then added ~10% for other funding from individuals or other grantmakers that didn&#8217;t show up. I scraped their data for views from the YouTube API and used a 33% watch-through rate.</p><p>After this, Rational Animations confirmed to me that they in fact spent more and received more grants, which summed to $4,395,132.</p><h3><strong>AI in Context</strong></h3><p>Aric Floyd told me that 80k spent $50.75k on the video and that staff time on the video summed to ~$75.2k for a total of $126k. He also sent me his watch data for the video.</p><h3><strong>Cognitive Revolution</strong></h3><p>I asked Nathan, and he said I was approximately correct that $500k ($250k of which is on production and ~$250k paid to him as salary) is about what has been spent over the last couple of years for production. He also shared his YouTube data and suggested that he gets more engagement from audio-only. (Not to worry, Nathan, I&#8217;m analyzing this next. It&#8217;s just hard to get data.)</p><p>Cognitive Revolution is unique in that they have very substantial revenues because of sponsorships and YouTube views. This somewhat breaks my formulas for cost-effectiveness since donations aren&#8217;t required. In other words, Cognitive Revolution makes money. Therefore, I&#8217;m conservatively estimating the value of Nathan&#8217;s time here to be a bit higher, given his experience. Thus, $250k is spent on production, and $250k/year is spent on Nathan&#8217;s time. For 2 years, I have been considering the total cost of Cognitive Revolution to be $750k with $500k in revenues, though I am very open to changing these numbers. I don&#8217;t know the best way to treat Nathan&#8217;s podcast given all these variables.</p>]]></content:encoded></item><item><title><![CDATA[AI protests from around the world ]]></title><description><![CDATA[AI Protest Actions #1, a guest post by Rachel Shu]]></description><link>https://manifund.substack.com/p/ai-protests-from-around-the-world</link><guid isPermaLink="false">https://manifund.substack.com/p/ai-protests-from-around-the-world</guid><dc:creator><![CDATA[Rachel Shu]]></dc:creator><pubDate>Tue, 22 Jul 2025 16:17:45 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!8izV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Austin here. You may be familiar with the protests led by <a href="https://manifund.org/projects/pauseai-us-2025-through-q2">Pause AI</a> and <a href="https://www.stopai.info/">Stop AI</a> &#8212; but did you know that there are many other such protests, worldwide? My colleague <a href="http://blog.rachelshu.com">Rachel Shu</a> compiled this overview of different protests against AI, and I was surprised by their breadth.</em></p><p><em>Personally, I&#8217;m skeptical that pausing AI would be good, and also don&#8217;t know whether these protests are effective for achieving that goal. But Manifund aims to be a neutral platform, supporting different viewpoints on how to do good, recognizing that we&#8217;re not always sure what is the right thing to do. In that spirit, I wanted to share this list of protests with you all. Enjoy!</em></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!LGkf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!LGkf!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 424w, https://substackcdn.com/image/fetch/$s_!LGkf!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 848w, https://substackcdn.com/image/fetch/$s_!LGkf!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 1272w, https://substackcdn.com/image/fetch/$s_!LGkf!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!LGkf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png" width="1194" height="448" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:448,&quot;width&quot;:1194,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!LGkf!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 424w, https://substackcdn.com/image/fetch/$s_!LGkf!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 848w, https://substackcdn.com/image/fetch/$s_!LGkf!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 1272w, https://substackcdn.com/image/fetch/$s_!LGkf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0c1629b-fead-4642-b1a9-c34cdd88190f_1194x448.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em><a href="https://x.com/catehall/status/1941178384010313740">Cate Hall on X</a></em></figcaption></figure></div><h2>Why I&#8217;m writing this</h2><p>Like Cate, I (Rachel Shu) think it&#8217;s inevitable that public opposition to AI will continue to grow quickly over the next two years. In particular, I think there is a medium-sized-but-salient chance (33%?) of some flashpoint in the next two years during which millions of people worldwide will be angry enough to lend their voice to protests against increasingly powerful AI. Most likely this flashpoint will center around large-scale labor replacement, or possibly a mismanaged government surveillance initiative, rather than around existential risk.</p><p>When such flashpoints happen, they create political momentum that advocacy groups, labor unions, civil rights organizations, and other actors can mobilize to advance their agendas. The organizations best positioned to capitalize on this moment will be those with existing infrastructure, clear policy demands, and the resources to scale up quickly. Mapping out these potential actors, and the actions they&#8217;re taking, will prove essential for understanding and potentially influencing how such a movement develops and what it ultimately accomplishes.</p><h2>What this series will cover</h2><p>To build this awareness, I&#8217;m writing this as the first of a series of posts to track global protest movements against AI and AI companies, both those protests concerned with existential risk from AGI and also of more immediately pressing concerns.</p><p>The scope of this series is specifically <strong>mass advocacy</strong>: for example, demonstrations, strikes, phone banks, petitions, and the like; as opposed to higher-level/insider political work, which I don&#8217;t have any comparative advantage in explaining. The series will cover both actions specifically protesting AI and also actions against AI organizations for reasons that aren&#8217;t centrally about AI. Each post will also devote some to explaining the context and history of specific issues or players. Hopefully this roundup will be useful to policymakers, journalists, historians, and protest organizations themselves.</p><p>I&#8217;ll analyze of some recent actions and the organizations behind them, and at the bottom will list upcoming actions and meetings I know of for anyone interested in getting involved!</p><h2>Recent demonstrations (since June 1, 2025)</h2><h3>July 15, Pittsburgh, USA: Indivisible, Sunrise Movement, ACT UP &#8212; &#8220;Stop the Summit&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!fkG6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!fkG6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 424w, https://substackcdn.com/image/fetch/$s_!fkG6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 848w, https://substackcdn.com/image/fetch/$s_!fkG6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 1272w, https://substackcdn.com/image/fetch/$s_!fkG6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!fkG6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png" width="1456" height="969" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:969,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!fkG6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 424w, https://substackcdn.com/image/fetch/$s_!fkG6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 848w, https://substackcdn.com/image/fetch/$s_!fkG6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 1272w, https://substackcdn.com/image/fetch/$s_!fkG6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8aa4217-dab6-4e48-8c9a-2981dfcbaf67_1800x1198.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>The Act Up Pittsburgh protest moved through Oakland on Tuesday, July 15, 2025, ahead of President Donald Trump&#8217;s arrival at the Pennsylvania Energy and Innovation Summit at Carnegie Mellon University. (<a href="https://www.publicsource.org/trump-mccormick-visit-cmu-energy-innovation-summit-pittsburgh/">Photos by Caleb Kaufman / PublicSource</a>)</em></figcaption></figure></div><p>Hundreds of people attended a <a href="https://www.publicsource.org/trump-mccormick-visit-cmu-energy-innovation-summit-pittsburgh/">series of demonstrations which took place</a> against the Pittsburgh Energy and Innovation Summit at CMU, in which President Trump gave a speech, <a href="https://www.wesa.fm/education/2025-07-14/cmu-pushback-ai-summit">jointly organized by various advocacy groups</a>. Riot police were deployed to disperse the protests. The summit was organized around the creation of AI datacenters in Pennsylvania, and in response the protests featured speeches discussing datacenter environmental impacts and the perceived prioritization of the international AI arms race over local concerns. From the signage, it seems the protests were also a fairly generalized anti-Trump, anti-corporate affair, rather than specifically about investment in AI.</p><h3>July 14, Seattle, USA: Jewish Voice for Peace (JVP) &#8212; &#8220;Purge Palantir&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DWBq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DWBq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 424w, https://substackcdn.com/image/fetch/$s_!DWBq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 848w, https://substackcdn.com/image/fetch/$s_!DWBq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!DWBq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DWBq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg" width="900" height="506" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:506,&quot;width&quot;:900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Seattle protest Palantir...&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Seattle protest Palantir..." title="Seattle protest Palantir..." srcset="https://substackcdn.com/image/fetch/$s_!DWBq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 424w, https://substackcdn.com/image/fetch/$s_!DWBq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 848w, https://substackcdn.com/image/fetch/$s_!DWBq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!DWBq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F40fb1e3c-5b5d-415b-8ff4-d722bb74634e_900x506.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>A photo of protesters holding a sign targeting tech company Palantir outside of the Fairview Market Hall in South Lake Union. (<a href="https://mynorthwest.com/local/seattle-protest-palantir-gaza/4109926">Photo: Jason Rantz / KTTH</a>)</em></figcaption></figure></div><p>Jewish Voice for Peace Seattle <a href="https://mynorthwest.com/local/seattle-protest-palantir-gaza/4109926">staged a demonstration against Palantir&#8217;s Seattle headquarters</a>, and <a href="https://www.kiro7.com/news/local/jews-say-let-gaza-live-protesters-rally-against-palantir-over-alleged-role-gaza-conflict-ice/Q6GIYYFF7NEYFIQ54GGIDRTGRU/">about 120 people attended</a>. Other JVP chapters and sympathetic organizations staged concurrent protests at Palantir locations around the US, <a href="https://www.democracynow.org/2025/7/15/doge_20">including Denver, New York, and Palo Alto</a>, each of which drew dozens. JVP is an organization of Jewish-Americans who oppose Israeli actions in Palestine and support a ceasefire.</p><p>Palantir Technologies provides surveillance services to federal, military, and police organizations, including Immigration and Customs Enforcement (ICE) and the Israel Defense Forces (IDF). Given that these two organizations have been the targets of recent large demonstrations in the US, it&#8217;s unsurprising that Palantir is also under fire for supporting them.</p><p>Previously <a href="https://www.theguardian.com/us-news/2025/jun/26/trump-palantir-protest-arrests">six protestors were arrested at Palantir&#8217;s NYC offices</a> on June 26 during a protest organized by climate justice group <a href="https://www.instagram.com/p/DLatvbYJ5AD/">Planet Over Profit</a> and immigrant rights group <a href="https://www.instagram.com/p/DLVaARcA0Rp/?img_index=1">Mijente</a>.</p><h3>July 13, Mexico City, Mexico: Mexican Association of Commercial Announcements (AMELOC) &#8212; &#8220;Creative industry workers united for the urgent regulation of artificial intelligence&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!beRj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!beRj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 424w, https://substackcdn.com/image/fetch/$s_!beRj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 848w, https://substackcdn.com/image/fetch/$s_!beRj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 1272w, https://substackcdn.com/image/fetch/$s_!beRj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!beRj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png" width="1280" height="720" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:720,&quot;width&quot;:1280,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!beRj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 424w, https://substackcdn.com/image/fetch/$s_!beRj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 848w, https://substackcdn.com/image/fetch/$s_!beRj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 1272w, https://substackcdn.com/image/fetch/$s_!beRj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff274e5db-419c-444c-b9d0-3c8e7b94394c_1280x720.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Voice actors demonstrate in Mexico City demanding regulation of artificial intelligence in their industry (<a href="https://www.france24.com/en/live-news/20250714-mexican-voice-actors-demand-regulation-on-ai-voice-cloning">Photo: Carl de Souza / AFP</a>)</em></figcaption></figure></div><p><a href="https://www.bssnews.net/news/292063">Dozens</a> of media professionals attended <a href="https://www.france24.com/en/live-news/20250714-mexican-voice-actors-demand-regulation-on-ai-voice-cloning">voice actor protests</a> in Mexico City for the regulation of AI voice cloning. The inciting event was the unauthorized and unattributed use of a deceased voice actor&#8217;s voice by the National Electoral Institute, a Mexican public agency.</p><p>According to this <a href="https://www.instagram.com/p/DL8tel4tKsl/?hl=en&amp;img_index=1">Instagram post</a> from the organizer, there are proposed regulations on the table in the Legislative Assembly of Mexico City (a federal district) which they are agitating in favor of. These would require consent, attribution, and royalties for the use of voice likenesses.</p><p>Previously in 2023, SAG-AFTRA and the WGA led 6-month strikes in the USA during which one of the major themes was AI use in media. SAG-AFTRA subsequently led a second strike of voice actors on the issue of AI use in video games.</p><h3>July 8, Geneva, Switzerland: Boycott, Divestment, and Sanctions (BDS) &#8212; &#8220;End UN Partnership with Genocide-Enabling Tech in &#8216;AI for Good&#8217; Conference&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!QHWR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!QHWR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 424w, https://substackcdn.com/image/fetch/$s_!QHWR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 848w, https://substackcdn.com/image/fetch/$s_!QHWR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 1272w, https://substackcdn.com/image/fetch/$s_!QHWR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!QHWR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png" width="864" height="487" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fd81d41f-635d-43cc-9498-962322b0d00d_864x487.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:487,&quot;width&quot;:864,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!QHWR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 424w, https://substackcdn.com/image/fetch/$s_!QHWR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 848w, https://substackcdn.com/image/fetch/$s_!QHWR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 1272w, https://substackcdn.com/image/fetch/$s_!QHWR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd81d41f-635d-43cc-9498-962322b0d00d_864x487.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Protestors outside the AI for Good Summit. (<a href="https://www.aa.com.tr/en/europe/protest-in-geneva-slams-ai-for-good-summit-over-big-techs-reported-role-in-gaza-war/3625487">Anadolu Agency</a>)</em></figcaption></figure></div><p>The <a href="https://aiforgood.itu.int/">AI for Good summit</a> is organized by several dozen UN and sponsored by many major tech companies, many of which have partnerships with the Israeli military or civilian government. Israel controls one of the most technologically advanced militaries, and deploys AI tools for mass surveillance and surgical precision strikes in Gaza. <a href="https://www.aa.com.tr/en/europe/protest-in-geneva-slams-ai-for-good-summit-over-big-techs-reported-role-in-gaza-war/3625487">Around 100 activists</a> gathered at Geneva's Broken Chair monument for this protest. Anadolu Agency quotes an activist as saying &#8220;The UN is delegitimizing itself with this conference, with hosting big tech at an AI for Good conference during an AI-powered genocide.&#8221;</p><h3>July 7, San Francisco, USA: Sunrise Movement Bay Area &#8212; &#8220;Billionaire Bailout Bill vs The People #bullytherich&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PA4B!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PA4B!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 424w, https://substackcdn.com/image/fetch/$s_!PA4B!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 848w, https://substackcdn.com/image/fetch/$s_!PA4B!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 1272w, https://substackcdn.com/image/fetch/$s_!PA4B!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PA4B!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png" width="1440" height="1150" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ebe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1150,&quot;width&quot;:1440,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PA4B!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 424w, https://substackcdn.com/image/fetch/$s_!PA4B!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 848w, https://substackcdn.com/image/fetch/$s_!PA4B!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 1272w, https://substackcdn.com/image/fetch/$s_!PA4B!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Febe71d94-4410-408c-bb63-5fde4050f675_1440x1150.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Protestors holding signs defaming Sam Altman (<a href="https://www.instagram.com/p/DLyk0ZMtFH3/">Sunrise Movement Instagram</a>)</em></figcaption></figure></div><p>A small gathering of Sunrise Movement members <a href="https://abc7news.com/post/protesters-rally-outside-openai-ceo-sam-altmans-san-francisco-home-praising-president-donald-trump-big-beautiful-bill/17004473/">protested outside of Sam Altman&#8217;s house</a>, in particular against Sam Altman&#8217;s support of the Big Beautiful Bill. Sunrise Movement is nominally a climate change focused organization, but it runs campaigns in support of a wider base of left-coded causes, such as racial and economic justice. Minor themes of this particular action <a href="https://www.instagram.com/p/DLwgNrkOI7P/">included</a> protesting AI datacenter energy use and AI job replacement.</p><p>Another Sunrise Movement chapter also featured in the Pittsburgh protests linked above.</p><h3>July 2, USA: No Azure for Apartheid &#8212; &#8220;When a child dies in Gaza, we get paid for aiming the air strike&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dY7i!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dY7i!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 424w, https://substackcdn.com/image/fetch/$s_!dY7i!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 848w, https://substackcdn.com/image/fetch/$s_!dY7i!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 1272w, https://substackcdn.com/image/fetch/$s_!dY7i!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dY7i!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png" width="1400" height="933" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:933,&quot;width&quot;:1400,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dY7i!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 424w, https://substackcdn.com/image/fetch/$s_!dY7i!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 848w, https://substackcdn.com/image/fetch/$s_!dY7i!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 1272w, https://substackcdn.com/image/fetch/$s_!dY7i!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F739a0b94-dfee-4130-adbc-4753649caf9f_1400x933.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Protesters waving Palestinian Flags &amp; a sign that reads &#8220;Microsoft powers genocide&#8221; at a protest organized by No Azure for Apartheid (<a href="https://medium.com/@noazureforapartheid/when-a-child-dies-in-gaza-we-get-paid-for-aiming-the-air-strike-a-letter-to-all-microsoft-db87b1fbedc7">No Azure for Apartheid</a>)</em></figcaption></figure></div><p>No Azure for Apartheid <a href="https://medium.com/@noazureforapartheid/investigations-reveal-microsofts-active-role-in-israel-s-genocide-in-gaza-campaign-statement-0eb180eea6f4">claims</a> that Microsoft, especially through its Azure division and partnership with OpenAI, has deep investment in Israel; it seems that much of the Israeli military relies on Azure as a platform. No Azure for Apartheid, founded in October 2024, has staged a <a href="https://apnews.com/article/microsoft-protest-employees-fired-israel-gaza-50th-anniversary-c5b3715fa1800450b8d0f639b492495e">string of in-person</a> and <a href="https://medium.com/@noazureforapartheid/when-a-child-dies-in-gaza-we-get-paid-for-aiming-the-air-strike-a-letter-to-all-microsoft-db87b1fbedc7">online protests</a> against the sale of AI weaponry to Israel and broadly attempting to get Microsoft to divest from Israel. Microsoft workers taking part in these actions are more or less signing up to get fired.</p><h3>June 30, London, UK: Pause AI &#8212;&#8221;<strong>AI companies are less regulated than sandwich shops&#8221;</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8izV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8izV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 424w, https://substackcdn.com/image/fetch/$s_!8izV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 848w, https://substackcdn.com/image/fetch/$s_!8izV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 1272w, https://substackcdn.com/image/fetch/$s_!8izV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8izV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png" width="1000" height="750" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:750,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8izV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 424w, https://substackcdn.com/image/fetch/$s_!8izV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 848w, https://substackcdn.com/image/fetch/$s_!8izV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 1272w, https://substackcdn.com/image/fetch/$s_!8izV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8fccd043-fa98-4c0b-af75-94a384e941bb_1000x750.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Protesters demonstrate at London's King's Cross. (<a href="https://www.businessinsider.com/protesters-accuse-google-deepmind-breaking-promises-ai-safety-2025-6">Hugh Langley/Business Insider</a>)</em></figcaption></figure></div><p>Pause AI <a href="https://www.businessinsider.com/protesters-accuse-google-deepmind-breaking-promises-ai-safety-2025-6">led a protest</a> against Google DeepMind for releasing Gemini 2.5 Pro without honoring previous commitments on how they would go about risk assessment for newly released AI models. The organization&#8217;s name might be misleading to some; they are not specifically in favor of an AI moratorium like the widely-known 2023 petition of a similar name, but rather support members who take a broader range of positions from increased regulation to full cessation of AI development. About 60 people took part in this protest.</p><p>Pause AI has previously led protests against other frontier AI labs.</p><h3>June 12, Annecy, France: Belgian association gathering authors and creators of animation (ABRACA) &#8212; &#8220;<strong>GenAI Seeks Not To Support Artists, But To Destroy Them&#8221;</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!sye2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!sye2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 424w, https://substackcdn.com/image/fetch/$s_!sye2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 848w, https://substackcdn.com/image/fetch/$s_!sye2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 1272w, https://substackcdn.com/image/fetch/$s_!sye2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!sye2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png" width="681" height="383" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5974935f-447c-4216-90e2-da53bef9aab2_681x383.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:383,&quot;width&quot;:681,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!sye2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 424w, https://substackcdn.com/image/fetch/$s_!sye2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 848w, https://substackcdn.com/image/fetch/$s_!sye2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 1272w, https://substackcdn.com/image/fetch/$s_!sye2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5974935f-447c-4216-90e2-da53bef9aab2_681x383.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Anti-Generative AI protest at the Annecy International Animation Film Festival (<a href="https://deadline.com/2025/06/annecy-ai-protest-animation-guilds-1236431752/">Melanie Goodfellow</a>)</em></figcaption></figure></div><p>Annecy is host to the world&#8217;s premier animation festival, taking place this year June 8 - June 14. Animators are increasingly under pressure due to advances in AI generated videos, and animation unions have been nearly unanimously in opposition to the widespread deployment of genAI in the entertainment industry. This protest was <a href="https://deadline.com/2025/06/annecy-ai-protest-animation-guilds-1236431752/">led by ABRACA</a> (Animation workers union of Belgium) and supported by over 20 other European media and entertainment industry labor unions.</p><h3>May 22-July 1, USA: Grassroots advocacy against AI state regulation moratorium clause</h3><p>A federal moratorium on state AI laws was proposed by Ted Cruz, but ultimately removed from the One Big Beautiful Bill Act before it was signed into law. The Senate removed the provision in a 99-1 vote, enabling states to continue regulating AI. This has been <a href="https://thezvi.substack.com/p/ai-moratorium-stripped-from-bbb">covered better elsewhere</a>, and in my understanding was mostly a higher-level policy push rather than a grassroots effort, so I&#8217;ll just point at some advocacy groups that helped with the push by phone banking: <a href="https://x.com/pauseaius/status/1939144759354003594">Pause AI</a>, and <a href="https://www.instagram.com/p/DKw1sFnR_EF/?hl=en">Encode Justice</a>, among others.</p><h2>Upcoming demonstrations and actions</h2><p><em>A protest being listed here doesn&#8217;t constitute an endorsement. Contact the author if you want to be listed.</em></p><h3>July 25, San Francisco: Stop AI &#8212; &#8220;Close OpenAI!&#8221;</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DpEv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DpEv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 424w, https://substackcdn.com/image/fetch/$s_!DpEv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 848w, https://substackcdn.com/image/fetch/$s_!DpEv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!DpEv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DpEv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png" width="1200" height="1200" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1200,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DpEv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 424w, https://substackcdn.com/image/fetch/$s_!DpEv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 848w, https://substackcdn.com/image/fetch/$s_!DpEv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!DpEv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4e04dca8-31a0-461c-98c4-3c95342e98b2_1200x1200.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/StopAI_Info/status/1941785596848173168">https://x.com/StopAI_Info/status/1941785596848173168</a></figcaption></figure></div><p>Self-description:</p><blockquote><p>&#8220;OpenAI is threatening our jobs, our lives, and our loved ones' lives. JOIN THE FIGHT! [&#8230;] We demand that the US government shut down OpenAI, close any other company building AGI, and permanently ban the development of AGI.&#8221;</p></blockquote><h2>Upcoming publicly advertised meetings and other events</h2><h3>San Francisco, CA, US:</h3><p><strong>Stop AI:</strong> <a href="https://x.com/StopAI_Info/status/1943180791695184112">Bar meeting every Wednesday 6-8 pm at Kiitos Sports Bar in San Francisco</a>.</p><h3>Berkeley, CA, US:</h3><p><strong>Stop AI:</strong> <a href="https://x.com/StopAI_Info/status/1942093239072534885">Bar meeting every Sunday 6-8 pm at Raleigh's Pub in Berkeley</a>.</p><h3>Bay Area, CA, US:</h3><p><strong>Sunrise Movement:</strong> <a href="https://www.mobilize.us/sunrisemovement/event/798414/">Monthly virtual hub meetings.</a></p><p></p><div><hr></div><p>If you&#8217;d like to help with this effort, please comment, reach out in a private message, or message me on signal: <strong>@wearsshoes.77.</strong> I&#8217;m looking both for tips about upcoming or ongoing actions, and also advice on how to shape this series going forward! I&#8217;m planning to publish this at a monthly cadence initially, faster when start to things pick up.</p><p>Thanks to Rachel Weinberg, Caithrin Rintoul, Ross Rheingans-Yoo, Gavriel Kleinwaks, Venki Kumar and others for assistance.</p>]]></content:encoded></item><item><title><![CDATA[Consider political giving for AI safety]]></title><description><![CDATA[AI policy and the unique advantages of individual donors]]></description><link>https://manifund.substack.com/p/consider-political-giving-for-ai</link><guid isPermaLink="false">https://manifund.substack.com/p/consider-political-giving-for-ai</guid><dc:creator><![CDATA[Erin Braid]]></dc:creator><pubDate>Wed, 16 Jul 2025 18:30:45 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/2d24ed3e-64d7-4311-a1f3-3b195a420a91_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>A guest post by <a href="https://www.linkedin.com/in/erinbraid/">Erin Braid</a></em></p><p>On June 25, U.S. Representative Raja Krishnamoorthi opened a Congressional committee meeting by talking about AGI.</p><p>&#8220;Basically,&#8221; Krishnamoorthi&#8217;s <a href="https://democrats-selectcommitteeontheccp.house.gov/media/press-releases/transcript-ranking-member-krishnamoorthis-opening-statement-hearing-algorithms">opening statement</a> explained, &#8220;it's AI that meets or exceeds human capabilities and can take action without human intervention. [&#8230;] Whether it's American AI or Chinese AI, it should not be released until we know it's safe. That's why I'm working on a new bill&#8212;the AGI Safety Act&#8212;that will require AGI to be aligned with human values and require it to comply with laws that apply to humans.&#8221;</p><p>This &#8220;AGI Safety Act&#8221; has not been formally introduced, and we don&#8217;t know what it might ultimately contain. But Krishnamoorthi&#8217;s statement, <a href="https://peterwildeford.substack.com/p/congress-has-started-taking-agi-more">among others</a>, clearly shows that some members of Congress are open to talking about and legislating on AI safety. This could be a key moment for AI policy, and so it might also be a key moment for donors who want to support AI policy orgs, especially orgs that work on building relationships with members of Congress. In this post, I want to explain why individual donors are especially important to those orgs, and why large foundations can&#8217;t just fund all of the policy work that they think is important. If you are an American citizen interested in donating thousands or tens of thousands of dollars towards AI policy and governance, this opportunity might be for you! (The rest of us will just get a little insight into American politics and policy.)</p><h3>The fundraising grind</h3><p>Members of Congress have full schedules of committee hearings, briefings, floor votes, meetings with their constituents, meetings with other Congresspeople, etc, but their schedules are also shockingly full of fundraising. It&#8217;s been reported that sitting members of Congress spend as much as half of their work time fundraising. They&#8217;re on the phone for hours, going down long lists of potential supporters, calling them one after another to ask for donations. A 2016 <a href="https://www.cbsnews.com/news/60-minutes-are-members-of-congress-becoming-telemarketers/">CBS segment</a> got some horrifying quotes from sitting Representatives:</p><blockquote><p>&#8220;[Before 2010] I'd have to put in about an hour, maybe an hour and a half, at most, two hours a day into fundraising. And that's the way it went until 2010, when Citizens United was enacted. At that point, everything changed. And I had to increase that to two, three, sometimes four hours a day, depending on what was happening in the schedule.&#8221; - Rep. Steve Israel.</p><p>&#8220;Both parties have told newly elected members of the Congress that they should spend 30 hours a week in the Republican and Democratic call centers across the street from the Congress, dialing for dollars.&#8221; - Rep. Rick Nolan.</p></blockquote><p>As Rep. Nolan&#8217;s quote suggests, this extreme fundraising schedule isn&#8217;t purely an individual decision for each politician. Members of Congress are expected to raise large sums of money for their parties, not just their own campaigns. And these <a href="https://issueone.org/wp-content/uploads/2025/03/IssueOneAnotherLookPriceofPower032025.pdf">&#8220;party dues&#8221;</a> are higher if you are on the most powerful legislative committees &#8211; so, in practice, you won&#8217;t be assigned to the most powerful legislative committees unless you can raise significant amounts.</p><p>This all means that if you give money to a politician&#8217;s campaign, alongside whatever effect you may have on their (re-)election odds, you can save them a lot of time and grief, and even enable them to actually serve in their current elected office.</p><h3>Not all dollars are created equal</h3><p>Some donation dollars are much more useful to a candidate than others. Under the current laws governing federal elections, a donor can give a maximum of $3,500 per election directly to a candidate&#8217;s campaign. (State-level elections have <a href="https://documents.ncsl.org/wwwncsl/Elections/Contribution-Limits-to-Candidates-2023-2024.pdf">different rules in each state</a>, but 38 of the 50 states also have this kind of donation limit.) Political Action Committees (PACs) can give a bit more: a qualifying PAC can contribute up to $5,000 per election. Other entities, like corporations and 501(c)(3) nonprofits, can&#8217;t contribute to campaigns or to PACs at all.</p><p>These legally-limited contributions, from individuals and PACs directly to a campaign, are the most valuable form of support: the highly desirable &#8220;hard dollars&#8221;. Candidates are allowed to use this money for any legitimate campaign expense. <a href="https://www.opensecrets.org/campaign-expenditures#:~:text=Expenditures%202024%20Cycle">Most importantly</a>, they can run ads to reach and convince voters.</p><p>Famously, &#8220;super PACs&#8221; can raise and spend unlimited amounts of money. However, the candidates themselves don&#8217;t control this kind of spending. By law, candidates and their campaigns can&#8217;t be involved in the creation of a super PAC&#8217;s messages in any way. So from a politician&#8217;s or a party&#8217;s perspective, while they appreciate super PAC spending on their behalf, they would definitely prefer &#8220;hard dollars&#8221; that they can use to run their own ads with their own messages.</p><p>This means that a new donor who can give, say, $3,000, is much more valuable to a politician than an additional $3,000 from someone who has already given significant amounts. This in turn opens up an opportunity for smaller donors to collectively have an impact that&#8217;s not available to a single large donor. Networks of small donors, who care about an issue and are willing to coordinate, can be very influential.</p><h3>The power of coordination</h3><p>Donors who are more willing to coordinate will have an easier time building relationships with politicians and leading the discussion on a policy issue. For example, with no coordination, individual donors might each independently contribute to a campaign, and each independently tell the candidate that issue X is important to them. This approach leaves it up to the candidate and their team to proactively notice that many different voters and donors care about X; then, even if they do notice, there will be no obvious way to learn more about X and discuss possible next steps. But if donors work together, they can collectively host X-themed fundraisers; they can unite behind specific experts and spokespeople, who can then build ongoing relationships with politicians; they can pursue a strategic mix of supporting new candidates who would champion issue X if elected, and also advocating for issue X with more established politicians.</p><p>Examples of successful donor networks:</p><ul><li><p>OutGiving and the Gill Action Fund. The Gill Action Fund was founded in 2005 by Tim Gill to advocate for marriage equality and related LGBT issues. OutGiving, the network of LGBT and allied donors that Gill had founded a decade earlier, partnered with the Action Fund to direct their individual donations to pro-equality candidates and ballot measures. After same-sex marriage was legalized nationwide in 2015, and with a majority of Americans supporting marriage equality, the Gill Action Fund wound down its operations, and the OutGiving network moved away from direct political donations.</p></li><li><p>The American Israel Public Affairs Committee, or <a href="https://www.aipac.org/">AIPAC</a>, is a pro-Israel lobbying organization. It has a large pool of affiliated donors, including &#8220;club members&#8221; who commit to contributing at least $1,800 per year. These contributions are bundled and directed to candidates across the country. Until 2020, AIPAC hosted a large annual policy conference; past speakers include Presidents Trump, Biden, Obama, and Bush. AIPAC&#8217;s policy positions have been influential in Congress for decades.</p></li><li><p><a href="https://emilyslist.org/">EMILYs List</a> is a PAC that aims to elect pro-choice Democratic women to office. Each election cycle, the organization endorses candidates in local, state, and federal races across the country. Affiliated donors can support these candidates by giving to EMILYs List, which bundles the contributions and sends them to the campaigns. EMILYs List also recruits and trains potential candidates, and runs an affiliated super PAC called Women Vote.</p></li></ul><h3>Giving opportunities</h3><p>Concretely, I know of two AI policy orgs that would benefit from a network of individual donors concerned about AI risk. <a href="https://theaipn.org/">The AI Policy Network</a> (AIPN) advocates for policies to help the United States prepare for rapid advances in AI capabilities, focusing on national security and on maintaining human control of AI. <a href="https://ari.us/">Americans for Responsible Innovation</a> (ARI) advocates for policies in the public interest on a range of AI issues, including current harms (e.g. nonconsensual deepfakes), national security (e.g. enforcing export controls), and emerging risks (misuse and misalignment).</p><p>You can reach out to these orgs if you are interested in supporting their work or hearing about upcoming donation opportunities. Note that AIPN and ARI are both 501(c)(4) nonprofits, so donations to them, like donations to political campaigns, are not tax-deductible. ARI also has a 501(c)(3) sister organization, the Center for Responsible Innovation, which is <a href="https://www.founderspledge.com/research/center-for-responsible-innovation">recommended</a> by Founders Pledge.</p><h3>Some cautionary notes</h3><p>When thinking about whether you should make some of your donations as part of an AI policy donor network, consider:</p><ul><li><p>Some people who work on AI safety have worried that getting involved in politics will result in the idea of AI risk becoming (ahem) politicized. In particular, you might worry that AI safety will become associated with one party, and then the other party will be negatively polarized against the idea.</p></li><li><p>On the flip side, if a donor network works with politicians all across the political spectrum, that will probably include supporting politicians that you, personally, strongly disagree with on other issues. You may not be willing to do that! (You can still participate in a donor network even if you're not willing to support all the politicians they might work with, but the more politicians you put in that category, the less valuable your participation in the network becomes.)</p></li><li><p>Remember that the point of building relationships with politicians, talking to them about AI risks, and electing more people who care about this issue, is ultimately to write and enact helpful legislation. I would only recommend being part of an AI policy donor network if you agree that there are specific policy asks that would mitigate AI risks.</p></li></ul>]]></content:encoded></item><item><title><![CDATA[Announcing Manival]]></title><description><![CDATA[An LLM-powered grant evaluator]]></description><link>https://manifund.substack.com/p/announcing-manival</link><guid isPermaLink="false">https://manifund.substack.com/p/announcing-manival</guid><dc:creator><![CDATA[Lydia Nottingham]]></dc:creator><pubDate>Wed, 18 Jun 2025 18:09:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!-A0s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Status: Something we&#8217;ve hacked on for a couple of weeks; looking to get feedback and iterate!</em></p><p>Last time, I wrote about some <a href="https://manifund.substack.com/p/givewell-for-ai-safety-lessons-learned">considerations</a> for AI safety grant evaluation, but didn&#8217;t actually ship a cost-effectiveness model. Since then, Austin, Nishad, and I have:</p><ul><li><p>Developed <a href="https://manivaluator.org/">Manival</a>, an LLM-powered grant evaluator</p></li><li><p>Demoed it to an audience at <a href="https://manifest.is/">Manifest</a></p></li><li><p>Written and applied <a href="https://docs.google.com/spreadsheets/d/1t4GkdnurnDAb8N_tO5Kl6RuEY83Nll7hA0FscezqGFU/edit?usp=sharing">our own grantmaking criteria</a>&#8212;we&#8217;ll see if Manival can replicate our taste </p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-A0s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-A0s!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 424w, https://substackcdn.com/image/fetch/$s_!-A0s!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 848w, https://substackcdn.com/image/fetch/$s_!-A0s!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 1272w, https://substackcdn.com/image/fetch/$s_!-A0s!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-A0s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png" width="1456" height="849" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:849,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:392437,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://lydianottingham.substack.com/i/165895064?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!-A0s!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 424w, https://substackcdn.com/image/fetch/$s_!-A0s!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 848w, https://substackcdn.com/image/fetch/$s_!-A0s!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 1272w, https://substackcdn.com/image/fetch/$s_!-A0s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc171ea3e-515e-473d-8490-ac0e1dea13ee_2819x1644.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>How Manival works</h3><p>This is effectively a form of structured &#8216;Deep Research&#8217;.</p><p>First, we specify the fields that matter to us when evaluating a grant. These might include &#8216;domain expertise of project leads&#8217; or &#8216;strength of project&#8217;s theory of change&#8217;. We have RAG-based &#8216;data fetchers&#8217; (Perplexity Sonar) scour the internet and return a score, with reasoning, for each of these fields. We then feed these into an LLM synthesizer (Claude Opus), which provides an overall evaluation.</p><p>This is a pretty janky LLM wrapper compensating for the lack of Deep Research API. We&#8217;re aware of various RAG and Deep Research alternatives, and expect our evaluations to improve as we plug better models in.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cApP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cApP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 424w, https://substackcdn.com/image/fetch/$s_!cApP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 848w, https://substackcdn.com/image/fetch/$s_!cApP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 1272w, https://substackcdn.com/image/fetch/$s_!cApP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cApP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png" width="870" height="1478" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1478,&quot;width&quot;:870,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:131939,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://lydianottingham.substack.com/i/165895064?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!cApP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 424w, https://substackcdn.com/image/fetch/$s_!cApP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 848w, https://substackcdn.com/image/fetch/$s_!cApP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 1272w, https://substackcdn.com/image/fetch/$s_!cApP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd376df32-91e9-4252-8e7d-19c315ccece1_870x1478.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>Customizing the criteria</h3><p>Different people have different ideas of what should go into a grant evaluation config. Austin cares deeply about how great a team is; I&#8217;d like mine to consider counterfactual uses of a team&#8217;s time.</p><p>With Manival, you can apply any grant evaluation criteria of your choosing (go to Configs &#8594; AI Generate). Here&#8217;s one we made just for fun:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MgNi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MgNi!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 424w, https://substackcdn.com/image/fetch/$s_!MgNi!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 848w, https://substackcdn.com/image/fetch/$s_!MgNi!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 1272w, https://substackcdn.com/image/fetch/$s_!MgNi!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MgNi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png" width="1374" height="831" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:831,&quot;width&quot;:1374,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:99152,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://lydianottingham.substack.com/i/165895064?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!MgNi!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 424w, https://substackcdn.com/image/fetch/$s_!MgNi!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 848w, https://substackcdn.com/image/fetch/$s_!MgNi!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 1272w, https://substackcdn.com/image/fetch/$s_!MgNi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bf45dd9-52a5-4e80-b612-09c55fafc796_1374x831.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jiZz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jiZz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 424w, https://substackcdn.com/image/fetch/$s_!jiZz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 848w, https://substackcdn.com/image/fetch/$s_!jiZz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 1272w, https://substackcdn.com/image/fetch/$s_!jiZz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jiZz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png" width="1456" height="1173" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1173,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:312871,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://lydianottingham.substack.com/i/165895064?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!jiZz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 424w, https://substackcdn.com/image/fetch/$s_!jiZz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 848w, https://substackcdn.com/image/fetch/$s_!jiZz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 1272w, https://substackcdn.com/image/fetch/$s_!jiZz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d6a5f52-0e43-4657-baae-210509b02ec5_1871x1507.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>What&#8217;s next for Manival?</h3><p>Manival has lots of potential uses. Here are some main ones:</p><ol><li><p><strong>Estimating marginal cost-effectiveness: </strong>We could write a config that estimates how much of a difference marginal $ &lt;x&gt; would make.</p></li><li><p><strong>Predicting impact market cap:</strong> Right now, our configs evaluate projects on a scale from 0 to 10. In the real world, project size varies: some established projects seek 6-7 figures like a &#8216;Series A&#8217;; others seek 4-5 figures in &#8216;seed&#8217; / &#8216;pre-seed&#8217; funding. Can we use Scott Alexander&#8217;s <a href="https://forum.effectivealtruism.org/posts/E7pkeDruknpSa7j3i/results-of-an-informal-survey-on-ai-grantmaking">impact valuations</a> to estimate a project&#8217;s &#8216;impact market cap&#8217;?</p></li><li><p><strong>Improving project proposals: </strong>Grant applicants can run their project proposal through Manival to understand what might need clarifying.</p></li><li><p><strong>Project comparison: </strong>We can use Manival to rank a category on Manifund, funnelling its most underrated projects to the top of your feed.</p></li><li><p><strong>Recommendations: </strong>We can use Manival to recommend new projects to grantmakers based on projects they&#8217;ve already supported.</p></li><li><p><strong>Solving the &#8216;adverse selection&#8217; / &#8216;funging&#8217; problems: </strong>Grantmakers can estimate how likely a project might be to get funding elsewhere, or better understand why it hasn&#8217;t been funded when that&#8217;s the case.</p></li></ol><p>It might be valuable to simulate how other grantmakers you respect might evaluate a project when deciding whether to make a grant. For example, here&#8217;s a simulation of  Joe Carlsmith&#8217;s thinking:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ysc_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ysc_!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 424w, https://substackcdn.com/image/fetch/$s_!ysc_!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 848w, https://substackcdn.com/image/fetch/$s_!ysc_!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 1272w, https://substackcdn.com/image/fetch/$s_!ysc_!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ysc_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png" width="1456" height="566" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/da5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:566,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:448445,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://lydianottingham.substack.com/i/165895064?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!ysc_!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 424w, https://substackcdn.com/image/fetch/$s_!ysc_!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 848w, https://substackcdn.com/image/fetch/$s_!ysc_!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 1272w, https://substackcdn.com/image/fetch/$s_!ysc_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fda5b104f-536d-4d13-aa87-39fef28dd6be_2799x1088.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>These &#8216;simulated scores&#8217; might differ from how a grantmaker actually thinks. Accordingly, we plan to develop configs that are maximally faithful to <a href="https://docs.google.com/spreadsheets/d/1t4GkdnurnDAb8N_tO5Kl6RuEY83Nll7hA0FscezqGFU/edit?usp=sharing">our own thinking</a> over the next week.</p><p>For now, I expect a lot of Manival&#8217;s value to come from &#8216;flagging potentially great projects to look into&#8217;, rather than being something people defer to.</p><p>We&#8217;re excited for you to try Manival, and eager to know what you think, especially if you&#8217;re a donor, grantmaker, or someone else who cares a lot about evaluating grant proposals. <a href="https://calendly.com/manival/">Schedule a call</a> with us to chat this through, or let us know in the comments!</p>]]></content:encoded></item><item><title><![CDATA[‘GiveWell for AI Safety’: Lessons learned in a week]]></title><description><![CDATA[Finding effective giving opportunities: how can Manifund help?]]></description><link>https://manifund.substack.com/p/givewell-for-ai-safety-lessons-learned</link><guid isPermaLink="false">https://manifund.substack.com/p/givewell-for-ai-safety-lessons-learned</guid><dc:creator><![CDATA[Lydia Nottingham]]></dc:creator><pubDate>Fri, 30 May 2025 10:57:09 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!r9XR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Epistemic status: I spent ~20h thinking about this. If I were to spend 100+ h thinking about this, I expect I&#8217;d write quite different things. I was surprised to find early GiveWell &#8216;learned in public&#8217;: perhaps this is worth trying.</em></p><p>The premise: EA was founded on cost-effectiveness analysis&#8212;why not try this for AI safety, aside from all the obvious reasons&#185;? A good thing about early GiveWell was its transparency. Some wish OpenPhil were more transparent today. That seems sometimes hard, due to strategic or personnel constraints. Can Manifund play GiveWell&#8217;s role for AI safety&#8212;publishing rigorous, evidence-backed evaluations?</p><p>With that in mind, I set out to evaluate the cost-effectiveness of marginal donations to AI safety orgs&#178;. Since I was evaluating effective giving opportunities, I only looked at nonprofits&#179;.</p><p>I couldn&#8217;t evaluate all 50+ orgs in one go. An initial thought was to pick a category like &#8216;technical&#8217; or &#8216;governance&#8217; and narrow down from there. This didn&#8217;t feel like the most natural division. What&#8217;s going on here?</p><p>I found it more meaningful to distinguish between &#8216;guarding&#8217; and &#8216;robustness&#8217;&#8308; work.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vXD4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vXD4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 424w, https://substackcdn.com/image/fetch/$s_!vXD4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 848w, https://substackcdn.com/image/fetch/$s_!vXD4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 1272w, https://substackcdn.com/image/fetch/$s_!vXD4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vXD4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png" width="1456" height="272" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:272,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:202092,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/164793045?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vXD4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 424w, https://substackcdn.com/image/fetch/$s_!vXD4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 848w, https://substackcdn.com/image/fetch/$s_!vXD4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 1272w, https://substackcdn.com/image/fetch/$s_!vXD4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F12841baf-23a7-42b1-8e83-06ce75f3778c_1766x330.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><p>Some reasons you might boost &#8216;guarding&#8217;:</p><ol><li><p>You think it can reliably get AI developers to handle &#8216;robustness&#8217;, and you think they can absorb this responsibility well</p></li><li><p>You think &#8216;robustness&#8217; work is intractable, slow, or unlikely to be effective outside large AI companies</p></li><li><p>You prioritize introducing disinterested third-party audits</p></li><li><p>You think &#8216;guarding&#8217; buys time for &#8216;robustness&#8217; work</p></li></ol><p>Some reasons you might boost &#8216;robustness&#8217;:</p><ol><li><p>You want more groups working on &#8216;robustness&#8217; than solely AI developers</p></li><li><p>You think &#8216;guarding&#8217; work is unlikely to succeed, or be effective against advanced models, or is fragile to sociopolitical shifts</p></li><li><p>You prioritize accelerating alignment / differential technological progress</p></li><li><p>You think &#8216;robustness&#8217; work makes &#8216;guarding&#8217; efforts more effective</p></li></ol><p>Finally, a note on &#8216;robustness&#8217;. I don&#8217;t expect safety protocols that work on current models to generalize to more capable models without justification and concerted effort. Accordingly, I think it makes sense to separately classify orgs whose theory of change (ToC) focuses on superintelligent systems.&#8309;</p><p>Doing so&#8212;and categorizing other helpful work as &#8216;facilitating&#8217;&#8212;we get a typology roughly as follows:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!r9XR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!r9XR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 424w, https://substackcdn.com/image/fetch/$s_!r9XR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 848w, https://substackcdn.com/image/fetch/$s_!r9XR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 1272w, https://substackcdn.com/image/fetch/$s_!r9XR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!r9XR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png" width="992" height="800" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:800,&quot;width&quot;:992,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!r9XR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 424w, https://substackcdn.com/image/fetch/$s_!r9XR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 848w, https://substackcdn.com/image/fetch/$s_!r9XR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 1272w, https://substackcdn.com/image/fetch/$s_!r9XR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7400232-aa67-465c-9896-5c1d8ae02a51_992x800.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>As you can see, &#8216;technical&#8217; and &#8216;governance&#8217; orgs fall all over the map. Some orgs are particularly hard to categorize&#8212;e.g. CHAI has outputs that plausibly fall in all four quadrants&#8212;so I&#8217;ve tried to focus on orgs with narrower remits. Redwood, GovAI, and RAND are placed solely for the agendas stated. Some work (like METR&#8217;s) facilitates work east or north of it. In this diagram, which quadrant / quadrant boundary an org belongs to carries meaning, but where it lies within that does not. The horizontal axis may collapse depending on your beliefs.</p><p>I think this typology is very much open to contention. Its main point is to convey how I arrived at exploring the funding landscape for &#8216;robustness&#8217; orgs. I&#8217;m interested in the effective giving prescription regarding these, particularly those tackling superintelligence risks, because (in roughly decreasing order):</p><ol><li><p>I think their work is less legible to outsiders than &#8216;guarding&#8217; or &#8216;facilitating&#8217; work.</p></li><li><p>I think evaluation of their work is comparatively neglected, given other funders&#8217; current foci.</p></li><li><p>I endorse differential technological development, with safety outpacing capabilities.</p></li><li><p>I&#8217;m concerned about the fragility of a guarding-only strategy, particularly in fast takeoff scenarios.</p><ol><li><p>I think having concrete plans for a &#8216;pause&#8217; may strengthen the case for one.</p></li></ol></li></ol><p>When I searched for nonprofits working on superintelligence (ASI) safety &#8216;robustness&#8217;, I got a list as follows:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bDAc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bDAc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 424w, https://substackcdn.com/image/fetch/$s_!bDAc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 848w, https://substackcdn.com/image/fetch/$s_!bDAc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 1272w, https://substackcdn.com/image/fetch/$s_!bDAc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bDAc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png" width="1456" height="750" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:750,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:311892,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/164793045?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bDAc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 424w, https://substackcdn.com/image/fetch/$s_!bDAc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 848w, https://substackcdn.com/image/fetch/$s_!bDAc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 1272w, https://substackcdn.com/image/fetch/$s_!bDAc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ffe4a28-fcd3-43cc-a88a-b85a0cfd02f0_1787x921.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>FAQ:</p><blockquote><p>Why are Redwood included despite being categorized as &#8216;prosaic focus&#8217; above?</p></blockquote><p>They have several public stories of how their work might contribute to ASI alignment. I think more orgs should have public stories like this!</p><blockquote><p>Why is &lt;org&gt; included / excluded?</p></blockquote><p>I categorized based on what I could easily find online. This method reflects what an external donor might see. It&#8217;s also imperfect, and I probably missed some orgs. Corrections are appreciated, particularly if I missed orgs that are currently fundraising.</p><blockquote><p>Why is there so much theory?</p></blockquote><p>The amount of compute required for empirical ASI work favors for-profit hosts. Yes, it&#8217;s possible this is the work that matters most. Sought / upcoming post: a theory of what safety-relevant goods it&#8217;s nonprofit orgs&#8217; comparative advantage to provide.</p><p>&#8212;</p><p>Because we&#8217;re looking for effective giving opportunities, we want to focus on the &#8216;Actively seeking donations&#8217; bucket. (We also want to make it easier for orgs to share when they&#8217;re in that bucket!)</p><p>Of the six such orgs I identified, all but Orthogonal have sought funds on Manifund&#8310;. Indeed, I only identified Coordinal Research, Luthien, and SaferAI because of this. Most of these are small orgs (&lt;10 FTE). I don&#8217;t claim this list is exhaustive: if told of another org addressing ASI threat models, I&#8217;ll include it.</p><p>I&#8217;m pretty unsure how to rank the six highlighted orgs actively seeking donations. High-quality, transparent evaluation and comparison would help.</p><blockquote><p>Hang on, isn&#8217;t this OpenPhil&#8217;s job?</p></blockquote><p>A major concern of would-be donors is that if OpenPhil / Longview / SFF / LTFF / &#8230; chose not to fund / recommend these projects, they might not be worthwhile&#8311;. I&#8217;d like to dig into this assumption.</p><p>Will OpenPhil fund all worthwhile &#8216;direct work&#8217; orgs seeking six figures for operating expenses? I can see that they funded <a href="https://www.openphilanthropy.org/grants/timaeus-operating-expenses/">Timaeus</a>, several months after Manifund <a href="https://manifund.substack.com/p/reviewing-our-ai-safety-regrants">supported</a> Timaeus with an initial start-up grant. They seem to fund a decent amount of academic work, as does Longview.</p><p>As a prospective donor, I&#8217;d like to understand why OpenPhil has funded Timaeus but not the other orgs on this list (it&#8217;s possible the others haven&#8217;t applied). I&#8217;d also appreciate a fuller list of Longview&#8217;s grants: it&#8217;s hard for me to understand their decision-making from only <a href="https://www.longview.org/artificial-intelligence/">three featured AI grants</a>.</p><p>It&#8217;s unfortunate that OpenPhil doesn&#8217;t share more about their grantmaking process: the director of CAIP, an org denied OpenPhil funding, claims not to know &#8220;what criteria or measurement system they&#8217;re using.&#8221;&#8312;</p><p>If OpenPhil and Longview could share more of the reasoning behind their grantmaking, that could really help other donors make effective giving decisions. It could also help orgs reflect more critically on their own activities!</p><p>&#8212;</p><p>Manifund is well-positioned to help great new projects get off the ground, as shown in the case of Timaeus. It also has its own limitations&#8212;often the only information available to prospective donors is an org&#8217;s own funding call, and sometimes the fact they haven&#8217;t gotten funding elsewhere.</p><p>Over the course of this investigation, it became apparent to me that Manifund lacks the resources to evaluate every org manually. With that in mind, here are some feature ideas I have to help address the &#8216;adverse selection&#8217; issue. Manifund plans on implementing some of these &#8211; I&#8217;m also curious for you, reader, to share what&#8217;s missing!</p><ol><li><p><strong>&#8216;What the experts say&#8217; sidebar:</strong></p></li></ol><ul><li><p>Explicitly invite expert commentary and highlight endorsements</p></li><li><p>Offer verified anonymous input channels via:</p><ul><li><p>Fully anonymous posts</p></li><li><p>Aggregated feedback (visible only to donors or the fundraising org)</p></li></ul></li></ul><ol start="2"><li><p><strong>One-click granular reacts (with optional elaboration):</strong></p><ul><li><p>&#8220;Right idea, wrong team&#8221;</p></li><li><p>&#8220;Best this falls, making space for something new&#8221;</p></li><li><p>&#8220;Low-probability, high-upside bet: support&#8221;</p></li><li><p>&#8220;Accelerates capabilities at least as much as alignment&#8221;</p></li><li><p>&#8230;</p></li></ul></li><li><p><strong>&#8216;Why Manifund [rather than / in addition to other funders]&#8217;</strong>:</p><ul><li><p>Invite orgs to explain why they haven&#8217;t found funding elsewhere</p></li><li><p>Invite other funders to publish their assessment criteria</p></li></ul></li><li><p><strong>Recommendations welcome!</strong></p></li></ol><p>I hope this will make it easier for donors to find the best giving opportunities, and for great projects to get funded.</p><p><em>Thanks Austin Chen, Justis Mills, Nick Marsh for feedback.</em></p><p>Some &#8216;robustness&#8217; orgs currently fundraising: <a href="https://manifund.org/projects/research-staff-for-ai-safety-research-projects">CAIS</a>, <a href="https://manifund.org/projects/coordinal-research-accelerating-the-research-of-safely-deploying-ai-systems">Coordinal</a>, <a href="https://manifund.org/projects/luthien">Luthien</a>, <a href="https://manifund.org/projects/mathematical-theory-of-bounded-learning-agents">CORAL</a>.</p><p>Some &#8216;guarding&#8217; orgs currently fundraising: <a href="https://manifund.org/projects/general-support-for-saferai">SaferAI</a>, <a href="https://manifund.org/projects/support-caips-3-month-project-on-reducing-chem-bio-ai-risk-">CAIP</a>.</p><p>Some &#8216;facilitating&#8217; orgs currently fundraising: <a href="https://manifund.org/projects/asterisk-ai-blogging-fellowship">Asterisk</a>, <a href="https://manifund.org/projects/ai-forecasting-and-policy-research-by-the-ai-2027-team">AIFP</a>, <a href="https://manifund.org/projects/keep-apart-research-going-global-ai-safety-research--talent-pipeline">Apart</a>, <a href="https://manifund.org/projects/human-intelligence-amplification--berkeley-genomics-project">BGP</a>.</p><p>&#185;Long feedback loops, high downside risks, to name a few.</p><p>&#178;Some others&#8217; work on this topic: <a href="https://www.openphilanthropy.org/request-for-proposals-ai-governance/#id-3-criteria-by-which-applications-will-be-assessed">OpenPhil</a> (criteria only), <a href="https://www.alignmentforum.org/posts/C4tR3BEpuWviT7Sje/2021-ai-alignment-literature-review-and-charity-comparison">Larks</a> (org-by-org), <a href="https://www.longview.org/grantmaking/">Longview</a> (criteria only), <a href="https://thezvi.substack.com/p/the-big-nonprofits-post">Zvi Mowshowitz </a>(org-by-org), <a href="https://80000hours.org/2025/01/it-looks-like-there-are-some-good-funding-opportunities-in-ai-safety-right-now/">Ben Todd</a> (spotlighting orgs).</p><p>&#179;This is an obvious limitation: lots of safety work happens within for-profit companies. Sought / upcoming post: a theory of what safety-relevant goods it&#8217;s nonprofit orgs&#8217; comparative advantage to provide.</p><p>&#8308;&#8217;Robustness&#8217; is an overloaded term. But I wanted to describe &#8216;making the underlying systems safer&#8217;: calling this &#8216;solutions&#8217; would&#8217;ve been a disservice to &#8216;guarding&#8217;. Tough semantics.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uE0r!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uE0r!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 424w, https://substackcdn.com/image/fetch/$s_!uE0r!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 848w, https://substackcdn.com/image/fetch/$s_!uE0r!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 1272w, https://substackcdn.com/image/fetch/$s_!uE0r!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uE0r!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png" width="274" height="188.23268698060943" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:744,&quot;width&quot;:1083,&quot;resizeWidth&quot;:274,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uE0r!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 424w, https://substackcdn.com/image/fetch/$s_!uE0r!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 848w, https://substackcdn.com/image/fetch/$s_!uE0r!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 1272w, https://substackcdn.com/image/fetch/$s_!uE0r!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F04f3dfae-645d-4cde-b018-8ad81634d681_1083x744.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>I call &#8216;guarding&#8217; everything outside the blue line, which includes white-box evals, black-box evals, audits&#8230;whereas the name might imply everything outside the red line, which includes control and hardware mechanisms. I think my classification makes sense, because some safety/control mechanisms may be either &#8216;baked into&#8217; a model or &#8216;external&#8217; to it. Everything within the blue line could be shown to an evaluator to argue that a model is safe.</p><p>&#8309;I categorized based on what I could easily find online. This method reflects what an external donor might see. It&#8217;s also imperfect, and I welcome corrections.</p><p>&#8310;I tried to minimize selection effects, but I expect I missed at least one org, which makes this statement weaker.</p><p>&#8311;Thanks, <a href="https://forum.effectivealtruism.org/posts/qdKhLcJmGQuYmzBoz/larks-s-shortform">Larks</a>, <a href="https://forum.effectivealtruism.org/posts/9uZHnEkhXZjWzia7F/please-donate-to-caip-post-1-of-3-on-ai-governance?commentId=KsHptFHawnsCfHEkn">Nanda</a>, and <a href="https://forum.effectivealtruism.org/posts/sWMwGNgpzPn7X9oSk/select-examples-of-adverse-selection-in-longtermist">Linch</a> for articulating the &#8216;adverse selection&#8217; problem.</p><p>&#8312;<a href="https://forum.effectivealtruism.org/posts/9uZHnEkhXZjWzia7F/please-donate-to-caip-post-1-of-6-on-ai-governance?commentId=kw5eGmfgaBvNQonZw">Source</a></p>]]></content:encoded></item><item><title><![CDATA[What makes a good "regrant"?]]></title><description><![CDATA[Reviewing some of our favorite AI safety regrants - and some less good fits]]></description><link>https://manifund.substack.com/p/reviewing-our-ai-safety-regrants</link><guid isPermaLink="false">https://manifund.substack.com/p/reviewing-our-ai-safety-regrants</guid><dc:creator><![CDATA[Jesse]]></dc:creator><pubDate>Thu, 24 Apr 2025 17:07:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!K9Gn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>AI moves at a crazy pace. To keep up, AI safety needs to move fast, and that means funding has to move with it. Take <a href="https://timaeus.co/">Timaeus</a>: our regrantors weren&#8217;t their biggest funders, but they were <em>first,</em> allowing Timaeus to start months earlier &#8212; and in a world of exponential curves, mere months matter.</p><p>This is why Manifund runs our regranting program: to quickly fund promising projects, by delegating budgets of $50K-$400K to experts. As we&#8217;ve <a href="https://manifund.substack.com/p/manifund-2025-regrants">just announced our 2025 regrantors</a>, now is a good time to review past regrants -- some we think were great, and others that weren't such good fits.</p><p>We think our regranting program is great for donors who want to seed ambitious new projects, care about moving fast, and appreciate transparency. If you want to fund our 2025 program, please <a href="mailto:austin@manifund.org">contact us</a>!</p><p><em>About the author: Jesse Richardson recently joined Manifund after working at Mila - Quebec AI Institute, and also has a background in <a href="https://manifold.markets/Austin/will-i-regret-investing-150k-into-j">trading on prediction markets</a>. This post is basically Jesse&#8217;s low-confidence, hot takes.</em></p><h2>Three awesome AI Safety regrants&#8230;</h2><p>What makes a great regrant? We look for early-stage projects that need quick funding, opportunities OpenPhil might miss, and chances to leverage our regrantors' unique expertise. Some of our favorites are:</p><ul><li><p><strong><a href="https://manifund.org/projects/scoping-developmental-interpretability-xg55b33wsfc">Scoping Developmental Interpretability</a></strong> to Jesse Hoogland - the first funding for Timaeus, accelerating its research by months</p></li><li><p><strong><a href="https://manifund.org/projects/support-for-deep-coverage-of-china-and-ai">Support for deep coverage of China and AI</a></strong> to ChinaTalk - reporting on DeepSeek, ahead of the curve</p></li><li><p><strong><a href="https://manifund.org/projects/shallow-review-of-ai-safety-2024">Shallow review of AI safety 2024</a></strong> to Gavin Leech - quick regrants inducing further funding from OpenPhil &amp; others</p></li></ul><h3><a href="https://manifund.org/projects/scoping-developmental-interpretability-xg55b33wsfc">Scoping Developmental Interpretability</a> - the first funding for Timaeus, accelerating research by months</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!K9Gn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!K9Gn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 424w, https://substackcdn.com/image/fetch/$s_!K9Gn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 848w, https://substackcdn.com/image/fetch/$s_!K9Gn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 1272w, https://substackcdn.com/image/fetch/$s_!K9Gn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!K9Gn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png" width="1456" height="413" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:413,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:776027,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/161941039?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!K9Gn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 424w, https://substackcdn.com/image/fetch/$s_!K9Gn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 848w, https://substackcdn.com/image/fetch/$s_!K9Gn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 1272w, https://substackcdn.com/image/fetch/$s_!K9Gn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e9b110f-e409-4c2b-b63b-bf589828f784_1920x545.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>This regrant was made in late 2023 to Jesse Hoogland and the rest of what is now the <a href="https://timaeus.co/">Timaeus</a> team, for the purposes of exploring Developmental Interpretability (DevInterp) as a new AI alignment research agenda. Four regrantors made regrants to this project totalling $143,200: Evan Hubinger as the main funder, alongside Rachel Weinberg, Marcus Abramovitch and Ryan Kidd. Evan had previously mentored Jesse Hoogland as part of MATS, and therefore had additional context about the value of funding Hoogland&#8217;s future research. This is the sweet spot for regranting: donors may have the public information that Evan Hubinger is an expert who does good work, but regranting allows them to leverage his private information about other valuable projects, such as DevInterp.</p><p>Regarding the grant itself; success for this project looked like determining whether DevInterp was a viable agenda to move forward with, rather than producing seminal research outputs. I recommend <a href="https://timaeus.co/research">reading</a> <a href="https://www.lesswrong.com/s/SfFQE8DXbgkjk62JK/p/TjaeCWvLZtEDAS5Ex">more</a> about DevInterp if you&#8217;re interested, but my shallow understanding is that it aims to use insights from <a href="https://www.lesswrong.com/posts/fovfuFdpuEwQzJu2w/neural-networks-generalize-because-of-this-one-weird-trick">Singular Learning Theory (SLT)</a> to make progress on AI alignment through interpretability, focusing on how phase transitions in the training process lead to internal structure in neural networks.</p><p>I&#8217;m not well placed to form an inside view on how likely DevInterp was/is to succeed, but this proposed research agenda had numerous things going for it:</p><ul><li><p>it was <strong>novel;</strong> the application of SLT to alignment was largely unexplored prior to this work,</p></li><li><p>it seemed to be <strong>well thought out</strong>; the LessWrong write-up included plenty of detail about why we might expect phase transitions to be a big deal and how this would relate to alignment, as well as a solid six-month plan,</p></li><li><p>it had an element of &#8220;<strong>big if true&#8221;</strong> i.e., it may be unlikely that the strong version of the DevInterp thesis is true, but this research has potential to make meaningful progress on AI alignment if it is</p></li></ul><p>These are all markers of projects I am excited to see funded through Manifund regranting. In addition to the agenda itself, I also think this was a good team to bet on for this kind of work; they seem capable and have relevant experience e.g. ML research, and running the 2023 SLT &amp; Alignment Summit.</p><p>This regrant is a strong example of where Manifund&#8217;s regranting program can have the biggest impact: being early to support new projects &amp; organizations, and thereby providing strong signals to other funders as well as some runway for these organizations to move quickly. In this case, Manifund&#8217;s early funding helped Hoogland&#8217;s team get off the ground, and they subsequently started a new organization (Timaeus) and received significantly more funding from other sources, such as $500,000 from the Survival &amp; Flourishing Fund. It&#8217;s probable that they would&#8217;ve gotten this other funding regardless, but not guaranteed, and I&#8217;m happy that Manifund helped bring Timaeus into existence several months sooner and with increased financial security. Hoogland notes:</p><blockquote><p><em>Getting early support from Manifund made a real difference for us. This was the first funding we received for research and meant that we could start months earlier than we otherwise would have. The fact that it was public meant other funders could easily see who was backing our work and why. That transparency helped us build momentum and credibility for developmental interpretability research when it was still a new idea. I'm pretty sure it played a significant role in us securing later funding through SFF and other grantmakers.</em></p></blockquote><p>In terms of concrete outcomes, there&#8217;s a lot to be happy with here. Timaeus and its collaborators have published <a href="https://www.lesswrong.com/posts/gGAXSfQaiGBCwBJH5/timaeus-in-2024">numerous papers on DevInterp</a> since this regrant was made, and it seems that DevInterp&#8217;s key insight around the existence and significance of phase transitions has been validated. My sense is that the question of whether DevInterp is a worthwhile alignment research agenda to pursue has been successfully answered in the affirmative. It&#8217;s also nice to see strong outreach and engagement with the research community on the part of Timaeus: November 2023 saw the first DevInterp conference, and they&#8217;ve given talks at OpenAI, Anthropic, and DeepMind.</p><h3><a href="https://manifund.org/projects/support-for-deep-coverage-of-china-and-ai">Support for Deep Coverage of China and AI</a> - reporting on DeepSeek, ahead of the curve</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!J4eM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!J4eM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 424w, https://substackcdn.com/image/fetch/$s_!J4eM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 848w, https://substackcdn.com/image/fetch/$s_!J4eM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 1272w, https://substackcdn.com/image/fetch/$s_!J4eM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!J4eM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png" width="1456" height="713" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:713,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;ChinaTalk&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="ChinaTalk" title="ChinaTalk" srcset="https://substackcdn.com/image/fetch/$s_!J4eM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 424w, https://substackcdn.com/image/fetch/$s_!J4eM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 848w, https://substackcdn.com/image/fetch/$s_!J4eM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 1272w, https://substackcdn.com/image/fetch/$s_!J4eM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9f61e95-457c-47d5-903a-1c64ee4c2b50_1500x735.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In 2023 &amp; 2024, Manifund regrantors Joel Becker and Evan Hubinger granted a total of $37,000 to <a href="https://www.chinatalk.info/">ChinaTalk</a>, a newsletter and podcast covering China, technology, and US-China relations. ChinaTalk has over 50,000 subscribers and is also notable for the quality of its coverage and the <a href="https://www.chinatalk.media/about">praise and attention</a> it receives from elites and policymakers.</p><p>Before this regrant, ChinaTalk had been run by Jordan Schneider and Caithrin Rintoul, both part-time, on a budget of just $35,000/year. What they were able to accomplish in that time with those limited resources was impressive, and I believe merited additional funding, even just to allow Jordan to work on this full-time. More funding would also have meant ChinaTalk bringing on a full-time fellow who, per Jordan, &#8220;<em>would be, to my knowledge, the only researcher in the English-speaking world devoted solely to covering China and AI safety.&#8221;</em> ChinaTalk has since received further funding and is in the process of growing to five full-time employees, but we would&#8217;ve loved for this to happen sooner through an expanded regranting program.</p><p>Even putting aside the specific track record of ChinaTalk, it seems clear to me that the intersection of China and AI safety is an incredibly important area to cover, and at a high level it is valuable to fund organizations that are doing this kind of work. It can be hard to imagine plausible scenarios of how the next decade goes well with respect to AI that don&#8217;t run through US-China relations, and I am persuaded by Jordan&#8217;s case that the amount of energy currently being expended on this is grossly inadequate.</p><p>Since the first regrant, ChinaTalk&#8217;s Substack audience has grown from 26,000 subscribers to 51,000 and they&#8217;ve put out regular high-quality content, including an English translation of an <a href="https://www.chinatalk.media/p/deepseek-ceo-interview-with-chinas">interview with DeepSeek CEO Liang Wenfeng</a>, coverage of <a href="https://www.chinatalk.media/p/taiwan-vs-us-chip-subsidies-bolstering">chip policy</a>, and what important 2024 elections in the <a href="https://www.chinatalk.media/p/tim-walz-on-china">US</a> and <a href="https://www.chinatalk.media/p/taiwan-election-results-how-lai-won">Taiwan</a> mean for China. The ChinaTalk team has <a href="https://www.chinatalk.info/team">expanded</a> to six people, allowing for a greater diversity and quantity of coverage, including YouTube videos. Jordan has also <a href="https://www.chinatalk.media/p/whats-next-for-chinatalk">announced plans</a> for launching a think tank&#8212;ChinaTalk Institute&#8212;this year, in a similar vein to IFP.</p><p>Among their varied coverage, I was particularly impressed to see how ChinaTalk was ahead of the curve in covering the rise of DeepSeek, while most of the West seemed to be taken by total surprise in January 2025. As a trader and forecaster, this advance insight might have been worth a lot of money to me through anticipating the market freakout, suggesting I should pay more attention to ChinaTalk in the future.</p><p>ChinaTalk has continued on the strong trajectory it was on in late 2023, and it was great that Manifund was able to support ChinaTalk in this success. For more information about why this grant was likely good ex ante, I encourage you to look at regrantor Joel Becker&#8217;s <a href="https://manifund.org//projects/support-for-deep-coverage-of-china-and-ai?tab=comments#0ac8a37d-16bf-e616-ab8c-349bc9dfaeb3">comment</a> on the subject. Joel&#8217;s detail about why ChinaTalk was at the time insufficiently funded</p><blockquote><p><em>Philanthropists are scared to touch China, in part because of lack of expertise and in part for political reasons. Advertisers can be nervous for similar reasons&#8230; Jordan was hoping to support this work through subscriptions only.</em></p></blockquote><p>makes me more optimistic that this regrant was the kind of thing the program should be doing: plugging holes in the funding landscape.</p><h3><a href="https://manifund.org/projects/shallow-review-of-ai-safety-2024">Shallow review of AI safety 2024</a> - quick regrants, nudging OpenPhil &amp; others to donate</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tppE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tppE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 424w, https://substackcdn.com/image/fetch/$s_!tppE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 848w, https://substackcdn.com/image/fetch/$s_!tppE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 1272w, https://substackcdn.com/image/fetch/$s_!tppE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tppE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png" width="1286" height="433" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:433,&quot;width&quot;:1286,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:955783,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/161941039?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tppE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 424w, https://substackcdn.com/image/fetch/$s_!tppE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 848w, https://substackcdn.com/image/fetch/$s_!tppE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 1272w, https://substackcdn.com/image/fetch/$s_!tppE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb290acb1-392b-4210-ba8c-b3f9ee2e1f7f_1286x433.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Gavin Leech co-wrote <strong><a href="https://www.lesswrong.com/posts/zaaGsFBeDTpCsYHef/shallow-review-of-live-agendas-in-alignment-and-safety">Shallow review of live agendas in alignment &amp; safety</a></strong> in 2023, which was well-received and considered a useful resource for people looking to get a top-level picture of AI safety research. Given that it was intended to be a shallow review, this post has a lot of helpful detail and links for various research agendas, e.g., the amount of resources currently devoted to each, and notable criticisms.</p><p>Last year, he sought funding to create an updated 2024 version of this post. He received $9,000 from Manifund regrantors Neel Nanda and Ryan Kidd, as well as $12,000 from other donors through the Manifund site.</p><p>Big picture, I believe there should be an accessible and up-to-date resource of this kind; for people who are starting out in AI safety and don&#8217;t know anything, for funders trying to get a sense of the landscape, or for anyone else who might need it. In 2022 I was at a stage where I wanted to contribute to AI safety but didn&#8217;t know anything about it and was unsure where to start, and I would&#8217;ve likely found Gavin&#8217;s review useful, along with the other resources that existed. Based on this, Gavin&#8217;s record in a variety of fields, and the quality of the 2023 version, I think this regrant looked promising.</p><p>The new post (<strong><a href="https://www.lesswrong.com/posts/fAW6RXLKTLHC3WXkS/shallow-review-of-technical-ai-safety-2024">Shallow review of technical AI safety, 2024</a></strong>) came out in December 2024 and appears to be similarly comprehensive to the 2023 version, although it has gotten less attention (~half the upvotes on LessWrong and not curated). That&#8217;s probably a bit worse of an outcome than I would&#8217;ve hoped for, but I still would have endorsed this grant had I known the result in advance. Presumably the updated version is less eye-catching than the original, while still being necessary.</p><p>The funding of this project also shows the advantages of the Manifund regranting program. Gavin asked for between $8,000 (MVP version) and $17,000 (high-end version) and was quickly funded for the MVP by Neel and Ryan. He then got an additional $5,000 from OpenPhil, after Matt Putz learned about this proposal <a href="https://forum.effectivealtruism.org/posts/CdHrwEiGsJqC8RAat/5-homegrown-ea-projects-seeking-small-donors?commentId=wbHMrQKFjmofrGyPs">via our EA Forum post</a>; and a further $12,000 from other donors. I am happy with how the regranting program is both able to provide the small amount of funding to get a project off the ground, and increase visibility of that project so that other donors can step in and fund it to a greater extent. A couple of small negatives: (1) regrantor Neel Nanda is less optimistic than I am that this was a particularly good grant and (2) the high-end version was supposed to include a &#8220;<em>glossy formal report optimised for policy people</em>&#8221; which didn&#8217;t get made (OpenPhil opted against funding it), however the excess money is instead going towards the 2025 edition. I look forward to it!</p><h2>&#8230; And three that maybe weren&#8217;t a good fit for regranting</h2><ul><li><p><strong><a href="https://manifund.org/projects/pilot-for-new-benchmark-by-epoch-ai?tab=comments">Pilot for new benchmark</a></strong> to EpochAI - a limited-info regrant that might have accelerated AI capabilities</p></li><li><p><strong><a href="https://manifund.org/projects/ai-safety--society">AI Safety &amp; Society</a></strong> to CAIS - cool new project on AI safety writing, less exciting for regranting specifically because the regrantor was also leading the project</p></li><li><p><strong><a href="https://manifund.org/projects/ai-safety--society">General support</a></strong> to SaferAI - an org doing important work in AI policy, but might be growing too fast</p></li></ul><div class="pullquote"><p>Austin here. FWIW, we are very grateful for the regrantors for recommending these, and aren&#8217;t saying that these grants are bad, <em>per se.</em> In fact, I think some of these grantees are amazing and deserve lots more funding! We just think that the facts around these grants means that they&#8217;re not good examples of where Manifund&#8217;s regranting can be differentially good, compared to other mechanisms (such as LTFF or OpenPhil&#8217;s); or donors directly giving to these orgs, through Manifund or elsewise.</p><p>This matters because when we (and our donors) consider whether to run regranting, we want to see that the program is counterfactually moving funds towards projects that wouldn&#8217;t have them otherwise. In startup terms, we want to see regrants that are like angel investments in <em>undervalued</em> people and projects; the regrants below feel more like post-Series A top-ups. Even if the money is used well, the grant itself doesn&#8217;t signal much to the broader AI safety community; the quality of these orgs is already priced in. Now back to Jesse!</p></div><h3><a href="https://manifund.org/projects/pilot-for-new-benchmark-by-epoch-ai?tab=comments">Pilot for new benchmark by Epoch AI</a> - unfortunately confidential</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Vvdo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Vvdo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 424w, https://substackcdn.com/image/fetch/$s_!Vvdo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 848w, https://substackcdn.com/image/fetch/$s_!Vvdo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 1272w, https://substackcdn.com/image/fetch/$s_!Vvdo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Vvdo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png" width="1248" height="626" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:626,&quot;width&quot;:1248,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Vvdo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 424w, https://substackcdn.com/image/fetch/$s_!Vvdo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 848w, https://substackcdn.com/image/fetch/$s_!Vvdo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 1272w, https://substackcdn.com/image/fetch/$s_!Vvdo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F879211a0-d4c2-4483-a414-49ddd96986a6_1248x626.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This regrant consists of $200,000 from regrantor Leopold Aschenbrenner to <a href="https://epoch.ai/">Epoch AI</a> to support the pilot of a new frontier AI benchmark. Epoch is a research organization that does some great work on forecasting AI progress, AI hardware, the economics of AI, and more. For this reason, they are by default a good candidate for funding through our regranting program. <em>[Editor&#8217;s note: in fact, we&#8217;ve asked Tamay to serve as a regrantor for 2025!]</em> However, there are a few reasons why I am less excited about this regrant than many of the other projects that could&#8217;ve been funded.</p><p>Firstly, this proposal presented very little information, for confidentiality purposes. While I understand why this was necessary, the lack of transparency is not ideal from Manifund&#8217;s perspective. Part of the value we hope potential grantees can get from the regranting program is increased awareness of their project, so that other funders can support them, and hopefully to improve understanding of what work is being done in AI safety. The lack of detail here makes that difficult.</p><p>Secondly, I think the program works best when regrantors give smaller amounts to many projects, compared with the (relatively) larger sum given to Epoch here. We generally want to be taking risks on getting smaller projects off the ground, with the hope that other funding sources can take over if/when their funding needs exceed what Manifund can offer. Part of <a href="https://www.notion.so/Manifund-2025-Regrants-launch-post-1c054492ea7a80af9de5ee10856487d0?pvs=21">our comparative advantage</a> lies in the flexibility and speed of the regranting program, so I place more value on regrants that lean on that advantage. On the other hand, this is a pilot, so the proposal does fit the loose remit in that sense. I would also be happier about the relatively large grant if it came from multiple regrantors on the site (as with Timaeus), as that&#8217;s further evidence that a proposal is worth funding according to multiple different experts.</p><p>Thirdly, related to the above, Epoch has already received significant funding from <a href="https://www.openphilanthropy.org/grants/?q=epoch">OpenPhil</a> and <a href="https://survivalandflourishing.fund/recommendations">SFF</a>, suggesting they wouldn&#8217;t benefit most from Manifund&#8217;s support. Aschenbrenner remarks that Epoch has other funding, but not for this project. I wonder whether that&#8217;s a signal that this project is not as valuable as Epoch&#8217;s other work, in the eyes of the Epoch leadership team and their previous funders?</p><p>Finally, while I highly value a lot of Epoch&#8217;s forecasting work, I think there is growing evidence that benchmarks of this kind help accelerate AI capabilities. The new reasoning-via-RL paradigm for frontier LLMs means that one of the biggest bottlenecks to AI progress is now <a href="https://x.com/davidad/status/1888945708272591161">high-quality reward signals to train on</a>, which new benchmarks may assist in providing. This is evident in <a href="https://epoch.ai/blog/openai-and-frontiermath">Epoch&#8217;s FrontierMath benchmark</a>, which OpenAI commissioned and has sole access to (except for a holdout set). While the math nerd in me thinks FrontierMath is really cool, it seems to me that OpenAI thought creating this benchmark would be to their benefit, either through directly speeding up their progress or for marketing reasons. I find this concerning. We don&#8217;t know exactly whether this new benchmark would accelerate AI capabilities, or what the ownership structure of it would be, but that ultimately comes back to difficulties stemming from lack of transparency. I would be more confident about the positive value of this regrant if there were more information about this thorny question. There is significant disagreement about whether contributing to AI capabilities is in fact negative for the world, and I want to defer some to that uncertainty, but my personal view is that it does increase the probability of global catastrophe and I am therefore more pessimistic about this regrant. I would&#8217;ve preferred to see our regrantors fund a wider array of projects, as well as projects that are more transparent and ideally carry less downside risk.</p><h3><a href="https://manifund.org/projects/ai-safety--society">AI Safety &amp; Society</a> - a direct donation may have made more sense</h3><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!s6CR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!s6CR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 424w, https://substackcdn.com/image/fetch/$s_!s6CR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 848w, https://substackcdn.com/image/fetch/$s_!s6CR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 1272w, https://substackcdn.com/image/fetch/$s_!s6CR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!s6CR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png" width="204" height="70" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:70,&quot;width&quot;:204,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!s6CR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 424w, https://substackcdn.com/image/fetch/$s_!s6CR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 848w, https://substackcdn.com/image/fetch/$s_!s6CR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 1272w, https://substackcdn.com/image/fetch/$s_!s6CR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53dca5be-c0b1-4089-be9c-3bfa8460ea65_204x70.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>AI Safety &amp; Society (AISS) was the proposed name for a new platform by the <a href="https://www.safe.ai/">Centre for AI Safety</a> (CAIS) designed to foster high-quality AI safety discourse, filling a gap between social media and formal academic publications. This platform launched two weeks ago, under a new name: <a href="https://www.ai-frontiers.org/">AI Frontiers</a>. It received a regrant of $250,000 from Manifund regrantor and CAIS Executive Director Dan Hendrycks, who is also advising the project and is now Editor-in-Chief of AI Frontiers.</p><p>There&#8217;s a lot here that I&#8217;m excited about: I agree with the need for more good AI safety writing, and ensuring that AI safety ideas reach a broader audience. I also trust Dan and the CAIS team to execute this well, and they have a lot of the right contacts and experience to get this off the ground. Looking at the articles published on AI Frontiers in the last two weeks, I&#8217;m impressed with the quality and scope.</p><p>Despite the positives, I&#8217;m unconvinced this was a great outcome for the Manifund regranting program, because it involved a regrantor using their budget to support their own organization&#8217;s project. To be clear, I think this was a reasonable use of money. It makes sense that Dan Hendrycks&#8217; reasons for supporting this project professionally also lead him to think it&#8217;s a good place to direct funding. Yet this doesn&#8217;t strike me as making valuable use of the regranting program, compared with other funding mechanisms. Consider the two main ways a hypothetical donor might want to defer to Hendrycks&#8217; judgment as an AI safety expert:</p><ul><li><p>they might specifically value the work he&#8217;s doing with CAIS and projects such as AI Frontiers, or</p></li><li><p>they might value his ability to discover and rate AI safety work more generally</p></li></ul><p>The former situation should probably lead that donor to just donate to CAIS directly; the latter is a case that the regranting program was specifically designed to assist with. This regrant feels more like the former, which I don&#8217;t think is a problem in itself, but I would feel less optimistic about continuing the regranting program if this is the main way in which it was used. That being said, I hope this project succeeds and if AI Frontiers does have a big positive impact, I might end up thinking that the specifics of how it got funded were less important than getting it funded at all.</p><p>This post should not be seen as knocking AI Frontiers&#8212; a nascent project that could be highly impactful&#8212; but rather articulating a view on the best use cases for regranting. Looking forward, I wouldn&#8217;t want Manifund regrantors to rule out funding projects they&#8217;re involved in, but such a decision should perhaps merit some extra skepticism about whether this is the best use of a regrantor budget.</p><p><em>[Editor&#8217;s note: we reached out to Dan about this grant, and he disputes this inclusion. In particular, he notes that AI Frontiers has only been out for a couple weeks, and that <a href="https://manifund.org/projects/removing-hazardous-knowledge-from-ais">his previous grant</a> to <a href="https://www.wmdp.ai/">the WMDP benchmark</a> would make a better example, as that grant has similar characteristics but has had more time to stand on its merits.]</em></p><h3><a href="https://manifund.org/projects/general-support-for-saferai">General support for SaferAI </a>- growing too fast?</h3><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vqR4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vqR4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 424w, https://substackcdn.com/image/fetch/$s_!vqR4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 848w, https://substackcdn.com/image/fetch/$s_!vqR4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 1272w, https://substackcdn.com/image/fetch/$s_!vqR4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vqR4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png" width="208" height="223" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/219da98c-45e0-4d54-9887-b81777b64f35_208x223.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:223,&quot;width&quot;:208,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vqR4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 424w, https://substackcdn.com/image/fetch/$s_!vqR4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 848w, https://substackcdn.com/image/fetch/$s_!vqR4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 1272w, https://substackcdn.com/image/fetch/$s_!vqR4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F219da98c-45e0-4d54-9887-b81777b64f35_208x223.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>This $100,000 regrant was made by Adam Gleave to <a href="https://www.safer-ai.org/">SaferAI</a>, an organization that conducts research on AI risk management. Similar to Epoch AI and CAIS, I have confidence that SaferAI is doing good work , such as their <a href="https://ratings.safer-ai.org/">&#8216;risk management maturity&#8217; ratings</a> for leading AI labs, featured in TIME Magazine. </p><p>Adam Gleave continues to support the grant, and is excited about further support to SaferAI:</p><blockquote><ol><li><p>They&#8217;ve done good work, and are one of really very very few policy-focused orgs in the EU that have gained any traction there. They&#8217;re the only one I can think of in France, which is such a key actor.</p></li></ol><ol start="2"><li><p>They can&#8217;t take [OpenPhil] money. This really limits the number of other sources.</p></li></ol></blockquote><p>However, I am concerned that SaferAI's leadership team is relatively inexperienced for their rate of growth; a concern shared in Adam&#8217;s comments and SaferAI&#8217;s proposal. They intend to grow and take on further projects in a way that might require a lot of coordination and administrative overhead&#8212;tasks the leadership may not be fully equipped to handle. I am not personally familiar with the SaferAI team, so I&#8217;m taking this as given based on what these guys have written, but it seems to me that this may limit the usefulness of additional funding to SaferAI at this point in time. In other words, SaferAI seems like an organization that may be bottlenecked along multiple axes (funding, experience, organizational capacity) such that fixing one bottleneck alone is less valuable than making grants towards another effort that is solely bottlenecked by funding. It may be that SaferAI was not a strong candidate for a Manifund regrant at the time (February 2025) but will be in the future as their ability to usefully absorb more funding increases.</p><p>On the other hand, one of the ways that this grant may be used is to allow Sim&#233;on, the Executive Director, to start drawing a salary, which seems a more straightforwardly good use of money, assuming he doesn&#8217;t have perpetual funding from another source. It&#8217;s the bringing on of additional (junior) staff that gives me more pause, although I must stress that my judgment here is low-confidence given my limited information on SaferAI&#8217;s capacity for expansion.</p><p>Beyond the organizational capacity issues, I also note that SaferAI has &#8220;<em>multiple institutional and private funders from the wider AI safety space,</em>&#8221; which, while certainly not disqualifying for a regrant, is another reason this regrant may be lower value than many others made through this program.</p><p>Finally, I commend SaferAI for the 'What are the most likely causes and outcomes if this project fails? (premortem)' section of their proposal, which was especially helpful for thinking about the value of this grant. I imagine it can be quite easy to write a premortem that doesn&#8217;t really get to the nub of a project&#8217;s potential downsides. I hope to see SaferAI continue to have positive impact and receive support from the AI safety community, but given their other limitations, I am not sure it was the best fit for the regranting program at this stage.</p><div><hr></div><p><em>Austin again. I&#8217;m seeing some common patterns with these examples:</em></p><ol><li><p><em>A large regrant ($100k+),</em></p></li><li><p><em>Made to established orgs with lots of money already,</em></p></li><li><p><em>By a single regrantor,</em></p></li><li><p><em>Who is extraordinarily busy</em></p></li></ol><p><em>So these are heuristics for regrantors to keep in mind; and for Manifund to think about when designing our future program.</em></p><p><em>That said, I don&#8217;t think that pattern-matching to one of these heuristics is necessarily bad! Because:</em></p><ol><li><p><em>Sometimes it&#8217;s actually right for a regrantor spend all their money on a single bet. This year especially, Manifund is diversifying the program with more regrantors, across more fields, with smaller per-person budgets</em></p></li><li><p><em>Sometimes orgs with good track records will have a lot of funding, but can still use the marginal $ better than a new org</em></p></li><li><p><em>Most regrants are made by a single regrantor; it&#8217;s rare to get contributions from multiple regrantors, or the general public.</em></p><ul><li><p><em>In fact, I remember a bit of &#8220;regrantor game of chicken&#8221; happening with orgs like Timaeus and Apollo &#8212; everyone thought they were obviously good, but it wasn&#8217;t clear who should spend their budget on it. IMO, funders should think like angel investors &#8212; getting early &#8220;allocation&#8221; into good projects is good!</em></p></li></ul></li><li><p><em>Regrantors who are busy are generally busy for good reason: they&#8217;re competent &amp; well connected. As the saying goes, &#8220;If you want something done, give it to a busy person.&#8221;</em></p><ul><li><p><em>Also, it&#8217;s good for Manifund to have famous (aka busy) people associated with the regranting program &#8212; it makes fundraising easier for us, and makes the program seem more legit. But we don&#8217;t want to overweight this; we definitely want regrantors who will actually make good grants too.</em></p></li></ul></li></ol><p><em>So overall, it&#8217;s really hard to say &#129335;. Making good grants is a hard problem, one we&#8217;re always trying to get better at. Once again, we&#8217;re very grateful to every one of our regrantors for finding and funding the opportunities that they feel will most help the world!</em></p>]]></content:encoded></item><item><title><![CDATA[Manifund 2025 Regrants]]></title><description><![CDATA[Announcing 10 AI safety regrantors, with $2m+ total to distribute]]></description><link>https://manifund.substack.com/p/manifund-2025-regrants</link><guid isPermaLink="false">https://manifund.substack.com/p/manifund-2025-regrants</guid><dc:creator><![CDATA[Austin Chen]]></dc:creator><pubDate>Tue, 22 Apr 2025 16:44:09 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Q6fk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Each year, Manifund partners with <em>regrantors</em>: experts in the field of AI safety, each given an independent budget of $100k+. Regrantors can initiate fast, small grants, seeding early-stage projects with $5k-$50k.</p><p>For 2025, we&#8217;ve raised $2.25m so far, and are excited to announce our first 10 regrantors:</p><ul><li><p>Neel Nanda &#8212; DeepMind</p></li><li><p>Lisa Thiergart &#8212; SL5 Task Force</p></li><li><p>Lauren Mangla &#8212; Constellation</p></li><li><p>Aidan O'Gara &#8212; Longview</p></li><li><p>Gavin Leech &#8212; Arb</p></li><li><p>Marius Hobbhahn &#8212; Apollo</p></li><li><p>Thomas Larsen &#8212; AI Futures Project</p></li><li><p>Tamay Besiroglu &#8212; Mechanize</p></li><li><p>Richard Ngo &#8212; Independent</p></li><li><p>Joel Becker &#8212; METR</p></li></ul><p>We deeply respect what our regrantors have each accomplished in their fields, and are excited to see what they choose to fund!</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q6fk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q6fk!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 424w, https://substackcdn.com/image/fetch/$s_!Q6fk!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 848w, https://substackcdn.com/image/fetch/$s_!Q6fk!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 1272w, https://substackcdn.com/image/fetch/$s_!Q6fk!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q6fk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png" width="1052" height="1564" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1564,&quot;width&quot;:1052,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:541718,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://manifund.substack.com/i/161837590?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Q6fk!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 424w, https://substackcdn.com/image/fetch/$s_!Q6fk!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 848w, https://substackcdn.com/image/fetch/$s_!Q6fk!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 1272w, https://substackcdn.com/image/fetch/$s_!Q6fk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1e387c98-dae5-4431-bffa-74fc56f2c933_1052x1564.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Why does regranting matter?</h2><p>Regranting fills a critical role in the AI safety funding ecosystem:</p><ul><li><p><strong>Fast, low-friction grants:</strong> when needed, we can go from &#8220;grant recommended&#8221; to &#8220;money in grantee&#8217;s bank account&#8221; in &lt;1 week.</p></li><li><p><strong>Hits-based giving:</strong> regrantors can make grants solo, and are directly responsible for the quality of the grant. This encourages more speculative grants, and avoids problems in review-by-committee.</p></li><li><p><strong>Proactive funding, not reactive:</strong> regrantors can offer funding, knowing that they have the money to back their promises. Most other grant processes (OpenPhil, LTFF, SFF) are application-based.</p></li><li><p><strong>Support beyond money</strong>: regrantors form closer relationships with grantees, and can provide feedback, intros, mentorship, and publicity</p><ul><li><p>Startups will often accept small angel checks for the prestige, connections &amp; advice; regrantors can provide similar support to grantees</p></li><li><p>In contrast, OpenPhil program officers or LTFF fund managers are responsible for hundreds of grants and thus are often too busy to help</p></li></ul></li><li><p><strong>Efficient markets in grantmaking</strong>: regranting provides more influence (dollars as &#8221;votes&#8221;) to AI safety experts with good track records</p><ul><li><p>Similar to how tech startup exits create angel investors, who &#8220;vote&#8221; on the next batch of startups by funding them</p></li><li><p>Similar to how prediction markets provide more &#8220;votes&#8221; to good forecasters, becoming more accurate over time</p></li><li><p>Regranting budgets themselves are similar to retroactive funding for past good work!</p></li></ul></li><li><p><strong>Regranting is a decentralizing force,</strong> seeding new independent orgs, instead of concentrating talent within AI labs, or large charities with strong fundraising operations</p></li><li><p><strong>Our regrants are transparent</strong>, with project descriptions and grant writeups in public</p><ul><li><p>This provides a public track record and social proof for grantees, which helps them with future fundraising and hiring. For example, early public comments from regrantors Evan Hubinger and Ryan Kidd likely helped Timaeus raise more from SFF.</p></li><li><p>Transparency also helps the community understand which kinds of work are respected and funded. Manifund regrants provide more detail per grant compared to every other AI safety grantmaker.</p></li></ul></li><li><p><strong>Regranting builds up the Manifund network,</strong> helping us meet great grantees</p><ul><li><p>This gives us a picture of the landscape of AI safety,</p></li><li><p>We can follow up with intros, referrals, funding down the line</p></li><li><p>Many of the 2025 regrantors started out as Manifund grantees from past regrants (eg Tamay Besiroglu from Epoch, Lisa Thiergart from MIRI, Marius Hobbhahn from Apollo)</p></li></ul></li></ul><h2>What makes a good regrantor?</h2><p>With regranting, the most important choice Manifund &amp; our donors face is: &#8220;who gets these $100k+ budgets?&#8221; Here are some of the criteria we look for, when deciding who to invite:</p><ul><li><p><strong>Good taste in projects and people</strong></p><ul><li><p>Regranting is a similar skillset to angel investing. Good regrantors have a track record of being early and right; of finding undervalued opportunities.</p></li><li><p>Regranting is also similar to hiring: both require identifying great talent. Standard advice for investing in startups is to focus on the founders, not the ideas &#8212; we think this holds true for regrants, too. &#8220;<a href="https://nintil.com/hhmi-and-nih/">Fund people, not projects</a>&#8221;.</p></li></ul></li><li><p><strong>Has capacity to regrant, proactively</strong></p><ul><li><p>That is, regrantors should not too busy to actually find, consider, and make grants</p></li><li><p>Willing to research potential grantees, reach out and form relationships, follow up with mentorship and support</p></li><li><p>Regrantors would ideally spend 5h+/month talking to grantees &amp; writing up the grants</p><ul><li><p>It helps if a regrantor&#8217;s day-to-day work already provides them with good leads and ideas, eg they are more of a manager than a researcher</p></li></ul></li></ul></li><li><p><strong>Extends coverage of AI safety funding.</strong> One goal of regranting is to spot new opportunities &#8212; a wide net helps with this. Some kinds of coverage we think about:</p><ul><li><p>Geographic coverage: Bay Area vs DC vs London vs China</p></li><li><p>Subject matter coverage: evals vs mech interp vs AI policy vs fieldbuilding</p></li><li><p>Different kinds of orgs: labs vs thinktanks vs startups vs charities</p></li><li><p>Different competing orgs: Anthropic vs GDM vs OpenAI</p></li></ul></li><li><p><strong>Doesn&#8217;t already have easy access to funding</strong></p><ul><li><p>People who are already grantmakers, or are well connected to and can recommend grants to philanthropists, might use regranting budgets less counterfactually. We mostly don&#8217;t invite existing grantmakers to regrant.</p></li><li><p>There are exceptions though! It depends on specific circumstances. For example, Aidan O&#8217;Gara already makes grants at Longview, but we were still excited to have him regrant.</p><ul><li><p>A Manifund regranting budget provides Aidan with more flexibility and risk appetite (can make regrants that Longview may be unwilling to publicly endorse), and is more lightweight (regrants are fast and can go out in small amounts, whereas Longview rarely considers grants of &lt;$200k).</p></li></ul></li><li><p>Grantmaking may be a skill that improves with practice, so budgets may go farther in the hands of experienced grantmakers</p></li></ul></li></ul><p>We&#8217;re still looking for more regrantors; consider <a href="https://airtable.com/appOfJtzt8yUTBFcD/shrZW7S069EmghCSV">applying here</a>!</p><h2>For regrantors: what makes a good regrant?</h2><p>Good regrants are often <strong>local, small &amp; fast:</strong></p><ul><li><p>A <strong>local</strong> grant is made within a regrantor&#8217;s unique network, without prior knowledge from OpenPhil or others &#8212; our donors aren&#8217;t excited to funge against OpenPhil dollars (neither are we!). At the same time, it&#8217;s important not to overthink this criterion. There are plenty of cases where OpenPhil sees an opportunity but can&#8217;t fund it due to e.g. reputational risk, and also cases where they may be lacking information or are just wrong about the value of a proposal.</p></li><li><p>If the org receiving the grant isn&#8217;t <strong>small</strong> (raised $1M+), it probably already has a public track record and a fundraising team, and should therefore be on the radar for OpenPhil and others. As an example, Evan Hubinger&#8217;s <a href="https://manifund.org/projects/mats-funding">grant to MATS itself</a> is less exciting to us than grants to <a href="https://manifund.org/projects/scoping-developmental-interpretability-xg55b33wsfc">his own MATS mentees</a>, who have less funding and less visibility.</p></li><li><p>One strength of regranting is that it can be <strong>fast</strong>, i.e., time sensitive grants, as Manifund can move dollars to grantees within days of the recommendation. Types of funding that might fit into this category include funding for compute or travel expenses, or bringing on world-class talent, ASAP. On the other hand, yearly planning for orgs is better captured by OpenPhil and SFF, who move more slowly but have a lot more money available.</p></li></ul><p>Some examples of regrants that we particularly liked (full review coming soon!):</p><ul><li><p><strong><a href="https://manifund.org/projects/scoping-developmental-interpretability-xg55b33wsfc">Scoping Developmental Interpretability</a></strong> by Jesse Hoogland - the first funding for Timaeus, accelerating its research by months</p></li><li><p><strong><a href="https://manifund.org/projects/support-for-deep-coverage-of-china-and-ai">Support for deep coverage of China and AI</a></strong> to ChinaTalk - reporting on DeepSeek, ahead of the curve</p></li><li><p><strong><a href="https://manifund.org/projects/shallow-review-of-ai-safety-2024">Shallow review of AI safety 2024</a></strong> to Gavin Leech - quick regrants inducing further funding from OpenPhil &amp; others</p></li></ul><p>Ultimately, regrantors have a lot of discretion for making grants, and we encourage you to use it! We expect many great regrants to come not just from looking at new proposals as they appear on the Manifund website, but from proactively finding them. This could look like asking your friends or colleagues for leads, launching a request for proposals, or a prize contest.</p><p>See also &#8220;<a href="https://joel-becker.com/digital-garden/regrantor/">Some fun lessons I learned as a junior regrantor</a>&#8221; from Joel Becker.</p><h2>For donors: why fund regrantors?</h2><p>If you&#8217;re looking for a turnkey way to seed great AI safety projects, consider funding our regranting program! We think it&#8217;s a good fit for donors who are:</p><ul><li><p>Earning to give, and donating $50k+ a year</p></li><li><p>Concerned about AI safety, but less familiar with the landscape, and not interested in spending a lot of your time getting up to speed</p></li><li><p>Excited to back new, undiscovered opportunities</p></li></ul><p>For such donors, regranting provides a &#8220;part-time program officer, as a service&#8221;. It&#8217;s often hard to find great AI safety program officers because their opportunity costs are high; those with the appropriate context could just do direct work. But regranting doesn&#8217;t take an expert away from their day job. Instead, they can keep an eye out for good opportunities, while continuing to work in the field.</p><p>So rather than spinning up your own foundation, with the overhead that imposes, you can tap into the expertise of someone you trust &#8212; and have Manifund handle the logistics of sending out grants. We do ask large donors to contribute an extra 5% towards Manifund&#8217;s operational budget; in exchange, Manifund works closely with each donor to understand your giving objectives.</p><p>We&#8217;re quite flexible with how to distribute budgets:</p><ul><li><p>You can fund the regrantor program as a whole, allowing Manifund to allocate the budget</p></li><li><p>You can increase the budgets for specific regrantors you trust</p></li><li><p>You can nominate your own regrantors to join the program</p></li></ul><p>If you&#8217;re interested in donating via regranting, please reach out to <a href="mailto:austin@manifund.org">austin@manifund.org</a>!</p><p><em>Thanks to Jesse Richardson and Leila Clark for input~</em></p>]]></content:encoded></item><item><title><![CDATA[Fundraising for Mox, our space in SF]]></title><description><![CDATA[Coworking & events for AI safety, AI labs, EA charities & startups]]></description><link>https://manifund.substack.com/p/fundraising-for-mox-our-space-in</link><guid isPermaLink="false">https://manifund.substack.com/p/fundraising-for-mox-our-space-in</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Mon, 31 Mar 2025 18:17:31 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!xzwH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Hey! Austin here. At Manifund, I&#8217;ve spent a lot of time thinking about how to help AI go well. One question that bothered me: so much of the important work on AI is done in SF, so why are all the AI safety hubs in Berkeley? (I&#8217;d often consider this specifically while stuck in traffic over the Bay Bridge.)</em></p><p><em>I spoke with leaders at Constellation, Lighthaven, FAR Labs, OpenPhil; nobody had a good answer. Everyone said &#8220;yeah, an SF hub makes sense, I really hope somebody else does it&#8221;. Eventually, I decided to be that somebody else.</em></p><p><em>Now we&#8217;re raising money for our new coworking &amp; events space: Mox. We launched our beta in Feb, onboarding 40+ members, and are excited to grow from here. If Mox excites you too, we&#8217;d love your support; donate at <a href="https://manifund.org/projects/mox-a-coworking--events-space-in-sf">https://manifund.org/projects/mox-a-coworking--events-space-in-sf</a></em></p><h2>Project summary</h2><p>Mox is a 2-floor, 20k sq ft venue, established to bring together EA &amp; AI safety folks with the SF tech scene and labs. Since launching 6 weeks ago, we&#8217;ve onboarded 40+ coworking members and hosted 20 events: hackathons and bootcamps, dinners and retreats.</p><p>We&#8217;re now raising funding to expand Mox into a premier hub. We&#8217;re inspired by what Constellation, Lighthaven, and FAR Labs have achieved in Berkeley, and intend to build upon their example, in San Francisco: the city that is ground zero for transformative work.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!xzwH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!xzwH!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 424w, https://substackcdn.com/image/fetch/$s_!xzwH!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 848w, https://substackcdn.com/image/fetch/$s_!xzwH!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 1272w, https://substackcdn.com/image/fetch/$s_!xzwH!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!xzwH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png" width="1456" height="1196" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1196,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!xzwH!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 424w, https://substackcdn.com/image/fetch/$s_!xzwH!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 848w, https://substackcdn.com/image/fetch/$s_!xzwH!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 1272w, https://substackcdn.com/image/fetch/$s_!xzwH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77fc9d7d-3289-47e9-9556-a0a7bcebeddc_1748x1436.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>What are this project's goals? How will you achieve them?</h3><p>The main elements of Mox:</p><ul><li><p><strong>Coworking &amp; offices</strong>: We host daytime members, who use Mox as their primary workplace. Currently our members are small teams and individuals, with a mix of EA orgs, AI safety researchers, and startup founders. We&#8217;re also speaking with &#8220;anchor&#8221; orgs like Epoch AI to situate their offices here.</p></li><li><p><strong>Community space</strong>: We&#8217;re positioned as a &#8220;weekend office&#8221;, for folks at eg Anthropic, OpenAI, and METR to work and mingle. We encourage member-run gatherings like blog club, paper reading groups, lightning talks and yoga.</p></li><li><p><strong>Public events</strong>: As a large, central venue with easy access to both SF and East Bay, Mox is ideal for hackathons, speaker talks, happy hours, unconferences and the occasional party. We organize our own events, and also rent our space to aligned organizers.</p></li><li><p><strong>Project incubation</strong>: as a medium-term goal, we&#8217;d like to host external fellowships or incubators (eg for MATS, FLF, or Apart), or run our own in-house accelerator.</p></li></ul><p>Our north star is approximately: &#8220;bring together people who are insanely great.&#8221; In pursuit of this, we&#8217;ll move fast, stay flexible, try out many approaches, and double down on whatever shows promise.</p><p>Mox isn&#8217;t a WeWork; an explicit non-goal is to profit by selling coworking space. While we do charge for memberships and events, we do so at subsidized rates, to ground the value we provide. If Mox itself ends up becoming fiscally profitable, that will likely be through other models, eg equity from incubating amazing projects, YCombinator-style.</p><h3>How will this funding be used?</h3><p><em>All figures are very rough. We&#8217;d guess that philanthropic funding &amp; investment will cover between 20% to 50% of Mox expenses, and to make up the rest via memberships, events, and other program revenue.</em></p><ul><li><p>Minimal budget (for 3rd &amp; 4th floors): $1.6m/year</p><ul><li><p>$200k in setup costs (furnishing &amp; labor)</p><ul><li><p>We inherited ~$100-200k worth of furnishing and labor from Solaris, the previous tenants; helping us to hit the ground running</p></li></ul></li><li><p>$120k/mo * 12 mo of ongoing costs</p><ul><li><p>~$40k in fulltime benefits for team of 3</p></li><li><p>~$80k in rent, utilities, services, materials, snacks<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p></li></ul></li><li><p>Aim to support ~150 members</p></li><li><p>Organize ~1 tentpole event a month</p></li></ul></li><li><p>Ambitious budget (for all 4 floors): $3.6m/year</p><ul><li><p>$800k in setup costs</p><ul><li><p>+$600k to build out the 2nd floor to support housing for ~100</p></li></ul></li><li><p>$240k/mo * 12 mo ongoing &#8212; doubling the ongoing values above</p></li><li><p>Support ~350 members, with a fulltime team of 6</p></li><li><p>Organize 3-4 tentpole events a month</p></li></ul></li></ul><h3>Who is on your team? </h3><p>Our current core team is:</p><ul><li><p>Austin Chen: CEO of Manifund</p></li><li><p>Rachel Shu: Space Manager for Mox</p></li><li><p>Mattie Reyes: Operations for Mox</p></li><li><p>Ara Hao: Interior Designer for Mox</p></li></ul><p>Manifund also employs ~3 other FTE for other projects, including Manifest.</p><h3>What's your track record on similar projects?</h3><p>Since launching 6 weeks ago, Mox has:</p><ul><li><p>Onboarded <a href="http://moxsf.com/people">~40 members</a>, contributing $8k per month. Our members include folks from Anthropic, METR, FLF, and numerous startup founders and AI safety researchers.</p></li><li><p>Hosted ~3 events a week, with ~$10k in revenue to date. Notable visitors include Aviv Ovadya (AIDF), Ben Goldhaber (FLF), David Rein (METR), Deger Turan (Metaculus), Dylan Patel (Semianalysis), Gideon Lichfield (Wired), Jan Leike (Anthropic), Joel Becker (METR), Jonas Vollmer (Macroscopic), Josh Morrison (1DaySooner), Jueyan Zhang (AISTOF), Leo Gao (OpenAI), Marci Harris (PopVox), Oliver Habryka (Lightcone), Owain Evans (Truthful AI), Paul Christiano (AISI), Tamay Besiroglu (Epoch), Tao Lin (METR), William Saunders (ex-OpenAI)<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a>.</p></li><li><p>Organized the <a href="https://manifund.substack.com/p/ai-for-epistemics-hackathon">AI for Epistemics hackathon</a>, along with Elicit. Andreas Stuhlm&#252;ller (CEO of Elicit) remarks: </p></li></ul><blockquote><p>Mox was an amazing space for the AI for Epistemics hackathon. I loved the thoughtful layout with central communal gathering areas and quiet work areas around the edges, easily enough space for the 40+ participants. Austin was extremely helpful as well - the event was super smooth. I didn't have to think about anything, couldn't be happier. I'd definitely co-host again and am excited to do other projects together</p></blockquote><p>Outside of Mox, Manifund has:</p><ul><li><p>moved &gt;$5m to projects in AI safety, biosecurity, animal welfare, and other EA and charitable causes</p></li><li><p>run AI safety regranting in 2023 &amp; 2024, with regrantors like Neel Nanda, Leopold Aschenbrenner, Dan Hendrycks, Adam Gleave, Ryan Kidd, and Evan Hubinger; and have raised funding for a larger 2025 version</p></li><li><p>partnered with Scott Alexander on ACX Grants in 2024, distributing $1.5m to early science &amp; developmental projects</p></li><li><p>organized 3 premier conferences of 200-500 people: Manifest 2023 &amp; 2024, on prediction markets &amp; forecasting; The Curve 2024, on bridging the AI safety and SF tech scenes</p></li></ul><p>Previously, Austin started Manifold, a prediction market platform which raised &gt;$4m in EA &amp; venture funding; hosts ~160k markets and ~7m bets; attracts 5000 MAU; and engaged the NYT, Nate Silver, Paul Graham, Sam Altman, Elon Musk, and countless blogs.</p><h3>What are the most likely causes and outcomes if this project fails?</h3><p>While thinking about whether to do Mox, I made <a href="https://manifold.markets/Austin/will-i-start-a-coworking-or-event-s">this prediction market</a>, including reasons it might be a bad idea. Some that are still top of mind:</p><blockquote><ol><li><p>Coworking spaces are bad businesses, as far as I can tell</p><ul><li><p>Nowhere near the margins of software</p></li><li><p>The upsides mostly flow to the tenants I think?</p><ul><li><p>Maybe the answer is "charge more" but I'm somewhat allergic to that</p></li></ul></li></ul></li><li><p>Many other event/coworking spaces have failed or are failing</p><ul><li><p>E.g. Lightcone Offices, Atlantis, Solaris AI, Wytham Abbey if you squint</p></li><li><p>It's not obvious to me that SF Commons or Constellation are currently doing well (at least well enough to make me go "yeah there's no point in me starting my own")</p></li></ul></li><li><p>"Coworking" might be actively harmful (bad for focus, lead to groupthink).</p><ul><li><p>Famously, Paul Graham refused to offer coworking to YC startups</p></li><li><p>I do think the Constellation setup of "provide lots of private offices" might be good</p></li></ul></li><li><p>Physical spaces mostly serve human users; maybe there's more upside in serving AI users</p><ul><li><p>Though, as a human, I like humans, and probably will for a long time</p></li></ul></li><li><p>Maybe most interesting work in SF happens inside of labs, so there's less need for this kind of space</p></li></ol></blockquote><p>Though: in just 6 weeks, Mox has already proven its value in counterfactual events hosted, connections made, and projects incubated. &#8220;Failure&#8221; now would be a matter of degree, aka ending up less than maximally amazing as a space and community.</p><h3>How much money have you raised in the last 12 months, and from where?</h3><p>Mox has not received any external funding at this time. Manifund started Mox with $300k for the initial deposit, rent, and operational costs; we have approximately 2 months of runway without further funding. We&#8217;ve applied to, and are still waiting to hear back from EA Infrastructure Fund (<a href="https://manifoldmarkets.notion.site/Mox-EAIF-app-notes-19354492ea7a8001abc6cf3c7d06a191?pvs=4">app</a>), and OpenPhil (<a href="https://manifoldmarkets.notion.site/OpenPhil-App-Mox-1b254492ea7a80599f75c75177a6ddca?pvs=4">app</a>).</p><p>Even without funding, we are committed to operating Mox for remaining duration of the lease (~12 months), at the least. We&#8217;ll be more interested in finding clients that can pay well for coworking or events, rather than clients aligned with our mission. We&#8217;ll also need to focus more on the fundraising side of Manifund, to generate operational revenue. In the worst case, Austin may make substantial personal donations towards Manifund &amp; Mox.</p><h3>Impact certificates in Mox</h3><p><strong>It would be wise to view any donation to Mox in the spirit of an investment</strong>; specifically, as a purchase of an <a href="https://www.astralcodexten.com/p/impact-markets-the-annoying-details">impact certificate</a> for Mox, at a premoney valuation of $8m. Your donations now entitle you to a share of the credit for whatever Mox accomplishes. And, extremely speculatively, you may be able to sell these impact certs for more donation ability in the future.</p><p>There isn&#8217;t much precedent for impact certs of this size, but here are some comparisons to ground this $8m valuation:</p><ul><li><p>Constellation received <a href="https://www.openphilanthropy.org/grants/?organization-name=constellation">$20m in funding from OpenPhil</a> in 2024. In a typical investment round, investors might buy 10-20% of the total company; this heuristic would value Constellation at $100m-$200m.</p></li><li><p>Lighthaven made <a href="https://www.lesswrong.com/posts/5n2ZQcbc7r4R8mvqc#The_economics_of_Lighthaven">$1.8m in revenue</a> in 2024; <a href="https://claude.ai/share/97e45b6a-d1bc-4224-83be-d553f4190770">Claude</a> and ChatGPT ballpark Lighthaven&#8217;s for-profit valuation at $4-8m based on revenue multiples; this metric excludes Lighthaven&#8217;s significant impact, which I would put a further 3-4x multiple on for $20-$30m.</p></li><li><p>A typical YC-backed startup might raise a $2m seed at a $15m-$25m valuation after demo day, 3 months after starting.</p></li><li><p>Manifold raised at a $15m postmoney valuation in our seed round in Mar 2022, 3 months after launching our website.</p></li></ul><p>To be clear, impact certificates are a very new concept, without any of the ecosystem surrounding for-profit startups. It&#8217;s very possible that your donation to Mox simply ends up as a one-time donation. </p><p>But we&#8217;re hoping that through our efforts and with your support, Mox will grow to be much more valuable than it is today. And if Manifund can establish a robust ecosystem of impact certificate purchases, buybacks, and retroactive funding (big if!), then donors today may be rewarded with a large return in charitable credit, to donate towards future projects that matter to you.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://manifund.org/projects/mox-a-coworking--events-space-in-sf&quot;,&quot;text&quot;:&quot;Donate to Mox&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://manifund.org/projects/mox-a-coworking--events-space-in-sf"><span>Donate to Mox</span></a></p><p></p><h3>PS: apply for Mox!</h3><p>Want to join Mox? We're located at 1680 Mission St; our website is <a href="https://moxsf.com/">here</a>. Fill out <a href="https://moxsf.com/apply">https://moxsf.com/apply</a> if:</p><ul><li><p><strong>you&#8217;re looking for a space to work from</strong>. We support <a href="https://moxsf.com/membership">memberships</a> for hot desks, fixed desks, private offices and occasional visitors.</p></li><li><p><strong>you&#8217;d like to run an event</strong>. We&#8217;ve hosted hackathons, parties, dinners, reading clubs and weird events of all kinds.</p></li><li><p><strong>you&#8217;re interested in working for Mox or Manifund</strong>! We&#8217;re looking for great people for a variety of roles, from events coordinator to back office &amp; finances to &#8220;Director of Mox&#8221;</p></li></ul><p>(or if you know someone who&#8217;d be a good fit, send them the link!)</p><p></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Our landlord has asked us not to broadcast the rent we pay, hence this aggregation. If a more precise breakdown is important, reach out to <a href="mailto:austin@manifund.org">austin@manifund.org</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>This should not necessarily be read as a strong endorsement of Mox from these individuals; the amount of participation ranges from &#8220;brought their team in to cowork&#8221; to &#8220;participated in an all-day hackathon&#8221; to &#8220;played in a Magic: the Gathering event&#8221;.</p></div></div>]]></content:encoded></item><item><title><![CDATA[AI for Epistemics Hackathon]]></title><description><![CDATA[Seeking truth via LLMs; 9 projects built in 8 hours]]></description><link>https://manifund.substack.com/p/ai-for-epistemics-hackathon</link><guid isPermaLink="false">https://manifund.substack.com/p/ai-for-epistemics-hackathon</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Fri, 14 Mar 2025 20:39:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!1k-9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>AI for Epistemics is about helping to leverage AI for better truthseeking mechanisms &#8212; at the level of individual users, the whole of society, or in transparent ways within the AI systems themselves. <a href="http://manifund.org/">Manifund</a> &amp; <a href="https://elicit.com/">Elicit</a> recently hosted a hackathon to explore new projects in the space, with about 40 participants, 9 projects judged, and 3 winners splitting a $10k prize pool. Read on to see what we built!</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1k-9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1k-9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 424w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 848w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1272w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1k-9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png" width="1456" height="423" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:423,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!1k-9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 424w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 848w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1272w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Resources</h2><ul><li><p>See the project showcase: <a href="https://moxsf.com/ai4e-hacks">https://moxsf.com/ai4e-hacks</a></p></li><li><p>Watch the recordings: <a href="https://youtu.be/xGAfG0L_82g">project demos</a>, <a href="https://youtu.be/bMW3J3jjFUA">opening speeches</a></p></li><li><p>See the outline of project ideas: <a href="https://docs.google.com/document/d/18_h13T-Yx5dxN7DRP8WXgKk-twObsMgrOAXoM2GLJ1k/edit?tab=t.nip7ri77dom0">link</a></p><ul><li><p>Thanks to Owen Cotton-Barratt, Raymond Douglas, and Ben Goldhaber for preparing this!</p></li></ul></li><li><p>Lukas Finnveden on &#8220;What's important in &#8216;AI for epistemics&#8217;?&#8221;: <a href="https://lukasfinnveden.substack.com/p/whats-important-in-ai-for-epistemics">link</a></p></li><li><p>Automation of Wisdom and Philosophy essay contest: <a href="https://blog.aiimpacts.org/p/winners-of-the-essay-competition">link</a></p></li></ul><h2>Why this hackathon?</h2><p><em>From the opening speeches; lightly edited.</em></p><h3><strong>Andreas Stuhlm&#252;ller: Why I'm excited about AI for Epistemics</strong></h3><p>In short - AI for Epistemics is important and tractable.</p><p>Why is it important? If you think about the next few years, things could get pretty chaotic. As everyone rushes to integrate AI systems into every part of the economy, the world could change more rapidly than it does today. There's significant risk that people and organizations will make mistakes for relatively uninteresting reasons&#8212;simply because they didn't have enough time to think things through.</p><p>If we can make it easier for people to think clearly and carefully, that's really important. People will use AI tools to help them make decisions either way; eventually unassisted decision-making just won&#8217;t be competitive anymore. This is a lever: the more these tools actually help people make wise decisions, or help them figure out whether they're right or wrong about something, the better off we'll be.</p><p>AI for Epistemics is also tractable now in a way it wasn't before. We're just reaching the point where models are good enough and cheap enough to apply at scale. You can now realistically say, "Let's analyze all news articles," or "Let's review all scientific papers," or thoroughly check every sentence of a document, at a level of detail that wasn't feasible before.</p><p>Given good ideas for epistemic tools, the implementation cost has dropped dramatically. Building significant products in hackathons has become much easier. You can basically copy and paste your project description into Cursor, type "please continue" five times, and you'll have a working demo.</p><p>The key challenge we'll need to think about today is: how can we tell if we're actually making things better? What evidence can we see that would lead us to believe a tool genuinely improves people's thinking, rather than just being a fun UI with knobs to play with?</p><p>I'm really excited about this hackathon. This is the event I've been most excited about for quite a while. I'm very grateful to Austin for creating this space for us.</p><h3><strong>Austin Chen: Why a hackathon?</strong></h3><p>Andreas first talked to me a couple months ago, saying we want to do more for the AI for Epistemics field. We were thinking about some ideas: &#8220;oh, maybe we should do a grants program, or a fellowship program, or something like that&#8221;.</p><p>But I have a special place in my heart for hackathons specifically. So I really sold him hard: we're gonna do a hackathon. We can do all that other stuff too later, but: first things first. <em>(Andreas, wryly: &#8220;I was very hard to sell.&#8221;)</em></p><p>I like hackathons for a lot of reasons:</p><ul><li><p>Hackathons are a sandbox. They're a place where you can play with an idea a little bit. You don't have to worry about whether this thing will be great down the line, or even live past the end of the day. So it gives you a chance to be a bit more creative, try riskier things.</p></li><li><p>It's a blank canvas. You don't have to worry about what your current users will think, or &#8220;will this make money?&#8221;. You can just&#8230; do stuff.</p></li><li><p>Hackathons are a forcing function. There's the demos in eight hours. We're all gonna get up there and present and talk about what we did. So you have to sit there and actually build your idea. You can't just keep spinning your wheels, thinking forever.</p></li><li><p>And it's a chance to meet people. It's a filtering function to find people who care a lot about this particular niche. Right now, AI for Epistemics is a tiny field. All the people who care about it are maybe in this room right now (plus a few others who are remote). But hopefully, it will grow down the line. And this is your chance to meet each other, talk to each other, share your ideas, build stuff out.</p></li></ul><p>Those are some of the reasons I'm excited about hackathons. I'm glad that Andreas and the Elicit team are happy to host this with us today.</p><h2>Meet the projects</h2><p><em>We asked the participants to share more about their project after the hackathon ended. Comments are mostly Austin&#8217;s.</em></p><h3>Question Generator, by Gustavo Lacerda</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!fctt!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!fctt!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 424w, https://substackcdn.com/image/fetch/$s_!fctt!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 848w, https://substackcdn.com/image/fetch/$s_!fctt!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 1272w, https://substackcdn.com/image/fetch/$s_!fctt!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!fctt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png" width="510" height="370.9409340659341" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1059,&quot;width&quot;:1456,&quot;resizeWidth&quot;:510,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Question Generator&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Question Generator" title="Question Generator" srcset="https://substackcdn.com/image/fetch/$s_!fctt!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 424w, https://substackcdn.com/image/fetch/$s_!fctt!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 848w, https://substackcdn.com/image/fetch/$s_!fctt!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 1272w, https://substackcdn.com/image/fetch/$s_!fctt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa759bddd-0f7e-4222-8f30-23fbbef1f07f_2166x1576.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=oPE0_8bfj0s5tYDu&amp;t=378">6:18</a></p><p><strong>Description</strong>: This is a browser extension that generates forecasting questions related to the news page you are visiting.</p><p><strong>Comments:</strong> Good exploration of a promising form factor (chrome extension to make personal flow easier). I like that it ends with &#8220;create a Manifold question&#8221;, as a concrete thing to go next. I&#8217;m not sure if the questions were actually any good? But with LLMs, maybe it&#8217;s always a brainstorming aid, or LLM generate and humans filter (as with imagegen).</p><h3><strong>Symphronesis, by Campbell Hutcheson (winner)</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0Mo_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0Mo_!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 424w, https://substackcdn.com/image/fetch/$s_!0Mo_!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 848w, https://substackcdn.com/image/fetch/$s_!0Mo_!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 1272w, https://substackcdn.com/image/fetch/$s_!0Mo_!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0Mo_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png" width="406" height="313.4230769230769" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1124,&quot;width&quot;:1456,&quot;resizeWidth&quot;:406,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Symphronesis&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Symphronesis" title="Symphronesis" srcset="https://substackcdn.com/image/fetch/$s_!0Mo_!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 424w, https://substackcdn.com/image/fetch/$s_!0Mo_!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 848w, https://substackcdn.com/image/fetch/$s_!0Mo_!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 1272w, https://substackcdn.com/image/fetch/$s_!0Mo_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcab724c5-fc69-4cb8-9cae-7b16999e579d_1518x1172.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=0UhhAfRYRL1BS1I5&amp;t=874">14:34</a></p><p><strong>Description</strong>: Automated comment merging for LessWrong; finds disputes between the comments and the text and then highlights the text with the disputes; color coded and you can mouse over and then jump to the comment.</p><p><strong>Why did you build this?</strong>: I&#8217;m interested in how LLMs will enable highly personalized UI/UX. One of my main contentions is that software became mass produced because the cost of development is very high and so it was prohibitive to create artisanal software solutions for individuals - but that LLMs - because they make software cheaper - give us the opportunity to return to a more artisanal software experience - where our interface to software is created dynamically. Moreover, as the cost to benefit ratio of software was even worse in design than elsewhere - good design has been something essentially limited to software companies that aggressively focus on it as part of their core value prop (e.g. Apple, Notion, Linear). But, this can now change, and we can have better, more personalized, richer experiences.</p><p><strong>What are you most proud of for this project?</strong>: It worked. It has nice bells and whistles. It enables me to have more control over a document as an organic thing.</p><p><strong>Source:</strong> <a href="https://github.com/chutcheson/Symphronesis">https://github.com/chutcheson/Symphronesis</a></p><p><strong>Comments:</strong> Interesting &amp; pretty UI, reasonable concept. Lots of audience questions about how it was implemented. Lukas: &#8220;Unfortunate to do this for Lesswrong which is the website with the most support for this already&#8221;</p><h3>Manifund Eval, by Ben Rachbach &amp; William Saunders</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pUg1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pUg1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 424w, https://substackcdn.com/image/fetch/$s_!pUg1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 848w, https://substackcdn.com/image/fetch/$s_!pUg1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 1272w, https://substackcdn.com/image/fetch/$s_!pUg1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pUg1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png" width="412" height="338.6748898678414" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:933,&quot;width&quot;:1135,&quot;resizeWidth&quot;:412,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Manifund Eval&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Manifund Eval" title="Manifund Eval" srcset="https://substackcdn.com/image/fetch/$s_!pUg1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 424w, https://substackcdn.com/image/fetch/$s_!pUg1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 848w, https://substackcdn.com/image/fetch/$s_!pUg1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 1272w, https://substackcdn.com/image/fetch/$s_!pUg1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5997501-fd8b-4737-8d53-e9ea99540313_1135x933.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo:</strong> Starts <a href="https://youtu.be/xGAfG0L_82g?si=SrmPkK4lR31CfQUI&amp;t=98">1:38</a></p><p><strong>Description</strong>: Screen all Manifund projects to identify ones to look into more to consider funding. Also identifies the grant&#8217;s story for having an impact on transformative AI going well, so you can review that to save time in your evaluation. Makes it feasible to quickly sift through the large number of Manifund projects to find promising ones to consider.</p><p><strong>Demo link</strong>: <a href="https://manifundeval-zfxpigvo8jemehaybdwwsw.streamlit.app/">https://manifundeval-zfxpigvo8jemehaybdwwsw.streamlit.app/</a></p><p><strong>Comments</strong>: Of course, soft spot in my heart for using the Manifund API. Pretty important and impactful project (Andreas: &#8220;I actually need this.&#8221;) Not sure if the final scores outputted, or reasoning were that good though; didn&#8217;t seem that great by my lights. Might be biased &#8212; I&#8217;d tried something similar (for giving feedback to potential new projects) and it was only okay. But def worth more experimentation. I think I might want to issue a bounty to solve this problem for Manifund.</p><h3><strong>Detecting Fraudulent Research, by Panda Smith &amp; Charlie George (winner)</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1N29!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1N29!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 424w, https://substackcdn.com/image/fetch/$s_!1N29!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 848w, https://substackcdn.com/image/fetch/$s_!1N29!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 1272w, https://substackcdn.com/image/fetch/$s_!1N29!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1N29!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png" width="562" height="321.52884615384613" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:833,&quot;width&quot;:1456,&quot;resizeWidth&quot;:562,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Detecting Fraudulent Research&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Detecting Fraudulent Research" title="Detecting Fraudulent Research" srcset="https://substackcdn.com/image/fetch/$s_!1N29!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 424w, https://substackcdn.com/image/fetch/$s_!1N29!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 848w, https://substackcdn.com/image/fetch/$s_!1N29!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 1272w, https://substackcdn.com/image/fetch/$s_!1N29!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb66d69c7-f5f5-4728-b499-582eae60e30c_1616x924.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=tL5AITlwGDceJ_5i&amp;t=1259">20:59</a></p><p><strong>Description</strong>: There&#8217;s a lot of research. A lot of it seems bad. How much? We use language models to try and detect retraction-worthy errors in published literature. We purely reason from first-principles, without using meta-textual information.</p><p><strong>Why did you build this?</strong> Panda: At Elicit, I spend a lot of time thinking about people&#8217;s info sources. I&#8217;ve also read metascience blogs for a long time. I assumed there would be some fraud/bad papers that modern reasoning models could catch pretty easily. (I didn&#8217;t think there&#8217;d be so much!)</p><p><strong>What are you most proud of for this project?</strong> Panda: Very happy with doing a mix of &#8220;research&#8221; where we ran the numbers on how effective our technique was, but also prototyping and making something people can get their hands on</p><p><strong>Source</strong>: <a href="https://github.com/CG80499/paper-retraction-detection">https://github.com/CG80499/paper-retraction-detection</a></p><p><strong>Demo link:</strong> <a href="https://papercop.vercel.app/">https://papercop.vercel.app/</a></p><p><strong>Comments</strong>: Had the most &#8220;wow this is fun to play with&#8221; factor, also &#8220;I can see this going viral&#8221;. I particularly liked that they had some semblance of evals (taking 100 papers and running it through), rather than just one or two demo cases; with LLM stuff it&#8217;s easy to focus on one or two happy cases, and I&#8217;m glad they didn&#8217;t.</p><h3>Artificial Collective Intelligence, by Evan Hadfield</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lmzw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lmzw!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 424w, https://substackcdn.com/image/fetch/$s_!lmzw!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 848w, https://substackcdn.com/image/fetch/$s_!lmzw!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 1272w, https://substackcdn.com/image/fetch/$s_!lmzw!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lmzw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png" width="414" height="530.9415584415584" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1580,&quot;width&quot;:1232,&quot;resizeWidth&quot;:414,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Artificial Collective Intelligence&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Artificial Collective Intelligence" title="Artificial Collective Intelligence" srcset="https://substackcdn.com/image/fetch/$s_!lmzw!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 424w, https://substackcdn.com/image/fetch/$s_!lmzw!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 848w, https://substackcdn.com/image/fetch/$s_!lmzw!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 1272w, https://substackcdn.com/image/fetch/$s_!lmzw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13b388f2-4796-4b66-8d92-9e9a0ed12ad5_1232x1580.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=7-I32ei7N_OznSRK&amp;t=1753">29:13</a></p><p><strong>Description</strong>: ACI is a consensus-finding tool in the style of Polis / Community Notes, simulating a diverse range of perspectives. LLMs play the role of extra participants, submitting suggestions and voting on entries.</p><p><strong>Demo link</strong>: <a href="https://aci-demos.vercel.app/">https://aci-demos.vercel.app/</a></p><p><strong>Comment</strong>: Most ambitious IMO &#8212; an entire platform of sims. With more time to develop this, I could see this as my favorite entry. Unfortunately, lost points for not having a live working demo :(</p><h3>Thought Logger and Cyborg Extension, by Raymond Arnold</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!f-iR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!f-iR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 424w, https://substackcdn.com/image/fetch/$s_!f-iR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 848w, https://substackcdn.com/image/fetch/$s_!f-iR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 1272w, https://substackcdn.com/image/fetch/$s_!f-iR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!f-iR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png" width="446" height="352.5728021978022" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1151,&quot;width&quot;:1456,&quot;resizeWidth&quot;:446,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Thought Logger and Cyborg Extension&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Thought Logger and Cyborg Extension" title="Thought Logger and Cyborg Extension" srcset="https://substackcdn.com/image/fetch/$s_!f-iR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 424w, https://substackcdn.com/image/fetch/$s_!f-iR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 848w, https://substackcdn.com/image/fetch/$s_!f-iR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 1272w, https://substackcdn.com/image/fetch/$s_!f-iR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F977f1b20-4415-4b98-9a86-15f4630841ef_1632x1290.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=8EvhLZZiy_GM23IK&amp;t=2126">35:26</a></p><p><strong>Description</strong>: I have a pair of products: &#8211; a keylogger, which tracks all your keystrokes (except from apps you put on a blocklist), and exposes it on a local server &#8211; and a &#8220;prompt library&#8221; chrome extension, which lets me store fairly complicated prompts and quickly run them, while pulling a website or the keylogger logs into context.</p><p>For demo day, I worked on a &#8220;useful personal predictions&#8221; prompt for the prompt library, which takes in my keylogs from the past 2 days, extrapolates what projects I seem to be working on, and generates prediction-statements about my project, that help guide my strategy. (i.e. &#8220;I&#8217;ll get at least 3 positive reports from users about my product helping them, spontaneously, in the next 2 months.&#8221;). When I see ones I like, I enter them into Fatebook.</p><p><strong>Why did you build this?</strong> The general idea of the keylogger + prompt library is to set me up to leverage AI in all kinds of customized ways over the next couple years. I want to be an AI poweruser, and to have an easy affordance to invent new workflows that leverage it in a repeatable way.</p><p>I think &#8220;decision-relevant predictions&#8221; is a good tool to help you get calibrated on whether your current plans are on track to succeed. But operationalizing them is kind of annoying.</p><p><strong>Source:</strong> The tools aren&#8217;t public yet, but message me at <a href="mailto:raemon777@gmail.com">raemon777@gmail.com</a> if you&#8217;d like to try them out.</p><p><strong>Comments</strong>: Interesting set of work, I like the keylogger idea, picture of &#8220;record everything and have LLMs sort it out&#8221;. In practice had the flavor of people optimizing their personal setup a bit too much, and being hard to scale out (see also: complex Obsidian thought mapping, or spaced repetition)</p><h3>Double-cruxes in the New York Times&#8217; &#8220;The Conversation&#8221;, by Tilman Bayer</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Dbmy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Dbmy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 424w, https://substackcdn.com/image/fetch/$s_!Dbmy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 848w, https://substackcdn.com/image/fetch/$s_!Dbmy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 1272w, https://substackcdn.com/image/fetch/$s_!Dbmy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Dbmy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png" width="452" height="301.4368131868132" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:452,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Double-cruxes  in the New York Times&#8217;  &#8220;The Conversation&#8221;\n&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Double-cruxes  in the New York Times&#8217;  &#8220;The Conversation&#8221;
" title="Double-cruxes  in the New York Times&#8217;  &#8220;The Conversation&#8221;
" srcset="https://substackcdn.com/image/fetch/$s_!Dbmy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 424w, https://substackcdn.com/image/fetch/$s_!Dbmy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 848w, https://substackcdn.com/image/fetch/$s_!Dbmy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 1272w, https://substackcdn.com/image/fetch/$s_!Dbmy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76099c7c-d5a4-448d-866f-539dc5202b02_2256x1504.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=SChyf5O_GG-mxJb8&amp;t=2477">41:17</a></p><p><strong>Description</strong>: &#8220;The Conversation&#8221; is a weekly political debate format in New York Times &#8220;Opinion&#8221; section between conservative(ish) journalist Bret Stephens and liberal(ish) journalist Gail Collins, ongoing since 2014. I used Gemini 2.0 Flash Thinking to identify double-cruxes in each debate, with the aim to track both participants' shifts over time.</p><p><strong>Why did you build this?</strong>: Double-Cruxes are a somewhat intricate epistemic concept that so far doesn't seem to have made it very far beyond the LessWrong sphere. I wanted to explore whether one could use current LLMs to apply it at scale to a (non-cherrypicked) corpus of political debates aimed at a general audience.</p><p><strong>What are you most proud of for this project?</strong>: After some experimentation, found a prompt+model combination that holds up quite well in vibe tests so far.</p><p><strong>Source</strong>: <a href="https://docs.google.com/presentation/d/1Igs6T-elz61xysRCMQmZVX8ysfVX4NgTItwXPnrwrak/edit">Presentation slides</a> from the hackathon</p><p><strong>Comments</strong>: Unclear to me if double-cruxes is important epistemic tech, esp whether it has broad reach. Didn&#8217;t really have a working demo, sadly.</p><h3>Trying to make GPT 4.5 Non-sycophantic (via a better system prompt), by Oliver Habryka</h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!iYLM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!iYLM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 424w, https://substackcdn.com/image/fetch/$s_!iYLM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 848w, https://substackcdn.com/image/fetch/$s_!iYLM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 1272w, https://substackcdn.com/image/fetch/$s_!iYLM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!iYLM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png" width="494" height="406.125" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/271bec15-8407-4ed9-86de-995889f95899_1774x1458.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1197,&quot;width&quot;:1456,&quot;resizeWidth&quot;:494,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Trying to make GPT 4.5 Non-sycophantic (via a better system prompt)&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Trying to make GPT 4.5 Non-sycophantic (via a better system prompt)" title="Trying to make GPT 4.5 Non-sycophantic (via a better system prompt)" srcset="https://substackcdn.com/image/fetch/$s_!iYLM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 424w, https://substackcdn.com/image/fetch/$s_!iYLM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 848w, https://substackcdn.com/image/fetch/$s_!iYLM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 1272w, https://substackcdn.com/image/fetch/$s_!iYLM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F271bec15-8407-4ed9-86de-995889f95899_1774x1458.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=f1JhRTv310q31wye&amp;t=2821">47:01</a></p><p><strong>Description</strong>: I tried to make a system prompt for GPT 4.5 that actually pushes back on things I say and I can argue with in productive ways. It isn&#8217;t perfect, but honestly a bunch better than other experiences I&#8217;ve had arguing with LLMs.</p><p><strong>Prompt</strong>: <a href="https://gist.githubusercontent.com/akrolsmir/431fa35e2021db0fa1c1e6f3efc2cf62/raw/440a241ce18348823f3e47c9b06213c6c188cfb8/prompt.txt">link</a></p><p><strong>Comments</strong>: Many points for directly trying something out of the Owen&#8217;s spec. And for having the bravery to do a &#8220;non-technical&#8221; hack &#8212; as LLMs do more of the technical work, what&#8217;s left for humans is prompting well, imo. And for something that is immediately usable!</p><h3><strong>Squaretable, by David Nachman (winner)</strong></h3><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;4a72e8d0-484f-4d65-98eb-70a0b817a443&quot;,&quot;duration&quot;:null}"></div><p><strong>Demo</strong>: Starts <a href="https://youtu.be/xGAfG0L_82g?si=TDVgAxAFeSX0sf59&amp;t=3327">55:27</a></p><p><strong>Description</strong>: To assist a user in decision-making, the app uses LLMs to help the user come up with weighted factors, possible options, and factor values for each option. The UI consists of an always displayed table of the factors, options, weights, and values. The final score for each option is computed symbolically as a weighted sum based on the values and weights.</p><p><strong>Comments</strong>: Great UI, information is pretty well laid out and yet compact, love the colors. Unfortunate that David didn&#8217;t seem to think that the LLM&#8217;s results were that good. Andreas: &#8220;maybe better UX if you add columns incrementally, easier to spot check&#8221;. Makes sense, kind of like git diffs or what Cursor does in chat mode.</p><h2>What went well</h2><ul><li><p>Hacks were pretty cool! Especially given that they all represented ~8 hours of work</p><ul><li><p>Many are minimal versions of products I really want to exist, and play with more</p></li><li><p>Almost all of them felt like a worthwhile showcase, exploration of something interesting, and relevant to this field</p></li></ul></li><li><p>Lots of great people came for this! Very hard to think of more central folks for AI for Epistemics:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1k-9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1k-9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 424w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 848w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1272w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1k-9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png" width="1456" height="423" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:423,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!1k-9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 424w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 848w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1272w, https://substackcdn.com/image/fetch/$s_!1k-9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cf0dd1-760a-4fc4-bc51-fe540904252d_3754x1091.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">our lovely faces, once more</figcaption></figure></div></li><li><p><em>From left to right: Rafe Kennedy, Oli Habryka, Evan Hadfield, Kirill Chesnov, Owain Evans, Charlie George, Panda Smith, Gustavo Lacerda, Andreas Stuhlm&#252;ller, Austin Chen, David Nachbach (virtual), Lukas Finnveden, Tamera Lanham, Noa Nabeshima, Campbell Hutcheson, Keri Warr, Xyra Sinclair, Tilman Bayer, Raymon Arnold, Chris Lakin, Ozzie Gooen</em></p><p><em>Not pictured participants and viewers: William Saunders, Ben Goldhaber, Deger Turan, Vishal Maini, Ross Rheingans-Yoo, Ethan Alley, Dan Selsam, Stephen Grugett, David Chee, Saul Munn, Gavriel Kleinwaks and many others&#8230;</em></p><ul><li><p>Some of the goal of the hackathon, as with any event, is just to bring people together and have them stay in contact</p></li></ul></li><li><p>Good conversations at the beginning while participants were ideating, and throughout. Both on AI for Epistemics and other topics.</p></li><li><p>Overall event felt smooth and cohesive, especially for being pulled together on not that much organizer time</p><ul><li><p>Pretty happy with the continuing artifacts that we produced out of this hackathon (the showcase page, the video recordings, this writeup)</p></li><li><p>Somewhat more effortful to do all this compared to typical hackathons Austin has run, but hopefully worthwhile when trying to incubate a new field</p></li></ul></li><li><p><a href="http://moxsf.com/">Mox</a> seemed to be a good venue for this event! This was just our third week of operating, but I think our venue supported the hackathon well.</p><ul><li><p>One participant remarked:</p></li></ul></li></ul><blockquote><p>Something I like about your office is that it seems to naturally create the Cal Newport Deep Work architecture, where the further in you go the more deepworky it is</p></blockquote><h2>What could have gone better</h2><ul><li><p>Fewer submitted hacks than we&#8217;d hoped for</p><ul><li><p>Had ~40 people around but only ~10 submissions. Ideally more like 15-20?</p></li><li><p>Maybe we should have promoted this event harder, or cast a wider net?</p><ul><li><p>There&#8217;s a tradeoff on average participant quality vs number of submissions.</p></li><li><p>But maybe projects are hit-based, so having the best projects matters more than having a high average quality</p></li></ul></li><li><p>Maybe try to get higher commitment from folks, if we run this again</p></li></ul></li><li><p>Hoping to have discovered more people from outside our current networks, who are excited for AI for Epistemics</p><ul><li><p>2 of the 3 prizes went to teams from Elicit</p><ul><li><p>(Which says something about how great the Elicit team is, in case anyone out there is thinking about finding a new job&#8230;)</p></li></ul></li></ul></li><li><p>Unclear path to deployment for these projects, or continuing impact</p><ul><li><p>Admittedly, this is a standard problem with a hackathon form factor, especially when the hackathon isn&#8217;t housed within an org/for specific product features</p></li></ul></li><li><p>Not sure we made great use of the ideas doc &amp; categories that Owen/Raymond/Ben compiled?</p><ul><li><p>But hopefully, their work will set a stage of &#8220;this is what AI for Epistemics is about&#8221;</p></li><li><p>Perhaps having such specified categories was too confusing for participants</p><ul><li><p>One participant asked, &#8220;do I have to do something in these categories?&#8221; (A: no, but it&#8217;s bad that this wasn&#8217;t clear)</p></li></ul></li></ul></li><li><p>As judges: hard to give great judgements and feedback in short amount of time, by just looking at demos and asking questions</p><ul><li><p>The format of &#8220;demos in front of an audience&#8221; do bias towards presentation ability and flashiness, over usability of a core product</p></li><li><p>Might change up the structure for next time</p><ul><li><p>More time for judges and audience to play with hackathon demos?</p></li><li><p>Open up voting to the public, so it&#8217;s more democratized?</p></li></ul></li></ul></li></ul><h2>Final notes</h2><p>Overall, we&#8217;re very happy with how this hackathon turned out. Building a new field from scratch is difficult, high-dimensional problem, and this is just one step along the way; but I think we made meaningful progress, with ideas we brainstormed, hacks we demoed, and people we gathered.</p><p>After the end of the hackathon, a few of the judges and participants continued to discuss: &#8220;What&#8217;s next for AI for Epistemics? How does one build a nascent field? Is &#8216;AI for Epistemics&#8217; even a good name?&#8221; We&#8217;ll try to share more on this in the coming days; until then, if AI for Epistemics excites you, leave a comment or reach out to us!</p>]]></content:encoded></item><item><title><![CDATA[Come to minifest, our cozy one-day unfestival]]></title><description><![CDATA[Saturday, Dec 14 at Lighthaven, Berkeley]]></description><link>https://manifund.substack.com/p/come-to-minifest-our-cozy-one-day</link><guid isPermaLink="false">https://manifund.substack.com/p/come-to-minifest-our-cozy-one-day</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Mon, 09 Dec 2024 21:24:51 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/81942975-9fab-41a1-bace-ae5f172215e4_2536x1692.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hey! As 2024 comes to an end, I wanted to thank all y&#8217;all who have used Manifund over this last year. Whether you are a donor, a project creator, or just a commenter, our little charity only works because you chose to invest your time, hard-earned money &#8212; and above all your trust. I&#8217;d love to show some of my appreciation, but how?</p><p>Today, Manifund is mostly a remote endeavor; we interact through website comments, emails, and Discord. So one thing I&#8217;ve always wanted to do is to bring y&#8217;all together and meet face-to-face. I&#8217;ve really enjoyed doing this with the Manifold community at our premier festival, <a href="https://www.manifest.is/">Manifest</a>; but that takes months of work to organize, and Manifund&#8217;s not yet at that scale. What would a more minimal, 80/20 festival look like?</p><p>And so: minifest! We've booked out Lighthaven for a single day: this coming Saturday, Dec 14. We'll have unconference-style sessions, a smattering of events like charity poker, and ample time to hang out and chat. We're running this on a shoestring budget, so that everyone can come -- eg expect a homecooked dinner instead of catering. It's a cozy, lowkey, experimental event; I hope to see you there!</p><p>Get a ticket here: <a href="https://minifest.is">https://minifest.is</a></p><p>&#8212; Austin</p><p><em>In other news: we&#8217;re in talks to start a 100-person office &amp; coworking space in San Francisco next year &#8212; <a href="https://manifold.markets/Austin/will-i-start-a-coworking-or-event-s">markets say 70% likely</a>. If you&#8217;d like to get involved as a funder, tenant, or organizer, let me know (austin@manifund.org)~</em></p>]]></content:encoded></item><item><title><![CDATA[5 homegrown EA projects, seeking small donors]]></title><description><![CDATA[plus updates on what Manifund has been up to]]></description><link>https://manifund.substack.com/p/5-homegrown-ea-projects-seeking-small</link><guid isPermaLink="false">https://manifund.substack.com/p/5-homegrown-ea-projects-seeking-small</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Mon, 28 Oct 2024 23:21:35 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!GRU9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>What do I mean by &#8220;homegrown&#8221;? These projects are:</p><ul><li><p><strong>Local</strong>: Creators have a good track record in the EA or AI Safety community</p></li><li><p><strong>Modest</strong>: The amount requested is not large; $5k would be meaningful</p></li><li><p><strong>Overlooked</strong>: Not already backed by large institutional funders like OpenPhil</p></li></ul><p>If you&#8217;re a small donor or earn to give, consider giving to projects like these:</p><h3><strong><a href="https://manifund.org/projects/finishing-the-sb-1047-documentary-in-6-weeks">1. Feature-length documentary on SB 1047</a></strong></h3><p>By Michael Trazzi &#8212; $16k raised of $55k</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GRU9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GRU9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 424w, https://substackcdn.com/image/fetch/$s_!GRU9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 848w, https://substackcdn.com/image/fetch/$s_!GRU9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 1272w, https://substackcdn.com/image/fetch/$s_!GRU9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GRU9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png" width="1456" height="817" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:817,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GRU9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 424w, https://substackcdn.com/image/fetch/$s_!GRU9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 848w, https://substackcdn.com/image/fetch/$s_!GRU9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 1272w, https://substackcdn.com/image/fetch/$s_!GRU9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a905edd-a223-4d3f-9fe6-4bef28814d37_1600x898.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Michael has already recorded interviews with the main characters of SB 1047: sponsors like Scott Wiener and Dan Hendrycks, proponents like Zvi Mowshowitz and Holly Elmore, and opponents like Dean Ball and Timothy B Lee. Now he needs the funding to turn it into a 1-hour feature documentary. This is a rare chance to sponsor a high-quality video narrative, and share it beyond our existing ecosystem. I&#8217;ve personally donated $10k towards this, and expect Michael to be able to effectively use much more.</p><p><em>More info &amp; donate here: <a href="https://manifund.org/projects/finishing-the-sb-1047-documentary-in-6-weeks">https://manifund.org/projects/finishing-the-sb-1047-documentary-in-6-weeks</a></em></p><h3><strong><a href="https://manifund.org/projects/shallow-review-of-ai-safety-2024">2. Overview of AI Safety in 2024</a></strong></h3><p>By Gavin Leech &#8212; $8k of $17.6k raised</p><p>Gavin Leech is a forecaster, researcher and founder of <a href="https://arbresearch.com/">Arb</a>; he&#8217;s proposing to re-rerun a 2023 survey of AI Safety. The landscape shifts pretty quickly, so I&#8217;d love to see what&#8217;s changed since last year.</p><p><em>As I was writing this, regrantor Neel Nanda funded it to the minimum $8k ask! Neel adds:</em></p><blockquote><p>I think collections like this add significant value to newcomers to the field, mostly by being a list of all areas worth maybe thinking about, and key links (rather than eg by providing a lot of takes on which areas are more or less important, unless the author has excellent taste). Gavin has convinced me that the previous post gets enough traffic for it be valuable to be kept up to date.</p></blockquote><p><em>More info &amp; donate here: <a href="https://manifund.org/projects/shallow-review-of-ai-safety-2024">https://manifund.org/projects/shallow-review-of-ai-safety-2024</a></em></p><h3><strong><a href="https://manifund.org/projects/elizabeth-and-timothy-podcast-on-values-in-effective-altruism">3. Podcast series on Effective Altruism&#8217;s values</a></strong></h3><p>By Elizabeth Van Nostrand &#8212; $1.3k raised of $2.6k</p><p>Elizabeth &amp; Timothy&#8217;s initial podcast was very well received, drawing extensive, thoughtful comments <a href="https://www.lesswrong.com/posts/u9a8RFtsxXwKaxWAa/why-i-quit-effective-altruism-and-why-timothy-telleen-lawton">from a variety of folks</a>. I&#8217;d be excited to see them continue this series, especially if they bring in folks involved with steering the EA community (like Sarah Cheng, who has <a href="https://www.lesswrong.com/posts/u9a8RFtsxXwKaxWAa/why-i-quit-effective-altruism-and-why-timothy-telleen-lawton#N4YTvb7jR6Sxhg2vt">extensively engaged</a> with their points)</p><p><em>More info &amp; donate here: <a href="https://manifund.org/projects/elizabeth-and-timothy-podcast-on-values-in-effective-altruism">https://manifund.org/projects/elizabeth-and-timothy-podcast-on-values-in-effective-altruism</a></em></p><h3><strong><a href="https://manifund.org/projects/fund-sentinel-for-q4-2024">4. Sentinel, a foresight and emergency response team</a></strong></h3><p>By Nuno Sempere &#8212; $16k raised of $90k</p><p>Nuno has long been one of our community&#8217;s most outspoken forecasters; now he&#8217;s working with Rai Sur to spin up an emergency response team (think: Army Reserve Corps, but for responding to existential risks). They&#8217;re already putting out a <a href="https://sentinel-team.org/#latest">useful weekly report</a> on biosecurity, geopolitics and other such topics.</p><p><em>More info &amp; donate here: <a href="https://manifund.org/projects/fund-sentinel-for-q4-2024">https://manifund.org/projects/fund-sentinel-for-q4-2024</a></em></p><h3><strong><a href="https://manifund.org/projects/salaries-for-sae-co-occurrence-project">5. Research on co-occurence of sparse autoencoder latents</a></strong></h3><p>By Matthew A. Clarke &#8212; $0 raised of $6.4k</p><p>TBH, I don&#8217;t know much about the merits for or against this line of research; I&#8217;m highlighting this grant because it&#8217;s overseen by <a href="https://manifund.org/projects/independent-researcher">Joseph Bloom, a past Manifund grantee</a> who I and others have been very impressed with. If mechanistic interpretability is your jam, check this one out!</p><p><em>More info &amp; donate here: <a href="https://manifund.org/projects/salaries-for-sae-co-occurrence-project">https://manifund.org/projects/salaries-for-sae-co-occurrence-project</a></em></p><h3>What else have we been up to?</h3><p>It&#8217;s been a quiet couple of months, but here&#8217;s what Rachel and I have been busy with:</p><ul><li><p><strong>Organizing <a href="https://www.notion.so/5-homegrown-EA-projects-seeking-small-donors-12d54492ea7a80ee9028c07b8124da41?pvs=21">The Curve</a>, a premier conference on transformative AI</strong></p></li><li><p>Wrapping up <a href="https://manifund.org/causes/ea-community-choice">EA Community Choice</a> and paying out grants (writeup to come, someday&#8230;)</p><ul><li><p>Did you see that <em>Marc Andreessen</em> donated $32k to <a href="https://manifund.org/projects/act-i-exploring-emergent-behavior-from-multi-ai-multi-human-interaction">one of the projects</a> &#128558;</p></li></ul></li><li><p>Filing our <s>taxes</s> Form 990 for 2023. Last year, we raised ~$3m and disbursed ~$2.6m!</p></li><li><p>Waiting on Survival and Flourishing Funds to get back to us on <a href="https://www.notion.so/3a614aa8fd8e4a459aa19735a8ed7adc?pvs=21">our funding request</a></p></li><li><p>Hacking on a system to <a href="http://yield.sh/">spin up web apps</a> from a single LLM prompt</p></li><li><p>Dreaming about a new coworking space like <a href="https://www.notion.so/Constellation-of-SF-acf2301d7b664a80af0fc04e11162eb5?pvs=21">Constellation or Lighthaven, but in SF</a></p></li><li><p>And of course, taking some time off to care for our newborn baby, Ada</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BxkG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BxkG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 424w, https://substackcdn.com/image/fetch/$s_!BxkG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 848w, https://substackcdn.com/image/fetch/$s_!BxkG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 1272w, https://substackcdn.com/image/fetch/$s_!BxkG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BxkG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png" width="440" height="586.6666666666666" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:992,&quot;width&quot;:744,&quot;resizeWidth&quot;:440,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BxkG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 424w, https://substackcdn.com/image/fetch/$s_!BxkG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 848w, https://substackcdn.com/image/fetch/$s_!BxkG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 1272w, https://substackcdn.com/image/fetch/$s_!BxkG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9786353a-7c76-4fc3-987d-0610a1363a3b_744x992.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Cheers,</p><p>Austin</p><p></p><p><em>PS: want to make a larger grant? Manifund can facilitate donations via <a href="https://www.notion.so/02aee92e884a47e49efd4d93242e2080?pvs=21">donor advised funds, crypto, and bank transfers</a>.</em></p>]]></content:encoded></item><item><title><![CDATA[Claim your funds now for EA Community Choice!]]></title><description><![CDATA[$100k airdrop, while supplies last~]]></description><link>https://manifund.substack.com/p/claim-your-funds-now-for-ea-community</link><guid isPermaLink="false">https://manifund.substack.com/p/claim-your-funds-now-for-ea-community</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Wed, 21 Aug 2024 17:28:40 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/af5b9361-c1d7-4b48-a10f-9412496c67eb_1080x617.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Last week, we announced <a href="https://manifund.substack.com/p/announcing-the-200k-ea-community">EA Community Choice</a>: a $200k round for funding projects in the EA community. Now, you can claim $100 to $800 in funds to donate to your favorite projects!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://manifund.org/edit-profile/roles&quot;,&quot;text&quot;:&quot;Claim your funds&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://manifund.org/edit-profile/roles"><span>Claim your funds</span></a></p><p></p><h3>Recap on how this works</h3><ul><li><p><strong>Phase 1: Project registrations open (Aug 13)</strong></p><ul><li><p><strong>50+ projects</strong> signed up to participate in this round, so far</p><ul><li><p>Project signups will remain open through phase 2; you can list your project <strong><a href="https://manifund.org/create?prize=ea-community-choice">here</a></strong>,</p></li></ul></li><li><p><strong>350+ members</strong> of the EA community registered their interest</p></li></ul></li><li><p><strong>Phase 2: Community members claim funds &amp; donate (Aug 21 - now!)</strong></p><ul><li><p>Fill out <a href="https://manifund.org/edit-profile/roles">this form about your roles in EA</a> to claim your funds.</p></li><li><p>Then, look through the <a href="https://manifund.org/causes/ea-community-choice">Community Choice projects here</a>; donate to your favorites!</p></li><li><p>You can also leave comments on each project proposal. This is a great way to ask questions to the project organizer, or share thoughts with rest of the EA community.</p></li></ul></li><li><p><strong>Phase 3: Funds matched and sent to projects (Sep 3)</strong></p><ul><li><p>Projects and donations will be locked in on September 3rd. Then, all money donated will be matched against a $100k <em>quadratic funding pool</em>.</p></li><li><p><em>Note: we&#8217;ve pushed back the dates a bit to account for Labor Day.</em></p></li></ul></li></ul><p></p><p>More questions? Check out the <a href="https://manifund.substack.com/p/announcing-the-200k-ea-community">announcement post FAQ</a>, or hop in our <a href="https://discord.gg/ZGsDMWSA5Q">Discord</a>!</p>]]></content:encoded></item><item><title><![CDATA[Announcing the $200k EA Community Choice]]></title><description><![CDATA[$100k directed by community folks + $100k in quadratic funding]]></description><link>https://manifund.substack.com/p/announcing-the-200k-ea-community</link><guid isPermaLink="false">https://manifund.substack.com/p/announcing-the-200k-ea-community</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Wed, 14 Aug 2024 00:35:16 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!cjJc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Manifund is hosting a $200k funding round for EA community projects, where the grant decisions are made by <em>you</em>. You can direct $100-$800 of funding towards the projects that have helped with your personal journey as an EA. Your choices then decide how $100k in matching will be allocated, via quadratic funding!</p><p>Sign up <strong><a href="https://airtable.com/appucH86FmSMha33p/pag9fHXZoBXj61ZIN/form">here</a></strong> to get notified when the projects are live, or read on to learn more!</p><h3>Timeline</h3><ul><li><p><strong>Phase 1: Project registrations open (Aug 13)</strong></p><ul><li><p><strong>Project organizers</strong> can now create a page for their projects <strong><a href="https://manifund.org/create?prize=ea-community-choice">here</a></strong>, to raise funding as part of this EA Community Choice round.</p></li><li><p><strong>Community members</strong> can sign up for updates <strong><a href="https://airtable.com/appucH86FmSMha33p/pag9fHXZoBXj61ZIN/form">here</a></strong>, or recommend projects to sign up <strong><a href="https://airtable.com/appucH86FmSMha33p/pagmSJZeW6XJiNvqj/form">here</a>.</strong></p></li></ul></li><li><p><strong>Phase 2: Community members receive funds (Aug 20)</strong></p><ul><li><p>We&#8217;ll give everyone $100 to donate; more if you&#8217;ve been active in the EA community. Fill out a 2-minute form to claim your $100, plus bonuses for:</p><ul><li><p><em>Donor:</em> $100 for taking the GWWC &#128312;10% Pledge</p></li><li><p><em>Organizer</em>: $100 for organizing any EA group</p></li><li><p><em>Scholar:</em> $100 for having 100 or more karma on the EA Forum</p></li><li><p><em>Volunteer:</em> $100 for volunteering at an EAG(x), Future Forum, or Manifest</p></li><li><p><em>Worker:</em> $100 for working fulltime at an EA org, or full-time on an EA grant</p></li><li><p><em>Senior:</em> $100 for having done any of the above prior to 2022</p></li><li><p><em>Insider</em>: $100 if you had a Manifund account before August 2024</p></li></ul></li><li><p>You can then donate your money to any project in the Community Choice round!</p></li><li><p>You can also leave comments about any specific project. This is a great way to share your experiences with the project organizer, or the rest of the EA community.</p></li><li><p>Funds in Phase 2 will be capped at $100k, first-come-first-served.</p></li></ul></li><li><p><strong>Phase 3: Funds matched and sent to projects (Sep 1)</strong></p><ul><li><p>Projects and donations will be locked in at the end of August. Then, all money donated will be matched against a $100k <em>quadratic funding pool</em>.</p></li><li><p>Unlike a standard 1:1 match, quadratic funding rewards a project with lots of small donors more than a project with few big donors. The broader the support, the bigger the match!</p></li><li><p>Specifically, the match is proportional the <em>square of the sum of square roots of individual donations</em>. A toy example:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cjJc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cjJc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 424w, https://substackcdn.com/image/fetch/$s_!cjJc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 848w, https://substackcdn.com/image/fetch/$s_!cjJc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 1272w, https://substackcdn.com/image/fetch/$s_!cjJc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cjJc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png" width="800" height="390" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/aaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:390,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:45235,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cjJc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 424w, https://substackcdn.com/image/fetch/$s_!cjJc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 848w, https://substackcdn.com/image/fetch/$s_!cjJc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 1272w, https://substackcdn.com/image/fetch/$s_!cjJc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faaab07a8-0ba5-4a78-83ea-ac79d5db5933_800x390.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p></li><li><p>Learn more about principles behind quadratic funding by reading <a href="https://vitalik.eth.limo/general/2019/12/07/quadratic.html">Vitalik Buterin&#8217;s explainer</a>, watching <a href="https://www.youtube.com/watch?v=xwY0UAk14Rk">this video</a>, or <a href="https://qf.gitcoin.co/?grant=100,50,1,1,5&amp;grant=100,100,5,5&amp;grant=400&amp;grant=5,5,5,5,5,5,5,5,5&amp;grant=&amp;match=10000">playing with this simulator</a>.</p></li></ul></li></ul><h3>What is an EA community project?</h3><p>We don&#8217;t have a strict definition, but roughly any project which helps you: learn about EA, connect with the EA community, or grow your impact in the world. We&#8217;re casting a wide net; <strong>projects do not have to explicitly identify as EA to qualify</strong> (though, we also endorse <a href="https://benthams.substack.com/p/be-proud-to-be-an-effective-altruist">being proud of being EA</a>). If you&#8217;re not sure if you count, just apply!</p><p>Examples of projects we&#8217;d love to fund:</p><ul><li><p>Community groups</p><ul><li><p>Regional groups like <a href="https://manifund.org/projects/support-a-thriving-and-talented-community-of-ea-filipinos-">EA Philippines</a></p></li><li><p>Cause-specific groups like <a href="https://joinhive.org/">Hive</a></p></li><li><p>University groups like EA Tufts</p></li></ul></li><li><p>Physical spaces</p><ul><li><p>Coworking spaces like <a href="https://manifund.org/projects/lightcone-infrastructure">Lighthaven</a>, <a href="https://epistea.org/">Epistea</a>, and <a href="https://manifund.org/projects/ai-safety-serbia-hub---office-space-for-frugal-ai-safety-researchers">AI Safety Serbia Hub</a></p></li><li><p>Housing like <a href="https://manifund.org/projects/ceealar">CEEALAR</a> and Berkeley REACH</p></li></ul></li><li><p>Events</p><ul><li><p>Conferences like Manifest, EAGx, LessOnline, and <a href="https://manifund.org/projects/ai-animals-and-digital-minds-conference-and-retreat">AI, Animals, and Digital Minds</a></p></li><li><p>Extended gatherings like Manifest Summer Camp or Prague Fall Season</p></li><li><p>Recurring meetups like local groups or <a href="https://forum.effectivealtruism.org/posts/nxfhxwQg4HJ7KQz4A/ea-coworking-lounge-space-on-gather-town">online EA coworking</a></p></li><li><p>Tournaments like <a href="https://www.metaculus.com/tournaments/">Metaculus Tournaments</a> or <a href="https://www.quantifiedintuitions.org/estimation-game">The Estimation Game</a></p></li><li><p>Essay competitions like EA Criticism &amp; Red Teaming Contest</p></li></ul></li><li><p>Software</p><ul><li><p>Tools like <a href="https://www.squiggle-language.com/">Squiggle</a>, <a href="https://carlo.app/">Carlo</a>, <a href="https://fatebook.io/">Fatebook</a>, or <a href="https://www.getguesstimate.com/">Guesstimate</a></p></li><li><p>Visualizations like <a href="https://theaidigest.org/">AI Digest</a></p></li><li><p>Datasets like <a href="https://manifund.org/projects/donations-list-website-retroactive">Donations List Website</a></p></li></ul></li><li><p>Educational programs</p><ul><li><p>Incubators like <a href="https://manifund.org/projects/10th-edition-of-ai-safety-camp">AI Safety Camp</a> and <a href="https://manifund.org/projects/help-apart-expand-global-ai-safety-research">Apart Hackathons</a></p></li><li><p>Course materials like <a href="https://aisafetyfundamentals.com/">AI Safety Fundamentals</a></p></li></ul></li><li><p>Information resources</p><ul><li><p>Websites like <a href="https://www.aisafety.com/">AISafety.com</a> </p></li></ul><ul><li><p>Youtube channels like Rational Animations, Rob Miles, and <a href="https://manifund.org/projects/a-happier-world-youtube-channel-promoting-ea-ideas">A Happier World</a></p></li><li><p>Podcasts like The 80k Podcast, The Dwarkesh Podcast, and <a href="https://manifund.org/projects/making-52-ai-alignment-video-explainers-and-podcasts">The Inside View</a></p></li></ul></li></ul><h3>FAQ</h3><ul><li><p>What is Manifund?</p><ul><li><p><a href="https://manifund.org/">Manifund</a> is a platform for funding impactful projects. We&#8217;ve raised over $5m for hundreds of projects across causes like AI safety, biosecurity, animal welfare, EA infrastructure, and scientific research. Beyond crowdfunding, we also run programs such as AI safety regrants, impact markets, and ACX Grants.</p></li></ul></li><li><p>Why are you doing this?</p><ul><li><p>We want to give the EA community a voice in what projects get funded within our own community. Today, most funding decisions are centralized in the hands of a few grantmakers, such as OpenPhil, EA Funds, and SFF. We greatly appreciate their work, but at the same time, suspect that local knowledge gets lost in this process. With EA Community Choice, we&#8217;re asking everyone to weigh from their own experiences, on what projects have helped with their personal journey towards doing good.</p></li></ul></li><li><p>Why these criteria for donation bonuses?</p><ul><li><p>We chose these to highlight the different ways that someone can contribute to the EA movement. EA Community Choice aims to be more democratic than technocratic; we want to ensure a wide range of activities get recognized, and that a broad swathe of the EA community feels bought in to these donation decisions.</p></li></ul></li><li><p>Why quadratic funding?</p><ul><li><p>Quadratic funding is <a href="https://vitalik.eth.limo/general/2019/12/07/quadratic.html#:~:text=you%20can%20prove%20that%20this%20solves%20the%20tragedy%2Dof%2Dthe%2Dcommons%20problem%20as%20well%20as%20you%20can%20with%20that%20subsidy%20budget">theoretically optimal</a> to distribute matching funds towards a selection of public goods (and we&#8217;re suckers for elegant theory). The crypto community has pioneered this with some success, eg with Gitcoin Grants and Optimism&#8217;s Retroactive Public Goods Funding rounds. Closer to home, the LessWrong Annual Review is an example of a quadratic voting system in practice, which produces pretty good results.</p></li></ul></li><li><p>Where did this $200k come from?</p><ul><li><p>An anonymous individual in the EA community. Manifund would love to thank them publicly, but alas, the donor wishes not to be named for now. (It&#8217;s not FTX.)</p></li></ul></li><li><p>Can I direct my funds to a project I work on or am involved with?</p><ul><li><p>Yes! We ask that you mention this as a comment on the project, but otherwise it&#8217;s fine to donate to projects you are involved with.</p></li></ul></li><li><p>How should I direct my funds? Eg should I fund projects based on their past work, or how they would use marginal funding?</p><ul><li><p>We suggest based on how much value you have gotten out of it (aka retroactive instead of prospective), but it&#8217;s your charity budget; feel free to spend it as you wish.</p></li><li><p>We&#8217;d appreciate if you leave a comment about what made you decide to give to a particular project, though this is optional.</p></li></ul></li><li><p>Can I update my donations before Phase 3?</p><ul><li><p>Yes! If later donations or comments change your mind about where you want to give, you can change your allocation</p></li></ul></li><li><p>If I think a project has negative externalities, can I make a &#8220;negative vote&#8221; aka pay to redirect money away from it?</p><ul><li><p>TBD. This may be <a href="https://vitalik.eth.limo/general/2020/04/30/round5.html#:~:text=Responses%20to%20negative%20contributions">theoretically optimal</a> and has been used by other projects, but leaning no because of bad vibes/potential for drama and additional complexity it introduces.</p></li></ul></li><li><p>Can I contribute my own money towards a community project?</p><ul><li><p>Yes! You can make a personal donation to any project in this community choice round; these donations will also be eligible for the quadratic funding match (as well as a 501c3 tax deduction, if you&#8217;re based in the US).</p></li></ul></li><li><p>How about contributing towards the matching fund?</p><ul><li><p>Yes! We&#8217;re happy to accept donations to increase the size of the matching pool for this round. Reach out to <a href="mailto:austin@manifund.org">austin@manifund.org</a> and I&#8217;ll be happy to chat!</p></li><li><p>Or, if you&#8217;re excited by this structure but want to try a different focus (eg a funding round for &#8220;technical AI safety projects&#8221; or &#8220;animal welfare projects&#8221;), let us know!</p></li></ul></li></ul><h3>Get involved!</h3><p>As the name &#8220;EA Community Choice&#8221; implies, we&#8217;d love for all kinds of folks in the community to participate. You can:</p><ul><li><p><a href="https://manifund.org/create?prize=ea-community-choice">Register your project to receive funding in this round</a></p></li><li><p><a href="https://airtable.com/appucH86FmSMha33p/pagmSJZeW6XJiNvqj/form">Recommend a project to join the round</a></p></li><li><p><a href="https://airtable.com/appucH86FmSMha33p/pag9fHXZoBXj61ZIN/form">Sign up to get notified when funds get sent out</a></p></li></ul><p>Excited to support the projects that y&#8217;all choose!</p><p><em>Thanks to Rachel, Saul, Anton, Neel, Constance, Fin and others for feedback!</em></p>]]></content:encoded></item><item><title><![CDATA[Episode: Austin vs Linch on OpenAI]]></title><description><![CDATA[We&#8217;re trying the podcast thing!]]></description><link>https://manifund.substack.com/p/episode-austin-vs-linch-on-openai</link><guid isPermaLink="false">https://manifund.substack.com/p/episode-austin-vs-linch-on-openai</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Sat, 25 May 2024 16:11:05 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/144916670/9a5f4cb5f73ece02c7c729121f35fa09.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>In this inaugural episode, Austin and Linch (of EA Funds) explore our disagreements on OpenAI. Austin is approximately an apologist for OpenAI, and for-profit/tech companies overall; while Linch takes a more hardline stance.</p><p>Neither of us are experts on OpenAI or have insider information; we&#8217;re just hashing out our different worldviews in a casual setting. If you listen or follow along, let us know what you think!</p><p>Topics covered:</p><ul><li><p>00:33 OpenAI's contradictions</p></li><li><p>02:44 NDAs and tech vs rationalist culture</p></li><li><p>13:51 Is there enough criticism of OpenAI? Too much?</p></li><li><p>17:15 Superalignment's 20% compute promise</p></li><li><p>21:50 The EA departures from OpenAI</p></li><li><p>26:10 ScarJo</p></li><li><p>34:41 Should AGI be pursued by a for-profit company?</p></li><li><p>46:10 A hypothetical AGI "Manhattan Project"</p></li><li><p>51:45 Government vs company services</p></li><li><p>54:08 Is OpenAI democratic?</p></li></ul><p><em>Note that this podcast was recorded May 22, before Kelsey Piper&#8217;s expose on NDAs; subsequent NDA walkbacks by OpenAI; and reports that Sky was indeed commissioned before the ScarJo reachout</em></p><p>Link with video &amp; transcripts: <a href="https://share.descript.com/view/1N6kvfXFLr9">https://share.descript.com/view/1N6kvfXFLr9</a></p><h1>1: OpenAI</h1><p><strong>Austin:</strong> [00:00:00] So we're here today to talk about OpenAI a subject that Linch and I have talked about in some context before.</p><p>I think I, coming from a more tech startup background, have views that maybe are a little bit more alien to the OpenAI. What I think of as a standard EA like AI safety, maybe Doomer ish views, which maybe Lynch represents a little bit more of.</p><p>So I'm excited to have a chance to hash this out with you today, Lynch.</p><p><strong>Linch:</strong> Yeah, happy to do I think if anything, I'm probably more negative about like OpenAI than the average like EA person, like for example, the OpenPhil cluster viewpoints. Interesting.</p><h2>[00:00:33] OpenAsteroidImpact &amp; OpenAI's contradictions</h2><p><strong>Austin:</strong> And when you created Open Asteroid Impact, right?</p><p>This is like a satirical thing, trying to get people to pay more attention to OpenAI and like ways that it could go. Badly. And I was working with you on that. I think in my mind, I didn't even think of this as " Oh, this is because we want to draw attention to like things, OpenAI is doing badly."</p><p>Rather like it just seems funny to me.</p><p><strong>Linch:</strong> Yeah. Yeah. I think badly is like one conception of it. I think contradiction is another conception. Like I was trying to highlight the ways in which I thought the public messaging, or at least Yeah. Public messaging as understood by people I know conflicted with a lot of the actions.</p><p><strong>Austin:</strong> Yeah, maybe that's a great place to start actually.</p><p>With the overall sense of messaging being contradictory, can you say a bit more about that?</p><p><strong>Linch:</strong> Yeah. These things are like. Oh not necessarily super obvious, but I think there's definitely a very weird needle or like a very odd thing.</p><p>So one example is that they talk about, so we talk about this in Open Asteroid Impact, but they say a lot of the value they brought is like bringing attention to AI or AI risk by making like really splashy like AI demos and like basically building the thing that's potentially on the ramp to being dangerous.</p><p>I think that just like an odd approach. I think it's like a pretty unusual approach. I couldn't think of many like historical precedents for doing something like this.</p><p><strong>Austin:</strong> That's not necessarily contradictory though or when you think, when you say contradictory, I think that usually implies like they're saying one thing and doing another or saying two different things at the same time.</p><p><strong>Linch:</strong> Sure. I guess like talking about openness is like a pretty odd thing a lot of the time from like the software perspective. To have obviously not like really open source for that don't open source most of their work, but also like open means many things right from like open societies perspective, you might think that people are free to talk about what they want to, or they're able to voice opinions.</p><p>I know for many people in the, during my issue, I safety crowd are worried about technological capabilities leaks for other Specific technical things that are like dangerous to say, and that's why you might need like NDAs for other ways to cover that. But it seems, as we now know, but I think it was pretty obvious before they don't say a lot of things, period.</p><p>There's a lot of stuff that they could talk about. They like employees basically can't. And that's pretty odd. I think.</p><h2>[00:02:44] NDAs and tech vs rationalist culture</h2><p><strong>Austin:</strong> Yeah. So I guess jumping to one of the topics of the day, the NDAs I, my high level of thought is that I feel like people in the area are making a mountain out of a molehill, so to speak.</p><p>It seems unlikely in my head to be the case that like the NDAs are significantly restricting speech. And I have more of you that are like, Oh, these are like relatively standard. Maybe there's a few ways in which not that standard. I don't think, I especially don't think there was like a lot of like ill intent, like trying to suppress disparagement or disclosure of like things that beyond what is like.</p><p>Roughly typical in Silicon Valley and yeah I guess I have some kind of model of if there were like important things that the people who have recently left OpenAI would have wanted to say they would be saying those things and it's not they feel, oh, I have signed this contract and as a result, can't say these things I don't think the signing of the contract makes a huge difference to their yeah.</p><p>Internal sense of whether it's worth blowing a whistle on like bad behavior at OpenAI. So I just think there's like too much of a conspiratorial view on like how much these NDAs actually affect the speech actions of the people who have left, or people who are still at OpenAI.</p><p><strong>Linch:</strong> Yeah. So for context, for people who don't understand, know this, like they have, so there's like NDAs, which are non disclosure agreements.</p><p>Those are fairly common in industry and there's non disparagement agreements, which are like Somewhat common, but not super common where you can't say that it's not, you can't say like important information or like trade secrets or anything, but you just can't say bad things about a company. And my understanding is many, if not most, for all OpenAIX employees have signed like a non disparagement agreement, which is much more significant to us.</p><p>And those are lifelong. So if you sign it, like legally, you're not supposed to ever criticize OpenAI. Which is pretty big, and third, my understanding is that these agreements were signed on the fear of losing their equity that they've already owned, which makes it much a higher bar than I think is common for other cases I've heard of.</p><p><strong>Austin:</strong> Yeah. So thanks for that context. I think one other thing that I wanted to draw attention to was that somewhat recently, I think there was like a kerfuffle, maybe a small discussion on the, on LessWrong perhaps, where I believe Wave who is like a company that does mobile payments in Africa.</p><p>[00:05:00] They're known for being like pretty EA affiliated. I think many of their like top employees are familiar with that. Wave mentioned they also have non, I believe disparagement agreements for people who leave. And this caused some, I think like clash of cultures between like I think the wave founder Lincoln Quirk was like, Oh, this is like a pretty normal thing.</p><p>And then I think about Oli Habryka of LessWrong, I was like, no, this is changing speech norms and like very much against the like rationalist culture. And I also took the like tech maybe like Lincoln Quirk line here, which is this seems like pretty fine and normal. And I guess to expand on that a little bit I do think the non disparagement agreement is there for some reason mostly that when you have an employee leave, you don't want to get into a lot of bad, like ugly press battles over he said, she said things like that. And I think there's like a degree of reasonableness basically in these kinds of contracts, which is if you say like things that are like generally like factually true and important to say, like something like.</p><p>Yeah. I don't know, like this top exec at OpenAI had an affair with one of their interns and this is like a big scandal. I don't think this is the kind of thing that like, then OpenAI would go after you in a non disparagement, using the non disparagement clause, right? So again, I'm modeling these, like society, I think generally to be reasonable and like the people at OpenAI to be like to have a sense of what, which things is like disparate agreement can cover and my modeling, meanwhile, the like rationalist EA crowd of taking these things like too literally if you look at the literal text of the contract, maybe it says oh, you can't say anything bad about this.</p><p>But literally, if you try to do this and an OpenAI tried to go after you with the contracting, I think they would lose in the court public opinion and they understand this. And everyone understands this.</p><p><strong>Linch:</strong> It's just obviously a chilling effect though. If you're like on the fence of whether to say something or not, you're just not going to say it.</p><p>And like in aggregate, like it's often the case where you have an elephant and everybody sees one piece of elephant, but they don't see all of it. And, and especially in like. Cases where there's like bad behavior, but there's no smoking gun. It's just like pretty common to make it very hard to know, like without all the pieces of information out there.</p><p><strong>Austin:</strong> Yeah but I really think the like non disparagement agreement is such a like tiny portion of the like reasons people don't speak up. I think like people don't say things for yeah, mostly things like. Wanting to be nice or not thinking that it is a worth raising or things like that.</p><p>And not one</p><p><strong>Linch:</strong> of the most powerful people in Silicon Valley, may not like you and may go after your employee, your future employers. There's a ton of stuff like that. Like this is like one piece of that angle that we see we don't know all the information we don't see.</p><p><strong>Austin:</strong> Yeah. Again, that seems Kind of conspiratorial.</p><p>I guess I don't know Sam Allman, for example, in person. And I know some people say things like, oh he could be, like, vindictive, or some people are, like, concerned of retribution or things like that. Maybe I'm just much more of an optimist, or I believe in people in general, like Sam Bankman-Fried sorry, Sam Altman and Sam Bankman-Fried Included.</p><p>There were some other things said about SBF that he could be, like, vindictive, and people were, like, afraid to say things. And I think all this is Roughly BS roughly it wasn't that this or</p><p>I guess I don't want to say, put words in people's mouths, maybe or put intentions in people's heads or whatever possibly they actually believe this, but I just, I don't know, think that actually, there wouldn't have been, like, important retribution or something like that, or</p><p><strong>Linch:</strong> yeah.</p><p>Yeah. A lot of people seem to believe this about Sam Bankman-Fried, like the, the former CEO of FTX US, I think, for example, had a bunch of specific things like he was served by lawyers and stuff like that. Sued by lawyers? Served. Served, I see. I think. Like, where like he, sorry, that's not exactly it.</p><p>There was like internal disagreement about FTX US policy. And then instead of resolving it the way that like. You might expect like to executives to hash it out instead of like Benjamin Fried had his lawyer sent like a specific thing to him. And that's just like powerful the course, I think for like at least some powerful people who are like, on the 90th percentile of like ruthlessness or sociopathy within amongst CEOs.</p><p>Yeah, I definitely I do want to make it clear that like my opinion of Sam is not that he's like being a typical CEO here. Like I do think he's sorry some Altman. Yeah. Yeah.</p><p><strong>Austin:</strong> Yeah. I definitely think there are many ways in which like some Altman is not typical CA, but there are many ways in which like every CEO is like not a typical CEO.</p><p>And I guess. Thus far, I have not been like particularly like concerned by like Sam Allman specifically his actions. And there was a point here around as we're talking about the NDA stuff, I think the way like OpenAI and Sam responded to this publicly on Twitter, it was like this was like a mistake or especially the, like the worst parts, the like clawback of asset equity things were not like, intended to be a thing.</p><p>And they haven't actually. And they're like, trying to fix the like contracts or whatever, or make things right to their employees. And I think that's reasonable a fix I think one thing you might say is that Oh, they only do this in response to like large public outcry or something.</p><p>And it's like evidence of like bad behavior that they need to be policed more. But I guess I wouldn't take that line. And I would also take the there's a hundred things to get right when you are like an executive at a company. And oftentimes this kind of minor detail you don't pay that much attention to your, like lawyers tell you this is a standard, like non disparagement agreement.</p><p>I'm like, cool. Okay. That's I [00:10:00] really</p><p><strong>Linch:</strong> doubt this is like what his lawyers dreamed up of without somebody in the exact team. Probably him, but at least somebody like. Yeah.</p><p><strong>Austin:</strong> And I guess I haven't read the text of this non inspirational agreement very closely, or None of us have.</p><p>None of us have. Oh, is it not like a agreement that the non disparagement agreement itself is like private or?</p><p><strong>Linch:</strong> Yeah. None of us has. Yeah.</p><p><strong>Austin:</strong> The whole self sealing NDA thing. But to use like a closer example, I think Lincoln Quirk, the CEO of Wave, also had this non disparagement agreement in their contracts and do you model him as having the same amount of I really think it's just like a, Oh, you like you're filling out like a template.</p><p>Like I do this a lot for manifold, like contracts. I just go to a website. There's a template. I like check some boxes. And I think in the moment I'm making like should I make them sign a non disclosure agreement? Yeah, sure. This seems better for the company. It seems safer.</p><p>I don't see why I wouldn't do this. And I did click that and move on. And that was like, not even a considered decision.</p><p><strong>Linch:</strong> Yeah. If this was like the type of. That's in that 90 percent of commodity does. I really think that there just isn't a case like in fact, for the Lincoln Cook thing, I think I had a dialogue like one of those were less on dialogues with Ruby where I was like this thing's bad, but it's like very reasonable from different cultural norms.</p><p>You shouldn't really help expect it to be held up to the rationalist ones. But I think A lifelong non disparagement agreement with like your best equity being called back as far as I could tell, I don't know about this. It seems very unusual and it's also silencing. And I don't know, this is just all pretty unusual.</p><p>And I don't know how unusual it is because obviously because they're all self silencing, not a lot of people have come up with saying, oh, this is like really common, like lots of companies do it. It's I've heard it's Yeah. Almost none actually do this.</p><p><strong>Austin:</strong> You don't think they get credit for walking it back or being like may a couple of I think Sam actually is like, yeah, this is my mistake, like an oversight as and I was, this is my fault, I am like into charge and they're going to try to fix it.</p><p>I'm like,</p><p><strong>Linch:</strong> I think morally they don't, because this is like pragmatically probably one of the better options they could take. They do get like some competency points.</p><p><strong>Austin:</strong> Yeah. Yeah. I don't know. I guess I'm just like occupying like a very different view, which is Oh, these people are like probably pretty like good hearted in general, or it's have a good heart in general.</p><p>Yeah. And are trying to like, They have a million things to manage and like this one was like a big issue. Whoops. They're gonna try to fix that and like they're gonna move on and try to do like</p><p><strong>Linch:</strong> other things Yeah, so the details of their walk back is also interesting, right? They didn't say that like they will nullify the previous ones will nullify the specific condition of non discrimination previous ones Or anything like that, right?</p><p>They will talk to me with his response to if you have a disagreement with it Which is you know, an art thing and Kelsey has And I think others, especially Kelsey, I'm sure he's like seeing Kelsey's comments have pushed them multiple times, both, like official communication and on Twitter to just say that they were like, not go after people who disparage the company.</p><p>Um, signing the original agreement for, retracted this, original agreement and, just silence from them so far.</p><p><strong>Austin:</strong> Yeah. I guess one thing I am keeping my eyes out on is the Daniel Kokotajlo prediction market. There's a manifold market on will he get his equity back?</p><p>I think for listeners he posted on the EA forum that. He'd given about he's given up equity equivalent to about 80% of his network, which means like if he currently had like</p><p><strong>Linch:</strong> his future network</p><p><strong>Austin:</strong> Yeah. His future net worth. Yes. Yeah. If he currently had, let's say like a million dollars, he possibly gave up $4 million yeah.</p><p>Of worth of OpenAI equity to be able to not sign that non-disparagement agreement.</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> And I think a lot of people probably rightfully lauded him as oh, this is like a good thing to do. This is a very like brave maneuver. But OpenAI putting its money where its mouth is would be like reinstituting that kind of equity to people like Daniel.</p><p>We'll see how that turns out. That will be like I think at least like some evidence to me to be like, Oh yeah, they are trying to do the right thing. And this is like a misstep.</p><h2>[00:13:51] Is there enough criticism of OpenAI? Too much?</h2><p><strong>Linch:</strong> Yeah. Yeah. I think that would be some evidence to me. I think also if there was, how do I put this?</p><p>If more people like start like criticize OpenAI and it's just not much Out there afterwards when like it feels like safe then obviously that feels better to me I feel like there's probably just a lot of pressure to not criticize open for various reasons like I know within EA like it's pretty common for people to want to get jobs within like labs and stuff and Often they just don't criticize the labs as much as they could because of that especially for like technical safety researchers.</p><p><strong>Austin:</strong> I think that's like a thing that probably happens to some people. I feel the opposite though. I feel like on places like the EA forum, there's like too much nitpicking or trying to find flaws with OpenAI in a way that's not very charitable. And I think there might be like the opposite effect where like you gain more like status points or something like that within the community if you are seen as being like hard headed or pushing for the right thing or something like that and I don't think these cancel each other out.</p><p>I think these are probably very different people actually, who are like staying silent, the like researchers who hope to work at OpenAI versus the like people [00:15:00] who are more in the Doomer camp and trying to push against it. But yeah,</p><p><strong>Linch:</strong> I think that it's like the more informed people choose to be silent roughly.</p><p>And Less informed people who just see from a distance that this whole thing is like really fucked and like foggy and like lots of black ink.</p><p><strong>Austin:</strong> Do you have examples of people who you think of as like particularly informed who are like notably not saying much?</p><p><strong>Linch:</strong> Almost all the time. ex-OpenAI employees.</p><p><strong>Austin:</strong> Like the ones who left in certain Anthropic and also the ones who have left recent or the ones who have left recently, or I guess, yeah, both</p><p><strong>Linch:</strong> Both camps, like you expect like more, like you expect more of the Anthropic people to at least have said something about now, like most likely the split of acrimonious so those are the obvious ones, I think, let me see.</p><p>Yeah, I think, I also know people who work at like safety non profits who like say less than they probably should.</p><p><strong>Austin:</strong> Yeah,</p><p><strong>Linch:</strong> Again, it seems although that said, I think if anything critiqued by Anthropic is even more hard. We're like, Like more</p><p><strong>Austin:</strong> hard to come by? Like critique of</p><p><strong>Linch:</strong> Anthropic? Yeah, we're like, people are even more self silent, within the EA circles, like even more self silent in critiques of Anthropic than of OpenAI, would be my guess.</p><p><strong>Austin:</strong> Yeah, I think that's like plausible. I don't know. Again, I am operating from like some other model where like people aren't saying much because there isn't much to say is like my best guess. They're There's not there may be like some kind of small things that like, didn't quite seem right that they like put out there, but I dunno, I'm like thinking of my like recent, like leaving a manifold, for example, which like, you could imagine somebody that could look at that and see my public statements and think there's gotta be like a bigger story than this, or there's gotta be like, bitter acrimonious things that like Austin isn't like saying, I don't really think there is, I don't know.</p><p>I think that</p><p><strong>Linch:</strong> You're like one of the most patently transparent people out there. And you also have criticized them when you left and stuff. And that seems like fine. I don't think I don't know. I think it just looked like a pretty different situation from the outside.</p><p><strong>Austin:</strong> Sure, but I know Jen Leakey, I hope I'm pronouncing his name right.</p><p>He did have this Twitter thread that was like a critique of the way, do you think he has like many things that he's not saying as well, or? I think he has</p><p><strong>Linch:</strong> less things that he's not saying would be my guess, but he probably, there's like standard NDA things that cover like some of the things he can't say.</p><h2>[00:17:15] Superalignment's 20% compute promise</h2><p><strong>Linch:</strong> It's known about He hasn't publicly said much about the compute stuff. I'm going back on the computer agreements. It's only through like press leaks, which may have been from him. I have it for other people that we need like more details about the compute stuff. I don't know. There's probably other things like that.</p><p><strong>Austin:</strong> Yeah. The compute stuff is interesting because that is like a concrete example. OpenAI committed 20 percent of the training compute up till the point when super alignment was launched</p><p><strong>Linch:</strong> to</p><p><strong>Austin:</strong> the super alignment team.</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> It roughly seems like they didn't get that compute or maybe open, I didn't quite honor the agreement.</p><p>I imagine in the In the heads of the people who are like managing open, I, they were like, Oh, we will we promise this, but we don't, we can't deliver all of it right now. Maybe you have to maybe next year, maybe the year after or something like that. So maybe it doesn't feel like a broken promise to them, but it definitely seems a bit of a broken pro or at least insofar as the like super alignment team, is saying Oh, we didn't get enough compute. I probably mostly take them at their word. I do feel like compute is tricky though, because like compute is basically like money or like like influence in some sense in this like world of OpenAI, I imagine.</p><p>Where it is like one of the most like scarce resources that they have and deciding where to spend your compute is I don't know how to express it. It's like spending, like allocating your headcount, allocating your like, yeah, it's one of the most</p><p><strong>Linch:</strong> important things. That's why it's not sounding like a strong promise and incredible one. We're like a different way if a country says with, um, I don't know if a politician says we're gonna give 20 percent of money into Social Security or Medicare or something and they end up giving like 1 percent and that obviously is a broken promise, right?</p><p>And money is a very important thing for the US government.</p><p><strong>Austin:</strong> Yeah, I agree. It seems a bit like a broken promise, but like looking at the super alignment team case, right? Yeah. At some point, like if the people you promised to all decided to leave, like relatively early on like super alignment has only been around for a year or so, is my understanding.</p><p>And I assume this like promise of 20 percent of the commute was not like, oh, and we will give this to you right now if you need to like, stop all of what OpenAI is currently operating on and just do super alignment stuff, right? I don't know, I still have I do feel basically like this is like one of the like less good things, one of the clear, like clear broken promises, perhaps but still like I see lots of ways in which like there's like room into ways for two teams to talk to each other, the super alignment team within the OpenAI team</p><p><strong>Linch:</strong> and</p><p><strong>Austin:</strong> both sides feel like they're like doing their best, but</p><p>Not quite getting to the right thing.</p><p><strong>Linch:</strong> Yeah, I think that I think they didn't break the explicit wording of the promise, which is I think we also are getting at this at 20 percent compute secured over the next four years that they could have meant like one possible interpretation is that they have 20 percent per year or so, up to 80 percent annual quota from that year and then they're going 2026.</p><p>That's one possible interpretation. Yeah. And that would in fact be the cheapest way for OpenAI to honor that commitment, given exponential trends [00:20:00] and things getting cheaper. But that is not how most people in the public interpreted it. That is not how I assume people within super alignment interpret it.</p><p><strong>Austin:</strong> I guess I would have interpreted it as 5 percent per year for each of those four years or something like that. If it's, 20 percent over four years.</p><p><strong>Linch:</strong> That's not how it works though. It's not 20% of the annual quota. I know</p><p><strong>Austin:</strong> it like, but they had fixed the amount of compute that they promised.</p><p>20%. So I think like a rough, very reasonable thing was to like dole it out equally over the four years time. That's probably, yeah. So I would've expect knife that would</p><p><strong>Linch:</strong> be like 20%. So to be clear, that would be 20% of that amount that year over five, four years, right? Than 5%.</p><p><strong>Austin:</strong> Or I'm thinking</p><p><strong>Linch:</strong> of computers not analyzed tone by default computers like continuous usually we think of like literal GPUs and 20 percent of GPUs being reserved for that time. And then as you get exponential trends that like as a percentage of total stock of GPUs that gets less and less, but it's not like 20 percent of that, like 20 percent running per one year. And then that's split into 5 percent over four years.</p><p><strong>Austin:</strong> I realized this is a thing I actually don't know very much about. I had assumed that it was like there are a certain number of floating operations that have been used to train. All of OpenAI models up to this point, one fifth of that moving forward is and maybe that is like a billion floating operations, but not floating operation per second.</p><p>Just like total like operations. That was my understanding or my, but I haven't, I don't know what the like agreement, I haven't looked into this as a thing, I guess we could like</p><p><strong>Linch:</strong> research. I haven't thought about it that way, but that also makes sense. Yeah. Yeah, I was thinking of the physical. Thing, but I guess, yeah, I guess like compute, like it's pretty reasonable to find compute as the, yeah, flop.</p><p>Yeah, the flop.</p><p><strong>Austin:</strong> Okay. Yeah, I guess in any other case, like this is like getting into the weeds somewhat, but yeah. Overall, I think this is like an important point where it seems like if the super alignment team didn't get the compute that they thought was according to them based on this promise, that is like a broken promise in terms of like OpenAI trying to help align each guy.</p><h2>[00:21:50] The EA departures from OpenAI</h2><p><strong>Austin:</strong> Cool. I think another point and we're already started talking about this is like the impact of all the people leaving OpenAI lately. It's, I forget who it started with but at this point there's Ilya and Jan the two leads of super alignment. A bunch of Daniel Kokotajlo, William Saunders, like Leopold Aschenbrenner.</p><p>A few other people, I think .</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> Who are kinda like somewhat related tied into the EA space. Other people who are like, roughly in my social circles or like friends of friends or something like that.</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> Who yeah. Have like suddenly left in the last like week or so, or have been leaving over the span of the last couple of months.</p><p><strong>Linch:</strong> Yep.</p><p><strong>Austin:</strong> What do you make of this.</p><p><strong>Linch:</strong> Yeah, for the world, pretty neutral I assume this is already baked in they couldn't get anything done yet. OpenAI, or couldn't get much done at OpenAI, so they left maybe they will kick out, maybe they won't.</p><p>It was pretty voluntary, but yeah, hopefully</p><p>like the best case scenario here is that Ilya and Jan weren't very good at politics and weren't able to do the thing, but like deep in their hearts Sam and Greg and the rest, like really wanted to care about safety and would work with safety on somebody with a group of people who are better at politics.</p><p>And. Maybe John Schulman, I think is the new person. Maybe he would be that person. Maybe they'll find somebody else. That's the best case scenario. There's most of it was just like political ability within the company. I was scenario is of course that like most of the safety work was like ultimately just trying to do some food safety washing in a way that's not super blatant.</p><p>And no matter who's in charge there, like it's just not gonna get much done.</p><p><strong>Austin:</strong> Yeah. Maybe I'm like narratizing this a little bit too much, but I do like roughly feel like it is like a sad thing that once upon a time or not even that long ago, there was like At least somewhat like a good faith effort. It seems to have people working in the same company and like talking to each other and like sharing ideas there.</p><p>And now that like split is much more like visible, something like the split between AI safety and OpenAI or EA and OpenAI or something like that.</p><p><strong>Linch:</strong> Yeah. Yeah. I think some of their point that came out was that, I don't know about AI but yeah, OpenAI, I think that was somewhat engineered from OpenAI side.</p><p>Like they didn't like the. Depressing them as like a EA associated company for various reasons, including like the FTX labs and stuff. So it's understandable why they feel that way. And it also makes sense. They aren't much of a EA company. So it makes sense that they don't want to be associated as one.</p><p><strong>Austin:</strong> Yeah, this is the kind of ties into another issue which is something around like many places wanting to turn that turn away the EA brand at this point, like viewing it as like negative, which I think is a whole other discussion, but we won't try to go like too deep into that. I do think I think it's</p><p>I feel like this can't have been that big of</p><p>what, what gives us a sense of like the, like brand of EA being negative like negative assets. Yeah. was a major contribution to like OpenAI kind of separating from the recent people.</p><p><strong>Linch:</strong> 5 percent maybe 3%. I'm not sure.</p><p><strong>Austin:</strong> Sorry. You think about 3 percent of the like rationale for that was, yeah. Okay. So maybe then when [00:25:00] we're pretty aligned like, I don't think it was like that big of a deciding factor. That sounds right.</p><p><strong>Linch:</strong> Yeah. They were like, yeah. According to, I think the New York Times, like while they're reporting, yeah, they were like, internal discussions within OpenAI of like how to distance themselves from EA.</p><p>But yeah, from like just senior people, but I don't know, like you said, a startup CEO works on a hundred people, a hundred things, probably wasn't that big a priority. Sure.</p><p><strong>Austin:</strong> Yeah I guess I, maybe I'm just like very conflict averse or something, but I just like, like it very much when like people like get together and talk and work out things and don't like much when like things are more like, oh, clear like lines in the sand.</p><p>We're going to get into a fight. I haven't been doing this building like e/acc versus EA, like narrative as oh, this is seems like it's going to worsen the quality of the discourse and make a lot of people mad and get a lot of clicks for not very good reason. Things like that.</p><p><strong>Linch:</strong> Yeah. I see that. Yeah, all else equal. I would also rather be in a world where like people are like, Like actually talking to each other, understanding each other's points. I don't notice myself being, like, less smart for less capable of clear reason when, more tribal impulses take me.</p><p>That's not great, I like to be smart and capable of good reasoning.</p><h2>[00:26:10] ScarJo</h2><p><strong>Austin:</strong> I guess moving on from this then into the third, I think topic du jour Scarlett Johansson's? What is it? Not sure how to pronounce these names. Yeah. I'm sure that Scar Jo is how I've been seeing her easier way.</p><p>Yeah. I think it's</p><p><strong>Linch:</strong> Johansson. Yeah. Johansson. Yeah. I don't really</p><p><strong>Austin:</strong> know. Was this a thing that you also had like strong feelings about with regards to OpenAI? I think a lot of people are painting this as, OpenAI choosing to use Scarlett Johansson's voice or voice alike in Skye as like a broken promise or like another example of them not being able to be trusted.</p><p>And this once again seemed to me to be a bit like catastrophizing or a bit like, making a mountain out of a molehill. Interesting.</p><p><strong>Linch:</strong> I don't think broken promise is quite the framing I would have. I don't think they promise much to her. I think it's more I think there's a few things. I think impersonating people is kinda sus but I'm not sure. I I could be convinced otherwise here. I think that's one thing. I think that I'm saying that it's, I don't know for sure, but it seems like a pretty odd coincidence if it's coincidence.</p><p><strong>Austin:</strong> Yeah. I listened to a very short clip of, Skye versus Scarlett Johansson's voice and in her, and they seem similar, but like similar in the way that like two women coming from like similar backgrounds might have like similar voices.</p><p>It might seem similar, especially like through the lens of a somewhat robotic like screen filter or something like that in the case of Sky's voice. And</p><p><strong>Linch:</strong> yeah, I think also like the model has probably iterated like multiple times over time. So it would be hard to find the ground truth or it would be hard to I would probably want to look at like a lot of clips to know and try to pinpoint the exact time that.</p><p>Scarlett claims that her friends like said it sounded really similar as well.</p><p><strong>Austin:</strong> Yeah. So there's one question which is around are they like telling the truth or lying, and that would be like a big one. I think probably one that we may like not find out, may never find out whether Sam Altman or the people working on a sky voice, like intentionally.</p><p>Try to make this similar to the scarred your voice. I really hope we do find out, but it might be pretty hard. Yeah. Yeah. I think like absent of that just reinforces your belief in if you were suspicious of the OpenAI people, you'd be like, oh, this is another example of them, like doing sketchy things, especially like breaking a promise or again, not a promise, but like infringing on ScarJo's rights or something like that.</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> And you said earlier that you weren't so sure about is this kind of impersonation even bad? Yeah, for one, it's not exactly impersonation. I think I would typically think of impersonation as pretending to be. A person, and this is something more like the whatchamacallit, the like mid journey artist issue.</p><p>Do artists and like voice actors and I guess writers have a right to their style to the way they draw their work, to the way they sound in this case, to the way they write. And I guess I also Some of my sympathy for OpenAI is I don't really know that I believe that creators should have this right.</p><p>This is like possibly a very controversial take. And I know like in many places, especially in like video game development, for example, many video game artists are strongly of the belief that this, all this AI generated art stuff is really bad. And you'll see like things like Steam reviews when they use an AI generated game, they'll get like downvoted to oblivion because people are like, no, we want to support the artists.</p><p>And not pitch in on this side of the corporations, like shilling AI. I kinda take the opposite side, I take the corporation side, or the AI side in this, where yeah and maybe this is a part of what gives me more sympathy for the, again, OpenAI camp over the ScarJo camp.</p><p><strong>Linch:</strong> Yeah. Yeah,</p><p>Kinda, I, yeah, I don't have strong opinions with here. I feel like I don't have very good intuitions here, and this feels like the type of thing that I don't know, hopefully like reasoned discourse and public opinion and like some form of public philosophy and like people like trying to express their preferences honestly, ideally people who aren't like, super economically or socially incentivized to say things.</p><p>Hopefully we'll get to a point where this is like the type of thing that society can agree [00:30:00] on. I don't really feel like this type of thing was like a clear truth of the matter, so to speak like absent social context, way that, like murder is like probably bad in almost all contexts where it's not bad.</p><p><strong>Austin:</strong> Sure. It's one of those like issues that is probably gonna get more like salience over time and let the legal system and like society around it needs to think a little bit more about what like copyright means and what an artist write to their likeness feels like those things are much easier to generate and produce.</p><p><strong>Linch:</strong> Yeah.</p><p>I do feel like, at some level I feel more strong intuitively that if somebody writes in my style, that's flattering but if somebody tries to talk in my style or act like me, even if they disclaim in loud letters, they're not me, that feels I don't know.</p><p>Off creepy.</p><p><strong>Austin:</strong> Interesting. There's a thing where like parody for example, is protected under like us law, at least. Aside from that, if somebody tries to use my voice, I guess I don't have a very strong association with my own voice. I never thought of my voice is like core to my identity, but I can see how maybe somebody like Scar Jo cares a lot more about that.</p><p>That is like her part of her, like life's work. How she makes her like living.</p><p><strong>Linch:</strong> And it's also relevant that most. I don't know, many of us have read like "Reasons and Persons" or similar work. Many of us don't tie that much stuff to our personal identity within EA. In a way that I think is probably almost the opposite for like actors, for I think knowing people somewhere in between and things for like professional actors are definitely on the extreme end of they just tie a lot to their personal identity.</p><p>Like for various reasons, including that it's like very professional and relevant to them.</p><p><strong>Austin:</strong> Yeah. And I almost wonder if there's one place where our intuitive like we see an actor, they are like a monkey the way we're a monkey and we don't like the monkeys. It leads us a little bit astray in terms of what should be the right thing to do from like a overall, like social economic perspective or something like that.</p><p>Where we think yeah, the actors deserve their privacy rights, like image likeness rights.</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> Whereas like in the opposite world where like, people just didn't like anyone could like make like deep takes of anyone. Yeah. There would just be much more creative, like media and people like consumers would be a lot more satisfied.</p><p>I think there might be something like that. I'm just thinking about this for the first time. This is not like a very like,</p><p><strong>Linch:</strong> well</p><p><strong>Austin:</strong> recent argument,</p><p><strong>Linch:</strong> I think. I</p><p><strong>Austin:</strong> mean,</p><p><strong>Linch:</strong> I guess we'll see. I guess I'm not convinced that you have that much better out because of this. Especially yeah, if you exclude parody and stuff, but I don't know.</p><p>So it is If you think about outer science, and I think like some of it like discovery, it does feel weird that the first person to discover a style gets to really own it fully. Yeah. But I think a lot a lot of copyright doesn't look like this, right? Token their own works for the idea of good visibility.</p><p>Obviously not a hero's journey or anything, and I think I don't know, like tying your identity to your voice, it's like a pretty reasonable thing, like in history to do it's it doesn't seem that extreme. It's one thing. And then my pre deception point, I'm still like really weirded out by OpenAI saying that they can't single the voice actors out because of privacy.</p><p>That, speaking of privacy, that does feel a bit odd to me.</p><p><strong>Austin:</strong> Yeah, definitely if I were OpenAI, I think I would be like asking the voice actress who voiced Skye to like, give up those privacy rights, maybe pay them extra money and be like, can we like tell people this was you?</p><p>And it'd probably be good for this person's career,</p><p><strong>Linch:</strong> yeah. Yeah. On priors, like how frequent is it that actors don't want to be credited for their work? Yeah. Come on. It just feels very odd. And</p><p><strong>Austin:</strong> I like</p><p><strong>Linch:</strong> Open, I managed should find the few actors in the world who happen to really care about their privacy the same way that they ex-employees, they just care so much about privacy that they can't complain about Open.</p><p>I, yeah I, I dunno man.</p><p><strong>Austin:</strong> I mean it would definitely make sense to me as like a, just a pure business move. Like probably as a business open, I want be to think of sky as opposed to as like X person's voice. Yeah. It's</p><p><strong>Linch:</strong> but except for this current situation, it's much lower liability if one of 'em, do something shitty, or speaks out against OpenAI. People don't want to identify this as Sky speaking out against OpenAI. Yeah, where they do something shitty, like they, like, when he had DOI, they don't want to. Yeah. Obviously, there's lots of reasons why You don't want that.</p><p><strong>Austin:</strong> Sure. But yeah I guess like then there's some thinking of are they again, like lying about it being a privacy thing? Could they have like relatively easily gotten that unlocked and are just choosing not to do so because it would be bad? Is it like actually secretly Scarlett Johansson's voice like in disguise?</p><p>I think these are all like,</p><p><strong>Linch:</strong> Yeah, we're like the scholar Johansson famous scholar Johansson impersonator that I hired.</p><p><strong>Austin:</strong> Yeah. And I guess I just like, once again don't really take the like, conspiratorial view on these things. I just feel like, Oh yeah, let's like, maybe some of these things.</p><p>I don't know.</p><p><strong>Linch:</strong> Yeah. Famous conspiratorial, but I feel like this is just straightforwardly like a, amoral business doing amoral businesses. If they could get away with it, they would.</p><h2>[00:34:41] Should AGI be pursued by a for-profit company?</h2><p><strong>Austin:</strong> Okay. Maybe moving back away from the stuff going on today and just like bringing it back out to a more overall how do you feel about OpenAI's approach?</p><p>How do you feel about like OpenAI's approach? Like overall to date?</p><p><strong>Linch:</strong> Yeah. I think it's like better than some worlds I think it's one of the worst. Among the worst words were [00:35:00] possible words out there. I think like somebody in that situation Sam Altman is like person, personality aside, a company in the position of opening the eye Can in fact do a lot worse, can in fact be a lot worse, I think they do talk about safety quite a bit.</p><p>They they try to be reasonable a lot of the time They like, You know, spend some, not a lot, of resources, but some resources on public goods and safety. They, talk about it from the beginning. They care about near term ethics issues and spend non trivial resources on that and so forth.</p><p>In that position, I think they do a fair number of reasonable things. I think that I talked about this with you, and I, Briefly mentioned this on the EA forum. I think that putting themselves in that position is somewhat blameworthy that I think</p><p><strong>Austin:</strong> as a, like a for profit corporation</p><p><strong>Linch:</strong> trying to deal with AGI.</p><p><strong>Austin:</strong> And alternatively, like they could have found this as they did originally find this as a nonprofit, but you could have also imagined it as like a government sponsor initiative or like an academic initiative or something else. Yeah.</p><p><strong>Linch:</strong> A lab within a company like not doing AI stuff at all.</p><p>There's many options.</p><p><strong>Austin:</strong> Yeah. I think that was like a very interesting point when you brought that up that like they had different, or one could have imagined trying to push to create AGI not inside of the API. For profit company</p><p><strong>Linch:</strong> specifically a for profit company dedicated to AGI. Yes, dedicated to AGI.</p><p><strong>Austin:</strong> And I guess this is where I think something like in my world, which is very much influenced by tech startups being as like, very like good. I feel path of a startup doing a thing, it's just so much more likely to succeed or execute well on a thing compared to the government quote unquote doing a thing for an academic group or a bunch of researchers like doing a thing.</p><p>I think that's like on one hand. On the other is the like fundraising considerations where open eyes started like actually as a nonprofit. And I think like a famously, they decided to switch over to more of a for profit model and started raising like billions of dollars when they realized, okay actually to make AGI work, we just need the funding and the nonprofit, we're not going to get enough in donations to be able to pay the cost of all this compute.</p><p>So we are going to have to raise from venture backed raise venture back funding from like for profit like investors. You, do you what would the alternative one be maybe like try to shut down.</p><p><strong>Linch:</strong> Yeah. If it, if you can't do something morally, then don't do it at all.</p><p>It's like the obvious. Yeah, a nonprofit. Don't like, don't think yourself as though, people building AGI, that's one of the groups that people like. Helping to make a GI safe. There's lots of stuff you could do. They could become a policy nonprofit, they could do safety research the way that like file does what red work does.</p><p>There's just like a lot of options if if Yeah. Get acquired.</p><p><strong>Austin:</strong> Yeah. I guess I, I feel like these are all like, like pretty unimpressive slash like very bad alternatives in some way that like, I guess I</p><p><strong>Linch:</strong> care more about the thing being good than the thing being impressive, which I think is a big difference between like my like internal ethos and that of Silicon Valley.</p><p>I think Oppenheimer was not like morality tale about Oppenheimer being a great person and everybody should aspire to be like him.</p><p><strong>Austin:</strong> Yeah, I hear that. I think that Trying to be good is like pretty important. Probably like more important than being impressive. But at the same time, I think like one thing, the like altruistic mindset, the good mindset kind of undervalues is just the degree of like uncertainty in the world, the like you can't get to, I think, really good outcomes just by Trying to do good things.</p><p>You actually need feedback loops. You actually need the signal that comes from running a thing at larger and larger scales. And there's the classic, like where has a lot of the good in the world come from and libertarian pill people tend to think a lot of like why we are like happier and like rich and like prosperous and like many as a, like human beings.</p><p>Is because of like capitalism or like the liberal, like world order or something, the ability for people to start things with capital and grow those and scale those up. And that creates most of the good in the world. I think this is pretty compelling. I think it's I think many people agree with this take already.</p><p>Yeah. Yeah. And then there's like a specific sub question of like, for the issue of AGI, should you use the same engine of like capitalism to create this thing? Or should you try to do something else? And I guess I feel like most of the alternatives you named would like something like not do a very good job of getting to AGI.</p><p>And then that being the case, like some other for profit might do that better job. And I guess if you're sitting in Sam's position or the founders of OpenAI's position like maybe the very reasonable thing is okay we will be the ones to do it instead.</p><p><strong>Linch:</strong> I guess like for large scale scientific projects.</p><p>I don't really buy this</p><p><strong>Austin:</strong> Okay,</p><p><strong>Linch:</strong> like I feel like there are many large scale scientific projects that are not like Built by a company that's like dedicated to doing it.</p><p><strong>Austin:</strong> I think like once upon a time, yes, but I think the track record of these things lately has been quite bad. Like lately, for example, I think of SpaceX as a much more [00:40:00] impressive achievement in getting things like, into space than like NASA where once upon a time I was like, Oh yeah, the</p><p>Like shut space, like shuttle system got like astronauts to the moon.</p><p>Cool. But like now, like not so much. It's not</p><p><strong>Linch:</strong> true. It's I don't know, there's like motion rovers and there's like far flung like space things and like deep space probes and stuff that all come from not SpaceX. Yeah, I think they think it exists. SpaceX is mostly like scaling up the thing that like we do.</p><p>Where do you know how to do and doing a lot cheaper and a bunch of other things that are good.</p><p><strong>Austin:</strong> But then the doing a lot cheaper part is actually like quite important. Oh, I agree. Yeah. Or it enabled like cheapness enables like a qualitative shift in what is possible and enables things like Starlink.</p><p>And I think we'll enable sustainable, like Mars travel or something like is the goal of this SpaceX launches followed by Starship development.</p><p><strong>Linch:</strong> Yeah.</p><p><strong>Austin:</strong> Yeah. So I don't know like how closely you follow the space. I'm not like, the most ardent like SpaceX fan, but I have a really strong sense that like a company in this case was a much better way of like working on like the scientific achievement of at least like reusable rockets and things like that.</p><p>I think like maybe things that the government seems to do better than like for profit. I was going to say something like fusion, but actually I'm also not sure on this point. Like historically, a lot of the research into fusion reactors or things like that was like government funded.</p><p>I know Sam Allman has invested in like companies like Helion, and I don't know enough about the state of progress here to see which of these is more impressive or will be.</p><p><strong>Linch:</strong> Yeah. Yeah. That's fair. I think a lot of basic research is still funded by the government for like offense Government funded things in academia eventually spin out once the science is established.</p><p><strong>Austin:</strong> I definitely agree with that, but the the basic research to me, it's not very clear that what HDI looks like, especially with, if the scaling hypothesis is true, is what it requires to scale it versus engineering work, stuff that seems to progress much faster inside of large companies.</p><p>Yeah,</p><p><strong>Linch:</strong> so my guess is that if you wanted to build HDI within a government lab, it would be more expensive. Maybe it's 10x more expensive, maybe it's 100x more expensive. Okay. I'm just like, this is such an important thing that it's like worth paying the cost to do it well which in a way that I would not say for most other things.</p><p>I would not say this for spaceships. I would not say this for fusion probably.</p><p><strong>Austin:</strong> Yeah. And I think like being very expensive I think has like other costs time to like deployment time to like us being able to use the thing. But also just It doesn't strike me that if the government had done 100x as much money people at the goal of making AGI, the resulting AGI would be more aligned, or something like that, or better by our lights.</p><p>It's not very non obvious. It could be the case, but it's not obviously the case.</p><p><strong>Linch:</strong> I'd be interested in looking at the empirical record of how often disasters happen from More government y things versus non government y things. I've seen that, especially in governments within democratic regimes I would guess that the governments are safer.</p><p>Suddenly you could go into the communist territory and be, like, bad, but, yeah, I think it's just I think I just have the intuition that the government run things are, like, in fact safer. Because, they sacrifice other things for safety. And this is even more true if you control for competency of people.</p><p><strong>Austin:</strong> I think this is where like safety becomes too broad of a concept to be like very useful, right? There's one kind of safety, which is do people like die or something like that as a result of the work here? But I don't think this is the like useful kind of safety when it comes to AI safety.</p><p>It is like something a lot more like complicated, like weird will this like thing that we are summoning out of the ether, hold our beliefs and act the way we want it to. And I think it's I won't expect much of a difference, like for that kind of, for do humans die as a result of AI from GPT 4, GPT 5 level of like work, whether that happens in like OpenAI or like X government.</p><p><strong>Linch:</strong> Yeah. I don't think there'll be necessarily be a huge difference, but a couple of reasons to think that there might be some. When is that like alignment research might look. Less like Mech Interp stuff or like the scaling hypothesis and look a bit more like fundamental research or like somewhere in between like even Mech Interp feels like a Bit less you could just like still compute and make a thing happen. Although maybe you could especially at higher levels of capabilities I don't necessarily bet on it at least.</p><p>The other thing is just are you willing to make the thing stop? Are you willing to slow down? If things look scary? And my claim is that like the incentives for a for-profit company that's focused on ai, it's like basically the worst possible one here.</p><p>Not literally the worst. Like you could have two, two countries like that are really racing and, war, hot war against each other, maybe. Even then, I'm like, I don't know, we have like nuclear treaties and we have de escalation before, I'm just like, like a company voluntarily shutting down its main product because of like speculative claims, or that just seems like insane, that just seems like really hard you have to like, Be extremely honorable.</p><p>Never risk, either risk or really honorable and well meaning person. And I guess I don't think of any of the AI labs as that. In fact, very few are that. It just seems like a really weird and insane incentive structure to place yourself in.</p><p><strong>Austin:</strong> Yeah, I can see that. I [00:45:00] guess if I try to think of OpenAI as like a climate global or something like that they have this like particular profit incentive to produce a lot of oil and as a result, they do things like lobby like Congress or try to spread disinformation about what climate change is going to look like first.</p><p>Then that would make me, that makes me a little bit like more sympathetic to the idea that like, oh, maybe the for profit like structure for, OpenAI is not that good.</p><p><strong>Linch:</strong> Yeah, I think that's the main thing I want to emphasize. But then a secondary point is that it's, that's the main thing they're doing.</p><p>It's not they're like a, they're like Microsoft or Google or some other company that has a bunch of product lines. It's it's not like asking Google to cancel their I don't know, the YouTube line. Because it's like going poorly and like bad for people. It's like asking Google to cancel like the ads or like asking Facebook to cancel social media.</p><p>Just like a very high bar to ask for a pause or a slow down.</p><p><strong>Austin:</strong> It's a good point. But I feel like the societal external response to this kind of thing, like regulation and protests and things like that will probably do a reasonable job of keeping things in line. And I think this particular structure of like company tries to do a thing with a very clear mission and a profit incentive.</p><h2>[00:46:10] A hypothetical AGI "Manhattan Project"</h2><p><strong>Austin:</strong> And insofar as there are negative externalities, That is socially handled by other aspects of society designed to try and control for that. That seems pretty okay. Compared to some hypothetical this was like a Manhattan Project style initiative where lots of government people hired the smartest researchers to work for the government instead.</p><p>That doesn't seem better, I guess.</p><p><strong>Linch:</strong> Yeah. Why not? You have a Manhattan style of like projects that people, because people think AGI is going to be really good for the world and lots of people have mixed opinions. Unlike the actual Manhattan project, this was not done because of the Nazis, either because the U. S. is way ahead of China or because they figured out enough international agreements such that any other possible contenders also have their arm in the pie. You had this and. I don't know. Civil society is talking about the risk, like people debate this regularly, like it's a topic of conversation among other intellectuals, like things keep progressing and that really gonna there's no way to stop it?</p><p><strong>Austin:</strong> I think the structure of the Manhattan Project was like much more responsible towards the U. S. government or maybe like a few like specific like generals and people inside the U. S. government.</p><p><strong>Linch:</strong> Yes.</p><p><strong>Austin:</strong> And this feels like a much less broadly likely to be good for society incentive reporting structure than what OpenAI has, which is yes, like responsible to the shareholders, but also it's customers and it's employees and like a lot more society.</p><p>I also feel famously Manhattan Projects was very secretive. Like they don't tell very many people about what they were doing. I think the OpenAI as it is set up and the way most corporations are set up, they are much more Open in the sense of letting people know what they're doing and trying to appeal to the public opinion in a way that I think government projects do not have to win.</p><p><strong>Linch:</strong> I guess the version of the Manhattan Project for AI imagining is one that's much more public to</p><p><strong>Austin:</strong> Much more open than Open is today</p><p><strong>Linch:</strong> yeah, like to be like responsive to civil society, like it's not going to be like a secret line in the government's budget because under war, it's going to be pretty clear. We spend like X dollars on this thing. If it's like trillion dollars or more of spending, like it's going to be a type of thing that's covered in presidential debates.</p><p>It's a type of thing that like, how the intellectuals in nearby areas like talk about regularly. They're going to have papers in Science. They're going to be position papers. They're going to specific details of if it turns out it's there's like algorithmic breakthroughs that are scary rather than just scale, then like specific things will be hidden. But a lot of the details are going to be public.</p><p><strong>Austin:</strong> I guess,</p><p><strong>Linch:</strong> It might look more like Apollo, for example. Sure. I can, the American public did not want Apollo to happen.</p><p><strong>Austin:</strong> I feel like, I'm not sure if this is just a bias because I'm again, closer to this Silicon Valley world, but I have much more of a sense of how to get like my own preferences. And I think a lot of other people's preferences, embedded into the way OpenAI currently operates.</p><p>Compared to if it were like any like government like initiatives like an Apollo, right? If Apollo is doing something, I think it's very dangerous. What do I do? Write a letter to somebody maybe is like my, like the first thing I think of.</p><p>Versus with OpenAI go to Twitter. Like you can try to do a public campaign. You can like change how you pay for things. Like being like a consumer of this thing very much effects.</p><p>The touch points between like me and most of I think society and a company. Feel much better developed than the touch points between a government program and like society.</p><p><strong>Linch:</strong> It's just really wrong to me. I think like for OpenAI specifically, I think like underlying reason is that they're like not that far away from a social group. And if I want, if I really want to complain about OpenAI, I have a very specific technical proposal that I would like him to implement. I could just ask I could find somebody, I could just ask them to do it.</p><p>And that's probably true for you as well, and many people in the audience. But this is not like normal civil society. This is not the majority of people who aren't like,</p><p><strong>Austin:</strong> Bay</p><p><strong>Linch:</strong> Area tech adjacent.</p><p><strong>Austin:</strong> Maybe if you thought about it, it's like Walmart versus like Kansas, like [00:50:00] city government or something like that.</p><p>Yeah. I think</p><p><strong>Linch:</strong> if you're somebody in Kansas, you probably have Is there a way to change like Walmart's behavior, change like Kansas city's behavior than Walmart's?</p><p><strong>Austin:</strong> I'm not sure. I do feel like Walmart is a lot more like responsive to your like requests or something like that. If you are like unhappy, you can talk to somebody and Walmart, like a Walmart like store manager is going to listen to you and tell you like what they can do about it.</p><p>That kind of thing.</p><p><strong>Linch:</strong> Yeah. Walmart is like a very consumer focused company. Sure. Supposedly I don't like what Palantir does. Okay, maybe that's not a good example because they really partner with the government. Blackwater which is a military, like they're like not really a military contractor, but they it's not because of like laws that's like making them scary.</p><p>What do I do?</p><p><strong>Austin:</strong> Yeah, I guess it's hard for me to know because I don't really know what Blackwater is, but I agree that there are many companies in this reference class. But, OpenAI does not seem to be in this reference class. They seem to be in the try to put a thing in front of many people</p><p><strong>Linch:</strong> kind of</p><p><strong>Austin:</strong> reference class.</p><p>And that I think gives them like a lot more like direct incentives to, do things right by their customers. I think that's actually like a pretty important point that I'm developing this idea like right now, just as we're talking, but I actually think that's like pretty important as far as they're trying to create this AI that like serves the world, they actually have a very strong incentive towards keeping their users like happy and providing them like a valuable service in a way that like, yeah, It's a very powerful engine of corporations and does not exist nearly as much with like things like NASA or the Manhattan Project.</p><p><strong>Linch:</strong> Yeah. Manhattan Project especially. Yeah. But yeah, interesting. Yeah.</p><p>I do agree that you get a lot more like feedback from reality, more data more ways to like to see how like customers want your product to be done.</p><h2>[00:51:45] Government vs company services</h2><p><strong>Linch:</strong> How much of that is a government versus company thing? Would you say that Like Fox, okay, maybe that's not, like CNN say, do you think that's like more aligned than the BBC, for example?</p><p>Okay.</p><p>Would you say that?</p><p><strong>Austin:</strong> Oh, sorry do I think CNN is more aligned than the BBC? Yeah. I don't even know very much what the differences are. The BBC is like a British, like broadcasting company.</p><p>CNN is like an American one. They're both like relatively neutral in terms of the political spectrum.</p><p><strong>Linch:</strong> Yeah. And BBC is owned by the British government and CNN is not.</p><p><strong>Austin:</strong> Yeah. I have very little Sure, sure.</p><p><strong>Linch:</strong> Yeah, that's fair. That's fair. What examples that we might both be familiar with? NPR, any chance?</p><p>Maybe media is not a good example. I don't know. Um, E versus if the government's running it.</p><p><strong>Austin:</strong> Doesn't the government run PG&amp; E, I actually don't really know. They seem very close to the state monopoly. I guess I agree that the services are not very good or something like that.</p><p><strong>Linch:</strong> Yeah, I don't know. UPS versus USPS.</p><p><strong>Austin:</strong> Sure. Or like Amazon delivery versus UPS or something like that. Or</p><p><strong>Linch:</strong> Amazon delivery runs on UPS and USPS</p><p><strong>Austin:</strong> among many other I think Amazon has it own, like driver and things like that. I guess I feel like generally a higher level of quality service from like the Amazon private delivery service than from like public, like US Mail or something like that.</p><p>But Okay.</p><p><strong>Linch:</strong> Yeah, I feel like they're similarly, sorry, Amazon specifically now, because like it's trying to do a different thing, but you're trying to get one of my packages from like, point A to point B. I'm not seeing like a big difference between USPS and UPS. I think I agree with that.</p><p>Yeah. Yeah, but maybe that's not fair. I know that SOEs in general do worse somewhat. State owned enterprises? Yeah. Like this is true in China at least, and I assume it's true across the world. But yeah, that's exactly it though. I think the thing that, I guess the thing that matters to me is if I'm not a consumer, can I get something to stop?</p><p>And it feels like I have more leverage for government things than corporation things, especially, and one reason I don't is because governments are more powerful, but, like we are like kind of conditioning against that if we're like actually have AGI.</p><p><strong>Austin:</strong> Cool. Did you have any I think we're like running at like about an hour or so. Did you have any other things you wanted to chat about on the OpenAI side? Things that you thought are like pretty important to your worldview of, like, how you think about OpenAI that you didn't get a chance to express?</p><h2>[00:54:08] Is OpenAI democratic?</h2><p><strong>Linch:</strong> I guess one thing, I don't know how strongly I feel this because I'm confused about how much I care about democracy and stuff. But I do think there's an important sense in which like OpenAI is like just a very strongly like Sorry, the mission of creating AGI, one of the most powerful technologies of all time and Doing not sure what with it That just feels very anti democratic</p><p><strong>Austin:</strong> What about it feels anti democratic?</p><p><strong>Linch:</strong> I guess it's like a lot of people did not consent to this danger being imposed on them. Assuming you have non trivial pDoom, which my understanding is that OpenAI leadership at least publicly said they do.</p><p><strong>Austin:</strong> Yeah, I agree that they have like non trivial pDoom. I again think that like they expect them working on it lowers pDoom compared to [00:55:00] them not working on it compared to some other competitor working on it. Or</p><p><strong>Linch:</strong> yeah,</p><p><strong>Austin:</strong> I guess they'll do the main thing or like</p><p><strong>Linch:</strong> Yeah, I think making that decision on behalf of humanity feels a bit odd and maybe that's the best choice they could make like I'm not like unsympathetic to like situations where like, you just got to do this in and wait for other people to catch up But it does feel like as a society we should be asking questions or something</p><p><strong>Austin:</strong> Maybe. I think many things about OpenAI seem like quite democratic to me or the way they are opening up the access of chat GPT to many people is like the kind of practical democracy that matters a lot. Anyone can use like the cheaper versions of the models for free in a way that is helpful to their own lives.</p><p><strong>Linch:</strong> Yeah,</p><p><strong>Austin:</strong> I think that kind of giving things to lots of people and trying to help them is I don't know if it's democratic exactly, but along the vision of trying to help many people as opposed to concentrating the like good things that come out of AI to a few to the people who pay a lot for it, or people in charge of it.</p><p><strong>Linch:</strong> Yeah. I think that's like a framing of morality that many EA's or like it's a framing of like altruism or like justice. Sure. It's not like a framing of like participation. It's not like a framing of giving people the agent agency to choose whether to, if things are happening.</p><p><strong>Austin:</strong> I think like giving people like AI in their pockets is a kind of like democracy can mean one kind of thing. Ability to vote for, legislation or vote for people to enact legislation. I think the spirit of democracy goes beyond that. It is something like, giving people the ability to flourish pursue happiness, and I think they're doing quite well on that regard.</p><p>On the question of is the process by which OpenAI, makes their decisions democratic. Again, going back to the, many people are consumers of OpenAI, they pay OpenAI money for a service that is provided. There's a kind of democracy, more correctly, it is a kind of like a plutocracy, maybe like the people with the money in this like ecosystem, like influence the decisions of OpenAI.</p><p>But I think like practically it is not just like who is paying, but also like their free uses probably matter a lot to them. The. Like OpenAI as an entity is trying to serve these people well, because like in the future, some of them might actually turn into paying customers, that kind of thing.</p><p><strong>Linch:</strong> Yeah, I think this is a type of experiment that if you don't have massive negative externalities or massive externalities period, then I'm happy to, this is just part of being a good civil society democracy isn't everything freedom is, the underlying thing that matters more than democracy, and having lots of companies choose to do their own thing, and serve their customers and so forth, that enhances a lot of freedom.</p><p>I think the question does become that once you're all like, talking about percentage points of doom, but much, much less, honestly. It's the same type of thing that you really want, free exploration ideas and I don't know maybe you should maybe that's like the safest path forward You know, in any light and perspective, the most Oh freedom has forward.</p><p><strong>Austin:</strong> Yeah. The like standard way by which like democracy will have it say in AI is that like our elected leaders in like Sacramento or Washington are going to sit down and think hard and talk to like lobbyists paid for by like many sides of the equation to think like what kind of legislation should we enact on how it, how the like development of AGI happens.</p><p>And. I guess they could OpenAI could be more democratic if they like literally conducted like polls or something like that or like surveys of many people in the US around the world about how they feel. Are you thinking like that's the kind of thing that is missing from? What a responsible steward of AI would be doing?</p><p><strong>Linch:</strong> I don't think that in the incentive structure to pick themselves, I don't think that being super responsible. I think that there's been some resources I'm trying to figure out how to govern AI that obviously not enough, but, they have many constraints. They are, like, thinking about how to get more inputs and so forth.</p><p>I think that's a benefit. I think, like, some of the things they do are like weird, like lobbying, like bunch of the lobbying things they do, or saying that they care a lot about safety, but then like secretly lobbying against, but like they try pretty hard to lobby for LLM exception in the EU, like AI laws, for example, stuff like that, it's weird, but overall I'm not like mad massively.</p><p>problematic in the position they put themselves in. I think more it's like just as a civil society, do we want this type of thing to happen? Do we are we excited about like individual companies making choices for things that might like, Do myself and obviously there's a probabilistic, right?</p><p>Like a tiny probability that like microplastics will be really bad for humanity or whatever. Sure. We should probably, that should not be enough to like, stop plastics companies from like refining petroleum stuff. At least like not the information we have.</p><p>Like maybe we should spend a bit more resources into researching or something, but I feel like at anywhere close to level, like anywhere close to like risk of car accident, killing off humanity, I don't know, it feels odd that civil society is not like putting more pressure and maybe rethinking whether we want this to be done at [01:00:00] all by private companies.</p><p><strong>Austin:</strong> Yeah, I do expect this kind of pressure to just continue to ramp up over time as like people like start seeing what AI is capable of doing and tying it to, their very normal takes on how bad it would be if very smart AI, walk the earth. Yeah.</p><p>Other than that, I guess the critique of OpenAI as somewhat anti democratic doesn't really resonate with me, but I can see yeah, there's something in this reference class maybe of Oh, a nuclear weapon or but much more dangerous maybe that should be a thing that like democratic processes have some kind of, yeah, control, decide</p><p><strong>Linch:</strong> how it happens and whether it should happen at all and when it happens and stuff like that</p><p><strong>Austin:</strong> are like, yeah, the drivers in the space, as opposed to right now, they're like, are looking to like, play it like a check on the space. Yeah. Cool. Cool.</p><p>I think that's a good place to end it. Thanks. That was a great conversation. Yeah. Thanks for chatting.</p><p><strong>Linch:</strong> Yeah. Hope there are five people or however many people like to view this.</p><p><strong>Austin:</strong> You five people, you are awesome. Yeah. Awesome.</p><p><strong>Linch:</strong> Yeah. Yeah. One of my favorite people. I'm sure. Yeah.</p>]]></content:encoded></item><item><title><![CDATA[Manifund Q1 Retro: Learnings from impact certs]]></title><description><![CDATA[Manifund is a philanthropic startup that runs a website and programs to fund awesome projects.]]></description><link>https://manifund.substack.com/p/manifund-q1-retro-learnings-from</link><guid isPermaLink="false">https://manifund.substack.com/p/manifund-q1-retro-learnings-from</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Wed, 01 May 2024 15:30:51 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!5i6s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Manifund is a philanthropic startup that runs a website and programs to fund awesome projects. From January to now, we wrapped up 3 different programs for impact certificates (aka venture-style funding for charity projects): ACX Grants, Manifold Community Fund, and the Chinatalk essay competition.</p><p>Overall, we&#8217;ve learned a lot and are happy with the projects we&#8217;ve funded, but are less excited by impact certs than before &#8212; it&#8217;s been hard to get investor interest, and we still haven&#8217;t found a use case where certs led to better funding decisions. For the next quarter, we&#8217;re trying out a bunch of things (including Manifest, regranting, and prize challenges) while looking to get product-market fit.</p><h1>What we&#8217;ve been working on in Q1</h1><h2>ACX Grants 2024</h2><p>Manifund hosted the 2024 round of ACX Grants, which included both direct funding to some projects and an impact market for any project that opted-in and did not receive direct funding. ACX directly funded 33 projects for a total of $1.35mm; on the impact market, investors funded 12 projects to their minimum bars, for ~$50k total at a combined valuation of ~$200k.</p><h3>Background</h3><p><a href="https://www.astralcodexten.com/">Astral Codex Ten</a>&nbsp;(ACX) is a blog by Scott Alexander on topics like reasoning, science, psychiatry, medicine, ethics, genetics, AI, economics, and politics.&nbsp;<a href="https://www.astralcodexten.com/p/acx-grants-results-2024">ACX Grants</a> (ACXG)&nbsp;is a program in which Scott helps fund charitable and scientific projects &#8212; see the 2022 round&nbsp;<a href="https://www.astralcodexten.com/p/acx-grants-results">here</a>&nbsp;and his retrospective on ACX Grants 2022&nbsp;<a href="https://www.astralcodexten.com/p/so-you-want-to-run-a-microgrants">here</a>.</p><p>In this round (ACX Grants 2024), some of the applications were&nbsp;<a href="https://www.astralcodexten.com/p/acx-grants-results-2024">directly funded</a> by Scott; the rest were given the option to participate in an impact market, an alternative to grants or donations as a way to fund charitable projects. Manifund transferred funds from donors to grantees; hosted project proposals on our website; and ran the impact marketplace.</p><h3>ACXG Direct Funding</h3><p>Scott evaluated all of the applications along with his team of judges, then directed a total of $1.35M to 33 projects. You can see the results on <a href="https://www.astralcodexten.com/p/acx-grants-results-2024">ACX</a> or on the <a href="https://manifund.org/causes/acx-grants-2024?tab=grants">&#8220;Grants&#8221; tab</a> of Manifund&#8217;s ACXG 2024 page. Example grants:</p><ul><li><p><a href="https://manifund.org/projects/create-open-sour">Open source predictors for polygenic screening</a>, by Gene Smith</p></li><li><p><a href="https://manifund.org/projects/build-anti-mosqu">Building anti-mosquito drones</a>, by Alex Toussaint</p></li><li><p><a href="https://manifund.org/projects/validate-a-solut">Solving Far-UVC ozone generation</a>, by Jacob Swett</p></li></ul><p>This was relatively straightforward to run, as making direct grants is similar to what we did in our regranting program from last year. The biggest operational burden was collecting funding from a variety of donor-advised funds for donations to the ACX Grants program.</p><h3>ACXG Impact Market</h3><p>Projects that weren&#8217;t given a direct grant were eligible to participate in an impact market, and the ones that opted in are listed on the <a href="https://manifund.org/causes/acx-grants-2024?tab=certs">&#8220;Impact Market&#8221; tab</a>. You can read more our announcement post about the impact market <a href="https://manifund.substack.com/p/acx-grants-2024-impact-market-is">here</a>. Example projects that were funded:</p><ul><li><p><a href="https://manifund.org/projects/run-a-public-onl">Run a public on line Turing Test</a>, by Cameron Jones</p></li><li><p><a href="https://manifund.org/projects/scaling-legal-im">Scaling Legal Impact for Chickens</a>, by Alene Georgia Anello</p></li><li><p><a href="https://manifund.org/projects/briico-helps-mig">Platform for migrants to start microbusinesses</a>, by Emily Kerr-Finell</p></li></ul><p>Five retroactive prize funders agreed to participate: next year&#8217;s ACX Grants, the Survival and Flourishing Fund, the Long-Term Future Fund, the Animal Welfare Fund, the Effective Altruism Infrastructure Fund. In 2025, they&#8217;ll award prize funding to successful projects, and the investors who bet on those projects will receive their share in charitable dollars.</p><p>Some key differences between this impact market and <a href="https://manifund.org/causes/manifold-community">previous</a> <a href="https://manifund.org/causes/acx-mini-grants">impact</a> <a href="https://manifund.org/causes/ai-worldviews">markets</a> we&#8217;ve run:</p><ul><li><p><strong>Retro funding:</strong> With regard to retro funding, this impact market:</p><ul><li><p>&#8230;was the first one we ran that has multiple retroactive funders. We&#8217;ll have a better understanding of how this &#8220;went&#8221; after the retroactive evaluations happen (next year).</p></li><li><p>&#8230;had the highest range of retro funding. Retro funders committed to evaluating projects retrospectively as if they were a prospective grant with a 100% success likelihood, which meant that they will be competitive with the $5-$33 million that the retro funders disburse yearly. This also made some of the investing process more difficult, as it also made the likelihood and quantity of retro funding less clear. However, the goals of the retro funders were broadly well-defined (and further collated in our announcement post). Also, attempts to make this <em>more</em> well-defined end up trading off against flexibility of both the retro funders and the project creators &#8212; and it will have been good to test the limits of an impact market in this direction.</p></li><li><p>&#8230;had the least defined criteria. We broadly said &#8220;aim for the goals of the retrofunders,&#8221; and let everything else figure itself out, rather than &#8220;here&#8217;s a specific goal, achieve it.&#8221; This also made some of the investing process more difficult, as it also made the likelihood and quantity of retro funding less clear.</p></li></ul></li><li><p><strong>Project type &amp; scope:</strong></p><ul><li><p>Since this impact market was for projects that were already rejected from a previous round, we did experience a bit of selection, both advantageous and adverse. Most noticeably, a lot of the projects that would&#8217;ve been obviously-good to fund had already received funding from the previous round, which meant that only the projects that were not obviously-good were included (which included some fantastic projects, just not the ones which were <em>obviously</em> fantastic upon first glance).</p></li><li><p>There was no central or obvious &#8220;theme&#8221; (besides being for ACX Grants, which itself is incredibly broad). Previous impact markets had been about <a href="https://manifund.org/causes/ai-worldviews">AI worldviews</a>, or <a href="https://manifund.org/causes/acx-mini-grants">forecasting &amp; prediction markets</a>, or the <a href="https://manifund.org/causes/manifold-community">Manifold community</a>. The wider theme lead to a more diverse array of projects, making it harder for investors to compare different projects.</p></li></ul></li></ul><p>The ACXG Impact Market overall went okay, but there were definitely a few areas that the impact market was lacking on:</p><ul><li><p><strong>Participation:</strong> we had less participation from investors &#8212; both in terms of external $ invested and comments/interaction &#8212; than we had hoped. One of the projects directly funded in ACX Grants (<a href="https://manifund.org/projects/build-anti-mosqu">Building anti-mosquito drones</a>, by Alex Toussaint), went viral on Twitter and received a bunch of small donations. We would&#8217;ve liked to see this kind of momentum on one of our impact markets proposals, to build up a more sustained environment of donation &amp; interaction.</p><ul><li><p>We launched a $500 micro-regranting program to get more investors to participate, see below for more details on how that went (spoiler: pretty well!).</p></li></ul></li><li><p><strong>Complexity/confusion:</strong> the mechanisms behind an impact market are confusing, but the auction mechanisms we used were especially confusing &amp; complex. We&#8217;re considering ways that we can mimic the style of a <a href="https://www.investopedia.com/simple-agreement-for-future-equity-8414773">SAFE</a> or an <a href="https://capbase.com/most-favored-nation-mfn-clause-in-startup-investing-what-it-is-and-how-it-works/">MFN</a> clause from VC investing onto an impact market.</p></li><li><p><strong>Financial operations:</strong> we ran into a few unexpected and uncorrelated issues with our payment processor, Mercury. For the vast majority of grants, there wasn&#8217;t any problem, but for a few grants it lead to delays of about 4-5 weeks</p></li></ul><h3>Micro-Regranting Program</h3><p>Halfway through the impact market, we noticed that it was receiving less attention than we would&#8217;ve hoped. We decided to run a &#8220;micro-regranting program,&#8221; an extremely scaled-down version of the <a href="https://manifund.org/about/regranting">regranting programs</a> we&#8217;ve run in the past.</p><p>We described the program in the application form:</p><blockquote><p>Manifund is assembling a small cohort of charitable-funding enthusiasts to try their hands at impact investing ... If selected,&nbsp;you'll get to allocate $500 of charity budget&nbsp;to invest in impact certificates for the 2024 Astral Codex Ten Grants program; you can donate any profit from the investments to the charity of your choice. We're looking for people who are excited to thoughtfully consider how to allocate their investments, similarly to Manifund's regrantors (see examples&nbsp;<a href="https://manifund.substack.com/p/what-were-funding-weeks-2-4">here</a>), and to post some feedback on the funding decisions they make.</p></blockquote><p>The description &#8220;the world&#8217;s tiniest and most wholesome hedge fund&#8221; was aptly used. And overall, the micro-regranting program was a huge success! Some parts that went particularly well:</p><ul><li><p>We posted a <a href="https://twitter.com/manifund/status/1768786442300379517">few</a> <a href="https://forum.effectivealtruism.org/posts/u7q3ppFy3XyNx3mXn/participate-in-manifund-microgrants-an-acx-grants-giving">announcements</a> to elicit applications, and we received 26 applications in just a few days (way more than we expected, much faster than we expected), and approved 24 of them. The high acceptance rate wasn&#8217;t reflective of an especially low bar for micro-regrantors, we just got a ton of fantastic applications!</p></li><li><p>We moved pretty fast on this &#8212; it took us just about a week from the first announcement of the micro-regranting program to having built up a solid cohort of micro-regrantors.</p></li><li><p>The micro-regrantors left many thoughtful comments, gave us and project creators detailed feedback over calls &amp; in the Discord, and helped allocate about $17k in funding. Project creators seemed to enjoy chatting with the micro-regrantors in the comments, helping everyone better understand these proposals.</p></li><li><p>The micro-regrantors overall enjoyed the process. From our feedback form (n=17), the average response to &#8220;did you enjoy being a micro-regrantor?&#8221; on a scale of 1 (hated it) to 3 (neutral) to 5 (loved it), the mean response was a 3.8, and the median was a 4. Some anonymized quotes:</p><ul><li><p>&#8220;People talk on the [EA] Forum all the time about grantmaking, some suggesting it is wizardry for the talented few, others suggesting it should be for the democratic-EA masses. Unsurprisingly, it turns out the truth may be somewhere in between. I took on the role mainly for the educational (to me) / experiential value, and I wasn't disappointed with what I got in return for my time investment.&#8221;</p></li><li><p>&#8220;[I enjoyed] having to seriously think about what projects will get traction and stand a good chance of making impact, [as well as] engaging with other projects from a microgranting perspective - purpose to my engagement.&#8221;</p></li><li><p>&#8220;I aimed to build a small portfolio of projects and I did build it. It was fun!&#8221;</p></li></ul></li><li><p>We received a lot of helpful feedback about the Manifund site that might have otherwise been difficult to tease out of the average user.</p></li></ul><p>We were really happy with how the micro-regranting program went. But some things went less well than we would&#8217;ve hoped:</p><ul><li><p>Most of the projects needed more than $500, which meant that micro-regrantors needed to coordinate with other micro-regrantors &#8212; or expect follow-on donations from external folks. Both of those happened, but weren&#8217;t done nearly so well as possible. Some ideas we had for this in the future:</p><ul><li><p>Have both more-concerted coordination points and more coordination points overall. We had a channel in our Discord for micro-regrantors, which was really great for coordination between a few of the micro-regrantors who used it, but didn&#8217;t work well for most of them. Some possible ideas: a video call with all of the micro-regrantors to chat about projects they wanted to coordinate on; a dedicated section on Manifund for micro-regrantors to discuss grants; a shared email thread; asking micro-regrantors slightly more strongly to join &amp; use the Discord; etc.</p></li><li><p>Reach out to external donors. We had at least one large donor contribute additional funding based on decisions made by micro-regrantors, but there were obvious steps we didn&#8217;t take to seek out additional large donors; e.g., email some we know, make an announcement on the EA Forum, etc. Part of the problem here was that we were pretty time-crunched, which constrained the actions we felt comfortable request of external donors.</p></li></ul></li><li><p>Some of the micro-regrantors found the structure of the impact market (including the valuations, auction mechanism, equity structure, retroactive funding, etc) extremely confusing. This was mirrored by some of the feedback we received from project creators, and we intend to reduce the confusion &amp; complexity in future rounds.</p></li><li><p>Some of the micro-regrantors strongly disliked the framing of an impact market as encouraging micro-regrantors to be profit-seeking. We probably could&#8217;ve framed this better in our request for micro-regrantors, and selected for those who&#8217;d be especially interested in the profit-seeking component of an impact market.</p></li><li><p>The timing was pretty tight &#8212; we would&#8217;ve liked to give micro-regrantors a few more weeks to sift through projects and make decisions, which would&#8217;ve also let them coordinate more and let us contact some external donors.</p></li></ul><p>Again &#8212; overall, we were incredibly happy with the results of the micro-regranting program. We intend to run additional programs like this in the future.</p><p>Big thanks to <a href="https://manifund.org/AntonMakiievskyi">Anton Makiievskyi</a>, who funded all of the micro-regrantor budgets &#8212; and allocated an extra $5k to <a href="https://manifund.org/Jason">Jason&#8217;s</a> budget after noticing Jason&#8217;s deep &amp; thoughtful comments on the platform and on the Discord. (And that sort of dynamic is <em>exactly</em> the sort of thing we were hoping would come out of the micro-regranting program!)</p><h2>Manifold Community Fund</h2><p>We ran an impact market for Manifold community projects from November 2023 to February 2024. We had some ideas about changes that we wanted to make for the (then upcoming) ACX Grants 2024 impact market, and thought this would be a good way to test those on a smaller scale, in addition to a way to get some cool contributions to Manifold.</p><p>In the end, we paid out $16k to 12 projects:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!5i6s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!5i6s!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 424w, https://substackcdn.com/image/fetch/$s_!5i6s!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 848w, https://substackcdn.com/image/fetch/$s_!5i6s!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 1272w, https://substackcdn.com/image/fetch/$s_!5i6s!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!5i6s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png" width="1374" height="1138" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1138,&quot;width&quot;:1374,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:212670,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!5i6s!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 424w, https://substackcdn.com/image/fetch/$s_!5i6s!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 848w, https://substackcdn.com/image/fetch/$s_!5i6s!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 1272w, https://substackcdn.com/image/fetch/$s_!5i6s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef30b4de-e916-4a10-8276-fd1b91c91322_1374x1138.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>Here are some changes we tested on the structure of impact markets, and how we think they went:</p><ul><li><p><strong>Ongoing evals and payouts</strong>: instead of allocating all of the prize money at once at the end of the program, we awarded prize money (i.e. made buy offers) once per month, for a total of three evaluation rounds. We think this provided useful feedback for creators, as evaluators generally wrote comments about why they valued the project the way that they did, but that the actual offers were pretty irrelevant. At each intermediate evaluation, we were only valuing work done so far, so expected valuation was always higher than the offers and no offers were actually accepted until the final round.</p><ul><li><p>We also learned that evaluation takes a long time and isn&#8217;t that fun&#8212;participation in evals from the Manifold team went down each month, and in particular the final evals were delayed by an entire month.</p></li></ul></li><li><p><strong>Automated Market Maker (AMM)</strong>: for previous impact markets we had run, there was very little active trading on the certificates after the seed round, perhaps because the only way to trade was via limit orders so you always needed to find a trade partner. To make active trading easier, we implemented an automated market maker. This did facilitate more active trading, but overall didn&#8217;t seem worth it. For one, all of the AMMs were overpriced, perhaps because we implemented buying and selling but not shorting, which made betting against a project if you hadn&#8217;t previously bet for it impossible. Additionally, some equity and some USD was needed to seed the AMM, which meant that the seed round valuations had to be artificially inflated.</p></li><li><p><strong>No auction</strong>: in order to simplify the fundraising process, we eliminated the auction[link to about] from the seed funding process. Instead, creators just chose the amount they wanted to raise and made an offer at the implied valuation. This was indeed a simplier <em>mechanism</em> to understand, but put a lot more pressure on the creator to set their parameters wisely, because their funding minimum and their funding goal are the same number, and choosing it requires trading off probability of raising any funding at all with the chance to raise more money. On the other hand, with the auction, they can set a low minimum and let the market give them more if it so pleases. All-in-all, this change seemed to cause people <em>more</em> confusion, not less.</p></li></ul><p>So, most ideas we tested with the Manifold Community Fund we did not end up bringing to the ACX impact market. There were, however, some really awesome projects that came out of this, specifically Case&#8217;s contributions to the Manifold codebase and wasabipesto&#8217;s updates to Calibration City.</p><p>Overall, this updated us against impact markets for this kind of use case. As with the ACXG impact market, we were disappointed with the amount of investor participation upfront, and throughout the Community Fund. The best projects to come out of this didn&#8217;t actually get any upfront funding, which suggests a simple prize would have worked just as well in this case. And making an easy-to-understand UX for both the creators and the investors in the impact market remains an unsolved problem.</p><h2>ChinaTalk Essay Competition</h2><p>ChinaTalk is a twice-weekly newsletter and podcast on China, technology, and US-China relations to an audience of over 35,000, hosted by Jordan Schneider. They have previously received a <a href="https://manifund.org/projects/support-for-deep-coverage-of-china-and-ai">$17k regrant</a> on Manifund.</p><p>Late last year, we partnered with ChinaTalk to sponsor and host <a href="https://manifund.org/causes/china-talk?tab=about">their essay competition</a>:</p><blockquote><p>ChinaTalk has teamed up with <a href="https://manifold.markets/">Manifold</a> to give away $6,000 in prize money to essays making a bold prediction on the future of China. We&#8217;ll choose winners in February, 2024.</p><p>The top 3 essays will win a cash prize and will be interviewed as a guest on the ChinaTalk podcast to discuss their essay with Jordan.</p><p>We&#8217;re inviting you to submit an essay making a bold prediction on a subject that ChinaTalk&#8217;s audience would connect with.</p></blockquote><p>We agreed to partner on this because:</p><ol><li><p>We wanted to try to make &#8220;impact certificates for essay contests&#8221; work. We&#8217;d previously tried this with OpenPhil&#8217;s AI Worldviews Contest; however, we didn&#8217;t receive many submissions (and none of the winning ones) through our platform, as we didn&#8217;t secure an official partnership with OpenPhil in time.</p></li><li><p>Chinatalk offered to take on most of the promotional and organizational work, in exchange for Manifold paying for their time to do so. As promotion and organization are two of the most expensive pieces of organizing such a prize competition (with the third being the cost of the prize itself), we were curious to see how outsourcing this would turn out</p></li><li><p>We wanted to commission high-quality essays that would make good use Manifold&#8217;s prediction markets. We&#8217;d seen other competitions (like the ACX Book Reviews Contest) produce extremely good essays that got a lot of organic virality, and thought ChinaTalk was well-positioned on to do this given their reach and expertise in the subject matter.</p></li></ol><p>In the end, the competition was won by Lily Ottinger, with her piece "<a href="https://www.chinatalk.media/p/sino-soviet-split-20">Sino-Soviet Split 2.0</a>&#8221;. While we learned a lot from this partnership, I think that Manifund did not achieve the goals we&#8217;d set out to with this competition, for a few reasons:</p><ul><li><p>The impact certificate framing was confusing to the competitors; our site also presented some technical difficulties for essay submission (which we&#8217;ve now fixed). Moreover, we didn&#8217;t catch the attention of any investors; the use case of &#8220;speculating on the best essays&#8221; did not seem to catch on.</p></li><li><p>We received fewer submissions than we&#8217;d hoped for, and most submissions did not make use of concrete predictions or markets in the way that we&#8217;d wanted. <a href="https://www.chinatalk.media/p/five-betting-markets-for-2024-london">Jordan himself did write up a great post</a> with concrete predictions, but this was largely orthogonal to the essay submissions themselves.</p></li><li><p>Though the ChinaTalk team (both Jordan and Caithrin Rintoul, who handled much of the competition logistics) were professional and excellent to work with, any kind of partnership across orgs is fraught &#8212; there are always many delays in communication and nuances to work out. In particular, this competition was something like our third priority (behind the Manifold Community Fund and ACX Grants), so we were less responsive and focused on making the competition go well.</p></li></ul><h2>People Updates</h2><p>Manifund is growing! For the first time since we started our work, we&#8217;ve brought a new full-time hire: Saul Munn, to run strategy and operations for Manifund. You might remember Saul from his work on Manifest last year, which he&#8217;s returning to organize this year; he also cofounded the <a href="https://www.opticforecasting.com/">OPTIC</a> intercollegiate forecasting tournament.</p><p>We also engaged two folks to help us out as consultants:</p><ul><li><p>Dave Kasten planned and conducted a variety of interviews with charity leaders outside of EA, to see whether they might be interested in working with Manifund on impact certs.</p></li><li><p>Lily Jordan wrote up explanations of how Manifund operates, kicked off our ACX Grants microregranting program, and improved a bunch of site UX.</p></li></ul><p>Thanks to Dave &amp; Lily for all your help! We&#8217;d love to publish the research they&#8217;ve done for us at some point; in the meantime, you can view drafts <a href="https://www.notion.so/Dave-K-Takeaways-from-January-2024-interviews-1ed4e78ce622414f972dccfb6b105f4e?pvs=21">here</a> and <a href="https://www.notion.so/Lily-J-manifund-s-approach-65dc6e24d998448a93bb67b7e2ce76b2?pvs=21">here</a>.</p><p>Finally, Austin has <a href="https://manifold.markets/Austin/will-i-regret-leaving-manifold">officially left the Manifold team</a>, to focus on Manifund and related work. On a related note, we&#8217;re considering rebranding away from &#8220;Manifund&#8221; so that the separation between Manifold and Manifund is more clear (the two orgs are separately incorporated with separate finances, and no longer share employees).</p><h2>Site Updates</h2><p>Throughout this quarter, we&#8217;ve generally been more focused on wrapping up funding rounds which required a lot of operational support than with site changes. Still, here were a few significant features we added recently:</p><ul><li><p>Comment reactions: you can now react to comments with emojis, or even leave tips on comments which go to the commenter&#8217;s charity balance.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!5RhD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!5RhD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 424w, https://substackcdn.com/image/fetch/$s_!5RhD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 848w, https://substackcdn.com/image/fetch/$s_!5RhD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 1272w, https://substackcdn.com/image/fetch/$s_!5RhD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!5RhD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png" width="1456" height="252" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:252,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:244755,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!5RhD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 424w, https://substackcdn.com/image/fetch/$s_!5RhD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 848w, https://substackcdn.com/image/fetch/$s_!5RhD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 1272w, https://substackcdn.com/image/fetch/$s_!5RhD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2453a43-2571-431f-bd9b-29ebcc4842f7_1690x292.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p></p></li><li><p>Better agreement display: now, the grant agreement displays the grantee&#8217;s agreement as a signature rather than simply a checkbox, and displays Manifund&#8217;s countersignature.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MNDS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MNDS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 424w, https://substackcdn.com/image/fetch/$s_!MNDS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 848w, https://substackcdn.com/image/fetch/$s_!MNDS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 1272w, https://substackcdn.com/image/fetch/$s_!MNDS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MNDS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png" width="1456" height="365" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:365,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:186271,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MNDS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 424w, https://substackcdn.com/image/fetch/$s_!MNDS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 848w, https://substackcdn.com/image/fetch/$s_!MNDS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 1272w, https://substackcdn.com/image/fetch/$s_!MNDS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7705034e-2c7d-4060-b61a-66aa612574f8_1708x428.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p></li></ul><ul><li><p>Draft mode: for the ACXG impact market, we added a &#8220;draft mode&#8221; that projects can be in before they&#8217;re published as proposals. In this mode, the creator can save edits to their project over multiple sessions, rather than needing to publish immediately.</p></li></ul><div><hr></div><h1>Where we&#8217;re headed in Q2</h1><p>With the ACX Grants impact market and other rounds wrapped up, we now want to take a step back to explore, &#8220;what do we need to get to product market fit&#8221;? We have a bunch of ideas on directions we can take Manifund; over the next quarter, we&#8217;d like to get more concrete signal about which of these are promising. Some of the initiatives we already have in flight:</p><h3><strong>Manifest 2024</strong></h3><p>Manifest is a festival for forecasting and prediction markets. It&#8217;s more of a Manifold event in spirit, but it&#8217;s being planned by the Manifund team &#8212; co-led by Rachel and Saul! It officially kicks off Jun 7-9, but LessOnline and Summer Camp start the week before; we expect this to be quite busy with this in the next few weeks leading up to the event.</p><p>(Get your tickets at <a href="https://manifest.is">https://manifest.is</a>; it&#8217;s going to be a blast!)</p><h3><strong>AI Safety Regranting</strong></h3><p>We&#8217;re renewing our AI Safety regranting program, allocating ~$1.6m to 6 world-class experts in the field: Adam Gleave, Dan Hendrycks, Evan Hubinger, Leopold Aschenbrenner, Ryan Kidd and Neel Nanda. More details about this soon, but we&#8217;re excited to be running this again and hope to scale it up, with more funding and regrantors this year!</p><h3><strong>Mixtral Challenge (with Trenton Bricken &amp; Dwarkesh Patel)</strong></h3><p>On a <a href="https://www.dwarkeshpatel.com/p/sholto-douglas-trenton-bricken">recent podcast episode</a>, Dwarkesh, Trenton and Sholto suggested the idea of having a prize challenge to better understand the open-source Mixtral model, based on the success of the Vesuvius challenge. This got <a href="https://twitter.com/dwarkesh_sp/status/1773770379976409213">memed into existence on Twitter</a>, with Nat Friedman and others offering $45k in prize funding for this challenge. We&#8217;re working with Trenton and Dwarkesh to run this competition on Manifund, where we host the contest site and handle funds transfer, while Trenton serves to coordinate and judge.</p><h3><strong>Fiscal sponsorship</strong></h3><p>We&#8217;ve been talking with orgs like Asterisk, PauseAI, and Sage about providing a fundraising platform and fiscal sponsorship relationship (aka 501c3 tax deductability for donations to these orgs). We&#8217;re already informally sponsoring many of the projects on our site, but providing this as a proper service solves a pain point for orgs who need this to receive donations.</p><h3><strong>EA Funds collaboration</strong></h3><p>We&#8217;ve been in close contact with Linch and Caleb of EA Funds, as their work (especially on the LTFF) are the closest in nature to what Manifund has tackled. There are a bunch of ways Manifund and EAF could work together; we&#8217;d be excited to combine their grantmaking expertise with our focus on delivering great software.</p><h3><strong>Manifold Charity program</strong></h3><p>We&#8217;re still supporting mana donations to charity based out of our original Future Fund grant, but we&#8217;re likely to sunset this program once Manifold <a href="https://www.notion.so/A-New-Deal-for-Manifold-c6e9de8f08b549859c64afb3af1dd393?pvs=21">supports real money prizes via sweepstakes</a>.</p><h1>Other</h1><h2>People we&#8217;d love to chat with</h2><p>As we scale up, there&#8217;s a lot of folks we&#8217;d like to talk to and learn from. If this describes you (or someone you can intro), reach out to <a href="mailto:austin@manifund.org">austin@manifund.org</a>!</p><ul><li><p><strong>Creators</strong> who&#8217;d like to launch a grants program or prize challenge for their audience (eg Scott Alexander, Dwarkesh Patel)</p></li><li><p><strong>Grantees and orgs</strong> who are curious about working with Manifund, whether as a fiscal sponsee, impact cert founder,</p></li><li><p><strong>Medium-sized individual donors</strong> ($10k+/year) who find Manifund&#8217;s approach exciting and want to fund one of our programs, whether that&#8217;s regranting, impact certs, individual projects or something else</p></li></ul><h2>Links &amp; callouts</h2><ul><li><p>Aaron Silverbook&#8217;s Lumina Probiotics has been on a tear lately, with a very hyped launch; coverage from Scott Alexander, Yishan Wong, Richard Hanania, and other bloggers and press; and sales going through the roof &#8212; we&#8217;re glad to have been <a href="https://manifund.org/projects/recreate-the-cavity-preventing-gmo-bacteria-bcs3-l1-from-precursor-">an early investor</a>!</p></li><li><p>Alex Toussaint&#8217;s <a href="https://manifund.org/projects/build-anti-mosqu">mosquito-killing drone project</a> recently blew up on Twitter, with a video demo of the sonar getting 1M views and attracting $2500+ in crowdfunded donations from 10 folks.</p></li><li><p><a href="https://manifund.org/projects/neuronpedia---ai-safety-game">Johnny Lin</a> and <a href="https://manifund.org/projects/independent-researcher">Joseph Bloom</a>, two separate Manifund grantees, have teamed up. Neuronpedia <a href="https://www.lesswrong.com/posts/BaEQoxHhWPrkinmxd/announcing-neuronpedia-platform-for-accelerating-research">now provides web tooling for sparse autoencoders</a>, using visualization tools similar to those used by Anthropic, and has since received follow-on funding from LTFF.</p></li><li><p>Joel Tan of CEARCH (<a href="https://manifund.org/projects/regrant-to-chari">ACXG 2024</a> grantee) published <a href="https://forum.effectivealtruism.org/posts/xGqpQKf2FpjvwJe6q/ea-meta-funding-landscape-report">this excellent report</a> on the landscape of EA meta funding.</p></li><li><p>Nonlinear Network is running a funding circle for AI safety and related fields, with hundreds of project applications, reviews from experts, and &gt;$1m moved in donations in just a couple weeks.</p></li><li><p>Alexander Berger of OpenPhil <a href="https://www.openphilanthropy.org/research/our-progress-in-2023-and-plans-for-2024">recaps 2023 and writes about their funding plans for 2024</a>.</p></li></ul><h2>Ambitious ideas we&#8217;d be excited about</h2><p>We&#8217;ve been brainstorming on a bunch of other things we could do; here are some very rough notes. If you&#8217;d like to help make these happen, comment below or reach out to <a href="mailto:austin@manifund.org">austin@manifund.org</a>!</p><p><strong>Extending the regranting model</strong></p><ul><li><p><a href="https://www.notion.so/celeb-regranting-as-a-service-221364b2b4c942218a7264a9442e0433?pvs=21">&#8220;celeb regranting as a service&#8221;</a> &#8212; manufacture more &#8220;ACX Grants &amp; Emergent Ventures&#8221;</p><ul><li><p>See this proposal we&#8217;d sent to Dwarkesh Patel: <a href="https://www.notion.so/Dwarkesh-Grants-proposal-c613774bb5df459f852a183e2cd60dbb?pvs=21">&#8220;Dwarkesh Grants&#8221; proposal</a>, which led to the Mixtral Challenge collab described above</p></li></ul></li><li><p><a href="https://www.notion.so/microregrantors-b7ff5a8e7bfc4b05a8370473719205ce?pvs=21">microregrantors</a> &#8212; crowdsourcing EA donation decisions</p></li></ul><p><strong>Building a robust impact marketplace</strong></p><ul><li><p><a href="https://www.notion.so/Invest-Anthropic-OpenAI-stock-2e1f404db6ab49a9a83a0f6d7bf948a0?pvs=21">Invest Anthropic &amp; OpenAI stock</a> &#8212; instead of certs on tiny orgs with little data, would there be more excitement for certs on larger orgs? This might just look like allowing</p></li><li><p><a href="https://www.notion.so/Manifund-DAF-e618f899b3064e80882c478d8db4344b?pvs=21">Manifund DAF</a> &#8212; medium-sized (~$10-500k) independent donations inside EA are currently a hodgepodge; we could build up an ecosystem of medium-sized donors and projects seeking funding.</p></li><li><p><a href="https://www.notion.so/SAFIE-Standard-Agreement-for-Future-Impact-Equity-6408a0387fe941ffb2fce741715e55e7?pvs=21">SAFIE (Standard Agreement for Future Impact Equity)</a> &#8212; The SAFE simplified early startup investing. Can we do this for early charity funding &#8212; and make it backwards compatible with regular grants?</p></li></ul><p><strong>Funding more home runs</strong></p><ul><li><p><a href="https://www.notion.so/Hosting-amazing-projects-hits-5c3dc0221cf245a1841d4e4258fd60c7?pvs=21">Hosting amazing projects (&#8221;hits&#8221;)</a> &#8212; in venture capital, a few hits dominate the portfolio; anecdotally, this seems to be the case in early-stage charity funding too. How do we find these projects and encourage them to come to Manifund?</p></li><li><p><a href="https://www.notion.so/E-Accelerator-e45d893cb8c64add9341ec59c3156248?pvs=21">E/Accelerator</a> &#8212; or do we try to directly create more of them, especially with a bias towards software (Manifund&#8217;s comparative expertise in the EA ecosystem?)</p></li><li><p><a href="https://www.notion.so/EA-Common-App-8fa102acc8854245bd4f60a0998ebc91?pvs=21">EA Common App</a> &#8212; when Manifold was first getting started, applying to a bunch of different funders for mostly the same information was kind of a pain. Like the collegiate Common App, can we make one for &#8220;EA organizational funding&#8221;?</p></li></ul>]]></content:encoded></item><item><title><![CDATA[ACX Grants 2024: Impact market is live!]]></title><description><![CDATA[See our recs, meet the retro funders, and apply to regrant $500]]></description><link>https://manifund.substack.com/p/acx-grants-2024-impact-market-is</link><guid isPermaLink="false">https://manifund.substack.com/p/acx-grants-2024-impact-market-is</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Tue, 19 Mar 2024 16:12:44 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/26257767-ff30-4fef-8577-160fb9bae661_1888x1236.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The Astral Codex Ten (ACX) Grants impact market is <a href="https://manifund.org/causes/acx-grants-2024?tab=certs">live</a> on Manifund &#8212; invest in 50+ proposals across projects in biotech, AI alignment, education, climate, economics, social activism, chicken law, etc. You can now invest in projects that you think will produce great results, and win charitable dollars if you are right!</p><p>For this round, the retroactive prize funders include:</p><ul><li><p>next year&#8217;s ACX Grants</p></li><li><p>the Survival and Flourishing Fund</p></li><li><p>the Long-Term Future Fund</p></li><li><p>the Animal Welfare Fund</p></li><li><p>the Effective Altruism Infrastructure Fund, and</p></li></ul><p>Combined, these funders disburse roughly $5-33 million per year. A year from now, they&#8217;ll award prize funding to successful projects, and the investors who bet on those projects will receive their share in charitable dollars.</p><p>This post also highlights a few grants that the Manifund team are particularly excited about. And to kick things off, we&#8217;re launching a &#8220;micro-regranting&#8221; program: we&#8217;ll give you <strong>a $500 budget to invest into your favorite projects</strong>, and you get to keep the (charitable) profits.</p><p><strong>Click&nbsp;<a href="https://manifund.org/causes/acx-grants-2024?tab=certs">here</a>&nbsp;to browse open projects and start investing; click <a href="https://docs.google.com/forms/d/e/1FAIpQLSf-hheZ__cNFm6yRXgKs8biY_D7ZOOSkmHD-aTn54CEBBQG9Q/viewform">here</a> to apply to our micro-regranting program!</strong></p><p><em>[Edit: applications for the micro-regranting program is now closed. We received a lot of great applications, and are excited about doing this again in the future!]</em></p><h1>ACX Grants 2024 Impact Markets</h1><p><a href="https://www.astralcodexten.com/">Astral Codex Ten</a>&nbsp;(ACX) is a blog by Scott Alexander on topics like reasoning, science, psychiatry, medicine, ethics, genetics, AI, economics, and politics. <a href="https://www.astralcodexten.com/p/acx-grants-results-2024">ACX Grants</a> is a program in which Scott helps fund charitable and scientific projects &#8212; see the 2022 round&nbsp;<a href="https://www.astralcodexten.com/p/acx-grants-results">here</a> and his retrospective on ACX Grants 2022&nbsp;<a href="https://www.astralcodexten.com/p/so-you-want-to-run-a-microgrants">here</a>.</p><p>In this round (ACX Grants 2024), some of the applications were <a href="https://www.astralcodexten.com/p/acx-grants-results-2024">given direct grants</a>; the rest were given the option to participate in an impact market, an alternative to grants or donations as a way to fund charitable projects. You can read more about how impact markets generally work&nbsp;<a href="https://www.brasstacks.blog/explain-im/">here</a>, a canonical explanation of impact certificates on the EA Forum&nbsp;<a href="https://forum.effectivealtruism.org/topics/certificate-of-impact">here</a>, and an explanation thread from the Manifund twitter <a href="https://www.astralcodexten.com/p/acx-grants-results-2024">here</a>.</p><p>If you invest in projects that end up being really impactful, then you&#8217;ll get a share of the charitable prize funding that projects win proportional to your original investment.&nbsp;<strong>All funding remains as charitable funding</strong>, so you&#8217;ll be able to donate it to whatever cause you think is most impactful (but not withdraw it for yourself). For example, if you invest $100 into a project that wins a prize worth twice its original valuation, you can then choose to donate $200 to any charity or project of your choice.</p><h1>Meet the retro funders</h1><p>Four philanthropic funders have so far expressed interest in giving retroactive prize funding (&#8220;retro funding&#8221;) to successful projects in this round. They&#8217;ll be assessing projects <em>retro</em>spectively using the same criteria they would use to assess a project <em>pro</em>spectively. Scott Alexander explains:</p><blockquote><p>[Retro] funders will operate on a model where they treat retrospective awards the same as prospective awards, multiplied by a probability of success. For example, suppose [the Long Term Future Fund] would give a $20,000 grant to a proposal for an AI safety conference, which they think has a 50% chance of going well. Instead, an investor buys the impact certificate for that proposal, waits until it goes well, and then sells it back to LTFF. They will pay $40,000 for the certificate, since it&#8217;s twice as valuable as it was back when it was just a proposal with a 50% success chance.</p><p>Obviously this involves trusting the people at these charities to make good estimates and give you their true values. I do trust everyone involved; if you don&#8217;t, impact certificate investing might not be for you.</p></blockquote><p>As a (very) rough approximation, the four philanthropic retro funders usually disburse about $5-33 million per year. They are:</p><h4><strong>1. ACX Grants 2025</strong></h4><p>Next year&#8217;s ACX Grants round (2025) will be interested in spending some of the money they normally give out as prizes for the projects that succeeded in this year&#8217;s (2024) round. ACX Grants 2025 will be giving out prizes to people who pursue novel ways to change complex systems, either through technological breakthroughs, new social institutions, or targeted political change.</p><p>Previous rounds of ACX Grants have disbursed about $1-2 million per round, and you can find the lists of grants that those rounds gave money to here (<a href="https://www.astralcodexten.com/p/acx-grants-results">1</a>, <a href="https://www.astralcodexten.com/p/acx-grants-results-2024">2</a>).</p><h4>2. The Survival and Flourishing Fund (SFF)</h4><p>From their <a href="https://survivalandflourishing.fund/">website</a>:</p><blockquote><p>[SFF] is a website for organizing the collection and evaluation of applications for donations to organizations concerned with the long-term survival and flourishing of sentient life.</p></blockquote><p>Since 2019, SFF has recommended about $2-33 million per year in philanthropic disbursements ($75 million in total):</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kusB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kusB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 424w, https://substackcdn.com/image/fetch/$s_!kusB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 848w, https://substackcdn.com/image/fetch/$s_!kusB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 1272w, https://substackcdn.com/image/fetch/$s_!kusB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kusB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png" width="619" height="89" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:89,&quot;width&quot;:619,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:10030,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!kusB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 424w, https://substackcdn.com/image/fetch/$s_!kusB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 848w, https://substackcdn.com/image/fetch/$s_!kusB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 1272w, https://substackcdn.com/image/fetch/$s_!kusB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcc81321e-7cfe-4951-8cf1-b7e21f075dd4_619x89.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>To find out more about the philanthropic priorities of the SFF&#8217;s largest grant-maker, Jaan Tallinn, see&nbsp;<a href="https://jaan.online/philanthropy/">here</a>. To see past grants SFF has made, see&nbsp;<a href="http://survivalandflourishing.fund/#past-grant-recommendations">here</a>.</p><h4>3. The Long Term Future Fund (LTFF)</h4><p>From <a href="https://funds.effectivealtruism.org/funds/far-future">their website</a>:</p><blockquote><p>The Long-Term Future Fund aims to positively influence the long-term trajectory of civilization by making grants that address global catastrophic risks, especially potential risks from advanced artificial intelligence and pandemics. In addition, we [the LTFF] seek to promote, implement, and advocate for longtermist ideas, and to otherwise increase the likelihood that future generations will flourish.</p></blockquote><p>The LTFF usually disburses around $1-5 million per year, and sometimes disburses much more. You can view their yearly payout data <a href="https://funds.effectivealtruism.org/funds/far-future#payout-stats">here</a>.</p><p>You can read more about the LTFF&#8217;s scope and expected recipients <a href="https://funds.effectivealtruism.org/scope-and-limitations#long-term-future-fund">here</a>, and find their public grants database <a href="https://funds.effectivealtruism.org/grants?fund=Long-Term%2520Future%2520Fund&amp;sort=round">here</a>.</p><h4>4. The Animal Welfare Fund (AWF)</h4><p>From <a href="https://funds.effectivealtruism.org/funds/animal-welfare">their website</a>:</p><blockquote><p>The Animal Welfare Fund aims to effectively improve the well-being of nonhuman animals, by making grants that focus on one or more of the following:</p><ul><li><p>Relatively neglected geographic regions or groups of animals</p></li><li><p>Promising research into animal advocacy or animal well-being</p></li><li><p>Activities that could make it easier to help animals in the future</p></li><li><p>Otherwise best-in-class opportunities</p></li></ul></blockquote><p>The AWF usually disburses around $0.5-3 million per year, and sometimes disburses much more. You can view their yearly payout data <a href="https://funds.effectivealtruism.org/funds/animal-welfare#payout-stats">here</a>.</p><p>You can read more about the AWF's scope and expected recipients <a href="https://funds.effectivealtruism.org/scope-and-limitations#animal-welfare-fund">here</a>, and find their public grants database <a href="https://funds.effectivealtruism.org/grants?fund=Animal%2520Welfare%2520Fund&amp;sort=round">here</a>.</p><h4>5. The Effective Altruism Infrastructure Fund (EAIF)</h4><p>From <a href="https://funds.effectivealtruism.org/funds/ea-community">their website</a>:</p><blockquote><p>The Effective Altruism Infrastructure Fund (EA Infrastructure Fund) recommends grants that aim to improve the work of projects using principles of effective altruism, by increasing their access to talent, capital, and knowledge.</p><p>The EA Infrastructure Fund has historically attempted to make strategic grants to incubate and grow projects that attempt to use reason and evidence to do as much good as possible. These include meta-charities that fundraise for highly effective charities doing direct work on important problems, research organizations that improve our understanding of how to do good more effectively, and projects that promote principles of effective altruism in contexts like academia.</p></blockquote><p>The EAIF usually disburses around $1-3 million per year, and sometimes disburses much more. You can view their yearly payout data <a href="https://pbs.twimg.com/media/GIkpkY0aMAI-KuR?format=png&amp;name=900x900">here</a>.</p><p>You can read more about the EAIF&#8217;s scope and expected recipients <a href="https://pbs.twimg.com/media/GIkpkY0aMAI-KuR?format=png&amp;name=900x900">here</a>, and find their public grants database <a href="https://funds.effectivealtruism.org/grants?fund=EA%2520Infrastructure%2520Fund&amp;sort=round">here</a>.</p><h4>&#8230;and (possibly) more.</h4><p>If you want to join these four institutions as a potential final oracular funder of impact certificates, see&nbsp;<a href="https://www.notion.so/ACX-Grants-2-Pitch-to-Retro-Funders-1e37f1ebf79b4df7bf38a9ad6cddb55e?pvs=21">this document</a>&nbsp;and email <a href="mailto:rachel@manifund.org">rachel@manifund.org</a>.</p><h1>Some projects we like</h1><p>Many of the projects are really great! We don&#8217;t have enough time or space to talk about all of the ones we&#8217;re excited about, but here are a few of our faves, from each of:</p><h3>Austin</h3><p><strong>&#8220;<a href="https://manifund.org/projects/run-a-public-onl">Run a public online Turing Test with a variety of models and prompts</a>, by <a href="https://manifund.org/cameron">camrobjones</a>.&#8221;</strong></p><p>Cam created a Turing Test game with GPT-4. I really like that Cam has already built &amp; shipped this project, and it appears to have gotten viral traction and had to be shut down due to costs; rare qualities for a grant proposal! The project takes a very simple premise and executes well on it; playing with the demo makes me want to poke at the boundaries of AI, and made me a bit sad that it was just an AI demo (no chance to test my discernment skills); I feel like I would have shared this with my friends, had this been live.</p><p>Research on AI deception capabilities will be increasingly important, but also like that Cam created a fun game that interactively helps players think a bit about how for the state of the art has come, esp with the proposal to let user generate prompts too!</p><p><strong>&#8220;<a href="https://manifund.org/projects/commission-an-ac">Quantifying the costs of the Jones Act</a>, by <a href="https://manifund.org/BalsaResearch">Balsa Research</a>.&#8221;</strong></p><p>Balsa Research is funding an individual economist or a team to conduct a counterfactual analysis assessing the economic impact if the Jones Act was repealed, to be published in a top economics journal.</p><p>I like this project because the folks involved are great. Zvi is famous enough to almost not need introduction, but in case you do: he's a widely read blogger whose coverage of AI is the best in the field; also a former Magic: the Gathering pro and Manifund regrantor. Meanwhile, Jenn has authored <a href="https://jenn.site/2023/05/things-i-learned-by-spending-five-thousand-hours-in-non-ea-charities/">a blog post about non-EA charities</a> that has significantly shaped how I think about nonprofit work, runs an awesome <a href="https://manifold.markets/jenn/how-many-paper-units-will-be-in-the">meetup in Waterloo</a>, and on the side maintains this great <a href="https://codexcc.neocities.org/">database of ACX book reviews</a>. (seriously, that alone is worth the price of admission)</p><p>I only have a layman's understanding of policy, economics or academia (and am slightly bearish on the theory of change behind "publish in top journals") but I robustly trust Zvi and Jenn to figure out what the right way to move forward with this.</p><p><strong>&#8220;<a href="https://manifund.org/projects/write-and-publish-handbook?tab=comments#d32aacb2-1457-4eeb-b354-fd49e9ee4f34">Publish a book on Egan education for parents</a>, by <a href="https://manifund.org/brandonhendrickson">Brandon Hendrickson</a>.&#8221;</strong></p><p>Brandon wants to publish a book on education for parents based on Kieran Egan's educational theory. He walks the walk when it comes to education; his <a href="https://www.astralcodexten.com/p/your-book-review-the-educated-mind">ACX Book Review</a> contest entry on the subject was not only well written, but also well structured with helpful illustrations and different text formats to drill home a point. (And the fact that he won is extremely high praise, given the quality of the competition!) I'm not normally a fan of educational interventions as their path to impact feels very long and uncertain, but I'd be excited to see what Brandon specifically can cook up.</p><p>(Disclamer: I, too, have some skin in the game, with a daughter coming out in ~July)</p><h3>Lily</h3><p>&#8220;<strong><a href="https://manifund.org/projects/start-an-online-">Start an online editorial journal focusing on paradigm development in psychiatry and psychology</a>, by <a href="https://manifund.org/Psychcrisis">Jessica Ocean</a>.&#8221;</strong></p><p>Jessica&#8217;s project takes up the mantle of a favorite crusade of mine, which is &#8220;actually it was a total mistake to apply the scientific method to psychology, can we please do something better.&#8221; She&#8217;s written extensively on psychiatric crises and the mental health system, and I would personally be excited to read the work of people thinking seriously about an alternative paradigm. I&#8217;m not sure whether the journal structure will add anything on top of just blogging, but I&#8217;d be interested to see the results of even an informal collaboration in this direction.</p><p>(Note that I probably wouldn&#8217;t expect the SFF or LTFF to fund this; ACX Grants 2025 maybe, and the EAIF I&#8217;m not sure. But <em>I&#8217;d</em> be happy to see something like it exist.)</p><p><strong>&#8220;<a href="https://manifund.org/projects/an-online-scienc">An online science platform,</a> by <a href="https://manifund.org/pravsels">Praveen Selvaraj</a>&#8221;</strong></p><p>I think generating explanatory technical visuals is both an underrated use of image models, compared to generating images of mysteriously alluring women roaming the streets of psychedelic solarpunk utopias, and an underrated use of genAI for education, compared to chatbots that read your textbook over your shoulder. I&#8217;d like to see more 3Blue1Brown in the world, and in general I&#8217;m optimistic about people building tools they already want for their personal use, as Praveen does.</p><h3>Saul</h3><p><strong>&#8220;<a href="https://manifund.org/projects/educate-the-publ">Educate the public about high impact causes</a>, by <a href="https://manifund.org/alexkhurgin">Alex Khurgin</a>.&#8221;</strong></p><p>Alex wants to build a high-quality YouTube show, and seeks funding to make three episodes of the show on AI risk, antimicrobial resistance, and farmed animal welfare. This is something that I could pretty easily imagine the LTFF, EAIF, and possibly SFF retrofunding, and I'd additionally be excited about more people knowing about them &amp; reducing their expected negative impact on the world.</p><p>Alex&#8217;s (and his team&#8217;s) track record is also pretty great: they&#8217;re clearly experienced &amp; know what they&#8217;re talking about. I&#8217;d be interested in getting a better path to impact &#8212; what do they plan to do after they click publish on the videos? &#8212; but I&#8217;m sufficiently excited that I&#8217;ve invested a token $50 in Alex&#8217;s project to credibly signal my interest.</p><p><strong>&#8220;<a href="https://manifund.org/projects/distribute-hpmor?tab=comments#43bcf7dd-c866-409b-baf2-befd86f855af">Distribute HPMOR copies in Bangalore, India</a>, by <a href="https://manifund.org/adityaarpitha">Aditya Arpitha Prasad</a>.&#8221;</strong></p><p>Anecdotally, the answer &#8220;I got into HPMOR&#8221; has been quite a common response to the question &#8220;how did you become interested in alignment research?&#8221; Mikhail Samin has had (from what I&#8217;ve seen) a lot of success doing something like this in Russia, and I&#8217;m excited about starting a similar initiative in India. This grant seems to fall pretty clearly within the range of retrofunding from the LTFF and/or EAIF. I&#8217;ve invested a token $50 in Aditya&#8217;s project to credibly signal my interest.</p><h1>Micro-Regranting</h1><p><em>[Edit: applications for the micro-regranting program is now closed. We received a lot of great applications, and are excited about doing this again in the future!]</em></p><p>Manifund is assembling a small cohort of charitable-funding enthusiasts who want to try their hands at impact investing. If selected, <strong>you'll get to allocate $500 of charity budget</strong> to invest in impact certificates in this ACX Grants round; you can donate any profit from the investments to the charity of your choice. We're looking for people who will thoughtfully consider how to allocate their investments, similarly to Manifund's regrantors (see examples <a href="https://manifund.substack.com/p/what-were-funding-weeks-2-4">here</a>, or the examples directly above), and will post feedback and rationales for the funding decisions they make.</p><p>In addition to the investment budgets, we'll feature our favorite comments in a future edition of the Manifund newsletter and offer a mana bounty on Manifold for excellent comments.</p><p>Anyone is eligible, unless you have a project in the current ACX impact market. <strong>Please submit <a href="https://docs.google.com/forms/d/e/1FAIpQLSf-hheZ__cNFm6yRXgKs8biY_D7ZOOSkmHD-aTn54CEBBQG9Q/viewform">this form</a> by the end of&nbsp;Friday, March 22 to participate!</strong></p><p>(Reminder that <em>anyone</em> can invest in the impact market. The Micro-Regranting program will simply give you <em>a free charity budget</em> to invest on the impact market, rather than <em>your own charitable dollars</em>.)</p><h1>Links &amp; contact</h1><p>Click&nbsp;<a href="https://manifund.org/causes/acx-grants-2024?tab=certs">here</a>&nbsp;to browse open projects and start investing; click <a href="https://docs.google.com/forms/d/e/1FAIpQLSf-hheZ__cNFm6yRXgKs8biY_D7ZOOSkmHD-aTn54CEBBQG9Q/viewform">here</a> to apply to our micro-regranting program.</p><p>If you&#8217;re interested in learning more about investing on an impact market, donating to projects directly, or even just chatting about this sort of thing, you can email&nbsp;<a href="https://www.astralcodexten.com/">saul@manifund.org</a>&nbsp;or book a call&nbsp;<a href="https://savvycal.com/saulmunn/manifund">here</a>.</p>]]></content:encoded></item><item><title><![CDATA[Manifund: 2023 in Review]]></title><description><![CDATA[Impact certs, regranting, and the year ahead]]></description><link>https://manifund.substack.com/p/manifund-2023-in-review</link><guid isPermaLink="false">https://manifund.substack.com/p/manifund-2023-in-review</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Thu, 18 Jan 2024 23:41:28 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!sjof!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Manifund is a new funding org that experiments with systems and software to support awesome projects. In 2023, we built a website (<a href="http://manifund.org/">manifund.org</a>) and donor ecosystem supporting three main programs: impact certificates, regranting, and an open call for applications. We allocated $2m across dozens of charitable projects, primarily in AI safety and effective altruism cause areas. Here&#8217;s a breakdown of what Manifund accomplished, our current strengths and weaknesses, and what we hope to achieve in the future.</p><p><em>If you like our work, please consider <a href="https://manifund.org/about/donate">donating to Manifund</a>. Donations help cover our salaries &amp; operating expenses, and fund projects and experiments that institutional donors aren&#8217;t willing to back &#8212; often the ones that excite us most!</em></p><h1>At a glance</h1><p>Here are some high-level stats that provide a snapshot of our 2023 activities:</p><ul><li><p><strong>$2.06M sent to projects</strong>: $2.012M to grants &amp; $45K to impact certificates</p><ul><li><p>Of the totals above, $95K that went to grants and $40K that went to certs came from unaffiliated donors/investors, rather than regrantors.</p></li></ul></li><li><p><strong>88 projects</strong> were funded: 54 grants &amp; 34 certs</p></li><li><p><strong>$2.22M</strong> has been deposited into Manifund, and <strong>$1.62M</strong> has been withdrawn so far.</p></li><li><p>Below are the top cause areas of projects that got funded. Note that these are overlapping, that is, one project can be filed under multiple cause areas.</p><ul><li><p><strong>Technical AI Safety</strong>: 27 projects funded, $1.57M dispersed</p></li><li><p><strong>Science and Technology</strong>: 9 projects funded, $118K dispersed</p></li><li><p><strong>AI Governance</strong>: 10 projects funded, $112K dispersed</p></li><li><p><strong>Biosecurity</strong>: 4 projects funded, $97K dispersed</p></li><li><p>Honorable mention to <strong>Forecasting</strong>, which only received $76K total, but encompassed 35 projects. This is because our two biggest impact certificate rounds so far&#8212;ACX Mini-Grants and the Manifold Community Fund&#8212;were centered around forecasting and funded lots of small projects.</p></li></ul></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!sjof!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!sjof!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 424w, https://substackcdn.com/image/fetch/$s_!sjof!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 848w, https://substackcdn.com/image/fetch/$s_!sjof!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 1272w, https://substackcdn.com/image/fetch/$s_!sjof!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!sjof!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png" width="1456" height="792" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:792,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!sjof!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 424w, https://substackcdn.com/image/fetch/$s_!sjof!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 848w, https://substackcdn.com/image/fetch/$s_!sjof!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 1272w, https://substackcdn.com/image/fetch/$s_!sjof!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f4874cc-0f37-47f1-8ec9-73d81391b6de_2000x1088.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h1>2023 Programs</h1><h2>Impact certificates</h2><p><strong>Summary</strong>: Impact certificates are venture funding for charitable endeavors. Investors fund founders by buying shares (&#8221;certs&#8221;) in their projects, which pay out if the project later receives a retroactive prize. See also <a href="https://manifund.org/about/impact-certificates">our</a> and <a href="https://www.astralcodexten.com/p/impact-markets-the-annoying-details">ACX&#8217;s</a> explainers.</p><p><strong>Assessment</strong>: 7/10</p><p>Impact certs have been discussed as a much more efficient system of funding charitable projects. Luminaries such as <a href="https://impactpurchase.org/why-certificates/">Paul Christiano, Katja Grace</a>, <a href="https://medium.com/ethereum-optimism/retroactive-public-goods-funding-33c9b7d00f0c">Vitalik Buterin</a> and <a href="https://www.astralcodexten.com/p/impact-markets-the-annoying-details">Scott Alexander</a> have championed the idea, but apart from a <a href="https://impactpurchase.org/">few small</a> <a href="https://vitalik.ca/general/2021/11/16/retro1.html">experiments</a> with retroactive funding, no notable impact marketplaces existed prior to Manifund.</p><p>In 2023, we built out the website and operational experience to run impact certs end to end, from handling project submissions, to investments, to retroactive prize payouts. We completed 2 impact cert rounds and started 3 more, each testing different setups and domains.</p><p>How did we do? I would describe impact certs as &#8220;working as intended, but not yet hitting product/market fit&#8221;. Some possible reasons:</p><ul><li><p><strong>Impact certs are a 3-sided marketplace, and come with the implied cold start problems</strong>: we need to get a retro funder, investors, and founders all on board.</p><ul><li><p>We think of the retro funder as the hardest part; they have to be convinced that it&#8217;s worth paying out money for work already accomplished. Most donors instead think in terms of funding things prospectively.</p></li><li><p>But finding good investors is also not trivial! The investors&#8217; decisions shape which projects actually get underway; they serve the role of grant evaluators in a traditional charity ecosystems, which requires a particular skillset and dedication.</p></li></ul></li><li><p><strong>A lot of education is needed to explain the whole system:</strong> it has a lot of moving parts, and most people are unfamiliar with the venture ecosystem.</p></li><li><p><strong>Feedback loops via impact certs are slow.</strong> Projects take a long time to develop; updates have been infrequent from all sides (founders, investors, and funders).</p></li><li><p><strong>In theory, impact certs should encourage investors to seek out good projects and help them, though this doesn&#8217;t seem to have happened much.</strong> Perhaps this is due to small dollar amounts or us not having found expert investors to participate.</p></li></ul><p>Despite these difficulties, I think impact certs are Manifund&#8217;s most exciting project, with the potential to transform the entire landscape of charitable funding. The economic theory behind them is elegant, and recent successes with advance market commitments (Operation Warp Speed, Stripe Frontier) are waking people up to the idea that prize funding is a great way of encouraging public goods.</p><p>Here are some reflections, broken down by round.</p><p><strong><a href="https://manifund.org/causes/acx-mini-grants">ACX Forecasting Minigrants</a></strong>: 7/10</p><ul><li><p><strong>Stats</strong>: $30k prize pool distributed to 20 projects, Jan to Oct 2023</p></li><li><p>See also: <a href="https://www.astralcodexten.com/p/announcing-forecasting-impact-mini">ACX announcement</a>, <a href="https://manifund.substack.com/p/acx-mini-grants-results">Manifund&#8217;s retro,</a> <a href="https://www.astralcodexten.com/p/impact-market-mini-grants-results">ACX&#8217;s retro</a></p></li><li><p>This was the project that kicked off Manifund! Scott approached us saying that he wanted to run ACX Grants 2 on impact certs, but (understandably) wanted to try a lower-stakes test first. We picked &#8220;forecasting&#8221; as an area that Scott felt qualified to judge as a retro funder; I brought on Rachel to work on Manifund fulltime, and together we shipped the MVP of the site in 2 weeks.</p></li><li><p>Lessons: Everything worked! We successfully created the world&#8217;s first ecosystem around investing in charitable projects and retroactively funding them. I don&#8217;t think projects produced were quite as good as those in the original ACX Grants round; not sure if that is due to impact certs, the much lower funding &amp; prize pool ($20-40k), more limited scope, or something else.</p></li></ul><p><strong><a href="https://manifund.org/causes/ai-worldviews">OpenPhil AI Worldviews Essay Contest</a></strong>: 2/10</p><ul><li><p><strong>Stats</strong>: 5 essays cert-ified, Feb to Sep 2023.</p></li><li><p>See also: OpenPhil&#8217;s <a href="https://www.openphilanthropy.org/open-philanthropy-ai-worldviews-contest/">announcement</a>, <a href="https://www.openphilanthropy.org/research/announcing-the-winners-of-the-2023-open-philanthropy-ai-worldviews-contest/">results</a></p></li><li><p>This one was kind of a flop. We had launched ACX Minigrants and were waiting for results; we saw that OpenPhil had announced this contest and figured &#8220;$225k for essays? Great fit for contestants trying to hedge some of their winnings&#8221;. We reached out to Jason Schukraft, the contest organizer, and got his blessing &#8212; but unfortunately too late to secure an official partnership. We reached out to the essayists on our own, but most did not agree to create a Manifund impact cert (including none of the ultimate winners).</p></li><li><p>Lessons: large dollar prizes + well-known brand are not sufficient to get a robust cert ecosystem started. Unclear if &#8220;essays&#8221; are a compelling use case for impact certs, as the investment comes after all the work is done. On the plus side, this was a pretty cheap experiment to try, as all the infrastructure was already in place from the ACX Minigrants round.</p></li></ul><p><strong><a href="https://manifund.org/causes/china-talk">Chinatalk Prediction Essay Contest</a></strong>: ongoing (Nov 2023 to Jan 2024)</p><ul><li><p><strong>Stats</strong>: $6k prize pool; expecting 50-100 submissions.</p></li><li><p>See also: <a href="https://www.chinatalk.info/essay">Chinatalk&#8217;s essay website</a></p></li><li><p>Chinatalk (a Manifund grantee) approached us to sponsor their essay contest; I saw this as a chance where we could try out &#8220;impact certs for essays&#8221;, but this time with official partnership status. So far, Jordan and Caithrin have been amazing to work with; it remains to be seen if the added complexity of impact certs are worth the benefits of investor engagement.</p></li><li><p>As an aside: the Chinatalk contest makes me wonder if there&#8217;s space for &#8220;hosting contests-as-a-service&#8221;. There&#8217;s proven demand and a lot of good work generated via contests and competitions (ACX Book Reviews; Vesuvius Prize; AIMO prize) but each of them jury-rig together websites &amp; infra. Manifund could become a platform that streamlines contest creation (think <a href="https://www.kaggle.com/">Kaggle</a> but for misc contests) and thereby fill in the &#8220;retro funder&#8221; part of the marketplace.</p></li></ul><p><strong><a href="https://manifund.org/causes/manifold-community">Manifold Community Fund</a></strong>: ongoing (Dec 2023 to Feb 2024)</p><ul><li><p><strong>Stats</strong>: $10k x3 prize payouts, ~20 projects proposed</p></li><li><p>See also: <a href="https://news.manifold.markets/p/manifolds-30k-community-fund">Manifold announcement</a></p></li><li><p>Partnering with other orgs has upsides (publicity, funding) but also downsides (higher communication costs and more negotiations). Could we move faster and experiment more by funding a prize round ourselves? We landed on &#8220;community projects for Manifold&#8221; as an area we were experts in judging, and could justify spending money on to get good results.</p></li><li><p>We set up the MCF with 3 rounds of ~$10k in funding once per month, instead of a single prize payout at the end. My hope is that more frequent prize funding will provide better feedback to investors &amp; founders, and also teach us more about how to actually allocate retro funding, which is a surprisingly nontrivial problem!</p></li></ul><p><strong>ACX Grants 2024</strong>: ongoing (Dec 2023 to Dec 2024)</p><ul><li><p><strong>Stats</strong>: $300k+ prize pool; predict there will be 400-800 proposals, with 10-30 funded</p></li><li><p>See also: <a href="https://www.astralcodexten.com/p/apply-for-an-acx-grant-2024">ACX announcement</a></p></li><li><p>This will be our largest impact cert round so far. Some changes we made, compared to the previous ACX Minigrants:</p><ul><li><p>Scott will be directly funding the proposals that he likes up front, instead of waiting to act as a retro funder. The other proposals may then be created as impact certificates.</p></li><li><p>We allow anyone to invest (via a donor-advised funds model) instead of restricting to accredited investors.</p></li><li><p>A group of EA funders have agreed to participate as retro funders (Survival and Flourishing Funds, EA Funds &amp; ACX). This is particular exciting, as the first test case of having multiple different final buyers of impact.</p></li></ul></li><li><p>We&#8217;re happy that Scott found the first round of our impact certs compelling enough to want to expand it into the next official round! ACX Grants is also especially dear to our hearts, as it was counterfactually responsible for getting Manifold off the ground; to have the opportunity to come and help future ACX Grantees is quite the privilege.</p></li></ul><p><strong>Next steps for impact certs</strong></p><p>We&#8217;d like to continue running impact cert rounds, in a more frequent, standardized manner (perhaps even self-serve). We&#8217;d also like to find a large, splashy use case for impact certs, that draws more attention to the concept and validates that it works well at larger scales. This would likely involve partnering closely with some deep-pocketed funder who wants to put up a large prize for a specific cause.</p><p>Potential causes we&#8217;ve been daydreaming about:</p><ul><li><p>Curing malaria (vaccine rollout? gene drives?)</p></li><li><p>Big, yearly AI Safety prize</p></li><li><p>Ending flu season in SF with FarUVC rollout</p></li><li><p><a href="https://sideways-view.com/2021/03/21/robust-egg-offsetting/">Offsets for factory farmed eggs</a></p></li><li><p>Carbon credits, similar to Stripe Climate</p></li><li><p>Political change (e.g. housing reform?)</p></li><li><p>General scientific research prizes in some weird field (eg longevity? fertility?)</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!U-Jh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!U-Jh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 424w, https://substackcdn.com/image/fetch/$s_!U-Jh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 848w, https://substackcdn.com/image/fetch/$s_!U-Jh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 1272w, https://substackcdn.com/image/fetch/$s_!U-Jh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!U-Jh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png" width="464" height="464" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1456,&quot;width&quot;:1456,&quot;resizeWidth&quot;:464,&quot;bytes&quot;:6121085,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!U-Jh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 424w, https://substackcdn.com/image/fetch/$s_!U-Jh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 848w, https://substackcdn.com/image/fetch/$s_!U-Jh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 1272w, https://substackcdn.com/image/fetch/$s_!U-Jh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2eacdc45-5290-40b1-a6ed-d5125249df8b_2000x2000.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Regranting</h2><p><strong>Summary</strong>: A charitable donor delegates a grantmaking budget to individuals known as &#8220;regrantors&#8221;. Regrantors independently make grant decisions, based on the objectives of the original donor and their own expertise.</p><p><strong>Assessment</strong>: 7/10</p><p>Regranting was pioneered by the FTX Future Fund; among the grantmaking models they experimented with in 2022, they&nbsp;<a href="https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Expectations_vs__reality">considered regranting to be the most promising.</a> We believed that regranting is a good way of allocating funding for a few reasons:</p><ul><li><p><strong>Faster turnaround times for grantees:</strong> Regrants involve lower overhead and less consensus, which leads to faster decisionmaking. Around the time we started this program, the EA funding space had just been severely disrupted by the collapse of FTX, which was making grant turn around times especially long. We had experienced this ourselves, as had many other people in our network, and this seemed like a problem we could help solve.</p></li><li><p><strong>More efficient funding allocation and active grantmaking:</strong> Regranting utilizes regrantors&#8217; knowledge and networks, which may lead to above the bar use of funding on the margin. They&#8217;re often more connected to their grantees, which allows them to give more feedback or even initiate projects themselves, whereas most funders take a more passive approach.</p></li><li><p><strong>A better option for some donors:</strong> Delegating donations to regrantors is a unique donor experience, which offers a balance between maintaining control and minimizing effort relative to either directly giving to projects or giving to grantmaking organizations. Donors can pick individuals they trust to give intelligently and with their values in mind, but who may be better able to allocate the money as efficiently as possible due to some combination of time, expertise, and connections.</p></li><li><p><strong>Scalable:</strong> Regranting can scale up to moving large amounts of funding. This was a clear upside for the Future Fund, which was aiming to distribute 100M to 1B+ a year, though is less important for us now, as Manifund (and EA as a whole) are more funding constrained now.</p></li></ul><h3><strong>Large regrantors</strong></h3><p><strong>Stats</strong>: 5 regrantors with max budgets of ~$400k each, $1.4m total pool. 5 grants initiated, 15 grants supported.</p><p><strong>Assessment:</strong> 7/10</p><p><em>See also: <a href="https://manifund.substack.com/p/announcing-manifund-regrants">regranting launch announcement</a></em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!grAp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!grAp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 424w, https://substackcdn.com/image/fetch/$s_!grAp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 848w, https://substackcdn.com/image/fetch/$s_!grAp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 1272w, https://substackcdn.com/image/fetch/$s_!grAp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!grAp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png" width="576" height="383.7362637362637" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:970,&quot;width&quot;:1456,&quot;resizeWidth&quot;:576,&quot;bytes&quot;:557344,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!grAp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 424w, https://substackcdn.com/image/fetch/$s_!grAp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 848w, https://substackcdn.com/image/fetch/$s_!grAp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 1272w, https://substackcdn.com/image/fetch/$s_!grAp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F232022b1-abd0-4408-a5be-25ae94798835_2000x1333.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We decided to do a regranting program after we were introduced to an anonymous donor, &#8220;D&#8221;, in May 2023. D liked Future Fund&#8217;s regranting setup, and wanted to fund their own regrantors, to the tune of $1.5m dollars across this year. This seemed like a good fit for Manifund to run, as:</p><ul><li><p>We had already built a lot of the necessary infrastructure for ACX Minigrants&#8212;a website, 501c3 org, and payout processes&#8212;and we could reuse these for regranting.</p></li><li><p>We&#8217;d been beneficiaries of the regranting mechanism ourselves: Manifold&#8217;s seed round was initiated by an offer from a Future Fund regrantor.</p></li><li><p>We thought we could give our grantees a better experience, with faster turnaround times and more involved grantmakers.</p></li><li><p>We thought that by supporting regrantors and their grantees now, we could later convince them to participate in impact certs as investors &amp; founders.</p></li><li><p>We were in a lull anyways, waiting for ACX Minigrants projects to wrap up.</p></li></ul><p>We&#8217;re reasonably happy with the quality of grants made! At a midpoint review in September, our donor D&#8217;s judgement was that the regrants made were somewhat better than initially expected. They were open to renewing the program for next year, pending other considerations.</p><p>We do wish that we&#8217;d gotten more consistent active participation from the large regrantors. Relative to the small regrantors, they tended to make fewer grants and engage in less coordination and discussion with other regrantors&#8230; which is understandable! D chose a bunch of very well-credentialed folks with great judgement, but it turns out such people are pretty busy or in high demand, and $400k is not a large enough budget to warrant spending significant amounts of time doing active grantmaking.</p><p>We had one notable exception to this pattern: Evan Hubinger was very prolific and dedicated, making 9 different grants to projects in technical AI safety. We ended up increasing his budget to encourage him to continue finding good opportunities.</p><p>Still, until the last couple weeks of the year, it looked like a large portion of the large regrantor pot would be left unspent, but then Dan, Evan, and Adam came through with some last minute recommendations and made use of their remaining budgets.</p><p>While we&#8217;re glad all of the money was sent to projects, this end of year influx wasn&#8217;t ideal. First, it&#8217;s unlikely that the best opportunities just happened to come along suddenly at the end of the year, which means something inefficient was going on. It seems more likely that they could have given to marginally better projects earlier in the year, or they could have given to these projects earlier. This is informative for how we set up budgets if we do this again, so we don&#8217;t incentivize waiting until a week before the expiration date to spend budgets.</p><p>Additionally, because we didn&#8217;t anticipate this influx, the regrantors offered more in funding than we had budget for, and we had to reject two grant recommendations that at other times we would have approved. This possibly created false expectations and a pretty bad experience for these two grantees, for which we are quite sorry.</p><p><strong>Highlighted grants</strong></p><ul><li><p><a href="https://manifund.org/evhub">Evan Hubinger</a>, $100k: <a href="https://manifund.org/projects/scoping-developmental-interpretability-xg55b33wsfc">Scoping Developmental Interperetability</a></p><ul><li><p>This also got donations from Marcus, Ryan, and Rachel, though all after Evan&#8217;s initial contribution and recommendation.</p></li><li><p>This was an example of regrantors using their professional connections to find opportunities and that they had a lot of context on: Evan previously mentored Jesse, the recipient of this grant, and is familiar with the work of others on the team. With this all of this context, he said he &#8220;believe[s] them to be quite capable of tackling this problem&#8221;.</p></li></ul></li><li><p><a href="https://manifund.org/AdamGleave">Adam Gleave</a>, $10.5k: <a href="https://manifund.org/projects/introductory-resources-for-singular-learning-theory">Introductory resources for Singular Learning Theory</a></p><ul><li><p>According to Adam, &#8220;There's been an explosion of interest in Singular Learning Theory lately in the alignment community, and good introductory resources could save people a lot of time. A scholarly literature review also has the benefit of making this area more accessible to the ML research community more broadly. Matthew seems well placed to conduct this, having already familiarized himself with the field during his MS thesis and collected a database of papers. He also has extensive teaching experience and experience writing publications aimed at the ML research community.&#8221; I&#8217;ll note that Evan also expressed excitement about Singular Learning Theory in his writeup for the above grant.</p></li><li><p>Like the above grant, this was an opportunity that Adam came across and could evaluate with lots of context as he previously mentored Matthew and continues to collaborate with him.</p></li></ul></li><li><p><a href="https://manifund.org/LeopoldAschenbrenner">Leopold Aschenbrenner</a>, $400k: <a href="https://manifund.org/projects/compute-funding-for-seri-mats-llm-alignment-research">Compute and other expenses for LLM alignment research</a></p><ul><li><p>From Leopold&#8217;s comment explaining why he chose to give this grant: &#8221;Ethan Perez is a kickass researcher whom I really respect, and he also just seems very competent at getting things done. He is mentoring these projects, and these are worthwhile empirical research directions in my opinion. The MATs scholars are probably pretty junior, so a lot of the impact might be upskilling, but Ethan also seems really bullish on the projects, which I put a lot of weight on. I'm excited to see more external empirical alignment research like this!</p><p>Ethan reached out to me a couple days ago saying they were majorly bottlenecked on compute/API credits; it seemed really high-value to unblock them, and high-value to unblock them quickly. I'm really excited that Manifund regranting exists for this purpose!&#8221;</p></li><li><p>Leopold started with a donation of $200k, and then followed it up with another $200k two months later after seeing the progress they&#8217;d made and learning they were still funding constrained.</p></li></ul></li></ul><h3><strong>Small regrantors</strong></h3><p><strong>Stats</strong>: 11 regrantors with budgets of ~$50k each, $400k total pool. 11 grants initiated, 41 grants supported.</p><p><strong>Assessment</strong>: 8/10</p><p><em>See also: <a href="https://forum.effectivealtruism.org/posts/LqMiyLTy7gZ6vbWoo/some-fun-lessons-i-learned-as-a-junior-regrantor#comments">Joel Becker&#8217;s reflections as a regrantor</a></em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!E1np!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!E1np!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 424w, https://substackcdn.com/image/fetch/$s_!E1np!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 848w, https://substackcdn.com/image/fetch/$s_!E1np!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 1272w, https://substackcdn.com/image/fetch/$s_!E1np!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!E1np!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png" width="422" height="564.309065934066" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1947,&quot;width&quot;:1456,&quot;resizeWidth&quot;:422,&quot;bytes&quot;:1110519,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!E1np!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 424w, https://substackcdn.com/image/fetch/$s_!E1np!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 848w, https://substackcdn.com/image/fetch/$s_!E1np!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 1272w, https://substackcdn.com/image/fetch/$s_!E1np!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06429050-1866-4814-8395-14b0b7cf6083_2000x2674.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>After planning out the regranting program with D, we thought that 5 regrantors weren&#8217;t enough and that increasing the size of the regrantor cohort would be good for fostering more regrantor discussion and broadening the kinds of grants made. We decided to fund this from our own budget, primarily out of our general support grant from the <a href="https://survivalandflourishing.fund/sff-2023-h1-recommendations">Survival and Flourishing Funds</a>.</p><p>One main benefit of choosing our own regrantors was that we could bet on unconventional candidates. As a variant on &#8220;<a href="https://www.openphilanthropy.org/research/hits-based-giving/">hits-based giving</a>&#8221;, we were &#8220;hits-based delegating&#8221;, giving budgets to wide variety of people, including many with little-to-no previous track record of grantmaking.</p><p>The small regrantor program is one of the places where Manifund&#8217;s experimentation has been most successful, in our opinion. Some of our regrantors have gone above and beyond, in their commitment to initiating good grants and helping grantees. After launching with our initial cohort, we opened up an application for regrantors and got very high-quality applicants. We approved Ryan Kidd, Renan Araujo, Joel Becker, Nuno Sempere and waitlisted many more promising regrantor candidates. We also had some regrantors who weren&#8217;t active or decided to withdraw which is fine &#8212; we were expecting this and structured the regrantor budget allocation with this in mind.</p><p>We thought that good grants made by our regrantors would encourage other people to donate to regrantor budgets. This doesn&#8217;t seem to have happened much; among donations made by individuals on Manifund, very little went towards regrantor budgets relative to specific projects listed via the open call. To that end, fundraising for our regrantors remains our main constraint on operating this program.</p><p><strong>Highlighted grants</strong></p><ul><li><p><a href="https://manifund.org/joel_bkr">Joel Becker</a>, $1.5K &amp; <a href="https://manifund.org/RenanAraujo">Renan Araujo</a>, $1.5K: <a href="https://manifund.org/projects/explainer-and-analysis-of-cncertcc-">Explainer and analysis of CNCERT/CC (&#22269;&#23478;&#20114;&#32852;&#32593;&#24212;&#24613;&#20013;&#24515;)</a></p><ul><li><p>Joel&#8217;s explanation of the origin of this grant: &#8220;Renan and I put out a call to an invite-only scholarship program, the "Aurora Scholarship," to 9 individuals recommended by a source we trust. We were aiming to support people who are nationals of or have lived in China with a $2,400-$4,800 scholarship for a research project in a topic related to technical AI safety or AI governance&#8230;Alexa was one of our excellent applicants.&#8221;</p></li><li><p>Empowering people in or with connections to China to do AI safety work seems pretty important. We were particularly impressed by the initiative Joel and Renan took in getting this off the ground&#8212;the prospect of facilitating active grantmaking like this is part of what motivated us to start the regranting program!</p></li></ul></li><li><p><a href="https://manifund.org/GavrielK">Gavriel Kleinwaks</a>, $41.7K: <a href="https://manifund.org/projects/optimizing-clinical-metagenomics-and-far-uvc-implementation">Optimizing clinical Metagenomics and Far-UVC implementation</a></p><ul><li><p>Gavriel works at One Day Sooner, and is our only biosecurity-focused regrantor, and inline with this grant, her work has been focused on Far-UVC implementation! Here are some quotes from Gavriel&#8217;s writeup explaining why she chose this project:</p><ul><li><p>&#8220;From my conversation with Miti and Ale&#353;, it sounded as though there was a pretty good chance to unlock UK government buy-in for an important biosecurity apparatus, through the relatively inexpensive/short-term investment of a proposal submitted to the government. Biosecurity doesn&#8217;t have a lot of opportunities for cheap wins as far as I normally see, so this is really exciting.&#8221;</p></li><li><p>&#8220;This is exactly the type of project Manifund is best poised to serve: the turnaround needs to be really fast, since Miti is targeting an October deadline, and it&#8217;s for a small enough amount that at my $50k regranting budget I can fully fund it.&#8221;</p></li></ul></li><li><p>Shout out to Joel again who recommended this grant to Gavriel! Because of a COI with the recipient, he didn&#8217;t contribute financially, but he still deserves a lot of credit for making this happen.</p></li></ul></li><li><p><a href="https://manifund.org/MarcusAbramovitch">Marcus Abramovitch</a>, $25K: <a href="https://manifund.org/projects/independent-researcher">Joseph Bloom - Independent AI Safety Research</a></p><ul><li><p>Joseph Bloom had a strong track record with independent AI safety research&#8212;he maintains TransformerLens (the top package for mechanistic interperetability), his work has been listed by Anthropic, he teaches at the ARENA program, and he came highly recommended from Neel Nanda. Unsurprisingly, he seems to have lived up to this track record and made good progress on his research according to the updates he sends Marcus each month.</p></li><li><p>This also received the biggest independent donation of any project on Manifund: $25k from Dylan Mavrides!</p></li></ul></li></ul><h2>Open Call</h2><p><strong>Summary</strong>: &#8220;Kickstarter for charitable projects&#8221;: allow anyone to post a public grant proposal on the Manifund site, for regrantors and the general public to fund</p><p><strong>Assessment:</strong> 6/10</p><p><strong>Stats:</strong> 150 projects submitted, $95k raised among 40 individual donors</p><p>We started our open call to identify more opportunities for our regrantors to donate to. The open call worked well for this: it has surfaced many projects that we wouldn&#8217;t have seen otherwise. For example, I (Austin) allocated half of my own regrantor budget to projects that applied via the open call: Lantern Bioworks, Sophia Pung, Neuronpedia, and Holly Elmore.</p><p>To our surprise, many open call projects got support from individual donors that we had no pre-existing relationships with. We had about forty people donate this way, for a total of $95k.</p><p>Shout out to our top 10 individual donors of 2023.</p><ol><li><p>Dylan M - $25,000</p></li><li><p>Jalex S - $20,000</p></li><li><p>Anton M - $11,530</p></li><li><p>Vincent W - $10,500</p></li><li><p>Cullen O - $8,710</p></li><li><p>Carson G - $6,000</p></li><li><p>Peter W - $5,000</p></li><li><p>Gavin L - $5,000</p></li><li><p>Nik S - $5,000</p></li><li><p>Adrian K - $4,000</p></li></ol><p>There were still some downsides associated with this program. The biggest is that an always-open call takes up a constant amount of toil on our team to screen and process grants. Many projects that ask for funding don&#8217;t seem impactful or look like bad fits for Manifund. Finally, it seems like regrantors generally prefer to spend their budgets on projects they personally initiate.</p><p><strong>Highlighted grants</strong></p><ul><li><p><a href="https://manifund.org/projects/mats-funding?tab=donations">MATS Funding</a></p><ul><li><p>This received a total of $190K from 8 different sources, including 6 donors and 2 regrantors.</p></li><li><p>I (Rachel) am a big fan of MATS, as it seems are lots of people. Insofar as AI safety is talent constrained rather than funding constrained, programs like MATS are great way of converting abundant resources into more scarce and valuable ones, i.e. good technical AI safety researchers. MATS specifically occupies one of the hardest parts of that pipeline and does a great job. Many of their alums go onto work on the safety teams of the most important players in AI, like Anthropic and OpenAI, and it seems our two regrantors from Anthropic&#8212;Tristan Hume and Evan Hubinger&#8212;are both willing to pay for the talent that MATS brings to their company and field.</p></li><li><p>Tristan Hume explained his decision to direct $150k to MATS: &#8221;I've been very impressed with the MATS program. Lots of impressive people have gotten into and connected through their program and when I've visited I've been impressed with the caliber of people I met.</p><p>An example is Marius Hobbhahn doing interpretability research during MATS that helped inform the Anthropic interpretability team's strategy, and then Marius going on to co-found Apollo.&#8221; (n.b. Apollo Research is also a Manifund grantee!)</p></li></ul></li><li><p><a href="https://manifund.org/projects/experiments-to-test-ea--longtermist-framings-and-branding">Experiments to test EA / longtermist framings and branding</a></p><ul><li><p>This received a total of $26.8K from 5 different sources, including 3 donors and 2 regrantors.</p></li><li><p>From Marcus&#8217; comment explaining why he decided to contribute: &#8220;We just need to know this or have some idea of it (continuous work should be done here almost certainly). Hard to believe nobody has done this yet.&#8221;</p></li><li><p>Other comments from donors each expressed a similar sentiment: this is simply a really important question and people are curious to see the results!</p></li></ul></li><li><p><a href="https://manifund.org/projects/recreate-the-cavity-preventing-gmo-bacteria-bcs3-l1-from-precursor-">Recreate the cavity-preventing GMO bacteria BCS3-L1 from precursor</a></p><ul><li><p>This received a total of $40.6K from 10 different sources, including 8 donors and 2 regrantors. As Lantern Bioworks is a for-profit company, this was structured as a SAFE investment as part of their seed round, rather than a donation.</p></li><li><p>This project is simply very cool. Since receiving the Manifund investment, they&#8217;ve successfully gotten hold of this bacteria and started administering it (including to us, COI disclosure). Now they are focused on selling their probiotic more widely, and remain on a good path to succeeding at their ultimate goal of curing all cavities forever.</p></li><li><p>See also: <a href="https://www.astralcodexten.com/p/defying-cavity-lantern-bioworks-faq">ACX writeup</a> and their launched product, <a href="https://www.luminaprobiotic.com/">Lumina</a></p></li></ul></li></ul><h2>Manifold Charity Program</h2><p><strong>Summary</strong>: Allow people to donate their Manifold mana to charities of their choice.</p><p><strong>Assessment</strong>: 4/10</p><p>See also: Donations on the <a href="https://manifold.markets/charity">charity page</a></p><p>This was actually the original reason Manifold created a 501c3, back in 2022. We raised $500k from Future Fund as seed funding, to distribute to other charities; the idea was to provide some backing value for Manifold mana, and put donation decisions in the hands of our best traders.</p><p>Manifold users like that this exists. They mention that they buy into mana with the idea that they can donate it later. When Stripe <a href="https://news.manifold.markets/p/above-the-fold-donate-before-march">initially asked us to discontinue this program</a>, many of our users were vocally unhappy at this.</p><p>It also provides a cleaner story for why people participate on Manifold. Predictions markets are sometimes negatively viewed as &#8220;gambling&#8221;, and &#8220;gambling for fake money&#8221; is even less understandable, whereas &#8220;gambling for charity&#8221; is easier to explain and wholesome.</p><p>This program is currently in maintenance mode, from the perspective of Manifund. We&#8217;re continuing to administer it, but it&#8217;s not an area we&#8217;re trying to improve upon. We&#8217;ll revisit this as part of Manifold&#8217;s monetization goals in 2024. For now, it&#8217;s being capped at $10k/mo (<a href="https://www.notion.so/The-New-Deal-for-Manifold-s-Charity-Program-1527421b89224370a30dc1c7820c23ec?pvs=21">The New Deal for Manifold&#8217;s Charity Program</a>)</p><p>There are many potential areas of improvement to this that we could invest in:</p><ul><li><p>Make donating more of a social experience</p></li><li><p>Support donations to any charity, make it more self-serve</p></li><li><p>Transfer responsibility for program administration fully into Manifund</p></li><li><p>Run matching programs to encourage more user donations</p></li><li><p>Partner with charities</p></li></ul><h2><a href="http://leaf-board.org">leaf-board.org</a>: EA Funds&#8217;s grantee portal</h2><p><strong>Summary</strong>: Rachel built a dashboard for EA Funds grantees, which reads from the EA Funds system and tells applicants about their status in the grantee pipeline.</p><p><strong>Assessment</strong>: 7/10</p><p>Here&#8217;s <a href="https://leaf-board.org/recJxkz7MnnJdgF9V">Manifold&#8217;s grantee page</a> as an example:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4qLK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4qLK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 424w, https://substackcdn.com/image/fetch/$s_!4qLK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 848w, https://substackcdn.com/image/fetch/$s_!4qLK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 1272w, https://substackcdn.com/image/fetch/$s_!4qLK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4qLK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png" width="1456" height="826" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:826,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:686887,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!4qLK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 424w, https://substackcdn.com/image/fetch/$s_!4qLK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 848w, https://substackcdn.com/image/fetch/$s_!4qLK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 1272w, https://substackcdn.com/image/fetch/$s_!4qLK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6b353bf-6c7e-44f9-b82b-f2a99e558859_2000x1135.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This was a test of increased collaboration between Manifund and EA Funds, as our two orgs have a lot of overlap in cause alignment, check size, and org size. Beyond building this dashboard, we discussed many other options for collaborating as well, which may bear fruit down the line, such as creating a &#8220;common app&#8221; for EA, sharing notes on grantees, or merging our financial operations. Building this was also an experiment into &#8220;what if Manifund acted as a software consultancy, increasing the quality waterline of software in EA&#8221;. We proved that we could quickly deliver high-quality websites&#8212;Rachel shipped the entire site from scratch in &lt;2 weeks.</p><h1>Other stuff we tried</h1><p>Beyond impact certs and regranting, we&#8217;ve experimented with other financial mechanisms to assist with charitable endeavors. Some of the weirder things:</p><ul><li><p>Did you know that nonprofits can make loans? We&#8217;ve loaned out $300k twice, to two orgs who we felt aligned with and had a compelling pitch for how they would use the funds: <a href="https://manifold.markets/Austin/will-lightcone-repay-their-300k-loa">Lightcone Infrastructure</a>, and <a href="https://manifold.markets/Austin/if-manifund-loans-250k-to-marcusabr">Marcus Abramovitch&#8217;s trading firm AltX</a>. In both cases, we earned a nice return on investment for ourselves, while helping out other organizations in our network.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zBC2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zBC2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 424w, https://substackcdn.com/image/fetch/$s_!zBC2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 848w, https://substackcdn.com/image/fetch/$s_!zBC2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 1272w, https://substackcdn.com/image/fetch/$s_!zBC2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zBC2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png" width="1456" height="885" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:885,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:267087,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zBC2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 424w, https://substackcdn.com/image/fetch/$s_!zBC2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 848w, https://substackcdn.com/image/fetch/$s_!zBC2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 1272w, https://substackcdn.com/image/fetch/$s_!zBC2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2691220-9653-49db-9f97-b7e6bf204b49_2000x1215.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>Though the market seemed to think this was a bad idea&#8230;</em></p></li><li><p>Did you know that nonprofits can make investments? We put in $40k as a SAFE into Lantern Bioworks, through Austin and Isaac&#8217;s regrantor budgets. We have a soft spot for venture investments made through nonprofits; Manifold Markets started our seed round with a large investment from the Future Fund.</p></li></ul><p>Here are some other mechanisms that have caught our eye, as things to experiment with:</p><ul><li><p>Income share agreements, as a replacement for &#8220;upskilling grants&#8221;</p></li><li><p>Dominant Assurance Contracts, as a partial solution to public goods funding problems and a nice addition to our crowdfunding ecosystem</p></li><li><p>Quadratic funding and the S-process, as potential ways of calculating how much to allocate to retroactive payouts</p></li></ul><h1>What we were happy with in 2023</h1><p><strong>The core Manifund product: UI and grantee experience</strong></p><p>This year, we built a website and entire funding ecosystem from scratch, which has moved about 2 million dollars to projects to date. Our two main areas of focus were building out novel funding mechanisms and delivering a good grantee experience, and we feel we&#8217;ve succeeded at both.</p><p>Manifund supports regranting, crowdfunding, and impact certificates. We&#8217;re the first site ever to support trading impact certificates &#8212; we think this represents a huge step forward, experimentally testing out an idea that&#8217;s been discussed for a while.</p><p>We also think it&#8217;s nicer to be a Manifund grantee than a grantee of some other orgs in the EA space: our grantees get their money faster, receive more feedback on their projects, and have an easier time communicating with us and their grantmakers.</p><p><strong>Transparency &amp; openness</strong></p><p>Our initial thesis was that grant applications and screening largely can be done in public, and should be. We followed through on trying this out, and feel it went very well.</p><p>We&#8217;ve formed the largest database of public EA grant applications, as far as we know. Whereas every other application process and review happens over private writeups, Manifund enables these applications to be proposed and discussed on the public internet, including comments and feedback from grantmakers and others.</p><p>We think that more more transparency in funding is a public good. For most people in EA, it&#8217;s something of a mystery how funding decisions get made, which can be frustrating and confusing. Manifund makes the thought processes of grantmakers less mysterious.</p><p>Grantees also seem to appreciate the open discussion. From Brian Wang, discussing details about their proposal for &#8220;<a href="https://manifund.org/projects/design-and-testing-of-broad-spectrum-antivirals">Design and testing of broad-spectrum antivirals</a>&#8221; with regrantor Joel Becker:</p><blockquote><p>it&#8217;s been a breath of fresh air to be able to have this real-time, interactive discussion on a funding request, so props to Manifund for enabling this!</p></blockquote><p>Finally, having applications in public has given projects greater exposure. Ryan Kidd told us that someone got in touch about funding SERI MATS after seeing the Manifund post&#8212;not to mention all of the proposals posted via open call that were supported directly by small donors who just saw them on our website. As another example of the benefits of transparent grant applications: Lantern Bioworks&#8217;s <a href="https://manifund.org/projects/recreate-the-cavity-preventing-gmo-bacteria-bcs3-l1-from-precursor-">Manifund proposal to cure cavities</a> got to <a href="https://news.ycombinator.com/item?id=36702911">#1 on Hacker News</a>, helping them share their plan widely when they were at a very early stage.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rTSn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rTSn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 424w, https://substackcdn.com/image/fetch/$s_!rTSn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 848w, https://substackcdn.com/image/fetch/$s_!rTSn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 1272w, https://substackcdn.com/image/fetch/$s_!rTSn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rTSn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png" width="1456" height="347" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:347,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:616852,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rTSn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 424w, https://substackcdn.com/image/fetch/$s_!rTSn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 848w, https://substackcdn.com/image/fetch/$s_!rTSn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 1272w, https://substackcdn.com/image/fetch/$s_!rTSn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5d83e55-9125-45f5-b60d-4fa2e42e6c51_2000x477.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p><strong>Quality of projects we&#8217;ve been supporting</strong></p><p>This is both one of the most important metrics of success, and one of the hardest to evaluate. Part of the point of regranting is that the regrantors know more about their fields than we or our donor do&#8212;that&#8217;s the point of outsourcing the decisions to them! This means both that they make better decisions than we could, and that it&#8217;s really hard for us to make judgements about their decisions since they were based on expertise we don&#8217;t have.</p><p>All that aside, the quality generally seemed pretty high to us. Almost all grants were at least one of: evaluated by someone with special context on the team or the work which we feel comfortable deferring to, counterfactual or counterfactually fast when it was important, or an obviously positive grant of the type e.g. LTFF would make.</p><p>We&#8217;d be very curious to hear other people&#8217;s thoughts on whether this assessment seems accurate!</p><p><strong>Coordination with other EA funders</strong></p><p>Because the funding space is pretty fragmented, it seems like there&#8217;s a lot to be gained from a little bit of coordination. In our observation, EA funders rarely sync up, even on basic questions like &#8220;are you also planning on funding this project?&#8221; or &#8220;what are your plans for this quarter?&#8221;. We wanted to combat this and have worked closely with a handful of other orgs, and we&#8217;re happy with the value we&#8217;ve provided to them. These collaborations include:</p><ul><li><p><strong>Astral Codex Ten:</strong> we provided a polished website, fiscal sponsorship, and payout support for ACX Mini-Grants and ACX Grants 2024.</p></li><li><p><strong>EA Funds:</strong> we built a dashboard to improve their grantee experience.</p></li><li><p><strong>Lightcone:</strong> we gave them a fast loan when they were temporarily liquidity-constrained.</p></li></ul><p>And we&#8217;ll be working more with the Long-Term Future Fund and the Survival and Flourishing Fund, as they&#8217;ve agreed to be retroactive funders for ACX Grants 2024.</p><h1>Areas for improvement for 2024</h1><p><strong>Fundraising</strong></p><p>We haven&#8217;t raised much for Manifund&#8217;s core operations or our programs. Our successful attempts include the grant from the Survival and Flourishing Fund, the grant from the Future Fund, the donation from D, and the many small donations from individuals through the site. Unsuccessful attempts include our pitches to <a href="https://manifoldmarkets.notion.site/OpenPhil-Grant-Application-3c226068c3ae45eaaf4e6afd7d1763bc?pvs=4">OpenPhil</a>, <a href="https://manifoldmarkets.notion.site/Lightspeed-Grant-app-Jul-2023-7ce8b43ab15c40d7a2096b660c0beb4e?pvs=4">Lightspeed</a> (for regranting), and <a href="https://manifoldmarkets.notion.site/YC-W24-Application-Manifund-b39634e841b84fb48c71b26bd6a0fd01?pvs=4">YCombinator</a> (for impact certs).</p><p>We&#8217;ve been saying for a while that we&#8217;d like to find donors outside of traditional EA sources, though we haven&#8217;t followed through on giving this a serious try. Ideally, the Manifund product would appeal to &#8220;tech founder&#8221; or &#8220;quant trader&#8221; types, as a place where they can direct their money to charity in more efficient and aligned ways, but we&#8217;ve made few inroads into this demographic.</p><p><strong>Hiring</strong></p><p>Currently, Manifund consists of Rachel working fulltime and Austin working about halftime. On one hand, we think our output per FTE is pretty impressive! On the other, 1.5 FTE is not really that much for all the things we want to accomplish. We&#8217;re open to bringing on:</p><ul><li><p>A fullstack software engineer, to build out new features and generally improve the site</p></li><li><p>A strategy/ops role, to lead one of our main programs (regranting, impact certs, the open call) via fundraising, partnerships &amp; communications</p></li></ul><p>On the other hand, as a nonprofit startup seeking product-market-fit, we don&#8217;t want to overhire, either.</p><p>We&#8217;d also like to improve our nonprofit board. Our board currently consists of me (Austin Chen), Barak Gila, and Vishal Maini; Barak and Vishal signed on when Manifund was just running the Manifold Charity program. As we expand our operations, I&#8217;d like to bring on board members with connections and expertise in the areas we&#8217;re trying to grow into.</p><p><strong>Community engagement</strong></p><p>Unlike Manifold, Manifund doesn&#8217;t have much of a community of its own. People don&#8217;t spend their free time on Manifund, or chat with each other for fun on our Discord. I tentatively think this could be a major area of improvement.</p><p>In the early days, &#8220;forecasters hanging out&#8221; was a big part of making the Manifold community feel like a live, exciting place to talk with each other. Community was a key part of Manifold&#8217;s viral loops: people would create interesting prediction market questions, then share it outside of the site. Manifund is missing a similarly powerful viral loop.</p><p>Maybe it&#8217;s hard to replicate the Manifold community because Manifund feels more transactional. The nature of evaluating grant opportunities might make things seem less fun and more &#8220;let&#8217;s get down to business&#8221;. Or perhaps working with real money feels inherently more serious, compared with Manifold&#8217;s fake money.</p><p>The closest thing we&#8217;ve had to a lively community was the regrantors channel in the first couple of months of the regranting program, though the amount of collaboration and evaluation through discussion has tapered off. Still, it points to the possibility of creating a strong community among grantmakers or perhaps donors.</p><p><strong>Amount we help our grantees</strong></p><p>Part of the motivation for Manifund was based on having participated in the existing funding ecosystem, feeling the grantee experience was quite lacking, and thinking &#8220;huh, surely we could do better than that&#8230;&#8221;</p><p>While we think we&#8217;ve done a lot better on turn around times, we&#8217;ve only done slightly better at grantee feedback. When making grants, our regrantors are encouraged to write comments about why they chose to give the grant, and in general our comment section can facilitate conversations between any user and the applicant. However, as far as we can tell, once the grant is made, there isn&#8217;t much more interaction between grantees and grantmakers, or founders and investors in the case of certs, and Manifund the organization doesn&#8217;t provide any further support either.</p><p>One possible mission for Manifund would be to achieve YCombinator-levels of support for our project creators. We could nudge regrantors to stay in close contact with their grantees, like by suggesting they check in every month and see where grantees need help. We could aim to run our own batch for incubating projects at Manifund (as in YC or Charity Entrepreneurship), and facilitate stronger connections between the grantees.</p><p><strong>Building a growth loop</strong></p><p>Right now, each of our programs requires a large amount of time to organize, fundraise for, and then facilitate. We&#8217;d like our platform to be more self-serve and require less intervention from our team. For impact certs, perhaps we could standardize different aspects of creating a contest, and have a form and a standardized pipeline for spinning up custom contests.</p><p>The open call is already moderately self-serve: we haven&#8217;t put much effort into soliciting project applications or donors, and despite that we&#8217;ve received many interesting applications and donor interest!</p><p><strong>Focus</strong></p><p>Possibly we&#8217;re trying too many things for an org of our size and resources. On one hand, we view Manifund&#8217;s comparative advantage in the EA funding ecosystem as the ability to rapidly experiment with new programs and mechanisms that other funders wouldn&#8217;t consider; on the other, we may be able to execute better if we winnowed down our programs to only one or two that are very promising or clearly working well.</p><h1>Ambitious projects &amp; moonshots for 2024</h1><p><strong>10x&#8217;ing impact certificates:</strong> we&#8217;re reasonably happy with how impact certificates have worked out so far, and we&#8217;ve learned a lot through our experiments. The next step is to see how they work at a larger scale. Here are some ideas for much bigger prizes that could use impact certificates:</p><ul><li><p>&#8220;Nobel Prize&#8221; for AI safety work, highlighting the best examples of technical and governance work in the space each year, and allowing people to bet on entries beforehand.</p></li><li><p>O-1 Visa impact certs: offer up eg $10k prizes for employers to bring in O-1 candidates; allow investors and lawyers to buy into a share of the prize?</p></li><li><p>Eliminating flu season in San Francisco, with an Advance Market Commitment towards deployment of Far UVC tech. This is inspired by conversations with regrantor Gavriel, who works at 1DaySooner and would be a natural partner org for this.</p></li></ul><p><strong>Building a site for &#8220;contests-as-a-service&#8221;:</strong> within EA (OpenPhil AI Worldviews, EA Criticism Contest) and outside of it (Vesuvius Challenge, AI IMO contest), there are open contests for different kinds of work; but there&#8217;s no website you can go to to easily host your own contest. One inspiration might be from the world of <a href="https://en.99designs.jp/logo-design/contests">logo design contests</a>. As a bonus, if we make prize funding more of a norm, impact certs become much more palatable.</p><p><strong>Pushing harder on the Donor Advised Fund-angle of Manifund:</strong> we&#8217;ve been using the &#8220;DAF&#8221; approach to legitimize prediction markets &amp; impact certs, but on a relatively small scale (~$10k-100k/year). Could we convince large donors to store significant assets with Manifund (e.g. totalling $1M-10M/year), and offer exposure to other things that don&#8217;t work with real money (e.g. more liquid prediction markets; private stocks; ISAs)? And could we offer other services that DAFs currently don&#8217;t, like regranting as a product, or charity evaluations.</p><p><strong>Become the central hub for all giving &amp; donation tracking:</strong> <a href="https://openbook.fyi/">OpenBook</a>/Manifund merge scenario where we become a hub for donations, where people track their donations, see the donations of others, and discuss donations in one place. Giving What We Can is kind of like this, though they are less forum-y and only allow donations to a small set of orgs.</p><p><em>Thanks to Dave Kasten, Joel Becker, Marcus Abramovitch, and others for feedback on this writeup. We&#8217;d love to hear what you think of our work as well, and what you&#8217;d be excited to see from us as we go into 2024!</em></p>]]></content:encoded></item><item><title><![CDATA[ACX Mini-Grants Results]]></title><description><![CDATA[And future plans for impact certs]]></description><link>https://manifund.substack.com/p/acx-mini-grants-results</link><guid isPermaLink="false">https://manifund.substack.com/p/acx-mini-grants-results</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Fri, 13 Oct 2023 15:30:36 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!v4G5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Scott and the other judges (Drethelin, Nathan, Marcus and Austin) have finished their retrospective evaluations of ACX Mini-Grants. The results are in!</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!v4G5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!v4G5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 424w, https://substackcdn.com/image/fetch/$s_!v4G5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 848w, https://substackcdn.com/image/fetch/$s_!v4G5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 1272w, https://substackcdn.com/image/fetch/$s_!v4G5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!v4G5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp" width="728" height="351" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:351,&quot;width&quot;:728,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:50396,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!v4G5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 424w, https://substackcdn.com/image/fetch/$s_!v4G5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 848w, https://substackcdn.com/image/fetch/$s_!v4G5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 1272w, https://substackcdn.com/image/fetch/$s_!v4G5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5da4eba-f1d4-4e12-9360-9d279141960e_728x351.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>(see &#8220;PAYOUT&#8221; column)</em></p><p>Congrats to the investors and project creators who have earned a retro payout from Scott! You can read <a href="https://www.astralcodexten.com/p/impact-market-mini-grants-results">Scott&#8217;s reflections on ACX</a>.</p><h2>Next steps for creators and investors</h2><p>You should have gotten an email from Manifund with Scott&#8217;s buy offer on any projects you hold equity in. To redeem your certs for money:</p><ol><li><p>Follow the link in the email</p></li><li><p>Hit sell to accept his offer</p></li><li><p>Cash out from your profile page!</p></li></ol><p>Please take Scott up on his offer soon; we&#8217;ll keep the projects open for the next two weeks. You can email me at <a href="mailto:rachel@manifund.org">rachel@manifund.org</a> if you have any trouble.</p><h2>Reflections from Manifund team</h2><p>Like Scott, I&#8217;ll start by saying that this basically worked, which is great! That wasn&#8217;t guaranteed from the beginning&#8212;we worried we might not have enough investors, or they would make totally crazy decisions, or specific implementation details would be sticky, like defining what the final valuation actually <em><strong>means</strong></em> or deciding whether founders should be allowed to keep equity or how the IPO will work. But everything worked out: we got 18 projects funded, Scott only had to pay for the projects that actually worked, and a few did actually work quite well! You can read more about the top projects in <a href="https://www.astralcodexten.com/p/impact-market-mini-grants-results">Scott&#8217;s post</a>.</p><p>Now, on to things that could have gone better.</p><p><strong>First</strong>, Austin and I weren&#8217;t that impressed with the investment decisions, relative to, say, the previous ACX Grants round. There are a few factors that might feed into this:</p><ol><li><p><strong>These grants were smaller</strong>, so maybe this attracted fewer serious and competent people to apply, and fewer investors to participate.</p></li><li><p><strong>There were fewer total applicants, and we were less selective.</strong> ACX Grants round 1 had 600+ applications and funded 30; ACX Minigrants had ~30 applications and funded 18.</p></li><li><p><strong>Investors weren&#8217;t very profit-motivated</strong>, based on conversations with them. Many said they saw their investments more as donations than real investments, so they may not have chosen projects very carefully, or had the same expertise that Scott provided in ACX Grants round 1.</p></li><li><p><strong>There weren&#8217;t very many investors.</strong> Much like how a Manifold Market with very few traders will be mispriced, an IPO auction with only a few bidders will spit out the wrong price&#8212;like Max&#8217;s project, which only got one bid, at a valuation of $300, but ended up being valued at $7500. Though it did exceed expectations, I think an initial valuation of $1000 would have been much more sensible.</p></li></ol><p><strong>Second</strong>, in terms of the quality of projects, we were both surprised about how often the initial proposal just&#8230;wasn&#8217;t delivered on. This isn&#8217;t necessarily such a bad thing&#8212;perhaps this is common with grants in general, especially small ones, and impact certificates remove the risk for the philanthropist, which is good! But it was still disappointing.</p><p>Austin noted that forecasting needs less thinking and more doing. I can kind of get behind this, with the caveat that I&#8217;m not sold on forecasting, period. So I&#8217;m not that interested in the building infrastructure and tools for making forecasts either (we have lots of that), but am more interested in popularizing and making forecasts useful.</p><p><strong>Finally</strong>, on the judging side, Austin does think that evaluating &#8220;how good was this project and how much should it be worth&#8221; was indeed easier than &#8220;how good might this project be&#8221;. There were many projects where the initial valuations overvalued or undervalued the final outcomes; with hindsight, judges could focus on the delivered results rather than speculate based on their impressions of the project founders. This was what we expected before starting this certs round, but happy to validate that this key benefit of impact certs does actually hold up!</p><p>The judging itself was done relatively quickly and without much consultation between judges; judges had a couple weeks, and eg Austin spent a total of ~4 hours on judging + writing comments. This seems reasonable for the $26k we disbursed, but we may want to encourage more judge discussion on larger retro rounds (see some <a href="https://vitalik.ca/general/2021/11/16/retro1.html#other-ideas-for-structuring-discussion">suggestions by Vitalik Buterin for this</a>).</p><h2>Future plans</h2><p>Scott is planning on a full ACX Grants round (tentatively late November), where he makes some grants directly and Manifund creates impact certs for the other proposals; he also wants to get other final funders to commit retro funding too. Manifold is planning an impact certs round focused just on Manifold community initiatives &#8212; stay tuned. And we&#8217;re happy to partner with other folks who want to commit retro funding for their own impact cert round; reach out to <a href="mailto:austin@manifund.org">austin@manifund.org</a> if interested!</p><p>Some things we&#8217;d like to improve on for future rounds:</p><ul><li><p>better trading UI, including an AMM, so prices will be set better and it&#8217;ll be more fun to participate in on the investor side.</p></li><li><p>open up trading to non-accredited investors, by allowing them to fund projects and donate gains to charity but not withdraw them (similar to how mana works)</p></li><li><p>specifying up front what the final funders will be looking for</p></li><li><p>better guidance or defaults on how much equity founders should keep, and how much a project might be worth if it succeeds</p></li><li><p>more regular check-ins with people working on projects, both from Manifund and from investors.</p></li></ul><p>We&#8217;d love your suggestions and feedback, in the comments here or in our <a href="https://discord.gg/ZGsDMWSA5Q">Discord</a>!</p><p>And finally, thanks to everyone who joined us for this first test run: the project creators, investors, judges, and most of all Scott, for taking a chance on this newfangled funding system. This was a landmark use of impact certs, the first time where retroactive grant funding was paired with upfront equity investment; we&#8217;re excited for what comes next!</p><p>Best,<br>Rachel</p>]]></content:encoded></item><item><title><![CDATA[What we're funding (weeks 2-4)]]></title><description><![CDATA[$600k granted; volunteers & regrantors; community events]]></description><link>https://manifund.substack.com/p/what-were-funding-weeks-2-4</link><guid isPermaLink="false">https://manifund.substack.com/p/what-were-funding-weeks-2-4</guid><dc:creator><![CDATA[Manifund]]></dc:creator><pubDate>Fri, 04 Aug 2023 15:54:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!MCGS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3>Overall reflections</h3><ul><li><p>Very happy with the volume and quality of grants we&#8217;ve been making</p><ul><li><p>$600k+ newly committed across 12 projects</p></li><li><p>Regrantors have been initiating grants and coordinating on large projects</p></li><li><p>Independent donors have committed $35k+ of their own money!</p></li><li><p>We plan to start fundraising soon, based on this pace of distribution</p></li></ul></li><li><p>Happy to be coordinating with funders eg at LTFF, Lightspeed, Nonlinear and OpenPhil</p><ul><li><p>We now have a common Slack channel to share knowledge and plans</p></li><li><p>Currently floating the idea of setting up a common app between us&#8230;</p></li></ul></li><li><p>Happy with our experimentation! Some things we&#8217;ve been trying:</p><ul><li><p>Equity investments, loans, dominant assurance contracts and retroactive funding</p></li><li><p>Grantathon, office hours, feedback on Discord &amp; site comments</p></li></ul></li><li><p>Less happy with our operations (wrt feedback and response times to applicants)</p><ul><li><p>Taking longer to support to individual grantees, or start new Manifund initiatives</p><ul><li><p>Please ping us if it&#8217;s been a week and you haven&#8217;t heard anything!</p></li></ul></li><li><p>Wise deactivated our account, making international payments more difficult/expensive&#8230;</p></li><li><p>In cases where multiple regrantors may fund a project, we&#8217;ve observed a bit of &#8220;funding chicken&#8221;</p></li></ul></li></ul><h3>Grant of the month</h3><ul><li><p><strong>[$310k] <a href="https://manifund.org/projects/apollo-research-scale-up-interpretability--behavioral-model-evals-research">Apollo Research</a></strong></p><p>This is our largest grant to date! Many of our regrantors were independently excited about Apollo; in the end, we coordinated between Tristan Hume, Evan Hubinger and Marcus Abramovitch to fund this.</p><p>From Tristan:</p></li></ul><blockquote><p>I'm very excited about Apollo based on a combination of the track record of it's founding employees and the research agenda they've articulated.</p><p>Marius and Lee have published work that's significantly contributed to <a href="https://transformer-circuits.pub/2023/may-update/index.html">Anthropic's work on dictionary learning</a>. I've also met both Marius and Lee and have confidence in them to do a good job with Apollo.</p><p>Additionally, I'm very much a fan of alignment and dangerous capability evals as an area of research and think there's lots of room for more people to work on them.</p><p>In terms of cost-effectiveness I like these research areas because they're ones I think are very tractable to approach from outside a major lab in a helpful way, while not taking large amounts of compute. I also think Apollo existing in London will allow them to hire underutilized talent that would have trouble getting a U.S. visa.</p></blockquote><h3>New grants</h3><ul><li><p><strong>[$112k] <a href="https://manifund.org/projects/scoping-developmental-interpretability-xg55b33wsfc">Jesse Hoogland: Scoping Developmental Interpretability</a></strong></p><p>Jesse posted this through our open call:</p></li></ul><blockquote><p>We propose a 6-month research project to assess the viability of&nbsp;<strong><a href="https://www.lesswrong.com/posts/TjaeCWvLZtEDAS5Ex/towards-developmental-interpretability">Developmental Interpretability</a></strong>, a new AI alignment research agenda. &#8220;DevInterp&#8221; studies how phase transitions give rise to computational structure in neural networks, and offers a possible path to scalable interpretability tools.</p><p>Though we have both empirical and theoretical reasons to believe that phase transitions dominate the training process, the details remain unclear. We plan to clarify the role of phase transitions by studying them in a variety of models combining techniques from Singular Learning Theory and Mechanistic Interpretability. In six months, we expect to have gathered enough evidence to confirm that DevInterp is a viable research program.</p><p>If successful, we expect Developmental Interpretability to become one of the main branches of technical alignment research over the next few years.</p></blockquote><ul><li><p>Rachel was excited about this project and considered setting up a dominance assurance contract to encourage regrants, but instead offered 10% matching; Evan took her up on this!</p></li><li><p><strong>[$60k] <a href="https://manifund.org/projects/agency-and-disempowerment">Dam and Pietro: Writeup on Agency and (Dis)Empowerment</a></strong></p><p>A regrant initiated by Evan:</p></li></ul><blockquote><p>6 months support for two people, Damiano and Pietro, to write a paper about (dis)empowerment&#8230; Its ultimate aim is to offer formal and operational notions of (dis)empowerment. For example, an intermediate step would be to provide a continuous formalisation of agency, and to investigate which conditions increase or decrease agency.</p></blockquote><ul><li><p><strong>[$40k] <a href="https://manifund.org/projects/recreate-the-cavity-preventing-gmo-bacteria-bcs3-l1-from-precursor-">Aaron Silverbook: Curing Cavities</a></strong></p><p>Following up from Aaron&#8217;s application last month (which went viral on Hacker News), Austin and Isaak split this regrant &#8212; structured as an equity investment into Aaron&#8217;s startup. A handful of small donors contributed as well~</p></li><li><p><strong>[$30k] <a href="https://manifund.org/projects/activation-vector-steering-with-bci">Lisa Thiergart &amp; David Dalrymple: Activation vector steering with BCI</a></strong></p><p>This regrant was initiated by Marcus, and matched by Evan. Marcus says:</p></li></ul><blockquote><p>When I talked with Lisa, she was clearly able to articulate why the project is a good idea. Often people struggle to do this</p><p>Lisa is smart and talented and wants to be expanding her impact by leading projects. This seems well worth supporting.</p><p>Davidad is fairly well-known to be very insightful and proposed the project before seeing the original results.</p><p>Reviewers from Nonlinear Network gave great feedback on funding Lisa for two projects she proposed. She was most excited about this one and, with rare exceptions, when a person has two potential projects, they should do the one they are most excited about.</p><p>I think we need to get more tools in our arsenal for attacking alignment. With some early promising results, it seems very good to build out activation vector steering.</p></blockquote><ul><li><p><strong>[$25k] <a href="https://manifund.org/projects/independent-researcher">Joseph Bloom: Trajectory models and agent simulators</a></strong></p><p>Marcus initiated this regrant last month; this week, Dylan Mavrides was kind enough to donate $25k of his personal funds, completing this project&#8217;s minimum funding goal!</p></li><li><p><strong>[$10k] <a href="https://manifund.org/projects/run-five-international-hackathons-on-ai-safety-research">Esben Kran: Five international hackathons on AI safety</a></strong></p><p>A $5k donation by Anton, and matched by a regrant from Renan Araujo:</p></li></ul><blockquote><p>I&#8217;m contributing to this project based on a) my experience running one of their hackathons in April 2023, which I thought was high quality, and b) my excitement to see this model scaled, as I think it has an outsized contribution to the talent search pipeline for AI safety. I&#8217;d be interested in seeing see someone with a more technical background evaluating the quality of the outputs, but I don&#8217;t think that&#8217;s the core of their impact here.</p></blockquote><ul><li><p><strong>[$10k] <a href="https://manifund.org/projects/vaccinateca">Karl Yang: Starting VaccinateCA</a></strong></p><p>A retroactive grant initiated by Austin:</p></li></ul><blockquote><p>I want to highlight VaccinateCA as an example of an extremely effective project, and tell others that Manifund is interested in funding projects like it. Elements of VaccinateCA that endear me to it, especially in contrast to typical EA projects:</p><ul><li><p>They moved very, very quickly</p></li><li><p>They operated an object level intervention, instead of doing research or education</p></li><li><p>They used technology that could scale up to serve millions</p></li><li><p>But were also happy to manually call up pharmacies, driven by what worked well</p></li></ul></blockquote><ul><li><p><strong>[$6k] <a href="https://manifund.org/projects/alignment-is-hard">Alexander Bistagne: Alignment is Hard</a></strong></p><p>Notable for being funded entirely by independent donors so far! Greg Colbourn weighs in:</p></li></ul><blockquote><p>This research seems promising. I'm pledging enough to get it to proceed. In general we need more of this kind of research to establish consensus on LLMs (foundation models) basically being fundamentally uncontrollable black boxes (that are dangerous at the frontier scale). I think this can lead - in conjunction with laws about recalls for rule breaking / interpretability - to a de facto global moratorium on this kind of dangerous (proto-)AGI.</p></blockquote><ul><li><p>A series of small grants that came in from our open call, funded by Austin:</p><ul><li><p><strong>[$2.5k] <a href="https://manifund.org/projects/neuronpedia---ai-safety-game">Johnny Lin: Neuronpedia, an AI Safety game</a></strong> (see also <a href="https://www.lesswrong.com/posts/skKYznZyRtN87tHbB/neuronpedia-ai-safety-game">launch on LessWrong</a>!)</p></li><li><p><strong>[$500] <a href="https://manifund.org/projects/funding-for-solar4africa-app-development">Sophia Pung: Solar4Africa app development</a></strong></p></li><li><p><strong>[$500] <a href="https://manifund.org/projects/one-semester-living-expenses-for-mitharvard-based-researcher">Vik Gupta: LimbX robotic limb &amp; other projects</a></strong></p></li></ul></li></ul><h3>Cool projects, seeking funding</h3><p><strong>AI Safety</strong></p><ul><li><p><strong>[$5k-$10k] <a href="https://manifund.org/projects/exploring-ai-safety-career-pathways">Peter Brietbart: Exploring AI Safety Career Pathways</a></strong></p></li><li><p><strong>[$500-$190k] <a href="https://manifund.org/projects/the-rethink-priorities-existential-security-team-founder-in-residence-hire">Rethink Priorities: XST Founder in Residence</a></strong></p></li><li><p><strong>[$500-$38k] <a href="https://manifund.org/projects/discovering-latent-goals-mechanistic-interpretability-phd-salary">Lucy Farnik: Discovering Latent Goals</a></strong></p></li></ul><p><strong>Other</strong></p><ul><li><p><strong>[$25k-$152k] <a href="https://manifund.org/projects/riesgos-catastrficos-globales">Jorge Andr&#233;s Torres Celis: Riesgos Catastr&#243;ficos Globales</a></strong></p></li><li><p><strong>[$75k-$200k] <a href="https://manifund.org/projects/congressional-staffers-biosecurity-briefings-in-dc">Allison Burke: Congressional staffers' biosecurity briefings in DC</a></strong></p></li></ul><h3>Shoutout to Anton Makiievskyi, volunteer extraordinaire</h3><p>Anton has been volunteering in-person with Manifund these last two weeks. As a former poker player turned earning-to-giver, Anton analyzed how the Manifund site comes across to small and medium individual donors. He&#8217;s held user interviews with a variety of donors and grantees, solicited applications from top Nonlinear Network applicants, and donated thousands of dollars of his personal funds towards Manifund projects. Thanks, Anton!</p><h3>Regrantor updates</h3><p>We&#8217;ve onboarded a new regrantor: Ryan Kidd! Ryan is the co-director of SERI MATS, which makes him well connected and high context on up-and-coming people and projects in the AI safety space.</p><p>Meanwhile, Qualy the Lightbulb is withdrawing from our regranting program for personal reasons &#128543;</p><p>Finally, we&#8217;ve increased the budgets of two of our regrantors by $50k each: Marcus and Evan! This is in recognition for the excellent regrants they have initiated so far. Both have been reaching out to promising grantees as well as reviewing open call applications; thanks for your hard work!</p><h3>Site updates</h3><ul><li><p>The home page now emphasizes our regrantors &amp; top grants:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MCGS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MCGS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 424w, https://substackcdn.com/image/fetch/$s_!MCGS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 848w, https://substackcdn.com/image/fetch/$s_!MCGS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 1272w, https://substackcdn.com/image/fetch/$s_!MCGS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MCGS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png" width="1456" height="1071" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1071,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1407477,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MCGS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 424w, https://substackcdn.com/image/fetch/$s_!MCGS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 848w, https://substackcdn.com/image/fetch/$s_!MCGS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 1272w, https://substackcdn.com/image/fetch/$s_!MCGS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce580d5-1330-48f7-94dc-251637feed9f_2964x2180.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p></li></ul><ul><li><p>Our editor now saves changes locally, and prompts you to format your writing:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!a0lz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!a0lz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 424w, https://substackcdn.com/image/fetch/$s_!a0lz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 848w, https://substackcdn.com/image/fetch/$s_!a0lz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 1272w, https://substackcdn.com/image/fetch/$s_!a0lz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!a0lz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png" width="1456" height="684" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/06b851e4-6182-4f59-9573-e04592616421_1964x922.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:684,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:154648,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!a0lz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 424w, https://substackcdn.com/image/fetch/$s_!a0lz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 848w, https://substackcdn.com/image/fetch/$s_!a0lz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 1272w, https://substackcdn.com/image/fetch/$s_!a0lz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06b851e4-6182-4f59-9573-e04592616421_1964x922.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div></li></ul><h3>Community events</h3><p>We ran a couple experiments intended to help out our regrantors and grantees:</p><ul><li><p>A grantathon (grant writing workshop) on Jul 19, where we dedicated 3 sessions of heads-down writing time mixed with 1:1 feedback sessions.</p></li><li><p>Open office hours on Jul 30, where a few grantees and regrantors met on Discord to chat about their current projects.</p></li></ul><p>We&#8217;ll continue to run events like this every so often; hop on <a href="https://discord.gg/ZGsDMWSA5Q">our Discord</a> to hear more!</p><h3>Other links</h3><ul><li><p>Announcing <a href="http://manifestconference.net">Manifest 2023</a>, Manifold&#8217;s first conference (Sep 22-24 in Berkeley)</p></li><li><p>LTFF posts their <a href="https://forum.effectivealtruism.org/posts/zZ2vq7YEckpunrQS4/long-term-future-fund-april-2023-grant-recommendations">grant writeups</a> and is <a href="https://forum.effectivealtruism.org/posts/zt6MsCCDStm74HFwo/ea-funds-organisational-update-open-philanthropy-matching">fundraising</a>; <a href="https://forum.effectivealtruism.org/posts/9vazTE4nTCEivYSC6/reflections-on-my-time-on-the-long-term-future-fund">Asya Bergal</a> posts her reflections</p></li><li><p>Bet on whether <a href="https://manifold.markets/market/will-manifold-for-charity-get-accep">Manifund will get into YCombinator</a>&#8230;</p></li><li><p>Manifold team offsite in Puerto Rico! Shoutout to Nonlinear for lending us their apartment:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Wgeu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Wgeu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 424w, https://substackcdn.com/image/fetch/$s_!Wgeu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 848w, https://substackcdn.com/image/fetch/$s_!Wgeu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 1272w, https://substackcdn.com/image/fetch/$s_!Wgeu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Wgeu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:6545354,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Wgeu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 424w, https://substackcdn.com/image/fetch/$s_!Wgeu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 848w, https://substackcdn.com/image/fetch/$s_!Wgeu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 1272w, https://substackcdn.com/image/fetch/$s_!Wgeu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F11e88fe0-3fa2-49ee-87c6-7bad0a680c56_2280x1710.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p></p></li></ul><p>Thanks for reading,</p><p>&#8212; Austin</p>]]></content:encoded></item></channel></rss>