Aaron's Blog
Pigeon Hour
#9: Sarah Woodhouse on discovering AI x-risk, Twitter, and more

#9: Sarah Woodhouse on discovering AI x-risk, Twitter, and more

Note: I can’t seem to edit or remove the “transcript” tab. I recommend you ignore that and just look at the much higher quality, slightly cleaned up one below.

Most importantly, follow Sarah on Twitter!


(Written by chatGPT, as you can probably tell)

In this episode of Pigeon Hour host Aaron delves deep into the world of AI safety with his guest, Sarah Woodhouse. Sarah shares her unexpected journey from fearing job automation to becoming a recognized voice on AI safety Twitter. Her story starts with a simple Google search that led her down a rabbit hole of existential dread and unexpected fame on social media. As she narrates her path from lurker to influencer, Sarah reflects on the quirky dynamics of the AI safety community, her own existential crisis, and the serendipitous tweet that resonated with thousands.

Aaron and Sarah’s conversation takes unexpected turns, discussing everything from the peculiarities of EA rationalists to the surprisingly serious topic of shrimp welfare. They also explore the nuances of AI doom probabilities, the social dynamics of tech Twitter, and Sarah’s unexpected viral fame as a tween. This episode is a rollercoaster of insights and anecdotes, perfect for anyone interested in the intersection of technology, society, and the unpredictable journey of internet fame.

Topics discussed

Discussion on AI Safety and Personal Journeys:

  • Aaron and Sarah discuss her path to AI safety, triggered by concerns about job automation and the realization that AI could potentially replace her work.

  • Sarah's deep dive into AI safety started with a simple Google search, leading her to Geoffrey Hinton's alarming statements, and eventually to a broader exploration without finding reassuring consensus.

  • Sarah's Twitter engagement began with lurking, later evolving into active participation and gaining an audience, especially after a relatable tweet thread about an existential crisis.

  • Aaron remarks on the rarity of people like Sarah, who follow the AI safety rabbit hole to its depths, considering its obvious implications for various industries.

AI Safety and Public Perception:

  • Sarah discusses her surprise at discovering the AI safety conversation happening mostly in niche circles, often with a tongue-in-cheek attitude that could seem dismissive of the serious implications of AI risks.

  • The discussion touches on the paradox of AI safety: it’s a critically important topic, yet it often remains confined within certain intellectual circles, leading to a lack of broader public engagement and awareness.

Cultural Differences and Personal Interests:

  • The conversation shifts to cultural differences between the UK and the US, particularly in terms of sincerity and communication styles.

  • Personal interests, such as theater and musicals (like "Glee"), are also discussed, revealing Sarah's background and hobbies.

Effective Altruism (EA) and Rationalist Communities:

  • Sarah points out certain quirks of the EA and rationalist communities, such as their penchant for detailed analysis, hedging statements, and the use of probabilities in discussions.

  • The debate around the use of "P(Doom)" (probability of doom) in AI safety discussions is critiqued, highlighting how it can be both a serious analytical tool and a potentially alienating jargon for outsiders.

Shrimp Welfare and Ethical Considerations:

  • A detailed discussion on shrimp welfare as an ethical consideration in effective altruism unfolds, examining the moral implications and effectiveness of focusing on animal welfare at a large scale.

  • Aaron defends his position on prioritizing shrimp welfare in charitable giving, based on the principles of importance, tractability, and neglectedness.

Personal Decision-Making in Charitable Giving:

  • Strategies for personal charitable giving are explored, including setting a donation cutoff point to balance moral obligations with personal needs and aspirations.


AARON: Whatever you want. Okay. Yeah, I feel like you said this on Twitter. The obvious thing is, how did you learn about AI safety? But maybe you’ve already covered that. That’s boring. First of all, do you want to talk about that? Because we don’t have to.

SARAH: I don’t mind talking about that.

AARON: But it’s sort of your call, so whatever. I don’t know. Maybe briefly, and then we can branch out?

SARAH: I have a preference for people asking me things and me answering them rather than me setting the agenda. So don’t ever feel bad about just asking me stuff because I prefer that.

AARON: Okay, cool. But also, it feels like the kind of thing where, of course, we have AI. Everyone already knows that this is just like the voice version of these four tweets or whatever. But regardless. Yes. So, Sarah, as Pigeon Hour guest, what was your path through life to AI safety Twitter?

SARAH: Well, I realized that a chatbot could very easily do my job and that my employers either hadn’t noticed this or they had noticed, but they were just being polite about it and they didn’t want to fire me because they’re too nice. And I was like, I should find out what AI development is going to be like over the next few years so that I know if I should go and get good at some other stuff.

SARAH: I just had a little innocent Google. And then within a few clicks, I’d completely doom pilled myself. I was like, we’re all going to die. I think I found Geoffrey Hinton because he was on the news at the time, because he just quit his job at Google. And he was there saying things that sounded very uncertain, very alarming. And I was like, well, he’s probably the pessimist, but I’m sure that there are loads of optimists to counteract that because that’s how it usually goes. You find a doomer and then you find a bunch of more moderate people, and then there’s some consensus in the middle that everything’s basically fine.

SARAH: I was like, if I just keep looking, I’ll find the consensus because it’s there. I’m sure it’s there. So I just kept looking and looking for it. I looked for it for weeks. I just didn’t find it. And then I was like, nobody knows what’s going on. This seems really concerning. So then I started lurking on Twitter, and then I got familiar with all the different accounts, whatever. And then at some point, I was like, I’m going to start contributing to this conversation, but I didn’t think that anybody would talk back to me. And then at some point, they started talking back to me and I was like, this is kind of weird.

SARAH: And then at some point, I was having an existential crisis and I had a couple of glasses of wine or something, and I just decided to type this big, long thread. And then I went to bed. I woke up the next morning slightly grouchy and hungover. I checked my phone and there were all these people messaging me and all these people replying to my thread being like, this is so relatable. This really resonated with me. And I was like, what is going on?

AARON: You were there on Twitter before that thread right? I’m pretty sure I was following you.

SARAH: I think, yeah, I was there before, but no one ever really gave me any attention prior to that. I think I had a couple of tweets that blew up before that, but not to the same extent. And then after that, I think I was like, okay, so now I have an audience. When I say an audience, like, obviously a small one, but more of an audience than I’ve ever had before in my life. And I was like, how far can I take this?

SARAH: I was a bit like, people obviously started following me because I’m fre
Freaking out about AI, but if I post an outfit, what’s going to happen? How far can I push this posting, these fit checks? I started posting random stuff about things that were completely unrelated. I was like, oh, people are kind of here for this, too. Okay, this is weird. So now I’m just milking it for all its worth, and I really don’t know why anybody’s listening to me. I’m basically very confused about the whole thing.

AARON: I mean, I think it’s kind of weird from your perspective, or it’s weird in general because there aren’t that many people who just do that extremely logical thing at the beginning. I don’t know, maybe it’s not obvious to people in every industry or whatever that AI is potentially a big deal, but there’s lots of truckers or whatever. Maybe they’re not the best demographic or the most conducive demographic, like, getting on Twitter or whatever, but there’s other jobs that it would make sense to look into that. It’s kind of weird to me that only you followed the rabbit hole all the way down.

SARAH: I know! This is what I…Because it’s not that hard to complete the circle. It probably took me like a day, it took me like an afternoon to get from, I’m worried about job automation to I should stop saving for retirement. It didn’t take me that long. Do you know what I mean? No one ever looks. I literally don’t get it. I was talking to some people. I was talking to one of my coworkers about this the other day, and I think I came up in conversation. She was like, yeah, I’m a bit worried about AI because I heard on the radio that taxi drivers might be out of a job. That’s bad. And I was like, yeah, that is bad. But do you know what else? She was like, what are the AI companies up to that we don’t know about? And I was like, I mean, you can go on their website. You can just go on their website and read about how they think that their technology is an extinction risk. It’s not like they’re hiding. It’s literally just on there and no one ever looks. It’s just crazy.

AARON: Yeah. Honestly, I don’t even know if I was in your situation, if I would have done that. It’s like, in some sense, I am surprised. It’s very few people maybe like one, but at another level, it’s more rationality than most humans have or something. Yeah. You regret going down that rabbit hole?

SARAH: Yeah, kind of. Although I’m enjoying the Twitter thing and it’s kind of fun, and it turns out there’s endless comedic material that you can get out of impending doom. The whole thing is quite funny. It’s not funny, but you can make it funny if you try hard enough. But, yeah, what was I going to say? I think maybe I was more primed for doom pilling than your average person because I already knew what EA was and I already knew, you know what I mean. That stuff was on my radar.

AARON: That’s interesting.

SARAH: I think had it not been on my radar, I don’t think I would have followed the pipeline all the way.

AARON: Yeah. I don’t know what browser you use, but it would be. And you should definitely not only do this if you actually think it would be cool or whatever, but this could be in your browser history from that day and that would be hilarious. You could remove anything you didn’t want to show, but if it’s like Google Chrome, they package everything into sessions. It’s one browsing session and it’ll have like 10,000 links.

SARAH: Yeah, I think for non-sketchy reasons, I delete my Google history more regularly than that. I don’t think I’d be able to find that. But I can remember the day and I can remember my anxiety levels just going up and up somewhere between 01:00 p.m. and 07:00 p.m. And by the evening I’m like, oh, my God.

AARON: Oh, damn, that’s wild.

SARAH: It was really stressful.

AARON: Yeah, I guess props for, I don’t know if props…Is the right word, I guess, impressed? I’m actually somewhat surprised to hear that you said you regret it. I mean, that sucks though, I guess. I’m sorry.

SARAH: If you could unknow this, would you?

AARON: No, because I think it’s worth maybe selfishly, but not overall because. Okay, yeah, I think that would plausibly be the selfish thing to do. Actually. No, actually, hold on. No, I actually don’t think that’s true. I actually think there’s enough an individual can do selfishly such that it makes sense. Even the emotional turmoil.

SARAH: It would depend how much you thought that you were going to personally move the needle by knowing about it. I personally don’t think that I’m going to be able to do very much. I was going to tip the scales. I wouldn’t selfishly unknow it and sacrifice the world. But me being not particularly informed or intelligent and not having any power, I feel like if I forgot that AI was going to end the world, it would not make much difference.

AARON: You know what I mean? I agree that it’s like, yes, it is unlikely for either of us to tip the scales, but.

SARAH: Maybe you can’t.

AARON: No, actually, in terms of, yeah, I’m probably somewhat more technically knowledgeable just based on what I know about you. Maybe I’m wrong.

SARAH: No, you’re definitely right.

AARON: It’s sort of just like a probabilities thing. I do think that ‘doom’ - that word - is too simplified, often too simple to capture what people really care about. But if you just want to say doom versus no doom or whatever, AI doom versus no AI doom. Maybe there’s like a one in 100,000 chance that one of us tips the scales. And that’s important. Maybe even, like, one in 10,000. Probably not. Probably not.

SARAH: One in 10,000. Wow.

AARON: But that’s what people do. People vote, even though this is old 80k material I’m regurgitating because they basically want to make the case for why even if you’re not. Or in some article they had from a while ago, they made a case for why doing things that are unlikely to counterfactually matter can still be amazingly good. And the classic example, just voting if you’re in a tight race, say, in a swing state in the United States, and it could go either way. Yeah. It might be pretty unlikely that you are the single swing vote, but it could be one in 100,000. And that’s not crazy.

SARAH: It doesn’t take very much effort to vote, though.

AARON: Yeah, sure. But I think the core justification, also, the stakes are proportionally higher here, so maybe that accounts for some. But, yes, you’re absolutely right. Definitely different amounts of effort.

SARAH: Putting in any effort to saving the world from AI. I wouldn’t say that. I wouldn’t say that I’m sacrificing.

AARON: I don’t even know if I like. No. Maybe it doesn’t feel like a sacrifice. Maybe it isn’t. But I do think there’s, like, a lot. There’s at least something to be. I don’t know if this really checks out, but I would, like, bet that it does, which is that more reasonably, at least calibrated. I wanted to say reasonably well informed. But really what it is is, like, some level of being informed and, like, some level of knowing what you don’t know or whatever, and more just like, normal. Sorry. I hope normal is not like a bat. I’m saying not like tech Bros, I guess so more like non tech bros. People who are not coded as tech bros. Talking about this on a public platform just seems actually, in fact, pretty good.

SARAH: As long as we like, literally just people that aren’t men as well. No offense.

AARON: Oh, no, totally. Yeah.

SARAH: Where are all the women? There’s a few.

AARON: There’s a few that are super. I don’t know, like, leaders in some sense, like Ajeya Cotra and Katja Grace. But I think the last EA survey was a third. Or I could be butchering this or whatever. And maybe even within that category, there’s some variation. I don’t think it’s 2%.

SARAH: Okay. All right. Yeah.

AARON: Like 15 or 20% which is still pretty low.

SARAH: No, but that’s actually better than I would have thought, I think.

AARON: Also, Twitter is, of all the social media platforms, especially mail. I don’t really know.


AARON: I don’t like Instagram, I think.

SARAH: I wonder, it would be interesting to see whether or not that’s much, if it’s become more male dominated since Elon Musk took.

AARON: It’s not a huge difference, but who knows?

SARAH: I don’t know. I have no idea. I have no idea. We’ll just be interesting to know.

AARON: Okay. Wait. Also, there’s no scheduled time. I’m very happy to keep talking or whatever, but as soon as you want to take a break or hop off, just like. Yeah.

SARAH: Oh, yeah. I’m in no rush.

AARON: Okay, well, I don’t know. We’ve talked about the two obvious candidates. Do you have a take or something? Want to get out to the world? It’s not about AI or obesity or just a story you want to share.

SARAH: These are my two pet subjects. I don’t know anything else.

AARON: I don’t believe you. I know you know about house plants.

SARAH: I do. A secret, which you can’t tell anyone, is that I actually only know about house plants that are hard to kill, and I’m actually not very good at taking care of them.

AARON: Well, I’m glad it’s house plants in that case, rather than pets. Whatever.

SARAH: Yeah. I mean, I have killed some sea monkeys, too, but that was a long time ago.

AARON: Yes. So did I, actually.

SARAH: Did you? I feel like everyone has. Everyone’s got a little sea monkey graveyard in their past.

AARON: New cause area.

SARAH: Are there more shrimp or more sea monkeys? That’s the question.

AARON: I don’t even know what even. I mean, are they just plankton?

SARAH: No, they’re not plankton.

AARON: I know what sea monkeys are.

SARAH: There’s definitely a lot of them because they’re small and insignificant.

AARON: Yeah, but I also think we don’t. It depends if you’re talking about in the world, which I guess probably like sea monkeys or farmed for food, which is basically like. I doubt these are farmed either for food or for anything.

SARAH: Yeah, no, you’re probably right.

AARON: Or they probably are farmed a tiny bit for this niche little.

SARAH: Or they’re farmed to sell in aquariums for kids.

AARON: Apparently. They are a kind of shrimp, but they were bred specifically to, I don’t know, be tiny or something. I’m just skimming that, Wikipedia. Here.

SARAH: Sea monkeys are tiny shrimp. That is crazy.

AARON: Until we get answers, tell me your life story in whatever way you want. It doesn’t have to be like. I mean, hopefully not. Don’t straight up lie, but wherever you want to take that.

SARAH: I’m not going to lie. I’m just trying to think of ways to make it spicier because it’s so average. I don’t know what to say about it.

AARON: Well, it’s probably not that average, right? I mean, it might be average among people you happen to know.

SARAH: Do you have any more specific questions?

AARON: Okay, no. Yeah, hold on. I have a meta point, which is like, I think the people who are they have a thing on the top of their mind, and if I give any sort of open ended question whatsoever, they’ll take it there and immediately just start giving slinging hot takes. But then
Other people, I think, this category is very EA. People who aren’t, especially my sister, they’re like, “No, I have nothing to talk about. I don’t believe that.” But they’re not, I guess, as comfortable.

SARAH: No, I mean, I have. Something needs to trigger them in me. Do you know what I mean? Yeah, I need an in.

AARON: Well, okay, here’s one. Is there anything you’re like, “Maybe I’ll cut this. This is kind of, like narcissistic. I don’t know. But is there anything you want or curious to ask?” This does sound kind of weird. I don’t know. But we can cut it if need be.

SARAH: What does the looking glass in your Twitter name mean? Because I’ve seen a bunch of people have this, and I actually don’t know what it means, but I was like, no.

AARON: People ask this. I respond to a tweet that’s like, “What does that like?” At least, I don’t know, once every month or two. Or know basically, like Spencer Greenberg. I don’t know if you’re familiar with him. He’s like a sort of.

SARAH: I know the know.

AARON: He literally just tweeted, like a couple years ago. Put this in your bio to show that you really care about finding the truth or whatever and are interested in good faith conversations. Are you familiar with the scout mindset?

SARAH: Yeah.

AARON: Julia Galef. Yeah. That’s basically, like the short version.

SARAH: Okay.

AARON: I’m like, yeah, all right. And there’s at least three of us who have both a magnifying glass. Yeah. And a pause thing, which is like, my tightest knit online community I guess.

SARAH: I think I’ve followed all the pause people now. I just searched the emoji on Twitter, and I just followed everyone. Now I can’t find. And I also noticed when I was doing this, that some people, if they’ve suspended their account or they’re taking time off, then they put a pause in their thing. So I was, like, looking, and I was like, oh, these are, like, AI people. But then they were just, like, in their bio, they were, like, not tweeting until X date. This is a suspended account. And I was like, I see we have a messaging problem here. Nice. I don’t know how common that actually.

AARON: Was. I’m glad. That was, like, a very straightforward question. Educated the masses. Max Alexander said Glee. Is that, like, the show? You can also keep asking me questions, but again, this is like.

SARAH: Wait, what did he say? Is that it? Did he just say glee? No.

AARON: Not even a question mark. Just the word glee.

SARAH: Oh, right. He just wants me to go off about Glee.

AARON: Okay. Go off about. Wait, what kind of Glee are we? Vaguely. This is like a show or a movie or something.

SARAH: Oh, my God. Have you not seen it?

AARON: No. I mean, I vaguely remember, I think, watching some TV, but maybe, like, twelve years ago or something. I don’t know.

SARAH: I think it stopped airing in, like, maybe 2015?

AARON: 16. So go off about it. I don’t know what I. Yeah, I.

SARAH: Don’t know what to say about this.

AARON: Well, why does Max think you might have a take about Glee?

SARAH: I mean, I don’t have a take about. Just see the thing. See? No, not even, like, I am just transparently extremely lame. And I really like cheesy. I’m like. I’m like a musical theater kid. Not even ironically. I just like show tunes. And Glee is just a show about a glee club at a high school where they sing show tunes and there’s, like, petty drama, and people burst into song in the hallways, and I just think it’s just the most glorious thing on Earth. That’s it. There are no hot takes.

AARON: Okay, well, that’s cool. I don’t have a lot to say, unfortunately, but.

SARAH: No, that’s totally fine. I feel like this is not a spicy topic for us to discuss. It’s just a good time.

AARON: Yeah.

SARAH: Wait.

AARON: Okay. Yeah. So I do listen to Hamilton on Spotify.

SARAH: Okay.

AARON: Yeah, that’s about it.

SARAH: I like Hamilton. I’ve seen it three times. Oh.

AARON: Live or ever. Wow. Cool. Yeah, no, that’s okay. Well, what do people get right or wrong about theater kids?

SARAH: Oh, I don’t know. I think all the stereotypes are true.

AARON: I mean, that’s generally true, but usually, it’s either over moralized, there’s like a descriptive thing that’s true, but it’s over moralized, or it’s just exaggerated.

SARAH: I mean, to put this in more context, I used to be in choir. I went every Sunday for twelve years. And then every summer we do a little summer school and we go away and put on a production. So we do a musical or something. So I have been. What have I been? I was in Guys and Dolls. I think I was just in the chorus for that. I was the reverend in Anything Goes. But he does unfortunately get kidnapped in like the first five minutes. So he’s not a big presence. Oh, I’ve been Tweedle dumb in Alice in Wonderland. I could go on, but right now as I’m saying this, I’m looking at my notice board and I have two playbills from when I went to Broadway in April where I saw Funny Girl and Hadestown.

SARAH: I went to New York.

AARON: Oh, cool. Oh yeah. We can talk about when you’re moving to the United States. However.

SARAH: I’m not going to do that. Okay.

AARON: I know. I’m joking. I mean, I don’t know.

SARAH: I don’t think I’m going to do that. I don’t know. It just seems like you guys have got a lot going on over there. It seems like things aren’t quite right with you guys. Things aren’t quite right with us either.

AARON: No, I totally get this. I think it would be cool. But also I completely relate to not wanting to. I’ve lived within 10 miles of one. Not even 10 miles, 8 miles in one location. Obviously gone outside of that. But my entire life.

SARAH: You’ve just always lived in DC.

AARON: Yeah, either in DC or. Sorry. But right now in Maryland, it’s like right next to DC on the Metro or at Georgia University, which is in the trying to think would I move to the UK. Like I could imagine situations that would make me move to the UK. But it would still be annoying. Kind of.

SARAH: Yeah, I mean, I guess it’s like they’re two very similar places, but there are all these little cultural things which I feel like kind of trip you up.

AARON: I don’t to. Do you want to say what?

SARAH: Like I think people, I just like, I don’t know. I don’t have that much experience because I’ve only been to America twice. But people seem a lot more sincere in a way that you don’t really get that. Like people are just never really being upfront. And in America, I just got the impression that people just have less of a veneer up, which is probably a good thing. But it’s really hard to navigate if you’re not used to it or something. I don’t know how to describe that.

AARON: Yeah, I’ve definitely heard this at least. And yeah, I think it’s for better and for worse.

SARAH: Yeah, I think it’s generally a good thing.

AARON: Yeah.

SARAH: But it’s like there’s this layer of cynicism or irony or something that is removed and then when it’s not there, it’s just everything feels weak. I can’t describe it.

AARON: This is definitely, I think, also like an EA rationalist thing. I feel like I’m pretty far on the spectrum. Towards the end of surgical niceties are fine, but I don’t know, don’t obscure what you really think unless it’s a really good reason to or something. But it can definitely come across as being rude.

SARAH: Yeah. No, but I think it’s actually a good rule of thumb to obscure what you. It’s good to try not to obscure what you think most of the time, probably.
Ably, I don’t know, but I would love to go over temporarily for like six months or something and just hang out for a bit. I think that’d be fun. I don’t know if I would go back to New York again. Maybe. I like the bagels there.

AARON: I should have a place. Oh yeah. Remember, I think we talked at some point. We can cut this out if you like. Don’t if either of us doesn’t want it in. But we discussed, oh yeah, I should be having a place. You can. I emailed the landlord like an hour before this. Hopefully, probably more than 50%. That is still an offer. Yeah, probably not for all six months, but I don’t know.

SARAH: I would not come and sleep on your sofa for six months. That would be definitely impolite and very weird.

AARON: Yeah. I mean, my roommates would probably grumble.

SARAH: Yeah. They would be like.

AARON: Although I don’t know. Who knows? I wouldn’t be shocked if people were actually like, whatever somebody asked for as a question. This is what he said. I might also be interested in hearing how different backgrounds. Wait, sorry. This is not good grammar. Let me try to parse this. Not having a super hardcore EA AI rationalist background shape how you think or how you view AI as rationality?

SARAH: Oh, that’s a good question. I think it’s more happening the other way around, the more I hang around in these circles. You guys are impacting how I think.

AARON: It’s definitely true for me as well.

SARAH: Seeping into my brain and my language as well. I’ve started talking differently. I don’t know. That’s a good question, though. Yeah. One thing that I will say is that there are certain things that I find irritating about the EA way of style of doing things. I think one specific, I don’t know, the kind of like hand ring about everything. And I know that this is kind of the point, right? But it’s kind of like, you know, when someone’s like, I want to take a stance on something, but then whenever they want to take a stance on something, they feel the need to write like a 10,000 word blog post where they’re thinking about the second and order and third and fifth order effects of this thing. And maybe this thing that seems good is actually bad for this really convoluted reason. That’s just so annoying.

AARON: Yeah.

SARAH: Also understand that maybe that is a good thing to do sometimes, but it just seems like, I don’t know how anyone ever gets anywhere. It seems like everyone must be paralyzed by indecision all the time because they just can’t commit to ever actually just saying anything.

AARON: I think this kind of thing is really good if you’re trying to give away a billion dollars. Oh yes, I do want the billion dollar grantor to be thinking through second and third order effects of how they give away their billion dollars. But also, no, I am super. The words on the tip of my tongue, not overwhelmed but intimidated when I go on the EA forum because the posts, none of them are like normal, like five paragraph essays. Some of them are like, I think one of them I looked up for fun because I was going to make a meme about it and still will. Probably was like 30,000 words or something. And even the short form posts, which really gets me kind of not even annoyed. I don’t know, maybe kind of annoyed is that the short form posts, which is sort of the EA forum version of Twitter, are way too high quality, way too intimidating. And so maybe I should just suck it up and post stuff anyway more often. It just feels weird. I totally agree.

SARAH: I was also talking to someone recently about how I lurked on the EA forum and less wrong for months and months and I couldn’t figure out the upvoting system and I was like, am I being stupid or why are there four buttons? And I was like, well, eventually I had to ask someone because I couldn’t figure it out. And then he explained it to me and I was like, that is just so unnecessary. Like, just do it.

AARON: No, I do know what you mean.

SARAH: I just t
I think it’s annoying. It pisses me off. I just feel like sometimes you don’t need to add more things. Sometimes less is good. Yeah, that’s my hot take. Nice things.

AARON: Yeah, that’s interesting.

SARAH: But actually, a thing that I like that EA’s do is the constant hedging and caveatting. I do find it kind of adorable. I love that because it’s like you’re having to constantly acknowledge that you probably didn’t quite articulate what you really meant and that you’re not quite making contact with reality when you’re talking. So you have to clarify that you probably were imprecise when you said this thing. It’s unnecessary, but it’s kind of amazing.

AARON: No, it’s definitely. I am super guilty of this because I’ll give an example in a second. I think I’ve been basically trained to try pretty hard, even in normal conversation with anybody, to just never say anything that’s literally wrong. Or at least if I do caveat it.

AARON: I was driving home, me and my parents and I, unless visited, our grandparents were driving back, and we were driving back past a cruise ship that was in a harbor. And my mom, who was driving at the time, said, “Oh, Aaron, can you see if there’s anyone on there?” And I immediately responded like, “Well, there’s probably at least one person.” Obviously, that’s not what she meant. But that was my technical best guess. It’s like, yes, there probably are people on there, even though I couldn’t see anybody on the decks or in the rooms. Yeah, there’s probably a maintenance guy. Felt kind of bad.

SARAH: You can’t technically exclude that there are, in fact, no people.

AARON: Then I corrected myself. But I guess I’ve been trained into giving that as my first reaction.

SARAH: Yeah, I love that. I think it’s a waste of words, but I find it delightful.

AARON: It does go too far. People should be more confident. I wish that, at least sometimes, people would say, “Epistemic status: Want to bet?” or “I am definitely right about this.” Too rarely do we hear, "I’m actually pretty confident here.

SARAH: Another thing is, people are too liberal with using probabilities. The meaning of saying there is an X percent chance of something happening is getting watered down by people constantly saying things like, “I would put 30% on this claim.” Obviously, there’s no rigorous method that’s gone into determining why it’s 30 and not 35. That’s a problem and people shouldn’t do that. But I kind of love it.

AARON: I can defend that. People are saying upfront, “This is my best guess. But there’s no rigorous methodology.” People should take their word for that. In some parts of society, it’s seen as implying that a numeric probability came from a rigorous model. But if you say, “This is my best guess, but it’s not formed from anything,” people should take their word for that and not refuse to accept them at face value.

SARAH: But why do you have to put a number on it?

AARON: It depends on what you’re talking about. Sometimes probabilities are relevant and if you don’t use numbers, it’s easy to misinterpret. People would say, “It seems quite likely,” but what does that mean? One person might think “quite reasonably likely” means 70%, the other person thinks it means 30%. Even though it’s weird to use a single number, it’s less confusing.

SARAH: To be fair, I get that. I’ve disagreed with people about what the word “unlikely” means. Someone’s pulled out a scale that the government uses, or intelligence services use to determine what “unlikely” means. But everyone interprets those words differently. I see what you’re saying. But then again, I think people in AI safety talking about P Doom was making people take us less seriously, especially because people’s probabilities are so vibey.

AARON: Some people are, but I take Paul Cristiano’s word seriously.

SARAH: He’s a 50/50 kind of guy.

AARON: Yeah, I take that pretty seriously.
Obviously, it’s not as simple as him having a perfect understanding of the world, even after another 10,000 hours of investigation. But it’s definitely not just vibes, either.

SARAH: No, I came off wrong there. I don’t mean that everyone’s understanding is just vibes.

AARON: Yeah.

SARAH: If you were looking at it from the outside, it would be really difficult to distinguish between the ones that are vibes and the ones that are rigorous, unless you carefully parsed all of it and evaluated everyone’s background, or looked at the model yourself. If you’re one step removed, it looks like people just spitting out random, arbitrary numbers everywhere.

AARON: Yeah. There’s also the question of whether P doom is too weird or silly, or if it could be easily dismissed as such.

SARAH: Exactly, the moment anyone unfamiliar with this discussion sees it, they’re almost definitely going to dismiss it. They won’t see it as something they need to engage with.

AARON: That’s a very fair point. Aside from the social aspect, it’s also a large oversimplification. There’s a spectrum of outcomes that we lump into doom and not doom. While this binary approach can be useful at times, it’s probably overdone.

SARAH: Yeah, because when some people say doom, they mean everyone dies, while others mean everyone dies plus everything is terrible. And no one specifies what they mean. It is silly. But, I also find it kind of funny and I kind of love it.

AARON: I’m glad there’s something like that. So it’s not perfect. The more straightforward thing would be to say P existential risk from AI comes to pass. That’s the long version, whatever.

SARAH: If I was in charge, I would probably make people stop using PDOOm. I think it’s better to say it the long way around. But obviously I’m not in charge. And I think it’s funny and kind of cute, so I’ll keep using it.

AARON: Maybe I’m willing to go along and try to start a new norm. Not spend my whole life on it, but say, I think this is bad for X, Y, and Z reasons. I’ll use this other phrase instead and clarify when people ask.

SARAH: You’re going to need Twitter premium because you’re going to need a lot more characters.

AARON: I think there’s a shorthand which is like PX risk or P AiX risk.

SARAH: Maybe it’s just the word doom that’s a bit stupid.

AARON: Yeah, that’s a term out of the Bay Area rationalists.

SARAH: But then I also think it kind of makes the whole thing seem less serious. People should be indignant to hear that this meme is being used to trade probabilities about the likelihood that they’re going to die and their families are going to die. This has been an in-joke in this weird niche circle for years and they didn’t know about it. I’m not saying that in a way to morally condemn people, but if you explain this to people…
People just go to dinner parties in Silicon Valley and talk about this weird meme thing, and what they really mean is the ODs know everyone’s going to prematurely die. People should be outraged by that, I think.

AARON: I disagree that it’s a joke. It is a funny phrase, but the actual thing is people really do stand by their belief.

SARAH: No, I totally agree with that part. I’m not saying that people are not being serious when they give their numbers, but I feel like there’s something. I don’t know how to put this in words. There’s something outrageous about the fact that for outsiders, this conversation has been happening for years and people have been using this tongue-in-cheek phrase to describe it, and 99.9% of people don’t know that’s happening. I’m not articulating this very well.

AARON: I see what you’re saying. I don’t actually think it’s like. I don’t know a lot of jargon.

SARAH: But when I first found out about this, I was outraged.

AARON: I honestly just don’t share that intuition. But that’s really good.

SARAH: No, I don’t know how to describe this.

AARON: I think I was just a little bit indignant, perhaps.

SARAH: Yeah, I was indignant about it. I was like, you guys have been at social events making small talk by discussing the probability of human extinction all this time, and I didn’t even know. I was like, oh, that’s really messed up, guys.

AARON: I feel like I’m standing by the rational tier because, it was always on. No one was stopping you from going on less wrong or whatever. It wasn’t behind closed.

SARAH: Yeah, but no one ever told me about it.

AARON: Yeah, that’s like a failure of outreach, I suppose.

SARAH: Yeah. I think maybe I’m talking more about. Maybe the people that I’m mad at is the people who are actually working on capabilities and using this kind of jargon. Maybe I’m mad at those people. They’re fine.

AARON: Do we have more questions? I think we might have more questions. We have one more. Okay, sorry, but keep going.

SARAH: No, I’m going to stop making that point now because I don’t really know what I’m trying to say and I don’t want to be controversial.

AARON: Controversy is good for views. Not necessarily for you. No, thank you for that. Yes, that was a good point. I think it was. Maybe it was wrong. I think it seems right.

SARAH: It was probably wrong.

Shrimp Welfare: A Serious Discussion

AARON: I don’t know what she thinks about shrimp welfare. Oh, yeah. I think it’s a general question, but let’s start with that. What do you think about shrimp? Well, today.

SARAH: Okay. Is this an actual cause area or is this a joke about how if you extrapolate utilitarianism to its natural conclusion, you would really care about shrimp?

AARON: No, there’s a charity called the Shrimp Welfare Initiative or project. I think it’s Shrimp Welfare Initiative. I can actually have a rant here about how it’s a meme that people find amusing. It is a serious thing, but I think people like the meme more than they’re willing to transfer their donations in light of it. This is kind of wrong and at least distasteful.

No, but there’s an actual, if you Google, Shrimp Welfare Project. Yeah, it’s definitely a thing, but it’s only a couple of years old. And it’s also kind of a meme because it does work in both ways. It sort of shows how we’re weird, but in the sense that we are willing to care about things that are very different from us. Not like we’re threatening other people. That’s not a good description.

SARAH: Is the extreme version of this position that we should put more resources into improving the lives of shrimp than into improving the lives of people just because there are so many more shrimp? Are there people that actually believe that?

AARON: Well, I believe some version of that, but it really depends on who the ‘we’ is there.

SARAH: Should humanity be putting more resources?

AARON: No one believes that as far as I know.

SARAH: Okay. Right. So what is the most extreme manifestation of the shrimp welfare position?

AARON: Well, I feel like my position is kind of extreme, and I’m happy to discuss it. It’s easier than speculating about what the more extreme ones are. I don’t think any of them are that extreme, I guess, from my perspective, because I think I’m right.

SARAH: Okay, so what do you believe?

AARON: I think that most people who have already decided to donate, say $20, if they are considering where to donate it and they are better morally, it would be better if they gave it to the shrimp welfare project than if they gave it to any of the commonly cited EA organizations.

SARAH: Malaria nets or whatever.

AARON: Yes. I think $20 of malaria nets versus $20 of shrimp. I can easily imagine a world where it would go the other way. But given the actual situation, the $20 of shrimp is much better.

SARAH: Okay. Is it just purely because there’s just more shrimp? How do we know how much shrimp suffering there is in the world?

AARON: No, this is an excellent question. The numbers are a key factor, but no, it’s not as simple. I definitely don’t think one shrimp is worth one human.

SARAH: I’m assuming that it’s based on the fact that there are so many more shrimp than there are people that I don’t know how many shrimp there are.

AARON: Yeah, that’s important, but at some level, it’s just the margin. What I think is that when you’re donating money, you should give to wherever it does the most good, whatever that means, whatever you think that means. But let’s just leave it at that. The most good is morally best at the margin, which means you’re not donating where you think the world should or how you think the world should expend its trillion dollar wealth. All you’re doing is adding $20 at this current level, given the actual world. And so part of it is what you just said, and also including some new research from Rethink Priorities.
Measuring suffering in reasonable ranges is extremely hard to do. But I believe it’s difficult to do a better job than raising priorities on that, given what I’ve seen. I can provide some links. There are a few things to consider here: numbers, times, and the enormity of suffering. I think there are a couple of key elements, including tractability.

Are you familiar with the three-pronged concept people sometimes discuss, which encompasses tractability, and neglectedness?

SARAH: Okay.

AARON: Importance is essentially what we just mentioned. Huge numbers and plausible amounts of suffering. When you try to do the comparison, it seems like they’re a significant concern. Tractability is another factor. I think the best estimates suggest that a one-dollar donation could save around 10,000 shrimp from a very painful death.

SARAH: In that sense…

AARON: You could imagine that even if there were a hundred times more shrimp than there actually are, we have direct control over how they live and die because we’re farming them. The industry is not dominated by wealthy players in the United States. Many individual farmers in developing nations, if educated and provided with a more humane way of killing the shrimp, would use it. There’s a lot of potential for improvement here. This is partly due to the last prong, neglectedness, which is really my focus.

SARAH: You’re saying no one cares about the shrimp.

AARON: I’m frustrated that it’s not taken seriously enough. One of the reasons why the marginal cost-effectiveness is so high is because large amounts of money are donated to well-approved organizations. But individual donors often overlook this. They ignore their marginal impact. If you want to see even a 1% shift towards shrimp welfare, the thing to do is to donate to shrimp welfare. Not donate $19 to human welfare and one dollar to shrimp welfare, which is perhaps what they think the overall portfolio should be.

SARAH: Interesting. I don’t have a good reason why you’re wrong. It seems like you’re probably right.

AARON: Let me put the website in the chat. This isn’t a fair comparison since it’s something I know more about.

SARAH: Okay.

AARON: On the topic of obesity, neither of us were more informed than the other. But I could have just made stuff up or said something logically fallacious.

SARAH: You could have told me that there were like 50 times the number of shrimp in the world than there really are. And I would have been like, sure, seems right.

AARON: Yeah. And I don’t know, if I…
If I were in your position, I would say, “Oh, yeah, that sounds right.” But maybe there are other people who have looked into this way more than me that disagree, and I can get into why I think it’s less true than you’d expect in some sense.

SARAH: I just wonder if there’s like… This is like a deeply non-EA thing to say. So I don’t know, maybe I shouldn’t say it, but are there not any moral reasons? Is there not any good moral philosophy behind just caring more about your own species than other species? If you’re sorry, but that’s probably not right, is it? There’s probably no way to actually morally justify that, but it seems like it feels intuitively wrong. If you’ve got $20 to be donating 19 of them to shrimp and one to children with malaria, that feels like there should be something wrong with that, but I can’t tell you what it is.

AARON: Yeah, no, there is something wrong, which is that you should donate all 20 because they’re acting on the margin, for one thing. I do think that doesn’t check out morally, but I think basically me and everybody I know in terms of real life or whatever, I do just care way more about humans. I don’t know, for at least the people that it’s hard to formalize or specify what you mean by caring about or something. But, yeah, I think you can definitely basically just be a normal human who basically cares a lot about other humans. And still that’s not like, negated by changing your $20 donation or whatever. Especially because there’s nothing else that I do for shrimp. I think you should be like a kind person or something. I’m like an honest person, I think. Yeah, people should be nice to other humans. I mean, you should be nice in the sense of not beating them. But if you see a pigeon on the street, you don’t need to say hi or whatever, give it a pet, because. I don’t know. But yeah, you should be basically like, nice.

SARAH: You don’t stop to say hi to every pigeon that you see on the way to anywhere.

AARON: I do, but I know most normal people don’t.

SARAH: This is why I’m so late to everything, because I have to do it. I have to stop for every single one. No exceptions.

AARON: Yeah. Or how I think about it is sort of like a little bit of compartmentalization, which I think is like… Which is just sort of like a way to function normally and also sort of do what you think really checks out at the end of the day, just like, okay, 99% of the time I’m going to just be like a normal person who doesn’t care about shrimp. Maybe I’ll refrain from eating them. But actually, even that is like, I could totally see a person just still eating them and then doing this. But then during the 1% of the time where you’re deciding how to give money away and none of those, the beneficiaries are going to be totally out of sight either way. This is like a neutral point, I guess, but it’s still worth saying, yeah, then you can be like a hardcore effective altruist or whatever and then give your money to the shrimp people.

SARAH: Do you have this set up as like a recurring donation?

AARON: Oh, no. Everybody should call me out as a hypocrite because I haven’t donated much money, but I’m trying to figure out actually, given that I haven’t had a stable income ever. And maybe, hopefully I will soon, actually. But even then, it’s still a part-time thing. I haven’t been able to do sort of standard 10% or more thing, and I’m trying to figure out what the best thing to do or how to balance, I guess, not luxury, not like consumption on things that I… Well, to some extent, yeah. Maybe I’m just selfish by sometimes getting an Uber. That’s totally true. I think I’m just a hypocrite in that respect. But mostly I think the trade-off is between saving, investing, and giving. Beast of the money that I have saved up and past things. So this is all sort of a defense of why I don’t have a recurring donation going on.

SARAH: I’m not asking you to defend yourself because I do not do that either.

AARON: I think if I was making enough money that I could give away $10,000 a year and plan on doing that indefinitely, I would be unlikely to set up a recurring donation. What I would really want to do is once or twice a year, really try to prioritize deciding on how to give it away rather than making it the default. This has a real cost for charities. If you set up a recurring donation, they have more certainty in some sense of their future cash flow. But that’s only good to do if you’re really confident that you’re going to want to keep giving there in the future. I could learn new information that says something else is better. So I don’t think I would do that.

SARAH: Now I’m just thinking about how many shrimp did you say it was per dollar?

AARON: Don’t quote me. I didn’t say an actual thing.

SARAH: It was like some big number. Right. Because I just feel like that’s such a brainworm. Imagine if you let that actually get in your head and then every time you spend some unnecessary amount of money on something you don’t really need, you think about how many shrimp you just killed by getting an Uber or buying lunch out. That is so stressful. I think I’m going to try not to think about that.

AARON: I don’t mean to belittle this. This is like a core, I think you’re new to EA type of thinking. It’s super natural and also troubling when you first come upon it. Do you want me to talk about how I, or other people deal with that or take action?

SARAH: Yeah, tell me how to get the shrimp off my conscience.

AARON: Well, for one thing, you don’t want to totally do that. But I think the main thing is that the salience of things like this just decreases over time. I would be very surprised if, even if you’re still very engaged in the EA adjacent communities or EA itself in five years, that it would be as emotionally potent. Brains make things less important over time. But I think the thing to do is basically to compartmentalize in a sort of weird sense. Decide how much you’re willing to donate. And it might be hard to do that, but that is sort of a process. Then you have that chunk of money and you try to give it away the best you can under whatever you think the best ethics are. But then on the daily, you have this other set pot of money. You just are a normal person. You spend it as you wish. You don’t think about it unless you try not to. And maybe if you notice that you might even have leftover money, then you can donate the rest of it. But I really do think picking how much to give should sort of be its own project. And then you have a pile of money you can be a hardcore EA about.

SARAH: So you pick a cut off point and then you don’t agonize over anything over and above that.

AARON: Yeah. And then people, I mean, the hard part is that if somebody says their cut off point is like 1% of their income and they’re making like $200,000, I don’t know. Maybe their cut off point should be higher. So there is a debate. It depends on that person’s specific situation. Maybe if they have a kid or some super expensive disease, it’s a different story. If you’re just a random guy making $200,000, I think you should give more.

SARAH: Maybe you should be giving away enough to feel the pinch. Well, not even that. I don’t think I’m going to do that. This is something that I do actually want to do at some point, but I need to think about it more and maybe get a better job.

AARON: Another thing is, if you’re wanting to earn to give as a path to impact, you could think and strive pretty hard. Maybe talk to people and choose your education or professional development opportunities carefully to see if you can get a better paying job. That’s just much more important than changing how much you give from 10% to 11% or something. You should have this macro level optimization. How can I have more money to spend? Let me spend, like, I don’t know, depends what life stage you are, but if you had just graduated college or maybe say you’re a junior in college or something. It could make sense to spend a good amount of time figuring out what that path might look like.

AARON: I’m a huge hypocrite because I definitely haven’t done all this nearly as much as I should, but I still endorse it.

SARAH: Yeah, I think it’s fine to say what you endorse doing in an ideal world, even if you’re not doing that, that’s fine.

AARON: For anybody listening, I tweeted a while ago, asking if anyone has resources on how to think about giving away wealth. I’m not very wealthy but have some amount of savings. It’s more than I really need. At the same time, maybe I should be investing it because EA orgs don’t feel like, or they think they can’t invest it because there’s potentially a lot of blowback if they make poor investments, even though it would be higher expected value.

There’s also the question of, okay, having some amount of savings allows me to take higher, potentially somewhat higher risk, but higher value opportunities because I have a cushion. But I’m very confused about how to give away what I should do here. People should DM me on Twitter or anywhere they have ideas.

SARAH: I think you should calculate how much you need to cover your very basic needs. Maybe you should work out, say, if you were working 40 hours a week in a minimum wage job, like how much would you make then? And then you should keep that for yourself. And then the rest should definitely all go to the shrimp. Every single penny. All of it.

AARON: This is pretty plausible. Just to make it more complicated, there’s also the thing that I feel like my estimates or my best guesses of the best charities to give to over time has changed. And so there’s like two competing forces. One is that I might get wiser and more knowledgeable as time goes on. The other one is that in general, giving now is better than giving later. All else equal, because I think for a couple of reasons, the main one just being that the charities don’t know that you’re going to give later.

AARON: So it’s like they can plan for the future much better if they get money now. And also there’s just higher leverage opportunities or higher value per dollar opportunities now in general than there will be later for a couple of reasons I don’t really need to. This is what makes it really complicated. So I’ve donated in the past to places that I don’t think, or I don’t think even at the time were the best to. So then there’s a question of like, okay, how long do I save this money? Do I sit on it for months until I’m pretty confident, like a year.

AARON: I do think that probably over the course of zero to five years or something, becoming more confident or changing your mind is like the stronger effect than how much good you give to the, or how much better it is for the charities to give now instead of later. But also that’s weird because you’re never committing at all.
Sometimes you might decide to give it away, and maybe you won’t. Maybe at that time you’re like, “Oh, that’s what I want. A car, I have a house, whatever.” It’s less salient or something. Maybe something bad happened with EA and you no longer identify that way. Yeah, there’s a lot of really thorny considerations. Sorry, I’m talking way too much.

SARAH: Long, are you factoring AI timelines into this?

AARON: That makes it even more sketchy. But that could also go both ways. On one hand, you have the fact that if you don’t give away your money now and you die with it, it’s never going to do any good. The other thing is that it might be that especially high leverage opportunities come in the future or something potentially you need, I don’t know, whatever I can imagine I could make something up about. OpenPhil needs as much money as it can get to do X, Y and Z. It’s really important right now, but I won’t know that until a few years down the line. So just like everything else, it doesn’t neatly wash out.

SARAH: What do you think the AGI is going to do to the shrimp? I reckon it’s probably pretty neat, like one shrimp per paperclip. Maybe you could get more. I wonder what the sort of shrimp to paperclip conversion rate is.

AARON: Has anyone looked into that morally? I think like one to zero. I don’t think in terms of money. You could definitely price that. I have no idea.

SARAH: I don’t know. Maybe I’m not taking this as seriously as I should be because I’m.

AARON: No, I mean, humor is good. When people are giving away money or deciding what to do, they should be serious. But joking and humor is good. Sorry, go ahead.

SARAH: No, you go ahead.

AARON: I had a half-baked idea. At EA Global, they should have a comedy show where people roast everybody, but it’s a fundraiser. You have to pay to get 100 people to attend. They have a bidding contest to get into the comedy show. That was my original idea. Or they could just have a normal comedy show. I think that’d be cool.

SARAH: Actually, I think that’s a good idea because you guys are funny. There is a lot of wit on this side of Twitter. I’m impressed.

AARON: I agree.

SARAH: So I think that’s a very good idea.

AARON: Okay. Dear Events team: hire Aaron Bergman, professional comedian.

SARAH: You can just give them your Twitter as a source for how funny you are, and that clearly qualifies you to set this up. I love it.

AARON: This is not important or related to anything, but I used to be a good juggler for entertainment purposes. I have this video. Maybe I should make sure the world can see it. It’s like a talent show. So maybe I can do that instead.

SARAH: Juggling. You definitely should make sure the world has access to this footage.

AARON: It had more views than I expected. It wasn’t five views. It was 90 or something, which is still nothing.

SARAH: I can tell you a secret right now if you want. That relates to Max asking in the chat about glee.


SARAH: This bit will also have to edit out, but me having a public meltdown over AI was the second time that I’ve ever blown up on the Internet. The first time being. I can’t believe I’m telling you this. I think I’m delirious right now. Were you ever in any fandoms, as a teenager?


SARAH: Okay. Were you ever on Tumblr?

AARON: No. I sort of know what the cultural vibes were. I sort of know what you’re referring to. There are people who like Harry Potter stuff and bands, like Kpop stuff like that.

SARAH: So people would make these fan videos where they’d take clips from TV shows and then they edit them together to music. Sometimes people would edit the clips to make it look like something had happened in the plot of the show that hadn’t actually happened. For example, say, what if X character had died? And then you edit the clips together to try and make it look like they’ve died. And you put a sad song, how to save a life by the fray or something, over the top. And then you put it on YouTube.

AARON: Sorry, tell me what…
"Hat I should search or just send the link here. I’m sending my link.

SARAH: Oh, no, this doesn’t exist anymore. It does not exist anymore. Right? So, say if you’re, like, eleven or twelve years old and you do this, and you don’t even have a mechanism to download videos because you don’t know how to do technology. Instead, you take your little iPod touch and you just play a YouTube video on your screen, and you literally just film the screen with your iPod touch, and that’s how you’re getting the clips. It’s kind of shaky because you’re holding the camera anyway.

SARAH: Then you edit together on the iMovie app of your iPod touch, and then you put it on the Internet, and then you just forget about it. You forget about it. Two years later, you’re like, oh, I wonder what happened to that YouTube account? And you log in and this little video that you’ve made with edited clips that you’ve filmed off the screen of your laptop to ‘How To Save Life’ by The Fray with clips from Glee in it, has nearly half a million views.

AARON: Nice. Love it.

SARAH: Embarrassing because this is like, two years later. And then all the comments were like, oh, my God, this was so moving. This made me cry. And then obviously, some of them were hating and being like, do you not even know how to download video clips? Like, what? And then you’re so embarrassed.

AARON: I could totally seem it. Creative, but only a reasonable solution. Yeah.

SARAH: So that’s my story of how I went viral when I was like, twelve.

AARON: It must have been kind of overwhelming.

SARAH: Yeah, it was a bit. And you can tell that my time, it’s like 20 to eleven at night, and now I’m starting to really go off on one and talk about weird things.

AARON: Like an hour. So, yeah, we can wrap up. And I always say this, but it’s actually true. Which is that low standard, like, low stakes or low threshold. Low bar for doing that in recording some of the time.

SARAH: Yeah, probably. We’ll have to get rid of the part about how I went viral on YouTube when I was twelve. I’ll sleep on that.

AARON: Don’t worry. I’ll send the transcription at some point soon.

SARAH: Yeah, cool.

AARON: Okay, lovely. Thank you for staying up late into the night for this.

SARAH: It’s not that late into the night. I’m just like, lame and go to bed early.

AARON: Okay, cool. Yeah, I know. Yeah, for sure. All right, bye.

Leave a comment