(Okay, one final side note; myself, I have recently decided to stop following Noah Smith precisely because of this: he has shown himself to be "not consistently candid" and I've just reclassified him as untrustworthy; but I am sure he thinks, probably rightly, that he can convey his core ideas and beliefs and achieve his goals by being "not consistently candid").
I'm curious, in what areas do you think he's not candid? I stopped following him as well, but more because I feel like an increasing proportion of his takes are half-baked.
Easier to say in which he isn't, but two recent examples that I felt were dishonest would be his AI doom take and his having denied he ever supported DEI policies and reacting with blocking people once he was shown retweets of his doing precisely that.
I don’t think Kelsey Piper “F’d up”. You can always feel regret for something after you know more information, even if what you did with the information you had was the responsible thing to do.
Even from the perspective of hindsight I’d argue she comes out looking great. How insane and chaotic did the information environment become just weeks after she wrote those tweets? How quickly did people get fed up with the lockdowns in many places? If she started out as a maverick who didn’t defer to public health authorities at all would she have been able to credibly criticize their mistakes later as someone who doesn’t break from their guidance lightly?
That was a really long (albeit interesting!) post, Aaron. I have mixed feelings about your thesis. I'll try to order them in this comment.
At the core, I feel you *are* being a bit unfair here. I'd model the situation thus: public intellectuals have to create and cultivate what I'd call 'cultural capital', which mainly works like other types of capital. This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs.
A big part of this is choosing what wars to fight, what arguments to present, and when and for whom to do it. This doesn't have to be intellectually dishonest, but it does cross into what truth-seeking and truth-saying fundamentalists (like us) can consider a gray zone. I'd say one has an obligation to seek the truth, and if one is a public intellectual, one has probably some obligation not to lie or deceive, or deal dishonestly with your readers. But deciding to withhold some of the things you believe because you're uncertain of them, they are not central to you or your discourse, or you just feel they won't land well and end up being counterproductive to your whole line of work and persuasion (a.k.a., building your reputation and cultural capital) seems like... okay? It feels like you're assuming implicitly a world is which EA and LW culture norms are writ large, but truth be told, the epistemics norms of that community, while feeling quite optimal to the likes of me, can be extremely weird and self-defeating if you're outside of that pale. YTake MacAskill: i am pretty sure a lot of people (including a non trivial number of EAs!) are likely already pissed off by his Long-termist and (more recently) strong Ai shift. As Peter Wildeform once wisely said, "You have a set amount of "weirdness points". Spend them wisely".
I think your POV is essentially the default mode most PIs are operating under (so you're in good company) - or at least highly compatible with it.
> This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs
Lol I was the first one to carelessly use the word "need" (in the title) so I can't really object, but they only "need" to do these things insofar as they need [money/respect/status/stability/???]
Quite possibly I *was* anchoring too hard on relatively big names with >=stable careers (afaict) and so am not being fair to those for whom there's a legit strong tradeoff between e.g. financial security and candidness. This is very fair. I guess I would emphasize the "virtue" part and say like "well how close are you really to the Pareto frontier of candidness and other things you value?"
I'd be even more sympathetic to eg the Piper case (I'm starting out pretty sympathetic tbc) if she was like straight out of college with minimal credibility to burn if wrong in an unpopular way
Well said. A point I would add about Kelsey Piper: she had no special insight into COVID being a large pandemic. Kelsey Piper has sounded the alarm about various outbreaks that did not escalate into full-scale pandemics.
Her contribution was not seeing that a COVID pandemic was more likely than anyone else. Every journalist looked at the data, and said "there's a 1% chance this escalates into a worldwide pandemic." (This is a made up number - the exact number is not important to my point.) The difference is that other journalists took that as an invitation to write articles saying, "worrying about the pandemic is racist." Kelsey took that as an opportunity to write an article saying, "worrying about the pandemic isn't irrational." She didn't think an outbreak was more likely, she was just more willing to reason under uncertainty.
Could she have been more strident, and written an article predicting doom? Yes, absolutely. But the modern information ecosystem would punish her if she turned out to be wrong in the other 99% of cases where the outbreak peters out. I'm inclined to give her a lot of slack for widening the Overton window to allow people to react sooner, even if she didn't burn her social capital to do so.
(Sorry, the window was doing weird stuff, so I had to send the post uncompleted. I go on...)
As to candidness, do you really think most of the audience doesn't suspect that intellectuals don't mention *all the truth*? My base assumption is that they do, and that what you mention about disclosures would 1) not really add much, and 2) likely contribute to undermining the intellectual stance and brand of the intellectual. Again, it feels like you're seeing everything too much through EA-LW-tinted glasses. Intellectuals *do not* get bonus points among the public for making and recognizing mistakes. For a couple recent examples, just look at how people like Noah Smith or Ezra Klein have 'unpersoned' some their past opinions they now consider wrong.
As for 'Help Obi Wan, you're my only hope': I mean... people will either be the type to consult many different sources on an issue, of the type that don't. I think you're placing an excessively heavy burden on *some* intellectuals and public speakers. And once again, "You have a set amount of "weirdness points". Spend them wisely". I think you vastly underestimate how intellectually poisonous it is to present some people with some belief x that so weighs down in how that person is perceived that all his/her intellectual arguments get just filtered through the 'this weirdo has horrid and obviously stupid/wrong belief x, so it is not worth my time and thoughts to engage with anything else he/she says and giving them any credibility'.
I'll think I'll leave it here. Long enough rant already! xD
Good post! I think people should be more aware that in books, authors often put on a persona. This can be to adhere to what seems to be higher epistemic standards (as the Kelsey Piper case could be read as, authors of academic papersdi this very often as well), or to simplify things and avoid alienating the reader (as the Will MacAskill case seems to be).
So perhaps we need our literacy education to indicate that non-fiction books might not represent what the author ultimately thinks.
Which is a bit fucked up, because where can we actually learn what people think? Interviews? Not necessarily, if the author is polite and agreeable. Social media? Not necessarily, since they might be playing stuff up for an audience and there might be length restrictions.
So maybe this leads me down a road of skepticism where you can barely learn what people *actually* think. Which can sometimes be unfortunate.
(Though not always, imagine Kelsey Piper had said that Covid was coming in print, but she had been wrong. I think she was trying to anticipate and avoid the damage that could have caused to Vox.)
I am a bit more optimistic because people are blabbermouths, and even if they are trying to machiavellically curate what they say, their true thoughts and beliefs can mostly be pulled out through extensive text analysis and close reading. You can also try to create truth-seeking, high epistemic communities, but that's only really going to work for a finite subset of weirdos who train themselves into going against what is likely human nature (at least at this stage of our cultural and biological evolution). Humans respond to incentives, positive and negative (perhaps the deepest and superficially-looking-simple-but-really-deep-at-its-core-truth of economics). I can easily afford to be publicly commited to truth-saying not because of my superior ethics or backbone, but because it comes with little cost -as a) I'm a nobody and have little to no influence in decisions taken; 2) I have a job in which I am effectively unfireable; and 3) I write and post my thoughts in a language almost nobody in my face2face circle of concern will read them anyway-. Others don't enjoy such a luxury.
I actually think it's usually good for prominent public intellectuals to not spout off about all their speculations and intuitions that have weaker grounding. You can cite examples where their speculations were correct, but often they are not. Experts often have more speculative beliefs that are just as wild and implausible as the beliefs of randos if you get them talking candidly. For example, I've known a few doctors who, by all accounts, were good doctors, but also believed in conspiracy theories and quasi-mystical phenomena. The outcome of airing these beliefs is either destroying their credibility, or misleading the audience to put more credence into an idea than they should.
There are definitely cases where I think public intellectuals should air their unfiltered beliefs on a topic, but they should be very careful and thoughtful about when they do so. They should be pretty damn sure what they're about to say is worthwhile and likely to be correct.
(Okay, one final side note; myself, I have recently decided to stop following Noah Smith precisely because of this: he has shown himself to be "not consistently candid" and I've just reclassified him as untrustworthy; but I am sure he thinks, probably rightly, that he can convey his core ideas and beliefs and achieve his goals by being "not consistently candid").
I'm curious, in what areas do you think he's not candid? I stopped following him as well, but more because I feel like an increasing proportion of his takes are half-baked.
Easier to say in which he isn't, but two recent examples that I felt were dishonest would be his AI doom take and his having denied he ever supported DEI policies and reacting with blocking people once he was shown retweets of his doing precisely that.
I don’t think Kelsey Piper “F’d up”. You can always feel regret for something after you know more information, even if what you did with the information you had was the responsible thing to do.
Even from the perspective of hindsight I’d argue she comes out looking great. How insane and chaotic did the information environment become just weeks after she wrote those tweets? How quickly did people get fed up with the lockdowns in many places? If she started out as a maverick who didn’t defer to public health authorities at all would she have been able to credibly criticize their mistakes later as someone who doesn’t break from their guidance lightly?
That was a really long (albeit interesting!) post, Aaron. I have mixed feelings about your thesis. I'll try to order them in this comment.
At the core, I feel you *are* being a bit unfair here. I'd model the situation thus: public intellectuals have to create and cultivate what I'd call 'cultural capital', which mainly works like other types of capital. This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs.
A big part of this is choosing what wars to fight, what arguments to present, and when and for whom to do it. This doesn't have to be intellectually dishonest, but it does cross into what truth-seeking and truth-saying fundamentalists (like us) can consider a gray zone. I'd say one has an obligation to seek the truth, and if one is a public intellectual, one has probably some obligation not to lie or deceive, or deal dishonestly with your readers. But deciding to withhold some of the things you believe because you're uncertain of them, they are not central to you or your discourse, or you just feel they won't land well and end up being counterproductive to your whole line of work and persuasion (a.k.a., building your reputation and cultural capital) seems like... okay? It feels like you're assuming implicitly a world is which EA and LW culture norms are writ large, but truth be told, the epistemics norms of that community, while feeling quite optimal to the likes of me, can be extremely weird and self-defeating if you're outside of that pale. YTake MacAskill: i am pretty sure a lot of people (including a non trivial number of EAs!) are likely already pissed off by his Long-termist and (more recently) strong Ai shift. As Peter Wildeform once wisely said, "You have a set amount of "weirdness points". Spend them wisely".
I think your POV is essentially the default mode most PIs are operating under (so you're in good company) - or at least highly compatible with it.
> This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs
Lol I was the first one to carelessly use the word "need" (in the title) so I can't really object, but they only "need" to do these things insofar as they need [money/respect/status/stability/???]
Quite possibly I *was* anchoring too hard on relatively big names with >=stable careers (afaict) and so am not being fair to those for whom there's a legit strong tradeoff between e.g. financial security and candidness. This is very fair. I guess I would emphasize the "virtue" part and say like "well how close are you really to the Pareto frontier of candidness and other things you value?"
I'd be even more sympathetic to eg the Piper case (I'm starting out pretty sympathetic tbc) if she was like straight out of college with minimal credibility to burn if wrong in an unpopular way
Well said. A point I would add about Kelsey Piper: she had no special insight into COVID being a large pandemic. Kelsey Piper has sounded the alarm about various outbreaks that did not escalate into full-scale pandemics.
Her contribution was not seeing that a COVID pandemic was more likely than anyone else. Every journalist looked at the data, and said "there's a 1% chance this escalates into a worldwide pandemic." (This is a made up number - the exact number is not important to my point.) The difference is that other journalists took that as an invitation to write articles saying, "worrying about the pandemic is racist." Kelsey took that as an opportunity to write an article saying, "worrying about the pandemic isn't irrational." She didn't think an outbreak was more likely, she was just more willing to reason under uncertainty.
Could she have been more strident, and written an article predicting doom? Yes, absolutely. But the modern information ecosystem would punish her if she turned out to be wrong in the other 99% of cases where the outbreak peters out. I'm inclined to give her a lot of slack for widening the Overton window to allow people to react sooner, even if she didn't burn her social capital to do so.
(Sorry, the window was doing weird stuff, so I had to send the post uncompleted. I go on...)
As to candidness, do you really think most of the audience doesn't suspect that intellectuals don't mention *all the truth*? My base assumption is that they do, and that what you mention about disclosures would 1) not really add much, and 2) likely contribute to undermining the intellectual stance and brand of the intellectual. Again, it feels like you're seeing everything too much through EA-LW-tinted glasses. Intellectuals *do not* get bonus points among the public for making and recognizing mistakes. For a couple recent examples, just look at how people like Noah Smith or Ezra Klein have 'unpersoned' some their past opinions they now consider wrong.
As for 'Help Obi Wan, you're my only hope': I mean... people will either be the type to consult many different sources on an issue, of the type that don't. I think you're placing an excessively heavy burden on *some* intellectuals and public speakers. And once again, "You have a set amount of "weirdness points". Spend them wisely". I think you vastly underestimate how intellectually poisonous it is to present some people with some belief x that so weighs down in how that person is perceived that all his/her intellectual arguments get just filtered through the 'this weirdo has horrid and obviously stupid/wrong belief x, so it is not worth my time and thoughts to engage with anything else he/she says and giving them any credibility'.
I'll think I'll leave it here. Long enough rant already! xD
Good post! I think people should be more aware that in books, authors often put on a persona. This can be to adhere to what seems to be higher epistemic standards (as the Kelsey Piper case could be read as, authors of academic papersdi this very often as well), or to simplify things and avoid alienating the reader (as the Will MacAskill case seems to be).
So perhaps we need our literacy education to indicate that non-fiction books might not represent what the author ultimately thinks.
Which is a bit fucked up, because where can we actually learn what people think? Interviews? Not necessarily, if the author is polite and agreeable. Social media? Not necessarily, since they might be playing stuff up for an audience and there might be length restrictions.
So maybe this leads me down a road of skepticism where you can barely learn what people *actually* think. Which can sometimes be unfortunate.
(Though not always, imagine Kelsey Piper had said that Covid was coming in print, but she had been wrong. I think she was trying to anticipate and avoid the damage that could have caused to Vox.)
I am a bit more optimistic because people are blabbermouths, and even if they are trying to machiavellically curate what they say, their true thoughts and beliefs can mostly be pulled out through extensive text analysis and close reading. You can also try to create truth-seeking, high epistemic communities, but that's only really going to work for a finite subset of weirdos who train themselves into going against what is likely human nature (at least at this stage of our cultural and biological evolution). Humans respond to incentives, positive and negative (perhaps the deepest and superficially-looking-simple-but-really-deep-at-its-core-truth of economics). I can easily afford to be publicly commited to truth-saying not because of my superior ethics or backbone, but because it comes with little cost -as a) I'm a nobody and have little to no influence in decisions taken; 2) I have a job in which I am effectively unfireable; and 3) I write and post my thoughts in a language almost nobody in my face2face circle of concern will read them anyway-. Others don't enjoy such a luxury.
I actually think it's usually good for prominent public intellectuals to not spout off about all their speculations and intuitions that have weaker grounding. You can cite examples where their speculations were correct, but often they are not. Experts often have more speculative beliefs that are just as wild and implausible as the beliefs of randos if you get them talking candidly. For example, I've known a few doctors who, by all accounts, were good doctors, but also believed in conspiracy theories and quasi-mystical phenomena. The outcome of airing these beliefs is either destroying their credibility, or misleading the audience to put more credence into an idea than they should.
There are definitely cases where I think public intellectuals should air their unfiltered beliefs on a topic, but they should be very careful and thoughtful about when they do so. They should be pretty damn sure what they're about to say is worthwhile and likely to be correct.
(I also second everything Manuel says.)