Discussion about this post

User's avatar
Manuel del Rio's avatar

(Okay, one final side note; myself, I have recently decided to stop following Noah Smith precisely because of this: he has shown himself to be "not consistently candid" and I've just reclassified him as untrustworthy; but I am sure he thinks, probably rightly, that he can convey his core ideas and beliefs and achieve his goals by being "not consistently candid").

Manuel del Rio's avatar

That was a really long (albeit interesting!) post, Aaron. I have mixed feelings about your thesis. I'll try to order them in this comment.

At the core, I feel you *are* being a bit unfair here. I'd model the situation thus: public intellectuals have to create and cultivate what I'd call 'cultural capital', which mainly works like other types of capital. This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs.

A big part of this is choosing what wars to fight, what arguments to present, and when and for whom to do it. This doesn't have to be intellectually dishonest, but it does cross into what truth-seeking and truth-saying fundamentalists (like us) can consider a gray zone. I'd say one has an obligation to seek the truth, and if one is a public intellectual, one has probably some obligation not to lie or deceive, or deal dishonestly with your readers. But deciding to withhold some of the things you believe because you're uncertain of them, they are not central to you or your discourse, or you just feel they won't land well and end up being counterproductive to your whole line of work and persuasion (a.k.a., building your reputation and cultural capital) seems like... okay? It feels like you're assuming implicitly a world is which EA and LW culture norms are writ large, but truth be told, the epistemics norms of that community, while feeling quite optimal to the likes of me, can be extremely weird and self-defeating if you're outside of that pale. YTake MacAskill: i am pretty sure a lot of people (including a non trivial number of EAs!) are likely already pissed off by his Long-termist and (more recently) strong Ai shift. As Peter Wildeform once wisely said, "You have a set amount of "weirdness points". Spend them wisely".

16 more comments...

No posts

Ready for more?