(Okay, one final side note; myself, I have recently decided to stop following Noah Smith precisely because of this: he has shown himself to be "not consistently candid" and I've just reclassified him as untrustworthy; but I am sure he thinks, probably rightly, that he can convey his core ideas and beliefs and achieve his goals by being "not consistently candid").
I'm curious, in what areas do you think he's not candid? I stopped following him as well, but more because I feel like an increasing proportion of his takes are half-baked.
Easier to say in which he isn't, but two recent examples that I felt were dishonest would be his AI doom take and his having denied he ever supported DEI policies and reacting with blocking people once he was shown retweets of his doing precisely that.
Yeah, I stopped because his “medical costs are all evil providers’ fault and the insurance companies are the tragic victims” take was so half-baked I couldn’t take his judgment seriously anymore.
That was a really long (albeit interesting!) post, Aaron. I have mixed feelings about your thesis. I'll try to order them in this comment.
At the core, I feel you *are* being a bit unfair here. I'd model the situation thus: public intellectuals have to create and cultivate what I'd call 'cultural capital', which mainly works like other types of capital. This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs.
A big part of this is choosing what wars to fight, what arguments to present, and when and for whom to do it. This doesn't have to be intellectually dishonest, but it does cross into what truth-seeking and truth-saying fundamentalists (like us) can consider a gray zone. I'd say one has an obligation to seek the truth, and if one is a public intellectual, one has probably some obligation not to lie or deceive, or deal dishonestly with your readers. But deciding to withhold some of the things you believe because you're uncertain of them, they are not central to you or your discourse, or you just feel they won't land well and end up being counterproductive to your whole line of work and persuasion (a.k.a., building your reputation and cultural capital) seems like... okay? It feels like you're assuming implicitly a world is which EA and LW culture norms are writ large, but truth be told, the epistemics norms of that community, while feeling quite optimal to the likes of me, can be extremely weird and self-defeating if you're outside of that pale. YTake MacAskill: i am pretty sure a lot of people (including a non trivial number of EAs!) are likely already pissed off by his Long-termist and (more recently) strong Ai shift. As Peter Wildeform once wisely said, "You have a set amount of "weirdness points". Spend them wisely".
Well said. A point I would add about Kelsey Piper: she had no special insight into COVID being a large pandemic. Kelsey Piper has sounded the alarm about various outbreaks that did not escalate into full-scale pandemics.
Her contribution was not seeing that a COVID pandemic was more likely than anyone else. Every journalist looked at the data, and said "there's a 1% chance this escalates into a worldwide pandemic." (This is a made up number - the exact number is not important to my point.) The difference is that other journalists took that as an invitation to write articles saying, "worrying about the pandemic is racist." Kelsey took that as an opportunity to write an article saying, "worrying about the pandemic isn't irrational." She didn't think an outbreak was more likely, she was just more willing to reason under uncertainty.
Could she have been more strident, and written an article predicting doom? Yes, absolutely. But the modern information ecosystem would punish her if she turned out to be wrong in the other 99% of cases where the outbreak peters out. I'm inclined to give her a lot of slack for widening the Overton window to allow people to react sooner, even if she didn't burn her social capital to do so.
I think your POV is essentially the default mode most PIs are operating under (so you're in good company) - or at least highly compatible with it.
> This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs
Lol I was the first one to carelessly use the word "need" (in the title) so I can't really object, but they only "need" to do these things insofar as they need [money/respect/status/stability/???]
Quite possibly I *was* anchoring too hard on relatively big names with >=stable careers (afaict) and so am not being fair to those for whom there's a legit strong tradeoff between e.g. financial security and candidness. This is very fair. I guess I would emphasize the "virtue" part and say like "well how close are you really to the Pareto frontier of candidness and other things you value?"
I'd be even more sympathetic to eg the Piper case (I'm starting out pretty sympathetic tbc) if she was like straight out of college with minimal credibility to burn if wrong in an unpopular way
I don’t think Kelsey Piper “F’d up”. You can always feel regret for something after you know more information, even if what you did with the information you had was the responsible thing to do.
Even from the perspective of hindsight I’d argue she comes out looking great. How insane and chaotic did the information environment become just weeks after she wrote those tweets? How quickly did people get fed up with the lockdowns in many places? If she started out as a maverick who didn’t defer to public health authorities at all would she have been able to credibly criticize their mistakes later as someone who doesn’t break from their guidance lightly?
I actually think it's usually good for prominent public intellectuals to not spout off about all their speculations and intuitions that have weaker grounding. You can cite examples where their speculations were correct, but often they are not. Experts often have more speculative beliefs that are just as wild and implausible as the beliefs of randos if you get them talking candidly. For example, I've known a few doctors who, by all accounts, were good doctors, but also believed in conspiracy theories and quasi-mystical phenomena. The outcome of airing these beliefs is either destroying their credibility, or misleading the audience to put more credence into an idea than they should.
There are definitely cases where I think public intellectuals should air their unfiltered beliefs on a topic, but they should be very careful and thoughtful about when they do so. They should be pretty damn sure what they're about to say is worthwhile and likely to be correct.
(Sorry, the window was doing weird stuff, so I had to send the post uncompleted. I go on...)
As to candidness, do you really think most of the audience doesn't suspect that intellectuals don't mention *all the truth*? My base assumption is that they do, and that what you mention about disclosures would 1) not really add much, and 2) likely contribute to undermining the intellectual stance and brand of the intellectual. Again, it feels like you're seeing everything too much through EA-LW-tinted glasses. Intellectuals *do not* get bonus points among the public for making and recognizing mistakes. For a couple recent examples, just look at how people like Noah Smith or Ezra Klein have 'unpersoned' some their past opinions they now consider wrong.
As for 'Help Obi Wan, you're my only hope': I mean... people will either be the type to consult many different sources on an issue, of the type that don't. I think you're placing an excessively heavy burden on *some* intellectuals and public speakers. And once again, "You have a set amount of "weirdness points". Spend them wisely". I think you vastly underestimate how intellectually poisonous it is to present some people with some belief x that so weighs down in how that person is perceived that all his/her intellectual arguments get just filtered through the 'this weirdo has horrid and obviously stupid/wrong belief x, so it is not worth my time and thoughts to engage with anything else he/she says and giving them any credibility'.
I'll think I'll leave it here. Long enough rant already! xD
a) People not saying things that are relevant to a topic but which they're less certain about or don't think they can convince other people of (fine imo, this seems like a normal norm in both everyday conversation and public-facing writing)
b) People not saying things that are relevant to a topic b/c they want influence/popularity (seems bad)
Good post! I think people should be more aware that in books, authors often put on a persona. This can be to adhere to what seems to be higher epistemic standards (as the Kelsey Piper case could be read as, authors of academic papersdi this very often as well), or to simplify things and avoid alienating the reader (as the Will MacAskill case seems to be).
So perhaps we need our literacy education to indicate that non-fiction books might not represent what the author ultimately thinks.
Which is a bit fucked up, because where can we actually learn what people think? Interviews? Not necessarily, if the author is polite and agreeable. Social media? Not necessarily, since they might be playing stuff up for an audience and there might be length restrictions.
So maybe this leads me down a road of skepticism where you can barely learn what people *actually* think. Which can sometimes be unfortunate.
(Though not always, imagine Kelsey Piper had said that Covid was coming in print, but she had been wrong. I think she was trying to anticipate and avoid the damage that could have caused to Vox.)
I am a bit more optimistic because people are blabbermouths, and even if they are trying to machiavellically curate what they say, their true thoughts and beliefs can mostly be pulled out through extensive text analysis and close reading. You can also try to create truth-seeking, high epistemic communities, but that's only really going to work for a finite subset of weirdos who train themselves into going against what is likely human nature (at least at this stage of our cultural and biological evolution). Humans respond to incentives, positive and negative (perhaps the deepest and superficially-looking-simple-but-really-deep-at-its-core-truth of economics). I can easily afford to be publicly commited to truth-saying not because of my superior ethics or backbone, but because it comes with little cost -as a) I'm a nobody and have little to no influence in decisions taken; 2) I have a job in which I am effectively unfireable; and 3) I write and post my thoughts in a language almost nobody in my face2face circle of concern will read them anyway-. Others don't enjoy such a luxury.
Regarding McAskill, I think there's also a strategic miscalculation to be seen here: if you're presenting your ideas as novel already, people will be primed to find them implausible and default to more moderate and reasonable-sounding variants. Thus "abolish the police" becomes "defund the police" and "love your enemies" becomes "ordo amoris". I'd expect your average (uninitiated but interested) reader to hear about future multitudes of happy digital minds or what have you and default to supporting AI safety work, or to hear about wild animal suffering and default to supporting farmed animal welfare. Preempting skepticism by presenting the vanilla version first means your target audience will just default to an even more watered-down version of your proposal (say, from taking the GWWC Pledge to donating to charity once in a while).
I notice nowhere in this piece is a discussion of the concept that the Overton window is a thing—% of total text for your "real" argument is not the relevant number since if your real argument is outside the Overton window then it will have dramatic effects on your dialogue. So this seems to fail the ideological turing test for many actions in the world of policy.
I don't per se disagree with your final conclusion that there is virtue in saying what you actually believe (especially if you know more about a subject than your median reader), and that disclaiming which parts of what you are saying can be rigorously supported and which cannot is a good way to balance the tradeoff here. That being said, I think this is kind of a weak series of arguments on many levels and overlooks a lot.
To oversimplify here, I think there's essentially 4 broad goals someone might have when writing something or speaking publicly about something: information, interpretation, persuasion, and prediction. I think that, all else being equal, there's exactly one of these goals where it's reasonable, appropriate, and beneficial to want people to be fully candid: interpretation.
Information is essentially journalism. If your goal is to explain the facts as they are known and to only make recommendations which can be justified with a pretty objective level of evidence, I think it is very much not beneficial to mix in your opinions and intuitions as being commensurate with the facts. There's lots of reasons for this, but the most obvious one is that a less literate audience is unlikely to be able to cleanly differentiate between two different kinds of claims within the same piece. (I think explicitly disclaiming the difference can help with this, but I still don't think that's foolproof.) Another reason is that people interpret facts through the lens of their own values, and they should be given the room to do that. Smuggling in your own values along with facts taints the decisions a person might make given just the facts. For example, I'm a pretty risk-tolerant person in my own life, because I have a strong gut intuition that most things turn out fine most of the time. So if I were a doctor and I was recommending additional tests and someone asked what I would do, it might very well be true that I would do nothing if it were me. But that's not necessarily because I know more than you; it's because I'm more willing to shrug and say, “It'll probably be fine.” I'll do my best to tell you the facts as I understand them, but you have to decide based on your own risk tolerance what you want to do with those facts.
I think if someone is trying to do persuasion, it's fundamentally unreasonable to ask them to include points that make their argument weaker. We all construct persuasive arguments for maximum effectiveness and impact. As the person being persuaded, sure, I may want the person to be completely candid about their positions, and I do think it's their moral responsibility to not say things they don't believe to be true, but being candid will always come second to being persuasive if your goal is persuasion.
If the goal is prediction, I think it's maybe fine to think people should be candid about their best guesses, but I do think that Ball's point here is essentially correct; epistemic humility means knowing the difference between the conclusions I got to in a way that holds up to scrutiny and the ones I got to because my gut is really bad at imagining a world where my gut is wrong. I think I probably would prefer for PIs to still say their best guesses and disclaim the ones that aren't as rigorously supported, because their intuition for something they're in the weeds on might be better than mine, but I also think it's pretty reasonable for them to think, “Hey, I always think I'm right, which means I'm pretty bad at differentiating between the times where my intuition is good and the times where it's too dumb to know it's bad, so instead of coin flipping and looking bad I'd rather just take the conservative approach.”
I do think that if someone is trying to interpret the facts for the audience, they should be candid about their understanding of things. That being said, a PI has to earn credibility as an interpreter before they can fill that role, so there's a pretty limited list of people to whom that applies imo.
(Okay, one final side note; myself, I have recently decided to stop following Noah Smith precisely because of this: he has shown himself to be "not consistently candid" and I've just reclassified him as untrustworthy; but I am sure he thinks, probably rightly, that he can convey his core ideas and beliefs and achieve his goals by being "not consistently candid").
Well done. I detected his fundamental dishonesty—and, frankly, intellectual laziness—about a decade ago, when he still had his old pre-substack blog.
I’d block him on twitter, but sometimes I like to know the idiot take on issues.
I'm curious, in what areas do you think he's not candid? I stopped following him as well, but more because I feel like an increasing proportion of his takes are half-baked.
Easier to say in which he isn't, but two recent examples that I felt were dishonest would be his AI doom take and his having denied he ever supported DEI policies and reacting with blocking people once he was shown retweets of his doing precisely that.
Yeah, I stopped because his “medical costs are all evil providers’ fault and the insurance companies are the tragic victims” take was so half-baked I couldn’t take his judgment seriously anymore.
That was a really long (albeit interesting!) post, Aaron. I have mixed feelings about your thesis. I'll try to order them in this comment.
At the core, I feel you *are* being a bit unfair here. I'd model the situation thus: public intellectuals have to create and cultivate what I'd call 'cultural capital', which mainly works like other types of capital. This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs.
A big part of this is choosing what wars to fight, what arguments to present, and when and for whom to do it. This doesn't have to be intellectually dishonest, but it does cross into what truth-seeking and truth-saying fundamentalists (like us) can consider a gray zone. I'd say one has an obligation to seek the truth, and if one is a public intellectual, one has probably some obligation not to lie or deceive, or deal dishonestly with your readers. But deciding to withhold some of the things you believe because you're uncertain of them, they are not central to you or your discourse, or you just feel they won't land well and end up being counterproductive to your whole line of work and persuasion (a.k.a., building your reputation and cultural capital) seems like... okay? It feels like you're assuming implicitly a world is which EA and LW culture norms are writ large, but truth be told, the epistemics norms of that community, while feeling quite optimal to the likes of me, can be extremely weird and self-defeating if you're outside of that pale. YTake MacAskill: i am pretty sure a lot of people (including a non trivial number of EAs!) are likely already pissed off by his Long-termist and (more recently) strong Ai shift. As Peter Wildeform once wisely said, "You have a set amount of "weirdness points". Spend them wisely".
Well said. A point I would add about Kelsey Piper: she had no special insight into COVID being a large pandemic. Kelsey Piper has sounded the alarm about various outbreaks that did not escalate into full-scale pandemics.
Her contribution was not seeing that a COVID pandemic was more likely than anyone else. Every journalist looked at the data, and said "there's a 1% chance this escalates into a worldwide pandemic." (This is a made up number - the exact number is not important to my point.) The difference is that other journalists took that as an invitation to write articles saying, "worrying about the pandemic is racist." Kelsey took that as an opportunity to write an article saying, "worrying about the pandemic isn't irrational." She didn't think an outbreak was more likely, she was just more willing to reason under uncertainty.
Could she have been more strident, and written an article predicting doom? Yes, absolutely. But the modern information ecosystem would punish her if she turned out to be wrong in the other 99% of cases where the outbreak peters out. I'm inclined to give her a lot of slack for widening the Overton window to allow people to react sooner, even if she didn't burn her social capital to do so.
I think your POV is essentially the default mode most PIs are operating under (so you're in good company) - or at least highly compatible with it.
> This means they need to turn themselves into a reliable, credible, relatively stable brand. To the degree this is necessary, they have to negotiate with external realities that go beyond 'this is just what I believe and consider true, man'. They need to reach and ideally expand a constituency; they need to sell them something intellectually valuable; they need not to alienate their current and possible constituencies too much, and so on. As everything else in life, this is a matter of trade-offs
Lol I was the first one to carelessly use the word "need" (in the title) so I can't really object, but they only "need" to do these things insofar as they need [money/respect/status/stability/???]
Quite possibly I *was* anchoring too hard on relatively big names with >=stable careers (afaict) and so am not being fair to those for whom there's a legit strong tradeoff between e.g. financial security and candidness. This is very fair. I guess I would emphasize the "virtue" part and say like "well how close are you really to the Pareto frontier of candidness and other things you value?"
I'd be even more sympathetic to eg the Piper case (I'm starting out pretty sympathetic tbc) if she was like straight out of college with minimal credibility to burn if wrong in an unpopular way
I don’t think Kelsey Piper “F’d up”. You can always feel regret for something after you know more information, even if what you did with the information you had was the responsible thing to do.
Even from the perspective of hindsight I’d argue she comes out looking great. How insane and chaotic did the information environment become just weeks after she wrote those tweets? How quickly did people get fed up with the lockdowns in many places? If she started out as a maverick who didn’t defer to public health authorities at all would she have been able to credibly criticize their mistakes later as someone who doesn’t break from their guidance lightly?
I actually think it's usually good for prominent public intellectuals to not spout off about all their speculations and intuitions that have weaker grounding. You can cite examples where their speculations were correct, but often they are not. Experts often have more speculative beliefs that are just as wild and implausible as the beliefs of randos if you get them talking candidly. For example, I've known a few doctors who, by all accounts, were good doctors, but also believed in conspiracy theories and quasi-mystical phenomena. The outcome of airing these beliefs is either destroying their credibility, or misleading the audience to put more credence into an idea than they should.
There are definitely cases where I think public intellectuals should air their unfiltered beliefs on a topic, but they should be very careful and thoughtful about when they do so. They should be pretty damn sure what they're about to say is worthwhile and likely to be correct.
(I also second everything Manuel says.)
(Sorry, the window was doing weird stuff, so I had to send the post uncompleted. I go on...)
As to candidness, do you really think most of the audience doesn't suspect that intellectuals don't mention *all the truth*? My base assumption is that they do, and that what you mention about disclosures would 1) not really add much, and 2) likely contribute to undermining the intellectual stance and brand of the intellectual. Again, it feels like you're seeing everything too much through EA-LW-tinted glasses. Intellectuals *do not* get bonus points among the public for making and recognizing mistakes. For a couple recent examples, just look at how people like Noah Smith or Ezra Klein have 'unpersoned' some their past opinions they now consider wrong.
As for 'Help Obi Wan, you're my only hope': I mean... people will either be the type to consult many different sources on an issue, of the type that don't. I think you're placing an excessively heavy burden on *some* intellectuals and public speakers. And once again, "You have a set amount of "weirdness points". Spend them wisely". I think you vastly underestimate how intellectually poisonous it is to present some people with some belief x that so weighs down in how that person is perceived that all his/her intellectual arguments get just filtered through the 'this weirdo has horrid and obviously stupid/wrong belief x, so it is not worth my time and thoughts to engage with anything else he/she says and giving them any credibility'.
I'll think I'll leave it here. Long enough rant already! xD
I think you're conflating two things here:
a) People not saying things that are relevant to a topic but which they're less certain about or don't think they can convince other people of (fine imo, this seems like a normal norm in both everyday conversation and public-facing writing)
b) People not saying things that are relevant to a topic b/c they want influence/popularity (seems bad)
Good post! I think people should be more aware that in books, authors often put on a persona. This can be to adhere to what seems to be higher epistemic standards (as the Kelsey Piper case could be read as, authors of academic papersdi this very often as well), or to simplify things and avoid alienating the reader (as the Will MacAskill case seems to be).
So perhaps we need our literacy education to indicate that non-fiction books might not represent what the author ultimately thinks.
Which is a bit fucked up, because where can we actually learn what people think? Interviews? Not necessarily, if the author is polite and agreeable. Social media? Not necessarily, since they might be playing stuff up for an audience and there might be length restrictions.
So maybe this leads me down a road of skepticism where you can barely learn what people *actually* think. Which can sometimes be unfortunate.
(Though not always, imagine Kelsey Piper had said that Covid was coming in print, but she had been wrong. I think she was trying to anticipate and avoid the damage that could have caused to Vox.)
I am a bit more optimistic because people are blabbermouths, and even if they are trying to machiavellically curate what they say, their true thoughts and beliefs can mostly be pulled out through extensive text analysis and close reading. You can also try to create truth-seeking, high epistemic communities, but that's only really going to work for a finite subset of weirdos who train themselves into going against what is likely human nature (at least at this stage of our cultural and biological evolution). Humans respond to incentives, positive and negative (perhaps the deepest and superficially-looking-simple-but-really-deep-at-its-core-truth of economics). I can easily afford to be publicly commited to truth-saying not because of my superior ethics or backbone, but because it comes with little cost -as a) I'm a nobody and have little to no influence in decisions taken; 2) I have a job in which I am effectively unfireable; and 3) I write and post my thoughts in a language almost nobody in my face2face circle of concern will read them anyway-. Others don't enjoy such a luxury.
Regarding McAskill, I think there's also a strategic miscalculation to be seen here: if you're presenting your ideas as novel already, people will be primed to find them implausible and default to more moderate and reasonable-sounding variants. Thus "abolish the police" becomes "defund the police" and "love your enemies" becomes "ordo amoris". I'd expect your average (uninitiated but interested) reader to hear about future multitudes of happy digital minds or what have you and default to supporting AI safety work, or to hear about wild animal suffering and default to supporting farmed animal welfare. Preempting skepticism by presenting the vanilla version first means your target audience will just default to an even more watered-down version of your proposal (say, from taking the GWWC Pledge to donating to charity once in a while).
I notice nowhere in this piece is a discussion of the concept that the Overton window is a thing—% of total text for your "real" argument is not the relevant number since if your real argument is outside the Overton window then it will have dramatic effects on your dialogue. So this seems to fail the ideological turing test for many actions in the world of policy.
I don't per se disagree with your final conclusion that there is virtue in saying what you actually believe (especially if you know more about a subject than your median reader), and that disclaiming which parts of what you are saying can be rigorously supported and which cannot is a good way to balance the tradeoff here. That being said, I think this is kind of a weak series of arguments on many levels and overlooks a lot.
To oversimplify here, I think there's essentially 4 broad goals someone might have when writing something or speaking publicly about something: information, interpretation, persuasion, and prediction. I think that, all else being equal, there's exactly one of these goals where it's reasonable, appropriate, and beneficial to want people to be fully candid: interpretation.
Information is essentially journalism. If your goal is to explain the facts as they are known and to only make recommendations which can be justified with a pretty objective level of evidence, I think it is very much not beneficial to mix in your opinions and intuitions as being commensurate with the facts. There's lots of reasons for this, but the most obvious one is that a less literate audience is unlikely to be able to cleanly differentiate between two different kinds of claims within the same piece. (I think explicitly disclaiming the difference can help with this, but I still don't think that's foolproof.) Another reason is that people interpret facts through the lens of their own values, and they should be given the room to do that. Smuggling in your own values along with facts taints the decisions a person might make given just the facts. For example, I'm a pretty risk-tolerant person in my own life, because I have a strong gut intuition that most things turn out fine most of the time. So if I were a doctor and I was recommending additional tests and someone asked what I would do, it might very well be true that I would do nothing if it were me. But that's not necessarily because I know more than you; it's because I'm more willing to shrug and say, “It'll probably be fine.” I'll do my best to tell you the facts as I understand them, but you have to decide based on your own risk tolerance what you want to do with those facts.
I think if someone is trying to do persuasion, it's fundamentally unreasonable to ask them to include points that make their argument weaker. We all construct persuasive arguments for maximum effectiveness and impact. As the person being persuaded, sure, I may want the person to be completely candid about their positions, and I do think it's their moral responsibility to not say things they don't believe to be true, but being candid will always come second to being persuasive if your goal is persuasion.
If the goal is prediction, I think it's maybe fine to think people should be candid about their best guesses, but I do think that Ball's point here is essentially correct; epistemic humility means knowing the difference between the conclusions I got to in a way that holds up to scrutiny and the ones I got to because my gut is really bad at imagining a world where my gut is wrong. I think I probably would prefer for PIs to still say their best guesses and disclaim the ones that aren't as rigorously supported, because their intuition for something they're in the weeds on might be better than mine, but I also think it's pretty reasonable for them to think, “Hey, I always think I'm right, which means I'm pretty bad at differentiating between the times where my intuition is good and the times where it's too dumb to know it's bad, so instead of coin flipping and looking bad I'd rather just take the conservative approach.”
I do think that if someone is trying to interpret the facts for the audience, they should be candid about their understanding of things. That being said, a PI has to earn credibility as an interpreter before they can fill that role, so there's a pretty limited list of people to whom that applies imo.
Jeez, I'm sorry I wrote so much; no one has ever accused me of brevity, clearly 😬