You can surely at least understand the mindset there. Basically, when party A is obviously lying, a party B that calls them out appears more trustworthy, and it’s easier to overlook the obvious flaws in party B’s alternative. Here’s the logic, specific to vaccines:
group A claims vaccines are effective against contracting a given disease
group B points to evidence of actual effectiveness, which vastly falls short of what the public thinks
group B proposes an alternative to the vaccine, implying it’s effective and that group A doesn’t want others to know about it
group A attacks group B’s alternative
This creates an us vs them situation, so if you already distrust group A somewhat, it’s easy to side w/ group B, assuming you have no actual knowledge to parse the available information. The same logic works with anything, you just need a little bit of distrust w/ some authority, evidence of false/misleading statements, and a seemingly credible alternative.
The trick is to not lie/be misleading in the first place so you don’t break the trust. Trust takes years to build and a moment to break, so you need a very good reason to break the trust.
No, I really can’t understand the mindset. Especially not in the face of the constant undermining of trust by certain elements of society, including when they’re in government. We didn’t just arrive here for no reason. The same people who have eroded the trustworthiness of government and authority (on purpose, see Reagan) over decades are the ones who now exploit the results of their actions, for their own gain.
If, in your scenario, group B was on the level, it would be a different story. But they aren’t. If A oversold their claim, B would have massively oversold theirs. And that was easy to prove and has been proven. B also just didn’t oversell their own claim, they also exaggerated the claim that they refuted to something that, in this form, was never said - standard MO.
There is no trick to this. Being factual and getting people to believe you is much harder than telling an easy but good-sounding lie and getting people to believ you.
If A oversold their claim, B would have massively oversold theirs. And that was easy to prove and has been proven
Right. But if A is supposed to be the trusted authority and B proves they aren’t trustworthy, you’re more likely to not believe criticisms of B because “the establishment” has already been proven untrustworthy. That’s how conspiracies gain traction, and any amount of hiding of information gives fuel to detractors.
So people are going to ignore criticism of B because they’ll feel that B is the “underdog” being attacked by “the establishment.” That’s how these things work.
There is no trick to this. Being factual and getting people to believe you is much harder than telling an easy but good-sounding lie and getting people to believ you.
Sure, but trust is earned. You can’t lie 5% of the time and expect people to believe everything you say, if they find out about that 5%, the other 95% will be called into question. So you need to reserve the lies for when they really count.
Lying will work in the short-term, but it has big consequences in the long-term, so if you’re a long-term entity (e.g. the CDC, FBI, etc), you need to be very careful about how people interpret your message.
You can surely at least understand the mindset there. Basically, when party A is obviously lying, a party B that calls them out appears more trustworthy, and it’s easier to overlook the obvious flaws in party B’s alternative. Here’s the logic, specific to vaccines:
This creates an us vs them situation, so if you already distrust group A somewhat, it’s easy to side w/ group B, assuming you have no actual knowledge to parse the available information. The same logic works with anything, you just need a little bit of distrust w/ some authority, evidence of false/misleading statements, and a seemingly credible alternative.
The trick is to not lie/be misleading in the first place so you don’t break the trust. Trust takes years to build and a moment to break, so you need a very good reason to break the trust.
No, I really can’t understand the mindset. Especially not in the face of the constant undermining of trust by certain elements of society, including when they’re in government. We didn’t just arrive here for no reason. The same people who have eroded the trustworthiness of government and authority (on purpose, see Reagan) over decades are the ones who now exploit the results of their actions, for their own gain.
If, in your scenario, group B was on the level, it would be a different story. But they aren’t. If A oversold their claim, B would have massively oversold theirs. And that was easy to prove and has been proven. B also just didn’t oversell their own claim, they also exaggerated the claim that they refuted to something that, in this form, was never said - standard MO.
There is no trick to this. Being factual and getting people to believe you is much harder than telling an easy but good-sounding lie and getting people to believ you.
Right. But if A is supposed to be the trusted authority and B proves they aren’t trustworthy, you’re more likely to not believe criticisms of B because “the establishment” has already been proven untrustworthy. That’s how conspiracies gain traction, and any amount of hiding of information gives fuel to detractors.
So people are going to ignore criticism of B because they’ll feel that B is the “underdog” being attacked by “the establishment.” That’s how these things work.
Sure, but trust is earned. You can’t lie 5% of the time and expect people to believe everything you say, if they find out about that 5%, the other 95% will be called into question. So you need to reserve the lies for when they really count.
Lying will work in the short-term, but it has big consequences in the long-term, so if you’re a long-term entity (e.g. the CDC, FBI, etc), you need to be very careful about how people interpret your message.