There’s a strong urge to believe what you wish instead of what you can prove. Computer rumors are a great example. Many rumors have no basis other than being a feature someone wants. They call it "wish casting".
We want the world to be black and white. Some given statement is either true or false. But it’s not. Gödel #Godel describes at least three states: true, false, and unprovable (e.g., the statement "This statement is false". Can’t be true or false; it’s unprovable. Maybe there’s a better name.)
But it’s worse than that.
In science, a theory isn’t true … it’s just the best explanation we have so far. The whole endeavor of science is to keep finding better explanations. To make good decisions you don’t need the absolute best explanation, just one good enough to guide you to beneficial choices. (I said "prove" before, but to be more accurate I should be talking not about what you can prove, but about what you can’t disprove.)
#Bayes (really #Laplace) says a given notion isn’t true, it’s actually true-with-some-probability. Each new thing you observe impacts that #Probability. This is the actual math behind the #ScientificMethod. And it’s the truth of the world. Your beliefs must adapt to your observations, constantly, forever.
If you have unshakable faith in some set of "facts", you’re probably doing it wrong. Even when you’re right, you could be righter.
Of course, if you don’t adjust your beliefs with new input, if you don’t test, if you have "facts" instead of "very probable theories". If you believe things because of how strongly the person who convinced you believed instead of what they could actually show you. If you believe simply because that’s what your parents taught you. Then, well, you **might** be right (even a stopped clock is right twice a day). But at best you’re not going to make good decisions for yourself, and at worst you’re going to try to tell others what to do based on an inaccurate understanding.
It’s messy; and that’s just how it is.