TruthRank vs. PageRank

I was at dinner a few weeks ago with my long time friend and first business partner Dave Jilk. We ended up talking about how difficult it is to determine signal from noise, fact from fiction, truth from bullshit, and bullshit from complete-and-total-bullshit.

I recently hit the wall with all the political stuff that was popping up everywhere. I think the thing that flipped my switch from on to off was a satirical article about Hillary Clinton and all the horrible things she had done that was being passed around by people who I think considered it to be factual. As I read through it, I imagined all the derivative articles building on the sarcasm embedded in the article and then making arguments which would be cited by others as truth because they showed up credibly somewhere.

I probably would have recovered from this in a few days if I wasn’t then confronted yesterday by a Wall Street Journal article that was sent to me with a clear set of assertions built around a statement that I knew to be factually incorrect, but I’ve seen written exactly the same way in other articles to make a specious argument.

Software should be able to solve this for us. It appears that whenever Google talks about working on ranking based on trustworthiness anti-science advocates freak out about it. If you are interested in seeing the math (and some concepts) behind this, the paper by some Google folks titled Knowledge-Based Trust: Estimating the Trustworthiness of Web Sources – while chewy – is very interesting (at least the parts I understood.)

Dave sent me a presentation he’d done on this topic for a Defrag Conference several years ago. I tossed it up on SlideShare and embedded it below.

We went back and forth on it a little more and Dave ended with a strong statement around skepticism.

“It seems like a consequence of a few drivers has caused there to be more awareness of the notion and techniques of skepticism. However, people are using it indiscriminately, i.e., just to attack the other side. It’s another form of bullshit, actually – they don’t care whether the skeptical criticism is valid, but it has some additional polemical value because it has an aura of aiming for truth. Some of the drivers of the new Skepticism are all the problems with media, climate change, and probably some other things I’m not thinking of.”

When I ponder the notion of peak oil, I pine away for the concept of peak bullshit. But, like peak oil, I suspect it is an elusive construct.

  • Sachin Patel

    How do you know it was satirical – she’s done some pretty terrible stuff :p

    On a serious note. I’ve been thinking about this a lot too. A lot of news seems biased, incorrect or misleading. I think a crowdsourced Chrome extension similar to Genius.com would actually be pretty cool.

    See: http://genius.it/8782856/www.cnn.com/2016/03/06/opinions/nancy-reagan-impact-and-legacy-zelizer/index.html

    Perhaps highlighting misleading statements in red, with an upvoting mechanism that allows others to upvote the correct answer. The more upvotes on an alternate correct answer, the deeper the shade of red of the highlight..

    • Keith Hughes

      My concern with crowd-sourced determination of truth is that for stories like we are talking about here, any signal gets rapidly lost in the noise of things like these satirical articles that then become truth to a segment of the population because they don’t realize the articles that started things off were satirical. Also, many stories are being proclaimed as truth by a lot of people merely because they have been repeated often enough, despite the debunking that has occurred.

    • DaveJ

      “Genius” is a really interesting idea. I had envisioned something like this but hadn’t quite made the connection because I have mostly just used it for poetry…

      I don’t think up/downvoting works for this application, because people will vote based on whether they agree with the sentiment rather than the veracity. But it could work with the right kind of user-assessment.

  • I think that would be great … But we still must take responsibility to question. As the parent of a 13-year-old, I work hard to teach him to question everything he reads. I often use this as an example http://www.theonion.com/article/wikipedia-celebrates-750-years-of-american-indepen-2007

  • Matt Kruza

    Ok this may seem too obvious but few things are objectively facts. 2 * 10 = 20, yes, or so it seems. But that is only with the stipulation that we are in a base ten system. So even *that* seemingly obvious fact needs to be caveated. Just like if someone weighs 200 lbs, I have an easy diet where they can lose 124 pounds… it just involves moving to mars (i think my math is right there… 🙂 ). Of course there are ridiculous conspiracies (never went to the moon) and anti-science views with meaningful impacts (most of the anti-vaxxers etc.), but the best humans can do is objectively view each situation and we will spend more time figuring out the truth for those which truly matter. Things can be done to mitigate, but total technology nerd fantasy that subjective human feelings can be removed… total.

  • “… truth from bullshit, and bullshit from complete-and-total-bullshit.”

    LOL, I haven’t laugh so hard in a while, good line!

    Thanks for the Google link. I’ll get to that tonight. Hopefully I’ll make it through in one piece, otherwise I may end up with air pockets in my brain.

  • The sad part about reporting is that even the WSJ plays the fast and loose game with facts.

    • Indeed.

      • Toby Lewis

        Love this article. The Google approach seems very promising.

        To be fair the WSJ had a really strict process when I used to get sub-edited by them working in the wider Dow Jones. In a time sensitive fashion their sub-editor would nail down fact after fact. Journalists hate this, as they have other things to do with their, but there is a very strong institutional belief in getting the facts right. Perhaps they could also use an algorithmic tool of the Google paper writers to make this process better. Yet they seldom publish errors.

        I enjoyed watching Truth on this topic at 60 Minutes, which grapples with the process of publishing and verifying a sensitive truth about GW Bush. The poor journalist gets crucified, and it seems she was right. Very sad story, which is worth watching for all those hoping “peak bullshit” will arrive.

  • Thought-provoking (and clear) slide presentation — my compliments to the chef! I wonder how correlated, indirectly or directly, ad revenue is with general veracity on news sites. By the way speaking of ads I’ve started noticing ads above the comments — disqus? Annoyingly pulls me out of the otherwise more zen Feld Thoughts experience.

  • It’s not just politics. It’s economics, it’s global warming. Very rare when you find a source that shows its biases.

  • Felix Dashevsky

    Brad, have you read Ryan Holiday’s book Trust Me, I’m Lying? It’s replete with examples of what Dave calls “synoptic failure” in his slides (in fact, it asserts that that’s how modern “pageview journalism” works).

  • In many senses, the ad model causes this to be way more prevalent than it would be otherwise. The more people that read/see something – regardless of what it is – the more money publishers make. It ends leading to a bunch of content produced simply to fill up space. They don’t really care if it’s accurate or not, as long as people look at it. Turns out bogus, crazy information is more entertaining than factually correct.. the cycle continues, unfortunately.

  • Eric Warren

    You might want to take a look at some fairly serious academic work that has been done on something called “The Majority Illusion.” What it in effect says is that as people pass along information that they feel to be credible (or not) to their very small immediate network, the truth value of that information rises astronomically, regardless of whether it is independently true or not. Two links, one PC Week (bemoaning the numerous completely false articles about Google Plus being discontinued: http://bit.ly/1ODGGjv and The MIT Technology Review article about the USC Research: http://bit.ly/226BI6A. Very chewy, as you say…

    • DaveJ

      This is a very strong cognitive bias – I am fascinated when I see synoptic biases coming from very smart people.

      The best rule of thumb I have heard is that we should work twice as hard to try to verify the truth of facts that support our position as those that oppose it. Presumably, this goes for friends too. Unfortunately, they don’t like it when you question their sources…

  • Glenn Neal

    I’ve had three news reports written about events I or a friend were intimately involved with.
    The stories had very little to do with what actually took place. I find it interesting we use
    the word the ‘press’ to describe this conveyance medium.

    • Yup – good book.

  • MerredithB

    A couple of years back, I was wondering how anyone could possibly believe some of the outrageous clickbait passed off as truth — Obama purposely infecting Texans with Ebola, for example, or the “facts” behind Ferguson. I decided to follow the example of Eli Pariser and watch the feeds of some of my more conservative friends, and understood almost immediately that algorithms changed their diet of information. Of course it was their truth.

    One of the best examples of what you’re discussing can be found in this article by Rachel Aviv, in the New Yorker. We turn to the Internet to find facts/”the truth,” but now, truth can be purchased. Do we as a society have the patience to search past the first few pages of results? I fear not. I gave a talk on this (filtered re PR, tech and racism and gender), but Aviv’s article was galvanizing. Worth reading, not only for the specifics of one man’s case, but for its implications.

    http://www.newyorker.com/magazine/2014/02/10/a-valuable-reputation

  • Short Funny Jokes

    Great article. I have been trying to rank my website http://shortfunnyjokes.xyz/ with limited success. Your article helps much.