Now that Oxford Dictionary has positioned the word of 2016 as “post-truth,” perhaps it is the perfect time to study the origins of data. As Professor Aaron Hanlon said, after all, “data is a big deal.” If big data is about how we ‘see’ information, though, I wonder what it says about our current understanding of big data. Coming off this election, many argue that America is anti-intellectual. Many Americans would rather hear comments during Presidential Debates with words they can understand rather than trying to parse through the educated political arguments former candidates brought to the podium. Policy? Foreign Affairs? None of that seemed to matter this election—and neither did any ounce of fact checking.

Professor Hanlon showed us the Ngram Viewer and its results for “data,” “fact,” and “truth.” The graph clearly demonstrates there has been a sharp decline in “truth” over the years, while “data” has seemingly surged from the abyss. But what does this mean in a post-truth year? If data is “a thing given” and anyone can label anything as data nowadays without real evidence backing it up (aside from fancy credentials or buckets of corporate money behind them), then anything can be that which is given. Data, perhaps, has become exclusive from truth. After all, Professor Hanlon explained how data as we know it now was born from Scriptural Data. These givens were practiced by Bernard in Faithful Shepherd back in 1607; indeed, he claimed that the church needed to be more plain and use less verbose language for the “truths” of scripture. The image became the trusted source because words “can mislead us.” I wonder, though, what this means in this day and age of Photoshop. Can we trust images? Can we trust words?

What is more, in this age of post-truth and anti-intellectualism, does anyone really read the data? Or is it something to attach to a tweet or a HuffPost blog post that will be taken as fact without a second glance? Professor Hanlon ended his lecture with four main points: 1. Data has always been visual 2. “Big Data” was a conceptual revolution as much as a technological one 3. When data becomes the main form of evidence, that’s revolutionary 4. All data is rhetorical and theory-laden. In response to his third point, he commented that we must be more discerning between which questions deserve which responses. But what implicit value judgments are made in “data?” How does one attempt objectivity? What else is implied in ‘just observation’? And if it is true that “context shapes quality,” who today is truly discerning the quality of the context of data? While there are plenty of websites fact-checking President-Elect Trump’s comments (even real-time tweet fact-checkers), I think it is pretty clear that no one cares if he is lying. No one cares if he is racist. No one cares if he brags about sexual assaulting women. No one cares. We are in the age of post-truth. Is this actually a revolution? Are we experiencing of revolutionary moment?