Tag: science

Data is Not Fact or Truth

Are there ways to fact check data? Aaron R. Hanlon , an English Professor at Colby in his talk on the Revolutions in Data, Big, and Little, has brought up that question that I bet many have not thought to ask. Hanlon opened his talk with how we take Data for granted. Data comes from the British tradition, entering the English language in the 1600s from Latin, where it was seen as fact and truth the way nowadays we view Google data. Hanlon discusses the way we misinterpret “data,” a conversation that would truly benefit people today as we are in the age of technology where “data” is attainable at a click of a button.

Continue reading

It wasn’t real before, but it is now

Professor Keith Peterson enlightened us about Bruno Latour’s line of thinking that humans have never been modern or revolutionary. I haven’t taken a philosophy class, so I admit I had some trouble following, but a few of the key themes I found to be very interesting. Rather than focusing on whether humans have ever been revolutionary, since that just seems like a matter of definitions, I’m going to focus more on the aspect of scientific inquiry. I found the distinction between scientific knowledge and inquiry as a social construct vs something real, or not clearly real, to be very interesting. Something that isn’t commonly known is just going to seem unnatural or bizarre. There is often just no way for us to relate to a cutting edge scientific finding, and even if it is a concept that we think would be impossible to relate to, if enough people start to believe it, then that shared knowledge all the sudden starts to seem more legitimate and accepted. Eventually something scientific, like microbes, will move to the cultural mainstream, and seem ‘natural’ and be taken for granted. The idea is now the new way that we understand the world around us.

This reminded me of a discussion I’d had with a friend a few weeks before the lecture. We were talking about the current state of scientific knowledge. If we think of the way physics is taught in secondary school vs college, for example, we see that things are very much watered down Newtonian physics in high school rather than including the more “correct” knowledge behind Einstein’s work, relativity, quantum mechanics etc.– basically modern physics – and therefore if you take PH241 at Colby, you learn a lot about why much of what you learned before was wrong, or breaks down under certain conditions. Basically, our scientific knowledge is on a need-to-know basis. Science is a way to explain the world around us, and if I don’t question too much or explore too deeply, then high school physics can more simply, fully, and intuitively explain my world. In this example, behind the simplification of science is a timeline. We still teach the physics of hundreds of years ago in secondary school even though the fundamentals of it aren’t necessarily held up in college or grad school. Even modern physics is up to a hundred years old. Hence, we have this idea of scientific progress, and what’s newer is better (and more complex). Will everything we know today be discounted in the future? Very possibly so. As Professor Peterson said, as it is for all scientific knowledge, everything is relatively tentative and can be updated or modified in the future, even though it may seem relatively stable now. I think it was described as moving from essence to existence to essence again, then back to existence etc. But the way Latour thinks humans understand this phenomenon, and the general perception of the passing of time, is as the abolition of time before the present. Our idea of progress would then be one of a future that is unlike the past. Thus, Latour sees us as detached from the past since we separate ourselves from previous times by events (progress), rather than just the passing of time.

 

Curiosity Killed the Cat

But satisfaction brought him back.


Kerry Emanuel explored the history of climate science and listed many contributors to the development of this discipline. Throughout his talk, what really stood out to me was that curiosity is a driving principle for the advancement of climate science. I venture far as to say that curiosity is a driving principle for ANY science. Curiosity opens up the conceptual space for exploring new ideas not available before, and allows for experimentation, both theoretical and practical.

According to Emanuel, curiosity about why the Earth’s temperature was what it was, why the old ice sheets behaved the way they behaved, and what determines the nature of the surface of the Earth is what guided many scientist’s experiments in climate science. This desire to know guides not only climate science, but also any other scientific endeavors; it reflects an awareness, want, and need to fill in a knowledge gap. It is also a source of personal satisfaction and of aliveness: according to Ian McEwan, “the standard measure of how alive you are is your curiosity.”

Taking curiosity as a guiding principle in science leads to many paths that are not necessarily useful at first glance. Endeavors led by curiosity are often thought of as useless because they do not have any immediate practical applications, and are waved off as daydreams or delusions. Those who follow their scientific curiosity, and dare to ask the “what if…” question (or any other questions), are often the ones that make significant contributions. In these types of experiments, utility is not the primary purpose, the motivation to realize them is not utilitarian.

This does not mean, however, that any inquiry based on curiosity will result in something useful. “Fooling around” with an idea does not make it automatically better, but it does offer conceptual freedom and “shackles off the human mind” (Flexner, 546) and sets it free for adventure. According to Flexner, curiosity is the “outstanding characteristic of modern thinking. It is not new. It goes back to Galileo, Bacon, and to Sir Isaac Newton, and it must be absolutely unhampered” (Flexner, 545).

Emanuel made it clear that the revolution in climate science, just like any revolution, has a long and intricate history. Each scientist finds bits and pieces of a theory and then the pieces are organized in a systematic way to make a real contribution to science. All of these experiments enrich our world view and aid in the pursuit of science and truth.  “The mere fact that [experiments] bring satisfaction to an individual soul-bent upon its own purification and elevation is all the justification that they need” (Flexner, 549).

References:

Flexner, Abraham. “The Usefulness of Useless Knowledge.” Harpers, vol. 179, 1939, pp. 544-552, https://library.ias.edu/files/UsefulnessHarpers.pdf.

What Kind of Revolution?

History has told us that Charles Darwin was not the first person to suggest the theory of evolution, but he is whom we credit with this discovery. This is a controversial part of history, but we have seen this happen in other fields of study. Janet Browne was quick to let us know that Darwin was modest about his accomplishments and did not seek all of the attention he received. Rather, other scientists and members of society made him into the representative of what we call the Darwinian Revolution. The Darwinian Revolution was not all Darwin, and, as Browne would argue, not quite a revolution. However, I would argue the Darwinian Revolution was a revolution based on the long lasting implications it has had on society.

Darwin knew that his discoveries would have uncomfortable social and political implications. This contributed to his hesitancy to share them with the rest of the scientific community, and the rest of the world. Religion was the way to explain the unknown, and it governed everyone’s lives. Suggesting that some other process produced humans than creation caused widespread horror. While probably not to the degree it was back then, many people are still outraged by this suggestion. So while the Darwinian Revolution changed science, it also changed religion. Darwin also knew his discoveries would cause a revolution, but not in his lifetime. Even before his death, Darwin had become romanticized as a hero of modern science. His supporters were adamant about keeping his legacy alive. They made sure the theory of evolution was never discredited or considered irrelevant. They were successful, considering we still learn and talk about it today. However, this has continued to be a controversial topic, especially in the United States.

Browne stated that The Darwinian Revolution was long because it lasted more than a few years; one hundred and fifty to be more exact. However, I would disagree with this statement. As we have seen in this course, revolutions often last many years and continue to influence society after they end. The Darwinian Revolution is no exception. The changes that occurred in the scientific world were vast, and this also affected the education system. When it comes to religion, the Darwinian Revolution’s influences are still very prevalent. The decision to teach evolution in public schools in the United States was implemented very recently in relation to the acceptance of evolution as a legitimate scientific theory. The resistance of religious institutions and groups has not gone away, even today. Darwin has become the symbol and hero of atheists. Darwin considered himself to be agnostic, but for some reason many religious people cannot separate science and religion. This is definitely not true of everyone. There are many people who can accept both. When charting the evolution of the Darwinian Revolution, it has changed from a scientific revolution to a religious one. Because of this, I would argue that the after effects if the Darwinian Revolution still play a large role in the questions asked by people about science, religion, and the intersection of the two.

The revolution of the language of “science”

When we think about why humans have the power to influence the world in a global scale, one of the most important factors that we should not ignore is the science. For a long time, we had acquired knowledge, made tools, and built civilizations. None of them could be achieved without science. However, if we think about “science” as a word and a kind of languages, we will find that it is also revolutionary.

Continue reading

Is Modern Science Taken for Granted?

While Professor Cohen gave his talk that questioned how revolutionary, how scientific and how unique The Scientific Revolution was, a question popped in my mind.  That question was, are we, as a human race, taking for granted the “knowledge” we think we know? In other words, are we being misled when we think we know almost all there is to know in the science realm. How is the mindset of our predecessors in the era of The Scientific Revolution any different than the mindset we currently have today? For example, for a long time people took for granted that the Earth was flat and that the sun revolved around the Earth. How do we know whether or not we believe in something that is completely false but is taken for granted to be true or that something thought to be ridiculous is actually true?

While modern technology and science allow us to understand much more about life and the Earth than was known around the time of The Scientific Revolution, there could be knowledge to be discovered that may seem completely false and impossible. One such notion could be the concept of extraterrestrial life. While in no way has it been proven that aliens do not exist, there seems to be a general consensus that it may be a bit ridiculous to believe in them. Someone who claims to have been abducted by aliens or claims to have seen a U.F.O. is generally considered crazy, but that may not be the case once technology improves and the vast reaches of space are discovered.

My favorite part of Professor Cohen’s talk came when he discussed discoveries of The Scientific Revolution that were not all too scientific. Copernicus reasoned that the sun should be at the center of the “universe” because it’s so noble, Kepler invoked the beauty of the Platonic solids to explain the organization of the planets, and Francesco Sizzi argued by analogy that there could not be more than the seven known planets. It’s fascinating to think some of the era’s most revolutionary discoveries were first deemed true based on absurd explanations. To revert back to the alien example, someone may have a ridiculous explanation for why they “know” alien life exists, but technology is not at a level to prove it yet. Some amazing revelations may be currently ignored or put down because the believer’s reasoning is bizarre or foolish or the idea itself is far-fetched to our understanding.

This is not meant to be a piece obsessing about the possibility of life outside Earth, but the point is that current scientific beliefs or understandings could be wrong. Seemingly ridiculous proposals could be true, our technology just may not be at a level to prove them yet. In just the past century, science and technology have developed greatly and will continue to do so. Many scientific subjects encourage questioning life’s many aspects, we have not made every discovery and some “discoveries” we have made could be wrong. Don’t take science for granted.

The Single Story of Science

This summer, I had the honor of watching Chimamanda Ngozi Adichie’s Ted Talk titled “The Danger of a Single Story.” A powerful Nigerian author, Adichie cautions and explains how damaging a single narrative of any story is to society and relates back to her own childhood experiences to demonstrate her point. Adichie’s world as a child was filled with stories about young British boys and girls, and she grew to understand the British experience through reading the literature that was widely available to her as a child. Despite never seeing snow in her life and living in Western Africa, Adichie wanted to experience it the way the young British characters of the novels and children’s stories did.  From a young age, she had many stories about British life, but because of a lack of Nigerian (or even African) literature, British children would never have the same possibility. The British children would grow up with a singular story of what “Africa” was like; a story that completely disregarded the complexities and riches of the continent. To the British children, the daily happenings of Adichie’s life and therefore her human worth were lost in this larger story while at the same time she was discovering British weather patterns and foods.

When I had the opportunity to listen to Professor Cohen discuss the Scientific Revolution and how it wasn’t truly revolutionary, I couldn’t help but be reminded of this Ted Talk because of the lack of diverse stories and thoughts in the history of science and philosophy. Cohen’s lecture was fascinating and enlightening, and there were two things that particularly struck me and that sat with me after the lecture: the change in the concept of what is “scientific” and his reflection that the Scientific Revolution was dominated by pale males.

Science as we know it has not always been very “scientific”. The Ancients considered philosophy and other abstractions as science in a way that contemporary views fail to encapsulate and understand. The Scientific Revolution helped form science into the empirical, methodical, quantitative art that it is today, but it created a modern conception of the term that has since made it impossible to thoroughly understand the science of the past. The definition of “science” has always been a “moving target” in the words of Cohen, and as definitions have changed and been altered by the ruling classes, the original intent of the “science” of philosophy and the stars has disappeared and been appropriated by white men.

In reflecting on this idea of the definition of science being created by those in power, I wonder how different our world would be if women and people of color in the past would have had the chance to be a part of the discourse. If women and people of color had had the chance to interpret old ideologies and concepts and apply it to the world around them, would things be different? What would our technological landscape look like today if the vast majority of the brains of the world had not been shut out of the process?

The dominant discourse of science is a single story: that of wealthy white men with the luxury of time to devote to study. The past has ignored and therefore erased the narratives of the majority of the population that was closed off and told that they were not good enough for the process. And while the Scientific Revolution certainly created new technologies and accelerated the human race forward, it failed to dramatically alter social and political life for those who didn’t hold the power. It was revolutionary technology but failed to be revolutionary thought.

 

Motion in Language or Language in Motion?

“The most revolutionary part of the Scientific Revolution was that we use a metaphor of a revolution to describe it.” That conclusion was very provocative to me. As products of their time and culture, and as aware and critic of themselves and their environment, the scientists during the Scientific Revolution started using the word revolution to explain their circumstances. “Revolution” was not only used as a metaphor to challenge the past and current standards and establish a new scientific outlook, it was also used as a cyclical historical term to describe a pattern.

During the Scientific Revolution, knowledge stopped being about enlightenment and faith and started being about experiments and testing. Although the theories were abstract and mathematical in nature, they were able to be tested physically with experiments. Everything could be put to the test in this process of discovering new principles through empirical methods and mathematical analysis. Bacon, Galileo, and Descartes created the foundations of thinking about our thinking about the world and established a new approach to methodological inquiry. This new paradigm echoed the rise of humanism during the Renaissance, which questioned religious authority and emphasized the capacity of individual human beings to understand the world. The Scientific Revolution relied heavily on a capacity for abstract thinking and a precise use of language in order to become such a powerful period in history.

According to Chalmers Brothers and Vinay Kumar, language is a tool we cannot stop using because we need it to use all other tools. Language does not only communicate and describe; by making distinctions, it creates, generates, and provides us access to conceptual breakthroughs. By acquiring distinctions and giving them a name, we discriminate between things we didn’t see as different before. New ways of seeing things allow us to do what we could not do before. In short, language and distinctions give us access to knowledge. Once you have the distinction, you have created the conceptual space for understanding and access breakthroughs. The thinkers of the Scientific Revolution realized they were living a turning point in modern science and culture, they were aware of the distinction, and named that distinction “revolution”. Even though the word could be used in different contexts (to roll back, to return, overturning, as an astronomical term…) and could take the form of different truths, it contributed to understanding and thinking about the world.

Centuries later, we continue to enhance and discover new meanings to the distinction made by the phrase “Scientific Revolution” using new analytical methods through data. We can now see we weren’t able to see before and distinguish deep structures and patterns in history. By reinterpreting the Scientific Revolution in terms of language we uncover a pattern of a continuing process of change as a critical part of history.