Author: Xiaoyue Zheng

An Alternative Place of Modernism in Bruno Latour’s Scheme

Bruno Latour’s claim that we have never been modern is indeed provocative. He suggested that every object in the world starts from its existence and eventually falls into a spectrum between pure nature and pure society. As time progresses, it undergoes an ongoing process towards purification. In this process, the object might oscillate between nature and society. His definition of modernism is the ultimate purification state, where at that time every object gets its stable place in the spectrum. However, per this very definition, modernism can never exist, because neither time will stop nor there will be a time when everything stops changing. An equally provocative implication of this train of thought is that we have never been revolutionary. In a revolution, the new emerge and the old is rejected; this very definition implies that the old and the new are two stable states. However, in Latour’s view things have always been changing over time, so the very idea of revolution cannot exist.

These two counter-intuitive claims rest on the idea that modernism is a point in time and that a revolution is a time interval. These assumptions are true in theory, but I want to propose alternative places for modernism and revolution in Latour’s scheme, so that we can see them in a slightly different light. Instead of points and intervals on the time axis, I think that modernism and revolution can also be objects. As time goes by, they too move towards points in the nature-society spectrum; in this case, both might settle on the social constructivist side. At a certain point in time, the places of the two objects get stabilized, and so are there definitions. Per this set of definitions, we can determine whether we are modern and whether an event is revolutionary.

A loose definition of modernism and revolution can be what people decide to be modernism and revolution. Take the widely-disputed Scientific Revolution for an example. In Latour’s definition as well as the contemporary definition of science, the Scientific Revolution is a misnomer, because it is neither scientific nor revolutionary. However, in the alternative definition of revolution, the Scientific Revolution is undoubtedly a revolution because people refer to the historical event as the Scientific Revolution. Though this logic sounds frivolous at first, the basis of it is the way people perceive the world. Note that everything we name is symbol of the subjective world that we observe; therefore, existence comes first. Because our perception of the world relies on subjectivity, defining A as A is ultimately correct, which makes the alternative definitions of modernism and revolution viable.

I am not denying Latour’s definition of modernism and revolution; in fact, compared with the alternative places of modernism and revolution, I think Latour’s definition is stronger. Nonetheless, the alternative places of the two concepts hopefully will shed light on people’s understanding of the world.

The Haitian Revolution: A Glimpse into Historical Studies

Late 18th and early 19th century was the age of revolutions. As historians and the public acknowledge that the American Revolution in 1776 and the French Revolution in 1789 gave birth to democracy and thus shaped the modern world, somehow the importance of the Haitian Revolution in 1804 was overlooked. Not only was it left out in history textbooks, it did not gain attention in the academic world until a few decades ago; in fact, the term “Haitian Revolution” was not used at all.

This week professor Jeremy Popkin took us back in time to this largely neglected historical event. Regarding its outcome, the Haitian Revolution can be said to be the most successful revolution in the Atlantic Revolutions. Beginning at 1791, the slave rebellion fought tenaciously until it finally expelled the French colonial government and established the independent Empire of Haiti. The revolution rebelled against colonial rule and slavery, and eventually reached its ideal of liberty, independence, and equality. In this sense, the revolution was as successful and glorious, if not more, as any other revolution. This made the question of why the Haitian Revolution was overlooked more intriguing.

In his lecture Professor Popkin offered a few possible answers. The Haitian Revolution was ignited from below by black slaves, who at the time were the majority. This was different from other revolutions, which were led by white middle class. The success of the Haitian Revolution also challenged the belief of black inferiority. Therefore, the significance of it was overlooked by western historians. Also, the brutality of the Haitian Revolution, which involved extreme violence of both sides throughout the revolution, challenged the commonsensical idea that democratic revolutions are moral and noble. Though the Haitian Revolution was inspired by Enlightenment thought, and did reach its goal at last, the process was so brutal that it was not considered one of the important Atlantic democratic revolutions.

The academic reactions to the Haitian Revolution shed light on historical studies in general. History is generally assumed to be the objective recount of the past. However, the nature of history automatically leads to historical studies, because historians must interpret the limited evidence to reconstruct the past. Moreover, the ambitious historians are not satisfied with reconstructing the past, they often try to understand why things happen in a certain way. This requires more selection and interpretation of information. In this process, they are biased inevitably. However, by and by some of the biases become common sense. Then later historians base their interpretation on these common-sense biases. Eventually historical study becomes an intricate web of interpretations, which rest upon facts and older interpretations. The whole process might be unavoidable, so what we can probably do is to try to be critical.

Old Monuments in Modern Light

The western culture has a long history of building monuments that can trace back to Ancient Greece and Rome. The tradition is so long that, most people, except for tourists, usually take monuments for granted. This is of little surprise. The word monument came from monere, “remind”, and thus ancient monuments were built to preserve memory. However, as that particular memory faded, the original meaning of the monument was lost. At the same time, new meanings arose, which is common throughout the history and integral to modern monuments. This week Professor Schnapp from Harvard University took us on a journey to investigate the “Bz ’18-’45” monument and the idea behind it.

A brief recount of the history of the monument shows how the meaning changed over time. Bz ’18-’45 is a modern monument based on the Monument to Victory inaugurated in the Fascist Regime. Set in Bolzano, a border city between Italian culture and German culture, the Monument to Victory was built to represent the imperial power of Fascist Italy. After the inauguration, the monument was frequently visited by Benito Mussolini. After WW2, the monument was used in a number of ways, from an alter immediately after the war to a gathering place of protests. Moreover, the monument itself was a subject of dispute among Italian nationalists and local German residents. Amid all the controversy, Bz ’18-’45 was built. The new monument left most of the old monument intact, added an LED ring on the façade, which completely altered the meaning of the monument, and reframed the basement to a museum of the history of the monument and Bolzano.

This monument is a great place for people to reflect on the meaning of a monument. Apart from its original meaning of memory, a monument gains additional meanings and significance over time as it is woven into the city fabric and the history fabric. Gradually, the original meaning might be lost or overcome by new meanings. This phenomenon drove the creation of modern monuments; for example, there are monuments of nothing whatsoever and monuments that are designed to disappear. This, in a way, sheds light on the modern mentality. In this ever-changing world, people realize that nothing is constant and stop pursuing grandness or timelessness. Instead, people start to focus on the ephemeral aspect of the world.

Another interesting point about this monument is the museum in the basement. The museum is filled with reproductions and photos. This is partly because the humidity of the basement will damage real historical objects and partly because today’s world no longer put that much significance on authentic historical objects. Instead, it focuses on the message, and even the impression.

In the modern world, monuments can hardly perform their original task of creating memory. They change, gain new meanings, and adapt to the modern mentality.

The Context of Data

Understanding today’s world requires understanding the concept of data, especially at this critical point – days away from the 2016 presidential election. These days we have seen too much “data” and “polls” and “facts” and “statistics” thrown in newspaper and online. These commonly used words are usually associated with, in writers’ minds as well as readers’, a less used word: truth. However, what does truth mean and does absolute truth exist in today’s world? In my opinion, there is a paradox around this question.

Contrary to popular belief, data has a long history in the Anglo-American tradition. The usage of the word can trace back to the 17th century. Through the Enlightenment, data started to be used in science as well as religion, when it was used in its original meaning: that which is given. Gradually, data gained more popularity as a synonym of evidence, as evidenced by the reception of Robert Hooke’s Micrographia. This book of illustrations of organisms under microscope was considered truth by the public. An interesting note here is that even though data had got a wider definition, at this point it was still visual and still required context. Also, truth in this sense of data does not meet our modern understanding of truth or fact.

Data later gained its modern definition, which implies huge tables of numbers and complex statistics. This development spawned a lot of modern science that plays a great role in modern society, such as economics. More importantly, it gave rise of the concept of information. The change from concrete visual data to the abstract information was truly revolutionary. It, in a way, defined the modern world. From baseball to election, from global climate change to cutting-edge experiments in labs, from financial markets to software development – none of these is able to exist without data.

As these “decontextualized” data are made more visually impressive, it speaks much louder than words ever did. Granted, the “cold” numbers do sound more convincing than the “hot” rhetoric that we have seen way too much these days. However, isn’t data also rhetorical and theory-laden? Can data really be decontextualized? Probably no. Then data is not independent, so it cannot be absolute truth. Here comes the paradox: throughout the history since the Enlightenment, which has been a scientific era, the truth perceived by the public has never been the truth.

How should we make of this? The existence of absolute truth is an underlying assumption in the modern society. What we can do is to always look at “data” critically and recognize its dependence on contexts.

The Dilemma of Typological Thinking

In this week’s lecture, professor Judy Stone talked about typological thinking in today’s society. Typological thinking is a way of looking at things that classifies things only in terms of the types to which they belong, and ignores variations among individuals. Though Darwin and following evolutionary biologists revolutionized evolution by proposing that variations play a central role in evolutions, typological thinking still persists in the general public. This, I contend, is incorrect yet inevitable. In order for our society today to avoid the negative consequences associated with typological thinking, we need to look at it appropriately.

Typological thinking has a long tradition that can be traced back to Plato. From this thinking emerged naturalists’ idea of taxonomy, which concerns the systematic classification of living organisms. The outcome of this is that the spectrum of living organisms is artificially dotted with many “perfect forms”. All living things are, to some extent, considered imperfect representations of the “perfect forms”. This approach was helpful at the beginning, when human’s understanding of the nature was so little that we needed it to simplify and make sense of the world. However, as more and more evidence compelled biologists and the general public to realize that all living things form a continuous spectrum, we can no longer hold that some forms are inherently more “perfect” than others.

While we know by now that typological thinking does not have any solid basis and thus erroneous, I want to argue that this kind of thinking is inevitable. Suppose we can completely get rid of typological thinking. Then all living organisms that have existed in this world, living or extinct, are not fundamentally different – there are only variations. We cannot draw a fine line between any two “species”. Thus all living organisms are not fundamentally different. Therefore, an outcome is that we cannot name anything at all, which is nonsensical.

What should we make of this? I believe that the only way to solve this dilemma is to be constantly aware of our own thinking process: to be aware of the intricacy of the world, to be aware of when and how we simplify things, and to be aware of the role typological thinking plays in our cognitive process. This will be largely aided by technological advances in genomics, which help us understand the similarities and differences of individuals in a fundamental way. As we learn to never take typological thinking for granted, hopefully we can overcome the negative consequences of typological thinking.

In conclusion, while we might never be able to completely get rid of typological thinking, it is possible, with efforts to understand the complexity of life, to avoid the ramifications of it.

The Evolution of Climate Science

Given the endless heated discussions of global warming and climate change today, it is of little surprise that we should be informed of the field of climate science before making judgments. This understanding includes understanding the development of the field of climate science. In this series of lectures, we have already heard from Dr. Gillen D’Arcy Wood’s talk on the Tambora eruption, which sparked the curiosity of European and American scientists, and, in a sense, was the beginning of the discipline. This time, we were lucky to have Dr. Kerry Emanuel from Massachusetts Institute of Technology to talk about the evolution of climate science. Having had a basic idea of the history of climate science, I realized that the development of this field was largely driven by curiosity and advances in other, more fundamental sciences such as Physics and Chemistry. In addition, this observation of the revolutions in climate science might shed light on our general understanding of revolutions: rather than independent events, revolutions are usually integrated processes.

The public’s interest in climate science started in early 19th century, when amateur scientists found geological phenomena that they could not explain with existing theories. With the climate data they had collected in the unstable first half of the 19th century, James Croll proposed the theory of long term climate change based on Earth’s orbital variation. This theory is currently known as the theory of Ice Ages. Though this theory seems self-evident today, it was considered a breakthrough, especially at the time when climate and astronomy data were sparse. It’s also noteworthy that the scientists that contributed to the theory were driven by their own curiosity of the world.

As scientists started to realized that the temperature of the Earth could endure fluctuations, with their curious minds, they started to question why the surface temperature of the Earth was the way it was. This can be said to be the origin of the study of the greenhouse effect. However, this process is not simple nor independent: climate science at the time depended on the discoveries made by physicists, such as Jean-Baptiste Joseph Fourier, Gustav Kirchhoff, Ludwig Eduard Boltzmann, and Max Planck. Their endeavor in the understanding of radiation, in addition to John Tyndall’s study on the physical property of air, finally allowed people to understand the dynamic equilibrium of Earth’s surface temperature.

Although climate science has become an increasingly independent discipline, revolutionary changes in the field have never been completely independent. Recent revolutionary methods in climate science such as geochemistry and robotics involve integration of multiple disciplines.

To conclude, throughout the history of climate science, the revolutions have been driven by scientists’ curiosity and their interdisciplinary perspective.

Darwinism Today

From today’s biology education inside and outside classroom, it is only natural for people to connect theory of evolution by natural selection with Charles Darwin and Galapagos finches in a mind map; some who worked really hard in class might also remember Darwin’s study of barnacles. However, this is only the tip of the iceberg of the theory of evolution. Today’s theory of evolution is much more than Darwin’s idea; in fact, the original Darwinian theory itself, if not complemented by other later studies, probably would not have been accepted at all: the public did not recognize or accept the theory until the development of modern molecular biology and genetics, which made the theory much more complex than what Darwin proposed. Therefore, using the term Darwinian Revolution to describe the development of the theory of evolution in today’s conversation might be a misnomer.

When attributing all the credits of the formation of the theory of evolution to Darwin, we often forget other naturalists who equally contributed to the theory. It is safe to say most people are acquainted with the Darwinian theory from their school education; however, it is equally safe to say that Jean-Baptiste Lamarck, Alfred Russel Wallace, Thomas Henry Huxley, and other naturalists who made great contribution to the development of the theory, are often left forgotten. According to Dr. Browne, much of our knowledge and vocabulary of the theory of evolution today has non-Darwin sources. However, for some historical reasons at the time, Darwin ended up reaping all the fame. Even though the increasingly independent academia grants us the necessary information to look at the development of the theory in an unbiased light, we sometimes are still limited by the historical and conventional understanding of the theory.

The misconception of the theory of evolution partly stems from the popularity of Darwinism. People use the term Darwinism arbitrarily in different contexts, such as economics and sociology. The wide adoption and adaptation of the theory in different fields are gradually included in Darwinism, rendering the theory of evolution itself less concrete and less scientific. For example, the term “survival of the fittest” has lost its original biological meaning, which concerns not only survival but also reproductive success.

Moreover, we have to think of the theory of evolution as a dynamic theory. The study of genes and the development of molecular biology, which took place about one hundred years after Darwin, have made the theory of evolution scientifically tenable. It is also possible that more rigorous research will further change the theory of evolution. For instance, the development of epigenetics, which studies the modification of gene expression rather than alteration of the genetic code itself, has caused the revival of Lamarck’s theory of inheritance of acquired characteristics and the modification of the Darwinian theory.

It is very likely that the “Darwinian theory of evolution” will continue to change in the future, drifting further apart from Darwin’s opinions. Therefore, in using the term “Darwinian Revolution”, we should keep in mind that it is much more than Charles Darwin.

Social Network’s Role in 21th-century Revolution

As more than three billion people get online today, the Internet’s impact on the world is unprecedented.  It has changed the way people entertain, communicate with each other, and even get new information. These are all subtle and long-term changes, though. On the other hand, the Internet, especially various social networks, has also caused short-term intense changes. The experience of Khalid Albaih, a Sudanese political cartoonist who has been taking active part in the Arab Spring, demonstrates this point. However, I think that social networks, in this sense, is double-edged. We have to look at itself and its impacts critically.

First, the authenticity of news has always been an issue concerning social networking. Prior to the age of the Internet, the ways by which people learn about the world, such as newspaper, television, and radio, are generally controlled by elites. At best, the public’s understanding of the world is influenced by trained journalists and editors; at worst, through extensive censorship government officials can present a biased world which they want the public to see. Because of a paucity of information, it is utterly impossible for the general public to determine the authenticity of any piece of news. This old paradigm of news has been revolutionized by social networks. Now every single person who has access to the Internet can post what he witnesses in no more than a few seconds. This definitely makes the true information available in real time. Moreover, the innate characteristics of social networks allow people all over the world to access the information. These all sound good. However, if we look at the issue from another angle, we will soon realize that the authentic information can easily be inundated by partial, and even bogus information. Thus a paradox appears: the true information is present, but almost impossible to discern.

Second, the considerable amount of information has shortened people’s attention span, causing those who post online to pursue eye-catching and rabble-rousing effects. This way of thinking is helpful in revolutions. For example, there is no stronger voices than “We are all Khalid Said” in the Egyptian Revolution. These viral posts not only raise the public’s awareness of a certain issue, but also help the public connect with each other. While noting this, we should also be mindful of the negative effects. Today’s social network is all about attention; radical opinions often draw attention; thus radical messages are often present in social networks. The simple syllogism can partly explain the rise of ISIS and the popularity of Donald Trump in this year’s election.

Finally, the anonymity of the Internet fosters free speech. At the beginning, this allows people to be open and discuss issues that they normally will not discuss, and thus bridge the gaps between people of different opinions and backgrounds. However, this inevitably leads to the irresponsible and even hateful messages that crowd today’s social network.

In conclusion, while social networks, and the Internet in general, have revolutionized the ways we look at the world, we should always try to stay critical of its effects.

 

The study of the Tambora Revolution is a Revolution Itself

Mount Tambora’s eruption in 1815 was indeed revolutionary. It heavily disrupted the global weather system and thus had serious impacts on politics, economy, society, and culture. However, not only the eruption itself was a revolution, but Professor Wood’s study of the Tambora eruption is also revolutionary.

In April 1815, Indonesia’s Mount Tambora erupted. The magnitude, coupled with its tropical location, made the eruption one of the most powerful volcanic activities in recorded history. The huge effect can be seen all over the world, from the totally destroyed Sumbawa island to the melting Arctic, from the literary creations in “the year without summer” to the political instabilities in Europe, from the cholera dispersion among the globe to famine and depression across the Atlantic Ocean.

In the past, researchers and scholars usually studied these events individually. This is absolutely reasonable. This allowed researchers to concentrate on their respective areas of expertise, and allowed them to delve further into one particular subject. At the same time, this research method might have prevented researchers from seeing the big picture and finding the subtle teleconnections in global events. For example, before Professor Wood, the dominating opinion concerning the cause of the depression in Europe in 1816 was the large scale demobilization of army after the Napoleon war. This conclusion is reasonable and backed by numbers. However, Professor Wood urged us to entertain the idea that the Tambora eruption caused global cooling, thus the crop failure in Europe, thus the market failure, and finally the depression. I cannot say whether one of the theories is sounder than the other, but I can say that Professor Wood definitely has a valid point.

When an expert in literature reads Lord Byron’s Darkness or Mary Shelley’s Frankenstein, when a political theorist tries to figure out the reason behind the millennium cults in 1816, when economists studies the 1816 depression, when meteorologists studies the weather pattern of the earth, when Professor Wood himself first stood on the edge of the Tambora caldera – none of them can connect the dots of all the seemingly irrelevant events. Only when we, with the help of Professor Wood’s insight and extensive data in this age of information, think outside the box and get away from the established rules of the academia, can we fully appreciate worldwide events such as the Tambora eruption.

As the Tambora study gained acceptance in the academia and the public, the merits of interdisciplinary study will be more recognized and appreciated in the future. In this sense, Professor Wood’s study is truly revolutionary. It teaches the public to see the connections between seemingly unrelated events with a broad perspective, opens a window for scholars, and creates countless opportunities for future researches.

A Closer Look: What do We Mean by the Scientific Revolution?

 

As science advances, human society’s view of science also evolves. In today’s world, science is more than science: once an esoteric subject of study, science has become an integral part of our culture and has shaped our modern conversation. A result of this is that people often use concepts related to science – from the gravitational wave to cloning, from the scientific revolution to the scientific method – without carefully examining their meanings. Instead, I believe that we should take a closer look at the very words and phrases we use, think about them in different contexts, and try to gain an informed understanding. So what do we mean by the scientific revolution?

The standard answer to this question is a unique event that took place in the 16th and 17th century, which revolutionized human’s understanding of natural science, transforming it into a mathematically precise, experimentally based objective knowledge. This revolution marked the transformation from the Medieval world to the modern world. We have long taken the term “scientific revolution” and its standard definition for granted. However, upon closer examination of its literal meaning, by asking simple questions such as “how scientific is it”, “how revolutionary is it”, and “how unique is it”, we can gain some insights into the historic event.

A noteworthy fact of the scientific revolution is the phrase “scientific revolution” itself. The phrase, counter-intuitively, was not coined by later historians, but by those who took active parts in the scientific revolution. In their age, the word “science” and “revolution” both had different meanings from our age. For example, “revolution” came from revolve, which, in context, meant going back to the glorious Greek and Roman Period.  Moreover, those who started to use the word “revolution”, Bacon for example, exaggerated the leap they made: the Medieval was not as primitive as they described. “Science”, on the other hand, is not scientific according to 21st century standard, for it still had much to do with politics, religion, superstition, and ethics.

Despite the fact that “the scientific revolution” was technically neither scientific nor revolutionary, the term has become a part of our contemporary conversation. In other words, the phrase has gained a cultural meaning in addition to its literal meaning. Therefore, we should not be too critical of the use of this phrase, or be too confident in our definition of the scientific revolution. Note that just as the definition of science and revolution has changed since 17th century, it will probably change in the future. Signs of this can be seen in the academia’s reception of the string theory. Though the theory cannot be tested, which contradicts our current definition of science, it is still considered a scientific theory.

To further explain this phenomenon, we can look at other scientifically revolutionary non-revolution events, such as the discovery of irrational number in math, the cell theory and the theory of evolution in biology, and the creation of the periodic table in chemistry. Though they are equally revolutionary, they are less referred to as “revolution” in daily conversations.

Therefore, referring to the 16th and 17th century event as the scientific revolution in modern conversation is scientifically inaccurate yet culturally precise. The main takeaway, however, is that we should be aware of the meaning behind the words and phrases we use.