Perks Of Copying Assignments
Every human being has some moments in their life that they would never wish to relive. For me, it was studying the course of ‘Engineering Drawing’ in my first year. As far as I can remember, there was a red book which was considered to be the 'Bible Of ED' and I never opened it. Consequently, I had no choice but to accept the fate of carrying out the Herculean task of copying the big ED sheets by pulling all-nighters. I was very tensed about passing the course but I managed to scrape by with marginal marks.
And now, a year has elapsed since I studied that horrid subject, and all I can remember from it are just four phrases: 'Front View', 'Side View', 'Top View' and 'True Object'. These were the topics covered in my first two classes of ED and after those classes, I pledged not to listen to any word that was related to the subject. Somewhere along the way, my low marks in ED and the insomniac nights of forging ED sheets made me a philosopher.
My recent philosophical theory is an analogy between ED and the interpretation of truth. I feel that truth is like a 3D object that can be deciphered as the true object and our views about truth are like the front view, side view and top view. Our version of the truth is always incomplete and our views can't be the ultimate truth as we always see only one side.
All the views of a simple sphere are the same: they all are circles of the same radius and that is why you will not be asked to draw views of a sphere in your exam. But your professor will ask you to draw views of irregular and sophisticated objects. The truth is just like those objects whose three views are completely different from each other. One of them may look like a distorted rectangle and the other one may look like a triangle. Our versions of truth may look and sound different but, in the end, when we compile all of them, we will find that our truth is just like those complex 3D objects that had blunt edges and slanted planes in it.
We live in the age of post-modernism and post-modernists are expected to have an open-minded view of all the happenings of society. But in the age of social media, it is seemingly hard to have a liberal, post-modern view of everything. Somewhere along the line, rather than this expanse of readily-available information opening our minds, it is closing them.
In the year 2010, internet activist Eli Pariser coined a term called 'Filter Bubble' and wrote a book about it in 2011. 'Filter Bubble' is a state of intellectual detachment that supposedly can occur from personalized searches when a website algorithm selectively chooses what information a user would like to see, based on data regarding the user, such as location, past click-behavior, and search history. Consequently, users become secluded from information that opposes their perspectives, completely quarantining them in their own cultural or ideological bubbles.
The choices made by these algorithms are not transparent. Common examples are Google’s personalized search results and Facebook's personalized news stream. You all have seen how often Google has read our minds and showed you what we wanted to see. This very phenomenon, which is seemingly harmless, is causing polarization of thoughts regarding politics, culture, and mass psychology.
Paulo Coelho has written that-"people only hear what they want to hear" and this thing is happening to us unknowingly. Eli Pariser said-"A world constructed from the familiar is a world in which there's nothing to learn ... (since there is) invisible auto-propaganda, indoctrinating us with our ideas" and that is what is causing extremism in politics, society and religion. There is a touch of toxic supremacy in play when we start to persuade others to believe in our thoughts.
There exists another buzzword in news media called 'echo chamber', which describes a state in which our tenets are 'amplified' or 'reinforced' by transmission and repetition inside a closed system and insulated from rebuttal. The term is a metaphor based on the acoustic echo chamber, where sounds echo in a hollow room. This can be termed as 'cultural tribalism' as well, which also suggests the echoing and homogenizing impact within social communities, such as on social media networks like Facebook, Instagram, Twitter, Reddit, etc. In an echo chamber "birds of a feather flock together" and we meet people who share the same belief with us. By visiting an "echo chamber", people can seek out information that bolsters their existing opinions and it leads them to confirmation bias.
According to research conducted by the University of Pennsylvania, members of echo chambers become reliant on the sources within the chamber and highly immune to any outer sources. When fake information circulates in these groups, people inevitably end up believing in it and it can cause polarization of our thoughts against a specific political party, religion, or culture. This has proven to be fatal in the cases of '2016 USA Presidential Election' and caused various riots in India. This kind of epistemic bubble can be popped by exposing the fake news but it is difficult as members of such echo chambers are in denial about the external sources of information and changing their mind is next to impossible.
Now that I think about it, maybe believing that ED was a difficult subject was also being part of an echo chamber. My thoughts were reinforced by my friends who were also struggling. But maybe if we had tried to get rid of our mental blocks and put in more effort, who knows, we might have figured the subject out. Nevertheless, I am grateful for the existence of ED since it gave rise to many such midnight musings.
-Submitted by Arghyadeep Dhar, via CollegeTime