1
Philosophy & academia
Despite my education or what my writing may suggest, I am not a philosopher. I have more in common with the prolific serial writers of Astounding Fiction! than I do with a traditional philosophers like Immanuel Kant or Rene Descartes. I feel that I must point this out, as there is, I think, a difference between a philosopher and a scholar, and a modern academic.
A philosopher is someone who, historically, works outside of science, traditionally – though science has been extended to include topics more at home in metaphysics – and treats subjects that are, in their time, unanswerable through measurement or devices that could give them data. In the absence of empirical evidence to suggest one thing or another, the realm of philosophy approaches such problems from varying schools of thought and disciplines.
In the west, Classical philosophy goes back to Plato in the his accounts of the trial of Socrates’ in The Apology, The Republic, a historical treatise, and later Aristotle would become the most prolific, if most problematic, contributor to academia. We have Aristotle to thank for the prevalent assumption that there are 5 senses (Taste, touch, hearing, vision, smell) is common sense, or thought of as such, when sense is rarely common and what is sensible is rarely common. Most people have a sense of time, a sense of balance, and a recently discovered genetic trait that endows us with the awareness of knowing when we’re being watched. A sense of shame, some of us. Pride, paranoia, power, seeing dead people (who don’t know they’re dead).
But Aristotle’s practicality and rigidity was necessary for the split between moral absolutists and later moral relativists. There was another popular field of study for Greek philosophers, however, and the wisdom of Archimedes’ On Sphere Building, and the propositions of Euclid, which includes the working model for geometry as we know it. There were more materialists like Democritus and Eratosthenes, pragmatic application philosophers; Rene Descartes, for example, invented a system of coordinates (Cartesian coordinates) that are still used in city planning. These are natural philosophers, academics whose insights into one field bring about the creation of others. Such as Michael Faraday, an uneducated visionary and Charles Darwin, a pigeon breeder and naturalist whose studies ab the HMS Beagle, would lead him to publish On the Origin of Species, in 1859, introducing the world to evolution through natural selection,
Isaac Newton, professor of mathematics and Cambridge, creates calculus (which he called fluxions, and it has been contested that contemporary mathematician Leibniz – himself a philosopher – may have publications predating Newton’s publication of Principia Mathematica. The foundation of his laws of motion, F=ma, the outline of the theory of gravitation – many of these proposals would not be corrected until Einstein’s general theory of relativity redefined gravitation and extended it to include the behavior of time.
1 What is a philosopher?
A philosopher in science is someone who, let’s use the ancient Greek mathematician Eratosthenes as an example, applies one skill set to solve a problem in another field. Such as the circumference of the Earth, which Eratosthenes measured by calculating the time it took shadows to move from one spot to another, and finally how long it took to go from overhead a designation shadow-caster (an obelisk) and return to be right overhead again. By using math and calculation, the size of the Earth was correctly (within reasonable approximation) derived by the observation of shadows and sticks.
Another example of this is the solving of the riddle behind the make-up and consistency of Saturn’s rings by preeminently gifted mathematician and father of electromagnetism, James Clerk Maxwell. His publications on the electromagnetic spectrum were the final word on the behavior of light in the 19th century, and would not be severely called into question until the development of quantum mechanics, which itself was resolved by Nobel laureate Richard Feynman, whose publication The Strange Theory of Light and Matter is the final word – for now. These are practical philosophers. A practical philosopher is someone who is presented with a problem that is novel and uses their intuition and training to come up with new ways to solve problems, such as the volume of a curve – which Newton’s system could accurately calculate. These philosophers are the driving force behind the development of new technology and work in fields of application, where a hypothesis, such as one in chemistry about the combination of two elements, can be proposed in the morning and, by measurement, proved correct or incorrect by lunch.
Poetics and practical philosophy
The other philosophy is the philosophy Aristotle called Poetics. Poetics is the philosophy of the armchair, philosophy that evaluates moral or metaphysical issues not subject to measurement, such as the nature of good & evil – though science has looked into the inheritance of certain characteristics common to the amygdala of criminals convicted of murder: a smaller and less attune antennae that doesn’t pick up on limits imposed by fear and rationality, but the nature of evil and how it may be dealt with is not something a natural philosopher can propose an equation to test. The poetic philosopher is more of a teacher of different disciplines and a student of classicists and the subjects of interest in their work: Socrates’ skepticism, for example, is an assault on the assumption of academia’s inherent rightness by association – with institution, school, or clan.
It is a serious question and a pressing one: what lends credibility to one person’s ideas and beliefs and what leads to the dismissal of a person’s evaluation of facts? I guess we all like to think, that somewhere in our stomach, or in somebody’s stomach, there is a right answer, dammit. If the competing ideas are those of two people, people of equal standing and repute, how is the conflict resolved?
Subjectivity in empiricism
Well, consider the following conflict and how revealing it is: you wake up in the middle of the night, let’s say it’s in November – as it is now – and you’re cold. You decide to get out of bed to turn up the thermostat in the living room. But when you get there, you find a friend or loved one at the dials. They’re burning up, sweat beading off their forehead; it’s too hot and they can’t get to sleep, so they’ve decided to turn on the air condition. The thermometer reads the same for both observers: 60 degrees.
There are different ways to approach this problem. Do you let your friend get some relief from the air condition for a while, make yourself some cocoa and get another blanket, or do you insist that it is cold, refuse to turn the heat down, and hold out for an expert’s opinion? The temperature is 600 degrees. Both sides agree. And yet, the problem remains.
Now enlarge the issue, put it in the hands of the public, and leave it up for the news to relate this to the public, the court of public opinion, with one media outlet playing prosecution and the other defense: one source sides with the cold woman, insisting that warmth is important and going to sleep cold can lead to cough and a runny nose. The competitor surveys an anonymous group of an undisclosed number of survey participants and claim that 84% of all news stories involving percentages are pulled from their collective assholes to push a story, and 76% of News A readers believe it is more dangerous to be too hot than to be too cold, because dehydration can lead to hallucinations and delusions, even stroke and death. The reports take shape and independent outlets take sides. Newspapers and websites run special editorials on the importance of warmth and cool air, as opinion pieces on why being hot is good for burning calories and Dr. Oz endorses freezing as an excellent way of strengthening your immune system. The token religion authority piece calls into question the measurement and suggests that it’s flawed.
So, when each source of information exists solely to reinforce one point of view or the other, when both are objectively true to each independent evaluator, but also when both are subjectively incorrect in their describing of the weather as applicable to what is felt by someone else. The point isn’t to convince someone that cold is really hot or hot is cold. How do you decide what is in need of being decided, if anything? You start with humanizing the individuals, stressing individual pressures and stresses unique to them, their fear and desire – all very real, all very unrelated to getting objective truth on something that is, by its very nature, inexplicable of dual definitions, or consensus definition. You give them faults, you tear their character apart, and you do it without facts – but with questions: the best way to manufacture news is to use a declarative sentence and add a question mark, disguising the subtle lie with the trappings of inquiry:
Allegations heating up! Could drug withdrawals be to blame for Cold Woman’s inexplicable coldness?
Could meth use explain explicable midnight suffering?
Blam! They’re no longer people experiencing normal human emotions; they’re talking points to be pulled from the shelf from time to time to make a point, only to be put away. Not only have they lost human dimension, but makes them abstract pieces in a question that has become about something else. Whether the thermometer is accurate, what led to one person being hot or the other being cold, and all this noise becomes a convoluted, incestuous echo chamber, and what is invariably lost in the details are the people most affected by it. One person is hot. One person is cold. And they’re waiting on a population of disaffected, disillusioned apostates of academia to settle the point of it all. To those not hot nor cold, the best thing to do is decide what will benefit them the most, in earning their professional opinion. Now imagine this conflict is something more serious, something involving, say, hydrogen bombs, and instead of two people waiting on the Parakeet Jury’s verdict, and there are millions – to be told whether it is hot or cold, a sensation they cannot feel, one way or the other, and, if they could, would not be solved by consensus.
The reason for this, and yes, dammit, there is one, is to establish that this line of thinking, of unresolved / ultimate subjectivity, in which a response can’t be simultaneously correct, is meant to establish a concept in fiction, and is best described as the lack of resolution in defining based on subjectivity and alternative perspectives, which extends to the largest elements of a story to the smallest, and could be called the unresolved discontinuity – a resolved clause without a resolved thread from competing beliefs that can be boiled down to multiple perspectives of debating whether 60 degrees is hot or cold.
The presumption of expertise
Experts are similarly arbitrary, despite the hypnotism of pomp and gravitas, and are commonly those of repute and influence, demonstrating understanding and the recognition of practical application in a given field: someone who has demonstrated their understanding and survived the pressure of peer-review, the sorting hat for new writers that separates the fuckin’ wheat from the chaff, I’ll tell you that shit right now. So, after the peer review process, once they have the esteem of a university or publication, how do we accept such an argument without skepticism, if all ideas are made great only by their bearing the brunt of the most vicious application of Occam’s Razor. We look to the foundation of what makes a structure reasonable by degree, putting the structure into a tangible format you can see how the skepticism of legitimate expertise drives an industry of opinion professionals.
The presumption of expertise often comes with the backing of a prestigious organization or academic community. For each expert’s idea or philosophy, the wise response to something that requires reason and evidence is the rigorous application of skepticism, as the questioning of the obviously false is the beginning of a life-long self-education, which gives the newly minted scholar keys to the pragmatic process of tearing down the proposed theory from the bottom up, starting at the assumptions that underlie the actual postulation. To test the strength of an idea, you test its foundation, the principles behind the methodology used to arrive with such a hypothesis, as was the case with Darwin, for example, and later Watson and Crick, all with theories with the evidence presumed to be discoverable in nature (which have been).
Skepticism is the crucible through which all ideas and theories are submitted to, and earlier attempts at a natural explanation for the diversity of animals on Earth had been shredded to piece by the dispassionate teeth of doubt an inquiry. And anyone confident enough in their proposal to submit to skepticism and inquiry participates on the perpetual renewal of knowledge in academia. If an idea doesn’t seem like it works, scrutinize it to the greatest extent necessary, and try not to impose or otherwise import an unrelated understanding and apply it to a novel problem. Don’t let the presumption of title or prestige ever shortcut your natural inclination to evaluate passionately, nor let someone shortcircuit your critical faculties by attempts to annul attempted criticisms, such as an idea’s built in defenses against claims against it.
The application of practical philosophy is observation, deduction, commonality or abundance of supporting themes, prediction and the ability to explain, before their discovery, how such discoveries will be measured – such as the completion of the Periodic Table before all of the elements that are now firmly nestled into it were even found (or, as in some instances, artificially created), and intermediate forms between species of animal – such as the closest living relative to the blue whale – have been proposed and subsequently found.
Scholarship and philosophy
A scholar, of literature or history, can be as rhetorically gifted and thoughtful as a philosopher, and just as instructive; but, more than anything, an academic is the messenger, the intermediate between an artist or a subject of study and the student. A natural philosopher in the literary tradition looks at common elements of human nature as represented in fiction (or nonfiction) and acts as an intermediate, much in the same way, a voice of some authority turning the commentary into a peripheral, an adjunct to communal learning in popular culture. It is not right by consensus, as truth is not measured by popular appeal but, ultimately, by historical favor: the community of opinion will, as the event gets closer and closer to falling over the edge of living memory, at its least vibrant and potent in our mind, , at least in the public arena, is the judgment of history.
Popular literary criticism and interpretation
And history has given us unique and, well, perplexing interpretations of the art and culture that has shaped human civilization. The thing about history, you see, consensus is hard to come by; as much for modern political issues as it has been forever, because humans have the stubborn capacity of trusting in the benevolence of those of learning who may stand to profit by misleading them.
What kind of influence a persuasive philosopher may have over the discovery, that’s only half the battle: the connection between relevance and meaning in a work of fiction is as dependent upon the reader as the writer, as that is what makes literary criticism the non-exact science it is. As individuals we decide what a work of art means. As a culture we develop an interpretation and the rest of the details emerge gradually.
There are very clever and astute people out there; people who understand subtlety, subtext, and thematic elements. You know the type, stuffy and pretentious academics who sniff their own socks when no one’s looking. Ahem. In my studies of writers and writing, I’ve noticed one thing that writers hate more than anything—with the possible exception of outright plagiarism—and that is popular, enduring misinterpretation.
Pareidolia is a phenomena defined by the failure of a person’s natural capacity for pattern recognition. Pareidolia is a type of apophenia—as we begin to see patterns and meaning in randomness; perhaps project would be a better way to describe how we can find a face on Mars—this is how pareidolia happens: constellations, the imagined forms we use to connect the stars and form from those connections recognizable human shapes and patterns. Every constellation we see is apophenia. It’s fun to come up with theories and meaning to enrich our favorite stories and usually no one gets hurt. Let me give you an innocent example, an example which appeals to the mid-90’s child we all are. Ahem.
It is a popular theory among the gamers of my generation that The Legend of Zelda: Majora’s Mask is about the five stages of grief. It could be an example of pareidolia, yet it’s understandable. It’s an interesting look at the storytelling techniques of a uniquely modern medium. It’s harmless and has no social ramifications. But sometimes, when a full moon is out, misinterpretation can lead to terrible, terrible things indeed.
Submitted for your approval, exhibit A: The popular Beatles song Helter Skelter upon release had no definitive or band confirmed meaning and therefore what it meant was largely dependent on what we brought to it as listeners; the final verdict being the decision made by the beholder, by the interpreter. That’s one of the greater qualities of music; as it is, more-so than many forms of art, an intentionally subjective consideration. For people who are not cult leaders on LSD who think they’re Jesus, like Charles Manson, it doesn’t prophesize a coming race war. Yet that interpretation culminated in the Tate-LaBianca murders in California, August of ’69.
Exhibit B: The Catcher in the Rye is a famous coming of age novel by J.D. Salinger. The tale, told in the first person, is recounted by an angry, unreliable narrator named Holden Caulfield. He goes to bars, talks to hookers, and rants about posers. For reasons unknown, there have been many murders and crimes related to The Catcher in the Rye—so much so that the bewilderment surrounding this bizarre phenomena has its own Wikipedia page—the true measure of cultural significance.
In fact—this should be noted—Mark Chapman, John Lennon’s murderer, was arrested with The Catcher in the Rye in his hands. He claimed the text of the book would serve the law in determining his reasons for the crime. John Lennon, who performed Helter Skelter, which, because of misinterpretation, led to the Tate-LaBianca murders, was murdered by Mark Chapman, himself operating under a misinterpretation of a work of art.
Objectivity in practice
Objective meaning in literature too is rare and this leads to less objectively true interpretations. Most of the time the interpretation is based on coincidence and correlation. A good example is reading J.R.R. Tolkien’s Lord of the Rings as an allegory for World War II. But, it could be argued that the various races in the Lord of the Rings are intended to represent the different religions of the world? The masked men with the Arab chic were Arabic, the Dwarves were Hebrew, and Tolkien imported perceived characteristics (racism) to give their people a desire for hoarding gold (racism), and yet they are the stand-in for Judaism. The men could be said to be those who turned away from God, or Illuvatar, agnostics and atheists; but, where pray are the Christians? Well, the Hobbits are obviously linked to paganism, living off the land, being in love with all the things that grow, and their paganism is tinged with passive pacificism, being (for the most part) content to not meddle in the affairs of men, and elves – the wise, the immortal, and most beautiful race? They could be Tolkien’s stand-in for Christianity. Gandalf and Frodo and Bilbo get to go to the Elvish heaven with them, in the end, leaving Sam’s pagan ass at home, having to deal with Rosie.
Now, that might seem ridiculous. And it’s understandable: a parallel can be drawn between the one ring and the one race. What we see depends on the eyes we have and what our knowledge allows us to see. If this was the work of armchair philosophers it wouldn’t have the same lasting, negative effect as sometimes the very people upon whom we rely to instruct us in proper interpretation and textual criticism fail us and generations to follow, cultivating a culture of misunderstanding.
I was in high school when I first read Ray Bradbury’s Fahrenheit 451. To me it seemed to be the definitive manifesto for the right to freedom of speech and expression. It is interesting to note that the book was first serialized in Playboy magazine. Suck it, censors! A then spry, youthful 196 year old Hugh Hefner managed to secure the serial rights to Fahrenheit 451 during an important period in American culture, a period in which real censorship was on its last leg, made of steel and ivory though it was. Books now considered classics, such as Ulysses had been put to trial for obscenity, as did Naked Lunch. The importance can be compared to the making of Casablanca during the actual Second World War. The book burning in Fahrenheit 451 is as iconic to English speaking readers as is the thought-police and Big Brother from Orwell’s Nineteen Eighty-Four. It’s not even subtle.
It’s obviously about censorship. Right right? Right, right!
What’s it going to be then, eh?
In Bradbury’s cooky, imagined future, America has outlawed books and freedom of the press; free thought and intellectualism were treated the way in which Pol Pot treated it in Cambodia. Censorship is a political mob-process and it’s treated like business as usual. The people are partially to blame for this; for their blasé reaction to the suppression of basic human rights and dignity and they, through this process, forget what it is to be free. They forget what it is to be human.
Fahrenheit 451 seemed so obvious when I first read it, obvious to the point of insult, I thought, to the reader’s intelligence. Even my English teacher Mrs. [Willnotbesued] believed in and espoused this traditional interpretation; an obvious allegory, a philosophy intended to presage the day to day realities in a world where the individual is defeated by state sponsored censorship. The title is even a reference to book burning! That’s classic censorship! Right? Right?! F@#%!
As Sherlock Holmes said in The Boscombe Valley Mystery:
“There is nothing more deceptive than an obvious fact.”
So I had it wrong, as did my teacher, and, because of her, the entire class. I would later learn that nearly everyone, except the author, had it wrong. That particular interpretation is incorrect and implicitly incorrect. And when he attempted to set the record straight while lecturing at UCLA, Ray Bradbury was told by students that he was mistaken; the author’s interpretation of his own material was wrong. (This is not impossible.) So what did the author do that fateful day at UCLA? He was proper pissed and walked out. So what did Ray Bradbury think his book was about? Television. Television and the ancient evil from whence it came (Cathode ray tubes, it would seem.)
In reality Bradbury was more concerned with literature having to compete with television for primacy in the war for the imagination of the world. I guess that makes Fahrenheit 451 less Nineteen Eighty-Four and more Video Killed the Radio Star. You see, Ray believed that television would somehow lead to the shortening of the attention span. He thought that complex social and economic issues would be compressed for time and later used by powerful conglomerations to spin the truth to their benefit through mass manipulation and … oh, wait.
But none of that is as absurd as the trial of the German philosopher and noted mustachio Friedrich Nietzsche, who’s book Also Sprach Zarathustra (Thus Spake Zarathustra, ‘zarathustra’ being the German for the Persian Zoroaster) was initially published in separate installments, individual installments, over the course of a couple of years, between 1883-85 and was only published in a single volume in 1887.
At the time of his death, the fourth part remained unpublished. In the first run, forty copies were printed, not counting copies set aside by the author for friends and family. And since then the most common version is the portable collection – with a fourth segment, the now annotated Intermezzo, which was unfinished, was published; and had a limited commercial run and went out of print.
Nietzsche died before he could finish the Intermezzo and rearrangement for a singular work, and his sister thought, if only if there were Nazi overtones in the book. And it was so.
Despite the application of allegory and its ambiguities, it is my hope, and the hope of every writer, save for perhaps James Joyce, to be understood. But to make it more interesting for the reader, there is ambiguity, subtlety, and authors rely on their readers to join them, to be their partners in creation. As the American abstract artist Mark Rothko once said in reference to his art:
A picture lives by companionship, expanding and quickening in the eyes of the sensitive observer. It dies by the same token. It is therefore a risky, and unfeeling act, to send it out into the world.
Rothko, 1956, committing Art.