From The November Letters: Academia, philosophy, and subjectivity

1

Philosophy & academia

Despite my education or what my writing may suggest, I am not a philosopher. I have more in common with the prolific serial writers of Astounding Fiction! than I do with a traditional philosophers like Immanuel Kant or Rene Descartes. I feel that I must point this out, as there is, I think, a difference between a philosopher and a scholar, and a modern academic.

A philosopher is someone who, historically, works outside of science, traditionally – though science has been extended to include topics more at home in metaphysics – and treats subjects that are, in their time, unanswerable through measurement or devices that could give them data. In the absence of empirical evidence to suggest one thing or another, the realm of philosophy approaches such problems from varying schools of thought and disciplines.

In the west, Classical philosophy goes back to Plato in the his accounts of the trial of Socrates’ in The Apology, The Republic, a historical treatise, and later Aristotle would become the most prolific, if most problematic, contributor to academia. We have Aristotle to thank for the prevalent assumption that there are 5 senses (Taste, touch, hearing, vision, smell) is common sense, or thought of as such, when sense is rarely common and what is sensible is rarely common. Most people have a sense of time, a sense of balance, and a recently discovered genetic trait that endows us with the awareness of knowing when we’re being watched. A sense of shame, some of us. Pride, paranoia, power, seeing dead people (who don’t know they’re dead).

But Aristotle’s practicality and rigidity was necessary for the split between moral absolutists and later moral relativists. There was another popular field of study for Greek philosophers, however, and the wisdom of Archimedes’ On Sphere Building, and the propositions of Euclid, which includes the working model for geometry as we know it. There were more materialists like Democritus and Eratosthenes, pragmatic application philosophers; Rene Descartes, for example, invented a system of coordinates (Cartesian coordinates) that are still used in city planning. These are natural philosophers, academics whose insights into one field bring about the creation of others. Such as Michael Faraday, an uneducated visionary and Charles Darwin, a pigeon breeder and naturalist whose studies ab the HMS Beagle, would lead him to publish On the Origin of Species, in 1859, introducing the world to evolution through natural selection,

Isaac Newton, professor of mathematics and Cambridge, creates calculus (which he called fluxions, and it has been contested that contemporary mathematician Leibniz – himself a philosopher – may have publications predating Newton’s publication of Principia Mathematica. The foundation of his laws of motion, F=ma, the outline of the theory of gravitation – many of these proposals would not be corrected until Einstein’s general theory of relativity redefined gravitation and extended it to include the behavior of time.

1 What is a philosopher?

A philosopher in science is someone who, let’s use the ancient Greek mathematician Eratosthenes as an example, applies one skill set to solve a problem in another field. Such as the circumference of the Earth, which Eratosthenes measured by calculating the time it took shadows to move from one spot to another, and finally how long it took to go from overhead a designation shadow-caster (an obelisk) and return to be right overhead again. By using math and calculation, the size of the Earth was correctly (within reasonable approximation) derived by the observation of shadows and sticks.

Another example of this is the solving of the riddle behind the make-up and consistency of Saturn’s rings by preeminently gifted mathematician and father of electromagnetism, James Clerk Maxwell. His publications on the electromagnetic spectrum were the final word on the behavior of light in the 19th century, and would not be severely called into question until the development of quantum mechanics, which itself was resolved by Nobel laureate Richard Feynman, whose publication The Strange Theory of Light and Matter is the final word – for now. These are practical philosophers. A practical philosopher is someone who is presented with a problem that is novel and uses their intuition and training to come up with new ways to solve problems, such as the volume of a curve – which Newton’s system could accurately calculate. These philosophers are the driving force behind the development of new technology and work in fields of application, where a hypothesis, such as one in chemistry about the combination of two elements, can be proposed in the morning and, by measurement, proved correct or incorrect by lunch.

Poetics and practical philosophy

The other philosophy is the philosophy Aristotle called Poetics. Poetics is the philosophy of the armchair, philosophy that evaluates moral or metaphysical issues not subject to measurement, such as the nature of good & evil – though science has looked into the inheritance of certain characteristics common to the amygdala of criminals convicted of murder: a smaller and less attune antennae that doesn’t pick up on limits imposed by fear and rationality, but the nature of evil and how it may be dealt with is not something a natural philosopher can propose an equation to test. The poetic philosopher is more of a teacher of different disciplines and a student of classicists and the subjects of interest in their work: Socrates’ skepticism, for example, is an assault on the assumption of academia’s inherent rightness by association – with institution, school, or clan.

It is a serious question and a pressing one: what lends credibility to one person’s ideas and beliefs and what leads to the dismissal of a person’s evaluation of facts? I guess we all like to think, that somewhere in our stomach, or in somebody’s stomach, there is a right answer, dammit. If the competing ideas are those of two people, people of equal standing and repute, how is the conflict resolved?

Subjectivity in empiricism

Well, consider the following conflict and how revealing it is: you wake up in the middle of the night, let’s say it’s in November – as it is now – and you’re cold. You decide to get out of bed to turn up the thermostat in the living room. But when you get there, you find a friend or loved one at the dials. They’re burning up, sweat beading off their forehead; it’s too hot and they can’t get to sleep, so they’ve decided to turn on the air condition. The thermometer reads the same for both observers: 60 degrees.

There are different ways to approach this problem. Do you let your friend get some relief from the air condition for a while, make yourself some cocoa and get another blanket, or do you insist that it is cold, refuse to turn the heat down, and hold out for an expert’s opinion? The temperature is 600 degrees. Both sides agree. And yet, the problem remains.

Now enlarge the issue, put it in the hands of the public, and leave it up for the news to relate this to the public, the court of public opinion, with one media outlet playing prosecution and the other defense: one source sides with the cold woman, insisting that warmth is important and going to sleep cold can lead to cough and a runny nose. The competitor surveys an anonymous group of an undisclosed number of survey participants and claim that 84% of all news stories involving percentages are pulled from their collective assholes to push a story, and 76% of News A readers believe it is more dangerous to be too hot than to be too cold, because dehydration can lead to hallucinations and delusions, even stroke and death. The reports take shape and independent outlets take sides. Newspapers and websites run special editorials on the importance of warmth and cool air, as opinion pieces on why being hot is good for burning calories and Dr. Oz endorses freezing as an excellent way of strengthening your immune system. The token religion authority piece calls into question the measurement and suggests that it’s flawed.

So, when each source of information exists solely to reinforce one point of view or the other, when both are objectively true to each independent evaluator, but also when both are subjectively incorrect in their describing of the weather as applicable to what is felt by someone else. The point isn’t to convince someone that cold is really hot or hot is cold. How do you decide what is in need of being decided, if anything? You start with humanizing the individuals, stressing individual pressures and stresses unique to them, their fear and desire – all very real, all very unrelated to getting objective truth on something that is, by its very nature, inexplicable of dual definitions, or consensus definition. You give them faults, you tear their character apart, and you do it without facts – but with questions: the best way to manufacture news is to use a declarative sentence and add a question mark, disguising the subtle lie with the trappings of inquiry:

Allegations heating up! Could drug withdrawals be to blame for Cold Woman’s inexplicable coldness?

Could meth use explain explicable midnight suffering?

Blam! They’re no longer people experiencing normal human emotions; they’re talking points to be pulled from the shelf from time to time to make a point, only to be put away. Not only have they lost human dimension, but makes them abstract pieces in a question that has become about something else. Whether the thermometer is accurate, what led to one person being hot or the other being cold, and all this noise becomes a convoluted, incestuous echo chamber, and what is invariably lost in the details are the people most affected by it. One person is hot. One person is cold. And they’re waiting on a population of disaffected, disillusioned apostates of academia to settle the point of it all. To those not hot nor cold, the best thing to do is decide what will benefit them the most, in earning their professional opinion. Now imagine this conflict is something more serious, something involving, say, hydrogen bombs, and instead of two people waiting on the Parakeet Jury’s verdict, and there are millions – to be told whether it is hot or cold, a sensation they cannot feel, one way or the other, and, if they could, would not be solved by consensus.

The reason for this, and yes, dammit, there is one, is to establish that this line of thinking, of unresolved / ultimate subjectivity, in which a response can’t be simultaneously correct, is meant to establish a concept in fiction, and is best described as the lack of resolution in defining based on subjectivity and alternative perspectives, which extends to the largest elements of a story to the smallest, and could be called the unresolved discontinuity – a resolved clause without a resolved thread from competing beliefs that can be boiled down to multiple perspectives of debating whether 60 degrees is hot or cold.

The presumption of expertise

Experts are similarly arbitrary, despite the hypnotism of pomp and gravitas, and are commonly those of repute and influence, demonstrating understanding and the recognition of practical application in a given field: someone who has demonstrated their understanding and survived the pressure of peer-review, the sorting hat for new writers that separates the fuckin’ wheat from the chaff, I’ll tell you that shit right now. So, after the peer review process, once they have the esteem of a university or publication, how do we accept such an argument without skepticism, if all ideas are made great only by their bearing the brunt of the most vicious application of Occam’s Razor. We look to the foundation of what makes a structure reasonable by degree, putting the structure into a tangible format you can see how the skepticism of legitimate expertise drives an industry of opinion professionals.

The presumption of expertise often comes with the backing of a prestigious organization or academic community. For each expert’s idea or philosophy, the wise response to something that requires reason and evidence is the rigorous application of skepticism, as the questioning of the obviously false is the beginning of a life-long self-education, which gives the newly minted scholar keys to the pragmatic process of tearing down the proposed theory from the bottom up, starting at the assumptions that underlie the actual postulation. To test the strength of an idea, you test its foundation, the principles behind the methodology used to arrive with such a hypothesis, as was the case with Darwin, for example, and later Watson and Crick, all with theories with the evidence presumed to be discoverable in nature (which have been).

          Skepticism is the crucible through which all ideas and theories are submitted to, and earlier attempts at a natural explanation for the diversity of animals on Earth had been shredded to piece by the dispassionate teeth of doubt an inquiry. And anyone confident enough in their proposal to submit to skepticism and inquiry participates on the perpetual renewal of knowledge in academia. If an idea doesn’t seem like it works, scrutinize it to the greatest extent necessary, and try not to impose or otherwise import an unrelated understanding and apply it to a novel problem. Don’t let the presumption of title or prestige ever shortcut your natural inclination to evaluate passionately, nor let someone shortcircuit your critical faculties by attempts to annul attempted criticisms, such as an idea’s built in defenses against claims against it.

          The application of practical philosophy is observation, deduction, commonality or abundance of supporting themes, prediction and the ability to explain, before their discovery, how such discoveries will be measured – such as the completion of the Periodic Table before all of the elements that are now firmly nestled into it were even found (or, as in some instances, artificially created), and intermediate forms between species of animal – such as the closest living relative to the blue whale – have been proposed and subsequently found.

Scholarship and philosophy

A scholar, of literature or history, can be as rhetorically gifted and thoughtful as a philosopher, and just as instructive; but, more than anything, an academic is the messenger, the intermediate between an artist or a subject of study and the student. A natural philosopher in the literary tradition looks at common elements of human nature as represented in fiction (or nonfiction) and acts as an intermediate, much in the same way, a voice of some authority turning the commentary into a peripheral, an adjunct to communal learning in popular culture. It is not right by consensus, as truth is not measured by popular appeal but, ultimately, by historical favor: the community of opinion will, as the event gets closer and closer to falling over the edge of living memory, at its least vibrant and potent in our mind, , at least in the public arena, is the judgment of history.

Popular literary criticism and interpretation

And history has given us unique and, well, perplexing interpretations of the art and culture that has shaped human civilization. The thing about history, you see, consensus is hard to come by; as much for modern political issues as it has been forever, because humans have the stubborn capacity of trusting in the benevolence of those of learning who may stand to profit by misleading them.

What kind of influence a persuasive philosopher may have over the discovery, that’s only half the battle: the connection between relevance and meaning in a work of fiction is as dependent upon the reader as the writer, as that is what makes literary criticism the non-exact science it is. As individuals we decide what a work of art means. As a culture we develop an interpretation and the rest of the details emerge gradually.

There are very clever and astute people out there; people who understand subtlety, subtext, and thematic elements. You know the type, stuffy and pretentious academics who sniff their own socks when no one’s looking. Ahem. In my studies of writers and writing, I’ve noticed one thing that writers hate more than anything—with the possible exception of outright plagiarism—and that is popular,  enduring misinterpretation.

Pareidolia is a phenomena defined by the failure of a person’s natural capacity for pattern recognition. Pareidolia is a type of apophenia—as we begin to see patterns and meaning in randomness; perhaps project would be a better way to describe how we can find a face on Mars—this is how pareidolia happens: constellations, the imagined forms we use to connect the stars and form from those connections recognizable human shapes and patterns. Every constellation we see is apophenia. It’s fun to come up with theories and meaning to enrich our favorite stories and usually no one gets hurt. Let me give you an innocent example, an example which appeals to the mid-90’s child we all are. Ahem.

It is a popular theory among the gamers of my generation that The Legend of Zelda: Majora’s Mask is about the five stages of grief. It could be an example of pareidolia, yet it’s understandable. It’s an interesting look at the storytelling techniques of a uniquely modern medium. It’s harmless and has no social ramifications. But sometimes, when a full moon is out, misinterpretation can lead to terrible, terrible things indeed.

Submitted for your approval, exhibit A: The popular Beatles song Helter Skelter upon release had no definitive or band confirmed meaning and therefore what it meant was largely dependent on what we brought to it as listeners; the final verdict being the decision made by the beholder, by the interpreter. That’s one of the greater qualities of music; as it is, more-so than many forms of art, an intentionally subjective consideration. For people who are not cult leaders on LSD who think they’re Jesus, like Charles Manson, it doesn’t prophesize a coming race war. Yet that interpretation culminated in the Tate-LaBianca murders in California, August of ’69.

Exhibit B: The Catcher in the Rye is a famous coming of age novel by J.D. Salinger. The tale, told in the first person, is recounted by an angry, unreliable narrator named Holden Caulfield. He goes to bars, talks to hookers, and rants about posers. For reasons unknown, there have been many murders and crimes related to The Catcher in the Rye—so much so that the bewilderment surrounding this bizarre phenomena has its own Wikipedia page—the true measure of cultural significance.

In fact—this should be noted—Mark Chapman, John Lennon’s murderer, was arrested with The Catcher in the Rye in his hands. He claimed the text of the book would serve the law in determining his reasons for the crime. John Lennon, who performed Helter Skelter, which, because of misinterpretation, led to the Tate-LaBianca murders, was murdered by Mark Chapman, himself operating under a misinterpretation of a work of art.

Objectivity in practice

Objective meaning in literature too is rare and this leads to less objectively true interpretations. Most of the time the interpretation is based on coincidence and correlation. A good example is reading J.R.R. Tolkien’s Lord of the Rings as an allegory for World War II. But, it could be argued that the various races in the Lord of the Rings are intended to represent the different religions of the world? The masked men with the Arab chic were Arabic, the Dwarves were Hebrew, and Tolkien imported perceived characteristics (racism) to give their people a desire for hoarding gold (racism), and yet they are the stand-in for Judaism. The men could be said to be those who turned away from God, or Illuvatar, agnostics and atheists; but, where pray are the Christians? Well, the Hobbits are obviously linked to paganism, living off the land, being in love with all the things that grow, and their paganism is tinged with passive pacificism, being (for the most part) content to not meddle in the affairs of men, and elves – the wise, the immortal, and most beautiful race? They could be Tolkien’s stand-in for Christianity. Gandalf and Frodo and Bilbo get to go to the Elvish heaven with them, in the end, leaving Sam’s pagan ass at home, having to deal with Rosie.

Now, that might seem ridiculous. And it’s understandable: a parallel can be drawn between the one ring and the one race. What we see depends on the eyes we have and what our knowledge allows us to see. If this was the work of armchair philosophers it wouldn’t have the same lasting, negative effect as sometimes the very people upon whom we rely to instruct us in proper interpretation and textual criticism fail us and generations to follow, cultivating a culture of misunderstanding.

I was in high school when I first read Ray Bradbury’s Fahrenheit 451. To me it seemed to be the definitive manifesto for the right to freedom of speech and expression. It is interesting to note that the book was first serialized in Playboy magazine. Suck it, censors! A then spry, youthful 196 year old Hugh Hefner managed to secure the serial rights to Fahrenheit 451 during an important period in American culture, a period in which real censorship was on its last leg, made of steel and ivory though it was. Books now considered classics, such as Ulysses had been put to trial for obscenity, as did Naked Lunch. The importance can be compared to the making of Casablanca during the actual Second World War. The book burning in Fahrenheit 451 is as iconic to English speaking readers as is the thought-police and Big Brother from Orwell’s Nineteen Eighty-Four. It’s not even subtle.

It’s obviously about censorship. Right right? Right, right!

What’s it going to be then, eh? 

In Bradbury’s cooky, imagined future, America has outlawed books and freedom of the press; free thought and intellectualism were treated the way in which Pol Pot treated it in Cambodia. Censorship is a political mob-process and it’s treated like business as usual. The people are partially to blame for this; for their blasé reaction to the suppression of basic human rights and dignity and they, through this process, forget what it is to be free. They forget what it is to be human.

Fahrenheit 451 seemed so obvious when I first read it, obvious to the point of insult, I thought, to the reader’s intelligence. Even my English teacher Mrs. [Willnotbesued] believed in and espoused this traditional interpretation; an obvious allegory, a philosophy intended to presage the day to day realities in a world where the individual is defeated by state sponsored censorship. The title is even a reference to book burning! That’s classic censorship! Right? Right?! F@#%!

As Sherlock Holmes said in The Boscombe Valley Mystery:

“There is nothing more deceptive than an obvious fact.”

So I had it wrong, as did my teacher, and, because of her, the entire class. I would later learn that nearly everyone, except the author, had it wrong. That particular interpretation is incorrect and implicitly incorrect. And when he attempted to set the record straight while lecturing at UCLA, Ray Bradbury was told by students that he was mistaken; the author’s interpretation of his own material was wrong. (This is not impossible.) So what did the author do that fateful day at UCLA? He was proper pissed and walked out. So what did Ray Bradbury think his book was about? Television. Television and the ancient evil from whence it came (Cathode ray tubes, it would seem.)

In reality Bradbury was more concerned with literature having to compete with television for primacy in the war for the imagination of the world. I guess that makes Fahrenheit 451 less Nineteen Eighty-Four and more Video Killed the Radio Star. You see, Ray believed that television would somehow lead to the shortening of the attention span. He thought that complex social and economic issues would be compressed for time and later used by powerful conglomerations to spin the truth to their benefit through mass manipulation and … oh, wait.

But none of that is as absurd as the trial of the German philosopher and noted mustachio Friedrich Nietzsche, who’s book Also Sprach Zarathustra (Thus Spake Zarathustra, ‘zarathustra’ being the German for the Persian Zoroaster) was initially published in separate installments, individual installments, over the course of a couple of years, between 1883-85 and was only published in a single volume in 1887.

At the time of his death, the fourth part remained unpublished. In the first run, forty copies were printed, not counting copies set aside by the author for friends and family.  And since then the most common version is the portable collection – with a fourth segment, the now annotated Intermezzo, which was unfinished, was published; and had a limited commercial run and went out of print.

Nietzsche died before he could finish the Intermezzo and rearrangement for a singular work, and his sister thought, if only if there were Nazi overtones in the book. And it was so.

Despite the application of allegory and its ambiguities, it is my hope, and the hope of every writer, save for perhaps James Joyce, to be understood. But to make it more interesting for the reader, there is ambiguity, subtlety, and authors rely on their readers to join them, to be their partners in creation. As the American abstract artist Mark Rothko once said in reference to his art:

A picture lives by companionship, expanding and quickening in the eyes of the sensitive observer. It dies by the same token. It is therefore a risky, and unfeeling act, to send it out into the world.

Rothko, 1956, committing Art.

Advertisement

Religion, Freedom, Fear & Panic (George Orwell) – 17 March 2016

THE ART OF OPPRESSION

AS MUCH AS ART AND LANGUAGE HAVE ENRICHED our lives and culture, it can be used as a means of personal advancement or attainment, and can be used, has been used as a tool to subdue and keep mute an illiterate public. As the best literature and music can be liberating, there is a darker side to this, something more nefarious. George Orwell’s nightmarish vision of a future where literature does not set free the soul was as fantastical as it was grounded. Because, despite seeing its absurdity, we saw echoes of Orwell’s themes, if but vaguely, in our own lives — Big Brother is the judging eye that watches, an eye that judges, a figure that enforces law against thinking the wrong way. Wilson gets sent to the worst hotel room in history outside of a Holiday Inn, Room 101.

Big Brother is Watching You — everyone is familiar with the popular phrase from [Orwell’s] most popular work, Nineteen Eighty-Four, but its interpretation has often been limited to political interpretation. If you cast big brother as an abstract and put the entirety of creation under his charge, what do you get? An all seeing figure who’s always watching, always careful to ensure the rules laid forth are observed, and in waiting to punish if a tenet of the Law is broken. Big Brother, as an abstract, is more than a satire of the culture of personality which, like Boy Bands and iPhone releases, always seem to spring up despite all sensible people knowing how objectively terrible they are.

Nineteen Eight-Four appeals to the same sensibility to which ‘God is watching over us’ appeals. Except, by inverting the All Seeing Eye, by showing us the perversion of thought crime, the crime of love, the arbitrary torture — it’s easy to see Orwell is telling a story on two different thematic levels: the microcosm (the singular Big Brother and the singular idea such a figure represents. And there’s a perversion of this, and it’s happening now, right in the modern, progressive world: but not through the silent, watchful judgment of one centralized authority figure, we’ve cast ourselves as indignant,flattered voyeurs in the drama of the watchful, attentive eye presiding over the most mundane of our activities, whether it is a friend who follows you on Twitter.

John Taylor’s Seven Lesson Schoolteacher has a different approach to handling an authoritarian edifice and his lessons are the bricks in the edifice of the mind’s sometimes voluntary enslavement. It is a poignant testament to the quality of individuality and warning against subscribing to a belief system structured to control. When the information provided comes from the same body enacting the law, it is, no matter the brand of information—literature, media, radio—designed to control by fear and recruit by a promise such law givers are unable to keep.

In The Seven-Lesson Schoolteacher, Gatto shows us what Dostoevsky, in Notes from Underground, called the ‘edifice of glass.’ Gatto shows the reality of totalitarianism in a distorted yet eerily similar America.  To paraphrase, a centralized order must not be questioned. No possible objections, logical, sensible or otherwise are to be taken seriously and those who make such objections do so to their disadvantage.

Mr. Gatto, as he wished to be called, was a school teacher who had taught for twenty-six years, winning many awards in the process. He outlines a subconscious and hidden curriculum that is unconsciously transmitted to every student in every school in the United States. These rules aren’t acknowledged, written, or made apparent but, as Mr. Gatto suggests, this is the only way students can be turned into functioning member of society—as he sees it.

What does it mean to function in a society if one has to be manipulated as a child to be able to do so? The seven universal lessons perpetuate what has done more to harm people throughout history, though it helps a select few, and could be interpreted as a list for the pros of making war upon your own government, as Shakespeare famously questioned in his treatment of the character in Richard II: is it ever right to overthrow a monarchy? When it is necessary for the following traits to be drilled into children in order to keep them in check, it most certainly is; I fall into another category on this position, which Leon Trotsky expressed so well in Literature and Revolution. 

‘Mechanical centralism is necessarily complemented by factionalism, which is at once a malicious caricature of democracy and a potential political danger.’

Mr. Gatto’s entire structure is built on factionalism. His seven universal lessons are meant to strengthen some factions to invite membership and conformity, and others are intended to keep those ‘unworthy’ are those for whom the rest of the rules were written. The seven universal rules are: confusion, class position, indifference, emotional dependency, intellectual dependency, provisional self-esteem, and an admonition against anyone who notices the slavery of a system that confuses intentionally, gives to one side it created for itself, and addicts the rest to scraps because class position can only exist in a society confused and emotionally dependent. You can’t hide. Big brother is watching you. Take your soma and fall in line: this is the literature of enslavement. And the author of this material is a real man and really believes in these universal ‘laws’ of education.

Students are often taught a barrage of information, none of which is important to their lives, intended to work as an assembly line towards an end, a goal: to college, to graduate school, and finally to a job. This sort of cynical approach by a life-long teacher is disheartening; it is disheartening not because of one man’s belief, but those who rally behind his ideas of slavery are highly influential. Behind all the useless information is what the intended goal of this system is: there is this centralized element abhorrent to Trotsky, an element that might have made Shakespeare rethink his ideas of overthrowing a monarch.

The central command structure of knowledge reaches into the deep past of western philosophy. It’s in Plato’s The Republic, St. Augustine’s City of God, even Leviathan by Thomas Hobbes. Although it wasn’t published in his lifetime, Hobbes’ much better work, Behemoth, was forbidden by a king, a king who probably would’ve endorsed it, had he read it. Satires like Orwell’s Nineteen Eighty-Four and Huxley’s Brave New World were considered, in their time, to be ridiculous. These were not instant classics. And the writing of Nineteen Eighty-Four nearly killed George Orwell; this brings us to what gave the English their first clear vision of totalitarianism.

AN HOMAGE TO ORWEL– On the Cult of Personality and Altar of Fear

BEFORE A SOCIAL AND BIOGRAPHICAL ANALYSIS of Orwell the man, writer of Nineteen Eighty-Four and Animal Farm, I would first like to say that I believe he was at his best in his non-fiction account of the Spanish Civil War—Homage to Catalonia.

“It was a bright cold day in April, and the clocks were striking thirteen.”

Sixty years after the publication of Orwell’s mostly widely cited and read work, Nineteen Eighty-Four, that crystal first line sounds as natural and compelling as ever. But when you see the original manuscript, you find something else: not so much the ringing clarity, more the obsessive rewriting, in different inks, that betrays the extraordinary turmoil behind its composition.

Probably the definitive dystopian novel of the 20th century, a story that remains eternally fresh and contemporary, and whose terms such as ‘Big Brother,’ ‘doublethink,’ and ‘newspeak—all of which having become part of the everyday currency in the English lexicon, Nineteen Eighty-Four has been translated into more than 65 languages and sold millions of copies worldwide, giving George Orwell a unique place in world literature.

The circumstances surrounding the writing of Nineteen Eighty-Four make a haunting narrative that helps to explain the bleakness of Orwell’s dystopia. Here was an English writer, desperately sick, grappling alone with the demons of his imagination in a bleak Scottish outpost in the desolate aftermath of the Second World War. The idea for Nineteen Eighty-Four, alternatively, The Last Man in Europe, had been incubating in Orwell’s mind since the Spanish civil war.

His novel, which owes something to Yevgeny Zamyatin’s dystopian fiction We, probably began to take a definitive shape during 1943-44, around the time he and his wife Eileen adopted their only son, Richard. Orwell himself claimed that he was partly inspired by the meeting of the Allied leaders at the Tehran Conference of 1944. Isaac Deutscher, an Observer colleague, reported that Orwell was “convinced that Stalin, Churchill and Roosevelt consciously plotted to divide the world” at Tehran.

Orwell had worked for David Astor’s Observer since 1942, first as a book reviewer and later as a correspondent. The editor professed great admiration for Orwell’s “absolute straightforwardness, his honesty and his decency,” and would be his patron throughout the 1940s. The closeness of their friendship is crucial to the story of Nineteen Eighty-Four.

Orwell’s creative life had already benefited from his association with the Observer in the writing of Animal Farm. As the war drew to a close, the fruitful interaction of fiction and Sunday journalism would contribute to the much darker and more complex novel he had in mind after that celebrated ‘fairy tale.’ It’s clear from Observer book reviews, for example, that he was fascinated by the relationship between morality and language.

There were other influences at work. The atmosphere of random terror in the everyday life of wartime London became integral to the mood of the novel-in-progress. Worse was to follow. In March 1945, while on assignment for the Observer in Europe, Orwell received news that his wife Eilee, had died under anesthesia during a routine operation.

Suddenly he was a widower and a single parent, eking out a threadbare life in his Islington lodgings, and working incessantly to dam the flood of remorse and grief at his wife’s premature death. In 1945, for instance, he wrote almost 110,000 words for various publications, including 15 book reviews for the Observer.

Then Astor stepped in. His family owned an estate on the remote Scottish island of Jura, next to Islay. There was a house, Barnhill, seven miles outside Ardlussa at the remote northern tip of this rocky finger of heather in the Inner Hebrides.

Initially, Astor offered it to Orwell for a holiday. Speaking to the Observer last week, Richard Blair says he believes, from family legend, Astor was taken aback by the enthusiasm of Orwell’s response.

In May 1946 Orwell, still picking up the shattered pieces of his life, took the train for the long and arduous journey to Jura. He told his friend Arthur Koestler that it was ‘almost like stocking up ship for an arctic voyage.’

It was a risky move; Orwell was not in good health. The winter of 1946-47 was one of the coldest of the century. Postwar Britain was bleak and Orwell always suffered from a chest pains and other anxiety-related pains. At least, cut off from the irritations of literary London, he was free to grapple unencumbered with the new novel. ‘Smothered under journalism,’ as he put it, he told one friend, ‘I have become more and more like a sucked orange.’

Ironically, part of Orwell’s difficulties derived from the success of Animal Farm. After years of neglect and indifference the world was waking up to his genius. ‘Everyone keeps coming at me,’ he complained to Koestler, ‘wanting me to lecture, to write commissioned booklets, to join this and that, etc.–you don’t know how I pine to be free of it all and have time to think again.’

On Jura he would be liberated from these distractions. The promise of creative freedom on an island in the Hebrides, however, came with its own, unique price. Years before, in the essay Why I Write, he described the struggle to complete a book: ‘Writing a book is a horrible, exhausting struggle, like a long bout of some painful illness. One would never undertake such a thing if one were not driven by some demon whom one can neither resist or [sic] understand. For all one knows that demon is the same instinct that makes a baby squall for attention. And yet it is also true that one can write nothing readable unless one constantly struggles to efface one’s personality.’ It ends with the popular adage: ‘Good prose is like a window pane.’

From the spring of 1947 to his death in 1950 Orwell would re-enact every aspect of this struggle in the most painful way imaginable. Privately, perhaps, he relished the overlap between theory and practice. He had always thrived on self-inflicted adversity.

At first, after ‘a quite unendurable winter,’ he reveled in the isolation and wild beauty of Jura. ‘I am struggling with this book,’ he wrote to his agent, ‘which I may finish by the end of the year—at any rate I shall have broken the back by then so long as I keep well and keep off journalistic work until the autumn.’

Barnhill, overlooking the sea at the top of a potholed track, was not large, with four small bedrooms above a spacious kitchen. Life was simple, even primitive. There was no electricity. Orwell used Calor gas to cook and to heat water. Storm lanterns burned paraffin. In the evenings he also burned peat. He was still chain-smoking black shag tobacco in roll-up cigarettes: the fug in the house was cozy but not healthy. A battery radio was the only connection with the outside world.

Orwell, a gentle, unworldly sort of man, arrived with just a camp bed, a table, a couple of chairs and a few pots and pans. It was a Spartan existence but supplied the conditions under which he liked to work. He is remembered there as a spectre in the mist, a gaunt figure in oilskins.

At the end of May 1947 he told his publisher, Fred Warburg: ‘I think I must have written nearly a third of the rough draft. I have not got as far as I had hoped to do by this time because I really have been in most wretched health this year ever since about January (my chest as usual) and can’t quite shake it off.’

Mindful of his publisher’s impatience for the new novel, Orwell added: ‘Of course the rough draft is always a ghastly mess bearing little relation to the finished result, but all the same it is the main part of the job.’ Still, he pressed on, and at the end of July was predicting a completed ‘rough draft’ by October. After that, he said, he would need another six months to polish up the text for publication. This does not happen.

Part of the pleasure of life on Jura for George and his young son was the outdoor life—fishing, explore the island, and potter about in boats. In August, during a spell of lovely summer weather, Orwell, Avril, Richard and some friends, returning from a hike up the coast in a small motor boat, were nearly drowned in the infamous Corryvreckan whirlpool.

Richard Blair remembers being ‘bloody cold’ in the freezing water, and Orwell, whose constant coughing worried his friends, did his lungs no favors. Within two months he was seriously ill. Typically, his account to David Astor of this narrow escape was laconic, even nonchalant.

The long struggle with The Last Man in Europe continued. In late October 1947, oppressed with ‘wretched health,’ Orwell recognized that his novel was still ‘a most dreadful mess and about two-thirds of it will have to be retyped entirely.’

He was working at a feverish pace. Visitors to Barnhill recall the sound of his typewriter pounding away upstairs in his bedroom. Then, in November, tended by the faithful Avril, he collapsed with ‘inflammation of the lungs’ and told Koestler that he was “very ill in bed”. Just before Christmas, in a letter to an Observer colleague, he broke the news he had always dreaded. Finally he had been diagnosed with TB.

A few days later, writing to Astor from Hairmyres hospital, East Kilbride, Lanarkshire, he admitted: ‘I still feel deadly sick,’ and conceded that, when illness struck after the Corryvreckan whirlpool incident, ‘like a fool I decided not to go to a doctor – I wanted to get on with the book I was writing.’

In 1947 there was no cure for TB; doctors could only prescribe fresh air regular diets. However, there was a new, experimental drug on the market, streptomycin. Astor arranged for a shipment to Hairmyres from the US.

Orwell’s son Richard believed his father was given excessive doses of this new drug. The side effects were horrific (throat ulcers, blisters in the mouth, hair loss, peeling skin and the disintegration of toe and fingernails; but in March 1948, after a three-month course, the TB symptoms had disappeared. ‘It’s all over now, and evidently the drug has done its stuff,’ Orwell told his publisher. ‘It’s rather like sinking the ship to get rid of the rats, but worth it if it works.’

As he prepared to leave hospital Orwell received the letter from his publisher which, in hindsight, would be another nail in the coffin. ‘It really is rather important,’ wrote Warburg to the star author, ‘from the point of view of your literary career to get it [the new novel] by the end of the year and indeed earlier if possible.’

Just when he should have been convalescing Orwell was back at Barnhill, deep into the revision of his manuscript, promising to deliver by ‘early December,’ and coping with ‘filthy weather’ on autumnal Jura. Early in October he confided to Astor: ‘I have got so used to writing in bed that I think I prefer it, though of course it’s awkward to type there. I am just struggling with the last stages of this bloody book [which is] about the possible state of affairs if the atomic war isn’t conclusive.’

This is one of Orwell’s exceedingly rare references to the theme of his book. He believed, as many writers do, that it was bad luck to discuss a work-in-progress. Later, to Anthony Powell, he described it as ‘a Utopia written in the form of a novel.’ The typing of the fair copy of The Last Man in Europe became another dimension of Orwell’s battle with his book. The more he revised his ‘unbelievably bad” manuscript the more it became a document only he could read and interpret. It was, he told his agent, “extremely long, even 125,000 words.’ With characteristic candor, he noted: ‘I am not pleased with the book but I am not absolutely dissatisfied… I think it is a good idea but the execution would have been better if I had not written it under the influence of TB.’

And he was still undecided about the title: ‘I am inclined to call it NINETEEN EIGHTY-FOUR or THE LAST MAN IN EUROPE,’ he wrote, ‘but I might just possibly think of something else in the next week or two.’ By the end of October Orwell believed he was done. Now he just needed a stenographer to help make sense of it all.

It was a desperate race against time. Orwell’s health was deteriorating, the ‘unbelievably bad’ manuscript needed retyping, and the December deadline was looming. Warburg promised to help, and so did Orwell’s agent. At cross-purposes over possible typists, they somehow contrived to make a bad situation infinitely worse. Orwell, feeling beyond help, followed his ex-public schoolboy’s instincts: he would go it alone.

By mid-November, too weak to walk, he retired to bed to tackle ‘the grisly job’ of typing the book on his “decrepit typewriter” by himself. Sustained by endless roll-ups, pots of coffee, strong tea and the warmth of his paraffin heater, with gales buffeting Barnhill, night and day, he struggled on. By 30 November 1948 it was virtually done.

Now Orwell, the old campaigner, protested to his agent that ‘it really wasn’t worth all this fuss. It’s merely that, as it tires me to sit upright for any length of time, I can’t type very neatly and can’t do many pages a day.’ Besides, he added, it was ‘wonderful’ what mistakes a professional typist could make, and, ‘in this book there is the difficulty that it contains a lot of neologisms.’

The typescript of George Orwell’s latest novel reached London in mid-December, as promised. Warburg recognized its qualities at once (‘amongst the most terrifying books I have ever read’) and so did his colleagues. An in-house memo noted ‘if we can’t sell 15 to 20 thousand copies we ought to be shot.’

By now Orwell had left Jura and checked into a TB sanatorium high in the Cotswolds. ‘I ought to have done this two months ago,’ he told Astor, ‘but I wanted to get that bloody book finished.’ Once again Astor stepped in to monitor his friend’s treatment but Orwell’s specialist was privately pessimistic.

As word of Nineteen Eighty-Four began to circulate, Astor’s journalistic instincts kicked in and he began to plan an Observer Profile, a significant accolade but an idea that Orwell contemplated ‘with a certain alarm.’ As spring came he was “having haemoptyses” (spitting blood) and ‘feeling ghastly most of the time’ but was able to involve himself in the pre-publication rituals of the novel, registering ‘quite good notices’ with satisfaction. He joked to Astor that it wouldn’t surprise him ‘if you had to change that profile into an obituary.’

Nineteen Eighty-Four was published on 8 June 1949 (five days later in the US) and was almost universally recognized as a masterpiece, even by Winston Churchill, who told his doctor that he had read it twice. Orwell’s health continued to decline. In October 1949, in his room at University College hospital, he married Sonia Brownell, with David Astor as best man. It was a fleeting moment of happiness; he lingered into the new year of 1950. In the small hours of 21 January, George Orwell suffered a massive hemorrhage in hospital and died alone.

The news was broadcast on the BBC the next morning. Avril Blair and her nephew, still up on Jura, heard the report on the little battery radio in Barnhill. Richard Blair does not recall whether the day was bright or cold but remembers the shock of the news: his father was dead, aged 46.

David Astor arranged for Orwell’s burial in the churchyard at Sutton Courtenay, Oxfordshire. He lies there now, as Eric Blair, between HH Asquith and a local family of Gypsies.

Cont.

Why ‘1984’? 

Orwell’s title remains a mystery. Some say he was alluding to the centenary of the Fabian Society, founded in 1884. Others suggest a nod to Jack London’s novel The Iron Heel (in which a political movement comes to power in 1984), or perhaps to one of his favorite writer GK Chesterton’s story, The Napoleon of Notting Hill, which is set in 1984.

In his edition of the Collected Works (20 volumes,) Peter Davison notes that Orwell’s American publisher claimed that the title derived from reversing the date, 1948, though there’s no documentary evidence for this. Davison also argues that the date 1984 is linked to the year of Richard Blair’s birth, 1944, and notes that in the manuscript of the novel, the narrative occurs, successively, in 1980, 1982 and finally, 1984. There’s no mystery about the decision to abandon The Last Man in Europe. Orwell himself was always unsure of it. It was his publisher, Fred Warburg who suggested that Nineteen Eighty-Four was a more commercial title.

Freedom of speech

The effect of Nineteen Eighty-Four on our cultural and linguistic landscape has not been limited to either the film adaptation starring John Hurt and Richard Burton, with its Naziesque rallies and chilling soundtrack, nor the earlier one with Michael Redgrave and Edmond O’Brien.

It is likely, however, that many people watching the Big Brother series on television (in the UK, let alone in Angola, Oman or Sweden, or any of the other countries whose TV networks broadcast programmes in the same format) have no idea where the title comes from or that Big Brother himself, whose role in the reality show is mostly to keep the peace between scrapping, swearing contestants like a wise uncle, is not so benign in his original incarnation. Apart from pop-culture renditions of some of the novel’s themes, aspects of its language have been leapt upon by libertarians to describe the curtailment of freedom in the real world by politicians and official—alarmingly, nowhere and never more often than in contemporary Britain.

Orwellian

 Room 101

Some hotels have refused to call a guest bedroom number 101—rather like those tower blocks that don’t have a 13th floor—thanks to the Orwellian concept of a room that contains whatever its occupant finds most impossible to endure. Like Big Brother, this has spawned a modern TV show: in this case, celebrities are invited to name the people or objects they hate most in the world.

Thought police

An accusation often levelled at authoritative governments, or arenas in public in which ideas or speech are being restricted; any conglomeration designed to bleep or blur, remove or ‘correct’ literature, hide and suppress ideas.

Newspeak

For Orwell, freedom of expression was not just about freedom of thought but also linguistic freedom. This term, denoting the narrow and diminishing official vocabulary, has been used ever since to denote jargon currently in vogue with those in power.

Doublethink

Hypocrisy with a twist. Rather than choosing to disregard a contradiction in your opinion, if you are doublethinking, you are deliberately forgetting that the contradiction is there. This subtlety is mostly overlooked by people using the accusation of ‘doublethink’ when trying to accuse an adversary of being hypocritical—but it is a very popular word with people who like a good debate with their beer. If I may: everything is good with beer—if you have the beer first.

IN THE BHAGAVAD-GITA, THE HINDU HOLY BOOK, we find the great archer and warrior, Arjuna, with his charioteer, and avatar of Vishnu, Krisha—of questionable fame stemming from an event earlier in life, having been caught stealing butter–allegedly. They are poised between two massive armies lined up to fight one another. He looks at both sides and finds relatives, fathers and sons, ready to slaughter one another in this battle. In his confusion and anguish, he cries out for guidance. To guide him, Krishna speaks to him as the supreme God of Gods, almighty Time, and instructs him the way of the Yoga.

The war, like so many of what is herein discussed, is an externalization used to illustrate the conflict inside oneself, the kind of conflict that every person has when it comes to choosing, when it comes to differentiating between what is right and what is wrong. Krishna appeared before him as a beacon of light in a time of darkness. He has since appeared to millions as the same light, to lead people from eternal return (For modern comparison, consider Groundhog Day) from what Krishna calls ‘the transient world of sorrow.’

The main thing that appealed to me about this ancient text is just pure beauty. Transience, I believe, is the major theme, the mortality of everything alive on the earth. In describing this to Arjuna, the transience of life and its luxuries, Krishna consoles and reminds Arjuna of his purpose, thereby escorting him out of darkness. What Krishna reveals to him cripples Arjuna and he is left shaking with fear and awe, saying, ‘Thy tears are for those beyond tears; and are thy words words of wisdom? The wise grieve not for those who live; they grieve not for those who died. Life and death will pass away.’

By this I believe he was saying that emotional and physical states exist in finite space, unable to last forever, and reasons that life, like death, will someday pass away into another sphere of existence, beyond eternal return.

‘Because we have all been for all time, I, and thou,’ he says. ‘We all shall be for all time, forever, and forever more.’

It appears in his words that Krishna relates the human body to be nothing but a vessel, like a physical ship to carry the ships’ captain, then, when the physical ship is no longer set afloat, the captain moves on to find another ship, only to be imprisoned again, like smoke inside a bottle until reincarnation, where we’re trapped again inside a body in the miserable cycle of eternal return.

Krishna appears before him as all powerful Time, with, ‘…multitudes rushing into him and pouring out of him as he devours them all, destroys everything.’

Krishna says, “I am all powerful time, and I have come here to slay these men. Fight, or fight not; all these men will die.”

After the mortal body is shed, ‘As the spirit of our mortal body wanders on in childhood, and in you and old age, the spirit moves to a new body,’ Krishna believes the evaluating mid-mind, the mind behind the body, passes in and out of light and dark, between worlds, reliving one cycle of life and death without ever finding something that lasts forever, something that is forever tangible. The spirit, however, is forever to him; this is a good idea, as death is relegated to nothing but a temporary shedding of a body: ‘Interwoven in [his] creation, the spirit is beyond destruction. No one can bring an end to the everlasting spirit or an end to something which had no beginning.’

Once someone escapes the transient world, Krishna instructs, he will dwell beyond time in these bodies, though our bodies have an end in their times, but we remain immeasurable, immortal. With these words, Krishna tells us to carry on our noble fight and noble struggles against the depreciating forces of all of life.

The highest goal for him is a goal familiar to Buddhists: asceticism. ‘From the world of senses,’ Krishna says, again beautifully illustrating transience, ‘comes fire and ice, pleasure and pain. They come and go for they are transient. Arise above them, strong soul.’

These words have encouraged and inspired millions of people; from east of the globe to west, every day for thousands of years, this has the quality of liberation. As the Persian poet wrote: A king wished to have a phrase that would cheer him when sad and sadden him when joyful:

This too shall pass.

 The tone of the piece is intended to convey a liberating, lasting peace—an acceptance and eagerness to dispel disillusion and ignorance, to grow closer to the laws of the world and universe, a universe that is god made manifest—this is, in essence, what is called Brahma. It is a call for people to be honorable and kind to others. I’m not a religious person. I am however not ignorant of what this gives to culture and the arts. From a secular perspective, The Bhagavad-Gita is one of the greatest works of literature ever produced by mankind. There is much to take away, to learn, to believe. Acceptance of the supernatural is not necessary to learn and benefit from this cultural jewel.

The Bhagavad-Gita is a beacon of light, a candle in the dark. All cultures in some form or another produce these spiritual and religious texts. The dependence on the supernatural varies, but the message is universal: good for the sake of goodness and kindness for its own sake, while it will earn you no medals or honorary titles, is what lasting peace demands. If the world worked in this way, if everyone was motivated to not only improve themselves but the world around them, a peaceful world becomes possible. In a free world, there is no need to govern, or for government. Government is a euphemism for organized, demanded control.

Confucius, the proverbial wise old man, is credited with the composition of The Analects. In it, Confucius believed himself to be nothing more than a carrier of knowledge. Nothing divine, nothing unique or supernatural, not an inventor but a curator in the museum of our artistic history. Confucian intended to ‘reinvigorate’ what is called the mandate of heaven. Although he claimed to be but a messenger, he is, nevertheless, credited with the most famous of all axioms: “Never impose on others what you would not choose for yourself.”

With great subtlety and emphasis on learning and growing, Confucius left behind a legacy that has had a lasting impact on the world for thousands of years. The Analects are not the only source for Chinese philosophy: Lao Tzu’s Tao te Ching, The Teachings and Sayings of Chuang Tzu, and the iconic I-Ching, or Book of Changes, are cultural treasures, and inherently consistent in tone and content, giving this brand of Eastern philosophy a unique consistency in an otherwise muddled, frustrated series of contradictive versions.

‘Before you embark on a journey of revenge, dig two graves.’

Lines like this are the sun, the light to the lofty and pretentious little quote-loving moth in us all.

In keeping with the tone and aloofness of Eastern philosophy, generally speaking, The Analects echo the Book of Changes, Confucius says, ‘The only constant is change.’

This axiom is but a small notch above pandering tautology; yet we’re still drawn to it. Quotes in this vein are uniquely popular and for good reason. Sometimes one can, without true effort and study, get a good summary or imbibe the essence of a work of art with a cursory glimpse and partial, sometimes non-representative quote. However, this quote is representative and conveys a valuable message. The intention is to raise awareness, to make us more aware of ourselves and changing moods and their relation to the seasons, the cycle of life and death, destruction and renewal. As with The Bhagavad-Gita; it is another mantra urging us to accept the inevitability of the transient, the ephemeral among what is truly immortal, or never-changing.

In the religions of independently evolving cultures, we find, over and over, a connection, a branching out across time and space; in this there is a surprising consistency in the essential message, ‘It is only he who is possessed of the most complete sincerity that can exist under heaven, who can give its full development to his nature. Able to give its full development to his own nature, he can do the same to the nature of other men.’

Confucius’s philosophy is a call to the most ambitious of our characters to look for wisdom and sincerity.

‘Hold faithfulness and sincerity as first principles. Then no friends would not be like yourself (all friends would be as loyal as yourself.) If you make a mistake, do not be afraid to correct it.’

This is unique among quasi-religious texts: this is a eukaryotic idea within, what is by nature, a prokaryotic art-form.

In all the philosophies and religions produced by mankind, within each is some sort of promise, some hint of shelter from whatever storms in which we struggle—and a promised liberation, a refuge to come, a refuge for each moment needed.

A Beginner’s Guide to War Crimes

guernica
Picasso’s depiction of the bombing of Guernica during the Spanish Civil WAr

A Beginner’s Guide to War Crimes – a brief summary of the absolute worst humanity has to offer

What constitutes a crime during warfare? As outlined by The Geneva Conventions:

https://www.icrc.org/eng/war-and-law/treaties-customary-law/geneva-conventions/overview-geneva-conventions.htm

Radical changes in power structure tend to give way to new groups who redefine existing laws to accommodate

There are 4 formal conventions of warfare defined:

1 – The first Geneva Convention protects wounded and sick soldiers on land during war.
2 – The second Geneva Convention protects wounded, sick and shipwrecked military personnel at sea during war.
3 – The third Geneva Convention applies to prisoners of war.
4 – The fourth Geneva Convention affords protection to civilians, including in occupied territory.

Examples of War Crimes — List far from incomplete, insufficient hard liquor.

– The bombing of Guernica during the Spanish Civil War, 1937
http://www.pbs.org/treasuresofthe…/…/glevel_1/1_bombing.html —
As German air chief Hermann Goering testified at his trial after World War II: “The Spanish Civil War gave me an opportunity to put my young air force to the test, and a means for my men to gain experience.” Some of these experimental tactics were tested on that bright Spring day with devastating results – the town of Guernica was entirely destroyed with a loss of life estimated at 1,650

– Attack on Pearl Harbor, 7 December 1945
Imperial Japanese Army under Admiral Isoroku Yamamoto – imperial Japanese Army convicted of War Crimes in International courthttp://militaryhistory.about.com/od/naval/p/Yamamoto.htm

– The bombings of Hirsohima and Nagasaki — 6 August and 9 August, 1945
http://www.aasc.ucla.edu/cab/200708230009.html
The real mortality of the atomic bombs that were dropped on Japan will never be known. The destruction and overwhelming chaos made orderly counting impossible. It is not unlikely that the estimates of killed and wounded in Hiroshima (150,000) and Nagasaki (75,000) are over conservative. Revenge is a mother fucker, but a war crime nonetheless.

– Massacre at My Lai (Vietnam War) – 18 March 1968
http://law2.umkc.edu/…/proje…/ftrials/mylai/summary_rpt.html

The My Lai Massacre was the mass murder of 347 to 504 unarmed citizens in South Vietnam conducted by U.S. soldiers from the Company C of the 1st Battalion, 20th Infantry Regiment, 11th Brigade of the 23rd ( Infantry Division, on 16 March 1968. On March 18, 1969, the United States began a four year long carpet-bombing campaign in the skies of Cambodia, devastating the countryside and causing socio-political upheaval that eventually led to the installation of the Pol Pot regime.

Notable War Criminals:

– Idi Amin (Rwandan military dictator)
= Robert Mugabi, (http://www.thestar.com/…/zimbabwe_suffers_silent_genocide_w…)
– Milan Babić – https://web.archive.org/…/indictm…/english/bab-ii031117e.htm
– Pol Pot (Commander of Khmer Rouge army in Cambodia, initiated Cambodian genocide. He was a very, very bad man! Lived to be 72)
http://www.cambcomm.org.uk/holocaust.php)
– Khan Noonien Sing (for using the Genesis Device – http://memory-alpha.wikia.com/wiki/Khan_Noonien_Singh — Product of eugenics wars —http://memory-alpha.wikia.com/wiki/Eugenics_Wars)

So, Why Should We Not Commit War Crimes?

IT IS VERY BAD.