Jeg løp å Bymarka i går. Det er stor park av Trondheim. Bymarka er øst av byen.
Jeg løp om ni mi. Men vi er i Norge, så fjorten kilometer. Jeg møtte noen folk å veien.
En par har litt hund, bulldog hvis jeg husker. Før jeg var i skogene, jeg sa hund gikk å toalett på torget veien.
Når jeg løp ved siden av jeg så ¨god sted!¨ Jeg vet ikke jeg hvis det er riktig. Men både lo.
På topp av Bymarka jeg kunne si gjennom Trondheim. Det er pittoresk .. vakkert. Det var lørdag så mange folk trekker for rekreasjon.
Jeg tenker jeg var på Våttakammen, men jeg skal finne ut neste gangen.
Jeg tak ikke camera. Men her er bra eksempel.
Okay so the main thing I think is going to cause confusion above is the story about the dog.
I was on my way to Bymarka. I saw a couple walking their dog. The dog took a shit on the train tracks. I said god sted as I went by. I think this might mean Good Place! As in, good place for the dog to take a shit.
Inflection would be important here in English and I think I got it right, if not the words, in Norwegian. Most of the time in Norwegian I will get it wrong, words and inflection. Often when I get words right, inflection will be wrong. I think this might have been a rare time I got inflection right. Not too sure about words.
The above in Norwegian I wrote without help. It represents my present level of writing. Today I did some homework in Ny i Norge, the Norwegian immigrants learning book (New in Norway). This week I hope to write about that, about Eurovision, about me sticky-noting my apartment, and maybe about a poker game in Oslo!
I read nonfiction. I basically only read fiction when it is philosophically motivated and influential, a historical landmark, or when it contributes to some sort of context I want to have, like if I am going to a country in which it takes place. Therefore, I wouldn’t take my fiction commentary too seriously.
Taleb’s often endearing, but frequently overbearing (and in the midst of key passages, no less) ego, absent-minded disorganization, indulgent and liberal analogizing, amid other avoidable errors, are not enough to extinguish the brilliance of his original thesis. Overlap with The Black Swan is relatively (surprisingly) small. A must read for those interested in nonlinearity talk, in the vein of Jervis’ System Effects, but without the rigour and with, to steal a phrase from a Howard Zinn jacket, “a shotgun blast of revisionism.” I would still start with the Black Swan, but stopping there would be self-punitive. This is not a technical work and relating technical matters is not Taleb’s forte. His forte is effulgence.
Thinking Fast and Slow
The pitch-perfect, comprehensive introduction for laypeople. An antidote to Gladwell, but so much more. High school textbook material, in a good way, but also recommended for lake-side reading. Basically what textbooks should be, bestowing definitions, examples, and evidences for various cognitive biases, without any of the usual eye-glazing roadblocks. I would go so far as to say asocial science triumph.
Elegant, crisp, and lucid prose. Astounding in arrangement at every scale — word, sentence, paragraph, story, work — with expansive sublime commentary through inference. Instills resonant heartbreak.
Bird of Chaman, Flower of the Khyber: Riding Shotgun from Karachi to Kabul
Matt Aikins intrepid journalism in long, narrative form. Full disclosure, I worked directly with (and under the tutelage of) Matt at the magazine he ran during his final year of Queen’s University. A talented writer, he has traveled extensively in Afghanistan, Pakistan, Syria…Are those hotspots sufficient? He has interviewed warlords, crossed through dangerous checkpoints, ducked gunfire, investigated torture, exposed the murder of Afghan civilians by US forces, spent days with Syrian bomb makers, the list goes on and on. He’s written for Harper’s, Rolling Stone, the Atlantic, et al. To my mind, the premier conflict journalist under forty writing in the English language.
Elements of Poker
The starting point for any aspiring poker professional looking to enact a lifelong tuning of their mental game.
Sneaking into the list, is effective, interesting scholarship and well organized, informative history, overlaying an intriguing narrative. What puts Greenblatt a cut above is his uncanny ability to lay out in plain language the nuts and bolts of competing philosophical premises, in their context. With this power comes some responsibility, and Greenblatt errs on hyperbole a smidgen at times. Another must for those touring the history of ideas.
He was the best living writer of English nonfiction until he wasn’t. A man who wrote in a wide variety of formats, Arguably is a collection that plays to his strengths. His reviews outclass his long-form work, such as the infamous and popular god is Not Great, to the point where it is, to my mind, unfortunate that it is the book he is most well known for.
In this collection he is the epitome of freethinker, navigating everything from animal rights to Benjamin Franklin studies.
His review of an apparently forgettable Mark Twain biography is a study in critical dissection. His prose are crafted to sever. He balances detailed consideration and pithy wit, but not delicately, because he is seemingly never in danger of being hysterical or hyperbolic, while simultaneously ravaging the banal, the lazy, the haphazard, the uncritical, and the misinformed. A unique collection whose only fault is its exclusion, despite dozens upon dozens of entries (ie plenty of room), of his 2006 Slate column on Slobodan Milosevic. That piece, again highlighting my own bias, contains the following favourite: “Beware of those resentful nonentities who enter politics for therapeutic reasons.”
Incredible writing like this does not litter Arguably, it forms the ground on which everything else stands.
Unique, virtuoso, and perpetually intriguing.
Simply, for prolonged stretches, not achieving a state of being compelling. Not a page turner. One has a hard time identifying with any characters, much less the majority of those who might be roughly pigeonholed into the role of protagonist as their portion of narrative is limelit.
Wallace is a prose artist, a classic case of, pardon the cliche, perfecting the rules in order to break them — with spectacular results. Coinage and spin, technically playful. If Wallace, as a 1,000 page novelist, were to be rendered into a basketball player, coaches would unanimously assert “he does all the little things.”
There is simply no way Jest cannot be characterized as self-indulgent. But that is fine, because indulging Wallace is surrendering to something majestic. My preference to be rendered a hapless page-turner, ignorant to my surroundings, was never fully actualized, particularly in the first 400 pages, as important in my mind as any other 400.
On the other hand, for what one supposes Wallace was trying to do, he, in my present, humble estimation, executed it near-flawlessly.
The Signal and the Noise
A noble effort. The world is better that this book was written. Nate Silver was not the hero some wanted, as far as literary capacity, but he is the hero they got.
The Idea of Decline in Western Civilization
A survey of history that suffers from a lack of corroboration in the assassination of various philosophers with whom we can infer the author brooks philosophical disagreement. An otherwise excellent survey, certainly worth studying, drawing from a breadth including W E B Dubois and Nietzsche.
The filthiest book I have ever read, bar none. Its black type on white page is make you look over your shoulder on planes filthy. Truly pornographic.
Crescent and Star
American Stephen Kinzer (who wrote the also recommended All the Shah’s Men) has written an informative work that suffers from narrative injections and unabashed romanticism for his adopted state.
The worst books I read in 2013
I feel bad about my neck - Nora Ephron
This book humoured me less than I did it.
Bone in the throat
Not the elsewhere brilliant, Renaissance flaneur-on-camera Anthony Bourdain’s best. Far superior to Ephron’s offering, though.
Almost two years ago I made this post, which is uncanny in its applicability to the current Quebec student protests.
A play without a hero
People might point to Quebec’s emergency legislation as proof of the fact that the government is in the wrong. Sure, it is in the wrong, but not in comparison to the student protesters, only in general. The students also happen to be in the wrong.
Misguided would be a great adjective to describe both parties. Anytime a Western democracy passes an emergency powers law it makes an unwitting admission that its legislators have no idea of the history of their own polity. It is an admission of ignorance. Not just of practicality — the fact being that these laws and efforts never work to their intended effect. More importantly is the total ignorance it shows with regards to the history of laws of this nature. Their absence and repeal, over time, is one of the main benchmarks of progress on the road to the prosperous and flourishing society we find ourselves in today. Those in the Quebec legislature cannot argue, without betraying ignorance to the seriousness of crises past, or the relative lack of seriousness of the student protests, that this is a crisis worthy of an emergency powers law. Moreover, there are laws on the books. Not to point out the obvious but, somehow, up to this point in recent history, Quebec has been a relatively orderly place.
The students are more obviously in the wrong, that is to say their wrongness is more conspicuous. They seem to realize that there is a state and that the state has interests. What they are painfully unaware of is that they are not an oppressed minority standing up for itself, but rather, an incredibly fortunate group merely protecting their interests.
Wrapping yourself in righteousness
Were these students aware they were receiving essentially a corporate hand out from the state, at the expense of the rest of society, and wanted to protect themselves from their subsidy being reduced, they would draw up a public relations campaign that painted themselves as an oppressed, hard-working minority “up against it” as it were by the penurious provincial government “going back” on its word. Moreover they would paint the issue in light of its fairness, its justice, cosmic justice! It would be a brilliant way to brand your interest group and to deceive the public who may or may not know better or who may or may not take the time to find out.
It just so happens, by way of happy accident, that the student protesters have painted themselves in this optimal public relations light: the unwashed, hardworking low men and women on the totem pole, simply fighting for what is right.
The truth is a neat little proof of the fact that, independent of whether age bestows wisdom, those lacking the former don’t possess the latter. I feel partially obliged to write this now since, at twenty-six, my ability to make such statements with any credibility is expiring.
Because the students actually believe what they portend on their placards and signs, they actually shout their slogans with conviction, not in sole interest of fulfilling their interests, but also in interest of supposed and phantasmic cosmic justice. They’ve been tricked by lightswitch enlightenment; the light is on, but nobody is home. Such is the privilege of privilege — the ability to inculcate yourself in the belief that what has been given to you is deserved.
This article originally appeared here and I am indebted to The Mark for publishing it so I do not want this reprint to be seen as anything untoward. Really, I wanted something leading this space that was not the Hitchens tribute, since that piece, apart from being laden with obscure references, probably has the least broad appeal of anything I have ever written, which is saying quite a lot. It seems likely that a few more people will be visiting this space, hopefully spillover from the other blog I have been keeping here. If you don’t know of it, and think that I have just been not writing anything this past while, well I am sure this, this, or this, will convince you otherwise.
Anyways, an update on this piece. Recently the father, his (preferred) wife, and his son were all indicted on murder charges based on the killing of the four women. This has been much publicized so I assume most have come across the story, I mention it in interest of full disclosure. My view, without actually having been in Canada for 15 months, is that the trial and the media coverage I have taken in seems to signal a shift of the conversation from the (often inane) “how culturally relativist should we be” back and forth to the “how can we address this issue in our communities” question.
Sitting in a coffee shop I overheard the barista soothe repeatedly “it is their culture” to an elderly regular. Her statement was both an explanation and an excuse; the customer could not understand why it had happened or how it was justified. Word had spread that the four women who drowned just outside Kingston last July had been murdered, by their own family, in what the National Post’s Tarek Fatah characterizes correctly as an “honour killing.” A man, his son, and his second wife had killed his first wife and their three daughters, ostensibly because one teen acted salaciously and thus shamefully. Fatah, a devout Muslim, takes the view that “the Koran does not sanction such murders, but man-made sharia law … does allow for the killing of women if they indulge in pre-marital or extra-marital consensual sex.” It is their culture. But what the barista offered as a defence should be an indictment.
In 1985, the Canadian Multiculturalism Act became law. The contradictions it contains offer a clear parallel to the mindset of those Canadians today who would offer a similar line.
(1) It is hereby declared to be the policy of the Government of Canada to
a) recognize and promote the understanding that multiculturalism reflects the cultural and racial diversity of Canadian society and acknowledges the freedom of all members of Canadian society to preserve, enhance and share their cultural heritage;
e) ensure that all individuals receive equal treatment and equal protection under the law, while respecting and valuing their diversity;
Canada is experiencing a widespread conflation between the rights and freedoms a liberal democratic state should endorse and unqualified cultural relativism. It is far from desirable for the state to have citizens preserving or sharing, much less enhancing, barbaric aspects of their cultural heritage. It should also be clear that ensuring the first portion of section (e) will oftentimes be anathema to respecting, much less valuing, diversity.
Furthermore, it is worth pointing out that a belief in basic human equality, when coupled with a desire to inhabit a society freely populated by people from all over the world, constitutes a culture itself. This kind of culture cannot, by definition, be one that welcomes all kinds of cultures into its midst.
One perverse alternative to this arrangement is cultural relativism, whereby members of one culture cannot judge those from another, simply because they “have not walked in another’s shoes.” This is too often the excuse given by Canadians to heinous actions of fellow citizens or to heinous practices accepted in distant lands.
There is no doubt that unjustified xenophobia persists in Canada. However, apologizing for the negative aspects of immigrant cultures is the furthest thing from a rational response to this societal shortcoming. And yet, it is the tactic the politically correct most often adopt. In their minds being called xenophobic themselves would surely be worse than admitting that there are inferior aspects of other cultures.
But what of those who would genitally mutilate their infant children, as if modifying their own property? What of the “traditional lifestyle” that keeps so many First Nation communities impoverished? What of Aqsa Pervez of Mississauga who was slain by her own father because of the choice she made, her choice, not to don the hijab? Surely a hurtful label can be risked in order to speak out against not just these acts, but against any culture that endorses them.
It is actually not that difficult for people from all around the world to cohabit a liberal democracy that affords them relative security, opportunities for prosperity, and the rights and freedoms an “equalist” would expect. People have much to gain from one another. You don’t need any abstract explanation for why a population diverse in geographic origin is preferable if you have eaten sashimi, samosas, or shawarma after growing up on chicken and potatoes. Cuisine suffices, to say nothing of music, fashion, literature, art, or more compelling still, the interpersonal relationships we form. Social cohesion is certainly not induced by lame government-sponsored cultural celebrations, but rather by the everyday social and economic benefits we derive from one another.
This ongoing conflation between the logic of equality and misguided political correctness is harming Canada’s chances of success. There is no hypocrisy in thinking and speaking critically of one’s own culture while doing the same of, and indeed rejecting outright, other cultures. When the freedom of women, the rights of children, or the hopes of the impoverished are at stake, let me suggest we do both.
The imaginary question from the imaginary person, who verily exists in this world, just not in my life, leads one to believe that another, less uncouth, expositor could have come along and been just as well heard and just as well considered and just as doubly well read. This is a falsity held up by Quixotism but maybe more precisely an inaccurate gauge of the state of the world’s other minds.
There are times to argue diplomatically, to make false concessions, to implore and to retract, to self-deprecate and to parry one’s own advances and one’s own territorial claims, that is if one seeks to plant the seed of an idea or perhaps just an emotion of nostalgia in the mind of another whose favour you personally need. This class of discussion need not always be with a women whose bed of which you seek joint custody for but tonight, but across broad surveys, this is when it occurs most often.
To not care what anyone else thinks is the correct stance to take on one’s own public expressions. Not for personal indulgence in smug erudition (while enjoyable), but because not having a stake in changing someone’s mind implies one is not in the literal business of doing so and that no gain of financial or ecclesiastic variety is on the cusp of seizure by hoodwink. The book is already sold; the column is already filed; the cheque is already in the mail and no one with a clip board awaits you as you filter out of the hall angry at that daft prick. Besides the silent electronic donations to the business of literalism which would have once announced their repeated cupidity with the familiar rattling of the collection bowl are the mercurial hindrances of the listener’s emotions, wooed by those who would study Cicero on oration not for what to spot, but for what to do. It is rare for someone to change their mind instantaneously and when it does happen it is often in times of distress and vulnerability, an arrival at gullibility by way of fear and inducement.
When I entreat someone to teach me something, as I do from time to time, I sit with open mind, and their thoughts pour through me like water through a sieve. I am actively ready for my mind to be changed, I know what I currently think on the subject is a hodgepodge ignorance, a collection of common sense natterings at best, and yet their efforts go unquaffed. It takes work for me to rehash and to revisit what has been said and why they have said it. But it is necessary because I am unable to purchase through husbandry or through lottery that which Pascal peddles, to believe without believing.
The wit behind the unapologetic diatribe, unwittingly or not, ensures his toil’s harvest is fairly reaped, that being, to plant the seed of a notion, that the meritocracy of the receiving mind can then promote or fire over time through examination and scrutiny of its own volition. This is the seared impression as gift, not for purchase or for sale. Quite in contrast to all those abnormally alliterative speakers who would wish, with charlatan charisma, others into halcyon hock, impervious to the careful consideration of facts and follies. That is how the man of letters and of the hour could know he had truly succeeded, when a skeptic is defeated despite being filled by his bilious essay with the fluid of choler. To not only dispense with, but oppose the persuasive techniques of the sophists, for the sophistication of ridicule, is not so much of a challenge as an investment in one’s apologies’ integrity.
Some pieces of culture we would all agree are disdainful, and some, though we may not agree which ones, superior in nature. And so some amalgamations of many pieces of culture one would assume by transference are therefore disdainful and inferior in nature. The opposing opinion was something which Hitch perennially derided and shamed, for it is clear that an elevation of those things we should not discriminate against — be it race, creed, party, or religion — is a defense of unsavory tribalism. Any allegiance to these mock, blow up doll divisions for dummies is a type of stupidity similar to that of the bigot. For the culture one has, especially in today’s west, is the culture one chooses, so that one’s culture is ultimately the amalgamation of the pieces of culture one elevates by virtue of deeming them worthy of reflection. Without heed to any ideological or other prerequisite, the amalgamation Hitchens enjoyed, he was by his virtuosity also contributing to until the end.
And so we realize that it couldn’t be done another way, the asshole is reason’s man. No man who chose to respect a peep of his inner milquetoast would be able to stand up for reason and sanity and criticism as much as Hitchens did. Never to hide behind shelves or peers or committees or titles or desks or blackboard and chalk, he says and he writes what he means and he expects to be evaluated on its merits. His causticity is the reaper of the intellectually slovenly habits of so many women and men in this ongoing spectacle, our species’ slow and assiduous awakening, century by century, from its self imposed immaturity.
Think about it (or don’t, as is your natural inclination). It seems to me that the only way in which humans could be considered smart is in relation either to animals or to rocks. But animals, much less rocks, are extraordinarily stupid! I watch documentaries involving evolutionary biologists who have trained monkeys to perform certain cognitive tasks. They are judging the animals cognitive performance in terms of what it can do, what it can maybe understand (more on that later), not in relief of what it can’t do and what it can’t understand. They aren’t judging intelligence in consideration of the fundamentally wrong decisions the monkey would make in an infinitude of situations.
Of course, failing rocks and squids, individuals can always point to other individuals to differentiate themselves as intelligent. A proof of one’s lack of stupidity this is not! If we judged human intelligence in relation to say, not being stupid, I think we would find that we are all very stupid indeed.
One main and overarching point I want to emphasize is the following. The only possible explanation for an endless variety of contemporary and historical phenomenon, whether economic, sociological, political, religious, cognitive, and beyond, is that we are all stupid and always have been. Without this ingredient a great deal of human behaviour is explained without anywhere approaching full satisfaction. Not to say that the we-are-all-stupid-postulate completes or fulfills explanations, but it is certainly true that without it, most explanations will be incomplete.
See, intelligence can be sometimes judged in a prism of choices. On this, humans do not score well. In fact, we almost always make the wrong choice. And worse, when we do make the right choice, it is often by accident! By nature of having a problem with finite choices, say three, you are inevitably going to get some correct by happenstance. And even worse than that, we often make the right choices by experience, that is to say we mime, without rhyme or reason, the apparently (but possibly not) right choices others make or have made. Let me start with an example of this last form of mimicry in intelligence’s clothing.
There are different kinds of mistakes one can make. One is inferring something where there is no inference to be made and one is not inferring something where there is. The latter makes a great deal of philosophical sense, since by making this error, you will not clutter the truth propositions in your mind with false truth propositions. You will remain effectively agnostic, yes ignorant, of inferences not yet made, but you still have the possibility of making them eventually, and when you do, correctly. But in evolution, this Socratic approach does not obtain. To see why, consider a rustling in the bushes (the classic example). If you infer something is there, say a tiger, you will take precautions to avoid said tiger. When it is actually the wind the cost is low, you have wasted cognition yes, but your genitals are very much intact. However if you were to ever make the other type of mistake, to not infer the presence of a threat when one was there in earnest, the cost is very high indeed. Your genes won’t be proliferating themselves via those genitals. And that is what your genes, your design, points you in the direction of, opportunities for your genitals to acquit themselves well.
So we can see making the false inference has low initial cost, a bit of wasted memory and effort, while the no inference has the highest cost, but rarely. The problem with the false inference is that the low initial cost can create a terribly large cumulative cost, as one’s set of beliefs is populated, littered if you will, with untruths and fallacies. Beliefs are, for better or worse, linked together. Members of a tribe would boil water from a river, because they were under the impression the gods of fire and water needed to be appeased before they imbibed this gift of nature. But more precisely, they boiled water because that is what the tribe had always done; they had seen the elders do it and were so instructed. The loop could be hypothetically completed when some member forsake the gods’ silly boiling ceremony, and unwittingly subjected himself to the bacteria any untreated river water hosts. The wishes of the gods, and the wisdom of traditional methods, is upheld.
I submit that an extraordinarily large majority of human behaviour today takes this unthinking form, whether through mimicry, trial and error, or choosing right for the wrong reason. A reason often never thought about. In fact, even when we have no tribal influences upon a certain decision, even no influences whatsoever, our decision making becomes impaired at the precise moment when we make an initial choice. This is because humans are very biased towards a choice they have made that has not resulted in disaster. As a result of our pain-avoiding tendencies, we have formed a strategy to adopt bias towards courses of action that we know work. Not work best, mind you. Simply work. Is the grass greener may be a question that pops into our mind during thoughts of copulation, but when not concerned with the distribution of our genes, we are very much inclined to not even ask. Much less investigate. And our preferences then form from habit. The more you enjoy something, the more you enjoy it. A nice modern example of this is your friend (or you!) who is a total stick in the mud when it comes to what subway sandwhich they order, or how they take their coffee. Our minds actually convince us that we have a strong preference for BLTs and that straight black is disgusting. Everything at subway is actually delicious and every style of coffee is easily quaffed, but you’ll never know because your mind begins to shrink your horizons from the first sip.
There is a point at which a human becomes self-aware, and according to Sartre, there is a point at which a human becomes aware of their self-awareness. They look down and see this self-awareness, and then, depending on your pretentious french interpretation, they either begin existing, or begin existing on a higher level of consciousness. The more people I meet the more it seems to me quite possible that there are not only a large percentage of humans who aren’t aware of their self-awareness, but that there are adults who are earnestly not self-aware in the first place. As in, they never were. The problem with this digression is its empirical boundaries and for that reason a disturbing digression it remains. But, in order to graduate from a hodgepodge of behaviours that could at best be described as the collection of cognitive errors through trial that results in the least disastrous outcomes, to something that resembles intelligence, one would assume we would at first have to become aware of what we are doing and why.
In the study of human behaviour the list of recognized cognitive biases has soared well past one hundred. But amongst the discussions of loss aversion, patternicity, and the illusion of control, everywhere I see an underlying theme, that these are deviations from humanity’s normally noble reason and infinite faculties. It is almost as if the whole zeitgeist suffers from a meta cognitive bias themselves. It being that they regard cognitive biases as though they are some sort of mutilated diaspora flung by untoward circumstances from the homeland of human behaviour, which is of course quixotic and populated by a vanguard who is always right, precise in judgment, and in possession of knowledge of the true kind. There is a simpler and more sensible model: we are all stupid.
And that cognitive biases are the rule, not the exception. Our ability to judge a dollar lost as more important that a dollar won, to see patterns where patterns don’t exist, to see faces where faces don’t exist, to perceive control where we have none, to see the past as predictable after the fact, to be so incompetent in our assessment of various probabilities, less the notion of probability, these (and others) are the norms that constitute the vast majority of human behaviour.
What a piece of work is a man, how noble in reason, how infinite in faculties, in form and moving how express and admirable, in action how like an angel, in apprehension how like a god! the beauty of the world, the paragon of animals
Animals! The problem with being an animal is that our brains have not developed to discern what is strictly true and guide us by that light. Rather, our brains care about getting the good news from our genitals and surviving to tell the story. And it came upon all the ways it does this by happenstance. This collection of idiotic habits we all have and exhibit have been accrued through a long history of not being eaten by tigers that probably weren’t there.
So truth seeking, or truth discovering, may be theoretically possible with our brains. Since if our brain stumbled on such functionality, and it provided some with a better chance of healthy procreation, then such capacities would proliferate. But currently we are bowed by the freight that is our survival directed cognition, a mishmash of nonsensical beliefs that often work for reasons far removed from the ones we have concocted. To err, but live nonetheless, is human.
There is a really big difference between best and most influential. So big I don’t know why one would put both adjectives together. There are two on the list (that I’ve read) that strike me as very influential, but also incoherent:
- No Logo, Naomi Klein
- The Clash of Civilizations, Samuel Huntington
I was surprised to see A Walk in the Woods by Bill Bryson on this list. Its a good book that I enjoyed. But its neither influential or the best. They also put Dreams of my Father on. Yeah right.
But at least they got some good books on this list. All the President’s Men, The Structure of Scientific Revolutions, Guns, Germs, and Steel, and a Room of One’s own.
If you haven’t read “Philosophical Underpinnings” which discusses the preface to this book, it is below, and that is where I would start. But this post should be fairly self-contained. Hopefully you enjoy it. I’m still thinking of reviewing ESPN’s Grantland next, but I might do something else. In any event I don’t plan on being so long between posts here; I’ve been very busy lately. I also don’t plan on writing much, if anything more, on conspiracy theories or theorists, or far left wing anarchists or counter-culturalists in general. In this space I’ve covered a lot regarding this fairly hopeless bunch and I would consider my criticism fairly comprehensive as of this post. I’d like to do a post or two on Hobbes coming up and maybe on someone I’ve never really looked into before. Hegel could be a good candidate, the major philosopher of whom I’ve read the least. Also would like to do some stuff on finance/economics, I have an article/brief interview with Paul Volcker that hasn’t seen the light of day and I just finished Niall Ferguson’s The Ascent of Money. Enough previews.
* * *
This psychic need to impute all evil to a lone, omnipotent source inevitably requires the conspiracist to create larger and larger metaconspiracies that sweep together seemingly unconnected power centers…. this is why… they are so fond of flowcharts… all of society’s actors can systematically be grouped into cascading hierarchies that soar upwards to a single, ultimate puppetmaster.
Jonathan Kay of the National Post, in Among the Truthers, boldly journeyed into the deep underbelly of… peaceful public demonstrations and the radicalized, highly incendiary… internet message board communities that characterize contemporary American conspiracy theory movements.
Kay focuses on the “Truthers,” those who would deny the ‘official 9/11 narrative’ in favour of a global conspiracy. The banal venues navigated to research the work mark the gulf, hopefully ellipses worthy, between the charges Truthers levy and the phoniness with which they pursue them. This was recently brought up in a roundtable discussion held on TVO’s The Agenda, not by Kay, but a Truther, one Barrie Zwicker, making one of Kay’s points “that [Truthers] seem strangely disconnected from this, like it is some sort of debating exercise,” not reflecting the horrific nature of their claims. The charge was subsequently, with histrionics, denied, but it is a disarming and salient point. “It bears mentioning,” Kay writes “that the Truth movement is entirely nonviolent. Their meetings and literature typically are suffused with exhortations to tolerance and respect. When they demonstrate publicly, they get permits, and usually follow police instructions carefully.” If you really believed in a ubiquitous secret cabal managing us, the global peons, should not your activities more closely resemble the freedom fighters of Terminator than those of Milk?
Conspiracist actions are prosaic in the extreme when compared to the all-encompassing nature of the threat they face. No, not all responses to problems must be proportional to their gravity to arrive at a solution. If you want to prevent an early and tragic death maybe all you need to do is stop smoking, if you want to save your marriage maybe all you need to do is pick up some haphazardly placed socks. But one would expect that to fell an organizational structure that is nearly omnipotent, generally undetected, and assumedly ubiquitous, one would need more than protests, placards, websites, pamphlets, and disorganized books. Youtube videos with countless thumbs electronically upwards do not ensure your revolution doth take. Prime that entire media with a quick veneer of zeal and all manner of ineffectual activism is accounted for in their ranks. What you don’t see is Truthers kidnapping congressman and injecting them with truth sermon on live webcast. You don’t see them making the news by breaking into offices associated with the Federal Reserve.
There are two main possibilities that I see from this point, that they are genuine and that they are fake. I suspect one thesis doesn’t apply to all. For the genuine the indictment is heavy because the explanation for why they don’t commit acts of violence in an effort to publicly unmask the New World Order or what have you is that they truly believe they already have. In other words, they think that their video documentaries and articles have sufficiently made their case (or perhaps that they are sufficiently morally absolved) and that those disbelieving are one or more of ignorant, evil, and stupid. If they thought that critical thinkers needed more proof to believe in their movement then they shouldn’t hesitate to commit acts of small and even medium scale violence. If you are unsure of what my inference is, and just to clarify for those readers who think I’m tip toeing, I mean to say that if you are a true Truther and you are not organizing and committing acts of violence against the state, or more precisely in your view, acts of violence against the oligarchy that runs this world, then you are a sell-out, a coward, or too stupid to understand your own claims. (Of course we see lone gunman understanding the callous and derision worthy implications every so often. Walking the walk doesn’t graduate them from the dunce corner.)
The very observation of these people congregating in churches and school gyms to plan non-revolution and non-violence quashes the believability that they believe. It’s a glorified hobby.
For this reason I agree with New York Times reviewer Jacob Heilbrunn that Kay doesn’t demonstrate America is in “dire straits” (Heilbrunn’s words) and that in “concentrating so narrowly on Truthers, Kay describes them superbly, but he may exaggerate their potential influence.” But for this I wouldn’t actually fault the author, after all is it not most authors’ modus operandi to advocate for their topic’s import? And in the larger scheme this criticism is slight; it certainly didn’t prevent Heilbrunn from penning a positive review, and isn’t going to hamper my want to either. Truther sympathizers might not take it too well though, when my main criticism of a book largely filled with criticisms of them is a denigration of their influence!
In fact, Kay doesn’t quite make the claim that if they continue to go unchecked we should worry that Truthers will stop being posers and break our covenant under Leviathan. Rather near the end of the work he warns “The threat currently posed by modern conspiracists is not physical, but cultural.”
But let us briefly return to the Truthers who would never enact violent projects towards the state because they don’t actually believe what they spout (as opposed to the honest cowards). They are the fakers and they know you know that they know it. I assume they don’t mind though, because the realization washes over them in but a moment, while their time preaching to the credulous can sustain late into the night. To that point Kay quotes Norman Cohn’s preface to the 1996 edition of Warrant for Genocide, “There exists a subterranean world where psychological fantasies disguised as ideas are churned out by crooks and half-educated fanatics for the benefit of the ignorant and the superstitious.”
And in that way conspiracism is very much a modern religion (a point Kay makes). Modern religions are labels with which to identify oneself, shopped by others and bought by those looking. They don’t stop the converted from adultery or from child molestation (independently held moral convictions or natural inclinations do or do not suffice for that), and they don’t affect the day to day routine of those not occupying the rock bottom. People go on living their life as practical atheists, the technical term (not actually being sarcastic here, it is a technical term) for those who espouse a religious bent that implies theistic belief, but go about their lives as if god didn’t exist, or at the least, was of no consideration. It was in this way that many (mostly the younger) Truthers existed before being converted. They were practical apoliticals (not technical, just invented), people who went through life enjoying what Marxists term commodity fetishism while often trying to find themselves from an identity or self-hood perspective, broadening their horizons, hanging out with friends, things like that. In other words, they were often generally enjoying life. And Trutherdom brings that enjoyment to a screeching halt, as Kay well catalogs. In his blog he posts letters of families breaking up over someone’s increasing, now consuming belief in global conspiracy. It is as if when they become fully self-aware of their commodity fetishism they deign to feel guilty about it, instead of first considering whether or not it is actually a bad thing.
Truthers care what others, especially significant others, think of them because they are closer to car salesman than drivers. They seek followers and the approval of their peers. At risk of delving too deeply too early, they, by and large, practice Sartian bad faith: they are playing the part of the kind of person they wish others saw them as, and indeed, the kind of person they wish they were. And when they are not doing that there are three possibilities remaining, they have made a grievous cognitive error, they are infected with a mind virus, or they are (and this isn’t often the case) just plain crazy. The last option isn’t often the case despite many, including Kay’s, own intuitions. The author had “long assumed that abnormal theories came from abnormal minds.” Sometimes they do though, let’s remember.
People are always trying to explain why their life took the direction it did. Why it didn’t turn out in the way they envisioned. Often conspiracists are offering not an explanation of global causation to the unwitting, but a deflection of responsibility for the cause of their current position. “In America,” Kay posits in a brusquer moment, “life’s losers have no one to blame but themselves.” This is a tad bit harsh, but well taken nonetheless. Trutherdom therefore, is often a masked victimology, a coping mechanism for those who cannot accept that there are very strong meritocratic elements running through American society and yet they remain rungs below where they would expect. The certain opposite of meritocracy is oligarchy, and the conviction that oligarchy is the true (yet unseen) structure of society equates to the universe, not as chaotic, in the steel yourself aphoristic tradition, but rather deterministic, in the pointing fingers and placing blame tradition. “Michael Shermer, the editor of Skeptic magazine, and executive director of the Skeptics Society, calls this mode of thinking “agenticity” — “the tendency to believe that the world is controlled by intentional agents, usually invisible, from the top down.”
One thing Truthers constantly fall into the habit of doing, whether young or old, esteemed or not, is mentioning the volume of their effort. In that Agenda roundtable Barry Zwicker implores Paikin and Kay that he has “been doing media criticism since the 1970s,” and has “looked at hundreds and hundreds of television shows, books, and articles.” Well congratulations! You get the volume of effort prize. Next time you want someone to believe something you have to say make sure to hoe a corn field for ten hours first. I can’t understate how often I get fed this, from cab drivers telling me about their biblical studies to doorknob stoners prefacing their idiocy with “I have done a lot of research, and.” I have done a lot of research and is like a code phrase to the listener to immediately leave the vicinity. I have done a lot of research and is essentially the speaker acknowledging that if he hadn’t prefaced the remarks that follow they couldn’t stand on their own because of how shamefully foolish they are.
The thing is that, if you don’t actually have anything important to say, if you don’t actually make sense, and if what you do say can be invalidated upon closer inspection it doesn’t actually matter how much work you have put in. Truthers and conspiracy theorists constantly feel as if they need to justify their credentials because they often see themselves as external to a world where credentials are king, or if they have credentials as in the case of the older professorial types, they see credentials as crediting. But neither is the case.
Of course nepotism exists. No shit. Sure, some people with credentials get jobs they do not deserve. This is often explained by a market with a large information asymmetry between employer and employee. Hiring humanities professors in universities is a great example of this, if the department chair makes the final say on hiring she or he might not know anything about Renaissance poetry, the specialization of the prospective employee. If a member of the university administration is making the hiring she or he almost certainly knows nothing about the subject. The same can be said largely of automotive repair or travel agencies (at least in the past). Because most people can’t afford ten day vacations to tropical getaways that require air travel very often, they will be uniformed of the various vicissitudes of the market and equilibrium prices. Because most people have no idea how their car works or what its individual parts cost they will be at the mercy of their, usually exploitative, mechanic. Nuances not aside does not mean that Zwicker is doing anything but huffing and puffing. But as I mentioned, his is not an isolated tactic. The lesser educated conspiracy theorists I know or have met are constantly reminding me that “I have done a lot of research, and” or “I have looked into this a lot,” or “I know this is crazy but I’ve been reading the Bible and it accurately predicts…”
The most painfully obvious point is quite harsh, but necessary to make: that these people don’t actually know what it means to do research.
Not because they are uneducated and not because they lack a liberal arts education, but because they lack those skills that are sometimes imparted upon a person by a liberal arts education, namely, being rigourous. Another being that, simply because you have thought critically, does not imply you have arrived at the right answer. Conspiracy theorists have trouble grasping this point the same way they misapply cui bono. Thinking critically can move you from one ideology to another, just as cui bono can shift blame from someone who was blameless to someone else who is blameless, but benefited nonetheless. Kay frames this succinctly: “Assassination-related conspiracy theories, in particular, tend to emphasize the logic of cui bono — since the death of any public figure (JFK is a good example) always produces hundreds of indirect benefit.” The way conspiracists use cui bono not implies, but shows, that I took part in some secret cabal to have my mother inseminated with the demonic seed of my father in an effort to serve my best interests of coming into being. And this, to the clear detriment of the masses.
“According to this view of history,” Kay unravels this species of anti-Americanism further, “there is no such thing as an honest causes belli: Just about every conflict in the history of human civilization has been caused by a warmongering conspirator killing his own kind and blaming it on an innocent enemy.” Again we see another detachment from reality the conspiracist maintains: the unrealization that there are bad people in the world, to be blunt, bad brown people. The point the anti-war left has habitually left unaddressed amidst reciting the truism that violence breeds violence, the conspiracist takes further.
Beyond Rooseveltian non-intervention the conspiracist defers to Gandhian moral absolution, the stubborn refrain of those who would fully endorse nonviolence in all historical contexts and wash their hands clean of this dirty world. Justice may be the truth of the stronger but that doesn’t mean such truths are undeserved by necessity. There are, in fact, accidents of history, to which those occupying the fringes, left or right, seem oblivious. This lack of accidental history is coupled with what “James Meigs calls “the myth of hypercompetence.” Even as the conspiracy theorist imagines a world-controlling cabal that is subhuman in its lack of pity, morality, honesty, and empathy, he is simultaneously awestruck by their superhuman intelligence, ambition, guile, discipline, and singularity of purpose.” How else could they run their operation so undetected, after all, were hypercompetence not involved? Kay connects these cognitive dissonances historically and contemporarily to anti-Semitism. He does it so well that the book is worth reading on its own for anyone with an interest in antisemitism. Jews are characterized in conspiracist (and fascist) literature and lore as being bloodsuckers, the basest and most defiled homunculi, yet simultaneously omniscient regarding every financial interaction in their purview while controlling every political outcome in their polity.
Yet another classic mistake conspiracists make when arguing is arguing emotionally. They speak hurriedly not because of the urgency worldwide conspiracy mandates, but because how urgent it is to them that you not dismiss them. “Talking to Jones [a prolific conspiracist] is exhausting,” Kay relates an interview, “He spits out every sentence as if he were calling the police to report a crime in progress.”
The Jones type, Kay argues, both wants to be smarter than the average Joe and probably is. He therefore genuinely believes that other people are ‘sheeple.’ “Once you discover reality, what is being admitted, all the crimes, and you go around to the zombie-like media and tell people to read all this stuff,” Jones pleads, “they just giggle and say none of this exists, that government is good, it’s upsetting, and so you try to wake people up,” But sophists like Jones fall into traps where they fudge logic in favour of oratory affect. Take the above quote. Since when would “none of this” existing imply us sheeple think that government is good? Conspiracists like Jones constantly concoct false dichotomies to press their point. If they can get you between 9/11 was an inside job or some historical factoid is false, then you’ve already lost. The constant introduction of false dichotomies robs most people’s intuitions by using the power of suggestion to frame a question in terms that would seem ludicrous upon sober reflection. And “like all committed conspiracy theorists,” the powers of a Jones sophistry extend to being “able to incorporate any new piece of information or historical development into a pre-existing framework.” This ability, difficult to combat during live dialogue, isn’t unique to them. What you see emerging after dissection of all conspiracist methods is the difficult managing of their theses and arguments doesn’t have to do with the power of the arguments, their logic, or the facts, but rather their power stems from the cognitive shortcuts they use, the same bag of tricks confidence artists and fake TV psychics employ. “In this game, the conspiracst claims victory merely by scoring a single uncontested point – since, as he imagines it, every card he plays is a trump.” The philosophers, historians, essayists, and political commentators that are actually worth reading and listening to take constant pain to meticulously prune their work of these shortcuts, these miniature fallacies if you will; to write in a way that should not mislead a reader into falling into a cognitive trap themselves, regardless if the resulting interpretation would be to the historian’s benefit. Conspiracists either have no such qualms (indeed they often preach the ends justify the means), or are so ensconced in a snug stew of cognitive biases and errors that it would be beyond them to conceptualize that some convincing arguments for their case should not be employed.
Paikin asks the armchair rebels “how do you manage to see through this vast manipulation?” This is a crucial point that highlights the hubris built into youth and exhibited so well by the disaffected almost-twenty stoner. Truthers, however smart, however charismatic, will always possess a zealous vainglory for their own abilities to rise above the fray. To not be a prole amongst bourgeoisie suburban sell outs, to not be a sheep amongst a population whose epistemology takes cues from lemmings proceeding over a cliff, and certainly to not be able to keep their metaphors unmixed during their incessant soliloquies. Martyrdom often accompanies this layer of wonderful bullshit, as does the redemptive hero narrative. Michael Moore typifies this mix of inflated ego, radical politics, and self-righteous crusader. Rising to near aphoristic heights in The Corporation, after remarking how capitalist interests will sell you the noose with which to hang themselves, he states without air of irony or self-deprecation, “well, I am the noose.” Again, sticks and stones aren’t going to overthrow the nefarious oligarchy, spending late nights editing movies will suffice.
Kay and Paikin, fortunately, are simply not swayed by the momentum of Truthers, and moreover never flinch when baited. Zwicker’s main strategy in the roundtable is to make things personal. He wishes Kay’s book rested on ad hominem attacks upon his and other prominent conspiracists’ character (when in fairness, Zwicker is but a minor character). This is a good strategy since you can either discredit the writer by pointing out the feebleness of ad hominem arguments in general, or you can point out those ad hominems and present yourself as mature, collected, and erudite in contrast. What was funny to me about The Agenda episode was that Zwicker proceeded to present himself as increasingly whiny, blubbering, and foolish. “You characterize a great deal of people in a down-putting manner,” he says. But he doesn’t make any point about why this characterization would be inaccurate or imprecise. In fact, the entirety of his appearance on the show left me with the impression that I have been right to largely dismiss these people as a notch above (or on their way towards being) raving lunatics.
Some of the most insightful passages in the book come when Kay is describing men his age.
“Like all forms of midlife crisis, this sudden lurch into conspiracism offers middle-aged men a sense of revitalization and adventure. In some ways, in fact, it offers an even more complete escape than the proverbial mistress and sports car. For a middle-aged man who’s grown tired of life’s familiar patterns, conspiracism provides more than just fresh surroundings: It offers an entirely new reality.”
This seems right. The classic cognitive mistake humans make when falling into ideological traps is thinking emotionally, basing their ‘opinions’ subconsciously (or overtly, even) on what they want to be true. “For all their pretensions to sophisticated truth-seeking, conspiracists often seem stuck in the suburban-basement universe of secret decorder rings and Star Wars action figures.” Simply, “[they] have seen “too many movies” — particularly in the action genre.” This is another masculine aspect of the phenomenon – common, infantile coping with post-scarcity by injecting one’s mind into a matrix unreality that, if it existed, would present the opportunity to prove one’s manhood with a definitiveness regularly mowing the lawn inevitably lacks.
If one is unaware that such a normative process is executing itself in a perpetual loop, creating conclusions, true and false, based on flawed method, then the odds of escape are understandably low. Kay, ever the realist, notes that the first thing humans are willing to do for their ideology is lie, to themselves and to others, most particularly, “when the course of human events doesn’t correspond with the results demanded by their ideology.” Towards the end of the book, and without drawing attention to it, he makes the humble admission that the project “has made me more self-aware when I bend the rules of logic in the service of ideology or partisanship.” More than one reviewer found this quite refreshing and I couldn’t agree more.
To his point Kay quotes Orwell on nationalism, which like many passages in the book, applies more widely than simply analysis of conspiracism, but not so widely that the thesis meanders. On the subject Orwell, it almost goes without saying, is the master of precise lucidity,
“Every nationalist is haunted by the belief that the past can be altered. He spends part of his time in a fantasy world in which things happen as they should – in which, for example, the Spanish Armada was a success or the Russian Revolution was crushed in 1918 – and he will transfer fragments of this world to the history books whenever possible … Events which it is felt ought not to have happened are left unmentioned and ultimately denied. In 1927 Chiang Kai Shek boiled hundreds of Communists alive, and yet within 10 years he had become one of the heroes of the Left. The re-alignment of world politics has brought him into the anti-Fascist camp, and so it was felt that the boiling…didn’t count…. Some nationalists are not far from schizophrenia, living quite happily amid dreams of power and conquest which have no connection with the physical world.” To Kay, conspiracists distend the human ability that creates “Japanese historians who have averted their eyes to the rape of Nanking.” And that seems pretty reasonable.
Donning his National Post hat Kay theorizes how conspiracists often shape their paranoid narratives “so every epic tragedy the world suffers must somehow be laid at Washington’s doorstep.” In the absence of “positive ethnic or religious attachments” these “failed historian[s]” inevitably gravitate to an “embrace of strident anti-Americanism.” As I have argued elsewhere, both Orwell and Zinn, at this point, are arguably doing more harm than good, by being read. The predispositions of many readers, I have to believe, lead them to misinterpret the intentions of the works and form false inferences. It doesn’t seem implausible to me that during a given decade some great work will stir more well intentioned anger than well directed dispassion. There wouldn’t be much point in handing out Leviathan to Russians in 1950. His National Post hat quickly finds the hook when Kay details “the failed historian” who, beyond false inferences, forces his inferences to serve his purposes, and “began showing up in the form of shell-shocked free-market purists (often in Tea Party garb), who could not accept that the greatest recession of our time had been sparked by the recklessness of homeowners, overleveraged banks, greedy mortgage brokers, and other private actors.” Sounds like these people would really benefit from re-reading Atlas Shrugged doesn’t it?
Of course, Truthers themselves know exactly what literature to recommend, at least to those who can be saved. The first layer of psychological protection the conspiracist relies upon is the belief the rest of the world is ignorant, simply in need of exposure to their truths. Failing that, the second layer states, those who have been exposed to no effect, are stupid: “As with Marxists who accuse nonbelievers of inhabiting a “false consciousness,” many Truthers see non-Truther “sheeple” as not merely misinformed, but mentally deficient in some very basic way.” Yet often their adoption of the conspiracist mindset had no roots in cold rationalism, “like religious converts who suddenly see the light of Christ, a surprisingly large number of Truthers told me that they ‘just knew’ that 9/11 was an inside job the second they saw the towers collapse.” This mentality provides the converse to the third layer’s narrative, that a nonbeliever, failing being stupid or ignorant, is evil. A great deal of self-hate, vestigial stoicism, and unproductive envelope licking is spent by Truthers in “a quest to situate one’s travails amid a meaningful struggle against some oppressive evil. The more oppressive the evil, the more meaningful the struggle.” “We’re in a crisis, a crisis as profound [as that] of the Revolution, the Civil War, the Great Depression, or WWII,” claims conspiracist film maker Stephen Bannon. It isn’t enough for the grandiose conspiracist to construct a big boogeyman, they have to state what I call the crossroads in history fallacy, the idea that now is somehow more important than before or after. Whether this all is a larger coping mechanism for our “difficulty dealing with random, purposeless suffering” or our inability to accept we live in a godless, monistic universe with no overarching or predictable narrative (“virtually any amount of suffering can be endured if the one enduring it feels it has a purpose”), or both, life is given meaning to pitiable Truthers who would paint their kampf as one filled with courageous, reluctant, and therefore earnest, martyrdom. It seems to me, in conditions of post-scarcity, where markets operate relatively well (enough for social mobility and meritocracies to obtain), and positional goods exist as indicators of social position, for those who find themselves in a middling position, yet harbour delusions of grandeur, the only logical step for their ego to take is to construct a narrative where the society around them is actually some sort of pod-person ubiquity conspiring to limit the ascension they would otherwise deserve, and as a result, their hand is forced to become a distributor of red pills to those still worth saving. But this construction might not entirely be truth deflection, but rather some other hardwired human failing. “Miami University scholar Timothy Melley calls it “agency panic” — “the conviction that one’s actions are being controlled by someone else, that one has been ‘constructed’ by powerful external agents… This fear sometimes manifests itself in a belief that the world is full of ‘programmed’ or ‘brainwashed’ subjects, addicts, automatons, or ‘mass-produced’ persons.”
Truthers may be very sociable and charismatic, it is perhaps what convinced them that they could write a book, that they argue so persuasively in the midst of people who tell them they really should publish something to the effect of their elocution, but when its rolled out in cold black ink their campy acronyms, their mixed metaphors, and their breakfast detention club quality prose, all reveal a rather childlike view of the great complexities adults find in this world. Kay notes in The Agenda roundtable that they talk about these things with too much distance, as if there aren’t incredibly terrible repercussions if what they are saying is true. I would go a bit further. Truther writing is chalk full of false gravitas through a piling on of converging apocalyptic forces while the real gravity of the events they wish to dehistorisize is totally lost on them. A perfect example from Zwicker’s writing: “Because the mainstream media are integral to the Industrial Military Academic Intelligence Media complex (IMAIM), the cold-blooded technicians of death face no journalistic scrutiny.” These people seem so dissociated from what real life is, from the fact that people die horrible, gruesome deaths and the fact that assigning the blame for those deaths has severe consequences. It is as if the fact that world war two was but a mere seventy years ago isn’t enough to make the politics of this world incredibly vivid and the consequences all too real. That’s the point I would make, that the world is insane enough without regrowing the tumorous belief systems Ockham’s razor severs from the sane.
What helps to alleviate concerns of the impact of these individuals is to read their finer musings. The more they speak or write, the less sense they make. One author, a Roy Moore, warns of a “UN guard being stationed in every house.” You can tell these people have ascended beyond the archaic and pacifying propaganda of the newspaper outlets since they have no concept of how anaemic the UN has been historically or is at the moment. Another such bizarre instance that Kay quotes is a birther relating Jesus’ nativity to that of Barack Obama: “There’s no doubt about where he was born, when, and his parentage. Jesus recognized those qualifications were essential to establishing his right to his earthly throne as king.” Bear with me, “[t]hat’s because God didn’t want there to be any doubts about Jesus’ eligibility or qualifications to be the King of Kings. There’s a lesson in this story for Barack Obama. His nativity story is much less known!” If you can tease out an implication, it would be that Jesus wouldn’t have been as eligible if his place of birth wasn’t well known. As an aside that’s one of the more nonsensical parts of the birthers’ argument. They are either xenophobes or racists whether they are honest with themselves about it or not. If Obama was born in Italy I don’t see how it would matter one iota. After all people from Italy don’t have some sort of genetic governance disease that impairs their ability to cut taxes. It is just nonsense. The only job I can think of that should actually require you to prove your place of origin is the Miss America pageant. Actually it might be the same fallacy that leads these conspiracists to believe all Jews both know each other and have regularly scheduled (scheming) cohorts, that somehow an Austrian born president wouldn’t have the best interests of American at heart solely because of his vagina of origin.
At some points you question the point of even giving people like this the time of day. That seems to me the main reason to be critical of Kay’s project. I wouldn’t say he credentials nutbars by acknowledging their existence, but in his unflattering portrait comes the feeling that what these people think matters. It might be the case that it doesn’t. Not at all, not in the slightest. It doesn’t matter what wild theories children hold for example, at least, relating to geopolitics. If a child believes his allowance is paid by the laundry troll, no legislation is affected. Well conspiracists are the children, the spoiled brats if you will, of the marketplace of ideas, spoiled by an absence of confrontation, not with their parents’ authority, but with reality. Their reality is defined by their own ‘critical thinking.’
This is the defining strength of the conspiracy theory in my view because it is true that being critical is the path to an answer that is probably better than the one you currently possess. So conspiracy theories are arrived at by the process of self-doubt that characterizes all intellectual progress. Except it isn’t intellectual progress, it is intellect gone awry. This, along with the kernel of truth most conspiracy theories centre on, create such ballasts against criticism that their entrenchment amongst some followers seems completely unshakable. It is one thing to have an ideology grounded in the divine word of god. Sitting around and dismissing that out of hand is at least possible. But when the ideology which grips your mind has a central tenet that it is the product of critical thinking, of questioning, of probing into one’s previously accepted beliefs, that is where the true mind virus lurks, an insemination of false serum that is spouted upon physiological prompting all the while the spout is convinced of its philosophical validity, that it was conceived in the pure thought plane of Cartesian doubt. “The experience also has convinced me,” Kay admits “that any effort to engage committed conspiracy theorists in reasoned debate is a waste of time. Once someone has bitten down on the red pill, it’s too late. As with any incurable disease, the best course isn’t treatment, it’s prevention.”
Often conspiracists are filling a void, Kay reasons, because “society requires some creed or overarching national project that transcends mere intellect. When the appeal of traditional religion becomes weak, darker faiths assert themselves: including not only communism, fascism, tribalism, and strident nationalism, but also more faddish intellectual pathologies such as radical identity politics, anti-Americanism, and obsessive anti-Zionism.”
For Kay to be right about this the thesis that some humans are a plane, if not planes, above others in simple reasonableness, must be defended. Because there are people who work regular jobs, who have decent families, and who hold no allegiance to a religion, a creed, a national project, a tribe, or an ideology. Their ideology is pragmatism, that is to say, no ideology at all, and their national project is being productive for themselves and for the ones they love, in order to ultimately enjoy life and feel satisfied.
“We speak of the Enlightenment in the singular. But as historian Philipp Blom emphasizes in his recent book Wicked Company, there actually were several enlightenments; each led by a man of ideas trying to put his distinct stamp on the complex philosophical ferment of the seventeenth and eighteenth centuries. Yet all of them were bound up together by what we now describe as skepticisim.” In my previous post I criticized a bit of the way Kay used “Enlightenment” in his prefacing remarks. That criticism definitely should be recanted after reading the entire book, because as in the passage above he revises and clarifies his use of the term.
I suspect Kay and I will ultimately disagree on the place of scepticism. As he quotes Voltaire, “An atheist, provided he be sure of impunity so far as man is concerned, reasons and acts consistently in being dishonest, ungrateful, a slanderer, a robber, and a murderer. For if there is no God, this monster is his own god, and sacrifices to his purposes whatever he desires and whatever stands as an obstacle in his path. The most moving entreaties, the most cogent arguments have no more effect upon him than on a wolf thirsting for blood.” Hence the necessity of invention. “In modest doses, skepticism provides a shield against superstition and false dogma. But when scepticism is enshrined as a faith unto itself, skeptics often will conjure fantasies more ridiculous than the ones they debunk.” Be that as it may, the failings of many would-be skeptics does not invalidate its power. Kay worries “that most researchers seem hesitant to suggest that any view of the world — no matter how preposterous – is unambiguously wrong.” To me, this is relativism not skepticism. And total relativism we can do without. But total skepticism seems to me indispensible. “From a young age, humans brains seem programmed to see design and intention behind the world around them. When asked about lions, children tell social science researchers that they exist so we can see them in the zoo. When asked why some rocks are pointy, children will respond: “so that animals won’t sit on them.” No less a thinker than Aristotle theorized that rocks fell downward so that they could take their natural place in the world.”
The things we can ultimately know with most certainty are the things that are false, not those that are true. This is the light skepticism shines into the darkness, it exposes falsity, but it cannot prove truth. Nothing can. “Unlike [modern atheists] Voltaire understood that man cannot survive on scepticism alone.” In rejoinder to this I would point out that the life of man does indeed expire. Kay has penned a great work, one that I feel obliged to add, contains many more insights, on topics from antiracism to dystopian literature, than this quotation heavy review shows. What affect the book will have may be unclear, but Kay can sleep safe knowing, not only that no diabolical marauder is planning, in an act endorsed by our puppet parliament, to harvest his spleen, but also that he has put forth a great and successful effort to fight the good epistemological fight.
* * *
“What is madness? To have erroneous perceptions and to reason correctly from them.” The introductory quote Kay selects from Voltaire loomed over the work. One of the more well-reasoned points to take away is that conspiracists aren’t stupid by nature. They are degrees of bright, sharp, erudite, learned, clever, and perspicacious. And therein lies the rub, for of their faults we are all susceptible in degree and in entirety. We are born into a world of stimuli charged with somehow forming true inferences, yet there are no rules or ontological safeguards preventing us from going wrong. “Our default position is just to assume that all patterns are real. This is the evolution of ‘patternicity’ or superstition,” Kay again quoting Shermer on the false positive versus false negative dichotomy.
We must guide our mind as it navigates the world’s many fallacies, the shortcomings of its design, while exposed to a bombardment of data. Beyond that we must guard our mind from ourselves, from our own desires, urges, and false intuitions. To constantly separate what is true from what we want to be true. That’s why the skeptic’s work is never done. To settle on conspiracism, Marxism, libertarianism, or liberalism is a relenting of a stoic conclusion-lessness that must be maintained in order to reason correctly, to go as deep as necessary and further, to question everything. A full grasp of falsifiability and incompleteness, we can be sure, has yet to pervade the popular mind. The temptation of declaring to oneself success, the lure of knowing in verity the answer, the very thing that at first motivates the pursuit of knowledge grows to impair, in questions of philosophy, the chances of its achievement. Some would rather lie than sleep and die with questions unresolved. But as during a long fast, eventually hunger fades. We can be content with “pretty likely but not for sure.”
Full skepticism, never to be resolved, is the only starting point, one that needs periodic revisiting during exploration, one whose forced agnosticism you need not ever leave on subjects of which you know nothing. Because we can be sure from watching Descartes’ journey that but a few steps from the distilled calm of nothingness, are the opaque and perilous shadows of knowledge, inevitably under which all stumble and err.
"Among the Truthers," Part One, Philosophical Underpinnings
Jonathan Kay is a columnist for Canada’s right-of-centre newspaper The National Post. I am sure I have read some of his pieces before, I have seen him on television a time or two, but to be honest I don’t remember much about him before he went on TVO’s Agenda the other day to promote his new book, “Among the Truthers,” which I’ve started to absorb. That’s the first part of full disclosure.
The second is that I don’t “believe” in 9/11 conspiracy theories, one reason being because I believe “believe” is the wrong word. It isn’t a dichotomous either-or and shouldn’t be posed as such, because it gives the believers far far too much credit. They are framing the debate from the get go by asking a question. Without saying too much about what I think on the topic right away, I will say that I think the Humean approach to these people’s claims is best. That is to say, which is more likely, convoluted conspiracy A, which requires the participation or confounding of large organizations like the media, various government agencies, private enterprise, etc, or, that people on the fringes of society are often there for a reason. They’ve got the causation backwards when they lament about society marginalizing them. They (often) think their intoxicating dissent, sure to inspire the masses, is being suppressed by the establishment, the hierarchy of power — landing them on the margins. Rather, they are on the margins, and that is why they also have the property of having absurd ‘beliefs’ they describe as dissent, and it is also why they don’t understand proof or research, thesis or synthesis, empiricism or rationalism, least of all the thing they claim to understand the most, skepticism. If they understood research, for example, they might be writing something readable or interesting or technical, not some excitable, alarmist, hyperbolic drivel.
Kay begins with a parable of true history, the “Lisbon Disaster,” a massive earthquake that caused a destructive tsunami and a large conflagration, which when combined in simultaneity, inspired (somewhat understandably) a large number of Iberians to declare the world was ending. The event also strengthened Voltaire’s case that we do not live in the best of all possible worlds, the view he famously skewered in Candide and the view of Leibniz with whom he originally took issue. Kay sees a disturbing growth of irrationalists cum conspiracists, one greater than its historical precedents.
Like the Lisbon Earthquake, [9/11] has had far-reading social, political, and psychological consequences that have yet to be fully absorbed or understood.
Kay cites the dismissal mainstream intelligentsia, both academia and the media, has afforded conspiracists, “truthers” if you will, as a dangerous expression of the idea that these things come and go. His book wants to prove its urgency by showing that these times are different. I think the author undermines this point a little from the beginning in listing all the times that have created demand for “a grander narrative than mere chaos, and grander villains than mere criminals and lunatics.” These are the French Revolution, the Great Depression, WWI’s conclusive treaties, the Cold War, JFK’s assassination, the Vietnam War, Watergate, and the rise of counterculture. Kay constrains some of those to their locality, but doesn’t it stand to reason that, if events that flare paranoia, generate “shrieking prophets,” and mark “when… conspiracy theorists found their followers” are so abundant throughout history that they aren’t even ebbs or flows, but rather, just the way people are? What I am saying is that while it is possible that people are more or less prone to believe conspiracy theories now than before, the fact that events have occurred, and their nature, is unlikely to be the cause of them. So simply because 9/11 was as spectacular as it was, as opposed to the less spectacular Watergate scandal, it is unlikely the visual spectacle, or the death toll, or the short term impact, will affect a populations proclivity to entertain wild notions. At least, as much as other things. Remember that for historians and political scientists, those interested in changes to the distribution of power, the single most important post WWII event remains, by leaps and bounds, the fall of the Berlin Wall. Maybe it is the writers who are misplacing 9/11 because of the stunning visual spectacle that it was. Who could blame them?
One of those other things that Kay lists, which is generally a well argued thesis, is that the internet has caused a “balkanization” of common discourse such that people of similar opinions can operate in silos, or ideological islands, reinforcing common misunderstandings and narratives. Passing around the communal cognitive kool aid, if you will. I am certainly in no position to say that this is not the case, but I have proffered that it could be that the population was always this stupid (or more likely, stupider), to be blunt, and that the internet has merely offered a more reflective mirror of society than we have previously possessed. Those nutjobs you would never meet in backwater Minnesota are now posting on facebook. But they were always there, living in their sheltered world. I think generally the people who think that the internet is causing silo effects have to answer, for whom?, and compared to what? Because the assumed comparison is to a liberal arts education, where reading the great books might relieve some misconceptions, hopefully instill some critical thinking, and at the least impart some epistemological humility. But the people who are committing logical fallacies on the internet, who think watching youtube clips is research, aren’t the people who would have become educated in the past. So their exposure to other ideas has to be greater with the internet’s invention simply by virtue of the fact that they were unlikely to have exposure to other ideas before it. And the silo effect thesis may be better explained by the fact that those we would consider rational or articulate or epistemologically self-aware, used to be the generators of media, and now are still the same (or greater) in absolute numbers, but are watered down greatly by the influx of previously nonexistent web commentators and website commenters. Kay argues well in the opposite direction:
At the Web’s birth in the mid-1990s, it was imagined that these new information technologies would usher in an Enlightenment dreamworld of mutual understanding and rationalism. Instead, the opposite has happened: Rather than bring different groups into common discussion, they instead propelled radicals into their own paranoid echo chambers.
What I am suggesting is that the echo chambers have merely gotten bigger, not constrained to your trailer park or your university’s sociology lounge. But that also, the chance to run into something outside the echo chamber, on the internet, has to be nonzero, because of the ease of navigation to parts unknown. Before, people who had read the great books and who debated points seriously, would be caught dead in the trailer park, never mind the sociology lounge!
Kay handles Voltaire passably, but his characterization of the Enlightenment is a little puzzling at times as he relates it to today:
But the Christian intellectual monopoly that the Enlightenment overturned at least provided society with a shared frame of reference. Moreover, it also provided a cosmic explanation for evil — the main preoccupation of the secular conspiracy theorists who have proliferated in our own age.
So we have some good ideas here. First that a shared frame of reference has some benefits, or at least, properties, that makes explanations and communications easy. When everyone is approaching an issue from a different set of axioms, with little crossover, there is little hope, and no foundation, for agreement down the line. Second, Kay begins to tie the conspiracist agenda to some broader psychological desires, like the explanation of evil.
I don’t want to put words in his mouth but it seems to me that often post-Christian and post-theist worldviews are mere continuances of the cognitive mistakes we can see Christians making. Explaining the existence of evil by pointing to the sky not being the only one. I think of secularists who take morality to exist, or who vouch for, endorse, or live by the golden rule. The golden rule is essentially a distillation of a handful of the ten commandments based on wish thinking. No reason is ever given as to why it should be a rule, its acceptance and verbal repetition is usually an emotive expression of the desire for fairness and morality in a chaotic world devoid of an existing moral framework. Locke got this right when he wrote
Promises, covenants, and oaths, which are the bonds of human society, can have no hold upon an atheist. The taking away of God, though but even in thought, dissolves all;
Most want to have it both ways, they want to reject both god and their parents’ church, but keep their peace of mind. A universe without cosmic moral repercussions, for some reason, is hard for some to accept, I submit this is a common cognitive error replicated by post-Christian ideologies.
But let’s return to the Kay passage. “Christian intellectual monopoly that the Enlightenment overturned,” is a dubious phrase, mainly because the Enlightenment was a Christian project, serving Christian ends. Descartes entire motivation was to set Christian theism on firmer rational grounds than the French Scholastics had achieved in the centuries before him. The atheistic, deistic, or anti-theistic Enlightenment philosopher was the exception, not the rule. Kant perhaps remains the staunchest philosophical defender of Christianity today, or the provider of the most robust arguments. Hume and Hobbes were extraordinary, the former only admitting to agnosticism, the latter (likely an atheist) denying charges that he disbelieved in the Trinity. So perhaps the Enlightenment encouraged new ways to think, new methods of argument, that would later be employed, or would develop, into anti-Christian ends. But largely the movement was for Christians by Christians, and the case for God was stronger after the publication of Enlightenment texts by and large; in short a “Christian intellectual monopoly” was well preserved in the 17th and 18th century.
Where Kay excels, in this portion of his work, is epistemic. For whatever historical reasons we can disagree on, we have found ourselves in a place where:
Many of the most revered liberal arts scholars of the postwar era have cast doubt on the very idea that language can act as a bridge between people holding different viewpoints. Thanks to the rise of identity politics, it is imagined that words — and even facts — have no meaning independent of the emotional effect they produce on their audience: Everyone feels entitled to their own private reality. And so the idea of rationally negotiating a consensus truth about the way our world works came to be seen as not only impossible, but undesirable — a trap created by society’s privileged caste to justify their position. “There is one thing a professor can be absolutely certain of: Almost every student entering his university believes, or says he believes, that truth is relative,” wrote Allan Bloom in his 1987 book The Closing of the American Mind. “The study of history and culture teaches that all the world was mad in the past; men always thought they were right, and that led to wars, persecutions, slavery, xenophobia, and chauvinism. The point is not to correct the mistakes and really be right; rather, it is not to think you are right at all.” [Emphasis mine]
This is the zeitgeist in which the natural arrogance of youth festers into bold faced decrees of I am entitled to my own opinion. Where being true to oneself trumps truth itself. [someone must have written that sentence before now] And it isn’t that today’s rabble have given up on finding the truth because they have been informed by Pyrrho about how difficult that project actually is, rather that they dismiss the project as arrogant as opposed to earnest, and, wrapped in their own pretensions, fancy themselves above it, above the need to research, to reflect, to grow, and to mature, before making metaphysical pronouncements that are true by fact of solipsism. Perhaps the internet has warped some intellectual development, by shoving the metaphoric mic into the face of every young person. And the mic has a green light on its handle next to a big sticker that says “Speak up!”
Kay goes on to make necessary and important acknowledgements, that conspiracies do and have existed, by any reasonable definition, and that the criticisms of media from the left (Chomsky’s corporate control thesis) and from the right (the media as partisan and elitist) have some grains of truth to them. Moreover, contemporary conspiracies have some grain of truth to them, if only that we do not know, and therefore cannot explain, everything. It is these concessions one would hope to start to reverse this characterization of today’s discourse:
It is not unusual for intellectuals and politicians to reject their opponents’ arguments. But it is the mark of an intellectually pathologized society that intellectuals and politicians will reject their opponents realities.
But that is not Kay’s aim. Instead he sets out in the book to arm those who would defend “The Age of Reason,” by informing them about a threat to “our intellectual landscape” that “is simply too important to ignore.” With that thought, the book’s preface concludes.
Should I start posting about being right? I don’t know. This one time is an experiment. Look at what some National Post person wrote about Stephen Harper’s post election victory moves:
It’s not just that Mr. Harper decided to appoint three more unabashed partisans to the Senate. It’s not just that the Senators-to-be, Larry Smith, Fabian Manning, and Josée Verner, were rejected by Canadian voters only two weeks ago. And it’s not just that the PMO’s announcement of the appointments was seemingly timed to be as contemptuous of the public as possible — just after the new Cabinet was announced, and mere moments after the Prime Minister had completed a question-and-answer session with the media in Ottawa. It’s all of it, in one tidy package: more patronage, less respect for democracy and less accountability. He’s long since given up the pledge to only appoint “elected senators,” of course, but it takes some gumption to swallow all those principles at once.
Do look for Harper to make his most nepotistic appointments and most authoritarian executive decisions now or in the near future, while his victory is fresh and his re-election campaign is furthest away. Recommended by Machiavelli and exacted in Harper’s past when he, for example, appointed insider Michael Fortier to the Senate, this strategy is tried, tested, and true. Try not to swallow this pill too bitterly.
What Machiavelli wrote in The Prince in 1505:
Hence it is to be remarked that, in seizing a state, the usurper ought to examine closely into all those injuries which it is necessary for him to inflict, and to do them all at one stroke so as not to have to repeat them daily; and thus by not unsettling men he will be able to reassure them, and win them to himself by benefits. He who does otherwise, either from timidity or evil advice, is always compelled to keep the knife in his hand; neither can he rely on his subjects, nor can they attach themselves to him, owing to their continued and repeated wrongs. For injuries ought to be done all at one time, so that, being tasted less, they offend less; benefits ought to be given little by little, so that the flavour of them may last longer.
Apparently this Scott Stinson originally endorsed Harper pre-election and why his turn around is of note. The reactionary are usually the least read because they see some event as having no precedent, when the reality is that they have overestimated their qualifications to offer an opinion.
I just bought “Amongst the Truthers.” Expect a review soon.
Consciousness doesn’t return. It is the scarcest resource despite its abundance and despite our ability to create it. We have a supply of it that can be replaced but does not replenish. And it is for these reasons that the threshold to justify its extinguishing should be enormously high. Mistakes are egregious and irreversible.
The costs of doing business
But there is a bright side, in a way. And that is that perhaps the act is not egregious when the extinguishing is justified. That is to say, for example, it is not a big deal when a mass murderer is shot by a prison guard. The philosophical justification is two-fold, the first point being the obvious fact of solipsism. People on the chopping block aren’t you or I. The second is also tautological, an option has to be weighed against the alternatives, not “should be weighed.”
The economic argument is secondary, but let’s begin with it. The costs of keeping murderers alive behind bars is considerable when considered on a per capita basis. However, those who focus on this as a strengthener of the capital punishment argument miss the obvious point: the prison system is an egregious waste of state capital and the low hanging fruit does not grow on death row. No one should go to jail for getting high, no one should go to jail for soliciting a prostitute, or committing a petty theft, or a hundred other things. Release prisoners and stop imprisoning people, that’s the obvious way to cut prison costs. Reduce sentence lengths for non violent crimes. The alternatives argument really can’t stand on a budgetary basis given that the alternative has a simpler alternative.
Moreover, while we focus on court ordered executions, the track record of the state is quite poor. A nonzero frequency of innocent cognizances have been and are to this day legally executed. The state is incompetent in matters of internet regulation and road maintenance; it cannot under any conditions be trusted with capital that has no exchange rate.
The other alternative
What, apart from the economic situation, is the difference between someone imprisoned and someone who is not: their theoretical ability to do physical harm to others. The window for a margin of error can open outside state guarded walls since the nonzero chance of innocent execution has to be weighed against the chance the candidate kills others. Another caveat: in the real world guilt can be determined with much less sterility and candidates for justified execution still have the opportunity to commit incriminating acts, indeed they do, and often. When someone not dead is likely to kill, when the executed is not likely to be a big loss, when the potentiality or the actuality of quarantine is unrealized, then the fact that in the real world nonacting is acting underlines the possibility of justified killing.
In this the state should not be trusted, in this the state should not be believed, and in this the state should not be given rope, license, or encouragement. The state has enough hubris as it is that we need not worry about it proving that not acting is acting. But the fact that it abuses its ability to take such action does not mean that all those actions taken were unjustified. Paint splatters inside and outside of the margins. Context matters. Always remember that both things can be true when you observe a dichotomous human narrative. The state has a monopoly on the legitimate use of violence, achieved illegitimately, and abused frequently. There are “bad guys” in the real world and when they are killed the world is better for it.
So yesterday Canadians partook in their forty-first general election and returned a stunning, historic result. For anyone who has pored over the returns of yesteryear, or read John Duffy’s chronicling of Canadian elections, Fights of Our Lives (now recommended), it would be hard to understate the legacy effects of this election. Jack Layton’s legacy is secure, a remarkable turnaround for someone who was perilously close to not returning as leader of his party after each of his returns in the 2004, 2006, and 2008 elections. Then, his efforts of 19, 29, and 37 seats scraped the bottom of the NDP’s always lofty expectations. Now at 60, having pressed on through real health issues, Layton has achieved the breakthrough New Democrats would only speak about in moments of uncontrollable fervour. And while given their idealistic nature those moments weren’t exactly rare, no sober prognosticator could have detailed a victory of this magnitude in anything but sarcastic tones in the days of Taliban Jack, in the days the (now thinly) mustachioed blusterer accused Paul Martin of contributing to the deaths of homeless people.
Gilles Duceppe’s legacy is finalized. It is funny over the years how many people I respect have mentioned how they agree with so much he has to say, the sovereignty question notwithstanding. This quiet, well-spoken, and humbly thoughtful man who was so often demonized by the rest of the country is left to hopefully participate in the private or nonprofit sector, where if he so chooses, he can affect great changes. If you have actually listened to the federal debates of the past, if you have actually watched question period, and you haven’t done so through the red coloured glasses the sovereignty movement has distributed to the rest of the country, then you would recognize Duceppe as a man of reason and of reflection. These are rare traits amongst politicians. His legacy is finalized, his cause dead. I want to use that tempting phrase “make no mistake,” before saying the separatist movement of Quebec is now over for all practical purposes. It may elect a few people, it may let its voice be heard, but it will no longer threaten the geography of Canada. The demographics of Quebec have effected a slow bleed in the BQ’s sexual potency for a long time and that extinguishing will not abate over the next twenty years, pharmaceuticals be damned.
Michael Ignatieff has been trounced. Of this, there is not too much to say. The Liberal Party of Canada was of little interest during this campaign and is of little interest now in light of its results. It will be interesting, however, to see the party’s opinion of proportional voting systems now that it is wearing Jack Layton’s sneakers to the ball. Iggy announced his resignation a few hours ago and I welcome his return to writing books filled with tautological fatuities from political philosophers and poorly crafted Old Europe allusions.
Elizabeth May’s breakthrough has finally come. It is no small achievement to be elected at a time when many marginal Green supporters undoubtedly jumped ship to the NDP bandwagon. The Green party returned considerably less votes in this election than in recent memory, though I do not think this a condemnation of their prospects moving forward. The pool of potential voters it can draw from in future elections is very large, well into the millions. It all depends on the continuance of voter perception of this being a protest vote. I have never conceived of it as such, after all over 900,000 Canadians voted Green last election when they elected no one. That’s either a political base or the world’s largest protest. The best thing about only electing one member, especially one with May’s increasing profile, is that the Green’s will receive such a disproportionate amount of media space it is inevitable their message of fiscal conservatism and their focus on public policy incentives reaches the many Canadians who tap their feet to a monotonous beat.
It is funny just how bland Canadian politics is, just how reserved the rhetoric is, and just how wide the room for error is in its governance. Do not expect Stephen Harper to step outside those margins. If you are having trouble grasping this conception of things and don’t feel like reviewing the state of the rest of the world, merely watch and reflect on this video where Peter Mansbridge asks Harper, Ignatieff, and Layton to recount an error in judgment on their part and what they learned from it. The candour that results strikes me as the kind one would expect from a politician who has retired to pen his memoirs, not someone trying to be nationally elected at the end of the weekend. And in this video I think you can also see the result of a decade long mellowing of both Stephen Harper and Jack Layton. Mellowing and maturing aren’t the first two gerunds that I would imagine spring to the mind of Canadian voters today, I can only speak for myself as a long time observer of this polity and its rhetoric. Years ago Layton and Harper would be properly labeled Canadian political extremists and for good reason. Yet to hear Layton pick the word “thoughtless” out of the air at his discretion to describe an oration of his past, to hear Harper select his support for the war in Iraq out of an infinitude of possibilities, should be pleasing to the brains of all who processed this footage, because for one, it was in both cases the exact wrong answer according to any slick self-disrespecting public relations “guru.”
Harper has a classic inability to be inauthentic and for most of his time in the public light this has probably been a motivating factor for those he has rubbed the wrong way. You can tell when he is uncomfortable. When explaining himself he exudes it. You can tell when he disdains the labour of someone asking him a question and you can tell he disdains the questioner. Karl Rove would never approve of his adversarial inclinations, of his dismissive petulance, of his glacial stare — these things above all others are the reason he was pegged as unelectable previous to his defeat of Paul Martin. This is a man most comfortable in the box at a Leafs game, least comfortable kissing babies, and whomever thought he could pull off a cowboy hat and leather vest should have their brain bisected. Is it not fantastic that Canada has routinely elected someone devoid of charisma, who enjoys his history and his privacy, and who would describe himself as a policy wonk? If ever there were a sign of the salubriety of a democracy it would be an ability (less, a penchant) for electing to its highest office an acharismatic candidate.
Harper’s perpetual authenticity has disconcerted me many times in the past, and for those with a weakness for what psychologists term the “ick factor,” I can only imagine the discomfort his ascension has engendered. But this authenticity betrayed the mellowing and the maturing to which I previously referred in his victory speech last night, where he stated “we are intensely aware that we are and we must be the government of all Canadians, including those who did not vote for us,” where he said “of listening, of caring, of adapting, those lessons that have come as a minority government we must continue to practice as a majority government.” I was nearly dumbstruck when I heard “I think I can speak for the entire country in recognizing the determination and tenacity of Mr Layton and his remarkable campaign” met with light applause and nary a boo from the conservative rally. The man genuinely congratulated Elizabeth May.
Of course there are issues moving forward. But contempt of parliament will not be one of them. For those who cried foul at the prorogation, for the consistent flouting of parliament’s rules and the spirit Harper displayed in his tenure thus far as Prime Minister, we needn’t worry about this any longer. There is no reason to prorogue parliament or break protocol when your party has a majority government. You don’t complain about an infection you cut out of your body, you complain about the wound inflicted. And in this analogy parliamentary deadlock and gross dysfunction is the infection that will cease to exist.
Perhaps the most stinging indictment of Harper’s candidacy came from The Economist, not because it was the most critical, but for what it was critical of and for the position from which it speaks:
[T]here are some serious blots on Mr Harper’s record. He is a dinosaur on climate change. He has batted away all criticisms of the Albertan tar sands, where oil extraction is an especially dirty business, and placed his faith in carbon capture and storage, an unproven and expensive technology. Even some Albertan oil bosses favour greener rules. But the biggest worry about Mr Harper is his contempt for the rules of Canadian democracy. Since the previous election he has twice prorogued parliament for disgracefully lengthy periods, the second time to avoid awkward questions about whether his officials lied to the house about the treatment of detainees in Afghanistan. He has also got rid of watchdogs whom his government found too independent and generally tried to hand over as little information as possible to the public.
These failures should continue to follow Harper around. There are some reactionary “anyone but Harper” voters out there this morning who fear a repeal of gay marriage and an infringement on the right of a woman to abort her fetus. The latter is largely a judicial issue this Harper government will have nothing to speak on. The former will never obtain. Given Canadian attitudes any attack on same sex marriage is likely to result in riots in the streets. Mr Harper has won an extraordinarily large amount of political capital, he is not going to burn it all on the bigots and the social interventionists in his party that he never has fully respected anyways. Same sex marriage is an issue of tiny importance in general and to a man who seems slightly inebriated with the pleasure of guiding a country. Given the rancour its reexamination would generate would be so disproportionately large and all encompassing no one who knows Harper for the rationalist and the pragmatist he is would ever consider such a venture a possibility.
Do look for Harper to make his most nepotistic appointments and most authoritarian executive decisions now or in the near future, while his victory is fresh and his re-election campaign is furthest away. Recommended by Machiavelli and exacted in Harper’s past when he, for example, appointed insider Michael Fortier to the Senate, this strategy is tried, tested, and true. Try not to swallow this pill too bitterly. Politicians are politicians after all, above customer service representatives but below various incarnations of, and synonyms for, sludge.
With those social issues off the table and the cooling of Harper’s militaristic ambitions, look forward to an unremarkable management of the economy. Look forward to the absence of separatists.
The NDP is an interesting political party, because they have to exist, because they are exactly the worst party to govern, and because they are exactly the best party to serve as a squeaky wheel in need of oiling. The NDP is like an advocacy group for good intentions that doubles as a kind of white guilt detox clinic that no one ever graduates from. They don’t understand how prices work, they are supremely naive and unrealistic, they propose fatally flawed solutions, they are comprised of many well to dos who know what’s best for poor people, they unconsciously and continually suggest straight up declinist falsehoods, and when they aren’t those things, they are comprised of people who only vote for them because they feel like its the right thing to do. They might just be the perfect opposition party that never touches power.
I have been wondering for a while how it is possible for people to make statements like “things are getting worse, jobs are disappearing, benefits are eroding, society is going down the toilet etc” in stark contrast to everything that smacks them in the face every day. Part of it has to be some unawareness or inability to grasp that people in other places of the world and that people in the west’s history actually have burnt people alive. If you download Skype you can talk to anyone in the world by video for free and if you have a tap in Canada you pay for water by the penny. But I digress. What if it were exactly these type of intellectual rubes that produced the best possible performance from those with power because of the demanding nature of their unsolicited whines and whinnies? If they advocate for good intentions well enough then maybe those pushing the levers will keep that in mind for fear of an inundation of annoying reprisals.
Most telling in analyzing this election is the fact I paraphrase from Harper’s victory speech, “Canada, an island of stability and security in a troubled world.” It is easy to be bullish on a country that from the perspective of the outside world possesses no looming problems, no immediate threats, no tendrils of corruption, no grave injustices, no absurd scandals, no militaristic ambitions, no fomenting economic bubbles, and a total absence of widespread-violence whisperers. The least incompetent are in charge, the most well intentioned are in their ear, the separatists are disbanded, the insubstantial are diminished, and the greens are included. This is Canada as Candide, where the object most worthy of satire is the innocuity of its demagogues, and where the generator of the novella’s greatest irony is a population’s incapacity to consider the possibility that they live in the best of all possible worlds.
Check out this George Carlin bit where he repeats a version of the critique of mass society. His conviction is worth noting, but what makes his expression noteworthy I think it how clearly he states everything that a critic of mass society believes, that is, how clearly he states these fallacies.
Why does education suck? “Because the owners of this country don’t want that. I am talking about the real owners now. The big wealthy business interests…” Notice how everything starts with the real truth. The real story. The down low, what they don’t want you to know. Everything about this woolly thinking is predicated on the people who believe it being on the inside scoop of something of which the vast majority of plebeians are completely unaware. Never do they start, “as everyone knows, everyone in this society is a sucker.” Carlin is just getting warmed up,
"Politicians are put there to give you the idea that you have freedom of choice. You don’t. You have no choice. You have owners. They own you. They own everything."
If they own me then why don’t they ever tell me to do anything? How come they never cash me in? Oh right, everything is subversive. Everything is socially programmed through social reproduction. How could I forget what is silently controlling my every manufactured desire.
"They don’t want people who are smart enough to sit around a kitchen table and figure out how badly they are getting fucked by [the] system. Do you know what they want? Obedient workers. People who are just smart enough to run the machines and do the paper work and just dumb enough to passably accept all these increasingly shittier jobs with the longer pay, the shittier hours, the end of overtime."
This is the fallacy of someone who has never known a working class man, a blue collar, a miner, a stiff going every day to make widgets at a Chevrolet factory. These people have the same opinion as George. They all believe they are getting fucked. No where, in no place, is any appreciable part of the population acting like mindless drones, acquiescing to the political system. There are not many people this smart and this dumb, though there are plenty of both. After all such constraints are hard to fit. Carlin makes another common confusion. Ability to perform tasks, general proficiency is neutral with respect to critical thinking and vice versa. Moreover, a predilection or nurtured habit of critical thinking doesn’t guarantee you will come up with the right answer, even if you are smart. George Carlin is a case in point. I’ve met a lot of hicks in my time, and not one of them ever thought the government or “the system” was looking out for them or that they weren’t getting fucked. I’ve met plenty of blue collar factory workers and the same can be said of them. They may be hicks, they may be uneducated, but they aren’t idiots. They know politicians are crooks, they know shit goes down at lobbyist lunches. One problem with sermons like the one Carlin delivers is it loses a ton of its punch if the premise that “the word needs to get out” is revealed as completely vacuous.
A second fallacy stated in the quotation above is the good old times fallacy. Maybe there is a better name for it but there is no better time to live in this world than now. I mean, unless we could live further into the future. Anyone who has seriously examined the general trend in standard of living in the world, much less the United States (the country to which Carlin is exclusively referring), would never conclude things are or have been getting worse. The only country in the world, to my knowledge, that experienced a negative growth in standard of living, at any point, in the past ten years, was Sierra Leone. Congo, Rwanda, Sudan, these are the real outliers and the only places people concerned about a declining quality of life should worry about. Any union leader, any right wing or left wing reactionary politician, any populist or religious leader, comedian or loon operating in the United States who can seriously lament the declining quality of life in their country has a considerable inability to process information. There is of course, the historical trend, that overreaching social programs of the pre-thatcher west, have been rolled back over time as the cheque has come back. There is definitely a lag in time from when a society says the state will pay for everyone to have endless benefits and never go hungry, to when it figures out that it can’t actually afford all that. And this is what Western society experienced en masse as the 70s transitioned to the 80s, some countries more than others, with Britain being the archetype as Thatcher decommissioned the least fundamental and the most expensive branches of the welfare state. But again, if you had been paying attention, you would know that. And it is fine if you didn’t, just don’t say you have been.
The less often examined false premise that underlines a lot of Carlin-style critiques is the notion that awareness of a problem leads to its solution. Being aware of a problem and being able to affect its resolution hold no necessary bonds. So even if a majority of people are ignorant sheep, the illuminating of their minds could do nothing. So again under the thesis that most people know the world isn’t fair, or even, believe in Carlin’s paranoid version of events, we wouldn’t necessarily expect to see a world different than the one we inhabit, because knowledge of a problem’s existence has nothing to say about the ability to affect its resolution. This makes for simpler theorizing since we live in the world we observe. Much harder is it to construct a vision of your fellow man as some combination of ignorant, stupid, and malevolent operating in a clandestine yet all-encompassing phenomenon.
"They spend billions of dollars a year lobbying for the things they want. But I will tell you what they don’t want. They don’t want a population capable …of critical thinking. They don’t want that. That is against their interest. " Why it is against their interest is of course never explained. What exactly will a ton of critical thinkers be able to do if they already control everything? If the jig is up no amount of awareness of the problem is going to change anything.
You can tell these people never read the newspaper. I always wonder what role they would say about someone like Warren Buffet, the great proponent of a hefty inheritance tax (that currently does not exist), who lives modestly and enjoys bridge. What a patsy you are, I am sure they will tell me, what better example of tricking the masses than to have a front man for the exorbitantly wealthy appear as a man of the people. See how easy that was? The problem with having all the answers is that it implies you haven’t thought hard enough about it.
Almost without exception, every youtube video I watch that has over a million views has something like 15,000 likes for every 300 dislikes. The people who don’t like Kanye don’t listen to his music. When I go to a restaurant, and the girl whose food I am paying for asks for something on the menu to be modified, I have never seen them not do it. I hear when you buy some new homes you can select from your choice of 300 shades of off-white bathroom wall paint and from 3,000 different light fixtures.
Larry David and Jerry Seinfeld once reportedly had a dialogue similar to this:
Jerry: “Do you ever wish you weren’t circumcised?”
Larry: “No, I never think about it, why?”
Jerry: “I hear they are more sensitive.”
Larry: “More sensitive? I don’t want to have a heart attack.”
Not having been on both sides of that issue I can only say that the water on the other side is plenty warm. But more importantly, if this is what it is like to be owned, I can do without freedom.
LONDON - With a smile that lit up TV screens around the world, Kate Middleton married Prince William in a union that promised to revitalize the British monarchy. A million people roared their approval as the royal couple then paraded through London in an open carriage.
I take this as indisputable proof that there is a certain large minority, or perhaps a majority, of people who are irretrievably retarded. I can only hope that I live long enough to see the British monarchy disbanded and those who hold its titles thrown out into the street like refuse. They are the physical and economic embodiments of the historical inertia of anti-democracy of anti-liberalism of elitism and of nepotism. I mean, this kind of bluster really should go without saying.
To all those women who faun, who gossip over the dress, and who followed this with romantic hearts, I have this to say: you are truly stupid. We have television shows and paperback books in service stations for this kind of thing, to satisfy your guilty pleasure lust for all that is rom-com and pop glam. For satisfying that genetically encoded urge I do not fault you, in theory. But the monarchy are real people and their existence has real implications, people and implications I would happily forget about if I wasn’t constantly reminded. I’ll be delighted to hear of the institution’s funeral.
Late addition: There is a criticism of people like me that often goes like “the only people who are worse than the people who care about the royal wedding are the people who loudly complain that they don’t care to everyone they can.” I think this is a fine criticism. I don’t think it applies to me because: I care about the royal wedding. I care that it happened. I wish it didn’t happen, or that if it did, it wouldn’t matter to the economy. Or that if it did, it would be between private citizens who fancied themselves princes and princesses instead of those endorsed as non-fakers of such titles by not just the state, but by other states too.
As those who frequent this blog know I haven’t used this space to speak about gambling or about the fact that I am living in Peru right now. I will post a comprehensive blog on Peru soon, but not here. I will include the link when that happens. As for gambling this is likely the only post I am going to make concerning it.
* * *
The government that protects its citizens from itself
For Mill, the harm principle was the main criterion for determining government regulation of its citizens activities, with the exception being self-harm. Mill basically argued in On Liberty that the sovereign should actively try to dissuade its citizens from self-harm but that it would neither intervene or censure such behaviour under the pretenses that self-harm doesn’t violate the harm principle. The hinge of the harm principle being the power of one person over another’s well being.
It is my general contention that gambling falls under this case type of the harm principle. If it does harm, it is self harm, since the individual in question is making concerted, independent, adult decisions to place in risk the money they so choose. The incidence of gambling addiction is quite low compared to the participants, especially when compared to well legal vices of tobacco and to a lesser extent alcohol (unless I am wrong to believe more users of tobacco per capita are addicted than boozers). So simply because it has addictive properties does not de facto make government censure necessary or desirable.
I mean the general fact is that people should be able to spend their money as they please, first and foremost. This arguement has been repeated ad naseum. But the secondary truth is that, there is no better alternative. Even if people can’t be trusted to spend their own money wisely, which obviously some cannot, like young First nations men for example, even if they can’t, the government has no more of an effective capacity to limit their destructive tendencies in the vein of gambling or ingesting drugs of some form.
This is because there is a demand for self destructive behaviour by nature. And where there is a demand there is a supply. Gambling will follow the economic model of other vice markets. If they are sanctioned they will generate huge revenues for government through taxes and the demand will be met by legitimate supply. If they are prohibited then the supply will be met in the black market, just as marijuana is now. The demand for marijuana is met exactly by its supply. In Canada for example, anyone who ever wants pot at any time and for any amount or any quality will be able to procure it. Besides the precautions they have to occasionally take, the availability of the product is totally unaffected by the governments position on its legality. So the question for governments with respect to gambling is not whether it will occur or not. That they cannot control. The question is whether they want it to occur under their purview, whether they want to have some sort of beak-wetting income stream, or whether they want it to occur informally, in the back rooms, the offshore servers, and the loading dock lunch breaks.
A few years ago the US government passed the UIGEA, The Unlawful Internet Gambling Enforcement Act, which made it very difficult for banks in the US to do business with payment processors acting between them and online poker sites. No law was ever passed that said playing internet poker in the US was illegal, though states have on occasion passed state legislation to this affect. In any event, payment processing went on under false pretenses as a method of circumnavigating this. PartyPoker a huge player in the US market, pulled out after the UIGEA passed, while Full Tilt Poker, Pokerstars, and others remained operational claiming to have interpreted the law as to not apply to online poker, merely online sports betting or other wagering. The basic argument is that poker is not gambling like slots because they player has control over the outcome in one and not the other. Poker is a game of skill where luck over the long term will dissipate into the ether and the expected return on a player’s investment will correlate to how well he plays mathematically. This, in a word, is true — but try explaining that to a legislator who thinks about the issue about 5 minutes every 8 months.
Yesterday we found out that the FBI and the Department of Justice did not take Pokerstars and Full Tilt’s legal interpretation of the UIGEA well. In fact, they arrested people in Las Vegas and Utah and seized assets in 14 countries. Payment processors are being shut down. Extradition orders have been issued for owners of the most high profile multi-billion dollar a year poker sites in the world. All these individuals have been indicted on counts of massive fraud amongst other egregious charges.
About 4-6 hours later the same day both major sites announced to their US customers that they could no longer service them in real money games, account transfers, or withdrawals. Some players beat the decree making bank run style withdrawals to their payment processors, payment processors that the DoJ has indicted for operating illegally in the US, and is attempting to confiscate all funds and seize all transactions as if they are stolen goods. Those players, eager to prevent their money being effectively eaten, may have lost it entirely if it went to a payment processor who was promptly served a warrant and had all their assets seized the same day. Moreover, the effect of the bank run on the capital of the operating sites has to be extremely damaging. Before the end of the day US players were already receiving notice that their withdrawals were being retrospectively (by hours) declined and would not go to the payment processor at all. So the money would sit in their account unavailable to be played with or transferred to other player’s accounts.
The implications of this are worth exploring before continuing, as they have basically, in one fell swoop crippled the online poker economy. Player to player transfers occur all the time especially in a post UIGEA climate where a player from one site may not be able to deposit becaues of the state they live in. Thus they transfer money on a different site in exchange for that amount on the site they cannot deposit to, with a player who has accounts on both. The cessation of this activity is but a drop in the bucket. Many players have money in their accounts that is not theirs at all. There is an economy of backing and staking where players invest in what is referred to as a “stable” of less experienced and less wealthy players. The investor often coaches the players and provides them with capital as they hone their skills and generate a modest return for the investor while skimming some money for themselves. Now suddenly wherever the money is between the investor and stakee it is staying there, indefinitely. Some investors are asking for the money in real life from their stakee’s and since no one saw anything like a 6 hour window before impromptu total transfer cessation there are no clauses built into most investment contracts to deal with this type of situation.
Similarly there are services such as CardRunners who have a roster of private coaches who they match to players in need of study and willing to pay. If a student purchases 5 hours worth of coaching from a coach they transfer the money to CardRunners Full Tilt account where they hold it in escrow before getting notification from both parties that the coaching is complete. All that money in escrow is now frozen. With US players unable to participate in any activity relating to their accounts, those who play professionally are effectively worse than unemployed, because they could be unemployed with 80% of their total assets unavailable for the forseeable and indefinite future. That is the nature of poker, because the tool of investment you use is money itself players naturally keep a large portion of their total assets available for investment or use. There are thousands of US citizens who have tens of thousands (even hundreds of thousands) of dollars online who will not be able to make mortgage payments because of a sudden reversal in the liquidity of their assets.
The gist of all this is that capital interests operating legitimate casino gambling in the united states have been trying to shut these offshore operations out of the market so that they can start up their own companies in a competition free zone pending proper federal regulation from congress that formally legalizes online poker. It is not as if in the long run the US government holistically sees online poker as a scourge to be exterminated. It is that the US government wants to be able to tax it (which is understandable), and those corporate interests in house who have the lobbying power to suggest the route to legitimate market creation have exercised that power maximally.
The bank run is on. Canadian and non US players are all attempting to withdrawal their assets that double as their grocery money before it is too late. The instability could easily ruin the sites themselves, which as of now, are still operating outside of the US. If they survive the asset shock and games continue to run then players will still have to cross their fingers that extradition orders to the companies servers on the Isle of Man aren’t served. Because if the US government gains access to the Isle of Man 90% of all internet poker traffic will cease to exist as it presently exists, which is saying something. At any time of day you can count on somewhere around 400,000 people being online and playing real money poker.
One reality is that there is no moral argument here. Poker is the responsibility of the individual and those who think they have some high up soap box insight into the general dissolution of mores amongst today’s technology driven hedonistic addicts can leave those prepackaged irrationalities for the disapproving tones of those like and closed minded. If you think the deleterious effect of gambling upon society has anything to do with the desirability of legal online poker moving forward in its biggest market the United States you should save it for the next time your sociology professor puts his hand on your shoulder.
Governments have a choice, online poker occurring or not occurring isn’t it, it is between it occurring in the open or under the cloak of darkness. They have sat idly by, building a case, as US online poker has operated in a legal gray zone for the past 5 years. What we can expect to happen is for online poker to go from a truly wild west environment to one whose owners are in bed with congressmen everywhere. What happened friday was the first in a series of moves that are a straight up coup backed by the monopoly the united states government has on the legitimate use of violence. The victims are those players who knew it could come some day, yet had no warning when everything changed in a four to six hour window, when the hypothetical manifested into handcuffs.
Let's say something about saying something that provokes other people to kill people who are innocent
So in case you haven’t heard, the Floridian pastor Terry Jones burnt some Korans on March 20th, which set off a less than innocuous reaction. Some people, at least 24, have died in Afghanistan during riots. Let’s look at what the NYT writer Roger Cohen said about it:
Perhaps [Jones would] care to explain himself to the family of Joakim Dungel, a 33-year-old Swede slaughtered at the U.N. mission in Mazar-i-Sharif by Afghans whipped into frenzy through Jones’s folly.
Here is a classic example of Western Islamapologism. We find a lunatic pastor, exercising his right to free expression and free speech, and lunatic third world denizens, murdering people. Who should be blamed in the first paragraph of this NYT article? The pastor of course! Cohen goes further, citing Karzai and Afghan imams as “enablers.” Finally, he does draw a line in the sand, for which he should be given credit, in the seventh paragraph:
it was a heinous crime against innocent people and should be denounced throughout the Islamic world, in mosques and beyond. I’m still waiting.
Why do we feel the need to blame Terry Jones for this, at all? Do I endorse his actions? No. Do I think he was in any way wrong to do what he did? Of course not. He burnt a book. Assuming he owned it, his actions are no different than those of the right wing talk show host who defames the Koran every day across the airwaves. Since the Säuberung in Germany, circa 1933, there has been some confusion about burning books, I suppose. But make no mistake, the Nazis were what they were, what we never expected them to be, because they burnt people, not books. And moreover, the Säuberung was a public event, burning public books by the state, Jones is a private citizen, an individual.
I remember being in Vienna and reading Freud’s diary. He wrote something to the effect of “How far we have come in civilization! Years ago they would have burned me. Now they merely burn my books.” This was of course before he was forced to flee to England because of his Jewish bloodline. There is a line, that was crossed then, that has been crossed now, and this time, it wasn’t by Mr. Jones.
But you say, these deaths would not have happened had he not done what he did. Does that not imply culpability? Does that not imply guilt? Bill O’Reilly, for what its worth, claims Jones “Has blood on his hands.” Sure, these deaths would not have happened if he had not done what he did. They also would not have happened if people in Afghanistan hadn’t killed people. See the difference? The fact of the matter is when Cohen talks about these killers needing to be enabled by Karzai and the imams, when O’Reilly (or anyone eles) talks about these killers needing to be stoked by Jones, whenever the every day third world rioter isn’t given his or her own agency, we are effectively treating them as unthinking and predictable automatons. If we really thought that every person was equal, that those in the third world deserved respect, deserved to be treated as human, we would realize that this cuts both ways, and we can’t see vicious acts as inevitable consequences of the environmental factors thrust upon a population. These people aren’t automatons, they are real people too. And if you knew anyone who participated in a riot that ended in the deaths of innocents you would be abhorrent. You would call your local rotary club to have them revoked of their membership, and your Wednesday coffee get together would all speak in hushed tones, the kind you use when you didn’t think they of all people could kill. Whatever their pathetic reason for doing what they did, however they justified it to themselves, would be a mere afterthought as you huddled together over crumpets hoping they didn’t make bail.
As an aside, I have seen this editorial reaction first hand. Buried in the last pages of a magazine I wrote for once, I wrote a condemnation of prohibited words amongst those at my university. Words like nigger, cunt, and faggot. To my surprise, someone actually read this article, criticizing it in the campus newspaper:
It’s still an unfortunately common occurrence to see the words to which Chantler so glibly refers spray-painted on the walls of homes, businesses and places of worship. Hate-based crimes continue to proliferate: recently, the kidnapping and torture of a 20-year-old black woman in West Virginia was considered by some to be a hate crime. The woman was repeatedly called a racial slur while her captors sexually abused, beat and stabbed her, her mother said. With such obvious examples of the way these words are used to hurt, it’s shocking Chantler can claim they’re no longer potent in that context. By saying words such as “nigger” and “faggot” are simply part of reality, Chantler is perpetuating their colloquial usage. These words remain completely unacceptable in so-called “real” situations.
I am perpetuating the use of words. That is certainly true. I enjoy Chris Rock and Larry David too much to see them stricken from the lexicon. But if I was the woman who was kidnapped, tortured, beaten, stabbed, and presumably raped, (at the least, “sexually assaulted”) I think the words the men used while enacting this violence would be the least of my concerns.
The fact is this is just lazy writing and lazier thinking. Cohen has a deadline and O’Reilly has to fill airtime. They don’t make the time to think through the logic of every paragraph before the ink hits the page. And who is the victim? Why the third world individual of course (as always), robbed by these pundits of their agency.
There is a way of measuring income and wealth inequality by measuring the distance between a Lorenz curve and a straight line. The straight line works like this, at any point, the number that represents the percentile of the population you are at, represents the percentile of wealth or income, that percentile receives. So, if you take the top 30% of the population, they earn 30% of the income in a population, or if you take the top 85% they own 85% of all wealth in a population. In other words, perfect equality. The Lorenz curve is simply the representation of this concept in real world data, so if 20% controls 80% of the wealth, the lines will have some distance between one another — and the amount of that distance is taken to be some indicator of the inequality in the measured population.
There are very good reasons to believe that a well functioning society will have a great deal of inequality in income and in wealth. The reason is that, wealth as a natural phenomenon, an observed, uncontrolled milling of economic interactions, will distribute itself unevenly, for a variety of reasons. I think this is intuitive to most people, we see money making money all the time, for example. While people without enough money in their bank account get charged fees by the institution in question. These two positive feedback loops are contrasted by many negative feedback loops, like a college student treating himself to as much delivery pizza as his discretionary income can afford, but not more (or less).
Going up? or spreading jam
When I last interviewed U of T prof Joseph Heath he mentioned two schools of thinking with respect to income inequality, the sufficientarians and the prioritarians. The sufficientarians think that if everyone is above a certain level of wealth and income, then all is well. If you draw the poverty line in the right spot and in the future everyone gets above it, then it won’t matter how much richer some people are from a moral or social contract perspective. Prioritarians believe that to have social cohesion you need people partaking in the system voluntarily, and one obstacle to that participation is their perception that society is unfair, out to get them, rewarding others, etc. Coveting of another’s fortune also makes for social unrest. This seems true. Regardless of if a well functioning society has a high level of income inequality or not, the perception of that society by its inhabitants is going to be important.
That’s one incredibly effective part of the american dream, the belief that anyone can rise from rags to riches. What better way to increase social cohesion than to have those occupying the lower rungs that not only is it possible for them to achieve untold wealth, but that those who have coveted fortunes are the same type of people as they are, and furthermore, that their ascension is only a matter of time and perseverance. Contrast this model to the one of “Old Money” in Europe or South America. Family names and aristocracy is fixed, you are either walled in or out, and no amount of hard work will ever change the fact that you are one generation removed from being trash. Obviously the people who perpetuate this type of culture don’t care about the larger ramifications — but if they had an eye for self-preservation they might. Stability is a source of wealth and social unrest is the fuel of upheaval. But you see with America, it doesn’t actually matter if the pauper becomes the prince or not (just if he believes it possible). Especially when the “justice concerns” about inequality have no relation to the optimal improvement of the human condition.
Justice concerns is a conceptual phrase for the moral intuition of fairness, based on the (correctly) observed arbitrary conditions into which we are all born. Moral intuitions are based in a large part on our biology, a predilection to have an ethical/ideological predilection. That is, a tendency to grow up with some belief, acted upon or not, unstated or clearly expressed, in say, a fairness principle or the idea that intentions count. Our moral intuitions that sprouted from biological leanings often have nothing to do with the real world, and that may be the case with income and wealth inequality. None of this is to say we shouldn’t placate what moral intuitions we have, but more to argue that those we can modify we should, where warranted.
Self-similarity is a weird term, since when read literally, one would expect something to be similar to itself. But the term denotes something that is similar to itself at different scales, i.e. as you zoom in. The coast of England is the famous example, winding in and jutting out in the same kind of frequency and undulation as the lens protrudes. An unlimited number of natural phenomenon behave this way, and in no way can be made to fit geometric lines of best fit. As you peer closer you will never find a straight line. What we should investigate is whether income and wealth equality could have the same properties and what makes them move otherwise. Within the 40-50th percentile, does the 44th-45th percentile represent the same amount of the wealth as 10% does to the whole? Some natural shape may be the product of a wealth-maximizing economy. That shape might be deformed by any number of factors, silo-ing phenomenon via racial or class divisions, authoritarian government, minimum wage laws and social safety nets, a lack of banking regulations.
The reason we should be in favour of wealth maximizing, sufficientarian as opposed to prioritarian, for the time being, should be pretty clear. Most of the human population hasn’t escaped the poverty of prehistory. We can worry about those coveting Ferraris when everyone can afford a Ford. Moreover there is a fundamental strength in terms of social rest of the wealth-maximizing model. Human anger is related to expectations far more than anything else. The uneven ascension of all implies the enriching of those on the bottom rungs, and in that new-found wealth, will breed satisfaction. Perhaps the prioritarians concerns can be assuaged by psychological devices like the American dream while societies try to navigate themselves out of poverty. It isn’t clear, ipso facto, that policies of wealth and income redistribution actually quell the covetous of the poor, or indeed alleviate their poverty. But think of what they are trying to accomplish. Effectively they are trying to force a natural phenomenon, the interactions in the market, into straight lines; the unnaturally (and perfectly) even spread of numbers in a lottery draw, the imaginary aspiration of the Lorenz curve. It isn’t clear that these measures work, it isn’t clear how they work, and it isn’t clear that if they worked, they would be the best prescription for a population. We really need to be open to that possibility. Because if wealth distribution is self-similar and terribly unequal, we should have no problem with its curvature if a society can successfully scale up.
You know, the way things actually are
The most likely thing is that this is the way things actually are already, have been for a long time, and those great efforts at controlling it we have undertaken are drops in the ocean. Income and wealth inequality will remain, what should change is our efforts to rectify it might be better served convincing (or deluding!) everyone the wealth creation we are participating in has their best interests at heart. When you insist bad tasting medicine will cure the cold the child is more apt to take it.
At the cost of being repetitive, I have to once again state my amazement at the aspect of human nature that allows us to mix the most rigorous skepticism and the most acute gullibility.
It seems to me that The Black Swan is to be taken quite seriously. The work details a subject matter of great (and as the author argues, growing) importance to us: the recognition of a group of our connected cognitive failures and the effects these failures have. Awareness improves one’s ability to think about the world accurately and all the benefits that entails. Correction mows swathes of literature on banking, investing, risk analysis, and climate change. Interested?
The phrase of great importance to us is distinct from of recognized importance to us. This is natural since Taleb deals with fallacies, biases, and empirical failures. If we were fully aware of these problems or the mistakes they precursor, they would be higher on our priority list than they are currently. Moreover becoming aware of our systemic cognitive distortions requires breaking a self-nourishing circle of ignorance, “We do not spontaneously learn that we don’t learn that we don’t learn.”
The fact that these limitations are often unknown to us relates to a larger theme Nassim deals with, that is, epistemic categories. Some of the fallacies he deals with are unknown knowns, that is we know of them, but the people who commit them are unaware (or they wouldn’t). But the main topic in the book is the unknown unknowns: namely Black Swans, those events of large impact and low, un-ascertained, and un-ascertainable, probabilities. The Black Swan itself is simply an event, its epistemic categorization of unknown unknown is our fault, “it is not an objective phenomenon…The events of September 11, 2001, were a Black Swan for the victims, but certainly not to the perpetrators.”
That there are unknown unknowns to some may seem unintuitive and to others of old hat. But what can we say about such things? To Nassim, plenty, by virtue of the fact that we treat the world as if they don’t exist. We model, we assess, we ascertain, and we (as the author terms) platonify.
People in the classroom, not having faced many true situations of decision making under uncertainty, do not realize what is important and what is not — even those who are scholars of uncertainty.
The beast in this book is not just the bell curve and the self-deceiving statistician, nor the Platonified scholar who needs theories to fool himself with. It is the drive to “focus” on what makes sense to us. [Emphasis mine]
We apply models to the things it makes sense to us to apply models to, even if it were shown to us that models weren’t effective. After all what else are we to do with these topics? The old joke of the drunk looking for his keys in the lamppost’s light not because that is where he dropped them, but because that is where he can see, goes beyond analogy in explicating this idea. “Black Swan logic makes what you don’t know far more relevant than what you do;” [italics original] Taleb introduces the word unknowledge, for those things we don’t know, stating that not only is the set far and away larger than that of knowledge, but it is vastly more important. We use our knowledge to form predictions of the future, when it is our unknowledge that will be making it. He mentions a philosopher who has an antilibrary (this is an illustrative point) of books he hasn’t read. It is in a very big room.
Our ability to learn facts and only facts is scorned by the author, but not as much as how we extrapolate the facts to the general. The theory of the forms (hence Platonify) has created a great deal of retardation in how we accurately view the world we find around us, because we naturally make things fit, when a) not everything we observe is form fitting, and b) could be fit into multiple forms. Some cluster of data points can have an infinite number of curves pass through every point, each curve distinct. One problem is that once someone believes it is some curve x that represents the real form of the phenomenon they will have difficulty convincing themselves otherwise. We don’t just have theory adoption bias, we have first theory adopted bias. The idea that modeling phenomenon incorrectly can lead to dangerous results seems commonly accepted. The idea that some phenomenon shouldn’t be modeled at all is anathema to the most universal values of today’s académie:
Platonicity is what makes us think that we understand more than we actually do. But this does not happen everywhere. I am not saying that Platonic forms don’t exist. Models and constructions, these intellectual maps of reality, are not always wrong; they are wrong only in some specific applications. The difficulty is that a) you do not know beforehand where the map will be wrong, and b) the mistakes can lead to severe consequences. These models are like potentially helpful medicines that carry random but very severe side effects.
The quotation leading this article can be found in an unassuming footnote (of course, most are) on page 106 and refers to what I call the cognitive partition capacity. The idea is that we can create belief systems or areas of knowledge to which different rational, logical, or axiomatic structures apply. One can be the most critical and perspicacious mathematician but not spot a common logic gap when speaking of your failure with women, for example. Said gap might be so obviously false and specious its mathematical equivalency would never even be dismissed by your advanced faculties in that subject because it would never even be considered. The example Taleb uses is doctors who, “vigorously skeptical” of anecdotal results only accept rigorous study of drug efficacy, then go on to make the exact mistake in their personal lives or in the world of investing. Examples abound, I am sure. So to pull out the inference, when we deal with unknown knowns, those things other people are experts in, our partition capacity allows us to forgo any attempts at logic, and simply hand over a blank cheque of credulity.
Where this gets us in the most trouble is fields where experts do not exist. Sure, people may purport to be something all they want, but some phenomenon defy modeling. Those people who are doing the purporting aren’t even charlatans, unless you begrudge them for selling snake oil to themselves. The difference is they don’t know it. It is fine to have a taxi cab driver realize his stock picks are essentially arbitrary; the fund manager who believes himself to be a guru because of his ability to model phenomenon that can’t be modeled is extremely unwittingly dangerous.
What does this all have to do with black swans? Well, Taleb uses the black swan as an example of a highly improbable event which is not thought to be possible by method of inconceivability (as opposed to be actively thought impossible), until of course, it happens. Black Swans are further characterized by an astounding, disproportionate impact upon whatever phenomena to which they belong (think of a stock market crash’s impact upon market analysis rather than the discovery of a non-white swan’s impact on Taxonomist Quarterly.) So, 1) unknown unknown which can happen with very small probability 2) single event which impacts its phenomenon massively, nearly unilaterally. That fund manager (later cab driver) had no idea that stock market crash X was coming and stock market crash X had no clear cause, no clear precursor, and underlined all future activity in that stock market for the next fifteen years.
There is a problem in philosophy that relates to knowledge and to inference, most easily referred to as the problem of induction. Briefly, Hume stated that it is plausible that humans only believe billiard balls will follow certain trajectories when collided because of the result of prior observed collisions, not the rational framework of physics applied to these situations. Now, this example engenders certain cantankerousness because of our ability to predict before a collision the exact final locations of objects colliding, through our rational-analytic framework of such events. Hume may have used a different example had he been writing today. Or not. The common rejoinder to the point of possessing some predictive capacity is that different theories can be correctly overlayed on any set of empirical results, such that neither theory appears wrong upon close inspection, and both theories will correctly predict future outcomes. Hence the creation, the invention, of rational theories are a creative camouflage for the same phenomenon, inferring future events from past observations and nothing more. That the theory fits past data, and is confirmed by some predicted observations thereafter, does not prove in any way whatsoever that said theory will be able to predict all observations of this type in the future. Merely that it succeeded so far. This brings us to turkeys.
Taleb points out the epistemological progress of a turkey is fraught with dangerous delusional inference. Everyday the turkey is fed by humans. It grows fat and survives. It enjoys eating and everyday its suspicion that humans are altruistic caregivers and they feed me everyday grows into empirically verified belief. Say the turkey can remember the past 99 days of its life being fed by a human. Surely the chances of being fed on the next day is over 99% in the turkey’s mind. But a glutton’s delusions are shattered quickly. When the turkey’s belief was strongest, at its most confirmed, it was in the most danger. And it was too late. Eleventh hour revision of the evidence did the turkey no good. Simply put, no empirical observation could have led the turkey to believe what was going to happen, had it based its turkey beliefs on what it had observed, not even, but especially had it had those beliefs tested against future results.
We don’t just have theory adoption bias, we have first theory adopted bias.
The example of the turkey looms in the dark waters that is the history of investing and banking, something Taleb, as a former quant, knows all about: “We learn from repetition — at the expense of events that have not happened before…” It is not just that black swans are rare, or that they have monumental impact upon the phenomena to which they effect. It is that they cannot be seen coming, “(continuing)…Events that are nonrepeatable are ignored before their occurrence, and overestimated after (for a while).” Taleb is not raising his hand to tell us to re-evaluate what is currently deemed impossible, but rather, to worry about those things not deemed possible. As in, those things that, no one has thought of, or done any work on, or acted in any way upon. No deeming has occurred yet. Because you will find yourself unable to think of unknown unknowns, Taleb proffers simpler advice: avoid negative Black Swan arenas entirely. Black Swans needn’t be negative, they can be positive as well. What we know is where they reside, and in these places you nor your portfolio should dwell. Send your lottery tickets and wacky ideas there, by all means. This place Taleb dubs Extremistan. It is exactly this domain where investments can often reside and without investors or fund managers, knowing. The main reason being that black swan events don’t come around very often, you won’t be alerted to the huge risks ipso facto. Instead, you will be ‘picking up pennies infront of a steamroller,’ as Taleb argues the big US banks are in the business of:
There is no way to gauge the effectiveness of their lenging activity by observing it over a day, a wekk, a month, or… even a century! In the summer of 1982, large American banks lost close to all their past earnings (cumulatively), about everything they ever made in the history of American banking — everything. They had been lending to South and Central American countries that all defaulted at teh same time — “an event of an exceptional nature.” So it took just one summer to figure out that this was a sucker’s business and that all their earnings came from a very risk game. All that while the bankers led everyone, especially themselves, into believing that they were “conservative”… The Federal Reserve bank protected them at our expense: when “conservative” bankers make profits, they get the benefits; when they are hurt, we pay the costs.”
Of course everytone is in the business of predicting the next banking collapse (or housing market failure) after the event has occurred. This is a testimony to our myopia: “Note that after every event you start predicting the possibility other outliers happening locally, that is, in the process you were just surprised by, but not elsewhere.”
The visibility of the immediate cause
Much of what took place would have been deemed completely crazy with respect to the past. Yet it did not seem that crazy after the events. This retrospective plausibility causes a discounting of the rarity and conceivability of the event. I later saw the exact same illusion of understanding in business success and the financial markets.
The illogic of pure induction intersects with the narrative fallacy, a human ability Taleb rails against inexorably. It is, in short, to ascribe a narrative to a sequence of events that explains the causation of those events in a deterministic way. When 15 historical events are observed in the rear-view they are recounted as if 15 was bound to be the result of 1-14 because of a deterministic narrative of causation. 1-14 are explained to have caused 15 somehow inevitably, when in fact, 15 was the happenstance that happened to occur out of a multitude of events which each had a nonzero probability of occurring. The key is to recognize that if (and since) the same narrative could not have explained 15b, 15c, 15d had they been the happenstance to happen to happen, then that narrative could not possibility describe any causal development whatsoever. Such retrospective narratives are almost always retrospective fictions. Not only do they explain the past inaccurately, but they disingenuously bode to be important guides to the future. But fiction sells, something of which Taleb is acutely aware:
It happens all the time: a cause is proposed to make you swallow the news and make matters more concrete. After a candidate’s defeat in an election, you will be supplied with the “cause” of the voters’ disgruntlement. Any conceivable cause can do. The media, however, go to great lengths to make the process “thorough” with their armies of fact-checkers. It is as if they wanted to be wrong with infinite precision (instead of accepting being approximately right)…
One day in December 2003, when Saddam Hussein was captured, Bloomberg News flashed the following headline at 13:01: U.S. Treasuries Rise; Hussein Capture May Not Curb Terrorism…Whenever there is a market move, the news media feel obligated to give the “reason.” Half an hour later, they had to issue a new headline. As these U.S. Treasury bonds fell in price (they fluctuate all day long, so there was nothing special about that), Bloomberg News had a new reason for the fall: Saddam’s capture (the same Saddam). At 13:31 they issued the next bulletin: U.S. Treasuries Fall; Hussein Capture Boosts Allure of Risky Assets.
I predict your predictions won’t be your reflections
I noticed this very thing the other day in Canadian headlines. “Canadian dollar loses ground as Egyptian unrest lures investors to safety;” “Canadian dollar surges against greenback, Mideast concerns lessen;” "Loonie up amid strong growth data; traders keep wary eye on Egyptian unrest;" "Loonie falls quarter-cent on Egyptian unrest.” These were not very far apart in terms of time of publication. One has to ask themselves the following questions. First, aren’t the hour to hour fluctuations in the Canadian dollar’s strength essentially random? (they are, by the way) Second, if not, and they are precisely affected, what are the chances some hack beat writer for the Hamilton Spectator or Canadian Press could know that the mood of the population of Egypt of all places was the prime mover of the loonie de jour?
Returning to Nassim’s assessment:
The problem of overcausation does not lie with the journalist, but with the public. Nobody would pay one dollar to buy a series of abstract statistics reminiscent of a boring college lecture. We want to be told stories, and there is nothing wrong with that — except that we should check more thoroughly whether the story provides consequential distortions of reality. [Newspapers] are fact-checkers, not intellect-checkers. Alas.
Besides narrative and causality, journalists and public intellectuals of the sound-bite variety do not make the world simpler. Instead, they almost invariably make it look far more complicated than it is. The next time you are asked to discuss world events, plead ignorance, and give the arguments I offered in this chapter casting doubt on the visibility of the immediate cause. You will be told that “you overanalyze,” or that “you are too complicated.” All you will be saying is that you don’t know! [Emphasis mine]
There is no point in trying to predict the probability of future catastrophes. Of course a Lehman brothers talking head, quoting long odds in the WSJ, provides easy fodder for said thesis. But the second edition of The Black Swan in which that attack is found came on the heels of the company’s total collapse just days after the forecaster’s remarked that the previous day’s catastrophic events were a one in ten thousand year occurrence. Why can banks not properly assess how often, or with what likelihood, say, a housing market, will completely fail?
Consider that the frequency of rare events cannot be estimated from empirical observation for the very reason that they are rare.
How could a bank manager do it? How few examples of total market failure does he have to build his model upon? After that, how likely is the future to operate as the past did? With less empirical data comes more reliance on theory and the associated errors: misguidedly assigning causation (the narrative fallacy), naive platonification, and don’t forget the fact that people who work for financial institutions are by and large morons. The predictive banker deals with few catastrophic data points and a million curves to choose from. It is for this incapacity that Taleb labels big banks as extreme risk takers, they are ‘picking up pennies in front of a steam roller,’ because they leave themselves incredibly vulnerable to black swans while convincing themselves and their customers that they are very austere and conservative institutions.
It is the creation of rules and the deification of Gaussian distributions and associated elements (I may as well be quoting Taleb here) such as standard deviation, regression, and expected value, that has led real world analysts so astray. Real randomness lies in the real world, the sterility of the Gaussian can be found only in a few very controlled environments, like casinos. Taleb points out that it is in casinos where actual probabilities are known, the chance of the next card being some suit, the dice settling on a combination adding to seven. These places that are associated in everyone’s mind with blind and foolish risk are the only place you can figure out your odds of winning (and more often of losing) with a pen and a pad. But your erstwhile investment banker (now out of a job) presented you with a printout assessing the risks to your portfolio beset by the real world and you actually believed him! This is Platonicity at it’s most dangerous. Game theory is wonderful for solving for Nash Equilibrium at the poker table, but when applied to investing your money, it represents “the lethal fallacy of building knowledge from the world of games.”
"The Modelers’ Response"
Taleb autobiographs his travails through the corridors of finance and the cauldron of academic conferences trying to impress his ideas on those who would wield and preach the necessity and the effectiveness of models. At first no one listened, something he partly attributes to strategy,
This gave me the idea of using the approach “This is where your tools work,” instead of the “This is wrong” approach I was using before. The change in style is what earned me the hugs and supply of Diet Coke and helped me get my message across.
His funny, often sarcastic commentary intersperses amongst scathing criticism. (Tangentially, one Wall Street Journal review of his book describes his prose as “in a style that owes as much to Stephen Colbert as it does to Michel de Montaigne.” The reviewer didn’t read Montaigne, or didn’t take in Taleb’s lessons in proportion estimation, or both. This kind of remark highlights the fatuity of interpretation the work seems to have faced since publication. With its long winded plethora of subtitles a skimmer could easily mistake it for intentionally obfuscating polysyllabic drivel.) This talk of hugs and cola came on the heels of this passage:
The Modeler’s Response: We know all that. Nothing is perfect. The assumptions are reasonable. The assumptions don’t matter. The assumptions are conservative. You can’t prove the assumptions are wrong. We’re only doing what everybody else does. The decision-maker has to be better off with us than without us. The models aren’t totally useless. You have to do the best you can with the data. You have to make assumptions in order to make progress. You have to give the models the benefit of the doubt. Where’s the harm?
A flimsy paper litany that burns quickly, Taleb moves through it without hesitation. The plausibility of the notion of being able to model real-world epidemics, real-world financial performance, real-world natural catastrophe impact, real-world climate change, real-world product sale projections, real-world insurance costs, real-world war damage , real-world economic indicators; in short some of the most important phenomenon we face in the world, undergirds the employability of the people Taleb contends are phonies. If you found out you, or your monkey, could do just as well as the average portfolio manager, on balance, would you let them siphon off the top of your hard earned liquid assets? If governments knew that climatologists could advise but not predict, could assess but not forecast, could rationalize and provide anecdotes, but at the end of the day could provide no number (or range of numbers) better than a wild guess as to what parts per million some atmospheric pollutant must be kept below before total feedback loop induced catastrophe, then why would they front them the money for their expensive machines… to measure the things that went into the equation that produced nothing of predictive value? If Taleb is right there is no markov chains correctly calibrated, no algorithm no matter how precise the inputs or liberal the error margin, that can predict with any value the chances and impact of some future Spanish flu-esque global epidemic. On these threats we are totally in the dark — they are the negative black swans.
To understand the reason why we need to make use of, if I am not mistaken, a original Talebian coinage, the Mediocristan/Extremistan dichotomy, when coupled with the problem of induction.
Watch your step
Find one thousand random people and weigh them. Now introduce the largest human being on the planet to the sample. How much does his weight affect the average? Let’s say the average person was 75 kg, introducing a 450kg person to the sample would change the thousand and one person average from 75kg to 75.37kg — and that’s the largest person on earth.
Now take one thousand random people and average their net worth’s. Let’s say the average is 50,000 USD. Introduce someone to the sample, “say, Bill Gates, the founder of Microsoft. Assume his net worth to be close to $80 billion.” What happens to the 50k average amongst one thousand people? It becomes 849,150 USD amongst one thousand and one.
The author uses these elegant examples to draw the Extremistan/Mediocristan distinction. “I can state the supreme law of Mediocristan as follows: When your sample is large, no single instance will significantly change the aggregate or the total.” And on the next page, “In Extremistan inequalities are such that one single observation can disproportionately impact the aggregate, or the total.” Now we see something linking the turkey to the stock market, they both live in Extremistan.
Truth, Rules, and Science
Science’s theories hinge on one clear criterion, that nature is somewhat uniform. Repetitious experiments will generate repetitious results and it is likely the physical interaction of mass-energy in this universe follows some sort of rules. Now, Taleb and Hume decry both causation and rule-theorizing, under clear and prudent auspices. But there is reason to suspect that it is true that the universe has some fixed and universal rules. The reason is that we are here. If we were to imagine a universe inhabited by mass-energy, taking up some space-time, that did not behave in an orderly fashion, whose rules were unfixed and parochial, nothing would form, and if it did it would not stay long.
Well Hume will nix us again on this point. How in fact, do we know that our current area of space-time will not be arbitrarily subjected to a changing of the rules? We do not. The universe could implode any moment at random. The thing is though, rules have to exist to create such a scenario. A universe that actively operated without rules would be complete and constant chaos. What we should conclude, from our lack of nonexistence, is that we are operating in a universe with rules, and we will concede to Hume that this may be subject to change in the future. We cannot prove that it won’t. But this does not invalidate the notion of deriving formulas, laws, and axioms for navigating the space-time we inhabit. Formulas, which falls under the skewered rubric Taleb refers to as Platonifying, needn’t be laws written in stone, but can be accepted as Kuhnian science. The discovered existence of the immediate cause, much less its marked visibility, isn’t as important as the problem-solving capacity of the rule.
What is meant by that proper adjective is that in areas of science, especially dealing with the investigation of mass-energy and space-time, scientific explanations have nothing to do with what is true, but rather what works, what delivers results. If a tree of dependent concepts can predict outcomes of experimentation important to the scientist, something esoteric to trees relied upon in the past has struggled with, then that tree should be endorsed, adopted, and explored on that merit and that merit alone. Some formulas will survive upheavals and our problem solving capacities will improve over time. While learning of antiquated models of the atom (or soon to be antiquated) is a waste of memory, the methodology scientists adopt is not a waste of time. It delivers results. Far be it for them to be told there is no light at the end of the tunnel, in the form of a ubiquitous, true, causal explanation of all things. They can believe in finality on their way to effective stem cell research. The only advantage a group of skeptical empiricist scientists has over regular scientists is an awareness of the philosophical difficulty of the truth concept. Taleb is right in speaking highly of cognitive retraining, but it is unclear if it is for everyone in every area of life.
Were Taleb’s disemboweling of Platonic and Gaussian rationalism takes full effect is not in critiquing physics (in terms of the durability of his overall thesis my above criticism wouldn’t scratch it) but in critiquing social sciences, especially those that use mathematics to measure risks, to estimate real world probabilities, to extrapolate market conditions or fourth quarter growth. In other words he skewers, in its entirety, economics and finance. Financial forecasts are of enormous (also inestimable) impact.
One thing is apparent, Taleb is well read when it comes to the philosophical. He spatters allusions throughout the work. On page 101 he mentions two of his philosophical heros being Michel de Montaigne and Francis Bacon, which is fitting. Those reading with a pencil will have noticed this passage on page 26:
When I play back in my mind all the “advice” people have given me, I see that only a couple of ideas have stuck with me for life. The rest has been mere words, and I am glad that I did not heed most of it.
While unremarkable pulled out of its context, this is a near facsimile of a passage in Montaigne’s Essais, expressing the same sentiment.
It should come as no surprise then when Taleb dismisses Wittgenstein’s language problems entirely, before moving onto bigger fish.
[Hindsight] bias extends to the ascription of factors in the success of ideas and religions, to the illusion of skill in many professions, to success in artistic occupations, to the nature versus nurture debate, to mistakes in using evidence in the court of law, to illusions about the “logic” of history — and of course, most severely, in our perception of the nature of extreme events.
Taleb asserts that those “logic of history” philosophers, Hegel and Marx figuring most prominently, are hapless victims to the narrative fallacy and hindsight bias. Of course everyone is, but not everyone has a publication schedule and blank pages to be filled with Platonicity. The reason you’ve never seen a Platonic form is because they don’t exist outside the mind.
With linearities, relationships between variables are clear, crisp, and constant, therefore Platonically easy to grasp in a single sentence… If you have more money in the bank, you get more interest…If you are in a state of painful thirst, then a bottle of water increases your well-being significantly… [but] if I gave you the choice between a bottle or a cistern you would prefer the bottle — so your enjoyment declines with additional quantities…I will repeat that linear progression, a Platonic idea, is not the norm.
Historians suffered from the narrative fallacy from the beginning. Taleb quotes the modus operandi of Herodotus, “To preserve a memory of the deeds of the Greeks and barbarians, “an in particular, beyond everything else, to give a cause to their fighting one another.”” When we reflect on how many historians reflect, on their data, ascribing cause, creating narrative, we see how ingrained the issue really is: “the more we try to turn history into anything other than an enumeration of accounts to be enjoyed with minimal theorizing, the more we get into trouble.” Always one to quote authorities Taleb juxtaposes Herodotus with Yogi Berra, who purportedly once said “you can observe a lot by just watching.”
The analytic theist philosopher, Alvin Plantinga, presents an argument for why, if evolution is true, then it undermines naturalism. Naturalism has many definitions but for the sake of this discussion we should take it as the view that there is nothing in this universe that is supernatural, things are universally monist (as opposed to dualist), and that whether the rules of some phenomenon are accessible to us, they always are, and nothing more, not the products of agency, intention, or the divine. Platinga’s argument is remarkable in its originality, the solid postulates he formulates from, and the unusual fact that he accepts traditional atheist criteria, accepts traditional atheist methodology (ie: analytic philosophy), and from those teases out a spectacular conclusion that evolution could only be true if god created it.
First we begin with man’s needs in the state of nature, which philosophers and evolutionary biologists readily agree are to eat, drink, reproduce, sleep, and avoid danger. The rub is that evolution will select for survival, and based on these criteria, we can assume man’s faculties evolved to believe things that helped us survive, not things that were necessary true. In this world, where we evolved to have survival beliefs, not true beliefs, everything we believe is unlikely to be exactly true, unless we were lucky enough for survival efficacy to intersect with truth very precisely. So we have reason to doubt the products of our faculties including the truth belief in evolution and in naturalism themselves. Evolution undermines naturalism because beliefs evolve with guides that deceive and when those beliefs end up believing in naturalism, there is good reason to doubt their accuracy. On the other hand Plantinga notes, if there was a guiding hand, an intelligent designer shepherding the process of evolution, he could ensure our beliefs were not overrun by the necessity to survive.
You may claim that the idea that true beliefs aren’t optimized for survival, and hence, there is a distance between them and survival beliefs, is a faulty assumption. Plantinga argues that the fact of the matter is that while truth beliefs may sometimes be survival beliefs, there is no necessity under a scheme of rationalism. He draws out this idea in maybe his most famous writing:
Perhaps Paul very much likes the idea of being eaten, but when he sees a tiger, always runs off looking for a better prospect, because he thinks it unlikely the tiger he sees will eat him. This will get his body parts in the right place so far as survival is concerned, without involving much by way of true belief… Or perhaps he thinks the tiger is a large, friendly, cuddly pussycat and wants to pet it; but he also believes that the best way to pet it is to run away from it… Clearly there are any number of belief-cum-desire systems that equally fit a given bit of behaviour.
We are, after all, first theory adopters. When we first encountered a tiger, maybe we were eaten, maybe we were not. But the fact is that whatever reason the first non-eaten person had for running away from the tiger it is likely the reason to have stuck, it is likely the reason he believes he survived, it is likely the reason he told to the villagers. Totally independent of what that reason actually is. And so it is with all survival events. Whatever reason we believed for taking some action at a decision point was the reason that was going to win in the long term, because it produced physical survival. So under this scheme evolution points to our reasoning being a specious collection of gobbledygook, collected over time from dumb luck encounters.
The reasons Plantinga is wrong may not be so obvious, but they are clear. First, our faculties are not a single entity, the whole having reasons for doing things system may not have entered our genetic coding until very late. We were likely to be mammals before we had beliefs. So our belief generator may not generate beliefs selected for survival, but rather, because we are selected for survival we evolved to have a belief generator. Who is to say this belief generator isn’t in fact a truth generator? The main reason it is unlikely to be a truth generator is obvious, the fact that we can empirically observe people a large population with false beliefs. So where does that lead us? Plantinga’s example appears so powerful because it presents a case in isolation. The fact is the appearance of a tiger isn’t the sole time a troglodyte’s tiger beliefs are tested. Secondly, our reasoning operates in an empirical-rational matrix, where empirically based belief X is tested against empirically belief Y and rational belief A and rational belief B and the implication of empirical belief X and rational belief B are tested against empirical belief Y, ad infinitum. When your rationally formed belief in Karma is strained through the sieve of empirical observations, you are likely to either strengthen that belief or lessen it, depending on your dumb luck. The fact that Karma doesn’t exist isn’t at issue here, what this shows is the fact that beliefs don’t operate in isolation, much less do people. If the man really believed the best way to pet the tiger was to run away from it, he is unlikely to pass his genes along to the female in heat who requests he pet her. And if she is hormonally intoxicated enough to chase him down and show him what she means then his beliefs about what petting entails might be revised. And what is the most likely, not the guaranteed, not the necessary, but the most likely, belief he forms on the definition of petting, after this type of tester? That’s right, a true one.
Humans talk about their encounters with tigers, about why they boil water, and about why they store wood for the winter. Beliefs are interconnected and constantly affected by the inputs of reality, the harshness of empirical failure is but one of those inputs, and its faults are likely to be corrected through a meritocratic sieve. If you believe the best way to get fruit is to crouch down and pick whatever is below you, you are going to be successful in your berry bush environment. But when migration forces you to banana land this belief is going to make for nights filled with famish. Perhaps your belief that the best way to get a clay pot is to grab it, will help you out. The rational is tested against the empirical, the empirical is tested rationally, our ideas are tested by one another, and while a survival belief is not necessarily a true belief, a true belief is necessarily a survival one, and moreover, its very veracity confers additional survival chances because its implications will act as a truth-spreading contagion in the rational-empirical belief matrix. The fact is that having senses that interpret sense data accurately gives you the best chance to understand your reality.
And what can we expect from this long, drawn out process? Well from a population not at t = inf, then a population with a grab bag of beliefs, some true, all mostly effective, and many in dispute. In other words, exactly the situation we find ourselves in. We see many people with many obviously false beliefs that are still nevertheless alive and getting by exactly because they possess effective untrue beliefs. We see many people with an odd assortment of contradictory beliefs, those that haven’t been tested against one another by habit or happenstance, but those who know one is false are often soliciting them to do the comparison.
Taleb writes with an innocuous eloquence. Eloquence that doesn’t protrude isn’t just the mark of a talented writer, but one who is careful, crafty, and diligent. Taleb has put forth the type of labour that yields prose enjoyable just to read, instead of subjecting his audience to the curious navigation of a genius’ mildly filtered stream of consciousness. He doesn’t draw attention to a literary flare, flourishing without flourish. At times it will seem as if he does not possess a pretension detector, but over the course of the book it becomes a forgivable offense. The avuncular, and sometimes lame, jokes scattered liberally across the text engender an endearment for the author. He decries racism and makes jokes about the French. He scorns arrogance and at times comes across as exceedingly pompous about just exactly what his vision of the intellectual is, who qualified and who didn’t. Most of his allusions to the irritating nature of Diet Coke drinkers are good jokes because the reader is not instructed on how seriously to take them, and because they contain kernels of truth. But in the extolling of his virtues as a sophisticated, educated, cultured intellectual, Taleb strays a bit too far from the rules of comedy.
In a postcript essay attached to the original text as part of the second edition Taleb names names in a very upfront and self-important way. It is understandable that his ideas and work came under attack after its publication by people who, unlike me, didn’t read it twice, slowly, while marking it up. Those criticisms can either be deflected, addressed, or dismissed, and often the third option is warranted simply because people who don’t read a book shouldn’t be listened to if offering a critique of said book. But Taleb instead chooses to address seemingly every criticism and academic ignominy offered in the couple years between editions. Ad hominem attacks he decries yet comes perilously close to making himself. In defense of his defense, he does take the time to address mistakes he made in the first edition, at paragraph length, though of course, they are the mistakes to which he admits.
He cannot be accused too severely of the literary equivalent of liking the sound of his own voice, by virtue of the fact that many chapters in the original text begin with prefacing remarks that you should in skip ahead if you don’t want to get bogged down by boring details x,y, and z.
Often nonacademic writers are messy, romantic, bold, and narrative. The Black Swan, with its spiral of subtitles and literary allusions, makes for gripping reading, but manages to dodge the majority of hazards these adjectives so often realize. The kind of idea explosion that comes with being removed from the academy is something I can sympathize with, but the distance might also be from practiced rigour and discipline. Taleb, an erudite nonacademic if one at all, has published in journals and the like, one assumes providing the mop up work required to ensure his thesis’ fitness. Not yet having read those works I can only speak to the book itself, which is clear in its expression if nonlinear in its organization. Rigour, clarity, organization, and discipline have much to say for themselves. The vast majority of trained academics have nothing to say, but at least they know how to say it. Nassim Taleb has much to say in this work and writes it in a way that is easy to swallow. This is quite the accomplishment for a message that will be for many a bitter pill.
"One should also note…how easy it is for men to be corrupted. Their characters are quickly transformed, no matter how good and well-trained they were."
— Discourses. Machiavelli
Machiavelli did not give too much credit to the character of men involved in the realpolitik in which we find ourselves. But the main thing to remember is that he is right. The quote applies to those who have gained power, of which Julian Assange is an example. But its original context refers to legitimate power, those elected to lead the Roman republic of 451BC, who promptly transformed it into a dictatorship. While we no longer should fear dictatorship in the West, those assigned powerful positions, again and again, can be expected to wield their power as corruptly as tenable. Occasionally they will overstep this bound or simply miscalculate their chances of being caught. Whether it is appointing friends to obscure positions, voting for pork barrel projects, or forwarding contracts to certain companies, Western leader abuse their positions in creative myriad. This does nothing to describe the amoral milieu of devilry spawned by bureaucracies and bureaucrats. The two are distinct since there are of course self-interested bureaucrats acting in ulterior ways and then there is the host of negative effects from which the public interest suffers by a bureaucracy’s very existence. In all these respects, the West currently suffers.
So the first thing that we should keep in mind when approaching the question of Wikileaks is that politicians and the states they operate within are corrupt, Machiavellian projects. The public services states provide, the private goods they foster, the security they afford, these are not natural inclinations, but rather a culmination of a centuries long process of culling, pressing, revolting, protesting, reforming. The Western liberal-democratic state has been cajoled by its own citizens into its role as Magwitch. But a benefactor it is not to all denizens, much less non-denizens. Those who live outside the boundaries of the state do not receive its munificence unless they can help someone stateside escape the gallows, which is to say not often, for they cast no votes. It is these people Wikileaks has sought to represent. It is these people they have, in some ways, brought into western consciousness.
When death is brought to life
The clearest example of this remains the Wikileaked breakthrough footage of a US Apache helicopter gunning down a group of Baghdad men in 2007, two of whom were Reuters reporters. The footage is truly gruesome, but so is the running radio commentary overlaying it. It is astonishing how quickly the voices move from sighting the men to claiming they are engaging a group “with RPGs and AK-47s.” US forces were never fired on during this incident and the slaughtered were not identified prior to the execution. They shot fleeing, unarmed men, they shot rescuers, and ultimately killed somewhere in the neighbourhood of a dozen people. The full Wikileaks release can be found here. It details how Reuters was denied its requests for the footage under the Freedom of Information Act and mentions how a 22 year old US Army intelligence analyst, Bradley Manning, has been imprisoned in Kuwait for allegedly making the leak.
There are those (mostly defectors) who have been to Iraq with the US Army who claim this is an everyday occurrence, that US casualty figures are low-balled, often grossly so, and that the labeling of the slain as “enemy combatants” or “insurgents” is abused to such a degree that no credibility can be given to US estimates or press releases. The documents, cables, footage, and figures Wikileaks has obtained and published bolsters this version of events and the characterization of the US Army and the US government’s media efforts as nothing short of duplicitous. What has changed, since any casual observer will have held a very low opinion of the credibility of the US war machine’s pontifications already, is that this is proof akin to the work done by Woodward and Bernstein. It is indicting. It is not an a priori prediction of the scummy nature of someone like Karl Rumsfeld based on knowledge of his office, later verified by his ideological talk show doublespeak or his ignominious congressional testimony. Rather than intuition confirmation, it is proof of crime.
The particular Apache helicopter footage is also a testament to the dehumanized stasis of young men in war. Much more than a kilometer separates those perched in the air, wielding multimillion dollar machinery, from those wearing khakis in a dusty nondescript street. A spartan, carnal murder never occurs. It is a sordid and removed sequence with a sterility impregnated only by the gruff, disorganized, raucous conferring of those haphazardly influencing the trigger finger.
The latest, world infamous, package of leaks continues in this vein. One example being exposing inducements of traditional child rape by US contracted companies working in Afghanistan. The Houston Press, amongst others, sifted through the leaked documents to uncover a cable that began to tell a story of Bacha Bazi:
[B]acha bazi is a pre-Islamic Afghan tradition that was banned by the Taliban. Bacha boys are eight- to 15-years-old. They put on make-up, tie bells to their feet and slip into scanty women’s clothing, and then, to the whine of a harmonium and wailing vocals, they dance seductively to smoky roomfuls of leering older men.
After the show is over, their services are auctioned off to the highest bidder, who will sometimes purchase a boy outright. And by services, we mean anal sex: The State Department has called bacha bazi a “widespread, culturally accepted form of male rape.” (While it may be culturally accepted, it violates both Sharia law and Afghan civil code.)
For Pashtuns in the South of Afghanistan, there is no shame in having a little boy lover; on the contrary, it is a matter of pride. Those who can afford the most attractive boy are the players in their world, the OG’s of places like Kandahar and Khost. On the Frontline video, ridiculously macho warrior guys brag about their young boyfriends utterly without shame.
So perhaps in the evil world of Realpolitik, in which there is apparently no moral compass US private contractors won’t smash to smithereens, it made sense for DynCorp to drug up some Pashtun police recruits and turn them loose on a bunch of little boys.
On the heels of that wonderful account, we have to return to Apache helicopters. Another Wikileaks report detailed the execution of men who were “engaged and then they came out wanting to surrender” because the Apache crew were told that “Lawyer states they cannot surrender to aircraft.” This, under international laws as well as the US rules of engagement, is untrue. You can surrender to an aircraft, and the link above details some incidents of this occurring in the very same theater of operations. In one incident insurgents surrendered to the helicopter with troops close by to make the capture; but in another the helicopter had to wait, and did, before troops could be called to the scene. This is how it should be, because in war as understood by both the US and international law, the ability to capture is not commensurate with the ability to surrender (an argument the helicopter’s ‘lawyer’ would surely make). Unfortunately, neither surrender nor slaughter seem to be the exception or the rule. There is another Apache video timestamped July 31st, 2004, of a man exiting a car with his hands above his head, the only vehicle on an empty stretch of road. After brief moments the ground is lit up in a barrage of fire and when the smoke clears the man’s body lies motionless on the side of the road.
These three incidents are unfortunately mere anecdotes. The leaked Afghanistan and Iraq war logs detail the larger trend non-insiders assumed was the case: a bevy of unreported carnage, often a result of US forces operating in a way that cannot be described as precise, calculating, or cautious. Collateral damage is under reported and widespread. This makes liars out of all the generals we see giving press conferences, but should not contradict views of the character of men in war.
Canadians in the crossfire
Wikileaks also released a report that four Canadian soldiers were killed by friendly fire in Afghanistan, when the building they were fighting in was bombed by coalition forces. A part of Operation Medusa, this would not have been controversial had Canadian forces not maintained all this time that the soldiers in question had been killed by the Taliban. Rick Hillier, the brash and painful to watch former head of Canadian forces, stated on the record that the report was “erroneous,” before going onto say that he hadn’t read it. He knew it was false because he had troops on the ground at the time report to him what actually happened, but couldn’t describe to a reporter what actually happened when asked because he “wasn’t there.” Amongst indefensible blanket statements he further opined that he didn’t want to read the report at all, it was of no interest to him. This guy really does his due diligence before standing up for dead soldiers.
I’m working on a longer post on Wikileaks. But as you can probably tell there is a lot to take in on that story. Hopefully soon.
I listen to the Adam Carolla show a lot and the other day he was complaining about the prospects of paying 750k in taxes this year. He ranted about how 1% pay for 40% (though he didn’t quote that figure, its just a you know what I mean), and how no one thinks its unfair for high income earners to pay more. He railed against people calling into radio shows yesterday morning saying “yeah all the rich people with their income tax accountants searching out loopholes, they get away with everything.” While I agree it is unfair to characterize high income earners this way, and while Carolla is without any equal in radio in terms of true wit and insight, here I think he is letting the opposition frame the argument.
See, he disagrees with people who say “the rich are paying high income taxes, that is as it should be.” Therefore he disagrees with paying high taxes. But, the reality is that “the rich are paying high taxes, that is how it is.” This isn’t a justice debate. It isn’t fair that Carolla pays a higher % of his income than then next guy just because he makes more, he earned every penny. It isn’t that the state is passing judgment on his degree of avarice. It is that in our state of affairs, to lubricate our system and state, high income earners have to contribute a lot in terms of income to keep things going. This system, through the combination of its pros and cons, generates the conditions through which he can earn his income. It could all be improved. But arguments about fairness, justice, or the degree to which you earned your money have no place in the debates about the Bush-tax cuts or progressive tax systems in general. The lubrication and the health of the economy and the government infrastructure that supports it is most important, and for that to continue, given how inefficient government currently is, high income earners will need to be taxed. For what its worth, Carolla isn’t paying anything close to 50% on his highest bracket.
In any event, especially in an economy like California’s, corporate taxes need to be kept as low as possible to encourage businesses to stay (something Adam speaks correctly on often). Well, all the money has to add up in the end, and it would be far wiser to slash corporate tax rates mercilessly than be merciful on those making 7 figures a year.
The Agenda had a show on neuroscience’s affect on the future of law. One contention raised was that if one could show brain activity had been compromised in an individual leading to inevitable, uncontrollable acts of crime (or more generally sociopathy), then the person diagnosed in question would not be guilty. Of course, this lead to the assertion that since there is no free will there is no one who is truly guilty and its a moot point therefore since we still have to put people in jail.
Who thinks they are more qualified on the question of whether there is free will or not than people who believe themselves to be smart and who do not specialize in the question? Free will is like racism, the Iraq War, global warming, and the OJ trial — everyone has an opinion on it and very few plead agnosticism. And there are two main camps, let’s call them the logicians and the superstitions. The logicians essentially stick to one point: why did you choose chocolate over vanilla? If it was your preferences then you couldn’t have changed that, you have no free will. If it was your choice then what was it based on? It was not an arbitrary choice. Therefore it must have been based on something, and that something is inevitably reductio ad absurdum, out of your control. Ergo under any scheme you have no free will. Things are predetermined at every turn as in a mechanistic vision of the universe.
The superstitions essentially believe in magic (also called “dualism” amongst theodicians, whether they’re closeted or explicit). There exists mind and there exists matter, they are separate and yet one affects the other through some means. What these are we do not know. How it works is magical and/or poetic and/or divine. Therefore free will exists because it was god given or you can just tell (fideism strikes again).
Let’s understand what is not true: the possibility that our brains lead us to believe we have free will neither implies its existence or nonexistence nor precludes a defense mechanism of false consciousness. It might be true that a being without free will who was under the impression they had free will would be stricken with anguish or at risk of peril if they were led to believe or convinced that it was a ruse. But I suspect all the possibilities along these lines will inevitably be simply expressions of the arguers desires when claimed in the contrapositive, as they often are. Its like those people who just learned about the great game of politics proving things a priori via cui bono. Who benefited from 9/11? George Bush of course! Therefore interests prove culpability.
Anyways these two camps need to take some water in their wine. Because things aren’t so clear. The main problem with the superstitionists is that they believe in the ether. They presuppose things that do not exist in order to fit their theory. Ockam’s Razor isn’t a law of nature but it sure makes a good case for not believing superfluous planes of existence when it is unnecessary. The main problem with the logicians is something I’ve talked about before here: extrapolating the whole from the parts. There is a phenomenon that exists in systems (as broadly as we can define them) called “emergent properties.” This is that the system has a property that the parts do not and vice versa. Molecules and their locations are changing chaotically. Yet often the substance they constitute is stable and steadfast.
Think about this: at what point does a thermometer become a thermometer?
You might be thinking of when the glass comes together, or when the mercury is poured, or when the hash marks are made. The fact of the matter is that a thermometer becomes a thermometer when it is a thermometer and not a moment before. The parts apart cannot tell you whether to wear flip flops. So it is with a computer switch. It has no processing power until it can generate a 1 or a 0. So it is with life. The molecules that make a lifeform are not alive until they are and are no longer when they aren’t. And when they aren’t alive, they are simply molecules. Somehow, it is the unison of these molecules, through some systematic combination, that an emergent property, well, emerges, to grip the whole that the parts create. And that property in the case of a living organism is the property of “being alive” of “living.” And so goes consciousness. It is the recipe that makes up our bodies from which emerges consciousness. It is a property of the whole but not of the parts. And because the parts are deterministic and because the world is deterministic does not imply that some whole of some deterministic parts are deterministic. Free will could easily be an emergent property of the flavour of consciousness that is human. And it doesn’t have to be quite the free will as historically defined by Christendom. But it can be much closer to the free will of the superstitionists than the thesis of the logicians. All I know is what has been dis-confirmed and what has not yet been dis-confirmed. And of free will I can say it is a member of the latter.
And there is the possibility that our brains are so programmed to see the world deterministically, since it provides us with the best coping mechanism for reality, that conceiving of a non deterministic reality in terms of how our brain functions is either very difficult or totally impossible. I suspect the former. But whereof we cannot speak we must pass over in silence, and in the case that our investigations lead to this conclusion and no where else, nothing but agnosticism is available.
Descartes’ famous proof for existence, cogito ergo sum, I think therefore I am, was built upon, in some ways critiqued, by Sartre in Being and Nothingness. One essential idea from Sartre was that you didn’t begin to exist by thinking, but rather, by becoming aware that you are thinking. We classify sentient beings as those who are self-aware, the common example is being able to identify themselves in the mirror. But Sartre argued that you don’t become truly conscious by being that kind of self-aware, but by knowing you are thinking, by looking down and seeing your self-awareness in action. Its unclear how old one is when this happens. If we are stringent enough, it might be the case that some adult humans never have such a cognitive experience. But I would guess, it is a stage in most everyone’s life. There is another stage, I think it is just a moment in time, when an awareness, that this reality you’ve stumbled into will end, manifests itself. That moment when death goes from mere hypothetical to impending reality, and that realization is inexpressibly terrifying. A final, total, and unceasing cessation of consciousness. Given our biological drive to live, thrive, and survive, I don’t think its terribly unhealthy to have a negative reaction to such thoughts. But there is nothing in nature that says we would necessarily be best served living out our days. For those who die young, by their own hands, it is often said they are selfish, or naive. On the latter I would certainly agree and on the former I would say, what possession should you be selfish of before your own life? But what’s unclear is whether they are wrong to do it. It is the fact that we do not control our birth, that we did not wish to come to be, coupled with the fact that we are unlikely to control our death that makes suicide compelling as an unusual level of control exerted by the self over reality. If we weren’t going to die, killing oneself would lose a great deal of appeal to many. If we weren’t going to die, we might have time to fix all of the mistakes we can’t live with. I don’t think that is a paradox, for what its worth. What isn’t clear is that they were wrong to do it. It is amazing how, as we grow older, what we assumed was true in the past is seen as immature and ludicrous. Age has a lot to be said for itself. But no one can speak to the benefits of death, and those who follow texts or ideologies that claim knowledge in this area are without exception, some combination of deluded, short-sighted, conceited, desperate, and stupid. That’s why I will never have respect for a Catholic priest, for example. On a deep existential level they are liars. And they don’t just lie to themselves in order to sleep at night, they lie to the vulnerable and the credulous. What isn’t clear is whether they were wrong to do it. Because sometimes when you risk being wrong, you are at risk of being right.
Hedges inevitably relies on Harris and Hitchen’s support for the Iraqi invasion of 2003 to make the leap to the above paragraphs. But Dawkins, for one, was an ardent opponent of pretty much any Bush policy, especially the war. Dawkins focuses his atheistic efforts on US and UK education, separation of church and state, the evolution vs intelligent design argument, and Randi-style religious-huckster debunking. He doesn’t actually have all that much to say about global politics. And while he does have a lot to say about philosophy re:faith his contributions are the ones one might expect from a biologist. What am I getting at? These four are just people, not representations of a larger trend. Second, that to say so, and to attack them is pretty useless in terms of critisizing modern atheism. When it comes to philosophical depth, Dawkins is a straw man. When it comes to geo-politics, Harris is a straw man. Hitchens doesn’t go about making a lot of scientific claims, but I’m sure he would be well criticized if he did. What does any of these things have to do with atheism as religion? A great amount of Hedges book relies on the jump from some influential atheists being wrong about an assortment of things to there is a great emerging threat to western civilization.
One of Hedges central claims is that, in building on Enlightenment perfectionist values, modern atheists justify extreme violence, enacting an ideological commitment to ridding the world of theist opposition. He doesn’t spend too much time on the inference, but rather the premise that such values must uphold the perfectibility of man. Over and over again he recites the folly of those who would believe in the moral progress of man and the folly of those who would view man as anything but the perpetual reenactment of the stories of Cain and Abel and the fall. A typical example occurs on page 28-29:
"[A]Those who teach that religion is evil and that science and reason will save us are as deluded as those who believe in angels and demons….[B]Science and human reason, like institutional religion, have delivered as much suffering as comfort….[C]The story of the fall in the Garden of Eden is a warning about the danger posed by blind faith in the power of human knowledge. [D]The figure who delivers knowledge to Adam and Eve is the source of evil — the devil. [E] Knowledge brings with it benefits, including self-awareness and power, but it also tempts us to play God. [F] To act on this temptation, to worship our own capacities, lures us into Utopian projects. [G] The Biblical story of the fall conveys fundamental truths about Freedom, guilt, our relation to nature and mortality…. [H] Those who created the Greek myths, the Vedas, the Upanishads, as well as the Bible, were trying to explain human beings to human beings. [I] We carry on a never-ending struggle with "the evil that I would not that I do." "
"The agenda of the new atheists, however, is disturbing. These atheists embrace a belief system as intolerant, chauvinistic and bigoted as that of religious fundamentalists. They propose a route to collective salvation on the moral advancement of the human species through science and reason…."
He goes on “The utopian dream of a perfect society and a perfect human being, the idea that we are moving toward collective salvation, is one of the most dangerous legacies of the Christian faith and the Enlightenment.”
Most of Hedges criticisms of Hitchens take the form of throwaways. He doesn’t concentrate on him, maybe accidentally, maybe for good reason. The fact of the matter is that Hitchens, on matters of ideological bent, is nearly beyond reproach. The last thing someone who holds such a large swathe of opinions could be accused of being is ideological — if you want to accuse Hitchens of something, accuse him of being wrong. Hedges succeeds in doing the latter but his efforts in doing the former hurt his credibility. On Dawkins’ idea of brights, for instance, Hitchens immediately (and in writing) chastised the biologists’ naivete. (Dawkins wanted to create a sort of international rotary club uniting agnostics, secularists, atheists, and other infidels under the banner of ‘Brights’ — sounds ideologically promising doesn’t it?) This illustrates Hitchens lack of belief in a united atheist front neatly: he dissented from the would be figurehead on the matter of having a united atheist front.
Never does Hitchens et al. together say that technology, reason, and science is going to lead to a utopian future. But, Hedges presses on with that assumed:
"There is nothing in human nature or human history to support the idea that we are morally advancing as a species or that we will overcome the flaws of human nature. We progress technologically and scientifically, but not morally. We use the newest instruments of technological and scientific progress to create more efficient forms of killing, repression, economic exploitation and to accelerate environmental degradation. There is a good and a bad side to human progress.  We are not advancing toward a glorious utopia."
Let’s break down Hedges above paragraph clearly.  is obviously true and I have never heard one person say otherwise.  is obviously true and the only people I have heard say otherwise are lunatics like Ray Kurzweil or loony people who believe in heaven and salvation. Of those I have nothing else to say.  is just not true. It ignores the facts. It is a fact that the number of deaths inflicted upon humans, by other humans, on a per-capita basis, has gone down dramatically over time, up to and including this century. It is our stomach for violence that perhaps has gone up, at least, that is one explanation offered by Steven Pinker, amongst a bevy of good others. It is also a fact that Hedges subscribes to a Marxist version of economic exploitation that is so far from reality that I can only dismiss it out of hand. Hedges alludes to this ‘economic exploitation’ constantly throughout the book, but never actually makes anything that resembles an economic argument.
Because - are utterly nonsensical, let’s focus on  and . The main thing is that, if you believe the Bible or similar religious texts that preach the sinful nature, the irredeemable and guilt-ridden nature, of man, of course you will never believe in moral progress (as in [G]). But let me suggest this: those things Hedges would deem as moral, such as not harming others, and helping those who need help, for example, are affected by not just one’s genes, but one’s environment. And that, the plasticity of our brains allow sufficient room for our era to train our behaviour towards the more moral. No genetic evolution necessary. Even the most cynical Hobbesian will recognize that scarcity is a great source of violent death. No one can deny that scarcity has decreased. Hedges makes a classic big brain mistake [H]: he spins morality yarns out of religious texts that are at best parables, and more often, silly tales. Any story that involves people dying violently can be retold as a morality play from which you should learn how to live if someone is smart enough and wants to work hard enough to do it. Even if the story is a collection of garbled scripts written by nomadic Semitic tribes steeped in ignorant mysticism praying for a good winter.
For the sake of sadism, lets visit another Hedges passage:
"The myth of collective moral progress feeds the aggressive instincts Freud feared. If we see ourselves as the culmination of along, historical process toward perfectibility, rather than a tragic-reflection of what went before, then we are like to think the ends justify the means." Hedges favourite device is hyperbole, and this false dichotomy is a good example of that, since we don’t have to see ourselves as part of historical process toward perfectibility, but rather, part of a process toward historically low toaster-oven prices. The quality of life on earth is getting better — the choice he offers is a misdirection since it is not an either-or question.
"Religious institutions, however, should be separated from the religious values imparted to me by religious figures, including my father. Most of these men and women ran afoul of their own religious authorities. Religion, real religion, involved fighting for justice, standing up for the voiceless and the weak, reaching out in acts of kindness and compassion to the stranger and the outcast, living a life of simplicity, cultivating empathy and defying the powerful." Hedges makes a pretty classic mistake: naming his father during his description of how he arrived at being right. As with Kant’s soft spot for theism one should take such remarks with a healthy dose of skepticism. Actually, mentioning your father or mother having taught you something you now purport to others to be true should be like the rule of writing what you think is true in all caps: it implies you don’t know what the fuck you are talking about. What’s worse about the people who write in all caps is that it is a signifier to others that what the writer is spelling out is perceived as deeply profound, alarming, and/or shockingly novel and that they could not conceive of a reality where, if they were just listened to, others would not see things their way. Well, that’s the impression I get from Hedges’ offering, it would have been easier for everyone had it just been printed in all caps, so that we would know in advance not to read it.
That was the previous title of Chris Hedges When Atheism Becomes Religion. Before we go further let me say that there are a lot of things I don’t like about this book, so the following shouldn’t be construed as a review or a critique. It is definitely a criticism, one I doubt is objective, even though it feels to me like its rooted in rationalism. You’ve never heard of this book, you say? It is weird that it didn’t get much sustained publicity from a Pulitzer-prize winning journalist (war correspondent), but at the same time Hedges remains a literary force, churning out drek every year.
* * *
To begin with titles is both logical and illustrative. As it happens both of Hedges’ are neither. The original title scores great marks on the inflammatory scale (something I wholeheartedly endorse) but says nothing in terms of exposition, because nowhere does Hedges allege that atheists don’t actually exist. I don’t think it is presumptive to assume Hedges might argue the well-worn points that atheists don’t believe what they say they believe, but this never comes to fruition. Instead, the way one must interpret the original title as reflective of the contents of the book is to say Hedges doesn’t believe in atheists, in the way a mother believes in her son’s ability to make the soccer team. A weak thesis indeed (easy to prove!) and quite irrelevant.
In any event, the second title implies that atheism needs some additional ingredients to become religion (the when). The book makes very little mention of this point, focusing on specific atheists as fundamentalists. Moreover, the point that atheism can become religion, or more specifically, be part of a religious ideology, is historically and conceptually obvious. That it isn’t ipso facto, however, is important. While Hedges doesn’t take pains to explain this, he does state “After all, there is nothing intrinsically moral about being a believer or a nonbeliever.”
The objects of Hedges’ criticism are the supposedly iconic atheists of our time, Richard Dawkins, Christopher Hitches, Sam Harris, and Daniel Dennet. Well let me just say that the condition of being an iconoclast is in the eye of the believer, beholder, and disbeliever alike. These four, while they have their fans, and in Dawkins’ case, their followers, don’t represent either a united front, or atheists at large. That’s really problem number one with Hedges whole premise. To attack Dawkins on a topic is to praise Hitchens and vice-versa, since the two disagree on a lot. And moreover, even if you secure a video-game level evisceration of your foes, you still can’t guarantee your thesis obtains, since they don’t actually represent many more than themselves. And even if you present a damning study of an emerging strain of the modern atheist by linking three metaphorically-bespectacled members of the academy to the urbane, unadulterated man of letters, you certainly don’t justify the precipitous decent into hyperbole Hedges undertakes in the opening chapter as he describes the threat these book authors and public speakers pose to civilization itself:
They believe the ends, no matter how barbaric, justify the means. Utopian ideologues, armed with the technology and mechanisms of industrial slaughter, have killed tens of millions of people over the last century. They ask us to inflict suffering and death in the name of virtue and truth. The recent crop of atheists, in the end, offer us a new version of an old and dangerous faith. It is one we have seen before. It is one we must fight.
I wrote this article a while ago and couldn’t find it online. So here it is. Now that I have unlimited space I may expand it in the future as it was condensed for publication.
* * *
Michael Buckley is accurate when he describes the use of the word atheist in history as “long on rhetoric and polemics, but short on the precision.”[i] Atheist and atheism have undergone a meandering transformation over time, originating with the Greeks via atheos – ‘a’ meaning without and ‘theos’ god. This was later translated into Latin as atheoi by Cicero.
Many in ancient Greece and Rome were labeled atheists. The Roman triumvirate of Cicero (1st century AD), Sextus Empiricus, and Claudius Aelianus (both late 2nd to early 3rd century AD), compiled lists of atheists of the day and of the past. Men like Protagoras (5th century BC) of “man is the measure of all things” fame, often found themselves on such lists, though today we would identify him as an agnostic. In Concerning the Gods he writes, “Concerning the gods, I have no means of knowing whether they exist or not or of what sort they may be, because of the obscurity of the subject, and the brevity of human life.”[ii] While not atheistic, this, to say the least, was a highly heretical statement.
Perhaps most deserving to be on such a list was Diagoras of Melos (also 5th century BC), who earned himself a reward for his murder or capture, advertised on a bronze tablet atop the acropolis. Diagoras had certainly been impious, burning effigies of the gods for many to see. But because of scarce and conflicting sources, scholars are in disharmony as to whether he was an atheist in today’s sense, or whether he only fulfilled the Greek connotation of ‘ungodliness.’ He certainly earned repute.
To the Greeks someone could do many things to be an atheist: they could adopt an atomist metaphysics (as Epicurus), a sceptical epistemology (as Protagoras), or they could simply deny that the gods were good. Famously Plato’s Apology chronicles the trial of Socrates (399 BC), whose prosecutor, Meletus, accused him of being “a complete atheist”[iii] even though Socrates’ theism (if unorthodox for the time) is clear. Buckley, in At theOrigins of Modern Atheism, explains the relativistic use of the term: “One man’s theism proved to be his indictor’s atheism, the incarnation of impiety.”[iv] This usage was not necessarily an affront to reason: atheist did not describe one’s analytical position – not yet – it was an epithet.
Atheist often meant one who did not honour the city’s god. Hence the early Christians were called atheists by the Romans, and with some accuracy, the condemnation was returned. Christians are after all atheists with respect to Mars, and pagans towards Jehovah. The Christian Justin Martyr used the word in this manner in his First Apology, written in the second century AD in Rome: “Thus we are even called atheists. We do confess ourselves atheists before those whom you regard as gods, but not with respect to the Most True God.”[v] After the fall of the Roman empire, atheist would not be used for over a millennium.
In the late sixteenth century, the word atheism came to English, but its re-emergence was first as athéisme in French, in 1587. Atheist had appeared slightly earlier, in 1571, a trend that would continue in history. When deist was conceived for example, deism did not follow until much later. In the centuries preceding, words such as infidel, heretic, and blasphemer had filled the role of religious accusation. Similar words predate atheism, though their usage was scant, and they were often failed coinages, like atheonism’s appearance in 1534. Atheoi resurfaced in Italian in 1568, an indication of the growing interest in rediscovering the past.
Lord Burleigh, writing in 1572’s Discourse on the Present State of England, saw atheists everywhere: “The realm is divided into three parties, the Papists, the Atheists, and the Protestants.”[vi] There are two technical notions of atheism in use in scholarship today that help to interpret remarks such as this. Negative atheism is simply the absence of belief in god, while positive atheism is a conscious, affirmed disbelief in god’s existence. Given the time’s ambiguous usage, it is hard to know how much a proportion, if any, positive atheists constituted Burleigh’s survey.
Atheist honoured its epithetic pedigree, and was used primarily to slander one’s theological opponent. When fellow theists disagreed (like during the Reformation), one would argue the other’s position reduced to atheism. Such charges would be leveled endlessly at Thomas Hobbes, even centuries after his death, as a result of his 1651 publication, Leviathan. Hobbes took pains to defend himself, “the words Atheism, Impiety and the like, are words of the greatest defamation possible”[vii] he writes in An Answer to Bishop Bramhall, his principal adversary. On an atheist’s trustworthiness, Hobbes expressed the view of Locke, “no punishment preordained by Law, can be too great for such an insolence; because there is no living in a Commonwealth with men, to whose oaths we cannot reasonably give credit.”[viii]
But Hobbes’ book did incite atheism, an atheism that was romanticized by a few highborn. Daniel Scargill, in 1669, “expressly affirmed that I gloried to be an Hobbist and an Atheist.”[ix] He described how Hobbes’ materialism was discussed and endorsed in private circulation amongst a society of rich Englishmen. But Scargill’s words betrayed his interest, for in atheism he saw ‘glory,’ which did not last; he later recanted and was received back into the lord’s flock.
Atheist began being used differently soon after its introduction in English. There was widespread debate, which continued into the late 18th century, as to whether one could actually be an atheist. Thomas Curteis wrote in Dissertation on the extreme folly and danger of infidelity (London, 1725), “’tis very questionable, Whether there ever was any such Monster in Nature, as a serious, close-thinking or speculative Atheist: who liv’d and died so.”[x] Thomas Broughton’s 1737 work, Bibliotheca historico-sacra, echoed similar sentiments: “there is room to doubt, whether there ever have been thinking men, who have actually reasoned themselves into a disbelief of a Deity.”[xi] Indeed, the history of atheism for two centuries in Europe revolved not around the existence of a god, but around the existence of an atheist!
While atheist was used to defame others, it is apparent that it was seldom meant literally, since the same people to hurl it expressed doubt that anyone could actually be one. As the existence debate persisted, an important wrinkle made its entrance; theologians began not only expressing disbelief that one could be an atheist, they began condemning atheism. The English preacher, John Balguy conducted a series of twenty-two sermons in 1749, one of which was innocently titled The folly and wretchedness of an atheistic inclination,
Of all the false Doctrines, and foolish Opinions, which ever infest the Mind of Men, nothing can possibly equal to that of Atheism; which is such a monstrous Contradiction to all Evidence, to all Powers of Understanding, and the Dictates of common Sense, that it may be well questioned whether any Man can really fall into it by a deliberate Use of his Judgement.[xii]
The word was no longer simply an insult; it referred to a position – the denial of god’s existence. When it was doubted one could be an atheist, many thought they were writing of the squared circle. When atheism began being persecuted in writing, it becomes clear that there were people who were atheists, by implication. The condemners of atheism never acknowledged this, but as historian of British atheism David Berman points out, something has to exist in order for it to be properly persecuted.
Nearing the turn of the 18th century, men began adopting the word atheist as a means of self-identification, something inconceivable in the centuries before. The first written avowal of positive atheism came, as one would expect, pseudonymously, in the pamphlet Answer to Dr Priestley’s Letters to a Philosophical Unbeliever,
…as to the question whether there is such an existent Being as an atheist, to put that out of all manner of doubt, I do declare upon my honour that I am one. Be it therefore for the future remembered, that in London in the kingdom of England, in the year of our Lord one thousand seven hundred and eighty-one, a man publicly declared himself to be an atheist.[xiii]
While signed by a William Hammon, most believe it to be the work of a physician, Matthew Turner, though Berman argues for a co-authorship, positing Hammon as a real person. Nevertheless the search was over. What had been suspected in Hobbes, decried in Spinoza, and inferred in d’Holbach, was confirmed to exist in writing, with a healthy absence of equivocation. Atheist had undergone a transformation, from profanity, to hypothetical, to corporeality.
There are words in use today that have a similar structure of development, once derision directed towards a minority, evolving into a term of self-identification and even pride. It is in this way that atheist’s meaning came into the possession of those whom it now describes.
[i] Buckley, Michael J. At the Origins of Modern Atheism. Yale University Press. 1987.
[ii] Bremmer, Jan N. Atheism in Antiquity. In Martin, Michael (Ed.) The Cambridge Companion to Atheism. Cambridge University Press. 2007.
[iii] Plato. Apology. In The Trial and Death of Socrates: Four Dialogues. Shane Weller (Ed.) Benjamin Jowett translation. Dover Publishing, 1992.
[iv] Buckley, Michael J. At the Origins of Modern Atheism. Yale University Press. 1987.
[xiii] Turner, Matthew (and/or Hammon, William). Answer to Dr. Priestley’s Letters to a Philosophical Unbeliever. The Cambridge History of Eighteenth Century Philosophy, Vol. II. Knud Haakonssen (Ed.) Cambridge University Press. 2006.
Well the Democrats gave up some ground in the latest American election, and a lot of commentary leading up to, and in hindsight of, these results framed it as being a referendum on Obama’s performance. I think this framing has some merit in that a portion of voters don’t distinguish candidates from parties and another portion of voters give a portion of their vote’s consideration to parties over candidates. In both cases Obama’s performance would figure in the voting for Congressional office.
I was telling someone last week how it is funny that, leading up to his election I was one of the few Obama endorsers who claimed he was the lesser evil (as opposed to the extinguisher of evil). I remember writing something like the Obama fanatics would suffer widespread whiplash while reading the newspaper in the aftermath of his election and the inevitable disillusion-inducing headlines that would come with it. Yet I would still vote for him because, the possibility of Sarah Palin as VP alone was enough to wholly dismiss the McCain-Palin ticket. In any event, the really remarkable thing is that, amongst people today, I’m one of the few people who is quite pleased with what Obama’s done since he’s been in office. On almost every issue he’s performed fairly well, and when viewed in a comparative presidential light, he’s performed enormously well. I think one thing we can derive from this is the fact that most people my age are incredibly stupid. Perhaps unforgivably stupid. I’d go as far as to say most people from my generation, even under 30, especially (but not limited to) those who are formally education, are naive, epistemologically-unscrupulous, irrational popinjays. That’s not to fully endorse the wisdom accrued by the aged, but surely they have a better batting average with regards episodic idiocy, at least until annuated becomes superannuated.
I don’t agree with Obama’s moves with respect to Afghanistan, Pakistan, or Iran. I don’t think he’s pressed hard enough on Israeli-Palestinian negotiations when he had the chance to play hardball and get results. I think Gitmo could be closed by now. I don’t think ending don’t ask don’t tell is a good thing (since people, like gays, should be disincentivized to enter US military service). But on stem cell research, pro-choice 3rd world funding, broad economic decisions, anti-proliferation work, general world diplomacy, health care reform, the scale-down in Iraq, UN involvement, and a small litany of other topics, I think compared to other politicians (low standards in mind) Obama has wildly outperformed any reasonable expectations. When you have reasonable expectations and those reasonable expectations happen to be low expectations you will be satisfied often.
Check out this Progressive article that claims that “democracy” is drowning in campaign ads. I think there are two general theses that forward the idea that mass society is gullible and susceptible to the power of advertising. The first is based on a historical fear engendered by the rise of the Third Reich. It goes something like this: the German people at the turn of the century were some of the most cultured on earth, yet somehow they were ‘convinced’ to become terrible barbarians and commit horrific atrocities. The only explanation for this mass coercion of society is the effectiveness of propaganda upon an unwitting population when coupled with the hopelessness most felt in fighting back (Schindler syndrome). Thus a great number of convincable people were convinced, and the rest were out of fear. We are susceptible to the intoxication of propaganda and people go along with the crowd.
As an aside, the reason this thesis falls flat is that the assumptions are quite tenuous. The Germanic Christians were never that cultured, not to the point of something obvious, say like semitism (is that the opposite of antisemitism?) Not everyone needed convincing, and many who did didn’t need that much convincing, to go along with the nationalistic fervor that Hitler represented. The example is used again and again, the guards at the gas chamber, what were they to do, as individuals? This point can be debated, but consider this, when the ordinary German soldier took his rifle to defenseless civilians, in the city streets, or fresh off the train in some concentration camp, sometimes he enjoyed it. You don’t need an elaborate propaganda theory when you posit that human behaviour wasn’t debauched to a great degree, or that the capacity for certain behaviour wasn’t debauched at all.
The second thesis is simply this: everyone but me and a few select individuals are stupid, gullible twits. In this case, the superiority of some group of leftists, rightists, libertarians, religious, conspiracy theorists, anarchists, what have you, are convinced they are privy to some form of enlightenment that others are either a) incapable of comprehending b) unlucky enough to yet to be exposed. These work in conjunction of course since when anyone who they think falls into category b doesn’t accept their wild propositions and gross generalizations they automatically get shifted into category a, or pending that, a1, which states “oh they just don’t have an open mind at this point in their life.” The rule of thumb is everyone is a dupe but me. Somehow these people go unaffected by advertising, while the rest of society is irredeemably corrupted.
Getting back to the article in question, to much surprise the author thinks a different emphasis should be placed on the candidate she would have win:
"One of the main attributes of all the political ads is ceaseless repetition. No matter how well the candidates argue in the debates—and Feingold’s debate performances against opponent Ron Johnson are probably the best of any candidate in the nation—the same old talking points get recycled over and over."
And how clear, exactly, was his victory in the debates anyways? In summary, she talks about some nuances of her candidates positions that aren’t exposed in negative advertising:
"But these are details that don’t come up in the ad war that substitutes for a real political debate in this election year."
What exactly would make political debate real for this person, I wonder, besides the conviction of the public to elect the candidate she views as the clear better choice? That is in fact the only criterion she likely uses to gauge whether the masses are ‘on track’ or ‘being duped,’ whether the country is ‘turning around’ or ‘in trouble.’ This kind of political thinking burns much calories and gives an air of punditry to those who spend little of their current time thinking about something like actual public policy. Its not that they aren’t being productive with their time, its that what they are producing isn’t productive. And there is no more counterproductive thing than trying to convince your readership of the incredulity of those in the world outside of their sphere and yours.
There exists contemporary confusion regarding the place of free expression and free speech in a free society. A free society need not be defined as a market-based liberal democracy with a charter of rights and freedoms, but that certainly is a very good outline for a free society. An outline which Canada, for example, has filled in quite well, especially in comparison to other states. Many of the assertions and arguments in the following essay will assume a liberal democracy, rights and freedoms ensured by the state, and a market economy.
Freedom of Thought
Individuals have ultimate freedom in thought. Insofar as their capacities limit them a person is able to entertain any notion whatsoever without any immediate and certainly no necessary consequences to the rest of the world. Moreover, in thought there is ultimate privacy (barring a belief in something like god, of course). No one can access one’s thoughts and no one can silence them. And in fact, no individuals private thoughts can have any immediate, necessary, or physical affect on the well-being or freedom of another. Private thoughts have no cost and do no harm, yet provide unimpeachable freedom to the thinker. Of this, speech cannot be said.
Freedom of Listener
For there to be speech, there must be a listener. Words said alone are not heard by foliage, and therefore, is effectively vocalized thinking. When more than one party is involved in an interaction in a free society one rule governs the interventional jurisdiction of the state: harm. The freedom of an individual is so constrained as to prohibit actions that cause harm to others, by any means, including the impairment of said others’ individual freedom. Therefore, since speech requires a listener, and since freedom is restricted when harm is caused, freedom of speech can be restricted, by the law, by the state, when harm to others is at stake. If speech constrains the freedom of others the state can legitimately restrict it, if speech harms other directly the state can legitimately intervene. Ultimate freedom still resides in the speaker in this way: they can think whatever they want to themselves.
Freedom of Speaker
Where something is being said is not what is being said. Free speech and free expression are used interchangeably, but the former, when taken literally, is what is easiest to see being constrained. One cannot walk behind someone for days on end shouting loudly; this is harassment. One cannot lie to a jury of one’s peers; this is perjury. One cannot preach the gospel in a crowded restaurant of private citizens; this is public nuisance. In each case harm is being done, to the liberty of the harassed, to the justice of the accused, to the peace of mind of the patrons. Young men are always confused when a orator of great influence enters their community, and they are disdained by the community for shouting at this person of interest in front of an audience of hundreds. They are disdained because they don’t have a right to blight everyone’s experience in the interest of having their facile opinions heard above everyone else. In these ways, the freedom to physically speak should be constrained.
In contrast, the freedom to express, the content of speech, should not be constrained whatsoever. Whether speaking in a hall, opining in a newspaper, broadcasting across radio or television, writing in a book or elsewhere; the content of speech cannot be judged by the state or the law. Where someone is speaking matters, what they are saying does not, insofar as it concerns the state’s intervention in accordance with the prevention of harm.
The fact of the matter is that being heard takes two. The measurable differences from speech to expression is the leisure with which the listener is exposed. Where the listener can find the information at their leisure, through their own choice, the speaker should have the most freedom, and conversely where the listener has little or no choice but to listen the speaker should have little freedom. You have a right to speak but not to be listened. Therefore, in the mediums of publishing books, pamphlets, articles, periodicals, or television, radio, the speaker should have ultimate freedom to write, produce, or express whatever they desire. In public spaces however, one cannot simply shout through a loudspeaker as it was their right to injure the ears of a passerby, whatever their message. When the listener seeks out the speaker (as in books, as in radio, as in television, as in the internet) the speaker has endless freedom, when the speaker seeks out the listener (as in protests, as in solicitations, as in mandatory education) the speaker’s freedom ends.
Concerning tortured images
When it comes to pornography, torture, violence, various sexually explicit taboos, expression, publication, and documentation is not an issue. Torture of individuals is an issue, photography of innocents is an issue, defiling sexual acts is an issue. Whether the state reprimands these things is what is at issue. If the camera failed to take the picture and the scene it would have documented was illegal, then the scene it would have documented is equally illegal had the camera taken the photo. Similarly, if the scene is legal when not filmed then it is legal when filmed. Child pornography is proof something illegal has occurred, and that illegal act should be punished, not the act that documented it. The willful starvation of a dog is a crime, for example, the presentation of it as art is simply the documentation of a crime. It makes the work of a prosecutor easy, it doesn’t change the charges that should be pressed.
Benefits of an unencumbered Speaker
A society where the content of one’s thoughts are free to be expressed is a society of great transparency for the listeners. When a marketplace of ideas takes root, those members of a free society who ascribe to rational pyrrhonism benefit greatly in the availability of possible belief propositions. Inevitably, gradually, unevenly, better ideas replace. Moreover, the existence of a marketplace of ideas allows the prevalence of identifiable ideas in a society. The unencumbered speaker identifies themselves. If one is unable to express their deep-seated hatred of a particular ethnic group, for instance, or religious cult (popular or niche), it is the listener who suffers, not the speaker. The listeners have a right to know what world they live in when there are members of society willing to elucidate the topic for them. The listeners have the right to know there are those who would execute members of an ethnic group, and why they would do it, when there are people who are willing to divulge these statements. For when bigotry, par exemple, is willing to be spoken, and willing to be heard, but the osculation of lips to ears is disallowed by the state, what suffers but the accuracy with which the listeners view the world, and the ability of the listeners to refute, to repudiate, to censure, to dispel, the misgivings of bigots? With censorship harms censuring. And without the expression of ideas unrestricted can bad ideas be identified, can good ideas be tested, and the private thoughts of everyone disseminated as they will.
To that point, related to the freedom of the listener, offense is not harm. It is by no means clear that giving offense through expression is productive or harmful, whether through the looking glass of the state, or a philosopher who would be king. The fact of the matter is that taking offense merely offends the current sensibilities of a listener; whether those are the most desirable sensibilities for that listener to hold is up to them to decide. Being offended is a matter of taste, not principle. And since taking or giving offense concerns the content of expression, not the physicality of speech, the state has no word in its definition.
The Incompetence of the State
So what of the case where the censored are denied an opportunity to speak the truth to eager listeners? Of the content of expression, of the content of speech, the state cannot say anything. For who is the state to say the bigot is wrong, that is the purview of the listener. The state is neither qualified, nor were it qualified, competent enough to rule on the qualify of content of any expression. It is only qualified to rule on the place where speech is made, and when speech in such places cause harm. Whether expressions of ethnic bigotry, of occult devotion, of conspiratorial romance, of historical fabrication, of snake-oil salesmanship, of violent reprisal, of homicidal incite, the state cannot judge the merits of this speech’s content and therefore it should be allowed, and not discouraged (or encouraged), in interests of the exposure of the truth, whether that be through the self-evident or explanatory powers of these speakers or whether that be through the embarrassing absence of coherence in their expositions.
Despite all the power bestowed upon the state, and the necessity of these endowments, the state is far from competent in its responsibilities and errands. This understanding should be built into everything it does. It is a twofold incompetence, the state is by its nature, by its structure, incapable of the kind of relentless pursuit of achievement a single minded individual might be, or a driven commercial enterprise, and by its structure it is also incapable having proper incentives to engender the type of behaviour we wished the state partook in. The second kind of ingrained and natural incompetence is with the agents of the state, from the police officer, to the bureaucrat, the politician, to the social servant, the arbiter, and the prison guard. These are not a free society’s ‘best and brightest.’ Often they are the dimmest and worst of the productive members of society. If they are members of the cream of the crop, they are the film that defines its edges. And of that little can be done, besides restricting their power. In censorship the state therefore should have as little say as possible, both on grounds of its own incentives and interests being corrupting, but also on the pure lack of rational calculation and acumen on the executors of policy.
In policing the population, in forming the laws, in litigating disputes, and in enforcing the rules, the men and women that form these arms of government simply aren’t infallibly lucid or in possession of an alacritous wit. In comprehension of the subtlety necessary for optimal free speech in a free society they cannot be entrusted.
Protection of the Market
In copyright we see the protection of the market and its generation of prosperity, innovation, and quality of life. When a company pours a hundred million dollars into isolating a gene for a specific disease, in order to manufacture a cure, that work and investment must go rewarded. For if the company is not allowed imbursements based on their work being used, is not allowed to profit by being the government sanctioned provider of treatments of that disease based on the gene they isolated, is not compensated in some grand sustained manner if the content of their work is released for use by other companies in the market, then why would they launch such a project in the first place? Why would a company, through its own volition, without any order from above or initiative rooted in moral fervor, pursue an end that will lead to the uncompensated loss of one hundred million dollars? And when no company is willing to invest vast sums of money into innovative treatment of disease, or innovation in general, when will the future come? The fact of the matter is that the government is the estranged steward of the market. It is necessary to make the rules of the game, to build the arena, and to occasionally referee the action. But it doesn’t choose the winners, care who shows up to the game, or what are the final tallies. And those conditions are equally necessary (as the stadium, the rules, and their enforcement) to encouraging the highest number and most driven participants in the market. And when the market is suffuse with motivated participants, and when their potential rewards are uncapped, that is when society benefits the most from their playing the game.
Does this mean that said company should be allowed to sit on their findings and not produce . No, they pursued an item in the public domain, the genetic code of humans, in order to treat a disease. And through their successful investment and discovery, they, ipso facto, won the contract, the right, to peddle its cure for profit. If they become negligent in that contract, if they fail to bring wares to the market square, then the state can repossess the contract. But no one should be allowed to speak freely of the secret they discovered. Freely in that, if they do speak of it, compensation is headed that company’s way, just as if they are relieved of their contract regarding the disease, the new standard bearer will owe to them the cost of making the cure possible.
The Neccesity of the State
For what purpose should we entrust a monopoly on the legitimate use of violence with the incompetent, lumbering, self-serving, overarching organization better known as the state? Simply put: it is necessary and better than the alternative. The state creates the conditions for a free society; it is necessary but not sufficient. The absence of the state incurs an absence of possible freedom. For in anarchy there is freedom to act but not freedom from certain conditions, such as fear and scarcity, much less the luxuries something like free expression. And so to with the creation of the market is the state necessary, and from the market comes relief from scarcity, and from the market comes the very mediums by which one can express themselves.
Freedom of Assembly
Freedom to assemble is not commensurate to freedom of speech or expression. Assembly in private spaces is of no concern to the state in and of itself, only illegal activity in private spaces, with assembly or not, is of concern to the state. Assembly in public spaces, on the contrary, are in the jurisdiction of the state in this way: if they encroach on the well-being of the greater public, then the state has a right to intervene. The problem of course is that the state is both incompetent at assessing whether sufficient encroachment has occurred and has its own interests warping what it will view as encroachment. To this there are few solutions. But what is not given is the right of groups within a society to blockade the streets of a city, at the expense of everyone else. There are some mitigating platitudes. First, the establishment of “free speech zones” during times of high social tension amongst groups, is not an authoritarian idea if properly executed. It is not authoritarian to stipulate, and enforce, the fact, that you do not have a right to shout wherever you please. However, the more appropriate step is “no free speech zones,” say, in the vicinity of hospitals, courtrooms in session, elementary and secondary schools, and private residences. Parliamentarians and legislatures should by nature be designed for protesters to inhabit their lawns.
In that betrays the relationship of speech and of expression to freedom in a society. To optimize freedom, speech and expression need to be taken in nuance, in detail, and in balance, with the rights and freedoms, with the benefits and sacrifices, individuals and groups have and make, in order to generate and maintain a prosperous, democratic, secure, and free society.
Having just finished Chuck Klosterman’s Eating the Dinosaur, here is a review:
On the back cover of Klosterman’s collection of essays there is a witty Q&A with the book’s would-be purchaser, promising larger themes but nothing concrete. In this, the author delivers. One can appreciate in Klosterman a polished technical ability for exposition, expression, and exasperation. The book reads like a conversation with someone really interesting and well informed on topics which interest you, but you haven’t looked into as of yet. That is to say, Nirvana and Weezer, ABBA and Garth Brooks, Ralph Sampson and the NFL.
One thing Klosterman risks early and often is creating a thesis where none should exist. Call it the no accident syndrome that so many writers, particularly cultural critics, fall into. They forget that there are such things as accidents of history, and not every cultural moment of significance signifies something larger of the culture in which they partake. Klosterman expounds on Weezer’s lyrical literalism and its connection to the death of irony in post-Obama America, in no small detail. While frontman Rivers Cuomo’s non-ironic lyrics may represent some profound meaning and changes in his world, the world that created those lyrics, the idea that they represent something, changes or otherwise, in the world, seems entirely fantastical. There are coincidences, accidents of history, and anecdotes, but people who write about sports, or music, or television programs inevitably have to place events in a larger narrative that doesn’t exist. This is fine in this way: if they don’t take themselves seriously and their thesis-creation is taken to be what it is, insightful analysis about the art in question with some creative interpretation. Nonficiton writing as entertainment. In this, Klosterman excels.
And in getting things right, he doesn’t slouch either. On going back in time Klosterman concludes: “People who want to travel through time are both (a) unhappy and (b) unwilling to compromise anything about who they are. They would rather change every element of society except themselves.” Sometimes the book reads as a confessional, as Klosterman uses his self-analysis as a device to both explore the themes of the book and get the reader on side. The sentence following the quotation above: “This is how I feel.” This would all be well and good were getting the readers on side put to a good purpose.
Klosterman doesn’t use a traditional chapter system in breaking up the book, at least, I don’t think he does. There are essays separated by concocted Q&As by the former music journalist, but within those essays exists a paint by number system, 1, 2A, 2B, etc. If the device was supposed to confuse or clarify linkage to the reader it is unclear to me. I didn’t pay it any mind, I just read left to right, the pages in order. I assume that’s how the book was intended to be consumed and I found things quite well organized that way.
Organization and themes aside, the author typifies the phrase “quotable,” with such coinages as “Every fight is unique, but wars are always the same;” “I see a zebra, and I know what it is. But you know what I can’t see? How zebras look to a zebra. And that, I realize, is what matters most;” and
Cinema verite literally translates as “cinema of truth.” Herzog, of course, hates cinema verite, claiming it’s “devoid of verite.” In 1999, he wrote a ten-point manifesto titled “The Minnesota Declaration,” probably the only document in film history that attacks cinema verite techniques while complimenting Jesse Ventura. His essential point was that cinema verite provides “the accountant’s truth” and that cinema verite auteurs are like tourists. Keeping this in mind, I think it would be very interesting to see a Herzog movie about an accountant on vacation.
Back to getting readers on side. With this achievement comes some responsibility, in the sense of, it is one thing to wax ironic about quotidian encounters, you can be off-beat, funny, and gratuitously cynical without much repercussion, but if you suddenly levy a heavy thesis in the last chapter of your book, you better be ready to back it up. This is exactly the surprise twist that Dinosaur takes, and the back up never shows up to the shootout.
Just pages before Klosterman quotes an incredibly insightful, and existentially important, point by someone named Slavoj Zizek. And it was incredible insightful to select the quote itself. Klosterman is bright, and knowledgeable, but when it comes to serious philosophizing, his lack of rigour, not in presentation, but in deep contemplation, becomes painfully obvious. Not out of nowhere, but certainly unexpectedly, we find the author sympathizing with the anti-technological views of one Ted Kaczynski, better known as the Unabomber. This in itself isn’t discrediting, its the fact that what Kaczynski wrote is a good insight into the mind of people confused by the world: “I believe all technology has a positive short-term effect and a negative long-term impact, and — on balance — the exponential upsurge of technology’s social import has been detrimental to the human experience.” That’s Klosterman, not Ted, for what its worth. In a lot of ways Klosterman is like his good friend, the wildly successful sports writer Bill Simmons.
They are both genuinely charismatic writers, and veritable repositories of popular culture factoids, but they aren’t intellectuals, and sometimes they get confused about this point. Klosterman knows a lot of things, but shouldn’t tell us what to think of them holistically. He knows the name of Martin Heidegger, and knows he is supposed to be some sort of fascist, so when he name drops the phenomenological Jew-estranging philosopher on page 249, I suspect he thinks he will get away with giving readers who have heard the name of Martin Heidegger, and know he is supposed to be some sort of fascist, the impression that he knows what he is talking about. But Klosterman betrays himself as simply an entertainer by virtue of not having the discipline to assess reality, and to do the homework that requires. You actually have to read Martin Heidegger to even name drop him, because if you don’t you risk betraying your ignorance about his place in the intellectual scheme of things. In the homework department, Heidegger spent decades in the black forest of Germany writing Being and Time. I suspect Klosterman wrote the last chapter of Dinosaur one angst-filled weekend.
To keep things brief Klosterman writes: “Technology is bad for civilization. We are living in a manner that is unnatural. We are latently enslaved by our own ingenuity, and we have unknowingly constructed a simulated world.” His mistakes about the general susceptibility to disingenuous simulacrum of the average human are so obviously high art, New York music critic hipster, that I wonder if the whole last chapter is farce. The author never actually says why the state of affairs, such as it is, is unnatural, or why that’s a bad thing (since after all, in the state of nature, the average lifespan of man is 25 and the average death circumstances of man are violently violent). On pronouncements he is voluble and on explanations he is short. The ultimate result is well worded whining. Whining can be endearing, when done by those who would court Annie Hall, but when purporting vapid declinism, it simply spoils good books.
In my second year at my second university I lived in a porous fire hazard which, in the deep of winter, had a maximum internal temperature of 14 degrees Celsius. Doors closed, radiators cranked, double socks. There was at least one thing I looked forward to doing every day, sometimes twice a day, in this place, and that thing was showering. The bathroom could fit two people if one stood in the shower stall, and that was perfect, because by the time the hot water ran out, the small area was a veritable steam room, and the feeling had returned to the tips of my fingers. The only problem was getting back to my room; I would often burst through the door billowing steam in my wake, bolt through the kitchen and down the hall to dive under my bed sheets.
I didn’t like a lot of things about the place I lived in my second year at my second university, in fact, I didn’t like a lot of things about everything then. But I always enjoyed showering. And I enjoyed it unmitigated; independently of whatever had occurred that day, week, month. My enjoyment level of it was the same each time and each time my experience was more or less the same. I liked my fingers warming up. I liked my body warming up. I liked how clean my body and hair felt. I liked how I had muscle memory for the bathroom. With my eyes closed I could drop trou, pull the curtain, and turn the knobs to the exact place within a 64th of an inch, all in a smooth half second. I knew exactly how much hot water I was working with, and how to gradually squeeze the knobs to get every last drop. Its not the only shower in my life I’ve become symbiotic with, a master of the dials. I’ve always enjoyed the quiet privacy of the shower, even now, and I live alone. Its always been that way, with wherever I lived, with whatever shower was there.
There is more than one sentence in Chuck Klosterman’s book, Eating the Dinosaur, but you will have to take my word for it, “But when ever people talk about their personal bouts of depression in the abstract, there are two obstructions I hear more than any other: The possibility that one’s life is not important, and the mundane predictability of day-to-day existence.” There is a lot to say about the first ‘obstruction’ but this exposition regards the second.
Fight Club. A concept supposed to represent the lost innocence of a generation, the nihilism of today’s (or 1999’s) young adult. I imagine lots of people, when they were the age I was when I was in my second year at my second university quoted from the movie thusly:
"Man, I see in Fight Club the strongest and smartest men who’ve ever lived. I see all this potential, and I see it squandered. God damn it, an entire generation pumping gas, waiting tables – slaves with white collars. Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need. We’re the middle children of history, man. No purpose or place. We have no Great War. No Great Depression. Our great war is a spiritual war. Our great depression is our lives. We’ve all been raised on television to believe that one day we’d all be millionaires, and movie gods, and rock stars, but we won’t. We’re slowly learning that fact. And we’re very, very pissed off."
I’ve often said that delivery pizza is the greatest thing in the world. You know those people who still think capitalism is a system that imposes homogeneity on the masses? Have they ever thought deeply about delivery pizza? (Never mind the fact that these pervasive, authoritarian, homogenizing forces don’t seem to have any effect on them, on their sparkling uniqueness). Tonight I clicked on my computer and got a medium pizza with sausage, pineapple, red peppers, and black olives delivered to my house in 25 minutes. At midnight. And I could have gotten almost anything. I could have ordered a pizza with ten toppings. No Pizza Pizza employee ever questions my topping choices, they just delivery exactly what I want, and the variety they offer is staggering, especially considering the ingredients come from around the world and are at my door in 25 minutes. Pizza Pizza makes some suggestions about what I might want to order, and often I make a calculation based on the affordability of their offers. After all, pepperoni is good too. Never do I feel tricked. Never do I feel exploited. Never do I feel like homogeneity is being imposed — they don’t force people to put mushrooms on their pizza. My desire to eat is not being manufactured. And the delivery guys are nice. We smile at each other and talk about the weather, or how the landlord at my apartment building never changed the buzzer to my name.
The quote in Fight Club almost insinuates that not having a great war, or great depression, is something to complain about. If you spent enough time listening to people who actually lived through the great depression, who actually fought in the great war, you would realize not just how awful it was, but how awful they could have potentially become as a result of it. I’ve often wondered what human society would look like during utopia, or more exactly, the best possible world. I’m inclined to believe people would complain, a lot, because it is very very difficult to assess how optimal things may be when the best you can get is so far from perfect.
There is that saying “you have to appreciate the little things in life.” Often it is said like this has something to do with humility or contemporary stoicism or being chaste. But it doesn’t have to do with any of these things, it has to do with viewing the world accuracy. Not with having perspective, mind you, because that word is pretentious. It is age-independent that we enjoy being clean, being safe, being warm, and having quietude just as it is age-independent that hand-held half-melted cheese on dough with sausage and pineapple is delectable. But it isn’t age-independent that we have the luxury of showering everyday, of ordering delivery pizza twice a month, or speaking to any of our friends or family on a whim, whether electronically or on the phone.
The other day I was taking a bus back from Montreal, and the bus had wireless, and I have recently acquired an ipod touch. And you know what I downloaded? The latest Adam Carolla podcast, the last two episodes of Pardon the Interruption, and David Hume’s Dialogues Concerning Natural Religion. All free. The wireless was free, the podcasts were free, the book was free. And the information was coming from outer space. And like I say to my friends often, I said to myself then, we are living in the future. Louis CK speaks so well to this point that I’ve posted a video of him below. You might level a criticism that I am merely dispelling the contemporary technological version of the today-is-mundane thesis. Well I enjoy the sun shining on my face too and I enjoy it whether it rained or shined the day before.
This may or may not be utopia. But it certainly isn’t dystopia. And if you truly believe that capitalism or the hours you are forced to work if you want to get paid is getting you down you should meditate on Andy Warhol’s observation that the President of the United States can’t get a better can of Coca-Cola than the bum sleeping under the bridge. What’s important about my listening to my favourite podcasts and reading Hume on the bus ride is that it was Pareto efficient, that is to say I gained pleasure at the expense of no one and nothing. It is almost as if this better experience, this better life, sprung from the ether.
So we can see in both Klosterman’s experience and the member of Fight Club’s thesis that two things are the matter: having no meaning in life and “the mundane predictability of day-to-day existence.” Of the former much has been said and has to be said because it is an important question. On the latter let me suggest this: it is a red herring.
"The 69-year-old David Johnston was sworn in Friday as Canada’s 28th Governor General, taking over from the charismatic and wildly popular Michaelle Jean for a five-year term.
But from the scene-stealing kisses blown by the eight-year-old granddaughter at his side to the frequent moments of genuine affection between Johnston and his wife, it was clear that Canada’s new head of state sees the job as a family affair.
"I see my role as a bridge in bringing people of all backgrounds and ages together to create a smart and caring nation, a nation that will inspire not just Canadians but the entire world," he said in the speech that aimed to set out his vision."
Well I’m not too sure what to say apart from this guy seems incapable of writing a speech or talking in a non inane manner. We should definitely get rid of the Governor General position in any event. Who is going to rule on the dissolution of parliament? Let the speaker do it. If you don’t like that have an appointment process for someone out of public life, who actually doesn’t have any other duties. Who is going to represent Canada? I don’t know, the Prime Minister maybe, maybe all the diplomats we already have. And don’t forget the Lieutenant Governors, these are the most useless people imaginable.
Monarchists of some stripes say probably the dumbest thing one could “But the Prime Minister will think he is the king, we need to Queen and her representative to keep their heads from getting swollen.” This notion is almost incalculably dumb. I don’t know if you have noticed how Prime Ministers carry themselves, but they give off the impression that they hold the highest office in the land. And yet somehow, the law, the media, the parliament, the voters, the bureaucrats, the lobbyists, other world leaders, market forces, environmental catastrophes, and a hundred other things manage to keep them from running buck wild in this country. Nothing changes when the Governor General stops existing, its like Ockam’s Razor but for useless people instead of propositions.
I read a blog last week that argued Eminem was the best rapper right now, in part, because his lyrics were “powerful and meaningful.” The following could be interpreted as racist: it didn’t surprise me the writer was a well off non black man.
If you are like me and other people think you are white enough to tell black jokes around you’ll also have undoubtably heard people say they like all music except rap and country or some variation of that. The other one is they either dislike rap, or like a particularly small segment of rap songs, “because they aren’t all about my hoes and my cash and my ride, bling and violence.” Its funny when these people check to make sure there are no black people around before saying so. They talk about something deeper, have social commentary, or personal meaning. These people are pretty stupid in my opinion, and more accurately, they are pretentiously stupid.
There is one very good reason that “meaningful” should never enter into the discussion of music’s merits: music isn’t full of abstract meaning. It doesn’t convey deep concepts. I’ve never heard a song whose lyrics explained to me anything really. Many songs I liked growing up I couldn’t even understand what the singer was saying.
One can see why people, particularly educated white people, think their music is meaningful: it is a vague litany of emotion-provoking lyrics sung emotively on top of exciting, climaxing swells of harmony and sound. Open ended lyrics are the key to the perception of sublimity. Because of the distribution of music to each individual radio, computer, car, and the personal nature of the experience, people make music they hear their own, they relate it to their personal lives, or simply to their emotional state.
That’s all well and good. Coldplay has made a ton of money on this incoherent, but ultimately therapeutic response. Make no mistake, I would never argue that people should not enjoy something they enjoy. And music should be enjoyed. But what it should not be judged or classified by is its meaningfullness, its implicit deeper meaning. Maybe “Green Eyes” is meaningful to you, and maybe you lost your virginity to “Yellow,” but a philosopher in Chris Martin that doesn’t make. It doesn’t imply he knows anything about, say, free trade versus fair trade. I suspect a ton of people who ‘find meaning’ in all the guitar-centric angst-expression people are listening to are the kind of people who get agitated when people use the word ‘metaphysical’ in a sentence and turn their brains off.
See the places to actually find deeper meaning are elsewhere: places where people actually explain what they mean by what they are saying. This happens in all kinds of places: on the radio, in podcasts, magazines, books, and even on television (though not many people watch TVO). Here is an incredibly pretentious statement (speaking of pretension): you don’t need to go to Coldplay for deeper meaning in your life if you’ve gotten through Plato, Montaigne, and Camus. Or Shakespeare. And therefore Coldplay won’t inhabit the section of your cultural experience where you look to rectify how lost you are in the world outside of your brain. Music becomes what it is, part of play. Something to dance to, something to run to, something to kiss to, something to blast in the car while happily bobbing your head.
There isn’t a lot of meaning to be found in rap. You know, there are things to say about the culture that produced rap, what it tells us about urban culture, drug dealers, etc. Just like studying Coldplay can tell us about people who don’t understand Woody Allen movies. But that’s a separate type of meaning, namely, the kind unintended by the artist. I almost forgot another point: most musicians don’t have anything intelligent to say at all anyways when you get them in a medium in which they have the opportunity to say something meaningful. This shouldn’t surprise… because they have a talent for transforming their inner emotive state into sound or they simply have excellent and well worn muscle memory as it relates to something that makes noise you should listen to them about the socio-economic implications of China’s newest Tibet policy? Or they know something about god?
You know though, one would think it difficult to create a never ending supply of ways to express the thought “I get money.” And yet there are new rap songs coming out all the time, with new rhymes, new wordplay, and commonly new words. It is creative. It is hard to say you get paid and have all the chicken heads after your belt in a funny way when it has been said over and over again. Music is candy and it would be a mistake to try and dissect the meaning of candy. Much less take deep meaning from its consumption! It tastes good and too much of it is bad for you.
I’m going to repeat this in case you only read it once and/or were put off by the preceding pretension-filled paragraphs. The places to actually find deeper meaning are elsewhere: places where people actually explain what they mean by what they say. Try it on. If you can get over looking up words like metaphysical on dictionary.com every time you see a word you don’t know you’ll find the conceptual clarity intoxicating.
I’ve started to read Chuck Klosterman’s Eating the Dinosaur.
So far, there have been sentences like this:
"If given a choice between interviewing someone or talking to them "for real," I prefer the former; I don’t like having the social limitations of tact imposed upon my day-to-day interactions and I don’t enjoy talking to most people more than once or twice in my lifetime."
"It’s fascinating and stupid to watch adults destroy things on purpose….People wreck guitars to illustrate how important guitars are supposed to be, aggressively reminding us that these are the machines that kill fascists. Sadly, this axiom has proven to be mostly inaccurate; according to the most recent edition of the World Book Almanac, the number of fascists killed in guitar-related assassinations continues to hover near zero.”
My favourite so far might be this:
"Yet there’s something perverse about high-profile public altruism: it always feels like the individual is trying to purchase "good person" status with money they could never spend on themselves anyways. Oprah is doing something good, but not necessarily for the motive of goodness. And the motive matters."
So far he’s talked a lot about how silly Kurt Cobain was in his ideas and actions, and why people answer an interviewers questions. One thing is clear, he can really write, and he more or less knows it. But no pretension comes through, its more of a laid back witticism that also takes the form of trying to entertain the reader (in a good way). In other words, he is trying to write for the enjoyment of his audience, both in the words he chooses and in the exploration of the topics he is discussing.
There is a pretty great rap song by Drake, featuring Lil Wayne, Kanye West, and Eminem called Forever that has at least one suspcicious lyric. Kanye says at one point “trade my grammy plaques just to have my granny back.”
Wait, what? Shouldn’t it be obvious that he would trade grammys for his grandmother’s life? I don’t know if its just a total lack of perspective or the fact that as an argent jesus freak Kanye probably thinks his grandmother is waiting for him on the other side, and thus, undervalues having had her here and having her here henceforth. Anyways, he does kind of redeem himself with the line “You’d think I ran the world like Michelle’s husband.” Lil Wayne and Eminem also kill their verses. Enjoy:
There is a lot of stupidity on the internet, not just bad ideas, but racism, trolling, and general idiocy. Some believe this is a result of the medium itself. While it is true that anonymity shields some who say what they otherwise might not. But what if the internet simply allows the general populace to reveal themselves as they are to a wider audience. Half the people are dumber than the median intelligence, but what might be disturbing us about the internet is how low that median is. You know, in order for someone to even take a standardized test, or an intelligence test in the first place, they are more likely to be smart or educated. There has been no event in our civilization’s history in the past two thousand years that suggests we are civilized now whereas before we were not. Its an ongoing process. I don’t know what that fallacy is called (I’m sure someone already named it), but something like the modern times are now fallacy fits. And so the immaturity, the lack of lucid thought, the prejudice, the emotional hysteria, these things we see reflected in the comment pages on the internet (not to mention the nonsensical articles themselves) aren’t necessarily some distortion of society, bringing out the worst in people because of the medium. It might just be that people are that base.
Its kind of like why I argue racists shouldn’t be expelled from liberal arts universities for so called ‘hate crimes.’ Some US university expelled three students for hanging a racist banner from their residence windows. First, I can see some administrative moron saying “there is no place for people like that at [insert name of upstanding institution].” On the contrary, isn’t said upstanding institution exactly the place they should be? If its a model of how to train young upstanding citizens and all, won’t 18 year old racists turn out to be 28 year old racists if expelled, and 28 year old valued members of society if put through the rigour of your transformational four year liberal arts degree? Moreover, the university is a place of free speech. These types of speech acts, hateful as they are, should be allowed to take place. They should not even be discouraged. Why? Because if the university in question is actually a place of free speech, a marketplace of ideas if you will, then the product the racists are peddling won’t sell in a market that offers better products. In other words, students, teachers, and administrators, if they are so smart and articulate, will speak out against that type of speech act. One problem is of course that administrators for one, aren’t smart of articulate, nearly ever. Especially administrators at ‘prestigious’ places.
Anyways students should be allowed to make hateful and racists speech acts at a university with disciplinary immunity, because it is their right, because it will be spoken against and lose in the marketplace of ideas, and because, as a benefit, it reveals the true composition of the university’s populace (or society in general). I’d rather have racists speak up so I know who they are and how many of them there are instead of having them be silent and discriminate secretly as best they can. And as far as slippery slope arguments don’t pretend the law can’t tell the difference between hanging a banner that says “Fuck Niggers,” and harassing or assaulting someone. Allowing the former doesn’t impair the ability to arrest people for the latter.
We want people to open their mouths and remove all doubt that they are morons of one variety or another. It allows us to more accurately view the world.
There is a lot made of the fact that genetically (and therefore anatomically, etc) we are basically identical to homo sapien sapiens that spent eons evolving for the rigours of a hunter-gathering nomadic existence. But it is pretty unclear what this implies and in lieu of a comprehensive empirical understanding of our genes, their uses, and history, speculation runs rampant.
Many people posit or state explicitly that our civilization’s supposed willingness to risk total ecological catastrophe is based on this genetic pre-progamming. We are after the short fix after all, and have no concept of the long term. These people have no idea what they are talking about for many reasons, but the clear one is this: The actions of individuals do not mirror by necessity the actions of a group of individuals of that type. There are plenty of ways to think of this in a homespun way, namely the whole is greater than the sum of the parts. But more generally, it is that there exist such a thing as system effects. The trajectory of each water molecule is chaotic yet it all sits unmoving in a cup. In a well defined free market individuals pursuing their own interests create mutual benefit.
In other words, the behaviour of all humans in a group will not by necessity yield a group of humans that typify the values associated with that behaviour.
What might be in the best long term interest of a group of humans is to have all individual humans concern themselves with their short term interests. I say might because there is no necessity. It might be the case, but the opposite might be as well. It might be the case that our long term interests are best served by the vast majority of individuals devoting the majority of their interests to the selfish short term with an eye on the selfish medium long term and familial or familial group medium and long term, while a small minority of the foresighted individuals pay some extra attention to the entire group’s long term.
There are no universal rules that dictate what the best mixture is in terms of who is looking our for whom and over what length of time in the future.
Anyways, so we know one consequence of knowing that our genetic predisposition for short term solutions is that it doesn’t necessarily doom us. Beyond that speculation can persist.
But what strikes me is our ability to train away deficits and become cognitively adapted to our environments. How the rationalism of the prefrontal cortex can feed on itself to govern decisions previously dictated by rash emotionalism. But overcoming being an animal isn’t always easy. It is relatively easy to remind yourself not to get upset, at some base, carnal, event. It is hard to remember to remind yourself.
Why don’t I always put my keys in the same place when I get into my apartment? Not just because of the structure of the event: I am getting home from something tiring and I am incentivized to sit or lay down as soon as possible. But also because I forget to remember to put them in the same place. Remembering to remember something is very difficult. I think that’s the reason why I still have conversations with people where I get angry. Its not that I haven’t not gotten angry before and realized how well that went and remembered that. Its something else.
Does the fact that we were hunter-gather nomads undergird my capacity to do that? I suspect so, but I could be very wrong.
My opinion on the “ground zero” Mosque is pretty easy to come by. I’m not in favour of any new churches, synagogues, or mosques, so I’m not in favour of this particular new one. But since it should be legal for a church to be built without extreme location restrictions then it should be legal for a mosque to be built without the same location restrictions.
What can we learn from this hoopla? Well the rule of dichotomy not implying a good side is definitely in effect. Both the people who want the mosque built and those protesting it can be wrong. The people who want it built, as Tarek Fatah argues, know it means something more than they claim it does. The people who don’t want it built certainly have their share of xenophobic racists amongst them, and religious ideologues vis-a-vis my god is better than your god.
That being said the converse is also kind of true. Those who defend the mosque being built are right to say that there should be no legal restrictions on such an act, in a Mill-esque constitutional pluralistic democracy at least, and those who are against it are right to say that there was a strong causation, not just correlation, between the fact that the hijackers were Muslim and the fact that they guided planes into buildings.
As a side note the funniest development was the attention-grabbing by some Christian priests in terms of burning the Koran. How many Korans can we burn!? The Koran salesman just loves hearing that talk. It is unintuitive to some that the fact that we eat and slaughter so many cattle actually ensures the survival of the species, or the fact that when we don’t recycle paper and use paper to write on we ensure the maintenance of new forests (since very few ancient forests are actually in peril these days) because industry has a strong interest in maintaining their business. In short, burning Korans after purchase will create more Korans because it is the act of acquiring Korans that shapes their population not what you do with them after wards. The symbolism of the act was never really on their side either.
You agree with Hobbes that the default state is abject poverty and go on to say you are uneasy with the perverse results of the capitalist system. But earlier, you write about how much tax you pay on a car and at first are outraged but then realize you would pay that amount on one large purchase or a number of smaller ones. You say the correct response is not outrage, but Zen-like tranquility. So, if you understand all this about the state of nature and the results of capitalism, should you really feel uneasy? Why not Zen-like tranquility?
I’m not that uneasy.
So are you establishing your leftist credentials with that kind of talk?
(Laughs) I have a wistful desire for economic arrangements that involve less adversarialism and less cultivation of people’s more antisocial impulses. It would be nice for the sake of the common good if we didn’t have to rely so heavily on competition. Particularly being so culturally close to the United States; American culture cultivates many antisocial attitudes and behaviours and relies heavily on the market to channel that into socially beneficial activities.
I am not cut out for business, for adversarial relationships. I don’t enjoy engaging in straight up market transactions, negotiating price on things and so forth. I would be much happier if we had an economic system that relied less on that. At the same time, I have an appreciation for the fundamental virtues of capitalism.
I really think, for the left, you have to have a clear blueprint for what you’re planning as an alternative to capitalism. Any sort of blueprint that someone presents is going to have the market as a central coordination mechanism, so you may have a kind of market socialism or welfare capitalism. Anyone who thinks seriously about the question realizes that you are going to have to have the market as a mechanism because you have to have prices, because you have to know about scarcity.
There are a lot of people who aren’t serious about the question then. I run into them all the time.
Yeah, I agree.
You get two versions of it. Almost no one is willing to come right out and say central planning. Typically people will say some kind of democratic community-based corporate cooperative kind of thing. That’s just b.s. You aren’t going to figure out the price of tires, much less the price of everything in your economy with a community of associated producers. Those plans are typically put forward by people who have no theoretical understanding, but also no real life experience with trying to get things done.
I read this book by Michael Albert called Parecon that has this idea of a participatory economy. Often, the people who start with the community of associated producers drift to stuff like Parecon. The view is actually a version of market socialism but it relies upon an auctioneer instead of a market. There is this idea that you can simulate the market with an auctioneer. A lot of participatory and democratic economics stuff, when you look at it a bit more carefully, is relying on an auctioneer and so is, in fact, planning on simulating markets instead of having straight-up central planning. That’s the only way anything could ever possibly work—you drift out of democratic decision making, which is just a non-starter. The auction stuff has a huge number of problems as well. There is a hundred-year literature on that question.
So when it comes to these alternative proposals, does relative wealth and inequality even matter after a certain point of lower bounded material well being? Leftists are always telling me that money isn’t everything.
I actually think it does. In political philosophy, there is a split between so-called sufficientarian and egalitarian views. A lot of people are sufficientarians: namely, they think that once everybody is above a certain threshold, everything is fine.
But you have justice concerns.
Yeah, I’m more sympathetic to what’s called prioritarian: concerned about efficiency but also concerned about equality. You tolerate a certain amount of inequality because it makes everyone better off. You have a preference for more egalitarian over less egalitarian distributions. Personally, I think that equality as a norm is basically about the minimization of social conflict. What gives you a fundamental philosophical argument for equality is that when you have equality, no one wants to switch places with anyone else. If you have a proposal ‘hey how about we do it this way’ and no one wants to switch places, then you have disarmed one source of social objection. Anyone with kids knows this: as soon as you have an inegalitarian distribution, someone says “how come that person gets this?”
That seems like pragmatic reasoning.
It’s social-contract reasoning. The difference principle is a way of reconciling inefficiency with people’s intention – at least that is one way of interpreting the difference principle. The background view for me is social contract, which is that social order relies on people voluntarily playing along. Coercion plays some role. For the most part, institutions break down unless the majority of people voluntarily comply. You have to have a broad-based willingness from people to play by the rules of the game.
For example, taxes are almost entirely voluntary. There is coercion and enforcement at the limit, but if the population en masse stopped paying taxes, there is nothing the state can do. People have to be willing to play along, and in order to get that, you have to get agreement. A norm like Pareto efficiency generates agreement. If everyone is better off, then what’s the problem right?
The same happens with equality: if no one wants to switch places, then nobody has a reason to vote against. I think efficiency and equality are important norms. They are important for generating agreement and compliance, regardless of how wealthy people are. Because people are envious—people are always looking over their shoulder. While not wanting to say that equality is just a product of envy, the fact is that people have a lot of concern about their relative position and you will get social conflict arising out of extreme inequality, regardless of how well off people are. As proven by the fact that in our own society, we still complain about things, despite being wealthy beyond the dreams of avarice.
There is a statistic I find astounding: that in the twentieth-century, more goods and services were produced than in all previous human history combined. That one generation that lived over the course of the twentieth century
Was pretty opulent. (laughs)
In the west or whatever, they consumed more than all humans in all parts of history … and yet they were still unhappy! (laughs)
Some of them were very unhappy.
In the book I came across the following terms “moral sensibilities,” “moral intuitions,” “moral concerns,” “deeply felt moral convictions,” “moral qualms,” “moral reasoning,” “moral principles,” “morality, and social norms more generally.” These terms were all used in sort of the same way.
When I talk about moral qualms, it is to suggest that, you know what string is being plucked with people. The reason they get upset about it is that it violates moral sensibility.
I have a quotation in a footnote by James Q. Wilson, where he says there is a sort of pathological tendency whereby people will ignore some kind of pragmatic consideration because of a moral qualm, even when the moral issue at stake is very small and the pragmatic question is very large. For example, let’s talk about electricity supply and subsidization. There is a moral concern about the poor not being able to pay their electricity bill and a pragmatic concern about generating price distortions throughout the entire economy that systematically distorts every decision made by consumers, that systematically distorts industrial production, and that generates massive environmental problems. You have this massive pragmatic problem generated by fiddling with that price and by comparison a relatively small moral issue that could easily be addressed by giving people a tax rebate or something and yet people will hang onto that moral issue and ignore the pragmatic one.
Look at the NDP in BC coming out against harmonization of PST and GST, which is insane. The attitude is “GST is regressive.” Too much of the left has adopted this mantra that VATs [value added taxes] have to be opposed because consumption taxes are regressive, despite the fact that all European welfare states rely upon VATs, usually on the order of 17, 18, 20 per cent. The left should be uniformly supporting VATs. Furthermore, having two parallel tax systems, a provincial and a federal, is insane, in terms of the inefficiencies. Harmonization is absolutely a good thing. Yet the same response always, “it’s regressive.”
Campbell probably should have offered to have a combined tax at a lower rate. The reason they aren’t doing that is that they wanted to provide subsidies to the poor to counteract the regressive tax. So there is literally no basis for opposition to that policy. Then people say, ‘ah, it’s regressive.’
It is slightly misleading to say there is a moral and pragmatic question. The pragmatic question is a moral question: it has to do with the impact it has upon people who are further removed from the situation. Often, it is matter of not going with that first moral response but to look at the big picture and look at all the moral considerations.
You write that the book doesn’t have a happy ending. I was confused by that. I agree that it is going to be very difficult to dispel these economic fallacies. But then again, as your first book [The Efficient Society: Why Canada is as Close to Utopia as it Gets]tells us, apparently we live in as close to utopia as it gets. How could it not be a happy ending with everything we have, despite people being so oblivious?
That was partly a response to criticism of The Rebel Sell. The commonly heard refrain was “I like this book but it doesn’t tell us how to solve the problem of consumerism.” We added a postscript to the Canadian edition saying “That’s because we don’t think you can.” The Rebel Sell showed consumerism is not some kind of ideology, or invented by capitalism, but is a reflection of the fundamental competitiveness of the human situation. I thought that would make it obvious you can’t solve the problem, but a lot of people still took us to task for not having promoted a solution. We have become habituated to this idea that nonfiction books should and I wanted to deflate that expectation.
I often hate that section of books. You read a book that has really astute critical analysis of this serious social problem. Then they say “what can we do about it” and it is like “get involved, write to your congressman.” (Laughs)
Yeah, here is a list of websites. (Laughs)
American ones are particularly bad, just like milquetoast Democratic Party kind of stuff. Vote Democrat. Do the best you can do.
Or they will have some abstract cultural remedy like “we need to change the culture that tells young people it is ok to…” Yeah, how are you going to do that? If anyone knew how to change the culture, they would have done it a long time ago.
Also, it is collaborative. You have to have a lot of people appreciate how big the problems are, working collaboratively to hammer out solutions. A lot of economics teaches you what is not going to work. Thinking about what will work requires creativity. Then economics is a form of mental discipline, whereby you come up with a creative idea about how to solve a problem and then you look at it more carefully, following through all the consequences, working out what the equilibrium will be.
For example, when you have some kind of a blueprint for an economy’s fundamental transformation, you have to ask, “How are you going to deal with black markets?” This Parecon stuff, it’s all about the ideal community and how it is to be organized. Michael Albert, who is the architect for it, has obviously never thought about black markets and exactly what he is going to do to prevent them from developing. He assumes everyone is going to be on board.
Is it going to be in equilibrium? Will you be generating incentives that will give everyone reason to comply or are people going to have an incentive to go around the rules? For example, if you propose any kind of replacement for the market, you have to wonder whether markets are going to re-emerge spontaneously. The Soviet Union had tons of black markets. How are you going to deal with that? Are you going to stamp those out with the police? Because that doesn’t work.
And it is not coercive at all.
There is this kind of creative process for thinking up ideas and then thinking through the consequences of them, then thinking about perturbations of those. If things don’t go according to plan, where is it going to go?
I often think appreciating how hard the problems are is the best you can get.
“I’d venture to say that pretty much everything that the average person thinks he or she knows about how the economy works is wrong,” Joseph Heath posits near the beginning of Filthy Lucre: Economics for People who Hate Capitalism.
If you were expecting a dry lecture from a stuffy economics professor, you will be pleased to find out that Heath, co-author of the cultural myth-busting The Rebel Sell, doesn’t have a degree in economics. In fact, he admits to not even remembering how to do calculus. Best of all, Heath isn’t an ideologue; he’s a philosophy professor and a frequent dispeller of fallacies. In Filthy, he unravels six economic fallacies held dear by the right and six held dear by the left (the main audience, as the subtitle indicates).
There is a promotional quote on the front of Tim Harford’s popular 2006 book The Undercover Economist which claims “Reading this book is like spending an ordinary day wearing X-ray goggles.” Filthy offers harsh moral criticism of Harford, as well as the authors of the lauded Freakonomics, Levitt and Dubner, before moving on to carefully dismantle the intellectual positions of the left. Readers might just wake up the next day to find the goggles were actually of the beer variety.
The work’s overwhelming strength is the clarity of Heath’s explanations. Those laymen scared of economics – who say they just don’t “get it” – will find themselves empowered and refreshed. In addition, said lucidity presents some unconventional claims: that money-grubbing corporations are a kind of co-operative; that government-run health care is more efficient than private health care, all other things being equal, because of mandatory universal membership; and that “The best feature of capitalism is that competition forces private firms to act exactly as they would if they were governed directly by the decisions of socialist planners.” Whether you want to begin, or restart, your economic education, Filthy Lucre, written in eloquent and often humourous prose, is an ideal starting point. I caught up with Heath, most fittingly, at a Second Cup in Toronto.
It seemed to me that Filthy was a pretty logical continuation from Rebel Sell.
Yes. The Rebel Sell was a long applied version of the kind of concerns I had in Filthy Lucre. You start with an economic fallacy from which all kinds of beliefs get spun. In The Rebel Sell, it is the overproduction fallacy: the idea that we have to manufacture desires, create homogeneous tastes, that consumerism is a system of homogeneity. All that cultural theory of the countercultural left rides on the back of an economic fallacy. It shows how one error of economic reasoning –– a pretty basic error –– led to a huge subdivision of academic inquiry. Often I say to people, that’s just one of them. I try to categorize the four or five others of the left. A lot of people who reviewed it liked the fact that it was even-handed, but that wasn’t the intention.
That’s how I sell it to my leftist friends.
It is aimed at the left; the first half was to show that economics is not just right-wing ideology. When I was an undergrad, that’s the impression I got. The idea behind presenting the right-wing fallacies is to butter people up but also show there are a set of principles that cut both ways. A lot of cherished right-wing ideas haven’t survived scrutiny from twentieth-century economics. Everybody has to take some water in their wine before they start thinking systematically about the economic consequences of their ideas. The core of the book is to show the economic fallacies people on the left commit. The Rebel Sell was one version of that.
You argue “each sector, every market, every public policy proposal” has to be judged on its own merits. Given the adverse selection of our politicians, how are they supposed to handle such a nuanced task?
The point was that there was no general presumption in favour of the market a priori. You really have to look at the details of everything. When introducing market mechanisms and competition to the education system, for example. I’m in the business of providing education. I know how severe the information asymmetries are there, how opaque education is to students, but to their parents even more extremely so.
In Canada, it is particularly weird because you have public universities taking out full page ads, competing with each other. That shouldn’t happen. It is just a waste of ad revenue. I did my PHD at Northwestern, a school in the states with a ton of money but not top-tier status.
They have a good lacrosse team though.
And they had a good football team one year. When the football team got into the Rose Bowl, the school’s national ranking went from 17 to 11. The football team’s success improved the applicant pool and the applicant pool’s SAT scores, etc. You know university administrators aren’t stupid: they know how to juice their stats. They had just doubled the amount of money spent on grounds maintenance. It’s a beautiful campus. When parents drop Johnny off at college, they don’t attend the lectures and they certainly aren’t in a position to evaluate the quality of what’s being taught. But they can see the ivy going up the side of the walls. Competition has this capacity to go off the rails and make things worse. There is no question that, in the absence of competition, you get a certain sluggishness –– no one is particularly motivated to improve quality. If you start a poorly structured competition, you could motivate people to go out and to do things that are useless. You generate perverse incentives that siphon money out of what is worthwhile. So you overcome sluggishness but at the expense of…
Exactly, you are pouring money into something like the peacock’s tail, a totally symbolic display. You get that all the time in education. The point is there is no abstract presumption the market is going to be better than government. There are some powerful reasons for thinking the market will often be better. But you have to make the case. You really have to get down to the nitty-gritty and explain how the competition is going to be structured in such a way as to generate beneficial outcomes. But that’s not the job of politicians: that’s the job of the civil service.
In general, should we favour empirical experimental economics and save the theoretical rationalism for later?
The experimental versus theoretical economics is a separate issue. In this case, you are still working theoretically. You have to have a model.
I know someone who does auction designs. New Jersey wants to privatize their electricity supply: they have to create an auction for how the clearing house is going to work with electricity. They hire game theorists to come and design an auction. Now they are working with a highly idealized model of rational actors. But if they are well designed, they work the way they are supposed to. That’s a totally rationalistic model. Before you privatize, someone has to come in and show you in detail how that competition is going to work. If you can’t show you can have a well-designed auction, that nothing can be gained, you shouldn’t privatize.
Last time we talked, you said you were indifferent between a carbon tax and an emissions trading system. When I was reading the book, I thought ‘well he is definitely in favour of a carbon tax now.’ Then I read your interview in the Walrus, where you seemed resigned that the solution will have to be technological.
Right but you aren’t going to get the technological solution without a carbon tax.
There is this great paper this issue by Philippe Menanteau called “Prices versus Quantities.” The whole decision between tax versus cap and trade is just whether you want to affect quantity by changing price or affect price by changing quantity. Your typical supply and demand graph is price versus quantity. The easiest way to understand cap and trade versus tax is whether you want to work on the x axis versus the y axis. The policies are interchangeable at that level.
In terms of implementation, I gravitate towards the tax because politicians have proven to take the permits in cap and trade and give them away. That’s what they are doing in the states. That creates economic distortions. Existing industry gets a free pass; new industry has to pay to acquire permits. It defeats the purpose of ultimately making people pay for carbon emissions. A tax does that in a straightforward way.
Taxes can be used badly as well. Sarkozy in France is, underneath the European cap and trade system, introducing a carbon tax that is just a fuel tax. Whereas what they need to do is literally tax carbon. That means taxing things like cement, not just gas at the pumps.
I don’t think you can solve the carbon emission problem by having a global regime of pricing carbon because the collective action problems are insuperable. You need to make brown power sufficiently expensive such that green power becomes competitive. Then you will get technological innovation. You have to remember that gas is cheaper than water, than coffee. The experience that brought this home to me was I had a battery operated weed-whacker and it was a piece of junk. Big battery, eighteen volt weed-whacker, and the thing barely works. You get about fifteen minutes of juice out of it and, when it dies, it kind of caresses the weeds. So one day my wife got pissed off, went to Canadian tire and bought a seventy dollar gas powered weed-whacker, and
Now you are whacking weeds!
With this tiny gas engine, little gas tank, you have power to spare man. It’s an unbelievable difference. It must have had six or seven times the amount of power from a tiny amount of gas. That’s why it’s hard to compete against fossil fuels. There is so much damn energy in this stuff and you dig it out of the ground.
Almost all of our sources of energy are nineteenth-century technology: electricity, fuel, internal combustion engine, even solar cells. The only genuinely new twentieth-century technology was nuclear. So compare technological innovation in say, computers, to energy and the twentieth century was absolutely stagnant in the area of energy. Now why is that? Well one very plausible explanation is that you simply can’t compete with fossil fuels.
It is too good.
So cheap, so much energy, why would you bother trying to invent any other source of energy? That is still the case, even though oil is more expensive. A carbon tax makes it a little bit easier to compete. As oil gets more expensive, you have a motivation for technological innovation. A bolt of lightning has enough energy to power a city for a week; there is plenty of solar radiation bombarding the planet. People are ingenious—it’s just that there has been no incentive for technological innovation. When oil gets up to 200$ US a barrel permanently, then there is going to be a massive incentive for innovation.
In chapter six [“Personal Responsibility”] you write “This ‘blame it on the circumstances’ response has become almost an automatic reflex for the left. Indeed, much of what it means to acquire a ‘progressive’ education seems to involve learning how to take any sort of self-defeating, irresponsible, offensive, or just pain antisocial behaviour and explain it away as the effect of poverty, sexism, racism, or some shortfall in the achieved level of social justice.” After that, you say there is a kernel of truth in those explanations. So if we can’t dismiss them outright, how do we have a dialogue with those ideologically committed to them?
There are a lot of cases of this. Take something like teenage pregnancy.
The impulse on the left is to say that it is a product of disadvantageous circumstances and therefore the way to address the problem of teenage pregnancy is to address the disadvantageous circumstances. There is an element of just wishful thinking in that. Anyone who looks at it dispassionately, it seems to me, can see clearly there are cultural factors at play there, there are choices being made. There is an aspect of the conservative critique of the so-called culture of poverty that is clearly correct. I think that in Canada, we can say that more without getting into trouble because elements of the culture of poverty are absent in Canada. When you look at the States as a foreigner, you see there are cultural factors that are reproducing poverty in certain areas. In the States, you just get pilloried for even suggesting that because it is considered a conservative view. And if you think there is anything to this culture of poverty stuff, then you are just blaming the victim.
The problem is there is a feedback relationship between disadvantageous circumstances and the culture of poverty. That it may be the case that this culture evolved from disadvantageous circumstances thirty years ago or whatever, but there are many ways in which the cultural pattern make it impossible to address the circumstances without addressing the cultural pattern as well. You just can’t give people money. It may be because of racism but you can’t get rid of the racism as long as you have a pattern of fifteen year olds getting pregnant, two thirds of kids being born out of wedlock, that kind of thing, which generates negative social attitudes, and economically is totally catastrophic. You can’t address the underlying issues, assuming they are underlying, without addressing the cultural issues as well.
There is an element of the conservative view which, as an explanation of where the problem comes from, is in my mind self-evidently correct. It seems to me you just have to have the nerve to say, yes, the explanation is correct, the proposed remedy is wrong. That’s what I try to show in the chapter: a lot of conservative views that blame the poor are correct, as an explanation. The poor are agents just like you and me, and they make choices, and often they make very, very bad choices that generate a self-perpetuating trap of poverty. It is not totally forces beyond their control.
The left needs to have the nerve to say that the explanations provided by conservatives have a great deal of truth. But the proposed remedy isn’t really a remedy—it’s just piling on. Conservatives want what amounts to kicking people when they’re down.
The programs I admire –– Opportunidades in Mexico, Bolsa Familia in Brazil –– what is attractive about them is they restructure people’s incentives temporally. Your kids should go to the doctor, to school, and so on. Everybody understands that. The problem is that the benefits come ten years down the road. Rather than making the downside of that more catastrophic, which is what conservatives typically propose, you need to bring the incentives into the present. You get an incentive at the end of the month for sending your kid to school. That is an example of accepting a conservative explanation, but leaves it open to what you are going to do about it. The left can propose more effective, more humanitarian remedies.
That’s the title of Susan Jacoby’s latest article on Big Questions Online. It is quite good, but then again, I would say that, having written what I already have.
Jacoby uses the example of Ayaan Hirsi Ali, and a good example she is. But there is some room for criticism there. Jacoby repudiates those who would claim Hirsi Ali and Salman Rushdie have overstated the threats to their lives that their work has engendered, and thus, overstated the threat that multiculturalism poses to liberal democracy and the shroud of western enlightenment values that surround it. I think that repudiation is well founded, but it isn’t my impression that there is still a ton of debate over the dire nature of either individual’s affront to Islam, and while the current work of Rushdie (his latest novel received copious acclaim) and Hirsi Ali is of import, they have become overused insofar as examples go, when new stories of their kind are happening all the time. After Rushdie published The Satanic Verses there were people who said he deserved it, or that he shouldn’t have provoked Islam, just as people said that the Danish cartoons shouldn’t have criticized Islam. Well, I don’t hear those people speaking up much nowadays since its pretty obvious they are essentially endorsing violent reprisals to fair criticism of totalitarian ideologies in the name of getting along. Its the non-domestic version of endorsing a woman to know her place lest her husband should ‘be forced’ to beat her with his belt until one of her ribs crack. Why did she have to speak up after all when things could have gone smoothly?
This isn’t the same as my example of Florent Lemacon challenging the pirates of Somalia who dared defy him fulfilling his dream. People are supposed to be able, in a free, first world, western liberal democracy, to express in writing, speech, or film, criticism of any variety and on any topic. We don’t even need to get into the nitty gritty of free speech restrictions on that point. Most states ensure that right to the fullest. Moreover if there is any group worthy of criticism it is a group that promotes and endorses the removal of a young girl’s labia and clitoris through whatever barbaric means possible. Anyways, I don’t hear many people defending the rioters anymore. It is not of any importance if it is against the tenets of Islam to portray the prophet Muhammad in terms of whether one should be allowed to do it or not. Muhammad didn’t write the laws on the books of any society I have ever lived in.
A more en vogue way to defend multiculturalism is quite pernicious, but requires the same lack of critical reasoning. Jacoby details an instance:
When my friend saw one of her favorite young Afghan-American women — a high school senior — weeping in the dining room, she asked what was wrong. “Oh, madam professor,” the girl replied, “my father has arranged for me to meet my future husband. He is 40 years old, and the wedding will take place in six months. I wanted so much to go to college, and this will not be permitted.”
My friend replied gently, “You know, Yasmin, you don’t have to marry anyone in this country because your parents say so. There are organizations to help girls like you think these things through. There are college scholarships. I can give you the names of people to talk to.” Another resident of this community sharply reproved my friend, saying, “We have no right to interfere with her culture, her religion, her family,”
Wrong. This type of “interference” — telling a troubled young woman that she has choices other than an arranged marriage — is exactly what a true liberal ought to be doing. The idea that someone should ignore the tears of a 17-year-old who says she is being pushed to give up her education is utterly perverse.
Perverse indeed. You know, every white middle class person I know who would take the side of non-interference should realize just how awful and conceited and truly racist it is to think of yourself as so separate from people outside of your race, class, and background that you are obliged to follow the Prime Directive. After all, you inhabit the same world as them.
I remember the feeling when I learned that a girl I had gone to middle school and to a portion of high school with, Dalal, had moved to the Middle East because of her arranged marriage to a much much older man. It was a sinking feeling of remorse, and I was quick to say to my informants (much closer friends of Dalal than I had been) just how terrible a state of affairs it was. He’s greasy they giggled. But when it came down to really responding to my concerns they said things like, “he’s not so bad.” That’s not really the point, and they weren’t really defending the practice. They were just comforting themselves. Sadly, this isn’t the only anecdote I have in my life of a friend of mine being a victim of an arranged marriage, and I don’t have that many friends. So if you want to make an argument that these things aren’t a pervasive problem you should probably be volunteering for the away team more often; not every nondescript actor who beams down gets killed in the very next scene.
We have no right to interfere with her culture, her religion, her family. These are probably the same people who would say something like “you never talk about religion or politics.” People who honestly believe that, or just say that, are simply not adults. Adults should be able to talk about anything. It might be very hard for some adults to talk to other adults about emotional things, about personal matters. But when it comes to conceptual matters about right or wrong, nothing is sacred, and anyone who claims otherwise is reading too much Emily Post. I don’t subscribe to a version of morality that obliges people to do the right thing because it is the right thing to do, but if you happen to how could you argue that not upsetting people is more important than chipping away at barbarism? This lengthy detour brings us to Jacoby’s, and Hirsi Ali’s, next point:
"I do not agree with everything Hirsi Ali has to say — about Islam or the United States — but I strongly agree with the essential point she makes in Nomad:
Here is something I have learned the hard way, but which a lot of well-meaning people in the West have a hard time accepting: All human beings are equal, but all cultures and religions are not. A culture that celebrates femininity and considers women to be the masters of their own lives is better than a culture that mutilates girls’ genitals and confines them behind walls and veils or flogs and stones them for falling in love… . The culture of the Western Enlightenment is better. (italics in the original)
That’s not to say it doesn’t have its problems or shortcomings. It is only to endorse it when compared to the alternatives. And the alternatives are not abstract either. They’re staring us right in the face. Like how the life expectancy in Sierra Leone actually went down from 1984 to 1994. Like how people are beheaded in Saudi Arabia and women who are gang raped are scolded for wandering. It is nothing to say the west doesn’t live in a world of obscene cultural practices that need to be (and are being) criticized. Like how Roman Catholic priests are allowed to sodomize young boys with impunity, or how parents are allowed to take a blade to the genitals of infant boys, or how people in some states in America are put to death by the state. Actually, while these kind of things happen in our society, they aren’t endorsed by the culture of the Western Enlightenment. At the very least, they aren’t endorsed by John Stuart Mill.
As far as children go, one U.S. ruling got it right:
In Prince v. Massachusetts (1944), the Supreme Court upheld the conviction of a Jehovah’s Witness for violating state labor laws by requiring children to distribute religious literature at night. The Court declared: “The right to practice religion freely does not include liberty to expose the community or child to communicable diseases, or the latter to ill health or death… . Parents may be free to become martyrs themselves. But it does not follow [that] they are free … to make martyrs of their children.”
You see, the reason parents don’t have the right to circumcise their sons is the same reason they don’t have the right to deny their sons a blood transfusion. Children are no longer the property of their parents. They may be in an Abrahamic world, but Abraham was willing to kill his son now wasn’t he? I’ll leave the comprehensive dismemberment of male circumcision (since the immorality of female circumcision is so widely accepted) to a full article which I will post in this blog, but for now it suffices to say that parents don’t have a right to mutilate their son’s genitals and that if someone wants to have their own genitals mutilated, they can decide that under their own legal power when they are 15.
Jacoby deftly unravels another common argument, that of religion’s sphere of influence:
[F]oundations and academia, often assert that religiously sanctioned violence against women and other human rights violations are matters of “tribe and culture, not religion.” But what is more central than religion to most of the world’s cultures?
This is exactly right. The prime motivator of disentangling religion from factors affecting barbaric practices is the protection of religion, not the understanding of factors affecting barbaric practices. Culture, tribe, and religion reinforce one another in many of these cases. Were the Irish tribes historically patriarchal? Did their religion prescribe patriarchy? Was Irish culture in favour of patriarchy? Which word exactly is the reason Irish men of the 19th century found it widely acceptable to beat their wives? It isn’t an either-or between a culture of patriarchy and a religion that accedes to women as chattel.
That’s not to say that the west doesn’t have problems of domestic violence now. Again, we can be self-critical and outwardly critical, there is no hypocrisy there. It isn’t to say that the feminism that has infected the west hasn’t had its unintended or undesirable or irrational consequences, like the discrimination men so often face in the eyes of divorce laws. But then again, women can vote and be doctors in Canada, can’t they? And I think its safe to say that that is unambiguously and unequivocally, that is to say objectively, better than the alternative world where they cannot. But who would say that it isn’t, or that it isn’t their place to say? When put in such stark terms as women having the right to vote or be doctors, no reasonable multiculturalist amongst us would admit agnosticism on account of cultural relativism. The point Jacoby leaves out in the article is that the people who often say it isn’t our place to interfere in others’ cultures don’t understand the consequences of trying very hard not to appear a racist. They haven’t thought it through because they are being misguided by some sort of social intuition, learned or ingrained, but certainly not properly examined. As Hirsi Ali described in the above quote, they are “well-meaning.”
And not all multiculturalists have the same ‘ick factor’ preferences. Canadian Jews for example might ick at the thought of labia being cleaved from the body of a preteen African but not at the thought of a hapless infant having his foreskin severed with stainless steel and surgical precision. Such is the incoherence one finds themselves in if they haven’t uniformly applied the things they learned in On Liberty to their own intuitions. But this also speaks to critical mitigation. Not all defenders of multiculturalism, not all ‘liberals’ (a term Jacoby uses), are complete ideologues on the issue, which is good, since convincing them otherwise will not be impossible. Like I said above, their commitment to it is often based on not wanting to brook arguments at Thanksgiving a la Ms. Post. But while she might have been a great conversationalist, as any woman of high society should be, an unwavering commitment to small talk (and small ideas) gets old during an evening not interrupted by healthy consensual sex.
The blog has a new look that I am unsure about. Also, I have cleaned up some old entries with spell check, deleted some bad entries, and ‘tagged’ each entry for the type of content it discusses. This should make perusing the archive a bit easier. I’m hopefully going to start Niall Ferguson’s “The Ascent of Money” soon, and will be writing a review of it. I’ll also post the first part of my second interview with Joseph Heath soon.
The NYT recently published a very good article on how language shapes our brain. As I previously ranted here, its extremely vain to believe that speakers of one language can’t understand certain concepts because of the language they speak. The article even uses an example I did, to point out the stupidity of such a claim: “Do English speakers who have never heard the German word Schadenfreude find it difficult to understand the concept of relishing someone else’s misfortune?”
The article does some nice detailing on how this intellectual “fad” came about, through its primary proponent, Benjamin Whorf. An example position: “Whorf announced, Native American languages impose on their speakers a picture of reality that is totally different from ours, so their speakers would simply not be able to understand some of our most basic concepts, like the flow of time.”
So this is obviously false, and the article does a great job of pointing out what Whorf might have been seeing, but misinterpreting.
"Some 50 years ago, the renowned linguist Roman Jakobson pointed out a crucial fact about differences between languages in a pithy maxim: “Languages differ essentially in what they must convey and not in what they may convey.” This maxim offers us the key to unlocking the real force of the mother tongue: if different languages influence our minds in different ways, this is not because of what our language allows us to think but rather because of what it habitually obliges us to think about.” (italics original)
There are some interesting examples and experiments cites in the article to attest to this. In conclusion, the author writes:
"For many years, our mother tongue was claimed to be a “prison house” that constrained our capacity to reason. Once it turned out that there was no evidence for such claims, this was taken as proof that people of all cultures think in fundamentally the same way. But surely it is a mistake to overestimate the importance of abstract reasoning in our lives. After all, how many daily decisions do we make on the basis of deductive logic compared with those guided by gut feeling, intuition, emotions, impulse or practical skills? The habits of mind that our culture has instilled in us from infancy shape our orientation to the world and our emotional responses to the objects we encounter, and their consequences probably go far beyond what has been experimentally demonstrated so far; they may also have a marked impact on our beliefs, values and ideologies. We may not know as yet how to measure these consequences directly or how to assess their contribution to cultural or political misunderstandings. But as a first step toward understanding one another, we can do better than pretending we all think the same."
But this is kind of the point. No one pretends that we all think the same. Also, language isn’t necessarily the main influential factor. Genetic predisposition to stupidity for example, might have a case. I suspect that while the abstract conception of language, culture, upbringing, religion as shaping factors but not limiting factors upon our cognitive styles is fundamentally correct it is unlikely that empirical experiments will yield much in generalized rules or understanding. Its all too messy. I predict even the most ingenious experiments in this vein yield at most explanatory anecdotes. But that won’t dissuade some from seeing a larger pattern that isn’t there.
This book is not about authenticity, but rather, what people think is authentic. Potter begins with what turns out to be a perfectly chosen parable, in that it encapsulates so much of what his thesis concerns. He recounts the story of a husband and wife who, with their three year old son, uproot their life in France in order to live a more authentic life. How you ask? Why, by traveling to Zanzibar in a restored sailboat “into which they had poured their life savings.” Florent Lemaçon told a newspaper before they left that “We have got rid of the television and everything that seemed superfluous to concentrate on what is essential.” Fate would have it that their route took them to the coast of Somalia, well known by just about everyone as the most pirate infested waters of the 21st century. Warned by a French frigate to turn back because of the clear danger, the Lemaçons blogged “The danger is there and has indeed become greater over the past months, but the ocean is vast… The pirates must not be allowed to destroy our dream.”
Funnily enough Florent Lemaçon was shot to death in a gun battle between Somali pirates (who had taken the family hostage) and French rescue forces. Potter points out that their rejection of society is entirely cliché. I’ll defer to him for the rest of the explanation:
“Florent Lemaçon is not the first person to get himself killed while searching for a leaner and less complicated mode of existence. But there is something especially pathetic and pointless about this case, even discounting Florent and Chloe’s outrageous decision to bring their young son on such a trip. Civilization has its drawbacks, but if there is one unambiguous good that it provides it is safety, security, and the rule of law… Only someone in the grip of a seriously misguided ideological quest could imagine that taking his family through the Gulf of Aden is a more “essential” form of existence, or a reasonable and virtuous alternative to the life of a well-paid professional in contemporary France.”
That brings me back to the blog: “The pirates must not be allowed to destroy our dream.” I paused over this a couple of times, trying to make some sense of it. What is so curious, I think, is that the pirates aren’t out to destroy any dreams. In fact, “the pirates” don’t give a fuck about Florent Lemaçon’s intentions at all. Is it not the case that this man had something ‘to prove?’ When people have things to prove they so often think the world is out to get them, it seems to me. They personify society; they give society, corporations, governments, personal agency. Not to mention pirates. Florent was obviously alienated by the society in which he participated, but it’s very easy to be alienated by society when you personify it. When it is out to get you. Sure enough, this was part of Florent’s reasons for leaving: “We don’t want our child to receive the sort of education that the government is concocting for us” (italics mine).
I laugh when people who are driving yell (in the car) at other drivers. Other drivers can’t hear you. Just like Somali pirates aren’t keeping tabs on your blog. The reasonable way to approach these kinds of things is to treat the outside world as automatons that you a) have to navigate around (in the case of driving) or b) take as unalterable fact (in the case of pirates that are going to kidnap you). Motivations aren’t important. Defining yourself as a good driver, and defining another driver as an asshole, isn’t important. It is really stupid actually, because your goal should be safe driving, and if it isn’t then you probably have an ego problem, something to prove. Similarly managing to shoot the gap through the Gulf of Aden and avoiding capture isn’t akin to the triumph of one man’s will over the will of pirates to destroy his dream. It is akin to one man sitting in a room, by himself, and playing a round of Russian roulette, and happening to survive. One of the things that is so pointless and pathetic (Potter’s adjectives) about the story is the fact that pirates represent the lawlessness and anarchy found in the state of nature, the exact thing Florent was shepherding his family towards. So even if getting to Zanzibar safely was about a triumph of his will over the will of the pirates (which it so obviously isn’t) it would be about surviving the perils of the state of nature after rejecting the security of society, something he didn’t set out to do. He sought the supposed comforts of the state of nature.
A quote not in the book, but in this article, that I found interesting was “We want to flee the consumer society and its routine.” I don’t know if Potter refrained from including such obvious stupidity because of the resemblance it bears to the kind of stupidity skewered in The Rebel Sell. But Florent certainly can’t be faulted for being vague in how exactly he hadn’t thought things through. Potter teases out of these symptoms the disease afflicting the cognition of the Florents of the world in one of the sagest passages in the book:
“One widely accepted view is that it is impossible to build an authentic personal identity out of the cheap building blocks of consumer goods…When it comes to personal fulfillment, many of us subscribe to the idea that the self is an act of artistic creation, and living a meaningful creative life is impossible within the confines of the modern world…. Yet too often for comfort, the search for the authentic is itself twisted into just another selling point or marketing strategy, and once we appreciate the full implications of this, there is a real danger that cynicism will quickly set in: everyone is working an angle, everyone is looking to make a buck. Once we start down this path, it isn’t long before we reach the same conclusion as Florent and Chloe Lemaçon – society is corrupt, commerce is alienating, and the whole system should be abandoned, if not completely destroyed. We can tie ourselves in knots over this, but the fact is, the relationship between the stuff we buy and who we are, and the broader relationship among consumer culture, artistic vision, and the authentic self, is fraught with bad arguments and bad faith, and the usual themes and oppositions (between genuine needs and false wants, or between the shallowness of a branded identity and the depths of the true self) are too crude to be helpful.” (italics mine).
Potter couldn’t be more right. He doesn’t define the Sartrean term bad faith, but its placement is perfect. In a famous example Sartre describes a typical waiter’s interactions with others and the world. Waiters aren’t usually themselves; they play the part of being a waiter. But haven’t your best experiences with waiters and waitresses been when your interaction was genuine, when it felt unscripted? The thing is, a waiter who didn’t put on their usual act could be having a bad day and could be a genuine prick. It is appreciated when waiters are courteous and appeasing to their customers — the thing is the kind of bad faith imposed by society isn’t entirely polluting — it can be rather civilizing. Anyways, when people define themselves by their rejection of society or modernity, or what they don’t like about the world, they are indeed letting modernity or society define them. They end up looking like idiots, like the guy pictured in the post below this one (check it out). I’m fairly certain in this moment that this person would describe themselves in terms of their rejection of the very thing they state shouldn’t define them. Like the sign says, it is a mistake. It is bad faith to claim you don’t define yourself by the very thing you do, especially if deep down, you believe it.
And even more importantly, there isn’t an alternative to be found. When asked, people can’t even define what they mean by authentic living. That’s because “Authenticity is one of those motherhood words — like community, family, natural, and organic — that are only ever used in their positive sense, as terms of approbation, and that tend to be rhetorical trump cards.” If you keep looking for authenticity you are never going to find it, and if you keep looking for your authentic self, you will become something undesirable. Potter has a great line in this radio interview where he crisply states the faults of delusion of today’s search for authenticity:
"When it comes to the search for the authentic self, about being true to yourself and your own values and your own spiritual yearnings, you won’t find anybody out there who says its a bad thing to be fulfilled or to actualize yourself. Except, there is nothing that says that who you really are deep down is a good person. The idea that we are all good people deep down is a really old idea in our society. The idea is that we are all basically good and we are deformed by civilization, we are deformed by our social relationships, and if we can get rid of all this, you know quit your job, your ultimately a good person. But there is no evidence that that is true, right?"
The kind of authenticity people are looking for isn’t out there to be found, much less invented, produced, or consumed. In so stating Potter stands up for modernity in a very clear, precise, unpretentious, and unromantic way. It is the best we’ve ever had and we have to deal with it. Those people who use the crutch of rhetorical trump cards just aren’t very happy people. I don’t think Potter tries to impress that people should appreciate what they have simply because life is and has been far worse for others. But he’s no anti-natalist either. It is what it is. And however you deal with it, inventing false dichotomies between keeping it real, between being authentic, and being a fake sell-out, is highly counterproductive, ill-advised, and just plain irrational.
I did quickly become concerned that the book’s thesis would taste like the diet version of The Rebel Sell. That is, the desire to distinguish oneself from conformists (who constitute the majority of society) manifests itself in the consumption (and the requisite production) of conspicuous products, a phenomenon that nurtures new markets (and therefore the market itself), and that, in a coincidence of inconvenience, the purchasers of these differentiating products profess a hatred for the very market that provided their distinction and the homogeneity it supposedly engenders, a market in which they partake ostensibly unwittingly, undermining any revolutionary anti-market cultural cum political-economic rhetoric they happen to think themselves informed or angry enough to blurt out. Anyways, Potter does revisit these talking points since he proposes that authenticity or perceived authenticity is often a positional good, therefore subject to the rules of conspicuous consumption.
I did enjoy the return to Rebel Sell themes. One of the better examples used was of ‘locavores’ – the people and portmanteau we all could do without. The locavores embody the idea that the pursuit of the authentic is “just a disguised status competition.” How local is local enough? Well shortly after the fame of the 100 mile diet people began the 25 mile diet, and so forth, until the only way you could possibly be authentic and guilt free about your environmental footprint is when you were eating exclusively from your own bean and sprout garden that you fertilized with your own shit. How much of this is hyperbole? Potter details the growing movement of people across North America installing dirt floors in their homes.
The book does establish itself apart from The Rebel Sell when it is all said and done. The line isn’t altogether socio-economic this time, as the book takes a fairly existential turn. The subtitle after all is How we get lost finding ourselves.
Potter identifies the search for authenticity with a rejection of many of the prominent facets of modernity, such as secularism, liberal democracy, and the market economy, that have replaced previous sources of meaning, like religion, or aristocracy. It is a repetition of a common religious fallacy, that of a former golden age, or the fall of man: “The quasi-biblical jargon of authenticity, with its language of separation and distance, of lost unity, wholeness, and harmony, is so much a part of our moral shorthand that we don’t always notice that we’ve slipped into what is essentially a religion way of thinking. The ease with which we talk about our alienation from nature, … [hearkens] back to our ongoing sense that we are fallen people….[N]ow we make do with things such as Oprah’s Book Club, which offers a thoroughly modern form of spirituality that is a fluid mix of pop-psychoanalysis, self-help, sentimentality, emotionalism, nostalgia, and yuppie consumerism.” Ideologies are comprehensive explanations, and a consequence of their dismissal is the obverse of their appeal, a dearth of comforting explanations.
To that point, one thing I enjoyed less than I thought I would was the explicit atheistic conception of the world Potter states as fact. This supposition is integral to the force of his argument vis-à-vis what we should do in the dénouement of what he aptly calls the disenchantment of the modern world. It isn’t that I don’t take atheism as a fact of the world; I do with a degree of certainty that some find frightening. Rather, I think a book which implies atheism as a fact of the world, without stating it, preferable. Those who take it as fact will infer it easily. Those ardent theists reading will not be tempted to dismiss the book’s contents entirely. But most importantly, the enormous swathe of the population who are practical atheists, but who possess censual religious denonyms, don’t need the confrontation. The gentle everyday nudges reality provides them with have been working well for as long as I’ve been alive to observe.
If anything Potter is capable of, it is of eviscerating the disingenuous behaviour of those who strive to keep it real, from the cult of organic food, to the church of Oprah. That’s one of the benefits of having a keyboard with a caustic setting, if you succeed in arguing your point, the victim of your criticism appears thoroughly discredited. And in this Potter succeeds with who he accurately portrays as the father of our society’s nostalgia for a past that never existed, Jean-Jacques Rousseau. As opposed to some of the other historical figures of philosophy that make cameos in The Authenticity Hoax, Rousseau is given a fair and vigorous shake. The citizen of Geneva isn’t made out to be a total crank: the nuances behind some of his writings are respected, the overzealous interpretations of today’s common sentiments are pointed out, and the phrase misattributed as his literal coin “the noble savage,” that ahistorical beast Mr. Lemacon wished to become, is put in a proper context. But overall Rousseau and his exciting prose don’t emerge from the text with an air of prescience or erudition. Rousseau emerges looking like the West’s first incarnation of the pitiful Lester Burnham. But Potter makes sure one doesn’t confuse his prose as an eloquent Enlightenment era precursor to The Unabomber’s Manifesto, but rather to “transcendentalist writers such as Henry David Thoreau and Ralph Waldo Emerson.” Rousseau wouldn’t have endorsed Lemacon, since “the primitivist view of Rousseau’s ambition is mistaken: instead of looking for some sort of modernity-free sanctuary out somewhere in the world or in our distant past, he proposed that we look inward and find our authentic self by attending to our most basic, spontaneous, and powerful feelings and emotions. In this view, the authentic person is someone who is in touch with their deepest feelings.” The Hoax plucks Rousseau out of the fire from which totally irrational declinists such as Prince Charles, the 2012 Armageddon crowd, James Howard Kunstler, and other anti-civilization Lemaconic loons are not saved, placing him in a more sympathetic, but still scalding, frying pan of philosophical critique. Rousseau’s prescribed search within ourselves for authenticity has all the trappings Potter mentions in the portion of the radio interview quoted above, but he goes on:
"The truth doesn’t really matter, the actual real truth out there, the authentic is what reflects my own inner aspirations or my own sense of who I am. That’s not some objective thing. The authentic is what’s true for me, what speaks to my own truth, as opposed to what is true out there. Once you start equating what is true with what is true for me — and that’s the pursuit of the authentic — it can actually legitimize a great deal of antisocial or psychologically crazy behaviours."
As far as those left in the blaze of Potter’s ire for idiocy, it suffices to say it isn’t a fair fight. But quite enjoyable to read, in the way the Romans filled the Colosseum to watch lions eat people. I was reminded of something Joseph Heath told me when I asked him about his (and Potter’s) devastating criticisms of Naomi Klein. He said something to the effect of “you know these people need reminding that to be a public intellectual you have to actually think hard. Like, more work is required of you.” And if declinists like Kunstler ever read a critic like Potter that message will certainly be clear. Strangely enough, it seems like David Suzuki has taken some of the book’s criticism to half-heart.
Not every single off The Hoax is a hit. Potter is prone to over-intellectualize. It isn’t that the writing is drab, just the opposite. Potter has taken pains to be acerbic and funny and pop referential. And he is. Rather, it is twofold: he squeezes topics into his thesis and he explains tangential phenomenon that need no explanation. Of the former, I found one particularly implausible instance, where Potter is discussing the authentic ‘aura’ of modern art: “(This is the reason why art galleries are like churches, with the works curated like holy relics: the point is to preserve their aura.” Really? I thought the point to curate works reverentially was to not scratch the paint. In another instance, while etching a portrait of our overly fake world, Potter slips in “They have lost faith in a political culture that, [details Bill Clinton and George Bush’s failings].” Potter is talking about the result of a poll of Americans, not his own view of course, but it is a very dubious idea that Americans have recently lost their faith in their political culture, mainly because there has never been a good time in the history of just about any society to have faith in its political culture, and its unclear to me if ever there was a time when even a plurality of average citizens of a polity had faith in its politics. In Canada at least, we seem to be capable of both being largely skeptical of the motivations and abilities of our political class, while not taking our concerns to the streets, an indication of a general satisfaction with the state of affairs. And when people do riot in the street, they are generally seen as deplorable ruffians by most people over the age of 25. It isn’t clear that low voter turnouts, for example, is a sign of the proles’ disquiet regarding the political culture, since voter free-riding and apathy could in fact be well placed faith in the citizens who do vote and the results they’ve been producing.
There is a very insidious result of the latter, explaining tangential phenomenon unnecessarily. People love to experience things where everything just clicks. We are addicted to epiphany and those experiencing lapses of scruple, or who are simply unscrupulous to begin with, won’t care to inspect those epiphanies of the specious variety, particularly when they are throwaways. One such throwaway was Potter’s sentence: “That is why an inordinate number of fiction writers took up journalism after 9/11.” The only real explanation given was that “when the times turn serious, people are not likely to turn to novelists for guidance.” It doesn’t matter if these mini-theses are right or not. They are literally beside the point. Books like The Hoax explain, but explaining extraneous topics take on disproportionate risks, endangering the thesis’ overall credibility and focus by getting less vigilant readers’ heads nodding and more vigilant readers’ eyebrows up. It’s the expository equivalent of getting a cheap laugh.
The acknowledgments have Potter admitting that the book “took much longer to write than it should have.” Well if Potter so chooses, the soft cover will present the opportunity to extend that phrase to include “but we still didn’t take long enough.” Observe page 252: “nostalgia for the past unity spurs an period of exterminating violence.” Maybe this is typo nitpicking. Check out page 169: “Consider the case of former, and now disgraced, U.S. senator John Edwards. ….Edwards is notorious for being inordinately proud of his hair. So much so that when he ran for vice-president in 2004 as Al Gore’s running mate…” How no one noticed that, least of all Potter himself, escapes me.
I took a little issue with the title, when I thought about giving the book to some credulous people I know. The cult of conspiratorial worldviews is a serious problem amongst the truly apathetic and ignorant, and the use of the word hoax may just lead to gross misinterpretation. It makes for a catchy title, one which I suspect Potter may not have had complete control over. In any event, there is a danger in it, where those who would prefer a bad theory to no theory at all may blindly adopt some sort of solipsistic metaphysics based on a hurried misreading that Potter argues everything is fake in this world.
This brings us to the problem with straddling the line between popular work and philosophical tract. It may be that you lose both wagers hedging your bets. I often wonder how laymen read the PoliSci 101 introduction to Hobbes (and Locke’s rebuttal) that inevitably crop up in this kind of book. Because I sometimes groan, knowing I am about to read things that have been impressed upon me a dozen times. It is kind of like when reading a popular science book and the author has to define prime numbers. You, the reader, understand that it is necessary for the audience, covering the bases, but that doesn’t mean you enjoy reading something you wish would move along. But I also wonder, given the general disservice to Hobbes a two sentence introduction (on page 36 of The Authenticity Hoax) will almost certainly afford, if it is of any value whatsoever to someone who has never heard of him before. I have trouble remembering the names of characters during a movie that I am currently watching. Afterwards I refer to one character as Marley and to another as Jenifer Anniston. And what do I get out of these movies? Hopefully a third date. That is to say, not a deeper understanding of anything.
Despite the listed shortcomings, Potter achieves much of what he sets out to do. While some of the middle chapters lack the punch and lucidity of the first and final ones, this is probably a testament to Potter’s ability to look at the big picture effectively. What is most important, in my view, is that his diagnoses are right. That fact alone makes this book worth reading. I won’t fully ruin the happy or unhappy ending contained in the final two chapters, but I can assure the reader, they are sure to genuinely satisfy even the most voracious intellect’s hunger.
There is a good interview with Wikileaks founder Julian Assange done over tea with the Economist. In it he is quite ambivalent in offering a defense of his actions. He shouldn’t be. But what is clear is that he is politically motivated. He should drop the veneer of letting information be free (in fact he claims “information wants to be free”) and state explicitly that he is biased. He has stated explicitly that he is against the losing war that is Afghanistan. No apologies should be made of that. Its really easy to be impartial if you are publishing an expose on milkmen’s bottles, the technology that manufactured them left traces of toxins that seeped into people’s milk. There aren’t any milkmen anymore and no one is benefitting from the technology. Its hard to be impartial about an ongoing war. Its so hard that claiming any form of impartiality is self-delusional, dishonest, and enormously cognitively inefficient.
"Our goal is to achieve political reforms," says Assange. He also speaks of a harm analyzation process or something to that effect, whereby wikileaks assesses, in advance, what effects their releases will have on the world. He basically contradicts himself in the same sentence, saying "We believe prima facie true information does good, but true information can also do harm, so we strip it of the harm and release it." The Economist worries about who is regulating or watching Wikileaks. No one should be. No one needs to be. The United States military has something against Wikileaks, not because it is unregulated, but because it is an actor acting against its war interests.
Its pretty clear that Assange is in too deep to see how he is being perceived. He comes off as testy, ornery, self-righteous. Yet he does good work. Most leaks regarding Afghanistan and Iraq help solidify a picture non-insiders assume was the case: there is a lot of unreported carnage and the rules of engagement that the U.S. observers are by no means cautious, calculating, or precise. Collateral damage is under reported and widespread. Young men with guns are butchers, in that there is no surprise.
Information wants nothing
There are two ways to take the turn of phrase information wants to be free. One is that information has desires. That’s obviously false. The second is that the saying is a parable. That information, over time, will become public knowledge, because of the gossipy nature of humanity, the diffuse information technology of our society, and the irreversible process of information becoming public knowledge. Once it is out there, it cannot be removed. So it may seem that information becomes free over time and that therefore it acts on its own accord, but really its just a function of the structure of its habitat. As far as true information doing good, there is nothing to say that it is the case and there isn’t any need to explore that line of reasoning. The main problem is that creating generalized rules of being like that is kind of like pseudoscience. There is no reason to think philosophical ideas follow laws akin to the (assumed) regularity of the laws of physics. And more importantly, there is no way to prove it were it the case, empirically, rationally, practically. Therefore, when some piece of information is found the algorithm should go: is it true? Does it do good? Each piece of information will have its own answers, and I suspect no universal law like true information does good will emerge. But the point is that we shouldn’t even be thinking that way.
Conspiracy theorists think that the information they have isn’t free, isn’t in the public domain. But it is, and the cream hasn’t risen to the top. Why is that? Not because true information does or does not want to be free. Rather its because false information and stupid information is not useful to the inhabitants of reality. There is a type of meritocracy and its a natural consequence of the structure of the habitat of information in our world. No natural law.
Heidegger spends a good deal of time delineating the different types of objects and how they represent themselves in the world. It seems to me, at base, that if we don’t grant inanimate objects with agency, then a great deal of epistemological mysticism unravels itself. “This is this.” De Niro resolutely snarls in The Deer Hunter. Here is a question that makes no sense “For what reason would phenomenon represent themselves to be not what they are?” This implies agency. Phenomenon have no agency. They have no reason to not represent what they are because they do not use reason. A phenomenon has a representation of itself once there exists a perceiver, but there is no reason, rule, or law that gives an incentive for the representation to betray the phenomenon it represents.
Phenomenon don’t have representations without a perceiver. To that point, phenomenon don’t know if they have representations or not, or when they do. So any distortion of representation, that is perceiving a phenomenon as something it is not, lies in the senses. They are a total function of the senses shortcomings, not of a phenomenon’s ability or want to represent something it is not. Finally, we can and have made great strides in understanding how and when and why our senses deceive us. The problem is to a great degree solvable since the world has no choice but to represent what it is.
Surely you can’t be certain
The point Descartes tries to make when he constructs his pernicious demon example is that we cannot be certain our senses do not betray us. But in order for his example to work there must be agency (ie the demon, and its powers and intentions) behind the ruse. The example doesn’t claim representations are by nature liars. Our senses may not be able to interpret all representations as what they represent, in fact, we know they cannot. But in order for representations to be indecipherable in principle, some agency must exist to tend to the smokescreen. So if we don’t believe in contrived things that we have no reason to believe exist, our glasses should work.
Certainty isn’t what is important in the demon example. Relevance is what is important. If something serves no explanatory benefit, if the entire phenomenal world would be the same, or less complicated without it, and we have no reason to believe this something exists, there is no reason to pay it any mind. It isn’t a relevant thing to postulate.
Now we can really see the vanity of the solipsist. There is no reason to think the world is lying, it has no agency. Just as information does not want anything, the representations aren’t out to fool you. Its nearly certain there are other minds, but more importantly acting or believing as if there are not is a clear indication of an extremely inflated ego or an utter lack of self-examination.
Its a cognitive ability of ours, to suppose agency in inanimate objects. From the lock that won’t co-operate, to the universe itself.
The Mark published this wet noodle piece titled “Islamophobia’s Coming-Out Party.” Amongst other things, it argues that Little Mosque on the Prairie’s existence as a government sponsored and little watched television program shows Canada’s natural superiority over the United States when it comes to values and rights, and progressivism. There is no more prominent ideology in Canada than anti-Americanism. And increasingly prominent is inclusionary multiculturalism. Well, as I’ve argued before, not all cultures should be welcome here.
Also, there is a logical disconnect in 1. there are bigots in north american who have an unfounded distrust and prejudice against any muslim implying 2. therefore the ground zero mosque is a good idea. 1. is true, but it doesn’t imply anything normative, except perhaps to say that opinions should be formed on a rational basis.
Meanwhile Tarek Fatah and Raheel Raza penned this provocative article in the Ottawa Citizen, arguing “muslims know the Ground Zero mosque is meant to be a deliberate provocation.”
As per usual, I’ll side with the muslim voice on this one. Tarek’s muslim voice, that is.