education, evolution, Lighter Things, running, science, Uncategorized

Full Esteem Ahead

This morning, as I was making my way through my email, I caught a short news item in Academica Top Ten about a school in Calgary discontinuing awards and competitions based on the work of Alfie Kohn, an author who writes about child behaviour and parenting. The theory is that, “awards eventually lose their lustre to students who get them while often hurting the self esteem and pride of those who don’t get a certificate.” In essence, if I understand the idea correctly, when someone excels, rewarding them makes them complacent, and if they fail to excel, they suffer loss of self-esteem and pride; therefore, competition shouldn’t take place at all.

With all due respect, that’s total crap, in my opinion.

As usual, George Carlin said it best:

Now, all of this stupid nonsense that children have been so crippled by has grown out of something called the “self-esteem movement.” The self-esteem movement began around 1970, and I’m happy to say it has been a complete failure. Studies have repeatedly shown that having high self-esteem does not improve grades, does not improve career achievement, it does not even lower the use of alcohol, and most certainly does not reduce the incidence of violence of any sort, because as it turns out, extremely aggressive, violent people think very highly of themselves. Imagine that; sociopaths have high self-esteem. Who’da thunk? – From “Life is Worth Losing” (2006)

The self-esteem movement has led to such ridiculous acts as not having winners or losers in games – everyone is special! The problem arises once we realize that if everyone is special, then nobody is. To me, that sounds like a psychological theory created by wimpy scientists that always lost at sports and secretly harbored a grudge for many years until they could start influencing educational policy. Admittedly, this is somewhat of a generalization; I apologize to any athletic scientists out there (all three of you).

What bothers me about the self-esteem movement comes down to two key things: I have learned more from my failures than my successes; and some people are better at some things than others – that’s the nature of humanity.

The reason my Dad was an absolutely brilliant parent (whether he actually knew it or not) was that when the time came to give me advice, he shared his own experiences with me, then gave me the freedom to make whatever decision I felt was right. He trusted me enough that he knew I’d make the right decision, or if I didn’t, that I’d learn from having made the wrong one. It was the freedom to decide, and in truth, the freedom to fail, that made his guidance so valuable. Honestly, I’ve screwed up more times than I care to admit, including one massive failure in the field of marriage; even that experience has value if I manage to walk away having learned from the experience and changed my behaviour to adapt and try to prevent the same mistake from happening again. I certainly don’t rule out getting married again, I just won’t go about it in the same way. I learned, and at the risk of using a sickening cliche: I grew.

Of course, failure in this context isn’t the same as failure in sports, so let me use another example: I started running just over a year ago, and during that time, I’ve run several races – mostly 5K, but I have one 10K under my belt as well, and I consider that to be my maximum racing distance. Just my personal choice. I don’t enter these races with any expectation of winning – I generally finish about halfway through the pack for my age group, and I do keep track of my time for some races, with an eye to improving and getting faster as I do more of them. There are clearly winners of these races, and there are the rest of us who do not and will never win, but that doesn’t take away from my enjoyment and pride at having participated. Realistically, I am competing with myself – striving for personal best. This is the first time in my life that I have undertaken athletic activity in any sustained way, with the exception of doing some fencing a few years ago (which I also thoroughly enjoyed and failed to excel at), as during my childhood I was incredibly uncoordinated and generally somewhat overweight, so any such endeavor was doomed to fail. However, despite the lack of a `Mr. Congeniality` medal at the end of the soccer game or whatever, I survived and became a (somewhat) functional adult. In fact, I think I benefited more from losing that I would have from winning.

Losing with grace is sometimes more difficult than winning with it. What it does is it teaches perspective, and teaches you that yes, there are people better than you at certain things, and that`s not unexpected given that little thing we call evolution. Human beings come in many varieties, and some would be better adapted to chasing down a gazelle than others; when that was the means of survival, the ones who couldn`t were less likely to procreate. Now, however, there are a greater number of options as to how you win bread or bring home bacon or what have you – those people who kick the hell out of the ball may not be those who are good at manufacturing or selling said balls, or speculating on how many balls will get kicked; that`s the skill and talent the rest of us have, and some people can be scary good at all of these. Losing is learning about yourself and how you relate to the world; in that way, `losing`is in no way the same as `failing` – losing is not crossing the finish line first, but failing is not taking that experience and making it into a positive learning experience.

the difficulty that arises when someone is literally unable to fail because they are cocooned in a bubble wrap of `self-esteem`is that they don`t learn the coping skills necessary to make loss meaningful, because they don`t experience it. The child who is unable to lose becomes an adolescent and an adult who is baffled by the fact that they cannot realistically expect to be rewarded for everything they do – and yet they do expect it. I have been fortunate not to experience it myself, but I have friends who teach in post-secondary education who constantly encounter students with huge egos and unbelievable feelings of entitlement – self-esteem has, for these individuals, whooshed right past `confidence`and careened madly into `arrogance`. In the same way I can`t ask someone to hand me a zarf if they have no idea what it is, they are unable to recognize the true value of failing and trying again because they`ve never had to.


A zarf. Don`t say I never learned you nothin`. 

Eliminating the possibility of losing also takes away the ability to potentially excel, and both of these are critical in the development of a real, whole, genuine and might I add empathetic individual adult. We have been far too focused on not making kids sad and not nearly focused enough to see the long-term effects of character building experiences on the adults they eventually become. We take away the ability to stand out, and we actively deny the process that in the broadest sense makes us human in the first place.

Anyway, I`ve never been a parent, and the odds against my becoming one grow greater every day, so I can only speak from my own experience and knowledge – which makes me a lot more like my Dad than I ever realized, now that I think about it.

And that`s rather awesome, actually. Thanks, Dad, for letting me make mistakes. I think I`m better for it.

entertainment, media, science, Self-righteous asshole, Things We Should Know, willful blindness to absurd extremes

The Vast Difference Between Balanced and Irresponsible


If you’ve been reading my little posts for a while, you’ll know how I decry irresponsible journalism. On the flip side, I am an admirer of good journalism – writing that informs about legitimate debate and shows signs of painstaking research and fair examination of both sides of an issue.

Once again, Time Magazine shows us the difference between legitimate balance and useless filler. Jenny McCarthy is the subject of an article in which she claims to have ‘cured’ her autistic son. Oddly enough, Time manages to both justify (poorly) thir reasons for running the story and indicate why it shouldn’t have run in the first place:

To McCarthy’s opponents, from the public-health officials at the Centers for Disease Control and Prevention (CDC) to the pediatricians of the American Academy of Pediatrics, this makes McCarthy much worse than a crank: she’s a menace to public health.

So, the recognized scientific authorities on this topic disagree with the half-wit actress and former Playboy model? It’s not hard to decide who carries more authority in this case.

They ask why so many mothers are reluctant to vaccinnate their children based on McCarthy’s insistence that they are dangerous. That’s an easy one: They are idiots. Easily-led, scientifically illiterate idiots. Articles like this one will only make the problem worse, thereby giving Time the opportunity to exploit the resulting catastrophic preventable illness rates somewhere down the road.

In responsible scientific journalism, debate is important – two sides, both interpreting evidence they have gathered, but who have reached different conclusions. This is the essence of constructive debate, and is reflective of how science is built – good scientists always accept the possibility that they may be wrong. All of it is based on evidence, however. To juxtapose the results of hundreds of scientific studies with the beliefs of a third-rate actress is clearly wrong. The arguments are not coming from the same basis of assumptions – one is systematic, the other emotional. The evidence is clear: vaccinnes do not cause autism. Even the single study cited by vaccine panic-mongers is an admitted falsification, and is therefore invald. Preponderance of scientific evidence vs. fake science and anecdotal belief. Which should you choose?

Yet, Time insists on perpetuating the myth by giving McCarthy a venue to create more risk to children.

Let’s use a crude (very crude) analogy to demonstrate: I hereby deny the existence of Australia (no offence, just the first thing that came to mind). I’ve never seen it, except on maps. Well, what is my motivation to trust the representatives of the mapmaking industry? They have a vested interest in maintaining the illusion that Australia exists. How else would they maintain sales of maps of certain portions of the Southern Hemisphere?  Basically, I’ve never been there, never experienced the country directly, so I don’t believe it. What about all the people who have been there? In the pocket of Big Cartography. All the pictures? Faked – probably New Zealand or clever photoshops.

So, within a logical (although deeply flawed) framework of belief, for which I could probably gain support from at least a fringe portion the billions of people who have also never visited there, I have made my case. I demand equal time in Time to defend my views, because only then will they have fairly presented all sides.

In a word, no. Time, and other venues, need to wake up to the fact that not all views are equal. Despite the insistence of politically correct postmodern apologists for the validity of all ways of knowing, some beliefs are demonstrably, objectively and irrefutably wrong. Jenny McCarthy’s views on the link between autism and vaccines is one of them.

I know, my position is impossibly naive and not reflective of the competitive world of infotainment that news has become, but I can still hope for better. I can hope for someone, somewhere, to wake up and realize how irresponsible this type of reporting is. Somewhere in the world is an editor who can stand up and say, “No more”.

Except in Australia, of course.

creationism, education, evolution, favourite person, science, Sites of Interest, Things We Should Know

“Ardi”: No Longer Missing

My friends, it is a good day when the ludicrous ramblings of demagogues are undone by the spirit of scientific exploration. I’d like to introduce your friend and mine, “Ardi”:


Ardi is a specimen of Ardipithecus ramidus, who walked the Earth over a million years before Lucy, who herself lived about 3.2 million years ago. She is, in simplest terms, the common ancestor of humans and apes, or, even more importantly, the ‘missing link’ that was a supposed weakness of evolutionary theory.

From the online National Geographic article:

Announced at joint press conferences in Washington, D.C., and Addis Ababa, Ethiopia, the analysis of the Ardipithecus ramidus bones will be published in a collection of papers tomorrow in a special edition of the journal Science, along with an avalanche of supporting materials published online.

“This find is far more important than Lucy,” said Alan Walker, a paleontologist from Pennsylvania State University who was not part of the research. “It shows that the last common ancestor with chimps didn’t look like a chimp, or a human, or some funny thing in between.”

This is an historic find with broad significance to the study of human biology and history. To its credit, the New York Times does lead its Science section with this, but it deserves a lot more attention.

Strike another of the unscientific arguments against evolution. Take that, Kirk Cameron!

atheism, censorship, christians, creationism, culture, evolution, religion, religious right, science, Things We Should Know

Evangelicals: Growing Pains (In the Ass)

Former Growing Pains ‘star’ and current delusional paranoid evangelical xtian Kirk Cameron, has indicated that he plans to distribute to U.S. universities 100,000 copies of Charles Darwin’s On the Origin of Species, with a new 50-page foreword, to subvert the 150th anniversary of the publishing of the iconic science text on November 22nd, 2009 – “Darwin Day”.

The following from The Huffington Post:

Cameron explains that this “very special” edition of the “Origin of Species” will include an introduction explaining “Adolf Hitler’s undeniable connection” to the theory of evolution, and highlighting “Darwin’s racism” and “his disdain for women.” Cameron’s edition also exposes the “many hoaxes” of evolutionary theory, while presenting a “balanced view of Creationism.”

From the untalented hack’s own mouth:

A clever response from another YouTube user:

There are no limit to the ways I can object to this, and to how offended this makes me. To suggest that someone has the right to potentially alter the text of a seminal work (as suggested here)  is offensive. There is no requirement to be fair in the discussion of established scientific fact – there are no alternate explanations. What discussion happens in the field of evolutionary science concerns the processes within the general theory, not whether the basic theory is true. There is no internal conflict as to the truth of the statement “Organisms evolve and adapt to their environments”, the question of how it happens in specific instances are the subjects of discussion. There is, nor will there ever be, a requirement for ‘fairness’ or for providing time for alternate explanations, unless these explanations are derived from the same methodology. Otherwise, you are comparing scientific apples to schizophrenic oranges.

I propose to give away, for free, 100,000 copies of the Revised Edition of the Bible, which includes extensive references to historical, archaeological and physical scientific records to disprove the assertions of that book phrase-by-phrase. Hey, it’s only fair, right?

Science has no comment on religion (other than in behavioral terms), and religion should not try to usurp the expertise of science. To rehash the tired cliches about Hitler and evolution (let’s have a chat with the American originator of Eugenics, Charles Davenport, before drawing conclusions – Hitler couldn’t have enacted the idea without an American’s help – nice going), and Darwin’s supposed racism (he was, by all accounts, fairly tolerant – as least as much as an Englishman of his time could be) and misogyny (same notation) are idiotic, and will not convince anyone that evolution is not a scientific fact. You are, if you’ll excuse the phrase, preaching to the choir – the only purpose of this farce is to reinforce the religious views of those who already believe.

The evidence – ALL the evidence: physical, archaeological, biological, geological, etc., etc., adds up to ‘proven’, no matter how uncomfortable an untalented former teen idol is with the concept.

429px-Charles_Darwin_seatedWay to go, Mr. D.

Fuck you, Kirk Cameron.

No matter what you do, Kirk, Charles Darwin will always be more famous than you. Deal with it.

culture, entertainment, general silliness, Lighter Things, science

I Hope You Put a Stake in Him First

Dead child molester and incredible melting man Michael Jackson was buried yesterday, in a private ceremony, attended by 250 or so of his closest friends. Who all happen to be celebrities, and who for the most part appear to be out of work and in need of the camera time. How touching. If, of course, by ‘touching’ I mean ‘fucking disgusting’. It’s over already, let it go. He was somewhat talented nearly 40 years ago when he was black, and male, and, well, alive.

Quick question: who passed away on June 25 and brought joy to millions and made an indelible mark on his home country and the culture of the world?

Give up?

No, not MJ.

Jacques-Yves Cousteau, who passed away June 25, 1997.


Au revoir, you magnificent wet French bastard.

I mean, come on: you invent the fucking aqualung, allowing mankind to freely explore the most mysterious and fascinating areas of our planet (I was going to say ‘breathtaking’, but even I have limits), and launch a tradition of French-accented comedic narrations, and you get no love?

You learn to walk backwards while looking like you’re walking forwards, call it a dance step, and you’re an icon? Fuck.


At least now I have an excuse for wanting to shoot him in the head

Credit, and admiration, where it’s due, people.  At least there promises to be more entertaining times ahead for MJ, when they steal his body and hold it for ransom. Seriously, I think it’s gonna happen. Good enough for Charlie Chaplin, who at least limited his habit to underage girls, let it work on the self-proclaimed King of Pop.

atheism, christians, culture, religion, science, willful blindness to absurd extremes

Fish Story

So, Stanley Fish has felt the need to reply to the flood of reader responses to his blog post about religion, as I commented on here. His post begins thus:

According to recent surveys, somewhere between 79 and 92 percent of Americans believe in God. But if the responses to my column on Terry Eagleton’s “Faith, Reason and Revolution” constitute a representative sample, 95 percent of Times readers don’t.

Need I say, kudos to New York Times readers. He is gracious enough to indicate that the arguments against his original writing were not just name-calling, but contained some arguments to back it up. I will give credit where it is due, he did not have to acknowledge that his readers are actually intelligent, but he did. From there, however, Fish careens off the rails and into a mud hole of obscurity and postmodern blather.

Any long-time readers of this iteration of Blevkog and of its predecessor will recall that I have somewhat of a… problem with the idea central to postmodern thought: all forms of knowledge and understanding are valid, ergo none are. (I direct you to Francis Wheen’s Idiot Proof for a more fulsome discussion on the matter – I highly recommend it)

His main example is a dispute over the authorship of a poem – people, as he puts it, presuppose who the author may be, and see evidence to support it within the text. He extrapolates this to scientific endeavor, with people assuming they will see no evidence of god, so they see none. Conversely, those who accept god will see evidence of his works everywhere. I need not point out to rational readers that this is a classic example of a false analogy, a rationalization built on a faulty premise. The point, to adapt his example, is not to establish which author is reponsible for a particular work, but to establish that a human being wrote it, and that it did not spring from supernatural causes. The assumption of one author or another excludes its’ composition by ghosts or aliens, therefore it is a rational process. The problem arises when you ignore the central tenet of Occam’s razor, and add layers of explanation that require additional assumptions. If I assume Shakespeare wrote a sonnet attributed to him, there are rational and generally accepted ways to analyze the text to look for some clues for alternative attribution. I cannot, however, assume that it was written by a poet in the 30th century who travelled back through time and assumes Shakespeare’s identity when composing the work. It is accepting the limits of reasonable, and indeed possible, explanations that makes the poem analogy work as a metaphor for science.

As the whole piece is built on a faulty premise, Fish must struggle to reconcile the weakness of his analogy with his inability to refute his readers:

To bring all this abstraction back to the arguments made by my readers, there is no such thing as “common observation” or simply reporting the facts. To be sure, there is observation and observation can indeed serve to support or challenge hypotheses. But the act of observing can itself only take place within hypotheses (about the way the world is) that cannot be observation’s objects because it is within them that observation and reasoning occur.

While those hypotheses are powerfully shaping of what can be seen, they themselves cannot be seen as long as we are operating within them; and if they do become visible and available for noticing, it will be because other hypotheses have slipped into their place and are now shaping perception, as it were, behind the curtain.

By the same analysis, simple reporting is never simple and common observation is an achievement of history and tradition, not the result of just having eyes. And while there surely are facts, there are no facts (at least not ones we as human beings have access to) that simply declare themselves to the chainless minds Hitchens promises us if we will only cast aside the blinders of religion.

The fact that this makes his argument invalid as well seems to escape him.

History and tradition govern the operation of science as well – methods are tested and developed, theories and hypotheses are tested and re-tested, we learn from the past. What we should not do is allow the superstitions of the past to overwhelm our reason. If, as Fish asserts, religions are as valid a way of knowing the world as any other, are we to accept the resurgence of human sacrifices to assure bountiful harvests, or to make the sun rise as necessary for our survival? If not, are we to assume that some religions do in fact have a premium over others in their interpretations of reality? Who decides? Is Fish actually preaching the supremacy of his own religion over others, and, if so, doesn’t that violate one of the fundamental ideals of religion (or in fact does it just represent the bias inherent in religious thought)?

We assume the relative importance of things around us, that is true, but that reflects these individual parts’ varied utility to us as we proceed through our daily life – everything around us is simply not as important as everything else. While the ojects in our homes are probably of greater importance to us than the average, they still are relatively more or less important dependent on context – the TV becomes less important as the pipes burst (at least, unless it’s the playoffs – remember, there are priorities). The same applies to facts, particularly about the natural world – they exist, and they do present themselves, but we do have to focus on them and apply the scientific method to discover them in a way that is transparent and rational, and, perhaps most importantly, can be re-discovered by others following the same method. ‘Facts’ in a scientific sense do not come onstage and dance for us unless they pass the auditions, thank you, to borrow from Fish’s artistic metaphor. The assumptions that make science what they are, of precision, of testing and re-testing, of openness to criticism and sharing of knowledge, have nothing in common with the blind faith required from religion – religious ideas fail as knowledge because they require no rigor, only a decision to consciously omit rigor in our thought.

Tradition is, and always is, an insufficient argument to establish anything – scientific methods have come down to us over time, ’tis true, but it’s because they have been proven to be consistent and reliable. Other traditions, like the aforementioned human sacrifices, or prohibitions against women voting, or owning property, etc., simply have no utility in an advanced society – they have no utility in practice.

Science and religion do not compare to one another, and they are not compatible with one another, no matter what anyone says – you do either your work or your faith an injustice if you try to have it both ways. Fish tries, and fails, to denigrate rationalism by indicating that religion is immune from the criticism that it relies on faith. The point is meaningless, and ill-considered for a university professor – it is comparable to saying that I must not be a white male because I have never said that I am (I am, and I have said so, repeatedly – but you take my point).

Perhaps the most telling portion of Fish’s post is the last line, which is simultaneously the most arrogant and ignorant comment I have ever seen, made all the worse because he refers to himself:

I refer you to a piece by syndicated columnist Paul Campos, which begins by asking, “Why is Stanley Fish so much smarter than Richard Dawkins?” Darned if I know.

The article would havc been more informative if it consisted of just that statement. We would then know clearly what Stanley Fish’s thesis was: I’m smarter than anyone else.


Lighter Things, media, science

What’s In A Name?

The National Aeronautics and Space Administration (NASA) has been holding a contest to name the international space station.

Second place: Serenity.

First place, with 230,539 votes, over 40,000 more than the second place entry:



Congratulations, it’s a space station.

This just made my day.