Monday, September 28, 2015

How to Talk About Ahmed "Clock Boy" Mohamed Without Sounding Like a Fool

When we're talking about people writing for an outfit like Breitbart, coming across as an idiot on this type of matter may be part of the job description, but for the rest of us....

Credit where it’s due to comedian Bill Maher and HBO show Real Time, which has become one of the few mass media outlets telling the truth about “Clock Boy” Ahmed Mohamed, who was removed from his Irving, Texas school when he brought a device to the school unannounced that resembled a bomb.

If Maher is correct, let's skip ahead to his conclusion:

Maher has repeatedly said that he believes Mohamed should not have been arrested.

That's the gist of the controversy -- that Mohamed was arrested. Had he merely been sent to the principal's office, we would never have heard of him.

So is that the story? A young teenager was needlessly arrested for having a clock, with Breitbart's right-wingers joining with Maher to deplore the stupidity of his arrest? Of course not. What the author actually approves is Maher's claim that the teenager was not as bright as the initial stories suggested, and Maher's anti-Muslim rhetoric. So let's take those issues in order:

Maher used a zinger to shut down the notion that the boy removing the back of a piece consumer electronics and showing it to people makes Mohamed any kind of inventor, saying “This is like pouring milk on a bowl of Cheerios and claiming you invented cereal.”...

It’s the second time that Maher has featured the Clock Kid as a topic of discussion. On a previous episode of Real Time, billionaire Marc Cuban revealed that when he spoke on the phone with Muhamed and asked him questions, he could hear his sister whispering answers to him.

Tee, hee, hee, Maher really put that fourteen-year-old boy in his place. But here's the problem: the boy wasn't arrested for plagiarism, for cheating on his homework, for patent infringement.... Even if we presuppose that he can't even tie his shoes by himself or speak in coherent sentences, that in no way stands as an excuse for his treatment by the school or by the police. It's irrelevant.

When liberal Ron Reagan, Jr. attempted to claim that the device that Mohamed brought to school didn’t resemble a bomb, Maher quickly defused his argument, urging him, “Try taking that through airport security.”

And, as Reagan correctly pointed out, it would not have been a problem -- because you're actually allowed to take clocks, and electronic components, onto airplanes. The boy would have placed his pencil case on the scanner belt, the person operating the scanner might have flagged the item for further review, and upon further review it would have been determined to be a clock. They might also have swabbed the case to test for any residue of explosives, only to again confirm... clock.

While clocks can be used as timing devices for explosives, clocks are present in many things that people routinely take onto airplanes. Cell phones, computers, tablets, ebook readers, wristwatches, travel alarm clocks.... It's really not alarming -- even if it's an alarm clock -- unless there's some indication that it's actually going to be used in association with an explosive device. And no, looking like your memory of the excessively complicated explosive device created by the prop department of a James Bond movie does not translate into it's being anything more than a clock.

On that episode Maher also noted that so many young Muslim men have “blown a lot of shit up around the world.”

The boy's religion has obviously factored into discussion of the case, and appears to be the leading factor in Maher's knee-jerk reaction to the case. It's a factor that is emphasized and amplified in the right-wing media, where you can read conspiratorial tales of how the boy's older siblings or father are activists of one sort or another, and how the whole thing was a deviously clever set-up of the school.

Okay.... so we have a kid who's actually stupid, and does nothing more than disassemble off-the-shelf clocks and put them into his pencil case, but at the same time who is so clever that he tricked the police into arresting him, and tricked the right-wing media machine into engaging in anti-Muslim demagoguery? I think there's a bit of tension between those two positions, but leaving that aside for the moment, proponents of the Muslim angle have a bigger problem:

The school does not report that it suspected that the boy's device was a bomb, real or fake, because of his religion. The school does not suggest that their knowledge of the boy's family played any role in its decision to treat the possession of the clock as a criminal matter, and to call in the police. The police don't claim that they knew the boy's religion, or that they suspected that the clock was something other than a clock because of his religion.

The commonality seems to be that the teacher, school officials and police officers who made the stupid decision to treat this as a criminal matter believed that anything that looks like the time from a bomb in a James Bond movie has to be a bomb -- be it an actual bomb or a fake bomb. That's not a matter of the boy's being the most brilliant inventor on the planet or dumb as a rock; it's not a matter of the boy being Muslim, Christian, Hindu or atheist. It's a matter of the school administration and police acting foolishly and needlessly arresting a boy for his possession of a clock.

The actual story is this: A boy brought a clock to school, whether disassembled or home-made, a teacher was concerned by its appearance, the school overreacted and brought in the police, and the police overreacted by making an arrest. Once you subtract the anti-Muslim rhetoric, you can move straight forward into what seems to be obvious even to a Breitbart writer, "Mohamed should not have been arrested". Your choice to inject more into the story may tell us something about you, but it's otherwise irrelevant.

Wednesday, September 23, 2015

One of the Prices of Bigoted Demagoguery

Professional Islamophobe Pam Geller brings her trademark (lack of) insight to the Ahmed Mohamed incident, the boy who was arrested for bringing a homemade clock to school:

"If you ever see a Muslim with a suspicious object, remember the lesson of Ahmed Mohamed: to say something would be 'racism,'" she wrote. "That could end up being the epitaph of America and the free world."

A rational mind might observe that the biggest difference between a homemade clock and a homemade bomb is that the latter involves explosives. Without actual explosives, you have no bomb. Without make-believe explosives, you have no make-believe bomb. But more than that, the reason why the boy's clock looks like a bomb to people like Geller, and the reason any circuit board is going to look like a bomb to the equally addle Frank Gaffney, is because by all appearances everything they know about electronics and bomb-making comes from watching movies.

If we're going to embrace hysterics and suggest that any person in possession of something that looks like it could be used as a trigger device for a bomb should be arrested, whether or not they possess real, fake or imaginary explosives, we can start by arresting any person found in possession of a cellular phone. Meanwhile, if Geller is truly concerned that arresting kids for possessing homemade clocks is going to prevent the arrest of actual criminals, I suggest that she get herself a copy of "The Boy Who Cried Wolf", read it, then take a long look at herself in the mirror.

Sunday, September 20, 2015

Does Jon Snow Become Azor Ahai

If you're a Game of Thrones fan, you know that one of the subplots involves Melisandre's belief that Stannis Baratheon is the reincarnation of the legendary hero, Azor Ahai. The books emphasize that subplot to a greater extent than the series. With the death of Stannis, it's pretty clear that he's not Azor Ahai, and... who else is left but Jon Snow?

So perhaps in some sort of parallel to Danaerys's survival of her husband's funeral pyre, we're headed toward the very dead Jon Snow, arms crossed on his chest clutching his sword, set ablaze... and coming back alive not as "Jon Snow" per se, but as Azor Ahai.

The evidence for this theory would be a cross between what would turn out to be exceptionally manipulative statements by the showrunners that Jon Snow is dead, and the fact that Kit Harrington has been repeatedly seen around the Game of Thrones set, including observation of his participation in a massive battle sequence. Let's be honest: if he's in a massive battle, it's not going to be a flashback.

It's not entirely clear what it would mean for Jon Snow to become Azor Ahai, save for it being difficult to believe that it won't come across as corny. Also, as popular as the show is, I don't think that its necessary to manipulate the audience to create buzz.

Thursday, August 20, 2015

Trigger Warnings on College Campuses

The Atlantic has published an article that correctly laments the rising use of "trigger warnings" by colleges, but which focuses on an unsupported thesis that there is something materially different about "kids these days" rather than examining why college administrators are sensitive to demands for those warnings. Don't get me wrong -- there are differences between generations, and in some ways today's college students are a lot more sensitive about wrongs than were past generations. For example, today's college students appear to be much more sensitive to and much less tolerant of racism, as compared to past generations.

A couple of decades ago, when I was in law school, my criminal law professor used sexual assault in order to illustrate some of the complexities of the law. A law professor can pose a hypothetical example of a sexual assault in which the victim sincerely believes that a sexual assault occurred, while the accused (reasonably or unreasonably) sincerely believes that everything that occurred was consensual. Such a hypothetical case allows you to examine questions of the degree to which intent should factor into criminal prosecutions, charging decisions and sentencing. Should two equivalent acts receive the same punishment, even though one offender was acting deliberately and the other was clueless? To what degree does deterring the future acts of others justify prosecuting an offender who may not have realized that he was committing a crime? Hypotheticals can also extend into cases in which consent is obtained through fraud or deception, or by a mistake of perception.

Some of the students in my section made strenuous objection to the use of sexual assault -- not to any specific example, but to its being mentioned at all -- within the classroom. The objections did not change the manner in which the class was taught, but did inspire school administrators to meet with the students who raised the objections to try to explain what the professor was hoping to accomplish and to resolve their concerns. At that time I was told by another professor that the atmosphere for discussing sexual assault cases had changed significantly, and that some professors had already stopped using sexual assault hypotheticals in the classroom despite the difficulty of formulating hypotheticals that would as clearly illustrate the legal principles they were trying to teach. The Atlantic article opens with the example of "law students asking... professors at Harvard not to teach rape law", as if it is a new development. It is not.

When I hear of demands for "trigger warnings", my response is not that students are somehow different from back when I was in college. It may be that there has been some shift in the number of students raising objections -- something that I have not seen documented -- but the primary difference appears to be that school administrators have changed how they respond to student objections. Forty years ago, the response probably would have amounted to, "It's college, and you're going to be uncomfortable at times. Get over it." When I was a student the response was more gentle, but with a similar outcome. Today, college administrators do seem much more likely to ask that a professor add "trigger warnings" or change something about a course in order to avoid making the objecting students uncomfortable.

Why would administrators change their approach?

I think the answer lies not with the modern generation of students, but with the modern approach to the funding of college education. When college education was more affordable, and colleges were less dependent upon squeezing every last tuition dollar out of a student, colleges could more easily treat their students as students. As states have chosen to reduce their support for public colleges, colleges have increasingly had to fight for every dollar. Part of that process has involved making life a lot easier, and a lot more comfortable, for students. Part of that process involves catering to a vocal minority that, if not appeased, is likely to create negative publicity for a college or take its tuition dollars elsewhere.

The authors write,
The press has typically described these developments as a resurgence of political correctness. That’s partly right, although there are important differences between what’s happening now and what happened in the 1980s and ’90s. That movement sought to restrict speech (specifically hate speech aimed at marginalized groups), but it also challenged the literary, philosophical, and historical canon, seeking to widen it by including more-diverse perspectives. The current movement is largely about emotional well-being.
I think, here, that the authors are looking at two different forms of response by college administrators. There was no ambiguity in the classroom instruction -- sexual assault was presented as a bad thing, empathy was extended to the victim even when the hypothetical posited an offender who did not realize that he had committed a crime, and nobody questioned why "'No' means 'no'" was sensible policy. The students who objected to any reference to sexual assault in our criminal law class were not interested in widening the discussion or including additional viewpoints. Had our professor been less sensitive, perhaps an administrator would have advised him as to how to introduce the subject in a more sensitive manner, but the key difference seems to be that the administrators were not receptive to demands that material be removed from classroom instruction on the basis that it created discomfort for some of the students.

While the authors of the article describe the efforts in our society to make life more safe for children, I think that they overstate their conclusions. They write, "children born after 1980 — the Millennials — got a consistent message from adults: life is dangerous, but adults will do everything in their power to protect you from harm, not just from strangers but from one another as well." I disagree that the underlying message is "Life is dangerous" but, more than that, it's difficult to see that as a distinction from prior generations. Doe the authors believe that in the past generations, parents were teaching kids that "life is safe", even when that was patently untrue? Did the generation that grew up a century ago, spending their early childhood in an age before polio vaccination and where infant mortality was common, who experienced the Great Depression, who were born around the time of World War I and saw the country go through World War II (perhaps fighting in that war) under the impression that life was safe? I somehow doubt it.

Perhaps the difference is that parents of past generations of kids told them, "Life is dangerous, but you're on your own, kid -- you can't count on me to keep you safe"? No, that doesn't seem plausible, either.

I think the authors stray off of the rails when they suggest that political partisanship may be a significant contributing factor, based on "survey data going back to the 1970s". The authors are certainly aware that past generations have also experienced times of fierce political partisanship. Our country even once had a civil war. I also have little sense that students arriving on campus are more politically aware than those of past generations. So when the authors suggest that "students arriving on campus today might be more desirous of protection and more hostile toward ideological opponents than in generations past", it seems fair to point out that they're engaged in conjecture, and that they haven't adequately supported a theory that even they phrase as conjecture. Similarly, when describing the advent of social media, the authors write,
These first true “social-media natives” may be different from members of previous generations in how they go about sharing their moral judgments and supporting one another in moral campaigns and conflicts.
Or... they may not be any different.

The authors note that faculty members may be concerned about being attacked on social media, something that might explain why faculty members and administrators are perhaps hypersensitive to certain student complaints, but that's not an observation about the students.

The authors state,
We do not mean to imply simple causation, but rates of mental illness in young adults have been rising, both on campus and off, in recent decades. Some portion of the increase is surely due to better diagnosis and greater willingness to seek help, but most experts seem to agree that some portion of the trend is real.
If they're not suggesting "simple causation", what form of causation do they in fact intend to imply? What changes in diagnosis rates has occurred, and for what mental illnesses? Which mental illnesses are now more frequently diagnosed, and what portion of the increase for any given mental illness do the authors believe might be associated with calls for "trigger warnings"? When they speak of "experts", what are the qualifications of the experts whose views they have examined and deemed relevant? What does it mean to "seem to agree", as opposed to expressing actual agreement? It's really easy to make a nebulous, speculative assertion in support of an argument, but if you want it to carry weight you have to provide some amount of substance.

The authors suggest that changes in the interpretation of federal civil rights law that occurred in 2013 might play a role in the changes, but that argument seems weak on a number of fronts. First, the trends they are describing started long before 2013. Second, the average student is completely unaware of those changes. Third, to the extent that administrators are responding by being hypersensitive not only to situations implicated by the actual changes, but to situations well beyond their scope, that suggests a problem with the administrators and not the students. To the extent that interpretations of the law by federal agencies are making it more difficult to teach effectively on campus, those organizations should reconsider or clarify their interpretations, but it's not the fault of either students or federal agencies if college administrators impose policies that extend far beyond what the law requires.

The article continues into what I see as some rather odd armchair psychology, such as the suggestion that triggering material could benefit sensitive students by desensitizing them to a trauma. Even if we assume that the people demanding trigger warnings are doing so because of their own sensitivities, as opposed to those of others, there is a difference between a sensitivity and a phobia -- and no psychologist in his right mind is going to conflate random encounters with material that triggers a phobic reaction or PTSD is a proper alternative for a professionally conducted process intended to desensitize and individual. In fact, you're apt to learn that such an approach could worsen the phobia.

They share a couple of example of "catastrophizing" that are about college administrators:
Catastrophizing rhetoric about physical danger is employed by campus administrators more commonly than you might think—sometimes, it seems, with cynical ends in mind. For instance, last year administrators at Bergen Community College, in New Jersey, suspended Francis Schmidt, a professor, after he posted a picture of his daughter on his Google+ account. The photo showed her in a yoga pose, wearing a T-shirt that read I will take what is mine with fire & blood, a quote from the HBO show Game of Thrones. Schmidt had filed a grievance against the school about two months earlier after being passed over for a sabbatical. The quote was interpreted as a threat by a campus administrator, who received a notification after Schmidt posted the picture; it had been sent, automatically, to a whole group of contacts. According to Schmidt, a Bergen security official present at a subsequent meeting between administrators and Schmidt thought the word fire could refer to AK-47s.

Then there is the eight-year legal saga at Valdosta State University, in Georgia, where a student was expelled for protesting the construction of a parking garage by posting an allegedly “threatening” collage on Facebook. The coll[e]ge described the proposed structure as a “memorial” parking garage—a joke referring to a claim by the university president that the garage would be part of his legacy. The president interpreted the collage as a threat against his life.
I don't think that the dean who complained about Francis Schmidt was identified, but I doubt that the dean was a 'Millennial'. The president of Valdosta State University appears to have been in his late sixties at the time of the incident, so it's safe to say that he's not a millennial. Those actions may not represent the best of what we hope to impart to college students, but they appear to support for the argument that the primary source of the problem lies with college administrators as opposed to students.

The article also references an incident in which an instructor's joke was misinterpreted by a student and reported as a threat, resulting in the instructor's suspension. It is no surprise that college administrators are sensitive to potential violence on campus, and does not appear to be in question that the response in that case was an overreaction. It's difficult to see how the incident has anything to do with students being too sensitive, as opposed to there being an understandable concern (even if magnified by the spotlight effect) about violence on campus.

My preferred approach would not be seen as very sensitive to those who are concerned about potential triggering, or who are inclined to reinvent innocuous statements as "microaggressions": I would like to see colleges instruct incoming freshmen that the can expect their ideas to be challenged as they come into contact with other students, professors, and materials that reflect opinions and perspectives different from their own, and that the risk of being at times upset or offended is inherent to the learning process. I would trust professors who were planning to use particularly disturbing or potentially offensive materials to warn their students; for the most part it's difficult to think of scenarios where the nature of the material that will be presented in a course is not foreshadowed by the subject matter. Students who are hypersensitive would be free to explore their options within that framework, or to consider other colleges. The small but influential population of students who want to protect not only themselves, but to protect others from any form of actual or or potential offense would be on notice that the administration will focus on incidents of actual bias and discrimination, but won't be tying itself in knots trying to ensure that nobody is ever offended.

Will that happen? Perhaps at a college that is not dependent upon accepting most or all of the students who apply. But as long as colleges are in the position of having to treat students like customers, I don't expect that this sort of trend will reverse itself -- and I find myself in full agreement with the authors' underlying position that this trend isn't good for anybody

Tuesday, August 11, 2015

The Other Side of the So-Called Trophy Culture

A recent Real Sports broadcast, summarized here, declared that our nation has a problem with a "trophy culture" that coddles kids by handing out trophies for merely showing up for games, or perhaps even simply for signing up for a team whether or not they even come to a practice. At the conclusion, Bryant Gumbel expressed that he didn't see it as a big deal, and was told by the correspondent that it was somehow emblematic of a larger social problem. I find myself far more sympathetic to Gumbel's position.

If you've ever seen very young children play a team sport, like Soccer, you might be reminded of kittens chasing a string. No matter how many times you tell them to hold to their positions, most and sometimes all of the kids will simply chase the ball no matter where it is on the field. At that age you will find some kids who are skilled beyond their years, but for the most part I don't see much benefit in treating a game as if it's a meaningful contest. Let the kids have fun, don't worry about the score, and focus on building their interest and skills for future years.

As kids get a bit older, and they divide into groups of better-skilled and lesser-skilled players, a new question arises: Are you going to give every child some time on the field, or are you going to instead focus on getting the win? When the focus is on the win, stories of bad conduct by adults -- don't take it from me: Here's Real Sports on the issue:

Real Sports found a person by the name of Ashley Merryman to play the part of the scold against tropies. My first reaction to her statements was, "Kids aren't that stupid," or, to put it another way, kids know the difference between getting a trophy for participation, as compared to getting a trophy for their performance. I thought that perhaps somebody had researched the issue and discovered that somebody had -- as explained by none other than Ashley Merryman herself:
By age 4 or 5, children aren’t fooled by all the trophies. They are surprisingly accurate in identifying who excels and who struggles. Those who are outperformed know it and give up, while those who do well feel cheated when they aren’t recognized for their accomplishments. They, too, may give up.
So the issue really isn't that kids are fooled into thinking that their performance exceeds their actual skill set by getting participation trophies -- they know the difference. Having undermined her own argument, Merryman tries to turn it around by arguing that top performers might "give up" if they see other kids get meaningless trophies -- to which I respond, "That's nonsense". When I look at youth sports these days, I see kids performing at the highest levels that I've ever seen, far beyond the performance of the same age cohort when I was a kid. What evidence does Merryman offer to convince me not to believe my lying eyes? That would be... nothing.

At this point the case against participation trophies would seem to be that, past the age of five or six, the kids see them for what they are, and it costs a lot of money to hand out hundreds or thousands of meaningless trophies. Merryman can see that, but just can't stop herself from catastrophizing: "We have to stop letting the Trophy-Industrial Complex run our children’s lives." That would be an argument that's supported by the facts, while extrapolating that participation trophies are emblematic of the ruination of our society, the decline and fall of our civilization... not so much.

Saturday, July 04, 2015

Sorry, Game of Thrones Fans: Jon Snow is Dead

Was that clear enough? Let me say it again, just to be clear: Jon Snow is dead.

Please, no more need for recycling the same old theories about why he is alive. That ground has been thoroughly covered. That horse is as dead as Jon Snow.

Of course there actually are some decent arguments for why Jon Snow might somehow be resurrected. That possibility seemed quite plausible in the books, with Melisandre's presence at the Wall providing for a means of bringing Snow back from the dead -- and when Melisandre made her retreat back to the Wall as Stannis's army fell it seemed fair to ask why else she would be there but to save Jon Snow. The books also resurrect... book spoiler warnings Catelyn Stark. The books introduce a character, Coldhands, who seems to be a different form of living dead from the other creatures found north of the Wall.

In both the show and books Gregor Clegane, The Mountain, is brought back as some sort of zombie. Beric Dondarrion is resurrected several times before (in the books) passing his power onto Catelyn Stark. There would have also been the possibility of glamouring, as we know that Melisandre can make one character appear to be another. The book also emphasizes wargs, with some theorizing that Jon Snow could have warged into his direwolf before his body died (never mind that there's no obvious way back to human form, and that we've been told that remaining too long in an animal host will cause you to lose your human character).

Also, it seems like an incredible waste of a very interesting character to kill Jon Snow at this point in the story. The relationship you form with a television character is different from that you form with a character on the printed page, and the loss of Snow seems all the more stark (no pun intended) in the series. Why spend all of that time and energy building him into an interesting character only to kill him off, with nobody else of similar charisma to fill his shoes? Why drop conspicuous hints about his "real" parentage if he plays no role in the end game? From a literary standpoint, why kill off the principal point of view characters who could play a central role in the conclusion, while keeping alive others who can't possibly be on the winning team? Will people, readers or viewers, care about events at the Wall without Jon Snow?

At the same time there's one consistency to Martin's writing: If you're a noble, virtuous character who puts the well-being of others ahead of your own, you end up dead. Do you really think that Snow's body will be hidden and frozen for a year, so that he can be resurrected in Season 7? (Jon Snow on Ice.... Is this a Disney production?) From the standpoint of good storytelling, as questionable as it may be to kill Jon Snow at this point in the narrative, a secret, frozen Jonsicle would seem to be worse.

I suspect that the future story anticipates that readers won't care about the Wall post-Jon Snow, and thus won't be surprised or offended when the army of the dead defeats the remaining members of the Night's Watch (and any Wildlings who might have inexplicably continued to support their efforts in the wake of Snow's death), creating the context for a major conflict between the advancing White Walkers and the dragons of Danaerys. (That story line would seem a bit... predictable. We'll see.) Also, particularly if Jon Snow has the royal blood that many plausibly believes used to run through his veins, we can expect Melisandre to have some interesting visions in the flames of his funeral pyre.

The primary evidence for Jon Snow's death comes from the show, (and if it needs to be said) not the as-of-yet unfinished, unavailable sixth book. The show did not renew Jon Snow's contract for Season 6. It's one thing to give a character like Bran a year off, without much concern for whether the part will have to be recast in a future season -- children and adolescents change a lot, and few would be surprised to discover that Bran looks different after a year of communing with ancient trees -- he'll look different even if they keep the same actor.

While daytime soap operas of old used to switch out adult characters without much concern for appearance, and Game of Thrones has done the same with at least one small part (Gregor Clegane) and with a smaller role before it became larger (Tommen Baratheon) I don't think that audiences will accept a different actor as Jon Snow. He could come back among the undead, perhaps as a hooded Coldhands-type character, but he would both be dead and be a very different character -- what would be the point? If the show wanted to be sure that Kit Harington would be available to play Jon Snow in Season 7 or beyond, given the difficult production schedule, it is highly unlikely that they would risk his taking other jobs that would prevent his participation.

The show understandably shares Martin's affinity for killing off characters -- but sometimes the reasons for a death seem different, such as to simplify story lines and perhaps to control the show's budget. The death of Ser Barristan Selmy (a character whose story line died a slow on-screen death even before the showrunners made it final), the death of Mance Rayder with no glamouring, moving up the death of Shireen Baratheon and the defeat of Stannis Baratheon, not bringing back Catelyn Stark, killing Myrcella Baratheon (who at this point in the books was only missing an ear)....

The showrunners could easily have carried Jon Snow's character forward into season six, perhaps drawing on the chapters dealing with his management of the Wildlings or delaying the death of Stannis Baratheon while having Jon Snow announce the rescue mission that precipitated his death on the printed page. But they instead chose to bump Snow off during the last episode of the season, saving themselves a big chunk of change to apply to hiring other actors, building sets and producing special effects.

If in light of all of that you still think Snow is coming back, what do you think the dialogue would be upon his return in season seven? "Hey, folks, I'm back. Did I miss anything? Ooooh... are those dragons?"

So stop watching the length of Kit Harington's hair -- it's going to get longer and shorter over the coming year because (assuming an occasional haircut) that's what hair generally does. Stop arguing that if you turn your head sideways and squint, you can see proof of Snow's survival because his eyes almost imperceptibly change color during his death scene. "We shall never see his like again, and now his watch is ended."

Wednesday, May 20, 2015

The Reynolds "Charity" Empire in Decline

A few years ago I wrote a post entitled, "Is The Breast Cancer Society a Worthy Charity", to which the answer was "No". The comments to that thread are extensive, and include a defense of the organization from Kristina Hixson, which avoided answering any of the tough questions or giving an honest explanation of the organization's operations. She went so far as to post a series of fake endorsements to the thread, trying to bury valid criticism behind fictitious praise.

Oh yes, and she went on to marry the man who ran that "charity", James T. Reynolds II.

Over time, the Reynolds' family of "charities" started to receive press scrutiny. The Tampa Bay Times published an article, "Intricate family connections bind several of America's worst charities". It opens
Carol Smith still gets angry when she remembers the box that arrived by mail for her dying husband.

Cancer Fund of America sent it when he was diagnosed with lung cancer six years ago.

Smith had called the charity for help. "It was filled with paper plates, cups, napkins and kids' toys," the 67-year-old Knoxville, Tenn., resident said.

"My husband looked like somebody slapped him in the face. "I just threw it in the trash."
The story continues,
In the past three years alone, Cancer Fund and its associated charities raised $110 million. The charities paid more than $75 million of that to solicitors. Cancer Fund ranks second on the Times/CIR list of America's worst charities. (Florida's Kids Wish Network placed first.)

Salaries in 2011 topped $8 million — 13 times more than patients received in cash. Nearly $1 million went to Reynolds family members.

The network's programs are overstated at best. Some have been fabricated.
The Federal Government has finally managed to partially shut down the Reynolds empire:
In reality, officials say, millions of dollars raised by four “sham charities” [Cancer Fund of America, Cancer Support Services, Children’s Cancer Fund of America and the Breast Cancer Society] lined the pockets of the groups’ founders and their family members, paying for cars, luxury cruises, and all-expense paid trips to Disney World for charity board members.

The 148-page fraud lawsuit accuses the charities of ripping off donors nationwide to the tune of $187 million from 2008 to 2012 in a scheme one federal official called “egregious” and “appalling.”...

Among the allegations is that [Reynolds' ex-wife, Rose] Perkins gave 10% across-the-board bonuses twice a year to employees [of the Children’s Cancer Fund of America], regardless of performance, and was allowed to set her own salary and bonuses up to a limit without the approval of board members. In 2010, when donations to the Breast Cancer Society were declining, Reynolds II’s salary ballooned from $257,642 to $370,951, according to the complaint.
What can a grifter do, but grift? Even having been shut down, the Breast Cancer Society promises to come back to leach off of the good intentions of people who want to help cancer survivors:
The silver lining in all of this is that the organization has the ability to continue operating our most valued and popular program, the Hope Supply. Our Board will work tirelessly to maintain the Hope Supply program services that have benefitted our many patients for years – initially under the TBCS banner as it transitions under a different organization – all with the goal of seamlessly providing services to you. I take solace in the fact that this wonderful program has the chance to continue operating.
There is a note of honesty, "I have loved leading TBCS...." Why wouldn't James love working in a job that paid him royally for performing little work, despite his indifference to the needs of the people his charity was supposed to help? It's a gravy train he's eager to re-board, so watch out for his next "charity", coming soon to a list of the nation's worst charities near you.

If you want a good measure of James Reynolds II's character, watch him on video.

Tuesday, February 10, 2015

Addressing the Causes of Substance Abuse vs. Addiction

I saw Johann Hari interviewed on Real Time the other day, and what he essentially offered during the interview was a version of the essay published here, in which he argues that the real cause of addiction is the addict's environment, not the nature of the addictive substance itself:
This gives us an insight that goes much deeper than the need to understand addicts. Professor Peter Cohen argues that human beings have a deep need to bond and form connections. It's how we get our satisfaction. If we can't connect with each other, we will connect with anything we can find -- the whirr of a roulette wheel or the prick of a syringe. He says we should stop talking about 'addiction' altogether, and instead call it 'bonding.' A heroin addict has bonded with heroin because she couldn't bond as fully with anything else.

So the opposite of addiction is not sobriety. It is human connection.
Before I get into the obvious faults of Hari's theory, there is some merit to his position within the larger realm of substance abuse. Many people go through periods of their life in which they rely too heavily upon alcohol, or engage in the recreational use illicit substances or prescription medications, perhaps to the point that their lives seem to be coming apart at the seams, but are subsequently able to scale back or stop that behavior on their own. Their substance abuse may be largely situational, and when the situation changes so does the appeal of drugs or alcohol.

The problem that Hari's theory does not address is why certain individuals are not able to stop using drugs or alcohol without -- and sometimes even with -- significant intervention. Why, if it's the human connection that matters, some individuals will continue to use drugs even as their actions alienate every single person who is trying to connect with them or help them. Hari's theory might explain in part how dealing with addiction can seem like a game of whack-a-mole -- how the successful cessation of the use of one substance, such as alcohol, might be associated with the onset of the use of a different substance or a behavioral disorder. But his theory does not explain why addicts have different drugs of choice, or why rates of successful recovery can differ dramatically between substances.

Hari brings up behavioral addictions,
It was explained to me -- you can become addicted to gambling, and nobody thinks you inject a pack of cards into your veins. You can have all the addiction, and none of the chemical hooks. I went to a Gamblers' Anonymous meeting in Las Vegas (with the permission of everyone present, who knew I was there to observe) and they were as plainly addicted as the cocaine and heroin addicts I have known in my life. Yet there are no chemical hooks on a craps table.
Except, of course, there are. People do get a biochemical reward from gambling. Were that not the case, people would get nothing out of gambling -- there would be no thrill, just boredom associated with an overall loss of money -- and gambling would have no appeal. As it turns out, there is evidence "that the opioid systems in the brains of pathological gamblers may be different, affecting their control, motivation, emotion, and responses to pain and stress."

Problem gamblers appear to have an issue that is similar to that of some problem drinkers, "it seems that pathological gamblers just don't get the same feeling of euphoria as do healthy volunteers". As counter-intuitive as it may seem at first blush, a rapid response to intoxicants is an evolutionary defense against over-consumption. Broadly speaking, when you need to consume more of a substance to get the same thrill, you are at increased risk of addiction.

Hari engages in the dangerous practice of predicating his entire theory on a study of rats. Rats, he tells us, will deal with isolation and boredom by using drugs, but when given many exciting alternatives to drug use they largely choose life's other pleasures over drugs. While, yes, that does suggest that environment can affect rates of drug use, it tells us nothing about why two people who enjoy pretty much the same environment can have extremely different levels of interest in intoxication.

If you attend open AA meetings, those that welcome all members of the public, you will likely soon hear an addict describe his or her first experience with alcohol or drugs. You will very likely hear many speak of their extreme euphoria, their eagerness to repeat the experience, the steps they took to increase their access to their drug of choice and their frequency of use. While Hari would have us believe that in each case there was something -- some level of connection with others -- missing in their lives, and with some of those accounts suggesting such a lack of connection, Hari's argument nonetheless hits a stumbling block: Why do other people with similar or worse environments or levels of isolation try the same substance yet avoid a similar outcome? From another angle,
Time magazine reported using heroin was "as common as chewing gum" among U.S. soldiers [during the Vietnam War], and there is solid evidence to back this up: some 20 percent of U.S. soldiers had become addicted to heroin there, according to a study published in the Archives of General Psychiatry. Many people were understandably terrified; they believed a huge number of addicts were about to head home when the war ended.

But in fact some 95 percent of the addicted soldiers -- according to the same study -- simply stopped. Very few had rehab. They shifted from a terrifying cage back to a pleasant one, so didn't want the drug any more.
The thing is, every single veteran had a new, much more pleasant post-war "cage" -- so why did 5% remain heroin-addicted? Similarly,
If you get run over today and you break your hip, you will probably be given diamorphine, the medical name for heroin. In the hospital around you, there will be plenty of people also given heroin for long periods, for pain relief. The heroin you will get from the doctor will have a much higher purity and potency than the heroin being used by street-addicts, who have to buy from criminals who adulterate it. So if the old theory of addiction is right -- it's the drugs that cause it; they make your body need them -- then it's obvious what should happen. Loads of people should leave the hospital and try to score smack on the streets to meet their habit.

But here's the strange thing: It virtually never happens.
If by that Hari means that most people who are administered powerful opiates during hospitalization don't subsequently become heroin addicts, he's correct. But if he means to suggest that large numbers of addicts don't have their addictions start with their taking properly prescribed pain medications, he's wrong. Most patients will come out of surgery, deal with their inadequate post-hospitalization pain control, recover, and go on with their normal lives. Some will suffer a bit more during their recovery but again go on with their normal lives. Some will actively drug-seek, displaying behaviors consistent with substance abuse and addiction.

If Hari's theory were accurate, we should be able to easily define who is likely to become addicted and who is not. We could simply perform a survey of that person's life, their connections, their stressors and the like, and that should give us an excellent idea of who is likely to have a substance abuse problem and who is not. The problem is, you cannot predict substance abuse or addiction in that manner. You may find overall trends and risk factors, such as a family history of substance abuse, a childhood pain condition that was not properly managed, a history of being the victim of child abuse, and the like. Yes, some predictors do suggest a behavioral component to addiction -- which is what you would expect from something that is in large part a behavioral health problem. But other predictors are not behavioral. Why should it be a risk factor to you if relatives who you have never met, or who were never in a position to model addictive behavior to you, had substance abuse problems?

It's important to recall, also, that not everybody has the same reaction to the same substance. Alcohol triggers different physiological reactions in different people. Some people have little ability to metabolize alcohol, and within their communities rates of alcoholism are very high. Some people flush upon consumption of alcohol. Some become nauseous. Some quickly become tipsy, even with modest alcohol consumption. Others can consume large quantities of alcohol without displaying strong signs of intoxication. Similar things can be said of opiates -- if your reaction to opiates includes feeling itchy all over your body, feeling nauseous, experiencing severe constipation, or feeling confused and anxious, the odds are much lower that you're going to want to repeat the experience than if your principal memory is of euphoria.

These differences in reaction are biochemical, not behavioral. It reasonably follows that some of the differences in why people become addicted to drugs or alcohol, why people prefer one substance over another, and why some people have much greater difficulty establishing and maintaining sobriety, are biochemical. Yes, you may need to address psychological and environmental issues in order to help the addict achieve a stable recovery, but simply changing the addict's environment will not cure the addiction.

Hari suggests that the history of nicotine patches supports his theory,
Everyone agrees cigarette smoking is one of the most addictive processes around. The chemical hooks in tobacco come from a drug inside it called nicotine. So when nicotine patches were developed in the early 1990s, there was a huge surge of optimism -- cigarette smokers could get all of their chemical hooks, without the other filthy (and deadly) effects of cigarette smoking. They would be freed.

But the Office of the Surgeon General has found that just 17.7 percent of cigarette smokers are able to stop using nicotine patches. That's not nothing. If the chemicals drive 17.7 percent of addiction, as this shows, that's still millions of lives ruined globally. But what it reveals again is that the story we have been taught about The Cause of Addiction lying with chemical hooks is, in fact, real, but only a minor part of a much bigger picture.
Hari makes three fundamental mistakes in his comparison. First, he presupposes that the use of a nicotine patch is evidence that a smoker wants to quit. In fact, many smokers who attempt to quit are doing so not because they want to do so, but because they are under social pressure to stop smoking. Some people are afraid to quit smoking, for example because they fear weight gain. Second, he presupposes that establishing a baseline level of nicotine will remove any biochemical incentive for a smoker to smoke. The steady baseline certainly can help control cravings, but it is not going to provide the spike of nicotine exposure to which a smoker is accustomed. Hari is apparently referring to Treating Tobacco Use and Dependence, U.S. Department of Health and Human Services, June 2000, summarized here on page 491. Yes, Third, the abstinence rate for the study was premised upon six months of abstinence, so we're not merely talking about how well smokers abstained during their twelve weeks on nicotine patches, but during a period of months after they stopped using the patch. It's interesting to see that a nicotine nasal spray resulted in a 30.5% abstention rate over the same period, as did buprenorphine -- a medication that does not imitate nicotine, but instead blocks opiate receptors. If biochemistry weren't a big part of the story, the results should have been the same no matter whether the smoker received a placebo, a particular administration of nicotine, or buprenorphine.

Fundamentally, as with any addiction, no treatment program or assistive medication is going to work over the long-run unless the addict wants to stop using his drug of choice. Medications and treatment can provide a window of opportunity during which the addict can establish a period of abstinence and have an opportunity to consider a future both with and without his substance of choice, but unless the addict is sufficiently motivated to stop the addict will relapse. For that matter, many addicts who truly want to stop will still have problems with relapse, whether due to a momentary lapse in judgment, the strength of their cravings, or a combination of factors.

At the end of the day, yes, it makes sense for a recovering addict to improve his environment -- to address facors, internal and external, that contribute to addiction and could contribute to relapse. To ignore the biochemical side of addiction, the predispositions that some people have to the use and abuse of certain chemical substances, and the difficulty that addicts of all backgrounds experience when trying to establish and maintain sobriety, by suggesting... is it that this could all be fixed with warm feelings, love songs and group hugs... is to turn a blind eye to the leading factors in addiction.
Loving an addict is really hard. When I looked at the addicts I love, it was always tempting to follow the tough love advice doled out by reality shows like Intervention -- tell the addict to shape up, or cut them off. Their message is that an addict who won't stop should be shunned. It's the logic of the drug war, imported into our private lives. But in fact, I learned, that will only deepen their addiction -- and you may lose them altogether. I came home determined to tie the addicts in my life closer to me than ever -- to let them know I love them unconditionally, whether they stop, or whether they can't.
I'm not one to point to a show like Interventions and argue that it's a model for addiction treatment. The purpose of an intervention is to inspire an unwilling drug addicted person to go into residential treatment. Contrary to what Hari suggests, the message is not (or at least should not be) that "an addict who won't stop should be shunned" but is instead that the family has the right to draw boundaries and to state that, if the addict chooses to continue down the road to ruin, they will have to limit their role in the addict's life in order to protect themselves and their own mental health. Sometimes it takes a dose of that sort of reality to get the addict to go into treatment. Sure, others will reject the attempted intervention, but it's facile to suggest that it is a failed intervention that causes addicts to "deepen their addiction" -- addiction is a progressive disease and thus, absent some limiting factor, gets worse over time. Many addicts describe the fear of loss of family, the embarrassment of an arrest or jail sentence, and the like as the very thing that inspired them to finally work toward recovery.

What Hari describes as his ultimate take-away, "to let [the addicts in my life] know I love them unconditionally, whether they stop, or whether they can't", is a basic teaching of programs like Alanon, under the name of "detachment with love". Hari may not like some of the implications of that approach, the idea of telling an addict who calls you hysterically in the middle of the night that he was picked up by the police and needs to be bailed out, that he'll have to wait until morning -- or that he'll have to face the natural consequence of his decisions and find a way to bail himself out -- but allowing an addict to face those natural consequences is not an indication that you don't love them. It's a means of protecting yourself, of avoiding the anger and resentment that get in the way of love, and of allowing them to experience the negative consequences that they bring upon themselves such that they might decide that it's finally time to give sobriety a honest chance -- whether through inpatient treatment, an intensive outpatient program (IOP), counseling, peer support, and with or without assistive medication. When the addict reaches the point of wanting to recover, you can start implementing the structure and changes that Hari correctly associates with improving the chances of long-term sobriety. But no, when you're dealing with populations of addicts, you cannot simply work to improve their emotional environment and expect it to be a miracle cure.

Wednesday, February 04, 2015

Evangelical Christianity, Homosexuality and the Deeply Flawed "Tale of Two Bobs"

The other day I came across a blog post by Rod Dreher, in which he embraces a parable somebody wrote a couple of years ago about two neighbors, both named Bob, who get along even though the author assumes that they're not supposed to.

The opening of the parable could be this,
There once were two neighbors, both named Bob. One is a neo-Nazi, the other is Jewish. They've lived next to one another in a duplex for several years, and have been good neighbors: getting one another's mail when the other travels, hauling each other's garbage cans to and from the curb, and have occasionally had a cookout together. They are friends, but they've never really had a discussion about their differences.
Or this,
There once were two neighbors, both named Bob. One is a KKK member, the other is in an interracial marriage. They've lived next to one another in a duplex for several years, and have been good neighbors: getting one another's mail when the other travels, hauling each other's garbage cans to and from the curb, and have occasionally had a cookout together. They are friends, but they've never really had a discussion about their differences.
Or this,
There once were two neighbors, both named Bob. One is an evangelical Christian, the other is gay and agnostic. They've lived next to one another in a duplex for several years, and have been good neighbors: getting one another's mail when the other travels, hauling each other's garbage cans to and from the curb, and have occasionally had a cookout together. They are friends, but they've never really had a discussion about their differences.


The narrative continues,
One day, during March Madness, a stiff gust of wind knocked a tree limb into their power lines, and they found themselves without electricity, five minutes before the U of L game. They wandered out onto their respective porches and decided to go to a nearby pizzeria to watch the game.

Somewhere before the end of the game, this conversation began:
Bob 1: Isn’t it surprising that we've become friends?

Bob2: What do you mean?

1: Well, one of us has a [swastika / KKK emblem / rainbow sticker], and the other has a [Magen David / pro-diversity sticker / fish emblem]. According to most folks, we shouldn't get along.

2: Yeah, I'll admit it's crossed my mind once or twice. Does it bother you?

1: Does what bother me?

2: Well, that I am who I am?

1: Hmmm… I don't know how to answer that. Does it bother you that I am the way that I am?
The narrative continues,
Bob 2 scratches his chin, waits a moment.
2: I suppose there are two answers to that question. One is no, not at all. We've been good friends. You took my dog to the vet when it got into a fight with a possum. You share my hatred of the University of Kentucky. What's not to like? On the other hand, I think you've have committed your life to something that's toxic to our culture, and to yourself, and I wish for your sake, my sake, and the world's that you believed something different. So no. And also, I worry about you.
Bob 1 leans back a little, grinning.
2: Did I offend you?

1: No, not at all. In fact, I would probably give the same answer about you, though I'd phrase it a little differently.

2: How so?

1: Well first of all, I’d talk about your barbecue skills, and I’d admit that I like your smelly dog. Second, I’d say that I think who you are and who I am is more complex than beliefs and commitments… but I think that's true for myself too.

2: You don't think you chose to be that way?

1: Did you?

2: I guess I did and I didn't. Or maybe, I didn’t then I did. It was something I didn’t want, but eventually I had to admit it.

1: I guess I didn't and then I did.

2: That's a better way of putting it.

1: For both of us.

2: For both of us.

1: So all this simmers in the background while we see one another, day by day.

2: Yep.

1: But we just keep on being neighbors and sharing the occasional pizza.

2: Yep. Breathing the same air, trying to figure out how to get along.
The game got heated for a few moments and they drifted away from the conversation. Soon, it started up again.
1: Let me ask you something.

2: Shoot.

1: You're saying that you didn't choose to be the way you are, but then you did.

2: Yeah. It was a journey. I didn't want to believe it, but eventually, it became undeniable, and I had to accept it inwardly, and then I had to accept it outwardly.

1: How did your family react?

2: Well, they're more sympathetic to you than me… It wasn't easy. It still isn't. I get snarky comments occasionally, especially during election seasons.

1: Oh yeah… the worst.

2: The worst. Let me ask you something now.

1: Okay.

2: Has it caused trouble for you? Like, at work or anything?

1: Well, sometimes. Some folks just think it's awful, and you have to win them over by just being an ordinary person.

2: Because they think you're a monster?

1: Because they think you're a monster.

2: That's familiar.

1: Yep.
The game ends, the two walk back home, and their friendship resumes. Conversations return to this topic, and both try to convince the other of their errors… But thus far, not much has changed. They remain good friends and good neighbors.
The author argues,
This parable is meant to do two things. First, it’s sort of a Rorschach test. Which of the Bobs is a Christian, and which one is gay? In a culture that remains hostile to the LGBT community at one end of the spectrum, and at the other end, hostile to Christians who hold traditional beliefs, we will find folks like both Bobs: their social experiences are almost interchangeable.
Even within the context of "Which of the Bobs is a Christian, and which one is gay", the exchange is strange and contrived. When you recognize the fact that, perhaps with a slight adjustment for time and place, the exchange as easily "fits" contexts in which one person's views would be unacceptable by broadly held contemporary standards, the parable falls apart as a highly strained false equivalence. There is a difference between disliking somebody because of their beliefs, particularly when those beliefs cast you as somebody who is destined to Hell or inherently inferior, and disliking somebody over an aspect of their being that they cannot change -- such as their heritage, or their (or their spouse's, or their children's) skin color.

If you want to reduce it to a parable about mutual acceptance, to make it a song and dance number for a Rodgers & Hammerstein musical, you don't need to bring religion or status into the discussion. ">The farmer and the cowman can be friends. You say tomato, and I say to-mah-to. You say goodbye, and I say hello. The exchange actually works better if you treat the disagreement as being over a triviality. Consider Dr. Suess's story of the star-bellied sneetches, creatures identical in all respects save for the presence of stars on their bellies, who come to realize the absurdity of using that distinction as the basis of a claim of superiority. That form of the narrative can still serve as an analogy for much more serious, real world, bigotry and discrimination, but without the need for a false analogy.



Secondly, I think this conversation is very real and true to life. It’s a conversation that I’ve had in one form or another with many friends over the years. I’ve also had conversations that were much less friendly. But the context here is, I think, the key: being neighborly, being a friend, creates space for conversations that are hard. And while that probably won’t resolve the growing public tension over these issues, it might help us to live at peace with our neighbors, and that is, in some ways, far more important.
Except the conversation is not real and is not true to life. I'm not going to rule out the possibility, for example, that a member of the Westboro Baptist Church gets along with his gay neighbor, but this is not the conversation such a person would be at all likely to have with that neighbor. Also, the author starts from the preconception that gay people "shouldn't get along" with evangelical Christians, and vice versa. While some evangelical Christian churches and movements do preach intolerance, that's not prerequisite to being an evangelical Christian. And while a gay person might not like getting the stink eye from somebody who is intolerant of his relationships, there's absolutely no reason to presuppose that being gay predisposes you to not "get along" with an evangelical Christian. For goodness sake, you can both be an evangelical Christian and be gay.

The parable seems to recognize the inherent weakness of trying to analogize the condemnation of a group of people based on status -- something they cannot change -- and criticism of people based upon their beliefs, even sincerely held religious beliefs. The Bobs are posited as having this interchangeable view of their realizing that they were gay, or their embracing a form of evangelical Christianity that regards homosexuality as a mortal sin,
2: You don't think you chose to be that way?

1: Did you?

2: I guess I did and I didn't. Or maybe, I didn’t then I did. It was something I didn’t want, but eventually I had to admit it.

1: I guess I didn't and then I did.

2: That's a better way of putting it.

1: For both of us.

2: For both of us.
The problem here is that "gay Bob" would be describing a process by which he recognized and accepted his homosexuality despite strong social pressure not to be gay. Accepting the fact that you are gay is not a "choice" as posited by the narrative. In contrast, if a person in fact struggles with whether to join a particular religious or social movement, and struggles with those portions of its beliefs that teach intolerance of others, their ultimate decision to remain within the movement and to embrace those beliefs comes as the result of an actual choice. Under the interchangeable narrative, "Christian Bob" describes himself as coming from a family that holds different views than his own, and is accepting of gay people ("they're more sympathetic to you than me"). While "Christian Bob" may believe that his religion dictates his attitudes toward gay people, under the narrative he chose the path that led to those beliefs.

Some try to draw a fine line between homosexual thoughts and homosexual practices -- the conception being that if a gay person doesn't accept his homosexuality, or if he does accept it but represses any action on his desires, that he is somehow elevated above a homosexual person who involves himself in a gay relationship. Under that thesis as it plays out in the real world, you're asking homosexual people to either live a lie, usually at the expense of another person (their heterosexual spouse), or to openly state that they are homosexuality and then to live a life of chastity. Even if the latter path were realistic, many evangelical communities would not be welcoming to such an individual. We can debate the extent to which that's the result of the teachings of their church, the result of larger social views, or some combination thereof, but it's a reality. There's a vast difference between not excluding a gay parishioner and welcoming them into your church as a full and equal member.

To the extent that the narrative reminds evangelical Christians of the teaching that you can love the sinner while hating the sin, that you can be accepting of others without compromising your Christian values, that you can be neighborly even toward people whose lifestyles you find to be sinful, great. The preconception of the narrative, that evangelical Christians "shouldn't get along" with gay people is not necessary -- you can be a devout Christian without hating anybody. Why does a contrary impression exist? Not only because of the antics of groups like the Westboro Baptist Church ("God Hates Fags"), but because of attitudes like those acknowledged here,
I can't look my gay brother in the eye anymore and say "I love the sinner but hate the sin." I can't keep drawing circles in the sand.

I thought I just needed to try harder. Maybe I needed to focus more on loving the sinner, and less on protesting the sin. But even if I was able to fully live up to that "ideal," I'd still be wrong. I'd still be viewing him as something other, something different.

Not human. Not friend. Not Christian. Not brother.

Sinner.

And despite all my theological disclaimers about how I'm just as much a sinner too, it's not the same. We don't use that phrase for everybody else. Only them. Only "the gays." That's the only place where we make "sinner" the all-encompassing identity....
The author clearly felt immense pressure within his religious community to reject homosexuals. He also speaks of how, upon reflection, he can continue to hold his religious beliefs without joining in with that type of condemnation of his literal and figurative brothers. The author of the "Bobs" narrative asserts,
Christians make space for others all the time; neighbors who are adulterers or gluttons, alcoholics or tax cheats. We have family members who are liars and Christians – at their best – love these folks because they know that they are no different but for the grace of God. And so, Bob can make space for Bob even while he lovingly extends the offer of grace in Jesus Christ. That offer includes a call to repent of Bob’s sins, and that’s a tough pill to swallow.
Save for the contrived assertion that "Gay Bob" is agnostic, "Gay Bob" could have been Christian who attends a church that is accepting of his homosexuality. I doubt that the same sort of emphasis on "the offer of grace in Jesus Christ" or repentance of sins would be asserted if this were "Evangelical Bob and Presbyterian Bob", yet save for the author's contrivance "Gay Bob" could a devout Presbyterian, perhaps even a minister.
But the truth is that the other Bob wants to convert Christian Bob too – not to being gay, of course, but to his own worldview.
As the "Two Bobs" narrative unfolds, there's no reason to believe that to be the case. That is, with "Christian Bob" being able to be friends with his gay neighbor, there's little more that "Gay Bob" could hope to accomplish -- and no reason to believe that "Gay Bob" would be particularly interested in trying to push "Christian Bob" into making further concessions. After all, if most or all evangelicals were as neighborly, the author would have felt no need to write his parable.

Dreher's take-away from the parable was this:
Cosper’s point is that Bob 1 can be the gay agnostic, or the traditional Christian, and the same moral would apply. If you can’t see how either one could play either role in the conversation, perhaps you need to work on your empathy.
For reasons I've already outlined, and which should be readily apparent from the applicability of the parable to other contexts in which it becomes instantly uncomfortable, Dreher's first take-away fails due to narrative's reliance upon a false equivalence.

The argument for empathy -- for mutual empathy -- is more interesting. While the narrative flounders when it attempts to draw a parallel between immutable aspects of a person and their social or religious beliefs, there is no question but that people can be friends with evangelical Christians without sharing or endorsing their beliefs. Sure, just as political discussions are off the table at a lot of family Thanksgiving dinners, there may be discussions that don't occur in the interest of good neighborly relations, but that's part of how we get along with others who don't fully share our views.

The false analogy makes the argument for empathy a bit awkward -- I'm hard pressed to think of any gay person I've ever known who held the sort of blanket views of evangelical Christians that the author seems to believe are prevalent -- but certainly, there's room for neighbors with different social, political and religious views to find common ground. (Nonetheless, if "Christian Bob" is marching with the Westboro Baptist Church or is actively protesting gay marriage and lobbying politicians for a ban on employee benefits for same-sex partners, he needs to take responsibility for the fact that his actions make it much less likely that he will find common ground with his gay neighbor.)

Wednesday, January 14, 2015

Divining the Meaning of an Election

Michael Gerson has written a column in which he accuses political parties of... I guess it's of not sharing his personal beliefs about what is signified by the outcome of an election. Should we find it surprising that politicians characterize the political climate as being consistent with their own, and their party's, political agenda even when the facts might suggest otherwise? When G.W. Bush was pushing Social Security privatization as part of the supposed mandate from his 2005 re-election, despite its not being an election issue, Gerson was still working for him as a speechwriter. If Gerson wants to pen an interesting column on what it means to have a mandate, the collapse of that effort should provide plenty of material.

Gerson's commentary is largely inspired by the recent midterm election,
The GOP is feeling the momentum of its best congressional performance since the New Deal, and Senate Republicans are enjoying the pleasing weight of committee gavels in their hands. Elected Republicans generally believe that [President] Obama was humbled by voters and should act like it — that he should make concessions commensurate to his losses, as President Clinton did following his 1994 midterm defeat.

Obama, in contrast, seems to view the November outcome as his final liberation from a dirty political game characterized by complete Republican bad faith. He finds no repudiation in the verdict of an unrepresentative, midterm electorate. And he is no longer required to pretend that he cares about the political fate of the 4th District of Podunk. His reaction to the election has been to seek new avenues of executive action as an alternative to congressional dysfunction. So far, he has been politically rewarded.
My initial reaction to this split of opinion is pretty simple: The midterm election involved the House of Representatives and the Senate. Neither party disputes the obvious consequence of that election -- the Republicans took control of the Senate. However, the President did not stand for re-election and, as much as his political opponents might want to point to their electoral successes in a different branch of government as a reason why the President should abandon his own political agenda, that's not the way our system of government is constructed. We don't have a parliamentary system, where the party with the most seats gets to form a government with the party head becoming Prime Minister.

Gerson argues,
This type of polarization seems more psychological than ideological. Obama and congressional Republicans are inhabiting alternative political realities, with no overlap in which compromise might take root.
Although ideology comes into play, the word for which Gerson should have been searching is "political". Contrary to Gerson's suggestion, "Obama and congressional Republicans" are not "inhabiting alternative political realities" -- they are seeking to advance their own political agenda within the constraints of our political system. Gerson proceeds to explain that "'The meaning of elections... is almost always contested'" and "Election outcomes are not self-interpreting" -- well, no kidding.
As to the 2014 election: "It may well be," [political scientest Frances Lee told me, "that no single conventional wisdom will ever emerge. . . . Faced with ambiguity, people tend to believe what they want to believe. When people are surrounded by social networks that also want to believe the same thing, their views will harden further."
Cognitive bias 101... which, of course, has absolutely no relevance to how the President and Congressional Republicans interpret or respond to the election.

Gerson opines,
The parties do not view themselves as losers, even when they lose. The 2012 election should have demonstrated to Republicans (among other lessons) that they need a seriously revised outreach to minorities, women and working-class voters. The 2014 election should have demonstrated to Democrats (among other lessons) that a reputation for unreconstructed liberalism seriously limits their geographic appeal.
That, of course, is abject nonsense. If the lesson of the 2012 election is supposedly that Republicans "need a seriously revised outreach to minorities, women and working-class voters", a lesson the Republicans most certainly did not internalize, then the election of 2014 would be that the Republican Party does not need any such revised outreach. I'm reminded of how some commentators, speaking on climate change, confuse weather and climate -- it's the overall climate that requires the Republican Party to evolve. The big picture. The next twenty years. The result of a specific election is a data point, not a trend line.

When it comes to the President, it would be helpful if Gerson provided us with his conception of what it means to be an "unconstructed liberal". The term is bandied about in right-wing circles, but with little attention to meaning or consistency. It seems often to be used to describe somebody who adheres to far-left liberal positions. If that's what Gerson perceives in Obama's legislative history and his present political goals, to put it mildly, he's out of touch with reality. To the extent that Gerson is applying a dictionary definition of "unreconstructed", attempting to suggest that the President is advancing a liberal agenda that has become criticized or is unpopular, it's an odd argument. One of the reasons we have representative governments, and one of the reasons we elect officials for terms of years, is to insulate the political process from popular whims and prejudices. Further, such a definition would mean that Gerson is looking at opinion polls, not the result of the 2014 election and certainly not the results of prior elections.

Gerson's focus on geographic appeal is interesting, given that he presents geography as a problem for the Democrats but not for his own party. While it's not surprising that a Republican like Gerson would suggest that the Democrats should abandon their platform in favor of one that of his own party, it's not clear that doing so would actually do much to change the political map in the red states. What it would do is alienate blue state voters from the party, something the Republicans would no doubt appreciate but which would be entirely counter-productive to the Democratic Party itself. Gerson can't have helped but notice a clear red state, blue state divide in the 2014 election, yet he shows no sign of concern that the Republicans disavow their platform in order to woo more blue state voters. Under this interpretation of his statement, Gerson's suggestion to the Democrats is either a form of preaching to the Republican choir or is the sort of advice you give in the hope of handicapping an opponent who heeds it.

Gerson concludes,
Both parties could gain electoral advantages by realistically addressing their weaknesses, which would also open up the possibility of legislative progress. But everyone, unfortunately, seems to like what they see in the mirror.
Except... not so much. To the extent that Gerson correctly identifies trends within the population, he could make the argument that both parties need to focus on that long-term picture. Within that context it makes sense for the Republicans to pass a bipartisan immigration reform bill -- like the one that the Senate passed last year, but which the Republicans would not even allow to come up for a vote in the House. Instead the House is serving up a mess of a bill, unlikely to even gain Senate approval, but which seems to be fairly characterized as throwing red meat to anti-immigrant factions of their base.

Gerson might argue that the GOP is proving his point, that they need to pass something along the lines of the bipartisan Senate bill to help ensure the party's successful future. But even accepting that as true, the problem would be that the Republican Party, like Gerson, is focused on data points as opposed to trends. They're out to win the next election, not to lose that election for the sake of potentially positioning themselves to dominate politics a decade or more into the future. It's the President who has the eye on that future and, even if Gerson chooses to characterize his immigration policy as "unreconstituted liberalism", as something that should be abandoned, through the President's action the contrast between the Republican position and the Democratic position is made stark. Obama is taking the long view.

It's worth noting that Gerson is also playing the "pox on both your houses" game, in which he depicts both the Democrats (through Obama) and the Republicans as being equally at fault for legislative gridlock. The Republicans have come to the political realization that when a Democrat is in the White House, their party benefits from gridlock. The Senate immigration bill represents the sort of bipartisanship that Gerson would have us believe that we need (even as he suggests that the weaker reforms the President enacted through executive orders represent some form of liberal extremism) -- House Republicans killed the bill. Right now there is no chance that the Republicans are going to offer the President a reasonable immigration reform bill, let alone one that could fairly be characterized as bipartisan. There's similarly no chance that they will offer a reasonable healthcare reform bill (perhaps instead passing a score of "ObamaCare repeal" bills to add to the pile of their prior failed attempts) or a reasonable bill to address carbon emissions.... Where's the opportunity for the President to do anything but stand up for his core beliefs and do his best to advance the long-term interests of his party? It's not an issue of the President's liking what he sees in the mirror -- it's a matter of his being sufficiently politically literate to read the handwriting on the wall.

Monday, January 05, 2015

Hey, Firefox: Don't Mess With My Preferences

I know that Mozilla/Firefox ended its contract with Google and entered into a new search contract with Yahoo!, but that doesn't mean I want them to mess with my settings and make Yahoo! my default search engine.