Monday, September 28, 2015

How to Talk About Ahmed "Clock Boy" Mohamed Without Sounding Like a Fool

When we're talking about people writing for an outfit like Breitbart, coming across as an idiot on this type of matter may be part of the job description, but for the rest of us....

Credit where it’s due to comedian Bill Maher and HBO show Real Time, which has become one of the few mass media outlets telling the truth about “Clock Boy” Ahmed Mohamed, who was removed from his Irving, Texas school when he brought a device to the school unannounced that resembled a bomb.

If Maher is correct, let's skip ahead to his conclusion:

Maher has repeatedly said that he believes Mohamed should not have been arrested.

That's the gist of the controversy -- that Mohamed was arrested. Had he merely been sent to the principal's office, we would never have heard of him.

So is that the story? A young teenager was needlessly arrested for having a clock, with Breitbart's right-wingers joining with Maher to deplore the stupidity of his arrest? Of course not. What the author actually approves is Maher's claim that the teenager was not as bright as the initial stories suggested, and Maher's anti-Muslim rhetoric. So let's take those issues in order:

Maher used a zinger to shut down the notion that the boy removing the back of a piece consumer electronics and showing it to people makes Mohamed any kind of inventor, saying “This is like pouring milk on a bowl of Cheerios and claiming you invented cereal.”...

It’s the second time that Maher has featured the Clock Kid as a topic of discussion. On a previous episode of Real Time, billionaire Marc Cuban revealed that when he spoke on the phone with Muhamed and asked him questions, he could hear his sister whispering answers to him.

Tee, hee, hee, Maher really put that fourteen-year-old boy in his place. But here's the problem: the boy wasn't arrested for plagiarism, for cheating on his homework, for patent infringement.... Even if we presuppose that he can't even tie his shoes by himself or speak in coherent sentences, that in no way stands as an excuse for his treatment by the school or by the police. It's irrelevant.

When liberal Ron Reagan, Jr. attempted to claim that the device that Mohamed brought to school didn’t resemble a bomb, Maher quickly defused his argument, urging him, “Try taking that through airport security.”

And, as Reagan correctly pointed out, it would not have been a problem -- because you're actually allowed to take clocks, and electronic components, onto airplanes. The boy would have placed his pencil case on the scanner belt, the person operating the scanner might have flagged the item for further review, and upon further review it would have been determined to be a clock. They might also have swabbed the case to test for any residue of explosives, only to again confirm... clock.

While clocks can be used as timing devices for explosives, clocks are present in many things that people routinely take onto airplanes. Cell phones, computers, tablets, ebook readers, wristwatches, travel alarm clocks.... It's really not alarming -- even if it's an alarm clock -- unless there's some indication that it's actually going to be used in association with an explosive device. And no, looking like your memory of the excessively complicated explosive device created by the prop department of a James Bond movie does not translate into it's being anything more than a clock.

On that episode Maher also noted that so many young Muslim men have “blown a lot of shit up around the world.”

The boy's religion has obviously factored into discussion of the case, and appears to be the leading factor in Maher's knee-jerk reaction to the case. It's a factor that is emphasized and amplified in the right-wing media, where you can read conspiratorial tales of how the boy's older siblings or father are activists of one sort or another, and how the whole thing was a deviously clever set-up of the school.

Okay.... so we have a kid who's actually stupid, and does nothing more than disassemble off-the-shelf clocks and put them into his pencil case, but at the same time who is so clever that he tricked the police into arresting him, and tricked the right-wing media machine into engaging in anti-Muslim demagoguery? I think there's a bit of tension between those two positions, but leaving that aside for the moment, proponents of the Muslim angle have a bigger problem:

The school does not report that it suspected that the boy's device was a bomb, real or fake, because of his religion. The school does not suggest that their knowledge of the boy's family played any role in its decision to treat the possession of the clock as a criminal matter, and to call in the police. The police don't claim that they knew the boy's religion, or that they suspected that the clock was something other than a clock because of his religion.

The commonality seems to be that the teacher, school officials and police officers who made the stupid decision to treat this as a criminal matter believed that anything that looks like the time from a bomb in a James Bond movie has to be a bomb -- be it an actual bomb or a fake bomb. That's not a matter of the boy's being the most brilliant inventor on the planet or dumb as a rock; it's not a matter of the boy being Muslim, Christian, Hindu or atheist. It's a matter of the school administration and police acting foolishly and needlessly arresting a boy for his possession of a clock.

The actual story is this: A boy brought a clock to school, whether disassembled or home-made, a teacher was concerned by its appearance, the school overreacted and brought in the police, and the police overreacted by making an arrest. Once you subtract the anti-Muslim rhetoric, you can move straight forward into what seems to be obvious even to a Breitbart writer, "Mohamed should not have been arrested". Your choice to inject more into the story may tell us something about you, but it's otherwise irrelevant.

Wednesday, September 23, 2015

One of the Prices of Bigoted Demagoguery

Professional Islamophobe Pam Geller brings her trademark (lack of) insight to the Ahmed Mohamed incident, the boy who was arrested for bringing a homemade clock to school:

"If you ever see a Muslim with a suspicious object, remember the lesson of Ahmed Mohamed: to say something would be 'racism,'" she wrote. "That could end up being the epitaph of America and the free world."

A rational mind might observe that the biggest difference between a homemade clock and a homemade bomb is that the latter involves explosives. Without actual explosives, you have no bomb. Without make-believe explosives, you have no make-believe bomb. But more than that, the reason why the boy's clock looks like a bomb to people like Geller, and the reason any circuit board is going to look like a bomb to the equally addle Frank Gaffney, is because by all appearances everything they know about electronics and bomb-making comes from watching movies.

If we're going to embrace hysterics and suggest that any person in possession of something that looks like it could be used as a trigger device for a bomb should be arrested, whether or not they possess real, fake or imaginary explosives, we can start by arresting any person found in possession of a cellular phone. Meanwhile, if Geller is truly concerned that arresting kids for possessing homemade clocks is going to prevent the arrest of actual criminals, I suggest that she get herself a copy of "The Boy Who Cried Wolf", read it, then take a long look at herself in the mirror.

Sunday, September 20, 2015

Does Jon Snow Become Azor Ahai

If you're a Game of Thrones fan, you know that one of the subplots involves Melisandre's belief that Stannis Baratheon is the reincarnation of the legendary hero, Azor Ahai. The books emphasize that subplot to a greater extent than the series. With the death of Stannis, it's pretty clear that he's not Azor Ahai, and... who else is left but Jon Snow?

So perhaps in some sort of parallel to Danaerys's survival of her husband's funeral pyre, we're headed toward the very dead Jon Snow, arms crossed on his chest clutching his sword, set ablaze... and coming back alive not as "Jon Snow" per se, but as Azor Ahai.

The evidence for this theory would be a cross between what would turn out to be exceptionally manipulative statements by the showrunners that Jon Snow is dead, and the fact that Kit Harrington has been repeatedly seen around the Game of Thrones set, including observation of his participation in a massive battle sequence. Let's be honest: if he's in a massive battle, it's not going to be a flashback.

It's not entirely clear what it would mean for Jon Snow to become Azor Ahai, save for it being difficult to believe that it won't come across as corny. Also, as popular as the show is, I don't think that its necessary to manipulate the audience to create buzz.

Thursday, August 20, 2015

Trigger Warnings on College Campuses

The Atlantic has published an article that correctly laments the rising use of "trigger warnings" by colleges, but which focuses on an unsupported thesis that there is something materially different about "kids these days" rather than examining why college administrators are sensitive to demands for those warnings. Don't get me wrong -- there are differences between generations, and in some ways today's college students are a lot more sensitive about wrongs than were past generations. For example, today's college students appear to be much more sensitive to and much less tolerant of racism, as compared to past generations.

A couple of decades ago, when I was in law school, my criminal law professor used sexual assault in order to illustrate some of the complexities of the law. A law professor can pose a hypothetical example of a sexual assault in which the victim sincerely believes that a sexual assault occurred, while the accused (reasonably or unreasonably) sincerely believes that everything that occurred was consensual. Such a hypothetical case allows you to examine questions of the degree to which intent should factor into criminal prosecutions, charging decisions and sentencing. Should two equivalent acts receive the same punishment, even though one offender was acting deliberately and the other was clueless? To what degree does deterring the future acts of others justify prosecuting an offender who may not have realized that he was committing a crime? Hypotheticals can also extend into cases in which consent is obtained through fraud or deception, or by a mistake of perception.

Some of the students in my section made strenuous objection to the use of sexual assault -- not to any specific example, but to its being mentioned at all -- within the classroom. The objections did not change the manner in which the class was taught, but did inspire school administrators to meet with the students who raised the objections to try to explain what the professor was hoping to accomplish and to resolve their concerns. At that time I was told by another professor that the atmosphere for discussing sexual assault cases had changed significantly, and that some professors had already stopped using sexual assault hypotheticals in the classroom despite the difficulty of formulating hypotheticals that would as clearly illustrate the legal principles they were trying to teach. The Atlantic article opens with the example of "law students asking... professors at Harvard not to teach rape law", as if it is a new development. It is not.

When I hear of demands for "trigger warnings", my response is not that students are somehow different from back when I was in college. It may be that there has been some shift in the number of students raising objections -- something that I have not seen documented -- but the primary difference appears to be that school administrators have changed how they respond to student objections. Forty years ago, the response probably would have amounted to, "It's college, and you're going to be uncomfortable at times. Get over it." When I was a student the response was more gentle, but with a similar outcome. Today, college administrators do seem much more likely to ask that a professor add "trigger warnings" or change something about a course in order to avoid making the objecting students uncomfortable.

Why would administrators change their approach?

I think the answer lies not with the modern generation of students, but with the modern approach to the funding of college education. When college education was more affordable, and colleges were less dependent upon squeezing every last tuition dollar out of a student, colleges could more easily treat their students as students. As states have chosen to reduce their support for public colleges, colleges have increasingly had to fight for every dollar. Part of that process has involved making life a lot easier, and a lot more comfortable, for students. Part of that process involves catering to a vocal minority that, if not appeased, is likely to create negative publicity for a college or take its tuition dollars elsewhere.

The authors write,
The press has typically described these developments as a resurgence of political correctness. That’s partly right, although there are important differences between what’s happening now and what happened in the 1980s and ’90s. That movement sought to restrict speech (specifically hate speech aimed at marginalized groups), but it also challenged the literary, philosophical, and historical canon, seeking to widen it by including more-diverse perspectives. The current movement is largely about emotional well-being.
I think, here, that the authors are looking at two different forms of response by college administrators. There was no ambiguity in the classroom instruction -- sexual assault was presented as a bad thing, empathy was extended to the victim even when the hypothetical posited an offender who did not realize that he had committed a crime, and nobody questioned why "'No' means 'no'" was sensible policy. The students who objected to any reference to sexual assault in our criminal law class were not interested in widening the discussion or including additional viewpoints. Had our professor been less sensitive, perhaps an administrator would have advised him as to how to introduce the subject in a more sensitive manner, but the key difference seems to be that the administrators were not receptive to demands that material be removed from classroom instruction on the basis that it created discomfort for some of the students.

While the authors of the article describe the efforts in our society to make life more safe for children, I think that they overstate their conclusions. They write, "children born after 1980 — the Millennials — got a consistent message from adults: life is dangerous, but adults will do everything in their power to protect you from harm, not just from strangers but from one another as well." I disagree that the underlying message is "Life is dangerous" but, more than that, it's difficult to see that as a distinction from prior generations. Doe the authors believe that in the past generations, parents were teaching kids that "life is safe", even when that was patently untrue? Did the generation that grew up a century ago, spending their early childhood in an age before polio vaccination and where infant mortality was common, who experienced the Great Depression, who were born around the time of World War I and saw the country go through World War II (perhaps fighting in that war) under the impression that life was safe? I somehow doubt it.

Perhaps the difference is that parents of past generations of kids told them, "Life is dangerous, but you're on your own, kid -- you can't count on me to keep you safe"? No, that doesn't seem plausible, either.

I think the authors stray off of the rails when they suggest that political partisanship may be a significant contributing factor, based on "survey data going back to the 1970s". The authors are certainly aware that past generations have also experienced times of fierce political partisanship. Our country even once had a civil war. I also have little sense that students arriving on campus are more politically aware than those of past generations. So when the authors suggest that "students arriving on campus today might be more desirous of protection and more hostile toward ideological opponents than in generations past", it seems fair to point out that they're engaged in conjecture, and that they haven't adequately supported a theory that even they phrase as conjecture. Similarly, when describing the advent of social media, the authors write,
These first true “social-media natives” may be different from members of previous generations in how they go about sharing their moral judgments and supporting one another in moral campaigns and conflicts.
Or... they may not be any different.

The authors note that faculty members may be concerned about being attacked on social media, something that might explain why faculty members and administrators are perhaps hypersensitive to certain student complaints, but that's not an observation about the students.

The authors state,
We do not mean to imply simple causation, but rates of mental illness in young adults have been rising, both on campus and off, in recent decades. Some portion of the increase is surely due to better diagnosis and greater willingness to seek help, but most experts seem to agree that some portion of the trend is real.
If they're not suggesting "simple causation", what form of causation do they in fact intend to imply? What changes in diagnosis rates has occurred, and for what mental illnesses? Which mental illnesses are now more frequently diagnosed, and what portion of the increase for any given mental illness do the authors believe might be associated with calls for "trigger warnings"? When they speak of "experts", what are the qualifications of the experts whose views they have examined and deemed relevant? What does it mean to "seem to agree", as opposed to expressing actual agreement? It's really easy to make a nebulous, speculative assertion in support of an argument, but if you want it to carry weight you have to provide some amount of substance.

The authors suggest that changes in the interpretation of federal civil rights law that occurred in 2013 might play a role in the changes, but that argument seems weak on a number of fronts. First, the trends they are describing started long before 2013. Second, the average student is completely unaware of those changes. Third, to the extent that administrators are responding by being hypersensitive not only to situations implicated by the actual changes, but to situations well beyond their scope, that suggests a problem with the administrators and not the students. To the extent that interpretations of the law by federal agencies are making it more difficult to teach effectively on campus, those organizations should reconsider or clarify their interpretations, but it's not the fault of either students or federal agencies if college administrators impose policies that extend far beyond what the law requires.

The article continues into what I see as some rather odd armchair psychology, such as the suggestion that triggering material could benefit sensitive students by desensitizing them to a trauma. Even if we assume that the people demanding trigger warnings are doing so because of their own sensitivities, as opposed to those of others, there is a difference between a sensitivity and a phobia -- and no psychologist in his right mind is going to conflate random encounters with material that triggers a phobic reaction or PTSD is a proper alternative for a professionally conducted process intended to desensitize and individual. In fact, you're apt to learn that such an approach could worsen the phobia.

They share a couple of example of "catastrophizing" that are about college administrators:
Catastrophizing rhetoric about physical danger is employed by campus administrators more commonly than you might think—sometimes, it seems, with cynical ends in mind. For instance, last year administrators at Bergen Community College, in New Jersey, suspended Francis Schmidt, a professor, after he posted a picture of his daughter on his Google+ account. The photo showed her in a yoga pose, wearing a T-shirt that read I will take what is mine with fire & blood, a quote from the HBO show Game of Thrones. Schmidt had filed a grievance against the school about two months earlier after being passed over for a sabbatical. The quote was interpreted as a threat by a campus administrator, who received a notification after Schmidt posted the picture; it had been sent, automatically, to a whole group of contacts. According to Schmidt, a Bergen security official present at a subsequent meeting between administrators and Schmidt thought the word fire could refer to AK-47s.

Then there is the eight-year legal saga at Valdosta State University, in Georgia, where a student was expelled for protesting the construction of a parking garage by posting an allegedly “threatening” collage on Facebook. The coll[e]ge described the proposed structure as a “memorial” parking garage—a joke referring to a claim by the university president that the garage would be part of his legacy. The president interpreted the collage as a threat against his life.
I don't think that the dean who complained about Francis Schmidt was identified, but I doubt that the dean was a 'Millennial'. The president of Valdosta State University appears to have been in his late sixties at the time of the incident, so it's safe to say that he's not a millennial. Those actions may not represent the best of what we hope to impart to college students, but they appear to support for the argument that the primary source of the problem lies with college administrators as opposed to students.

The article also references an incident in which an instructor's joke was misinterpreted by a student and reported as a threat, resulting in the instructor's suspension. It is no surprise that college administrators are sensitive to potential violence on campus, and does not appear to be in question that the response in that case was an overreaction. It's difficult to see how the incident has anything to do with students being too sensitive, as opposed to there being an understandable concern (even if magnified by the spotlight effect) about violence on campus.

My preferred approach would not be seen as very sensitive to those who are concerned about potential triggering, or who are inclined to reinvent innocuous statements as "microaggressions": I would like to see colleges instruct incoming freshmen that the can expect their ideas to be challenged as they come into contact with other students, professors, and materials that reflect opinions and perspectives different from their own, and that the risk of being at times upset or offended is inherent to the learning process. I would trust professors who were planning to use particularly disturbing or potentially offensive materials to warn their students; for the most part it's difficult to think of scenarios where the nature of the material that will be presented in a course is not foreshadowed by the subject matter. Students who are hypersensitive would be free to explore their options within that framework, or to consider other colleges. The small but influential population of students who want to protect not only themselves, but to protect others from any form of actual or or potential offense would be on notice that the administration will focus on incidents of actual bias and discrimination, but won't be tying itself in knots trying to ensure that nobody is ever offended.

Will that happen? Perhaps at a college that is not dependent upon accepting most or all of the students who apply. But as long as colleges are in the position of having to treat students like customers, I don't expect that this sort of trend will reverse itself -- and I find myself in full agreement with the authors' underlying position that this trend isn't good for anybody

Tuesday, August 11, 2015

The Other Side of the So-Called Trophy Culture

A recent Real Sports broadcast, summarized here, declared that our nation has a problem with a "trophy culture" that coddles kids by handing out trophies for merely showing up for games, or perhaps even simply for signing up for a team whether or not they even come to a practice. At the conclusion, Bryant Gumbel expressed that he didn't see it as a big deal, and was told by the correspondent that it was somehow emblematic of a larger social problem. I find myself far more sympathetic to Gumbel's position.

If you've ever seen very young children play a team sport, like Soccer, you might be reminded of kittens chasing a string. No matter how many times you tell them to hold to their positions, most and sometimes all of the kids will simply chase the ball no matter where it is on the field. At that age you will find some kids who are skilled beyond their years, but for the most part I don't see much benefit in treating a game as if it's a meaningful contest. Let the kids have fun, don't worry about the score, and focus on building their interest and skills for future years.

As kids get a bit older, and they divide into groups of better-skilled and lesser-skilled players, a new question arises: Are you going to give every child some time on the field, or are you going to instead focus on getting the win? When the focus is on the win, stories of bad conduct by adults -- don't take it from me: Here's Real Sports on the issue:

Real Sports found a person by the name of Ashley Merryman to play the part of the scold against tropies. My first reaction to her statements was, "Kids aren't that stupid," or, to put it another way, kids know the difference between getting a trophy for participation, as compared to getting a trophy for their performance. I thought that perhaps somebody had researched the issue and discovered that somebody had -- as explained by none other than Ashley Merryman herself:
By age 4 or 5, children aren’t fooled by all the trophies. They are surprisingly accurate in identifying who excels and who struggles. Those who are outperformed know it and give up, while those who do well feel cheated when they aren’t recognized for their accomplishments. They, too, may give up.
So the issue really isn't that kids are fooled into thinking that their performance exceeds their actual skill set by getting participation trophies -- they know the difference. Having undermined her own argument, Merryman tries to turn it around by arguing that top performers might "give up" if they see other kids get meaningless trophies -- to which I respond, "That's nonsense". When I look at youth sports these days, I see kids performing at the highest levels that I've ever seen, far beyond the performance of the same age cohort when I was a kid. What evidence does Merryman offer to convince me not to believe my lying eyes? That would be... nothing.

At this point the case against participation trophies would seem to be that, past the age of five or six, the kids see them for what they are, and it costs a lot of money to hand out hundreds or thousands of meaningless trophies. Merryman can see that, but just can't stop herself from catastrophizing: "We have to stop letting the Trophy-Industrial Complex run our children’s lives." That would be an argument that's supported by the facts, while extrapolating that participation trophies are emblematic of the ruination of our society, the decline and fall of our civilization... not so much.

Saturday, July 04, 2015

Sorry, Game of Thrones Fans: Jon Snow is Dead

Was that clear enough? Let me say it again, just to be clear: Jon Snow is dead.

Please, no more need for recycling the same old theories about why he is alive. That ground has been thoroughly covered. That horse is as dead as Jon Snow.

Of course there actually are some decent arguments for why Jon Snow might somehow be resurrected. That possibility seemed quite plausible in the books, with Melisandre's presence at the Wall providing for a means of bringing Snow back from the dead -- and when Melisandre made her retreat back to the Wall as Stannis's army fell it seemed fair to ask why else she would be there but to save Jon Snow. The books also resurrect... book spoiler warnings Catelyn Stark. The books introduce a character, Coldhands, who seems to be a different form of living dead from the other creatures found north of the Wall.

In both the show and books Gregor Clegane, The Mountain, is brought back as some sort of zombie. Beric Dondarrion is resurrected several times before (in the books) passing his power onto Catelyn Stark. There would have also been the possibility of glamouring, as we know that Melisandre can make one character appear to be another. The book also emphasizes wargs, with some theorizing that Jon Snow could have warged into his direwolf before his body died (never mind that there's no obvious way back to human form, and that we've been told that remaining too long in an animal host will cause you to lose your human character).

Also, it seems like an incredible waste of a very interesting character to kill Jon Snow at this point in the story. The relationship you form with a television character is different from that you form with a character on the printed page, and the loss of Snow seems all the more stark (no pun intended) in the series. Why spend all of that time and energy building him into an interesting character only to kill him off, with nobody else of similar charisma to fill his shoes? Why drop conspicuous hints about his "real" parentage if he plays no role in the end game? From a literary standpoint, why kill off the principal point of view characters who could play a central role in the conclusion, while keeping alive others who can't possibly be on the winning team? Will people, readers or viewers, care about events at the Wall without Jon Snow?

At the same time there's one consistency to Martin's writing: If you're a noble, virtuous character who puts the well-being of others ahead of your own, you end up dead. Do you really think that Snow's body will be hidden and frozen for a year, so that he can be resurrected in Season 7? (Jon Snow on Ice.... Is this a Disney production?) From the standpoint of good storytelling, as questionable as it may be to kill Jon Snow at this point in the narrative, a secret, frozen Jonsicle would seem to be worse.

I suspect that the future story anticipates that readers won't care about the Wall post-Jon Snow, and thus won't be surprised or offended when the army of the dead defeats the remaining members of the Night's Watch (and any Wildlings who might have inexplicably continued to support their efforts in the wake of Snow's death), creating the context for a major conflict between the advancing White Walkers and the dragons of Danaerys. (That story line would seem a bit... predictable. We'll see.) Also, particularly if Jon Snow has the royal blood that many plausibly believes used to run through his veins, we can expect Melisandre to have some interesting visions in the flames of his funeral pyre.

The primary evidence for Jon Snow's death comes from the show, (and if it needs to be said) not the as-of-yet unfinished, unavailable sixth book. The show did not renew Jon Snow's contract for Season 6. It's one thing to give a character like Bran a year off, without much concern for whether the part will have to be recast in a future season -- children and adolescents change a lot, and few would be surprised to discover that Bran looks different after a year of communing with ancient trees -- he'll look different even if they keep the same actor.

While daytime soap operas of old used to switch out adult characters without much concern for appearance, and Game of Thrones has done the same with at least one small part (Gregor Clegane) and with a smaller role before it became larger (Tommen Baratheon) I don't think that audiences will accept a different actor as Jon Snow. He could come back among the undead, perhaps as a hooded Coldhands-type character, but he would both be dead and be a very different character -- what would be the point? If the show wanted to be sure that Kit Harington would be available to play Jon Snow in Season 7 or beyond, given the difficult production schedule, it is highly unlikely that they would risk his taking other jobs that would prevent his participation.

The show understandably shares Martin's affinity for killing off characters -- but sometimes the reasons for a death seem different, such as to simplify story lines and perhaps to control the show's budget. The death of Ser Barristan Selmy (a character whose story line died a slow on-screen death even before the showrunners made it final), the death of Mance Rayder with no glamouring, moving up the death of Shireen Baratheon and the defeat of Stannis Baratheon, not bringing back Catelyn Stark, killing Myrcella Baratheon (who at this point in the books was only missing an ear)....

The showrunners could easily have carried Jon Snow's character forward into season six, perhaps drawing on the chapters dealing with his management of the Wildlings or delaying the death of Stannis Baratheon while having Jon Snow announce the rescue mission that precipitated his death on the printed page. But they instead chose to bump Snow off during the last episode of the season, saving themselves a big chunk of change to apply to hiring other actors, building sets and producing special effects.

If in light of all of that you still think Snow is coming back, what do you think the dialogue would be upon his return in season seven? "Hey, folks, I'm back. Did I miss anything? Ooooh... are those dragons?"

So stop watching the length of Kit Harington's hair -- it's going to get longer and shorter over the coming year because (assuming an occasional haircut) that's what hair generally does. Stop arguing that if you turn your head sideways and squint, you can see proof of Snow's survival because his eyes almost imperceptibly change color during his death scene. "We shall never see his like again, and now his watch is ended."

Wednesday, May 20, 2015

The Reynolds "Charity" Empire in Decline

A few years ago I wrote a post entitled, "Is The Breast Cancer Society a Worthy Charity", to which the answer was "No". The comments to that thread are extensive, and include a defense of the organization from Kristina Hixson, which avoided answering any of the tough questions or giving an honest explanation of the organization's operations. She went so far as to post a series of fake endorsements to the thread, trying to bury valid criticism behind fictitious praise.

Oh yes, and she went on to marry the man who ran that "charity", James T. Reynolds II.

Over time, the Reynolds' family of "charities" started to receive press scrutiny. The Tampa Bay Times published an article, "Intricate family connections bind several of America's worst charities". It opens
Carol Smith still gets angry when she remembers the box that arrived by mail for her dying husband.

Cancer Fund of America sent it when he was diagnosed with lung cancer six years ago.

Smith had called the charity for help. "It was filled with paper plates, cups, napkins and kids' toys," the 67-year-old Knoxville, Tenn., resident said.

"My husband looked like somebody slapped him in the face. "I just threw it in the trash."
The story continues,
In the past three years alone, Cancer Fund and its associated charities raised $110 million. The charities paid more than $75 million of that to solicitors. Cancer Fund ranks second on the Times/CIR list of America's worst charities. (Florida's Kids Wish Network placed first.)

Salaries in 2011 topped $8 million — 13 times more than patients received in cash. Nearly $1 million went to Reynolds family members.

The network's programs are overstated at best. Some have been fabricated.
The Federal Government has finally managed to partially shut down the Reynolds empire:
In reality, officials say, millions of dollars raised by four “sham charities” [Cancer Fund of America, Cancer Support Services, Children’s Cancer Fund of America and the Breast Cancer Society] lined the pockets of the groups’ founders and their family members, paying for cars, luxury cruises, and all-expense paid trips to Disney World for charity board members.

The 148-page fraud lawsuit accuses the charities of ripping off donors nationwide to the tune of $187 million from 2008 to 2012 in a scheme one federal official called “egregious” and “appalling.”...

Among the allegations is that [Reynolds' ex-wife, Rose] Perkins gave 10% across-the-board bonuses twice a year to employees [of the Children’s Cancer Fund of America], regardless of performance, and was allowed to set her own salary and bonuses up to a limit without the approval of board members. In 2010, when donations to the Breast Cancer Society were declining, Reynolds II’s salary ballooned from $257,642 to $370,951, according to the complaint.
What can a grifter do, but grift? Even having been shut down, the Breast Cancer Society promises to come back to leach off of the good intentions of people who want to help cancer survivors:
The silver lining in all of this is that the organization has the ability to continue operating our most valued and popular program, the Hope Supply. Our Board will work tirelessly to maintain the Hope Supply program services that have benefitted our many patients for years – initially under the TBCS banner as it transitions under a different organization – all with the goal of seamlessly providing services to you. I take solace in the fact that this wonderful program has the chance to continue operating.
There is a note of honesty, "I have loved leading TBCS...." Why wouldn't James love working in a job that paid him royally for performing little work, despite his indifference to the needs of the people his charity was supposed to help? It's a gravy train he's eager to re-board, so watch out for his next "charity", coming soon to a list of the nation's worst charities near you.

If you want a good measure of James Reynolds II's character, watch him on video.