Saturday, November 30, 2013

Sorry, No, Pseudo-Conscription is Still a Bad Idea

Dana Milbank has jumped onto the bandwagon of those who think that the cure for "kids these days" is some form of conscription.
As I make my rounds each day in the capital, chronicling our leaders’ plentiful foibles, failings, screw-ups, inanities, outrages and overall dysfunction, I’m often asked if there’s anything that could clean up the mess....

But one change, over time, could reverse the problems that have built up over the past few decades: We should mandate military service for all Americans, men and women alike, when they turn 18. The idea is radical, unlikely and impractical — but it just might work.
But, even if we incorrectly assumed that the military wanted universal conscription, it's not even slightly realistic to have every 18-year-old serve in the military. While Milbank refers to Switzerland, in which "sons of bankers and farmers alike do basic training for several months and then are recalled to service for brief periods", none of the proponents of conscription are suggesting a similar approach. They instead prefer a period of a year or several years in which the young person is forced to participate in some sort of nebulous national work program that may (but probably won't) involve military service.

What we would end up with, at best, is what Milbank describes later in his editorial, based on his notion that "the structure is less important than the service itself",
My former colleague Tom Ricks proposes bringing back the draft in the United States but allowing for a civilian national service option — teaching, providing day care and the like — for those who don’t want to join the military.
Ricks' proposal was bad, as well, but at least he attempted to explain how it might function. The notion is that we are going to fix the nation's problems my making people take people away from their academic studies or jobs for a year or two, and compelling them to work in daycare centers or equivalent vocations? Seriously? And note the disdain for teaching as a profession - it's presented as something a random high school graduate can do, and roughly equivalent to working in a daycare center.

Milbank agrees with my past assessment that the cost of such a program would be huge. "Staggering" might be a better word. But he insists,
But so would the benefits: overcoming growing social inequality without redistributing wealth; making future leaders, unlike today’s “chicken hawks,” disinclined to send troops into combat without good reason; putting young Americans to work and giving them job and technology skills; and, above all, giving these young Americans a shared sense of patriotism and service to the country.
There is little reason to believe that Milbank's draft would overcome growing income inequality. The type of job skills that marginal high school graduates (or drop-outs) could develop through such a program would likely leave them qualified for low-paying jobs. Even for those who serve in the military, if you consider the difficulty that many veterans presently have finding employment, why does Milbank believe that those who are conscripted and serve for less time will fare better?

Without redistributing wealth? Sorry, but a program that takes a "huge" amount of government revenue and applies it to a year or two of national service would certainly be "redistributing wealth". There's also more than money at stake. For 18-year-olds who plan to attend college, you're delaying their entry into the workforce by at least a year (assuming the program is only a year long). For 18-year-olds who are employed, you're costing them their jobs. For 17-year-olds who would otherwise have job prospects, you're ensuring that employers won't consider them for anything more than temporary positions. That is, there's a tremendous opportunity cost imposed on the young people who are conscripted into the program - perhaps not wealth redistribution in the classic sense, but nonetheless imposing a genuine financial harm on those drafted into the program.

I'm not clear on why Milbank believes that this service, even if we pretend it could all be military service, will result in fewer military mobilizations. Milbank references "chicken hawks", the term applied to people like Dick Cheney who fastidiously avoided service during their youth but had no compunction about entangling the U.S. in wars. But where's the evidence that veterans, once in office, are any less hawkish than non-veterans? Veterans got us into the Korean War and Vietnam War, the first Iraq War, and any number of lesser conflicts. John McCain is a veteran, yet he's one of the most hawkish members of the Senate.

As for "putting young Americans to work and giving them job and technology skills", how would that work? Yes, the military involves a lot of job and technology skills, but often not the sort of skills that fit well with the modern civilian workplace. It's not clear how many of Milbank's conscripts would achieve similar skill sets, as their terms of service would be shorter. Beyond that, Milbank mentions... teaching and daycare. A young adult can already work in a daycare center straight out of high school, and in some cases before they even graduate. While it might be possible to create a small, focused program that allowed high school graduates to develop cutting edge skills, the sort of blunderbuss approach Milbank favors all-but-ensures that most participants will gain very few skills that would benefit them in a subsequent job, save perhaps at the bottom end of the job market.

Oh yes, and "above all", giving "young Americans a shared sense of patriotism and service to the country". I can't help but think of my uncle who, unlike many of the chicken hawks to whom Milbank alludes, insisted upon joining the military and upon combat duty despite a physical condition that would have allowed him to either avoid service or bide his time behind a desk. If anything, his patriotism was bated by his experiences, and the schisms within his unit made it anything but the sort of happy melting pot that Milbank seems to envision. Milbank's vision seems oddly in line with what you see in Vietnam, where conscripts may find themselves in a civilian uniform, working behind the front desk of an army-owned hotel... or cleaning the rooms. But there's little reason to believe that conscripts assigned to random, menial employment would feel much of a connection with those outside of the daycare center where they work, or that those who obtained more prestigious assignments or ranks would view them as equal. And last I checked, Vietnam's government was not one I envied.

Milbank later adds, "Gun-rights groups would cheer an armed citizenry", but where does that even come from? Milbank cannot seem to maintain a consistent thesis as to whether the conscripts would be performing military service, or whether they would be working in daycare centers. Perhaps he imagines that all conscripts will go through military basic training before being shipped off to work in daycare centers? It's hard to tell what he has in mind.

Milbank notes, "an article published by the libertarian Cato Institute argued that compulsory service 'can be a pillar of freedom,'" but fails to note the inherent tension between "libertarianism" and conscription, or for that matter between freedom and conscription. A libertarian might endorse bringing real meaning to the "unorganized militia", with the government providing broad opportunity for citizens to avail themselves of military training within that context, but let's not pretend that conscription is a libertarian ideal or that a libertarian with his head screwed on correctly would confuse it with "freedom".

I'm not particularly concerned that yet another pundit has endorsed a type of service he personally eschewed as a cure for the nation's ills, although I remain amused by arguments that boil down to, "The best way for young people to develop a set of values similar to my own is by being conscripted into a type of program that I, personally, avoided." There's no chance that universal conscription will become law. However, I do think that much of the hand-wringing about kids these days, and about how to provide better opportunities for young people who have more than their fair share of obstacles to overcome, could be channeled into a voluntary service program, a "bridge year" or two in which participants could be matched with suitable peer groups, and dispatched to communities where they could perform productive work and develop genuine job and leadership skills. But I guess that sort of idea isn't as fun to kick around. Besides, while conscription is something the government would have to impose, for somebody who is positioned to actually generate the necessary money and attention, proposing a "bridge year" program might invite the response, "Great idea, what are you doing to bring it to life?"

Don't Confuse High Stakes Testing With High Expectations

Frank Bruni recently expressed concern, in the usual hackneyed terms, that kids these days are "coddled". Some sports give trophies for participation, or end games early when the difference in score reaches a defined threshold. A middle school near Boston is concerned that kids' feelings might get hurt if they find out that they weren't invited to parties. Some kids get stressed by tests. Bruni complains that "Many kids at all grade levels are Bubble-Wrapped in a culture that praises effort nearly as much as it does accomplishment." As anybody, including Bruni, should know, people like Bruni have been writing this sort of column for generations.

All of Bruni's complaints are to set a context for his criticism of people who object to the high stakes standardized testing model imposed upon the nation's schools. Bruni conflates high stakes standardized tests with "tougher instruction [that should] not be rejected simply because it makes children feel inadequate, and that the impulse to coddle kids not eclipse the imperative to challenge them." While Bruni insists, that Common Core is "a laudable set of guidelines that emphasize analytical thinking over rote memorization", even he admits that "n instances its implementation has been flawed, and its accompanying emphasis on testing certainly warrants debate." Yet here he is, calling those who want to engage in the debate paranoiacs and whiners.
Then there’s the outcry, equally reflective of the times, from adults who assert that kids aren’t enjoying school as much; feel a level of stress that they shouldn’t have to; are being judged too narrowly; and doubt their own mettle.

Aren’t aspects of school supposed to be relatively mirthless? Isn’t stress an acceptable byproduct of reaching higher and digging deeper? Aren’t certain fixed judgments inevitable? And isn’t mettle established through hard work?
I don't mind at all the notion that school should be challenging. But what Bruni is overlooking is how standardized testing has displaced a lot of traditional classroom teaching and learning, or that the insistence that children master skills at earlier ages is not necessarily consistent with the students' cognitive development. After pushing more and more traditional first grade material into kindergarten, we're now hearing proposals to raise the age for kindergarten enrollment. If you end up with a kindergarten full of kids who, under the former system, would largely have been in first grade, what are you actually accomplishing?

Here's something it shouldn't take very long to figure out: When you tell a teacher, "Your ranking as a teacher, your ability to keep your job and the amount you are paid depends on how your students do on a series of standardized tests," the odds are that the teacher is going to devote a great deal of effort and classroom time to improving student performance on the test. Bruni ridicules a parent's complaint that as a result of that sort of focus on testing, his eight year old's class was left with "no room for imagination or play". Does Bruni not understand that children can be challenged academically, yet be encouraged in their imagination? Does Bruni not understand that children need breaks in their lessons during the course of a school day? That children can learn from play activities? It would seem not.

Bruni references David Coleman, "noe of the principal architects of the Common Core" as asserting that he favors self-esteem, but wants to "redefine self-esteem as something achieved through hard work". It's not that self-esteem cannot be derived from hard work, but that's not really what Coleman is talking about. In the schoolyard, self esteem is on the whole negatively correlated with academic performance. Bruni's ridicule of parents who are concerned about their parents feelings is, in a sense, more relevant than Coleman's goal, because Bruni's approach does not involve somehow changing human nature. When Coleman talks about how students "will not enjoy every step of it" but "if it takes them somewhere big and real, they’ll discover a satisfaction that redeems the sweat", he seems to be talking about the end of a very long process. If you don't find a way to let kids learn on an incremental basis that their hard work will be rewarded, you're not going to create an effective learning environment for most kids.

Bruni also references Marc Tucker of the National Center on Education and the Economy, who has stated, "ile American parents are pulling their kids out of tests because the results make the kids feel bad, parents in other countries are looking at the results and asking themselves how they can help their children do better." But that's not the actual issue. Although certain factions of school reformers like to point to nations that obsess over test scores, holding them out as a model for the nation, it's very clear that we don't have the sort of culture that will cause us to follow the lead of South Korea, with kids leaving school to head over to private academies where they spend additional hours being prepped for tests, and we don't really want to follow the lead of nations that produce kids who are very good at taking tests but not much good at thinking outside the margins of a carefully darkened oval.

If you want good public schools, you don't need to do much. You need to make the profession of teaching sufficiently well respected and remunerated that you attract above average students into the profession, you want to make the task of classroom teaching rewarding, and you want parents who will reinforce the need for their kids to attend school, study and do their homework, behave in class, and achieve academically. When you do that you don't need to obsess over test stores - you can use standardized tests in their traditional manner, to assess individual and group performance with an eye toward improvement, and with no need for teachers to "teach to the test" because the goal is to obtain an accurate assessment as opposed to an artificially inflated score that reflects intensive teaching to the test at the expense of a rich classroom experience.

Ah, but high-stakes standardized testing is so much easier for school administrators and politicians, the ones who have positioned themselves to get prizes for "participation" - a large, steady paycheck with no consequences at all for the failure of schools, teachers or students. And it's so much easier to point to a computer-generated list of scores and pretend that you have objectively evaluated a teacher or school than it is to work hard.

Monday, November 25, 2013

How New York City Views the World

I kid Fred Wilson, but I did see an element of the NYC-centered worldview in his comment,
A smartphone can get you a ride but a car can't get you a date.
I think it's safe to say that Mr. Wilson can afford a car that would be very helpful to his dating life, were he not happily married.

But heck, as removed from the singles scene as I am, maybe we are moving into an era where "Hey, babe, check out the productivity apps on my Galaxy" is an effective pickup line.

Wednesday, November 13, 2013

Reactions to the "Obamacare" Website

I am not particularly sympathetic to the finger-pointing by the contractors, that HHS should have left more time for testing the website before it went live, because they knew that there was a "drop dead" date by which the website had to go live. When I heard one of the contractors testify that their company didn't warn HHS of the need for a longer testing period, ostensibly because it wasn't their job, I had to roll my eyes. If you think it will take a month or two to integrate your work into website, and you know that the integration must be complete and tested inside of two weeks, it's your job to speak up - and to do so as soon as you realize that there's a problem.

I tried to use the website on day one. The site was clearly overwhelmed. My reaction to that? "I'll try again later." Yes, it would have been nice to get through the initial registration and set up an account, and it would have been nice had HHS anticipated the massive number of people who would try out the site when it went live, but this sort of thing happens.

What I didn't anticipate, when I went back to use the site, was an experience that suggested not only that the programmers didn't care about serve load, but that they built elements of the website that seemed to frequently and unnecessarily load the server. Oh, I'm sure lots of stuff is going on in the background, but when you're simply entering your personal information... why? And why so inefficiently? If the website is overwhelmed, it would make more sense to collect the information without doing all of the back-end data crunching and, when the basic information was collected, tell applicants, "It will take approximately X hours to process your application. We will notify you by email when your application has been processed. If you would like a text message, please enter your email address or cell phone number below."

When I went back to the site, I was able to complete the registration process, but received server error messages telling me to log back in later three times over that relatively short process. To the site's credit, I only lost data one time. One minor annoyance was having to enter the same information several times, with no ability to simply click a "same as last time"-type option to pull in the data already entered. Another was with the editing process. You have to enter SSNs for people who will be part of your application. For security reasons, the SSNs are obfuscated, with the last six numbers replaced by asterisks, when you edit the personal information for any person who is part of your application. But if you don't delete those asterisks and re-enter the SSN you will get an error message. That's the sort of inattention to detail that can make a website less pleasant to use - I wonder what percentage of applicants think that the asterisks reflect the website's retention of the data, such that it doesn't have to be re-entered, only to get that error message. If you have to enter the SSN anyway, don't populate the field with asterisks. Leave it blank, perhaps with an explanation, "For security reasons your SSN is not displayed on this page. Please re-enter the number before you proceed."

Another oddity is the navigation of the various steps of the application process. The site displays the steps you must take, and those you have not yet completed, but there's no "click here to continue" type prompt. You have to guess where to click. It's not that it's difficult to guess, but I've heard from a person who I would have thought would have figured it out and he was stymied.

When available plans are displayed, you can compare plans. You can select as many as you want to compare, but the comparison page only shows three plans at a time. The comparison page is decent, with the plan broken down into areas of coverage with subheadings for the elements of coverage within a given area. The problem is, if you choose the option to delete the plan in the first column, those subheadings go away making it e difficult to compare plans. They do not reappear even if you go to the next page of plans selected for comparison - for the subheadings to reappear you need to restart the comparison process.

Finally, when selecting a plan I received a large warning that the plan did not include dental coverage for minors. It did. The problem suggests that the data about each plan and its components is included in redundant fields, as if the plan can properly display the coverage it provides there is no reason why the verification algorithm would get it wrong.

Mistakes like these aren't just indicative of limited testing by HHS. They are indicative of limited testing by the contractors who developed the UI for the website, and more than that they suggest to me that the programmers were largely indifferent to the user experience. The delays in processing data suggest that programmers were largely, perhaps, completely, indifferent to server load.

I used the online chat service to verify that I could rely upon the plan description despite the warning message. Response time was prompt and the person providing support was professional and efficient.

If I were the programmer responsible for any of the problems on this site, I wouldn't be pointing fingers. I would be apologizing and redoubling my efforts to fix it. With Republican demagoguery on the law and now on the website, it's easy to point fingers but really - based upon the types of problems and issues I experienced, the programmers bear the lion's share of responsibility for the problems with the site they programmed.

If you want to browse basic pricing information for the sites included in, but don't want to register yet, unofficial information is available courtesy of Stephen P. Morse.

Monday, November 11, 2013

A Fourth Year of Law School?

Cardozo Law Professor Edward Zelinsky is taking some well-deserved ribbing for his suggestion that law school should be expanded to a four year program. Critics of his piece have noted the tension between his argument that law school should be longer and should be more affordable,
The most serious argument against a fourth year of law school is the additional cost it would entail. Legal education is already too expensive. Adding a fourth year would impart even greater urgency to task of controlling the expense of law school, just as there is currently great urgency to the task of controlling the costs of undergraduate education.
As if it makes sense to increase the cost and duration of law school by a third, and only then address the urgent need to make law school more affordable. But I also take issue with this claim,
An ancillary benefit of a fourth year of legal education would, in the short run, be a reduction in the supply of law school graduates.
Let's say that starting in 2014, every new law school student were enrolled in a four-year program instead of a three-year program. In 2017, very few law school students would graduate. In 2018, assuming prospective law school students didn't turn away from the additional burden in droves, the market would return to "business as usual". So there would be a whopping single year in which a shortage of law school graduates would significantly affect the legal job market, which may help lawyers you graduated a year, perhaps two years before, but would be of little help to anybody who graduated earlier and of no help to anybody who graduates later. Still, not to damn his point with faint praise, it's not the weakest of his arguments.

Thursday, October 24, 2013

Microsoft... Still Doesn't Get It

A Microsoft executive, Frank Shaw's, attempt to poke Apple has gained some attention,
Note: If you are the TL;DR type, let me cut to the chase. Surface and Surface 2 both include Office, the world’s most popular, most powerful productivity software for free and are priced below both the iPad 2 and iPad Air respectively. Making Apple’s decision to build the price of their less popular and less powerful iWork into their tablets not a very big (or very good) deal.

Since we launched the Surface line of tablets last year, one of the themes we’ve consistently used to talk about them is that they are a terrific blend of productivity and entertainment in one lightweight, affordable package. In fact, we’re confident that they offer the best combination of those capabilities available on the market today.
Wait a second... then why did Shaw just say this?
I have to say, I’m really excited for a 1080p Lumia with a third column on my start screen so I can keep a close eye on more people, more news, more stuff.
That is, if Microsoft's tablets are the best combination of productivity and entertainment on the market, why is Shaw excited about buying a Nokia, even if Microsoft is in the process of acquiring that company? Shaw leaves me with the impression that his blog post is less about touting his company's great product than it is an attempt to promote Windows tablets, generally, by taking digs at Apple. Microsoft was very late to recognize the market for the tablet computer, and continues to withhold Office from competing platforms as it has scrambled to develop its own tablets. On top of that, part of the reason that Microsoft's tablets are "affordable" is that they aren't selling, and as a result prices have been slashed... now twice. I don't want to diminish the Surface as a product, and I suspect that it would have been more successful had it been released two years earlier, but as with the Zune there's a significant price for coming late to market with a product that doesn't capture the imagination of your market.

What strike me most about the piece is how the Surface is, in essence, touted as a laptop. Microsoft, we're told, is the expert in how real people work. Real people want keyboards, trackpads, multiple windows open on their displays, a full version of Office. (Real people also apparently want an obnoxious, in-your-face tile interface shoved in their faces when they boot up a Windows 8 computer, and want touch screens on their portable computers.) And yet real people demonstrate Microsoft's skill in assessing their needs by buying Apple and Android tablets in huge numbers, while largely ignoring the Surface.

If you're typing or editing large documents, are creating spreadsheets, or working with other complex documents, you probably do want a keyboard and mouse or trackpad, but... you probably already have a desktop computer, a portable computer, or both upon which to perform those tasks. If you have a notebook computer that runs Windows, what's the advantage of toting around a Surface tablet with a keyboard cover when you can simply use your computer? The power of Surface has given Windows tablets about 5% of the market, which is enough to keep your toes in the water. Apple has about 5% of the global PC market, so in a sense Microsoft is in good company.

Shaw declares that by offering so much productivity Microsoft is leading the market (from behind)....
And so it’s not surprising that we see other folks now talking about how much “work” you can get done on their devices. Adding watered down productivity apps. Bolting on aftermarket input devices. All in an effort to convince people that their entertainment devices are really work machines.

In that spirit, Apple announced yesterday that they were dropping their fees on their “iWork” suite of apps. Now, since iWork has never gotten much traction, and was already priced like an afterthought, it’s hardly that surprising or significant a move. And it doesn’t change the fact that it’s much harder to get work done on a device that lacks precision input and a desktop for true side-by-side multitasking.
I think Shaw is onto something when he describes how Microsoft is in touch with what people want, if we define "people" as the population that is already predisposed to buy a Surface. The problem is, he is touting solutions that have absolutely nothing to do with how most people use tablet computers. The tablet is largely a product for consumption of media and entertainment, not for productivity. To the extent that you can add on productivity, about 90% of tablet users are going to find all of the power they need (and perhaps more) in the free apps that Apple is offering, and those apps will get better over time.

Perhaps what Shaw is displaying is discomfort at seeing his company's business model increasingly threatened by free software. Sure, the competing software may be less powerful than Office suite, but... free, and good enough for a significant majority of users. It's part of an expectation Microsoft helped create when it launched its browser war against Netscape, and even before that with its controversial bundling practices: the idea that you pay for your computer hardware, and that the software you need for basic functions (an ever-expanding category) should be free. Your Surface tablet runs Windows 8 and Office, but will upgrades to either be free? I doubt that's what Microsoft has in mind.

Apple seems to be taking the position that software is a commodity product that is best used to sell hardware, and by expanding the sphere of what its customers get for free - and how well its products play together - they want to keep customers in the Apple ecosystem. I can see merit in Microsoft's vision of the future, with people having full capacity to do whatever it is that they want to do on whatever device they have with them, but I'm not sure that the vision is compatible with Microsoft's business model - at least not in the mass market. If expensive software upgrades are required for any product running Windows, that cost will quickly undermine Microsoft's claim that its products are more affordable than those that offer free upgrades. Apple's vision of the future seems to be to allow users of its products to transition from one device to another, phone, tablet, computer, Apple TV, while having each device know exactly where you left off on the other. Continue your movie from where you paused, continue editing your document from where you stopped.... That vision seems to be more viable, and is unquestionably consistent with Apple's business model and - despite Mr. Shaw's claims - seems to be more consistent with how people in the mass market are using their devices.
So, when I see Apple drop the price of their struggling, lightweight productivity apps, I don’t see a shot across our bow, I see an attempt to play catch up.
Whereas I see Apple as obviating the need for 90+% of its customers to ever purchase "Office for IOS", should Microsoft ever muster enough courage to release such a product.
I think they, like others, are waking up to the fact that we’ve built a better solution for people everywhere, who are getting things done from anywhere, and who don’t have hard lines between their personal and professional lives. People who want a single, simple, affordable device with the power and flexibility to enhance and support their whole day. :)
I admit it. If I had to choose between doing my professional work, or even blogging, on a tablet or a notebook computer, I would pick the notebook computer as the "single, simple, affordable device with the power and flexibility to enhance and support their whole day". But I don't have to choose, and thus can use my smartphone or tablet for the functions they provides extremely well - basic communication, media consumption, web browsing, simple games, demonstrations, and online reference materials - and switch to my notebook computer (or go to my desk) for more complex tasks. To look at it another way, the fact that I have a Swiss Army knife and thus can sometimes avoid using a more specialized tool doesn't mean I'm going to throw away my saws, knives, screwdrivers and scissors. If your vision of a typical tablet user is somebody typing away on a keyboard using a fully featured windows OS, you're not looking at how people interact with their tablets.

Wednesday, October 23, 2013

Ed Rogers Is Terrified That Obamacare will Succeed

Ed Rogers, one of the Washington Post's political demagogues blogging in the PostPartisan (get it?) section of their website, blathers about Obamacare:
Since I’m always admonishing others to admit the obvious, I will now make an admission of my own: I’m rooting for Obamacare to fail. And I encourage others to do the same.
If you're familiar with Rogers or his history, that's really all that he needed to say. It would stand as a naked admission of his political bias, and his preference to cause harm to the millions of people who will benefit - and are already benefiting- from the PPACA if it gives advantage to the Republican Party. And if he left it at that, it would be fair to point out that people of Rogers' ilk used to go ballistic over criticism of George W. Bush, pretending that any criticism constituted a near-treasonous (or... should I say treasonish) wish for the nation to fail. But... he can't stop there.
I do not hope the uninsured stay uninsured, but Obamacare is not the solution.
You know what would have been a good follow-up to that claim? A proposed solution. Do you think we got one? Get real. The reason Rogers and his ilk are jumping on the "Glitchgate" bandwagon is because they don't have any ideas.
Obamacare is a harmful policy that will be bad for the country if it is forced upon Americans and the American economy.
You know what would have been a good follow-up to that hyperbolic claim? Any evidence that it is connected to reality. Do you think we got one? Yeah, same answer.
Why would anyone hope it limps into existence and settles like a cancer on the U.S. health-care system?
Wow... people having health insurance is like cancer. Who would have thunk.
The president asked his critics to stop rooting for its failure, but I, for one, refuse to do so.
And there he circles back to my first point, but again he finds himself unable to stop.
Also, the failure of Obamacare would do a lot to expose the false promises of big government programs.
Let's see... the three biggest ticket items are Medicare, a highly popular and effective health insurance program, Social Security, a highly popular and effective program to provide retirement benefits and disability insurance, and the military. Which of these does Rogers believe will be proved to be a "false promise" by a government program that, rather than following the successful lead of Medicare, compels people to buy insurance from private companies?
President Obama has been and always will be an unapologetic promoter of classic liberal activism.
Do you have the first clue what "classic liberal activism" means? If it were a little less polite, it might be a meaningless phrase Rogers picked up from Rush Limbaugh.... Does Hannity use that phrase? Seriously - when I search for the phrase "classic liberal activism" in quotes, Google returns a whopping six results, three of which are the same screed from WorldNetDaily, Joseph Farah's gift to reactionary trolls who want to play journalist. Perhaps that's Rogers' source?
Liberals think that Washington knows best and that increasing citizens’ dependency on government is a goal in and of itself.
When the facts fail you, there's nothing like a hollow man argument to save the day. Which liberals hold that belief, Ed? Can you name even one? Didn't think so.
Like all Republicans, I believe we need smaller government.
Straight from the hollow man to the false generalization. History tells us that Republican Presidents like to talk about smaller government, while nonetheless significantly expanding the size, reach and cost of government. Going back to 1982, which presidents have presided over the slowest growth in annualized federal spending? Clinton and Obama, by significant measure. If Republicans believe in smaller government, why do they have such a difficult time voting for politicians who reduce the size of government, implementing policies to reduce the size of government, or applauding the presidents who actually walk that walk? (Which again takes us back to my first point).
Obamacare is the opposite of a smaller, less obtrusive government.
No, as Rogers would figure out if he were to actually think about the issues, it is not. Millions of people are trying to sign up for health insurance through the exchanges because they want insurance. Millions of people went to the exchange site the day it opened for that very reason. Rogers may sit around with his wealthy Republican peers, sniveling about how ordinary people don't really want health insurance, but the fact is that millions of people like the provisions of the PPACA that are already in effect (such as allowing kids to stay on their parents's health insurance until the age of 24) and are eager to finally get health insurance that they can afford or that will cover their pre-existing conditions. In terms of being "smaller", quite obviously the program is designed to minimize the public role. The opposite of that would be single payer.
The failure of Obamacare would discourage and hopefully deter those who think a bigger, more domineering U.S. government is the answer to our problems.
Actually, the lesson would be that we should do the sensible thing, and follow the lead of pretty much any other western democracy - all of which have national health insurance programs of one sort or another that are both popular and largely successful. It is the adherence to "free market" principles that keeps this nation's health insurance costs so high, while providing the average American with less care than they would be able to obtain under a less costly "socialized" model.
And most important, the horrors of this debacle and the collapse of Obamacare would have a chilling effect on politicians who want to promote big government solutions.
And there you get to the real problem - Rogers is cheering for Obamacare to fail because his bladder trembles at the thought that it might succeed. And then governments might do crazy things, like adequately funding public education and fixing roads and bridges.

Monday, October 21, 2013

Senator Jim Inhofe, Dishonest, Irresponsible Demagogue

I see that Jim Inhofe is lying about ObamaCare.
Sen. Jim Inhofe (R-OK) told the radio station WABC on Sunday he may not have been granted his recent emergency quadruple bypass heart surgery if he was insured under Obamacare, BuzzFeed reported.

"You are talking to someone right now who probably wouldn’t be here if we had socialized medicine in America," he said on host Aaron Klein's radio show, referring to Obamacare.

Inhofe, 79, told Klein he discovered he needed immediate heart surgery for clogged arteries after going in for a routine colonoscopy, and that had he been in a country with “socialized medicine like Obama is trying to impose upon America,” the operation might have been unavailable.
As even a half-informed half-wit would know, coronary bypass surgery is performed as a matter of routine in nations with socialized medicine. But more than that... And such a person would likely also realize that, despite irresponsible demagoguery from people like Inhofe, Obamacare involves the purchase of insurance from private insurance companies. Surely Inhofe can still generate enough sparks between his neurons to understand the difference between buying insurance from a private company, even if pursuant to a mandate, and "socialized medicine."

Inhofe has two taxpayer-funded insurance policies. The first is the insurance the taxpayers provide to him through his employment as a U.S. Senator. Second, he is the beneficiary of a truly socialized health insurance plan, the extremely popular program called Medicare. Nothing about his coverage is going to materially change on January 1. He would still have his two taxpayer-funded policies, and with coordination of coverage it was likely the socialized Medicare program that paid for his recent surgery and that would pay for it had his need arisen next year.

You know what? Let's toss ObamaCare out the window, and give everybody exactly what Jim Inhofe enjoys - government paid health insurance on top of government-paid Medicare coverage. Deal?

Monday, October 14, 2013

The (Non-)Political (Non-)Persection of JPMorgan Chase

The Washington Post tells us what some conspiracy theorists have to say about the prosecution of JPMorgan Chase for the London Whale fiasco and for the sins of its acquired companies leading up to the financial crisis:
We’re less impressed by the more backward-looking attack on JPMorgan for allegedly misleading investors about the quality of securities it marketed before the crash. Mr. Dimon reportedly is facing a demand for $11 billion in fines and other payments to settle the case, under threat of a Justice Department criminal investigation. Yet roughly 70 percent of the securities at issue were concocted not by JPMorgan but by two institutions, Bear Stearns and Washington Mutual, that it acquired in 2008. Among the investors supposedly ripped off were the sophisticated government-sponsored enterprises known as Fannie Mae and Freddie Mac. As was inevitable, some say the case is payback for Mr. Dimon’s criticism of Obama adminstration policy.
The editorial continues,
We don’t take that view; nor do we pity JPMorgan, which is still a lucrative business despite its legal woes and which purchased the institutions for their valuable assets mixed in with their massive liabilities. When it bought them, it bought their legal issues, too — known and unknown.
There's an obvious tension between the Editorial Board's whine, "70 percent of the securities at issue were concocted not by JPMorgan but by two institutions, Bear Stearns and Washington Mutual, that it acquired in 2008" and their subsequent statement, "When [JPMorgan Chase] bought them, it bought their legal issues, too — known and unknown". If they're responsible, then the percentage doesn't matter. Even if you buy into the subsequent "poor little rich boy" argument, "then-Treasury Secretary Henry M. Paulson Jr. told Mr. Dimon that doing so would help the country by stemming market panic. That gives the case a certain 'no-good-deed-goes-unpunished' quality", the companies' misconduct was not exactly a state secret at that point and potential liabilities were factored into the fire sale prices. Robert Pozen offers another version of the "poor little rich boy" stance, that admits as much,
If JPMorgan had purchased Bear Stearns under "normal" circumstances, JPMorgan's shareholders would have been a reasonable target of the lawsuit. Typically, if one corporation (call it A Corp.) buys another (call it T Corp.), A assumes all of T's former liabilities-its bonds, pension obligations, and, yes, its legal liabilities.

The transfer of legal liability relies on the logic that A could have performed due diligence prior to acquiring T, and reduced its offer price to account for any potential legal liability. Thus, the expected cost of future lawsuits flows through to T's shareholders, as it should in the normal case.

But JPMorgan's acquisition of Bear Stearns was different. JPMorgan purchased Bear Stearns at the behest of top federal officials-who needed JPMorgan to quickly announce a deal in order to quell a potential financial panic. Furthermore, the offer price was effectively set by these federal officials. There was no opportunity for JPMorgan to learn about Bear Stearns' legal liability, nor to adjust its offer price accordingly. Indeed, JP Morgan initially walked away from the acquisition because it did not have enough time for due diligence.

Thus, punishing JPMorgan's shareholders does nothing to align incentives-it merely punishes shareholders for acts in which they are blameless. Even worse, this fine discourages companies from engaging in "white knight" acquisitions at the request of federal regulators. In the future, company executives will demand broad guarantees against losses from the government before taking over any troubled institutions.
The short answer to that is that, while I feel some sympathy for a company that plays white knight and ends up with a worse deal than it anticipated, nobody forced JPMorgan Chase to say "yes". Pozen admits that they understood the risk and chose to go forward with the purchase anyway. As for their being a fixed price and no room to negotiate, clearly there were negotiations - JPMorgan Chase was free to walk away, did so, reconsidered, and came back to close the deal. Further, the facts belie the notion that the government presented JPMorgan Chase with a "take it or leave it" price - they had initially agreed to pay $2/share, and it was the threats of Bear Stearns shareholders to fight the sale that inspired them to raise their offer to $10/share. The total selling price was less than the value of Bear Stearns' Manhattan headquarters - you can't look at the negotiations or the purchase price without recognizing that JPMorgan Chase knew it was taking on potentially massive liabilities.

As for future "white knights" being afraid to step forward, let's be honest: JPMorgan Chase acted because it saw a business opportunity. Pozen and the Washington Post Editorial Board assume that JPMorgan Chase was unaware of the possible downside. I don't attribute that level of incompetence to its negotiators, and have little doubt but that they carefully considered outcomes far worse than the present proposed fines when they agreed to buy Bear Stearns at a stock valuation of 7.5% of its 52 week high.

But more than that, if the Board sincerely does not endorse the view of the conspiracy theorists, the unidentified "some" who "say the case is payback for Mr. Dimon’s criticism of Obama adminstration policy", why did Fred Hiatt and his crew choose to title their editorial, "JPMorgan Chase’s political persecution"? Was Hiatt trying to mislead readers into believing that the Post endorses the stance of the conspiracy theorists? Does he understand what the word "persecution" means? And why are they suggesting that holding JPMorgan Chase responsible for misconduct for which they acknowledge it to have assumed liability imperils the government's reputation for impartiality? How would it be impartial for the government treat JPMorgan Chase more favorably than it would treat another, similarly situated company?

Alas, these are the types of questions that Fred Hiatt's Editorial Board can never seem to find space to answer.

Sunday, October 13, 2013

Pat Buchanan's Fantasy of... His Own Private Idaho?

Pat Buchanan, after describing how nations that had been part of the USSR reclaimed their national identities, and in some cases balkanized afterward, and carrying on for a bit about various secessionist movements in Europe, conflates the secession of a region to create a new, independent nation state with the redrawing of political boundaries within a nation state.
What are the forces pulling nations apart? Ethnicity, culture, history and language—but now also economics. And separatist and secessionist movements are cropping up here in the United States.

While many Red State Americans are moving away from Blue State America, seeking kindred souls among whom to live, those who love where they live but not those who rule them are seeking to secede.
Buchanan appears to be conceding that the primary driving forces behind these secessionist movements, foreign and domestic, are ethnicity and culture. In the context of ending a civil war between ethnic groups with the bloody partitioning of a country, it's not really surprising that ethnicity and culture play a role in that division - it's to be expected. But there are no similar crises within the United States. Buchanan may be correct that the citizens of the regions he describes later in his editorial are concerned about protecting their ethnicity and culture, but perhaps the real problem is that they're too resistant to getting on board with the proverbial "great American melting pot".
The five counties of Western Maryland—Garrett, Allegheny, Washington, Frederick, and Carroll, which have more in common with West Virginia and wish to be rid of Baltimore and free of Annapolis, are talking secession.
I found Ilya Somin's "vote with your feet" concept to be a bit ridiculous, but I have to say Buchanan's idea is far more precious. Why move across a nearby state line when you can instead move the state line? Buchanan has given no apparent thought to the permanence of such an arrangement. Does he anticipate that states will trade counties like kids trade baseball cards? (Perhaps I should say Yu-Gi-Oh? Kids, it seems, have their own cultural secessionist movements every decade or so.) Perhaps he imagines a context in which any number of people who claim to be disgruntled with their state government can split off and form their own state? Could I be my own state, and perhaps serve as both senators?

Buchanan has something dead wrong in his earlier secessionist argument, suggesting that economics are now driving European secessionist movements. To the contrary, economics tend to hold back peaceful secession. When you live in a small, rural area you gain considerable advantage by being associated with a larger, more economically developed state. Such regions usually receive massive subsidies from their states, sometimes direct, and often in the form of government enterprises that exist only by virtue of state funding. Buchanan states,
Folks on the Upper Peninsula of Michigan, bordered by Wisconsin and the Great Lakes, which is connected to lower Michigan by a bridge, have long dreamed of a separate state called Superior. The UP has little in common with Lansing and nothing with Detroit.
I'm not sure what Buchanan means by "The UP has little in common with Lansing", or if he even knows what he means. Or Detroit, for that matter, unless it's as basic as "skin color". Economically, many parts of the UP are quite comparable to Detroit. Baraga County in the Upper Peninsula has an 18.3% unemployment rate - comparable to Detroit's. Dependence upon public assistance is high. What keeps the UP's unemployment rate from being even higher? Six of the UPs fifteen counties are home to state penitentiaries, with the good-paying jobs that go along with them. The UP also benefits from the state's promotion of tourism, it's leading industry. A separate state of "Superior" would have to pay its own way, which may be part of why the most feverish part of the "long dream" of which Buchanan speaks broke in the 1970's. Also, given how easily Democratic Senator Debbie Stabenow carried the UP, perhaps the political culture is not as removed from the rest of the state as Buchanan imagines. I wonder if Buchanan even knows about the Mackinac Bridge?

In any event, even if Buchanan's worst case scenario unfolds, and we have another G.W. Bush-type President or continued incompetent House leadership by the likes of John Boehner, such that "another Great Recession hits or our elites dragoon us into another imperial war", and we "hear more of such talk", so what? It's a pipe dream. Nothing is going to come of it. I guess it's a bit more peaceful than the violent, genuinely secessionist fantasies of some of Buchanan's more extreme peers, but let's face it: We're not going to allow small regions to form their own states, we're not going to give five sparsely populated counties of Maryland or nine similarly rural counties of Colorado statehood, along with a minimum of two U.S. Senators and three Members of Congress. And beyond the fantasy, counties that enjoy being heavily subsidized by their urban peers tend to wake up at some point to the reality of what their tax bills and public services would look like if they actually carried their own weight.

Calling People "Low Information Voters" Doesn't Validate Decentralization or Privatization

In a recent article in Forbes, Ilya Somin bangs one of his favorite drums, that of the "low information voter". Somin complains that many voters don't understand the background to the government shutdown, or how much government spending goes to Medicare, Social Security and foreign aid, then lectures, "This kind of basic knowledge may not be all voters need to know to form intelligent opinions. But it’s hard to do so without it." He complains further that even informed voters can misunderstand the evidence, and that people with conflicting political beliefs can interpret the same evidence as being consistent with their own beliefs. Somin, in essence, blames this all on laziness:
It is easy to blame ignorance on stupidity or on the media. But basic information about most political issues is readily available in the media and online. It isn’t hard to find out what Obamacare insurance exchanges are. The problem is that most people don’t take the time and effort to do so. That is not because they are stupid, but because there is so little incentive to acquire political information.
I'll take issue with both of Somin's assertions. First, there is a concerted effort by parts of the media, the so-called news entertainment industry, to mislead its audiences into believing utter nonsense. If you're marginally informed about the issues, you can't turn on Fox News or drive time talk radio without figuring that out. Whether or not people should trust that type of news source isn't really the point, as many people do trust those sources and end up being misinformed and distracted from the actual issues.

Further, the mainstream media has an unfortunate tendency to turn every policy debate into a horse race, taking the proverbial "view from nowhere" while airing guests and interviews that they know to be objectively false or misleading. It seems that people like the argument more than they like to be educated, and that conflicts and contests generate more readers and viewers than a dispassionate explanation of the facts and how one side of the issue has a weaker case or is just plain wrong. A case in point would be "death panels", an outright lie about the PPACA (Obamacare) that was concocted, refined and perpetuated by the right-wing media. The mainstream media could have snuffed out that absurdity at the outset, but instead treated it as if it was a valid claim and let partisans lie on the air about the meaning and effect of perfectly reasonable provisions of the PPACA. Even though those provisions were dropped as a result, to the detriment of elderly patients, many people remain convinced that the PPACA creates "death panels". The fault for that lies almost entirely with the media. Another example would John Boehner's statements, including his widely distributed editorial, about the government shut-down. If you believe what he says you'll be less informed than when you began trying to follow the issue - but the media is not going to help you understand that, because they want Boehner to submit editorials or make appearances on their shows.

More than that, though, why does Somin believe people should have to dig for information in order to be informed, when they are reading the newspaper (or virtual equivalent) or watching the news and are being assured by the talking heads that they're getting enough information to be informed? Not everybody has the luxury to be a law professor, with many hours in an office to explore areas of esoteric interest, read the Internet, and discuss the issues of the day with informed colleagues. Most people work jobs that do not afford them similar luxuries, and it's perfectly reasonable for them to believe that the time they spend catching up with the news before heading to work, in the car, or after putting the kids to bed is sufficient to keep them informed. It's not what Somin deems being "rationally ignorant", with people knowing that they are informed but choosing to spend time on other things. It's a matter of there being only so many hours in the day, and the fact that the less you know the less aware you are of how much you need to learn.

Let's recall also that the concept of democracy does not anticipate that every voter will be informed on every issue, or even most issues. The Founding Fathers includes a number of checks on popular democracy out of concern that the masses might make bad choices - hence the non-elected Senate and the Electoral College. Inherent in the popular vote is the notion of the wisdom of the crowd - the idea that the larger body of voters will collectively exercise a level of insight and wisdom that significantly exceeds that of the typical individual voter. That's not an excuse for failing to try to better educate the pubic about the issues of the day, but if you accept that concept it's much less worrisome that people wish that spending for foreign aid were only 1% of government spending... it's actual level... instead of 10% or more.

From that rather dubious perch, Somin launches himeslf into a non sequitur:
The problem of ignorance is exacerbated by the enormous size and scope of government. Government spending accounts for well over one third of GDP, and government also regulates nearly every aspects of life. Even if voters devoted far more time to following political issues, they would still be ignorant about most government policies.... But we can help alleviate the problem by limiting and decentralizing government. When people “vote with their feet” in the private sector or in choosing what state or local government they want to live in, they have much better incentives to acquire information and use it rationally than when they vote at the ballot box.
That argument strikes me as utter nonsense. The federal government gets an incredible amount of media attention. The more local government becomes, the less media attention it typically receives. In many communities there's no meaningful media coverage of local government. While there have been corruption scandals at the federal level, in recent decades they have typically involved a small number of individuals with little impact on the overall functioning of government. The problem of corruption can be significantly more pronounced at the state level, as evidenced by Illinois, and can be appalling at the local level, as evidenced by Kwame Kilpatrick's Detroit. It is absurd to pretend that it is easier for voters to stay informed in relation to matters handled by a panoply of state and local governments, than it is to stay informed in relation to the actions of the federal government. Were that not true, one would reasonably expect that corruption and misconduct would improve at the local level, not significantly worsen.

Also, it's a rather arrogant conceit to suggest that if people don't like where they live, it's simple for them to sell their homes, uproot their families, get new jobs, and relocate to a community more to their liking. While there is no question but that when things get really bad a growing percentage of people will "vote with their feet", the significant number of people who stay behind reflect the reality that for most people it's not simple or easy, and for many it's an unaffordable luxury. More than that, there's more involved in choosing where you live than whether or not you like the state or local government, and moving to a different location that has a government more aligned with your beliefs may mean sacrificing social, employment and cultural opportunities that are important to your quality of life, as well as giving up easy access to top hospitals or similar local amenities.
Most of us spend far more time and effort acquiring information when we choose what car or TV to buy, than when we choose the president of the United States. The person deciding which car to buy knows that their decision is likely to make a real difference to the outcome. The same goes for a person deciding where to live in a federal system.
Perhaps he's speaking for himself, but I disagree with Somin's initial assertion. Most people give considerable thought to the president, and whether or not they will vote for or against an incumbent president. One might argue that much of that thought is partisan, and thus "doesn't really count", but that would sidestep Somin's claim - which is tantamount to arguing that people start thinking about their next car or television the moment they bring a new one home. I think people who vote hope that their vote makes a difference. A real difference in the outcome? Very few people who move to a new community do so with Somin's unrealistic notion that they will somehow influence local government simply by leaving one community or by joining another - I don't believe that I've ever met such a person. Moving to a different community based upon an assumption that it will be materially better than the place you live because of your perception of the political ideology of the local government seems like a fool's errand.

Somin presents no evidence, nor really any argument, that localized government would be cheaper, better, more transparent, less corrupt, more efficient... that it would provide any actual benefit to the public. He does not address the complexity of the modern nation state, nor the fact that some of the things of which he complains (e.g., ignorance of spending levels for foreign aid, Social Security and Medicare) are going to remain at the national level. He may have a pipe dream where programs like Social Security and Medicare become state-run programs, but the added complexity of such an approach should be obvious, as should be the impairment of people's ability to "vote with their feet" when they have to worry about how benefits are coordinated between states. But if you leave the big ticket items at the national level - military spending, Social Security and Medicare, along with matters that the federal government alone can reasonably address, such as managing foreign relations, regulating interstate commerce, providing the FDA, FTC, SEC, EPA, FEMA, USCIS, ICE, USPTO and the like, Somin's approach isn't actually going to simplify choices for voters at the federal level. It will instead add to the complexity of their other votes while doing next to nothing to simplify the federal government.

From there, Somin doubles down on his non seqiturs:
By reducing the size of government, we can enable more choices to be made in the private sector, where people have better incentives to become informed.
It seems that Somin imagines a future involving a lot more than localization of government, with government either outsourcing various tasks to the private sector or people being left without government support in certain areas of their lives and having to turn instead to private sector solutions. Somin is employed by a major university, and likely chooses his health insurance from a number of heavily subsidized plans offered by his employer - something akin to a health insurance exchange under the PPACA, but with his receiving a large subsidy for which he would be ineligible under that law. It's unlikely that Somin has ever tried to purchase health insurance on the individual market, let alone insure his family in that manner. It is unlikely that he has dealt with insurance companies' arbitrary denial of coverage, or trying to get individual coverage with a pre-existing medical condition. If he had, he wouldn't adhere to the fantasy that "more choices" result in a better informed consumer and even if all he cares about are incentives he would be aware of how difficult it is to actually compare individual health plans - even those offered by a single insurance company.

Were Somin to research the issue he would learn that having too many options can be paralyzing - people get confused, don't know how to draw meaningful distinctions, and may end up making their choices based upon arbitrarily selected, sometimes incorrect, information and assumptions. Somin may fret for hours or days about what TV to buy or what car to purchase, but at the end of the day his life will be little affected by whether he chooses a Camry or an Accord, a Vizio or a Samsung. If Somin believes his own conceit that voters are too lazy to educate themselves, his is a recipe for making that situation worse.

Thursday, October 03, 2013

Rep. Randy Neugebauer - Fool or Fraud

Via TPM,
Rep. Randy Neugebauer (R-TX) got into a heated exchange with a National Parks Service Ranger at the World War II Memorial over the closure of the park because of the government shutdown.

Neugebauer, one of a number of Republicans who have tried to use the closed memorial to bash the Obama administration and Democrats on the shutdown, confronted the ranger while surrounded by a crowd of onlookers.

Neugebauer asked the Ranger how she could turn World War II veterans away.

"How do you look at them and…deny them access?" the congressman asked.

"It's difficult," she responded.

"Well, it should be difficult," Neugebauer snapped.

"It is difficult," the Ranger said. "I'm sorry sir."

"The Park Service should be ashamed of themselves," Neugebauer said.
Yeah... the Park Service should be ashamed that Rep. Neugebauer and his Republican peers are throwing a childish tantrum instead of doing their job.

Medical Malpractice Litigation Really Isn't All That Special

One of the common threads you see in the criticism of medical malpractice litigation from the medical community is that medicine is special - it's so complex that it's just not reasonable to trust malpractice litigation to a jury of laypersons. We are told that we need "health courts", an undefined concept that generally appears to translate into "courts where people like me get to decide what is or is not malpractice." And all of that is fine, insofar as it goes, except...

There's actually nothing special about malpractice litigation. Many incredibly complex cases are resolved within the jury system. An argument could be made that all technologically complex cases should be removed from juries, but that's really an argument for the abolition of the jury system itself - and disregards a long and involved history of how we came to have juries in the first place, why we protect the right to trial by jury in the Constitution, and why we have perpetuated that system into the modern era. The fact is, not all malpractice cases are complex. You're supposed to remove the patient's diseased, left kidney, and you instead remove the patient's healthy, right kidney? That's not difficult to understand. And sometimes what one might assume to be simple cases, let's say a car accident, can end up involving experts in engineering, accident reconstruction, economic damages.... Let's not forget, also, that juries are called upon to decide incredibly complex cases involving business disputes, environmental contamination, intellectual property, antitrust.... There's simply nothing about the complexity of medical malpractice litigation that meaningfully distinguishes it from other types of litigation.

Dr. Robert Centor argues,
We need special health courts. The jury process induces lawyers to couch their words, use sophistry, and work hard to present part of the story. This is clearly true for both the defendant and the plaintiff legal teams. If we had special health courts, then we could have a nuanced discussion of all the details of patient care. A jury trial leads lawyers to focus on details and try to “make mountains out of mole hills”.
I've tried to engage Dr. Centor in the distant past about what he means by a "special health court", to no avail. Pretty clearly, his concept of the "health court" would be a non-adversarial court system, and would not involve a jury. Perhaps he's thinking of something akin to a coroner's inquest (of the type that does not use a jury), or some sort of inquisitorial tribunal system. One can only guess.

It's not at all clear, though, why he believes such a system could be created and yet be completely non-adversarial. It's difficult to imagine a doctor, accused of malpractice, beaming with delight at the thought of having his case tried in a "health court" as opposed to a regular trial court, or being content with lawyers who argue, "We're not actually representing you - our duty is to help the court find the truth, even if that means you're found to have been culpably negligent." All you have to do is look at the effort that doctors and the insurance industry have invested in keeping confidential the results of peer review in malpractice cases to recognize that the medical profession is not interested in placing all of the facts on the table.

We presently have trials that don't involve juries - bench trials - and although the conduct of the trial can be affected by the absence of a jury, even in medical malpractice cases the litigation remains adversarial. There are pro's and con's to the adversarial system, certainly, but on the whole parties like to be represented by advocates who zealously represent their interests.

Dr. Centor, himself, highlights a significant flaw with his apparent notion of a "health court", in that no "health court judge" or tribunal will be expert in every area of medicine:
Some physicians will testify in cases about which they really understand little. Reading the depositions of some other physicians saddened me.
I will continue my personal philosophy of only accepting to testify in malpractice cases for which I believe I have clear expertise. Over the years I have accepted less than 10% of offered cases. I had testified once previously approximately 25 years ago, and had been deposed once in another case. But generally, I avoid malpractice cases because I do not consider myself qualified.
If you have judges, even specially trained judges, presiding over a trial where the proposed medical experts are testifying outside of their area of expertise, how is that an improvement over the current system? The judges can't be assumed to have a better understanding of medicine than the experts testifying in court, and Dr. Centor argues that many of those experts aren't qualified. If litigants seek out Dr. Centor due to his credentials and expertise, yet he deems himself qualified to serve as an expert in only 10% of those cases, by Dr. Centor's measure what are the odds that a randomly selected health court judge is going to be qualified to hear an assigned case?

For all of the kvetching, every comprehensive effort to review the outcome of medical malpractice cases suggests the same thing: Malpractice lawyers are very selective about the cases they take, when they take cases that turn out to be weak it's almost always because they lack the information necessary to fully assess the case and cannot get that information without filing a lawsuit, when there's ambiguity in a case juries tend to side with doctors, and to the extent that error occurs it usually favors the doctors. As is quite typical, even when arguing that the case shows a "need" for a health court, Dr. Centor argues that the jury came to the correct verdict. His concession reduces his argument to this: "The system worked. so let's replace it with something I can't define that I think will be more fair."

I'm not going to dispute this:
he psychological impact of these charges on the defendants was palpable. These hard working, conscientious defendants had years of having these charges hanging over their heads. They did nothing wrong. That really does not matter in jury trial.
The distinction between professional negligence litigation and standard negligence litigation is that somebody is pointing their finger at you and claiming, "You weren't competent in this case, and your incompetence caused somebody to suffer an injury." One of my law school professors liked to edify his students by explaining legal practice in very blunt terms. One of his declarations was, "You will all commit malpractice." The fact is, everybody makes mistakes - the big question being, how you respond when you make a mistake. Most mistakes can be fixed and, if you detect your error or omission quickly enough, harm can be minimized or avoided.

When you look at why patients look for malpractice lawyers or bring malpractice lawsuits, you find that bedside manner is a huge factor. When something went wrong, was the doctor helpful? If the doctor made a mistake, did he apologize? Rick Boothman, chief risk officer for the University of Michigan Health System, has advocated for years that doctors and hospitals change their approach to litigation, and has documented that an approach of disclosure, apology and cooperation reduces the number and cost of malpractice claims. No need to reinvent the system. It's an approach more institutions and doctors should take.

A comment on Dr. Centor's blog suggests, as an argument for "health courts", "Attorneys will drag out a weak case in hopes of a settlement getting something instead of nothing." The commenter confuses the exception with the rule. The principal reason that malpractice litigation drags on as long as it does is not due to plaintiff's lawyers. It's due to the successful lobbying by the medical malpractice insurance industry for measures that increase the cost of litigation for a plaintiff and prolong the litigation process. By imposing up-front costs and delays, small but meritorious malpractice cases are squeezed out of the system. The longer a case drags on, the more likely it is that a seriously injured malpractice victim will settle for less than the case is worth. As Mr. Boothman indicates, plaintiff's lawyers are happy to work collaboratively with a doctor or hospital to arrive at an early settlement. No plaintiff's attorney wants to invest $50,000 to $100,000 or more in taking a case to the point of trial (and yes, malpractice litigation is extremely expensive) if they can settle it quickly. The exceptional cases involve the late disclosure by the defendant of information that undermines the plaintiff's case, where the plaintiff's lawyer then angles for a modest settlement to try to avoid taking a loss on the case, or where the plaintiff's lawyer isn't competent to litigate malpractice cases in the first place.

A last point on "health courts": Let's assume an efficient health court system that accurately distinguishes actual malpractice cases from maloccurrence that results from non-culpable negligence, outside factors or bad luck. I very much doubt that health insurance companies would support the implementation of such a health court system. Why? Because right now, only a very small percentage of valid medical malpractice cases are prosecuted. The estimate is usually around 12%. The rest of the cases involve patients whose cases are too small to litigate under the present system, patients who lack the capacity or understanding to pursue a malpractice cases, patients who dread the thought of litigation and, perhaps most importantly, patients who like their doctors. If you create a sufficiently painless system, with efficient, low-cost resolution of malpractice claims, inspiring a significant percentage of that majority to pursue their valid claims, the amount paid out to settle claims could increase substantially. Even if Dr. Centor believed that such a system would be better than the status quo, the malpractice insurance industry would fight its implementation, tooth and nail.

Tuesday, September 24, 2013

Where We Could Really Use the "Next Steve Jobs"

A lot of people focus on the smartphone market, and complain with each new Apple product that... Steve Jobs would have done something different, or better, or both. Steve Jobs brought something unusual to Apple, specifically a willingness to make huge gambles on theoretical technology, and to release products that could turn out to be failures. Apple seems to have become exceedingly cautious, but I'm not sure that is so much the result of a change in the company's philosophy as it is a change in consumer expectations. The iPhone 4 antenna issue, and the Apple Maps brouhaha, suggest that consumers want nothing less than perfect and, rather than launching risky products that might inspire a mixed reaction or turn out to be the next Newton (or Zune), caution has spread across the industry.

The real story behind the focus on portable electronics is not so much that a life-changing innovation is just around the corner. It's much more that there is profit in the upper end of the market, the mass market having already been commoditized. Smartphone advances reflect the importance of competition as, even though Apple sees the rise and fall of Nokia as a cautionary tale, history suggests that product development in a commoditized market tends to be slow. Most companies see little to no point in spending hundreds of millions of dollars to marginally improve a product that will likely sell at the same price point as before. That's the sort of context in which a short-sighted CEO of a company like Hewlett-Packard might decide that it no longer makes sense to fund research that is not directly aimed at turning a profit, or why a similarly short-sighted company's products might go from excellent to "good enough" in order to increase margins by decreasing production costs. (Am I talking about the same company?)

One might argue that televisions have seen marked advances in technology despite being a largely commoditized market, but that has been driven in no small part by the introduction of HDTV and the money poured into the development of new displays for computer users and commercial settings. Even in that context, major players like Panasonic have a very difficult time turning a profit, and the pool of companies that produce television displays and sets is not expanding.

One area that has seen a surprising lack of innovation is the desktop computer market. That's in part because it's a tough nut to crack - computers do pretty much what we want them to do, there are no obvious ways to dramatically improve the user interface, and the technologies for interacting with computers other than through a mouse and keyboard tend to focus on niche users or turn out to be largely impractical. It may be that one day we'll have displays and "no touch" gesture controls as shown in the film, "Minority Report", but that's not on the horizon. Basically, the desktop computer market seems a lot like the television market. To the extent that incremental improvements are seen, they're in no small part the result of R&D in the mobile marketplace. The biggest "innovation" we've seen in a desktop operating system was Microsoft's annoying, clumsy interference with the user experience by putting a "smart tile" display between the computer user and the desktop - that is, they tried to make the desktop experience more like mobile, never mind whether that makes sense. Apple has made similar, albeit less in-your-face changes to its desktop operating system, with its Launchpad and App store, but they're really not part of the ordinary desktop experience.

Somebody commented to me recently that Apple seemed to be "giving up" on the competition for desktop computers. I responded that they're chasing money and market share, and that right now they can find both in the mobile space while there is little incentive to try to claw out a greater market share in the desktop market. The cost of significantly expanding their desktop presence would be significant, and there's really not much money to be made in that market. Were Apple to start producing $300 - $600 portable computers it might find a market, but it would have to make the quality cuts that are readily apparent in computers in that price range, potentially costing it brand loyalty over the long run in the same manner that the low quality Apple products of the Sculley era damaged Apple's reputation and competitiveness. Why mass produce low-cost computers that have to be sold at tiny margins and that would likely have an impaired user experience, when you can continue to sell $1000+ computers that people enjoy using, and sell millions of highly profitable iPads to the sub-$1,000 market?

Really, though, the desktop industry needs to be woken from its complacency, much in the manner that Google and Apple rebooted then-stagnant browser development with Chrome and Safari. The problem being, you either need a company that sees a long-term gain in developing new technology at a significant short-term cost, the way Xerox PARC laid the foundation for the computer mouse and windows-driven displays, or because they don't want to be indentured to a competitor's product. And if you take the HP Labs / Xerox PARC approach, you also need a visionary who can see how a new idea can be improved and put into widespread use - after all it was Apple, not Xerox, that turned the mouse and menu/windows-driven interface from an impractical lab-based demo to the desktop standard.

The manner in which the world, and Apple, has changed is perhaps best illustrated by today's quiet announcement that the iMac has been updated. You can go to the Apple Store and buy one today - but the new version isn't even flagged as "new". A secondary illustration comes from the Mac Pro, the high-end computer Apple develops for the professional market, which is soon to be released in an innovative new case. But that's innovation in the same sense as the Mac Mini was an innovation - great design and packaging, but nothing you couldn't have accomplished in a traditional mini tower case. Apple did promote the redesigned Mac Pro, some months back, but when will it actually come to market? Later this year. There's no sense of urgency, as there is in the highly competitive mobile marketplace.

An argument can be made that when a technology reaches a certain point of maturity, all new developments will be incremental. Perhaps the keyboard and mouse-driven desktop computer are pretty much it - and unless the entire concept is reinvented (much as the iPhone reinvented the smartphone market) this is it. People seem disappointed when the new "state of the art" smartphone looks like the old one - as if there's a great deal you can do to differentiate the hardware of a typical smartphone in ways that are obvious or exciting. Even in that market, unless a new, disruptive technology comes along the biggest future changes will come through software. In fifteen years, today's typical smartphone and tablet apps are likely to look about as sophisticated as Pong. But still, it would be nice to have a sense that somebody out there - somebody positioned to disrupt the market - was looking at "impractical, unworkable" new ideas from a different angle, and asking, "What if...."

Tuesday, September 10, 2013

The Not-So-Cheap, but Colorful, iPhone

With Apple releasing a "cheap" iPhone that's $100 less expensive than its flagship model, and with the price gap being even smaller in China, the Wall Street Journal opines that the new iPhone may not be cheap enough for China.
With the cheaper phone, Apple will no doubt gain sales and market share. But it will still fail to reach the majority of city dwellers, according to a projection from the Wall Street Journal based on income distribution data from research firm CEIC Data.

The projection, developed in consultation with analysts, assumes that a working, urban family would be willing to spend, at most, half of its total monthly income on a single smartphone. Working on that assumption, around 260 million Chinese urban residents could potentially be willing to buy the iPhone 5c. That means cheaper iPhone effectively doubles Apple’s addressable market from the 125 million who would be willing to shell out for the more expensive handset.
My guess? Apple will sell iPhone 5c's as fast as it can make them, rendering moot the idea that it would sell more at a lower price. You can only sell your product as fast as you can make it. The new design is visually striking, and I think that's about more than just giving people a variety of colors to choose from. I think Apple designed the 5c with the goal of letting its customers in nations like China telegraph to their peers, "I can afford an iPhone." Given how status-conscious and luxury brand-focused Chinese consumers are reported to be, that's no small consideration.
It also means that an Apple phone is still too pricey to appeal to roughly 430 million people, or 62% of the country’s urban population.
So... only 263 million prospective customers, who happen to be the more affluent members of Chinese society. Apple sold 31.2 million iPhones last quarter, worldwide.

I appreciate the article's assertion that most Chinese consumers are looking for a phone that costs less than half of the price of a new iPhone, with many wanting to pay less than a quarter. But that's not a market that Apple is presently willing to serve, nor would it make sense for Apple to abandon its traditional business model and to start producing iPhones that could be sold at that price level. Will that give Android an advantage among bargain hunters? Yes, as will the array of larger-screened Android phones. But you don't make profits by selling low-quality merchandise at razor-thin margins, which is why Apple and Samsung are the only mobile phone manufacturers who are presently earning a significant profit from phone sales.

Smartphones will eventually become commoditized, a process that is accelerating with the release of attractive Android phones from several of Samsung's competitors. Apple won't be able to sustain its margins forever, although it is positioned to be the last man standing. The trick then becomes, how to leverage your platform into continued, significant profits. Apple has a number of advantages in that respect, including the fact that it has been careful to maintain backward compatibility in its devices. With most iPhones running the latest version of iOS, and with every iPhone built to Apple's standards, third party manufacturers can develop products and services that connect with the iPhone much more easily than they can with Android devices. It can only help Apple if, while Android remains the operating system of choice for bargain hunters, it holds its position as a phone of choice for affluent consumers around the world.

Even as the WSJ article suggests that Apple needs to make cheaper phones, it acknowledges the problem with that position:
Ma Tao, who owns a shop on the second-story of the electronics mall, echoed concerns that have already been voiced by some analysts: that the new phone would lead to a short-term spike in sales, but that it would erode Apple’s reputation as a maker of luxury high-end phones in the long run.
Meanwhile, the phone vendor they interviewed suggests that Samsung sales in China will suffer as customers opt instead for a considerably cheaper, Chinese "equivalent". It may turn out that Chinese consumers opt for the flagship iPhone, in silver, graphite or tacky gold, and those products are also designed to telegraph, "This is an iPhone". I'm taking a "wait and see" position on whether keeping the price "that close" will turn out to be a mistake, but I'm suspecting that for now it's a good move to maintain luxury pricing and to appeal to brand-conscious Chinese consumers, giving them a discount but also a brightly colored excuse to argue, "I picked this one because it's my favorite color, not because I wanted to save money".

I think that the article glosses over one of Apple's significant problems in the Chinese market - the size of its displays. The concept of a phone that you can operate with one hand is great, and Apple should continue to offer the standard sized iPhone. But if you've ever squinted at a small screen, consider what it would by like to read Chinese or Thai characters on an iPhone screen, or to enter text in an Asian script. Then consider the population that wants a bigger screen because it's better for videos and games. Or because they only want one device, and are attracted to the phablet.

If I were Apple... and I admit an uncanny knack for being incorrect in my Apple-related predictions... I would be thinking about releasing a larger-format iPhone no later than a year from now, and ideally in the spring.

"My Client Did Nothing Wrong"

Oh, boy...
Monday afternoon, Shellie Zimmerman called Lake Mary authorities to her parents' home, saying her estranged husband was threatening her and her father with a gun. Days earlier, she had filed for divorce... She later changed her story. According to police, Shellie Zimmerman and her father now say they never saw a gun, and no gun was found. Although CBS affiliate WKMG reports that Zimmerman's attorney, Mark O'Mara, said Zimmerman had a gun holstered to his body.

Shellie Zimmerman has said she won't press charges, but police say video of the alleged dispute on her damaged iPad could play into whether authorities file charges.... In her 911 call, Shellie Zimmerman said: "He then accosted my father then took my iPad out of my hands. He then smashed it and cut it with a pocketknife, and there is a Lake Mary city worker across the street that I believe saw all of it."....

Mark O'Mara, who served as Zimmerman's attorney in his murder trial in the death of Florida teen Trayvon Martin, said his client did nothing wrong in Monday's incident.
By "nothing wrong", does he mean "nothing criminal"? Let me guess... the iPad was severely depressed, and as much as Zimmerman tried to keep it in his wife's hands he was unable to stop it from taking a suicidal tumble from her hands.1 Also, if O'Mara was telling the truth and Zimmerman was carrying a gun, where did he and his wife seemingly conceal it while waiting for the police to arrive - and why?

I suspect that the defense fund" gravy train is slowly going off of its tracks....
1. Or maybe the iPad was wearing a hoodie and he reacted reflexively.

Monday, September 09, 2013

Dennis Ross: A "No" Vote on Syria Makes a Military Strike on Iran Both Less Likely and Almost Inevitable

Dennis Ross, the man who helped engineer the catastrophic failure of President Clinton's last ditch efforts to secure an Israel-Palestine peace deal, was given space in the Washington Post to bring the same special brand of help to the crisis in Syria. His central thesis is that if Congress blocks the President from bombing Syria, it becomes more likely that military action will be taken against Iran. He thus urges Congress to abandon its concerns over whether the bombing would accomplish anything constructive, whether it would suck the U.S. into a protracted conflict, or consider the extent to which U.S. interests are implicated by events in Syria, and authorize military action.

After telling Congress to ignore the slippery slope argument, in which military action in Syria "suck[s] us into a conflict that we cannot win", Ross offers a slippery slope of his own:
President Obama and Secretary Kerry have pointed out that there will be a great cost to international norms that prohibit the use of terror weapons such as chemical weapons. And surely they are right that if Bashar al-Assad can gas his own people and elicit only harsh words but no punitive action, he will use the weapons again. The price in Syria and the potential for spillover in the region are certain to be high. Additionally, other rogue actors may also draw the conclusion that chemical weapons are not only usable but that there are no circumstances, no outrages, no genocidal actions that would trigger a meaningful reaction from the so-called civilized world.
I think that's utter nonsense. First, in relation to Syria, Assad has repeatedly denied using chemical weapons, and the few nations that back him are hanging their hats on that denial. If Assad were to now announce, "Hah! Now that Congress has voted, I admit it - it was me!", it would be extremely difficult for him to avoid a Security Council resolution authorizing use of force. Putin may be a stubborn man, but after making repeated denials about Assad's involvement in the attack, it is difficult to believe he would have much interest in protecting Assad by vetoing the resolution. Second, were Assad to engage in additional attacks, any plausible deniability that remains from the prior incident would evaporate. It is difficult to believe that Congress would vote against military action, and even more difficult to believe that the President would sit and wait for approval from Congress if he believed it would not be immediately forthcoming. I don't think Ross believes what he suggests.

Second, in relation to the rest of the world, Ross is pushing the nonsensical idea that despots are eager to use chemical weapons against their own populations, but are held back by fear of military reprisal by the world's powers. With a whopping two significant uses of chemical weapons since World War II, the first (and more significant) use being the repeated use of chemical weapons by Iraq during the Iran-Iraq war, and the second being this attack by Assad, it's difficult to see why Ross would imagine that despots are in fear of reprisal. If Syria is not attacked, that would be in keeping with the precedent seen in Iraq. Also, if despots are so eager to use chemical weapons, why weren't they emboldened by Hussein's use of those weapons? Perhaps Ross seriously believes that despots around the world would respond to a military strike by thinking, "Dang, just as I was about to add chemical weapons to my use of torture, imprisonment without trial, summary execution, military strikes, collective punishments, internment camps, starvation, and occasional genocide, the world has made clear that I can't use them. Now I have no choice but to stick with my tried and true methods of perpetuating my regime," but again I doubt that he believes a word of what he said.

After mocking those who emphasize that military strikes should be used only against "threats that are immediate and directly affect us" as regarding other concerns as "abstractions", Ross introduces his argument about Iran:
Leaving aside the argument that when the threats become immediate, we will be far more likely to have to use our military in a bigger way and under worse conditions, there is another argument to consider: should opponents block authorization and should the president then feel he cannot employ military strikes against Syria, this will almost certainly guarantee that there will be no diplomatic outcome to our conflict with Iran over its nuclear weapons.
Note his use of language: "...over its nuclear weapons." Iran has no nuclear weapons.
I say this for two reasons. First, Iran’s President Rouhani, who continues to send signals that he wants to make a deal on the nuclear program, will inevitably be weakened once it becomes clear that the U.S. cannot use force against Syria. At that point, paradoxically, the hard-liners in the Iranian Revolutionary Guard Corps and around the Supreme Leader will be able to claim that there is only an economic cost to pursuing nuclear weapons but no military danger. Their argument will be: Once Iran has nuclear weapons, it will build its leverage in the region; its deterrent will be enhanced; and, most importantly, the rest of the world will see that sanctions have failed, and that it is time to come to terms with Iran.
First, President Rouhani has been in office for what? A month ago? Yet we are to believe that it is his voice of reason that keeps Iran from developing nuclear weapons? Does Ross want us to believe, then, that Mahmoud Ahmadinejad urged similar restraint and that it was his calming influence that prevented Iran from developing nuclear weapons during his eight years in office? That Ayatollah Ali Khamenei is going to suddenly declare, "I know I've been arguing for years that nuclear weapons are a sin, are forbidden in Islam, and should be eliminated from the planet, but I've changed my mind!"

Ross then argues, without explanation, that "Under those circumstances, the sanctions will wither". Why would he believe that? I don't think he does believe that and, once again, he feels no need to substantiate the claim. He instead insists that if the U.S. is blocked from using force in Syria, a different country involving different facts and different weapons, Iran will decide it will never be attacked and acquire nuclear weapons. He then contradicts himself,
Israel, however, is not prepared to accept such an eventuality, and that is the second reason that not authorizing strikes against Syria will likely result in the use of force against Iran. Indeed, Israel will feel that it has no reason to wait, no reason to give diplomacy a chance and no reason to believe that the United States will take care of the problem.
As should be needless to say, if a military strike becomes more likely if the U.S. votes "no" on striking Syria, the government of Iran is not going to conclude that it is not going to be attacked and forge ahead, full speed, with its nuclear weapons program.

Does Ross believe that Iran has never heard of Israel? That Iran is unaware of Israel's military power? That Iran is unaware of Israel's position on other nations developing nuclear weapons? That it somehow passed beneath Iran's notice that Israel has launched military strikes against nuclear facilities in both Iraq and Syria when it decided those nations were too close to developing nuclear weapons? Apparently so, given his expressed certitude on how Iran would react to a "no" vote in Congress on the use of force in Syria.

Ross makes a cryptic statement,
Ironically, if these opponent [sic] succeed, they may prevent a conflict that President Obama has been determined to keep limited and has the means to do so.
Is Ross speaking about Congressional opponents of a military strike - those are the only people otherwise described in Ross's editorial as "opponents"? If so, and their "no" vote prevents a conflict that the President is trying to limit, wouldn't that be a good thing? Or is it that Ross thinks the President is wrong to want to keep that conflict limited? Based upon his next statement,
After all, even after Israel acted militarily to enforce its red line and prevent Syria’s transfer of advanced weapons to Hezbollah in Lebanon, Assad, Iran and Hezbollah have been careful to avoid responding. They have little interest in provoking Israeli attacks that would weaken Syrian forces and make them vulnerable to the opposition.
Perhaps Ross means to suggest that if no strike is made against Syria, Hamas and Hezbollah will continue to show restraint rather than trying to widen the conflict? Does Ross believe that those factions would respond to a U.S. strike by engaging in acts that would invite a massive U.S. or Israeli military response? If Ross seriously believes that Hamas and Hezbollah will launch attacks in Lebanon, Israel and perhaps around the world in response to a strike on Assad, isn't that a solid argument for restraint? Ross undermines any such suggestion, admitting, "the Syrian and Iranian interest in an escalation with the United States is also limited". He continues,
Can the same be said if Israel feels that it has no choice but to attack the Iranian nuclear infrastructure? Maybe the Iranians will seek to keep that conflict limited; maybe they won’t. Maybe an Israeli strike against the Iranian nuclear program will not inevitably involve the United States, but maybe it will — and maybe it should.
Maybe Ross will attempt to do more than toss out mutually inconsistent alternative scenarios about what the future might hold; maybe he won't. Maybe Ross will provide a coherent, logical argument; maybe he won't - but maybe he should.

To summarize... If the U.S. doesn't strike Syria, Iran will conclude that the U.S. won't strike it's nuclear facilities, demolishing the credibility of the new Iranian president who has spent a full month advocating a diplomatic resolution of the controversy over its nuclear program, and inspiring Iran's Supreme Leader to abandon his stated views on Islamic law and seek to immediately acquire nuclear weapons. This will happen even though Israel, a nation with a decades-long history of using espionage and military strikes (and is rumored to have used assassinations) to stop other nations in the region from developing nuclear weapons, will almost certainly strike Iran's nuclear facilities if it senses that Iran has not abandoned its nuclear weapons program. This will result in a regional conflict that may or may not involve the United States, but that Ross apparently believes should involve the United States.

And this argument is supposed to... convince Members of Congress to support a strike?