So Just What did You Expect, Again?

Via a “share” from a Facebook friend’s page, we have this gem over at AlterNet.org.  I will observe that this is not the first article at that site that my friend has shared, on FB or otherwise.  I find the self-consciously cute name they’ve chosen for their site to be more than a bit ironic.  You see, it’s a play on the world “alternate” from which we are to deduce “alternative,” from which we are to conclude that this site is purveying news and opinion that’s somehow edgy, “alternative,” out there, or otherwise not just one more dead-fish organization going with the flow of the stream.  Except it is; what you’ll find there is pretty standard left-extremist claptrap. Like, for example, the linked article. We are all racists now, it seems.  As evidence for “subconscious racial bias” arising from “the most enduring, corrosive racial stereotype in America: the black-as-criminal mindset,” we have the observation:

“The archetype is so prevalent that the majority of whites and African Americans agreed with the statement “blacks are aggressive or violent” in a national survey.  In support of these findings, other research indicates that the public generally associates violent street crime with African Americans. Other nationwide research has shown that the public perceives that blacks are involved in a greater percentage of violent crime than official statistics indicate they actually are.”

There are two links in the just-quoted text at AlterNet.org, both to an article from 2007 over at the Journal of Contemporary Criminal Justice.  For the first link, no specific numbers are given in the linked JCCJ article, only the characterization “clear majority” of both groups.  As to the lattermost assertion, I could not find in the linked article on the public perception of percentage or prevalence of black crime relative to actual proportion of all crimes committed by blacks.  What I did find are some specific numbers of all crime committed by blacks relative to their statistical share of the gross population (remember that this article is now seven years old, and the research it’s based on presumably older still, so these numbers may well be out to lunch in one or more respects by now).  From the JCCJ:

Blacks are indeed involved in a disproportional amount of crime in general and violent crime in particular.  In fact, for violent crimes such as robbery and homicide, there have been times when Blacks were arrested in absolute numbers that surpassed those of Whites.  In more recent years, however, although Blacks did not surpass the actual number of Whites in nationwide arrests, their presence in these statistics has been greater than their representation in the general public. For example, although Blacks compose approximately 13% of the U.S. population, in 2002 they accounted for 38% of arrests for violent crimes and nearly 30% of arrests for property crimes. Juvenile arrest statistics indicate that during the same year, Black youth accounted for approximately 43% of arrests for violent crimes and 27% of arrests for property crimes. Researchers have suggested that crime committed by African Americans may be especially salient not only because it exceeds what would be expected based on the racial composition of the country but also perhaps because the violent crimes that tend to be most fearsome are the ones that are most disproportionately perpetrated by Black males.

Let’s see.  Black males constitute roughly 6.5% of the total population (half of 13% is female, right?).  Given that the overwhelming proportion of crime in general, and physically violent crime in particular, is committed by males, period, we can assume that somewhere north of 33% of violent crimes were accounted for by that 6.5%.  Which is also to say that black males (and the JCCJ article is predominantly about black males) account for violent crimes at over five times what you would expect if crime statistics were evenly distributed across all demographic groups (male/female, age, ethnicity, origin, etc.), and almost five times the rate for property crimes.  Remember that crime in general and violent crime in particular is not age-neutral; it skews strongly towards youth.  The numbers for black youth are even more alarming.  I haven’t seen an age-pyramid for black youth, but since birth rates trend negatively with increasing wealth, I’m going to assume that black youth accounts for something north of 13% of all youth.  Let’s assume 18% of all youth is black, making 9% of all youth both male and black.  That 9% of youth accounts for violent youthful offense at a rate 4.78 times their “statistical expectation,” and exactly triple the rate for property crimes.

Thus in point of fact just as a perception of statistical reality, the popular perception that blacks are more prone to commit violent crimes (or even property crimes) would coincide exactly with observable data.  I wish the JCCJ article had broken some of the numbers out in greater detail (alas! there are no hyperlinks in it).  I’d be interested to know what that “percentage of crimes committed by” figure looks like when you add to the question the qualifier “as experienced by members of specific groups.”  Thus, what percentage of violent crimes committed against blacks are committed by other blacks?  And the same question for whites (and East Asians, and South Asians, and Aboriginal Americans, for that matter).  From everything I’ve ever heard, the answer to that question, percentage of black victims of black crime, the number approaches depressingly close to 100%.  Small wonder that blacks might perceive each other to be prone to violence, when almost all the violence they experience is in fact at the hands of their own ethnic group.  And in fact the overwhelming percentage, from everything I’ve heard, of black violent crime in general is directed at other blacks.  Meaning that you’d expect white victims of black crime to be a smaller percentage than that 38%.  This would, again, match everything I’ve ever read, namely that all ethnic groups experience violence principally from members of their own group.  It wouldn’t surprise me at all if the proportion of white victims of black crime were something less than 13%.  So why might whites in general entertain that non-statistically-valid perception (as to themselves only)?  Unfortunately I don’t have hard numbers, but my understanding is that to the extent that whites experience violence from outside their own group, that violence comes nearly exclusively from blacks.  As a matter of logic that doesn’t make a whole lot of sense to extrapolate from that data point.  If I’m getting robbed at gun-point, or if a friend of mine has been robbed, what difference does it make what color skin the perp had?  But humans aren’t always the most logical creatures.  As a human child growing up on the playground I’ve probably been bitten by more humans than dogs, if you were to go back and count.  But the one bite of my life I still remember is when the German shepherd sank her teeth into my butt, way back in the early 1970s.

You won’t find much cogitation in that AlterNet.org article along the lines of the preceding paragraph.  What you will find are outright misrepresentations along the lines of, “Remember Zimmerman’s false syllogism?  A few blacks committed burglary, Trayvon was black, therefore Trayvon was a criminal.”  No.  What came out at the trial is that the housing development where Zimmerman lived had specifically been the subject of multiple break-ins, at least some of which George Zimmerman had observed, and at a minimum those which he’d observed had been committed by black males (my understanding is that to the extent the race and sex of the other perps were known at all, it was black and male).  Martin (you remember him; he was the one who was trying to splatter George Zimmerman’s brains onto the sidewalk) was observed by Zimmerman, wandering in the rain, pausing and looking into windows of housing units.  Martin may have been lost or disoriented, or just curious as to what sort of people lived in the place he was visiting.  But from Zimmerman’s perspective it looked like someone casing the joint.  And that’s how he reported it.  Unlike the 911 transcript fraudulently edited by the news networks, it wasn’t Zimmerman who brought up Martin’s skin color.  He didn’t mention skin color until he was specifically asked about it.  The “syllogism” claimed is simply bullshit.

In what she no doubts prides herself on as her demonstration, our author starts with the usual recital of America’s foundations in slavery, and the post-slavery history of violence against blacks committed by whites, in the form of lynching.  [A couple of observations are here in order.  For starts, given the explicitly racist practice of most law enforcement until the 1960s, you have to assume that for most of American history the vast majority of violent crime against blacks, committed by anyone, never made it into the official numbers.  They were sub-humans, so who cared if they were robbed, beaten, murdered, stabbed, raped, etc?  A lynching gets attention; knifing someone in a bar fight over a woman, not so much.  Secondly, given how geographically concentrated the black population was until post-1910, you have to assume that black-on-white crime was vanishingly rare.]

As Gentle Reader might suppose, there are pretty detailed data on lynchings by year, and in fact by race as well.  Here’s a tabulation maintained by the Tuskeegee Institute, for 1882 through 1968.  Not that it matters a hill of beans for this discussion, but you could have won some money off me betting that the number of white victims would have exceeded the number of black victims for any year at all . . . and yet for the first four years that’s exactly what happened.  Look at the total for both races for the 83 years from 1882 until the last recorded, in 1964: 4,742, of whom 3,445 or 72.649% were black.  To put some perspective on it:  That’s only 500 more total victims that the number of race-unknown homicide offenders in 2010 alone (links to FBI and Census Bureau data below), and it’s less than the white offenders for 2010 and less than the black offenders for 2010.  It’s not quite 36.5% of the victims for 2010 alone.  To put some even more distressing perspective on that:  Across the entire Reconstruction and Jim Crow eras, not quite 73% of all lynching victims were black; in 2010 alone, 49.78% of all murder victims were black.  Even after a good 15 or more years of dropping violent crime statistics, we’ve got a problem that’s two-thirds as exclusively black as lynching.  Someone remind me again why this isn’t getting more play in the lamestream media.

I think it’s pretty safe to assume that the number of black lynchers was zero, so you’ve got attribute all of those victims, white and black, to white perps exclusively.  But how many “offenders” were there?  I don’t think it’s appropriate to ask just how many people were actually involved in making the noose, tying the victim, looping the rope over the tree or lamp post, or whatever.  I think you have to attribute some moral guilt to to at least some of those who showed up, even if only out of curiosity.  I have no idea whether anyone has ever even attempted to figure out how many people attended these things.  How many of them drew a large crowd for a small town?  How many of them were just a couple or ten people in the dark of a night?  How many drew a crowd of thousands, as several well-known lynchings did?  So let’s just assume an “average crowd” of 750.  Gentle Reader is reminded how few places in the South during the years when most of the lynchings occurred (out of the 4,742 shown, 2,359 or 49.7469% had occurred by 1896, and 3,179 or 67.03922% had occurred by 1903 — whatever else it was, lynching as a widespread problem was overwhelmingly concentrated in the pre-World War I South, even though other states also knew it; for example, the lynching that prompted the poem “Strange Fruit” went down in Marion, Indiana, and even Minnesota can show at least one, of some circus hands) had other than minuscule populations.  So I don’t think 750 people is an unreasonably small number.  Applying that across 4,742 lynchings produces 3,556,500 “offenders,” and that’s if you consider all attendees equally guilty.  Now let’s ask ourselves how many tens of millions of people were living in the South during those 83 years.  I suppose a statistician could cipher that out, but I’d be amazed if the number was any less than two hundred-plus million.  The 1900 census data show 18,975,665 people living in the eleven states that had seceded, out of total population of 76,212,168; that’s 24.89847% of the gross.  In 1900 there were 115 lynchings.  Even if you assume an average crowd of 1,000 per, (and I think that’s a grotesque over-estimate) and even if you attribute all lynchings to those states, that gets you to 115,000 people or slightly over six-tenths of one percent of the gross population.  And yet we have tripe (on AlterNet.org, no less), such as the bilge I defenestrated here, in which the entire South is lumped into a single, seething, bloodthirsty mass.  Remind me again why this ahistorical bullshit is considered insightful analysis, and yet it’s conclusive evidence of racism! when popular perceptions of the prevalence of violence match observable statistics.

Just as an exercise, I spent some time looking for data on homicides, age, race, and total population.  I also looked for data on mass killings (most things I’ve run across define a “mass killing” as one where there are more than three victims in the same (e.g. Oklahoma City) or a closely-related sequence of killings (e.g. Virginia Tech)).  I also tried to tie the data I found to the same year, since things can change radically and very quickly.  Remember that 2007 data above?  Well, the one thing that’s been happening in the past six years is that violent crime of all kinds, and so far as I know, across all groups, has plummeted.  So even if a particular group X is “more disposed to violent crime than statistically predicted,” over the last six years they’ve got significantly less disposed to it.  In order not to spend more time than I have, I confined what I was looking at to homicide, since it’s the hardest to conceal and the most likely to be pretty fully reported.  I settled on the year 2010 so I could use the 2010 census data, available here.  The FBI has homicide data, both as to victims and perps, by age and race, here.  Finally, I couldn’t seem to find “official” numbers on mass killings, but Mother Jones has a tabulation on “US Mass Shootings, 1982-2012.”

A couple of cautionary notes about the data.  The census bureau reports, for racial self-identification, not only single-race responses but multiple-race responses.  The distinction can be significant when you’re talking about a gross population, as of Census Day 2010, of 308,745,538.  A total of 38,929,319 self-identified as black only.  That’s 12.60887% of the gross population.  But 42,020,743 self-identified as black-alone-or-in-combination.  That’s 13.610% of the population.  Unfortunately I could not find age distribution data for the 3,091,424 who self-identified as black-and-something-else, so I had to apply the percentage distributions of the age brackets reported to the larger number.  That requires some assumptions about birth patterns for which I have no support in the data I could find.  Secondly, there is a large difference in the FBI’s data between homicide victims (12,996) and homicide offenders (15,094).  On the other hand that’s logical because killing someone is a sufficiently egregious act that for a not insignificant number of perps it’s not something they’ve got the guts to do alone.  Additionally, while data is pretty comprehensive on the sex, race, and age of the victims (e.g., out of the 12,996, only 152 or just over 1% are shown as age-unknown), you’ve got to bear in mind that out of the 15,094 perps, 4,224 are shown as race-unknown; that’s 27.98% of the total.  There is enormous room for conclusions to move.  Just by way of extreme example, if you attribute all those unknowns to whites, you get 60.11% of homicides done by whites; if you attribute 13.61% of them (575) to blacks, you get 6,345 done by blacks, or 42.04% of the total.  In addition to Mother Jones’s data being non-verified (although they’ve got zero reason to understate any of it, with their known political/policy affinities), it’s only mass shootings, which is of course a subset of mass killings.  So it’s not complete (see Oklahoma City and its 168 dead); on the other hand, it’s jolly hard to kill more than one person with a knife, baseball bat, or claw hammer (by the way, although not relevant to our current discussion, Gentle Reader ought also bear in mind that blunt instruments are used to kill more people each year in the U.S. than firearms of all kinds), so any discrepancies are unlikely to be very large.

With all that in front of us, let’s look at the data.  First raw numbers.  Of the 15,094 homicide perps, 5,770 were black (more ominously, of the 12,996 victims, 6,470 were black, a catastrophic 49.78%, and for the age brackets between 17 and 39, blacks made up over 50% of the victims in every stinkin’ one of them), which is 38.23% of the total.  Whites, by the way, accounted for 32.13% (please to remember the race-unknowns, Best Beloved).  The overwhelming majority of all perps for whom sex is known were male (ex: of the 20-24 perps, 2,546 total, 2,315 of them were male, 90.93%; the divide hovers around 90% male for every single age bracket).  So our first conclusion stares us in the face:  If you want to be afraid of someone killing you, be afraid of a generic male.  If you want to assume that someone is violent, assume it’s a male.

But everyone (except perhaps the people who write for, and read, AlterNet.org) realizes that homicide is not evenly distributed across age, either.  For both white and black, it’s massively concentrated in the ages 15-40.  Let’s look:  For whites, 66.23% of their total (4,849) is accounted for in the 17 to 39 age brackets; for blacks the number is very similar: 72.65%.  Second conclusion:  If you’re going to be frightened of a putative murderer, don’t imagine him with a whole lot of gray hair.  By the way, it appears that whites remain more violent longer than their black age cohort.  The last black age bracket showing more than 100 perps is 50-54 (129); whites keep killing a full decade longer, until the 60-64 bracket (112).

Knowing that we’re discussing a perception issue here, and further knowing that whatever we perceive to be the level of violence associated with either race (high, medium, low), our perception is going to have to be grounded in reality, to the extent it is at all, in the data for males ages 15-40 of both groups.  So let’s see how that shakes out.  Applying the age bracket proportions for black-only to the black-alone-or-in-combination figure produces a total of 7,670,023 males in the 15-39 age range, which works out to be 2.48425% of the 308,745,538 gross population.  Now let’s compare that to the percentage of homicide offenders in the 17-39 age range (the FBI’s next lower age bracket is 13-16, and among blacks they account for only 265 of the 15,094, so I feel comfortable ignoring them here).  The black 17-39 age range accounts for 4,192 of the 15,094 homicide offenders, or 27.77% of the total.  Let’s juxtapose that even closer:  2.48% of the population is accounting for 27.77% of the killers, more than ten times their “statistical expectancy.”

In the interest of comparison, using the same extrapolation of age brackets for white-only to white-in-combination produces 37,210,162 white males age 15-39, or 12.052% of the gross population.  White homicide offenders in the 17-39 age range account for 3,212 of the 15,094 offenders, or 21.280% of the total.  Again the side-by-side:  12.05% of the population is accounting for 21.28% of the killers, not quite twice their “statistical expectancy.”

So as to both racial groups, their young males are statistically over-represented among killers, with the degree of over-representation being about five times higher among blacks.  A further point of commonality is that among that 17-39 range, the bulk of the killers are concentrated in the 17-29 range and the deadliest single bracket is 20-24.  For both races you’ve still got guys committing murder into their 30s, but they’ve started to taper off (most likely because they’ve been caught and are in the criminal justice system somewhere).  A point of distinction, however, is how much of each group’s race-in-combination population total is represented by that age range.  Among blacks, 1.852% is represented by males ages 18-19; for whites the figure is 1.381%, a full 25.4% less.  For 20-year-olds the numbers are 0.8943% and 0.6887% respectively, 22.99% less for white males.  In that deadliest, 20-24 bracket, the percentages are 3.95632% for blacks and 3.31933% for whites, a drop of 16.1%.  In other words, in 2010 a greater proportion of blacks were concentrated in the age and sex group most likely to become homicide offenders.  The black population is both younger and in the highest-risk group more heavily male.  That’s going to skew the numbers somewhat.

And at this point we run out of the purely numeric and shade into the concept of the “reasonable.”  Is it unreasonable, when two groups are both prone to excessive behavior on any scale, but one group is five times further out on that scale, that popular perception — unscientific as it always will be — will still reflect that?  Would it be unreasonable for someone to conclude that, all else being equal, blacks make better athletes, when the four data points are proportions of black males versus white males in the NBA and the NFL?  You can debate all damned day long about the why it should be so, but to argue that it’s not so is just damned foolish.

But Miss AlterNet.org isn’t arguing that.  She’s not arguing that blacks overall or black males in particular are not statistically more prone to acts of violence than whites.  She’s not impugning the numbers; she impugning the perceptions.  She’s arguing that because the “most horrific” crimes are committed by whites, and we (as a country) don’t perceive whites in general to be disproportionately violent, that’s evidence that we’re all racists.  Let’s tee up Mother Jones, bearing in mind my caveats above.  Looking broadly at their spreadsheet, it rapidly becomes apparent that mass shootings are (i) predominantly a white phenomenon, (ii) overwhelmingly a white male phenomenon, and (iii) by and large a crazy white male phenomenon.  But let’s look at just the numbers, ma’am.  In 2010, out of 12,996 homicide victims, we’ve got . . . 9 killed in a mass shooting.  That’s not quite seven-one-hundredths of one percent of the total.  I went back and added up all the mass shooting fatalities (Mother Jones gives numbers of wounded as well, by the way) since 1993, added in the 168 of Oklahoma City (but excluded the 3,000+ of September 11) and came up with . . . 588.  Thus, if you go back a full 21 years, you get 4.5245% of one year’s total homicides in the form of crazy white males shooting the place up or blowing up entire buildings.

Let’s go back to our observations about reasonableness.  Sandy Koufax was one of the all-time greats.  Hall of Famer.  Is it reasonable to conclude from his success that Jews make great athletes?  Or how about Croatians?  There are some very good Croatian basketball players, including some as play successfully in the NBA, and they routinely field outstanding teams in international competition.  So we know they make some damned fine basketball players there.  But how reasonable is the conclusion that “Croatians make great athletes” relative to the conclusion “blacks make great athletes,” based solely upon the data point of how many of each are playing in the NBA?  Let’s see . . . the NBA is . .. gosh . . . I don’t know (Wikipedia.org to the rescue: according to them in 2011 the NBA was 78% black and 17% white), really heavily black.  In fact, five times 17% works out to 85%, not much more than 78%.  Huh.

Gentle Reader will tax me with a false equivalence.  The make-up of the NBA and the perception of crime statistics are not the same thing.  Well, yes and no.  Where they both are similar is that both (i) are highly unrepresentative snap-shots of very large population groups, and yet  (ii) are highly visible markers which are flung in our faces remorselessly, and further (iii) represent the extreme point on their respective behavioral spectrum.  To illustrate the first point:  There were 7,670,023 black males in 2010 in the 15-39 age range.  Out of most of that number (17-39) they produced all of 4,192 known homicide offenders, and even if you attribute all the race-unknowns to black males ages 17-39, you get every bit of 9,962 offenders, or 0.12988% of the total in that age range.  Just over one-eighth of a percent of all black males turned out to be killers that year.  Meaning almost ninety-nine and seven-eighths didn’t.  Bearing in mind that even in the 24-hour news cycle there’s exactly X amount of information that can be put out, how reasonable is it to expect that news of a homicide is not going to get pretty good billing?  Although I’ve not crunched the numbers on other crimes of violence or property crimes, is it reasonable or unreasonable to expect that an armed robbery is going to be covered rather than an apartment that got broken into and a computer and some stereo equipment got stolen?  Finally, if killing is the ultimate crime, you must recognize that playing in the NBA is the ultimate in basketball athleticism.

So while it is entirely statistically defensible to state the conclusion “blacks are more likely to be killers than whites,” it’s not only statistically not supportable but morally reprehensible to conclude “blacks are likely to be killers” because neither group is very likely at all to be killers.  Neither.  But remember Mlle AlterNet.org isn’t about facts; she’s about perceptions.  If most Americans get their information, to the extent they get any, from television, and if television portrays only the most egregious events as “news,” and if any particular group X is in fact, undeniable, count-it-up-and-do-the-math fact, vastly disproportionately over-represented in any particular egregious behavior, precisely how is it that you expect such perceptions not to be awakened?  If all I’m shown is X with a smattering of Y thrown in, upon what basis do you conclude that I’m wicked for concluding, “X”?

Yet Mlle AlterNet.org wants me to be a bigot for thinking, “You know, maybe X.”  This passes for thinking nowadays, it seems.  I notice that she is identified as a “legal analyst.”  Good for her, because she’s a lousy statistical analyst.

Remind me how it Worked out Last Time

That a highly controversial, polarizing Middle Eastern head of state came to Germany and all the protesters turned out.  Prime Minister Erdogan is coming to speak in Cologne — Köln to the natives.  According to the FAZ, the protesters are already assembling from all over Europe.

It was Berlin, June, 1967, and the Shah of Iran was coming to town.  Granted, he was only going to the opera — Mozart’s Die Zauberflöte — but hey! he was an American ally and puppet.  Berlin, which has somewhat prided itself on civil disobedience ever since the latter days of the Kaiser’s reign, turned out in force.  Actually, when one says “Berlin,” one must bear in mind that back in those days the population of Berlin contained an enormous element of disaffected youth from all over the rest of Germany.  Because of its four-power occupied status (I’m going from memory of what I heard from my German friends 25+ years ago), if you were a male resident in Berlin you weren’t subject to the draft.  And apparently even student residence was sufficient to get you out.  Which means that Berlin university students skewed even more strongly left than university students typically do.

The demonstrations turned ugly, and fast.  I’ve never found a book-length treatment of that night, although I’m sure such exist.  Knowing what I do about how that place worked and to some extent still works, I’m quite confident there was a great deal of provocation among the demonstrators, in that they would have been liberally sprinkled with plants, mostly from the communist East, whose sole mission was to see to it that the demonstrators got well out of control.

On the other side you had the police.  Something to understand about Germany at this time is that large numbers of their senior leadership in all public agencies had . . . ummmm . . . not exactly pristine consciences, when it came to what they were doing for . . . oh, say . . . the years 1933 to 1945.  Oh sure, they’d got their “de-nazificationj” certification, but to an alarming extent those were simply fraudulent.  How that process worked, at least in the Foreign Office, is laid out pretty thoroughly in Das Amt und die Vergangenheit, the government-commissioned study of the office before, during, and after the Nazi era.  Let’s just say that there was a lively industry among former willing participants, fellow-travelers, and opportunists, where each would vouch for the other’s anti-Nazi bona fides.  And a lot — a lot — of people whose fingerprints were all over files, files detailing close cooperation with the SS, the SD, and the Gestapo in occupied and allied countries, in identifying Jews and Jewish assets, as well as leaning on host country officialdom, to get in the boat and row on implementing the Endlösung got their “Persilschein” (referring to a popular European laundry detergent, Persil, famed for its whitening powers). I have no reason, no reason at all, to suppose that the police would have been any different, especially since the police had been even more tightly integrated into the apparatus of horror.  Let’s just say that it’s a safe working assumption that the police on the street that night were anything but disappointed that the commies wanted to mix it up and maybe crack some skulls.  For some of their senior officials it might well have awakened fond memories of the Kapp Putsch or the glory days when the Sturmabteilung went about breaking up communist rallies and smashing Jewish shop windows.

As Lincoln observed in his Second Inaugural, “And the war came.”

On the streets the night of June 2 was a student named Benno Ohnesorg (ironically his last name translates to “without worry”).  He was married, expecting his first child, and this was his very first political demonstration (or so we’re told; it doesn’t really matter).  Also on the streets that night was a plain-clothes police officer, Karl-Heinz Kurras.  In the courtyard of a building he shot Ohnesorg, who died before they could get him treated at a hospital.  At the time Kurras was cleared (of course he was, all his fellow officers swore up and down on it, didn’t they?)

Except that Kurras wasn’t just any old beat cop.  He was also an agent of the Stasi, the principal East German surveillance and terror ministry.  He was also a long-time member of the SED, the official East German political party.  That didn’t come out until years later.  Also not coming out until years later was that the June 2, 1967, demonstrations weren’t Kurras’s first rodeo.  Turns out he’d been spying for the Soviets during the 1961 Checkpoint Charlie stand-off (English language link, this time).

The BBC calls it “the shot that changed Germany.”  And boy did it ever.  Among other young Germans radicalized by the events was a certain Gudrun Ensslin, who became one of the leaders of the Rote Armee Fraktion, the RAF, or as perhaps more widely-known in the Anglosphere, the Baader-Meinhof Gang (somewhat inaccurately; Ulrike Meinhof had long been marginalized, by among others Ensslin, well before the German Autumn of 1978).  October, 1978 saw the suicides of the senior leadership in prison, but by then the organization had morphed into a second-generation, even more violent, operation.  And they kept it up for years afterward, with bombings, assassinations, kidnappings, and so forth, only formally dissolving in April, 1988.

By way of postscript:  By 2012 new investigations (Kurras is still alive) cast serious doubt on the story told by Kurras and his colleagues (English-language link).  That story was that the officer was attacked by knife-wielding demonstrators and to defend himself he shot back.  Apparently that story can’t be squared with what is now known of the remaining physical, photographic, and documentary evidence.

Post-communist review of Stasi files does not reveal, it seems, that Kurras was acting on positive orders.  And after the shooting the Stasi broke off contact with him (well of course they would; their asset had to be considered a watched man, by the left if not by the authorities).  On the other hand, the Stasi recruited its agents very carefully, watched them like a hawk (counterintelligence), and generally spent a great deal of effort to ensure that they did things, and only those things, consistent with command from above.  And Kurras had joined the Stasi in 1955, so by June, 1967 he’s been on the payroll for some twelve years.  Even apart from his 1961 services to the Soviets he’s no rookie.

The promised demonstrations against Erdogan are supposed to be peaceful.  I suppose we’ll just have to wait and see.

Time / Out of Time

Among the harder tasks a father has is figuring out what in the world to buy his young children for their birthdays.  I mean, huh?  Mommy it is who tends to know what Small Child is hankering after; it’s Mommy whom Small Child will nag and whine about That One Special Thing.  Daddy, who’s doing well enough to remember birthdays in the first place, notices predilections only to the extent that they generate small pieces of things for him to step on as he walks across the living room floor at night and without the light on.

I was thus tickled to enjoy an afflatus the other night while cooking supper for my boys.  The youngest is mustard keen on military history in general, and the Civil War in particular.  Last summer in lieu of flying out to visit his cousins (normally this trip is by a wide margin the high point of my boys’ entire year) he decided he wanted to go to the 150th anniversary of Gettysburg.  So we did:  Nine days, eight nights in a tent, 2,512.8 miles in a non-air-conditioned minivan, six states, five battlefields (in order: Antietam, Harper’s Ferry, Gettysburg, New Market, and Appomattox Court House), two museums, a national parkway (Blue Ridge), and a mountain (Mt. Mitchell).  And two store-bought meals the whole time.  The whole time he never once complained about being hot, tired, thirsty, hungry, or bored.  He’d turned seven less than three weeks before we left.

So there I am cooking, and I popped the CD soundtrack from Ken Burns’s The Civil War into the player.  My youngest loves that music as well, and has been known to put it on very quietly to fall asleep to on more than one occasion.  And then I had my afflatus:  While we have, somewhere, a 20-plus year-old copy of the series on VHS, it’s been about six years since we’ve had a player capable of playing them without eating the tapes.  Five minutes on Amazon.com’s mobile phone app and the commemorative DVD set is on its way to my front door, expected delivery Thursday.  Annual anxiety over picking birthday present: solved.

But that prompted some thoughts.  For starts, that Amazon.com mobile phone app makes impulse buying childishly simple.  I seldom use it but when I do it’s for something I already know I want, and every time I’m struck by how easy it is.  But secondly and more to the point, if I had to get in my van and go dragging all over hell and half of Georgia looking in bricks-and-mortar stores for those DVDs, new or used, I’d never get it done.  Between work, grocery shopping, after-hours client meetings, cooking, laundry, dishes, homework, and chasing the boys to bed at 8:20 p.m., by the time I’ve got time to think about looking for Stuff, all the boots-on-the-ground retailers have gone home to chase their own children to bed.  The wife’s not in much better shape: she takes the boys to school on her way to work, and she’s the one who drags them to such after-school things as they have going on.

I know we’re not alone.  Our children aren’t in travel sports leagues, they don’t have musical lessons or recitals, or (God forbid) dance, or those other things which will pull the entire oxygen out of parents’ existences.  But I know full many parents who have all that on their plates and more.  Make our hypothetical parent a single parent and now you’ve really got problems making it all come together.

I’d be fascinated to look at Amazon.com’s sales data.  I’d like to see when they sell their products, by what time of day.  I’d wager a small sum that the bulk of their weekday sales of specifically children’s items occurs after 7:30 p.m., measured by the customer’s location.  In fact, it wouldn’t surprise me either to find out that even non-identifiably children’s items are skewed towards the evening hours.  So much of the debate we hear about Amazon’s business model focuses on how it “deprives” state and local government of sales tax revenue, and how unfair that is to bricks-and-mortar stores.  But what if what’s driving Amazon’s success is not just any perceived price differentials but the time factor.  Where I live if I wanted to buy in person something like that DVD set that I spent all of 3.5 minutes ordering last night, inclusive of trying to remember my account password, I’d get to drive somewhere between 45 minutes to a full hour just to get to the stores which might potentially have it in stock.  And then I’d get to hoof all over at least several of those stores, because I am perfectly comfortable that no bricks-and-mortar operator can afford to keep commemorative editions of 20-year-old documentaries on the shelf on the off-chance that someone’s going to toddle by and take it off their hands.  And at the end of the expedition, pissed off from the traffic and looking for place to park, with four or more hours blown away, a half-tank of gas into the bargain (and at $60-plus to top off an 18-gallon tank that’s a cost I have to add to the product), and with a further hour-plus drive home staring at me (remember I’m going to start with the stores closest to where I live), I’m most likely still to have to order the damned things after all.

Given what I perceive to be a trend (dare I use the expression “remorseless”?) towards ever-increasing demands on parents’ time, what does my hypothetical shopping trip above have to say about Amazon’s business model’s long-term viability relative to their competition, or at least that competition that does not deal in bulk, gotta-have-it-tonight supplies.  I know Amazon now sells groceries and whatnot, but unless you’re someone who’s a doomsday prepper or Super Organized Beyond all Reason, are you really going to buy your laundry detergent, pasta, toilet paper, and canned soup from Amazon.com?  On the other hand, if I’m running a store that deals in things that aren’t immediate-need items, that are non-run-of-the-mill items (other than hand-fabricated things like fashion accessories and whatnot), I think I have to see every travel soccer league as a threat to my livelihood.  Because every one of the out-of-town tournaments is just that much less time my customer has to do business with me.  Every two-hour Thursday evening practice is three or more hours less that my customer has to swing by my store.  An hour’s tutoring three afternoons and that’s so many shopping expeditions scuppered.

Am I the Only one Seeing a Pattern Here?

Via a link at Althouse, I stumbled this morning across a 2006 interview transcript from an NPR broadcast.

The interview subject was the fellow who was publishing a biography of Upton Sinclair.  Most Americans (at one time) knew him as the author of The Jungle, his 1906 exposé novel of the Chicago meat-packing industry.  Whatever its purely literary merits (and they seem to have been patchy enough), it was enormously effective in getting America stirred up about what was on its plate.  Literally.  Sinclair was disappointed because as a socialist (he was in fact hired to write the book as a socialist tome, not a public-health pot-boiler) the parts of the book he was least interested in got the most public attention.  We’ve all heard how the book was instrumental in prompting introduction of a federally-mandated inspection regime, which generations of high school teachers have solemnly informed us was fought tooth-and-nail by “the industry.”  Except it wasn’t, at least not by the large operators.  Inspection regimes are fixed costs.  Large operators can spread those fixed costs over larger production, so the price-per-final-product is less.  Small operators have to recapture that cost over a smaller number of products with a correspondingly larger price increase.  The desired result, from the big boys’ perspective, is that their competition will be priced out of the market and new market entrants faced with a large barrier to successful entry.  And so it proved to be.  Whether meat inspection is a good thing or bad can be debated.  But what is interesting in retrospect is the extent to which Sinclair may have gilded the lily on the hygienic conditions in the industry.

So Sinclair, the socialist, had a track record of service to Larger Truths.  In 1927 two Italian immigrants, Sacco and Vanzetti, were executed for murder.  Ever since we’ve been told by all our well-meaning teachers how they were just two innocents, framed up because they were (i) immigrants; (ii) Italian; and, (iii) avowed anarchists.  So obviously the fix was in, wasn’t it?  That’s the premise that Sinclair took with him when he went to write his novel, Boston, about the case.  Sinclair’s later biographer thinks he was pretty fair in presenting the case and the evidence.  Notice how Mr. Biographer words his statement:  “I think he was fair in his representation of the evidence and the case.”  The evidence and the case are not the whole story.  Do remember, please, that trials, especially criminal trials, are highly artificial proceedings.  That’s intentionally so; giving effective meaning to the presumption of innocence requires it.  Anyone who expects “the truth” necessarily to come out in a trial is a gull who deserves to blow $4,800 on penis-enlargement surgery which goes wrong.

And as it turns out, Sacco and Vanzetti were guilty as hell, and their lawyer presented a fraudulent defense.  Let’s hear it from Sinclair himself, as related in a letter from 1929:  “Alone in a hotel room with Fred [the defense attorney], I begged him to tell me the full truth. He then told me that the men were guilty and he told me in every detail how he had framed a set of alibis for them.”  Did he disclose that?  Well no, no he didn’t.  At least he had the common decency to admit, privately, where his duty lay:  “I face the most difficult ethical problem of my life.”  And how did he resolve that “most difficult ethical problem”?  Well, being the good lefty, he went out and served that good ol’ Larger Truth.

From his biographer:  “I think he felt that the climate of opinion and the representation of their foreignness, they were Italian, and their political beliefs, which were anarchism, had almost condemned them out of hand before they had a chance at a fair trial. . . .  Even if the men were guilty, he felt that the larger context of the world in which they were living rendered their guilt perhaps less important than it might have been otherwise.”  Ummm.  Fair trial?  No, they did not have a fair trial.  They got to put on a fraudulent defense.  Their lawyer lied to the jury.  A “fair trial” does not mean “the defense wins.”  And somehow their guilt was “less important” because they were anarchists?  No, it was even more important precisely because they were anarchists.  Recall in the 1920s there was tremendous debate going on about the fundamental nature of all these (later revealed as monstrous) political movements which had welled to the surface of post-Great War Western society and were tearing European societies to pieces.  The lefties here assured us that all of us troglodyte Americans were just too hard on those folks.  They only wanted Justice for the Common Man; they were for Peace (sort of like our left-extremists nowadays keep proclaiming the Religion of Peace, and refusing to call outfits like Boko Haram what they are: bloodthirsty terrorists, even when pressed to do so by their own colleagues in government).  In the Sacco and Vanzetti case America got to see what these thugs were really all about.  So it was critically important that the correct verdict be reached precisely because it ripped the mask off.  And as it turns out, notwithstanding they were lied to, the jury got it right.

Then come the Rosenbergs, Julius and Ethel.  Rivers of tears were shed for those poor innocents, done to death by a bunch of red-baiters.  Except that Julius definitely was, and Ethel may well have been, guilty as sin.

Fast forward to the Chambers and Hiss ruckus.  For decades the left extremists swore up and down that Alger Hiss was simon-pure and no more than the victim of a witch hunt.  Except he wasn’t.  He was guilty as sin.

And then we come to Tailgunner Joe, a distasteful person by any means, and a drunk, and a mountebank.  When Eisenhower, whom General Marshall had made (it was Marshall who promoted Ike directly from Lt. Colonel to Brigadier General; it was Marshall who tapped him to command TORCH; it was Marshall who handed him OVERLORD, even though he dearly wanted it for himself (Marshall had never commanded troops in battle, and he knew this would be his last chance) and even knowing that the commander of the invasion could easily have the presidency, if he wanted it) stood on a podium and listened, in silence, as McCarthy slandered Marshall as a traitor, Truman so lost respect for Eisenhower that he would never thereafter speak his name in public.  In Plain Speaking he refers to “that fellow who followed me.”

McCarthy famously brandished his list of however many hundred people who were communist infiltrators.  No one ever saw any such list, of course, and it’s undeniable that the 1950s Red Scare tarnished many people, ruined their careers even.  On the other hand, since publication of the VENONA files (Wikipedia has a list of American names appearing in the decrypts; some of them are breath-taking, and that’s not even a complete list: more are known), it’s likewise undeniable that McCarthy was dead-on right about the degree to which senior government positions had been infiltrated by the Soviet Union.  Alger Hiss was just one of many.  Hollywood still moans about its black-listed performers, producers, and others.  On the other hand the Soviet Union in fact did make a concerted effort to subvert American popular culture.  Their most effective and lasting capture, still loyally defending his blood-soaked master decades after the facts were known, was Pete Seeger (on whom more here, from his former extremely close friend, Ron Radosh).

I could go on.  I could trot out the new left-extremist notion of “fake but true” (which fits under the rubric of “dialectics”).  I could observe that the closest that Hollywood’s got to the Katyn massacres is a tangential reference in “Enigma.”  But why go on?

The common thread in all of those is that to the left, facts just do not matter.  What must be served is the Higher Truth, or what today’s left-extremists call the “narrative.”  It’s what was at the heart of Journolist, the news-manipulation cabal run from The Washington Post and whose mission was to elect Dear Leader.  I cannot and so do not deny that there are those on the Right for whom inconvenient facts get deep-sixed.  I am unaware, though, that the air-brushing of history is formally a part of Rightist philosophy and is not only engaged in on an organized basis but is actually an approved method.  Where is the right-wing Saul Alinsky, after all?

I suppose I really ought to give up on one of my favorite expressions:  You can’t make this stuff up.  You most certainly can, and we’ve got an entire chunk of the American political spectrum that regularly does.  Because that’s what its doctrine tells it to do.  Gentle Reader might study on that.

Food (and Indigestion) for Thought

Yesterday evening I attended a presentation by an analyst from the George C. Marshall Foundation.  They’re the outfit that was (of course) named after General of the Army George C. Marshall — to date the only professional military officer to receive, deservedly, the Nobel Peace Prize — and the purpose of which, in addition to preserving the documentary legacy of the man, his times, and his activities, also is to perpetuate Marshall’s legacy of magnanimity, cooperation, and commitment to the practicalities of creating those domestic and international structures and systems which form the framework upon which peace can be built.

If this sounds a bit unusual for an outfit that is not only named for a life-long soldier, but to this day is headquartered at a military college (the Virginia Military Institute), you really ought to read a bit more about Marshall.  For an officer who was scrupulously non-political (at least in his dealings with his civilian masters in FDR’s White House and in Congress), he was acutely sensitive to the fundamental political nature of the American military.  Again, that’s not a contradiction.  FDR famously addressed everyone by his first name.  These days it’s become fashionable because it’s considered egalitarian; perhaps it is, when everyone calls everyone by his first name.  But of course no one called FDR “Franklin”; his assumption of the prerogative was therefore diminishing to the addressee.  It’s a gentler form of the same method vulgarly practiced by LBJ in appearing naked in front of men he wished to intimidate.  In any event, FDR tried that business on with Marshall, who replied, “It’s ‘General Marshall,’ Mr. President.”  Congress recognized in him someone who was so straightforward with it he could appear before a committee, explain what he needed, and he was accepted at his word.  Mostly.  Once a particular senator from Missouri who headed an eponymous committee to investigate fraud, waste, and abuse in the war effort got to poking around in areas that weren’t exactly public.  Marshall got wind of it and put the word out that Senator Truman was simply not to be told certain things.  But it was Marshall who realized, and was greatly concerned about, the disruptions to American civil society that threatened from a long war.  He understood that an American wartime military must be a political expression of its society.  This directionality of the relationship was in contrast to, for example, the Soviet Union or Germany, in which civil society (to the extent they even had any left) was an adjunct to, and formed by, the military.  It was Marshall who looked Winston Churchill in the face and told him, with respect to some cock-eyed proposal to invade Rhodes, “Not one American soldier is going to die on that goddam beach.”  And finally, it was Marshall who put his credibility behind the effort to re-build the societies destroyed by the war, in a way that hadn’t been tried after the first go-round.

Truman it was who described Marshall as “the great one” of his era.  When you look at his breadth of comprehension and his iron-clad character it’s hard to disagree much with that statement.

In any event, the topic of yesterday’s presentation was the Ukrainian situation and its implications for Europe and Europe’s relationship with the U.S.  The presenter is a German lawyer with a Ph.D. from Harvard, and extensive experience as a reporter/analyst not only in Europe but also in central Africa.  She was in Rwanda in 1994, within weeks after the genocide.  And so forth.  Very impressive C.V., all in all.  She’s now based in the foundation’s Berlin office.

Her take on the situation is that the Ukraine represents the gravest crisis for the West since the break-up of the Soviet Union in the early 1990s.  Putin is trying to re-establish, not the Soviet Union, but rather the Soviet sphere of influence.  That effort is bound to de-stabilize not only the countries targeted (especially Belarus, Moldova, the Ukraine) but also Russia itself.  This is principally because, as she phrased it, other than a pile of cash, Russia’s not got any of the things needed to make the program work over time.  Once the cash is gone, and it will go (she didn’t mention the fracking revolution, but that technology may be the deadliest threat to Putin, even moreso than any nuclear deterrent), they’ve got nothing.  Their demographics are headed for societal implosion.  Their education system is awful.  Their economy is awful.  Their healthcare system is awful.  Their transportation system is awful.  Over everything lies the suffocating blanket of corruption.  And on and so forth.  For the long haul — she predicted “a generation” of turmoil in Eastern Europe — she was pretty sanguine.  Didn’t seem to think military action likely.  I wish I could join her in her optimism.  When someone is playing against long odds, as Putin is, the only way he wins the game is on a long shot.  With each gamble that doesn’t pan out, his objective motivation to double down increases because the aggregate odds against him increase with each lost bet.  There’s a reason, after all, why Germany’s and Japan’s losing phases of their wars got so vicious.

Another of the threads of her presentation, and of her responses to some specific questions afterwards, was the current state of the German-American relationship.  Once more, she had a fairly positive take on the connections at the policy-maker level, although she was pretty up-front that the NSA spying revelations had badly shaken people in Berlin.  She also shared something that I hadn’t thought of.  She allowed that a very great deal of “public” comment in newspapers and other mass media, including specifically the internet, is and is known to be bought-and-paid-for trolling.  Propaganda, in other words.  Beyond citing her connections inside German media she didn’t describe how this is known to be.  It certainly is possible; George Soros and his fellow left-extremists maintain several operations here in the U.S. who monitor various public-forum communications and regularly flood the waves, so to speak, with astroturf outrage.  The Occupy “movement” was little more than astroturf in the streets.  So it can be done.

One thing she also mentioned, and which got me to thinking (difficult, I know), was her observation that for many years America has been a foil for the streak of Romantic idealism that is so strong in German culture and politics.  Years ago while studying in Germany I took a lecture course in American colonial history.  The professor’s particular specialty was colonial New England history.  It was fascinating to see an outsider’s take on one’s own world.  One of the points he made, several times during the course, was the extent to which Puritan idealistic sensibilities still inform American society and especially its politics.  So when our presenter yesterday evening mentioned the repulsive aspects of the German view of America (as opposed to its simultaneous attractive aspects) as being rooted specifically in German idealism, the thought struck me that what you’ve got is competing idealistic sensibilities, and I wondered to what extent their incongruity traces back to the distinctions in the religious traditions that gave rise to them (Pietism on the one hand and Puritanism on the other).  I wonder if anyone’s ever looked at it from that angle, and if so what their conclusions were.  Sort of like neighboring families who’ve been picking at each other so long no one even remembers what it all started about, it would be amusing to tease out whether we’re grousing over two religious traditions that go back over 300 years.

I just wish I could feel as confident in the long-term future as she seems to.  My boys are 12, 10, and 8.  That “generation of turmoil” our presenter sees on the horizon will consume their childhoods and young adulthoods.  And it may consume them, depending on how badly the parties miscalculate.

What a Difference a Word can Make

The headline of this article over at Inside Higher Ed, “The Last Acceptable Prejudice?” has drawn ire, agreement, and counter-example in the comments.

The sequence of events that prompted the article seems, honestly, to be more than a bit of a tempest in a teapot.  Someone saw a student traipsing about without shoes, and (not to the student’s face) described the appearance as “hillbilly” for that reason.  Cue the sensitivity brigades.  For starts, other than the location of the school where it appears to have happened (University of North Georgia), I’m not sure exactly why “hillbilly” was the first description to pop into the mind of this particular person.  I mean, genuine hillbillies are almost by definition extraordinarily rare around college campuses.  In contrast, you can’t swing a cat even on the most Podunk campus without hitting what a cousin of mine (who’s lived in San Francisco for decades now) terms “stinky-foot hippy chicks” and their male equivalents.  If the sight had even registered with me, oblivious as I tend to be, my reaction would most likely have been, “Oh, another granola.  Look out you don’t slip on an organic banana peel.”  And my reaction would have been that because that’s by a wide margin the most statistically likely correct explanation for why someone who’s got enough resources to attend college and thus shoe him/herself properly would nonetheless appear unshod in public.

Perhaps because this is the U. of N. Ga. they’re more than usually sensitive to accusations of “y’all are just a bunch of rednecks up there,” sort of like the black sergeant in “A Soldier’s Story” tearing a strip off the musically-gifted, slow-witted buck private for playing “that guitar-pickin’, sittin’-around-the-shack music” (highly recommend the movie, by the way).  Whatever.

I’ve lived a good chunk of my life outside my native South.  While doing so I never attempted to hide my antecedents.  To my cost.  So I know for a fact that anti-Southern bigotry is both very real and something that people elsewhere feel perfectly comfortable not only expressing to one’s face and in public, but openly acting on in their personal decision trees.

The commenters to this article do have valid points, though.  There are a lot of other groups that come in for their share of chaffing.  Who’s not seen on a sit-com at some point a gag about Jewish mothers and chicken noodle soup?  Or Roman Catholic priests and prelates (although preachers in general are fair game, as are politicians and lawyers)?  Or fundamentalist Christians of pretty much any stripe (pay attention, though: you see them portrayed as Southerners, as a general rule, and not just fundamentalist Christians, almost as if the Christianity thing were simply an attribute of the Southern stereotype).  Even homosexuals get portrayed in pop culture not infrequently highlighted by what can only be described as stereotype behavior or appearance, and only a lunatic is going to argue that homosexuality is still looked down upon in those circles these days.

All that having been said, while it may be “acceptable” to play to those other stereotypes (and by the way, a “stereotype” is not necessarily a specifically hostile prejudice; it’s just a mental cartoon we form for ourselves, and whether we make it something hateful, or humorous, or admiring (East Asian brilliance at math and the sciences, anyone?) is largely up to us individually) or even to poke fun through the medium of them, my own personal impression is that trashing Southerners and the South is not just acceptable, but fashionable, in a way that poking fun at our Jewish mother for whom chicken noodle soup is the universal specific simply is not (or at least not in pop culture, itself an imperfect mirror of our society).  It’s sort of like a ritual of introduction, by the observance of which one asserts his initiation into The Enlightened.  You seldom see it done in any other than an explicitly vicious spirit.

Thought experiment:  It’s simply not imaginable, nowadays, to think of someone “who knows better” asking a black acquaintance where he prefers to buy his fried chicken.  In contrast I’d wager a small sum that most Southerners who’ve lived outside the South and not bothered to hide their background have had that sort of “someone who knows better” ask them something about indoor plumbing, or shoes, or in-breeding.  And do it in a manner which proclaims that, “No, I’m not saying this as a joke; I’m saying this to make sure you understand these are my assumptions about you and where you grew up.”  I use, by the way, the expression “someone who knows better” because I don’t think you can draw proper inferences for how tacky people behave.  That’s what they are: tacky; that’s how they act.  So I use that expression to refer to someone who at least claims some degree of refinement, of broad outlook and accepting disposition.

Just my two cents.  Do I know for a fact that these attitudes have cost me personally, in the form of refused employment, among other things?  I sure do.  Do I harbor a grudge about it?  Not really.  They weren’t obliged to like me as I was and am.  I could have tried to suppress who and what I am, and I chose not to make the effort.  I have to accept those costs.  Being an introvert helps, of course.  But I’d be less than honest to claim that it doesn’t rankle even a tiny bit.

Well, if You State all Their Assertions in the Same Sentence

. . . You arrive at Iowahawk’s formulation:  Colleges are hotbeds of rape and racism that everyone should attend.

Of the two sets of accusations, the one that doesn’t really concern me is the “racism!” screech.  If Dear Leader and his fawning acolytes in the lamestream media have accomplished one single useful thing in the past six years, it’s having so cheapened the “racism!” ejaculation that pretty much everyone now recognizes it as meaningless.  When everything is racist, then nothing is.  If you want to see genuine “racism” in action, you can watch what’s going on in the Ukraine between ethnic Russians and ethnic Ukrainians.

The development that concerns me more is the system of kangaroo courts that are even now being set up on campuses across the country, all under ukase from the Holder DOJ.  For those who haven’t been following it, the federal government is now mandating, more or less openly, that colleges address accusations of rape on campus not through careful preservation of crime scenes and other physical evidence until the police (you know, those folks who not infrequently have entire teams of people with specialized training in investigating sexual crimes) get there, but rather through a system of “discipline” that seems designed to do little more than make college administrators (and federal bureaucrats) feel good about themselves.

In truth, these panels and how they operate are easily recognized by anyone who has read his Solzhenitsyn.  They’re neither more nor less than the Cheka’s revolutionary tribunals or the OSO administrative sentencing system (most people sent to GuLAG were sentenced by OSO, and not by others of the organs).  From the linked article over at the Foundation for Individual Rights in Education (F.I.R.E; if you’re looking for a worthy object for your charitable giving, you could do a very great deal worse than these folks):

“Foremost among the demands since 2011 is that colleges use the ‘preponderance of the evidence’” standard of proof for adjudicating sexual misconduct accusations — a 50.01 percent likelihood standard that is our nation’s lowest. (In real courts, rape must be proved ‘beyond a reasonable doubt,’ a 98-99 percent likelihood standard.)

This low standard is then used in a disciplinary procedure where students nearly always lack lawyers, no legally trained judge oversees the process, testimony is not under oath, hearsay is freely considered, relevant evidence or even proper notice of the charges may not be given to both parties, students may be forced to incriminate themselves, and whatever ‘jury’ is empaneled may not be of one’s peers.

The task force report from Tuesday actually encourages colleges to make this situation worse. Perhaps recognizing that college hearings are delivering shoddy justice, the task force speaks highly of moving to a ‘single investigator’ model that would entirely dispense with niceties like ‘hearings’ or ‘the ability to face one’s accuser’ by appointing one administrator to act as detective, judge, and jury for campus crimes.”

And that’s just the lousy deal for the guy wrongly accused.  Not mentioned but nearly as objectionable is that the college’s ham-fisted treatment of the case may well irretrievably compromise what otherwise might be a successful criminal prosecution of a genuine rapist.  Remember that state universities are state agencies, their actions can be attributed to the state, and to the extent their functionaries are delegated police powers, you raise all manner of constitutional concerns about how they conduct themselves.  Those constitutional violations — and they will occur, and be legion (hell, colleges nowadays can’t even get the First Amendment right, what with stunts like disciplining students for passing out . . . copies of the Constitution) — are going to create legally cognizable problems for the actual law enforcement agencies when they actually do catch someone who actually has committed a rape that they otherwise could actually prove up beyond a reasonable doubt.  In short, they’ll manage to kick the rapist out of school, but he’ll still be on the street, looking for his next victim.

But none of that matters, though, does it?  Because our administrators can pat themselves on the back and loudly proclaim how tough they are on sexual misbehavior.  And that’s what matters, that educrats feel good about themselves.  That next victim, when he finds her?  She’s just collateral damage, and besides, she may not even be a student.

As I think I’ve mentioned before, I have three boys.  The oldest is seven years from college (assuming he goes).  Given the half-life of stupid ideas, it’s more or less a certainty that these lynch-mob Chekist systems are going to be still going strong when my boys go to college.  I’d like them to be able to enjoy the experience without having to adopt the survival habits of the zeks.  But this system may as well have been purposely designed for abuse, if not outright extortion.  Remember we’re dealing with the Laws of Very Large Numbers.  How many tens of millions of college students are there at any given time?  Now that a majority of them are female, how many millions of female college students does that work out to be?  By that time it will have been impressed on the female student body over the course of years that if you want to get rid of a male you don’t particularly care for (whether for personal or political reasons, or just because you can, because you’re looking for a scalp) all you have to do is engineer a bogus accusation of sexual assault and you will not only have blown up his college attendance, but you will have ruined his life (job interviewer: Why did you change colleges?  job candidate: Errmmmm, ahem, I, uh, just decided to.  interviewer: I see.).

Any system that is set up to be easily abused will be abused.  It doesn’t matter if you’re talking about tax loopholes, government benefits, military supply contracts, absentee voting, political police-state enforcement, or sexual conduct enforcement on college campuses.  When you spread that sort of opportunity before a sufficiently large population, abuse will occur and it will tend to become systemic as the abusers are seen to profit from it (Hayek’s chapter on “Why the Worst Get on Top” in The Road to Serfdom is a good illustration of the phenomenon in a different context).

I feel as though it’s 1937, and I’m watching my boys get ready fill out their applications to join the Komsomol (the leadership of which was shot, several times over, during the purges, and huge numbers of whose members fetched up in the camps).

How to Ruin an Otherwise Valid Point

Let’s say you have an argument to make about Issue X.  And let’s say there’s a great deal of merit to what you have to say.  Perhaps there is some part of your argument that reasonable people could disagree on in good faith, but you’re firmly convinced that your position on that much is valid, and as to the rest of it, you can prove it up to anyone’s reasonable standard of validity.

And then you pull some nickel-assed stunt like citing some alleged “study” in support of your argument, a “study” that you don’t provide a link to, and that is purportedly done by an outfit that, when you run a Google search, you find nothing at all directly related to that organization.

This is what some website doing business under the name “National Report” has done.  Under the breathless headline “New Study Reveals 89% of Nation’s Food Stamps Squandered On Junk Food,” they report that some group identified as “Malbeck Data Institute” has released a study involving alleged interviews with “over 100,000 men and women who are currently accepting SNAP assistance” (as reported at Conservative Frontline).  Neither site provides a link to any report of a study.  Google searches on “Malbeck Data Insitute,” “Malbeck Data,” and “Malbeck Institute” produce nothing that directs you to any website or other location where any such study results are available for public inspection, or even to a website under any variant of those names.  We are therefore to understand that an institute which has the wherewithal to interview “over 100,000” individual respondents does not have an online presence, does not publish its research online itself, and does not do so with some reputable online resource like Social Science Research Network.  A search there for a recent publication on the subject of “food stamps” also produces nothing along the lines of this alleged study, although there are articles addressing the subjects of food stamp fraud, recipients’ purchasing decisions, and so forth.  [Note (10 May 14):  I started this post yesterday after seeing a link to that article on Instapundit.  Apparently I wasn’t the only person who went checking around for this alleged study’s bona fides.]

Not content with the carnival-huckster headline, the author over at National Report favors us with lines like this:  “However, judging by what these individuals are choosing to purchase, it is evident that the majority of those who receive benefits are criminally milking the system for all it’s worth.”  This piece is not presented as an opinion piece, by the way.  Most of it is in fact presented as a write-up of what they allege to be their own, informal cross-check done through the simple method of watching what people at a Wal-Mart were buying with food stamps one day.

“Criminally milking the system,” though?  I didn’t see anything in that article, anywhere, to suggest that anyone purchased or attempted to purchase a single item not legally permitted to be bought with food stamps.  News flash:  If the program rules permit it, it isn’t “criminal.”  If it’s not illegal, it’s not even necessarily abusive.  I’m not aware that buying junk food somehow increases the amount of money you get to put on your SNAP card, so if this emaciated drug addict mentioned was loading up on candy bars then that’s just that much less money he had to buy something that would (as my mother used to say) “stick to his ribs.”  Another smiley-faced Wal-Mart customer mentioned, the 29-year-old mother of six (!), who disclaimed knowledge of who were the fathers of four of them (!!), is castigated for buying “microwavable entrees.”  Well, so what?  There are a great number of perfectly wholesome microwave family-sized dishes out there.  I’m not willing to conclude without more that this woman’s dietary choices were as flawed as her bedroom habits.

And this is where I get my butt chapped.  You see, there is tremendous waste in the SNAP program.  It shouldn’t be the case that you can buy candy and soft drinks and junk food with SNAP.  A few months ago I ran across a link to an article on Appalachia (I thought I’d linked to it in a blog post, but apparently I didn’t), and specifically to its gray market.  This article named names and places, by the way.  One of the phenomena described was how on the days that everyone’s SNAP card gets credited, you can see people pushing shopping carts through the grocery stores, and they’re entirely loaded with soft drinks.  As in hundreds of cans of soft drinks.  And nothing else.  What’s going on is that they’ll buy a case of soft drinks for $X, then turn around and sell it back to the store (or another store, assuming the case hasn’t been opened), for $0.50 on the dollar (the article in fact describes how some people will stock-pile soft drinks at home and use them as quasi-currency among themselves).  Store then repeats the process, and the SNAP recipient now has a pocket full of cash to go and spend on whatever else.  In Appalachia that all too frequently works out to mean meth and Oxycontin.  Notice, however, how the store is a critical player in this fraud.  There are many fewer stores to audit than SNAP recipients.  What does this suggest about where is the vulnerable link in this scheme?

Given the miracles of modern bar coding of absolutely everything under the sun that is sold at retail, it would be childishly simple to control very tightly for nutrition and quality everything that SNAP recipients buy.  Want your product to be eligible for purchase with SNAP?  Fine, you must put a bar code on each container sold separately at retail, and you must apply to HHS for that container of that product to be white-listed.  HHS then updates its master white-list monthly or so, and thus if our mother-of-six trots up and plops down the jumbo-sized pork rinds, the cash register spits it back out.  But that would of course make the cashiers’ jobs harder when mother-of-six looks at him and lies, “I didn’t know you couldn’t buy these things with food stamps.”  At which point he grabs the tub of butter she also bought and directs her attention to the tiny SNAP logo printed right beside its bar code.  “You see this logo, ma’am?  Show me that logo on that bag of pork rinds.  Everybody’s stuff you can pay for with your card has that logo on it.  If it ain’t got the logo you can’t buy it with food stamps.”

But, Gentle Reader objects, that would put unconscionable burdens on the manufacturers.  No it wouldn’t.  They’re asking the American taxpayer to buy, at his expense, their products for someone else to eat or drink.  Color me Scrooge, but I’m just not seeing that as an imposition beyond the pale.

Gentle Reader further objects to some government agency “dictating what poor people buy.”  That’s not what is proposed.  For starts, there will be thousands of products of all sorts which would be registered by their producers, and if you tell me I may select from among 17,500 potential items, but may not buy 4,750 others, you’re just going to have to pardon me for declining to think of that as dictating what I must buy.

Secondly, the application for approval process can be used in a secondary role to increase the quality of what poor people are eating.  For instance:  Go to your favorite deli (or refrigerator case) and look at the bologna, or cooked ham, or turkey, or whatever.  Somewhere on there it will state how much of that product, by weight, is . . . water.  In a lot of instances you’ll find that you’re paying $11.99 a pound for something that’s upwards of 30% water.  In Germany, by contrast, until recently (you can thank the EU-slugs for changing it) you could, by law, put two classes of ingredients into processed meat products:  meat and spices.  Period.  Or how about breakfast cereals?  Want the taxpayer to subsidize your customer’s purchase of your product?  Fine; just don’t put more than X% sugar or high-fructose corn syrup in it.  And so forth.  It’ll still be plenty sweet, but the sugar and corn industries won’t be getting a massive double subsidy out of the bargain (their production is already highly subsidized), and maybe the poor won’t be snookered into developing diabetes by age 45.

Will that increase the cost to the SNAP recipient of what he’s buying?  Yes; good food tends, overall, to be more expensive than cheap food (largely, no doubt, because cheap food products typically rely on heavily-subsidized ingredients like sugar and corn syrup; look at the top five ingredients in the junk foods sold in your local store, and then compare them with the comparable ingredients listing on the better-quality products).  On the other hand it is also a characteristic that junk foods by their metabolic effects tend to make your body crave them all the more, the more you eat.  Better-quality foods do that less.  So while our hypothetical SNAP recipient is “paying” (read: we’re paying for him) more for food, he’s getting a more lasting appetite satisfaction from it.  So in the long run he’ll need to eat less of it, and in fact will feel himself not hungry for longer.

What would be the net effects of all this on the food-intake needs and desires of SNAP recipients, both in their own terms and relative to the benefits they’re eligible to receive?  Can’t say, beyond the fact that they would be eating better overall.  And if the net effect is still an unacceptable overall price increase, because by hypothesis these things are going to be paid for electronically and will be linked to a computer database, HHS can negotiate price breaks with producers and/or retailers.  Remember it’s their customer who is being subsidized, and therefore their bottom line that’s being subsidized.  It’s no different from the exclusion of interest on municipal bonds from the bondholder’s gross income under § 103 of the Internal Revenue Code.  That is point-blank a subsidy for state and local government borrowers (they can borrow at significantly lower rates because their lenders won’t have to pay taxes on the interest), and Congress sure as hell is entitled to place such restrictions on the use of those borrowed funds as are necessary to ensure that the subsidy is not being abused.

Here I’ll also confess to something of sympathy with mother-of-six (if she exists).  I do about 70% of the cooking in our household, meaning I cook for myself and my three boys.  The wife won’t eat what I cook, by and large, so I gave up on that years ago.  If there are leftovers I’ll offer them but there’s a limit to how many different things I can cook for one meal.  When I cook for my boys, they get a meat-and-two minimum, and more typically a meat-and-three (daddy usually eats much more simply).  And then I do the dishes.  I also do a good bit of the laundry, and the overwhelming majority of the grocery shopping (when I go I bring back meat, vegetables, fruits, and primary ingredients; when the wife goes she brings back candied breakfast cereals and junk food, mostly).  And I work six days a week.  So I know what it means to bust ass and still try to put a more-or-less healthy meal on the table.  It’s not easy.  But it can be done.

You see?  I managed to make all of the above suggestions without once using words like “criminally” or “abused” or “lay-about” or “parasites” or “dead-beats” or similar expressions, or citing to some non-existent study to “prove” my points.  But over at the National Report and Conservative Frontline they’ve got to go that extra mile.  Given how fragile trustworthiness is in a universe like the internet, I can’t say that I could ever trust again something from their sites.  Pity.

[Update (12 May 14):  In reply to M. Simon’s question (thanks for commenting, by the way) as to whether “this post” was based on real studies or bogus ones, I’m assuming he’s referring to my post and not the posts I linked to.  I wish I’d remembered to bookmark that article on Appalachia I referred to, but I didn’t.  It was, however, in a “reputable” publication.  I can’t recall whether it was The Atlantic, or Bloomberg, or some other, but it was in a publication with some reputational stake in not just making stuff up.  As to overall observable purchasing patterns, I refer not only to what I’ve observed over the years myself, but also to several decades’ acquaintance with people involved in retail food, all the way from cash register jockeys to store owners.  They all have the same sets of comments, many of which boil down to, “You wouldn’t believe what gets bought with food stamps!”

As to the presence of processed sugar and high-fructose corn syrup in the national diet, by odd coincidence at lunch today I saw a physician from New York getting interviewed on Fox News on exactly this point.  He quoted numbers:  600,000 food products sold in America, and 80% of them contain “added sugar,” generally in the form of processed cane sugar or high-fructose corn syrup.  He held up a vial of what he represented was the sugar contained in one regular 12-oz. soft drink; it was a pretty thick test tube.  He then explained why high-fructose corn syrup is so insidious.  Apparently it suppresses release (he used the expression “shuts down”) of the hormone that tells your body you’re full and can stop eating.  And they showed side-by-side brain scans of the effect of sugar versus cocaine.

This doctor feller attributed the corn syrup’s popularity with food manufacturers to its comparative price relative to cane sugar.  And there’s a tie-in to M. Simon’s comment here as well.  Cane sugar is extortionately expensive in the U.S. because of ridiculously high tariffs on imported sugar.  Can’t recall the source any more, but once upon a time I saw the figure of a factor of five (or thereabouts; it’s been years since I saw that figure) is the cost increase that’s passed along to the American eater just in order to make domestic production pay.  And notwithstanding cane sugar is not a “natural” crop in our part of North America (in the sense of maize or wheat, both of which will grow just jim dandy in most of the continent), pay it does.  To give an illustration of just how high up these ties go and how lucrative they are for the welfare recipient:  Apparently the person whom then-President Clinton was talking to on the phone while a now-famous intern was pleasuring him was one of the principals in the leading domestic sugar producer.  Not that “ordinary” processed cane sugar is healthy by any stretch, but this particular piece of corporate welfare is not only massively increasing the cost of living to Americans at large, but it’s also indirectly contributing to significant increases in the incidence of morbid obesity.]

[Update (19 May 14):  While checking the weather for the next few days over at The Weather Channel, I ran into this link on the subject of added sugar in breakfast cereal.  They’ve got a slide show on a group of cereals each of which is at least 50% sugar by weight.  The winner is 88% sugar.  Plop a bowl of this in front of Junior and almost nine-tenths of what your child is shoveling into his face is processed sugar.  One pattern which struck me is how many of the cereals on this list are puffed-wheat products.  I remember having un-sugared puffed wheat cereals when a child, and they tasted like Styrofoam.  I also remember having un-sugared rice puff cereal, and it tasted that way but even more.  Regular corn flakes aren’t exactly packed with flavorful sensations either.  So why so many wheat puffs and not rice?  Why only one frosted flake product?  Maybe rice puff cereal has finally been moved over to the section with the monofilament tape, corrugated cardboard boxes, and other packaging products where it belongs?  In any event, if this Hall o’ Shame won’t put you off your feed, it ought to.]

[Update (15 Dec 14):  And for more on the subject of fructose’s effects on the body’s ability to recognize when it has taken on board enough fuel, we have this report.]

Because it Worked out so Well for General Motors

. . . When management airily assumed 8-9% annual rates of return on investment to fund its benefit obligations.  Excuse me, that’s Old General Motors, the one that soaked up several billions in outright taxpayers’ money (and was stolen from its creditors to be handed to the UAW in payoff for its electoral support), as well as about $16 billion worth of tax subsidy created by rifle-shot in the tax code (fuller details here).

Mayor De Blasio has presented his first city budget to the New York City council.  In true leftist fashion, he “balances” it by grinding his seed-corn, specifically reserves left from Bloomberg’s tenure.  I don’t carry a brief for li’l Nanny Bloomberg, but you have to give some sort of respect to a mayor who can squire a city through the upheavals of the September 11 aftermath, the implosion of the industry whose epicenter it is (the financial services industry), as well as five-plus years of general economy-wide decline and stagnation . . . and leave his successor a surplus at the end of the day.

I know that De Blasio is too “progressive” (he used the word something like five times in his presentation) to look back for reactionary purposes like seeing how his notions have played out for others who tried them.  He really ought, I suggest, to ponder the lessons of the Holodomor.  When Stalin announced compulsory collectivization, the peasants did the only thing they could to get at least some benefit from their generations’ toil.  They slaughtered and ate their livestock.  Then came the requisitioning commissions, and they took everything, leaving nothing even to plant for the next season.  How’d that work out?  Read about it, if you have the stomach, here.  Or here.

Also in true leftist fashion, he cranks up spending by 6% while “paying” for it from fantastical assumptions about unknown future revenues and unspecified, unenforceable “promises” from the city’s unions to cut healthcare spending — in the future, of course — by $3.4 billion.  Without any premium increases passed on to the rank-and-file.  This is in a world of “Affordable” Care Act plans the uniform feature of which is they cost fabulously more than what they’ve (compulsorily) replaced, because they’re mandated to cover a smorgasbord of benefits that earlier plans typically didn’t.  Like maternity care for 63-year-old males.  We are told not to worry, though, because if the unions don’t voluntarily comply with that pie-in-the-sky $3.4 billion promise, the cuts are going to happen forcibly.  Actually, the article’s paraphrase of De Blasio’s promise to respect them in the morning is “the city reserved the right to enforce some of the terms.”  Some; get it?

Left unmentioned is how they’re going to fit any of the “Affordable” Care Act’s Cadillac-plan tax burden into that $3.4 billion savings.  Dear Leader can utter executive orders all day long, but unless Congress actually chops that provision from the statute, eventually a large number of those union plans are going to get popped, and hard.  At which point they’ll discover something that the rest of us have long since figured out:  Taxes like that work out to be dead-weight losses.

The provision of the budget that really makes my head spin, however, is the bit about the hand-outs to unions (only the teachers are specifically mentioned, but there may be more).  They’re going to get — pay attention closely — retroactive pay increases.  That’s right; their contracts said they’d get paid $X.  They got paid $X.  Their contracts had expired, and they continued to get paid $X.  But now, after the fact and for no additional performance of any nature, they’re going to get paid $X+Y.  Of course, the teachers union vigorously supported Comrade De Blasio in his campaigns.  But This is Not a Payoff of Money for Votes, you understand?  No!!  Pay no attention to the man behind the curtain.

And while the teachers are going to get their money up front — while the Bloomberg surplus lasts, at least — “Much of the cost of retroactive pay for city teachers would not to be paid until the last years of Mr. de Blasio’s theoretical second term.”  Hand the money over now; figure out where it’s coming from eight years from now.  Because we’ve got “more ‘accurate’ forecasting,” you see, so we know what the world, national, and local economies are going to be doing eight years from now.  Eight.  Years.  From.  Today.  You heard it there first, people; the city government of New York City is officially basing its long-range financial commitments on possession of a crystal ball.

On a final note, there’s a line in there about adding in $20 million for “student aid” programs at City University of New York.  For those who don’t recall, CUNY’s original mandate was to provide free or very-low-cost quality higher education to the city’s less-well-off.  For years it did pretty much exactly that.  Oh sure, it’s had its moments of comedy, such as Leonard Jeffries, but by and large it did what it was supposed to, and for many students did an outstanding job.  Those days appear to be ending, if they’re not already over.

Now CUNY is morphing into a comfy slush fund for sinecures, place-men, and political payoffs.  Recently former Enron advisor and populist mountebank Paul Krugman got hired by CUNY to . . . well, that’s the point.  For his entire first year he’s been hired to do pretty much nothing.  Thereafter, he’s obliged to do only nearly nothing.  And for this he’s getting a base salary of $225,000 per year (with summers off, thereby increasing the annualized lick to an even $300,000), plus $10,000 for “expenses.”  So that’s $235,000 (plus payroll taxes, plus other benefits) cash out the door, each year.  Which means that over 1% of that $20 million in “student aid” is actually going to one man.  Who has been hired to do as close to nothing as you can imagine.

This is progress, folks, with a vengeance.

Department of Everything Old is New Again

Yesterday in Vienna the results of a survey study were published.  Those polled were Austrians over age 15.  They were asked their opinions about a number of things, including You Know What.

First, the good news.  Eighty-five percent agreed with the statement “democracy is the best form of government.”  Remember that number: 85%.  Thirty percent agreed with the proposition that the national socialist era (in Austria, at least) brought “only bad” things; another 31% agreed with the position that it brought “mostly bad” things.  Those two groups strongly correlated with whether the particular respondent had a “Matura” (the equivalent of the German Abitur, which is a level of academic challenge and achievement most Americans aren’t exposed to until their junior year in college, if then), and with whether the respondent had an overall optimistic view of his economic future.  The further good news is that the combined 61% who saw either primarily or exclusively bad things in the 1938-45 years represents an increase from 51% in 2005.  So in nine years we’ve seen a 19.6% increase in the proportion of People Who Get It.

But, lest one get too congratulatory, 36% of the respondents agreed that the Nazi era brought “both good and bad” with it (the write-up doesn’t make clear whether the survey included questions to tease out the responsive question, “For whom?”).  I mean, I can partly understand at least the ethnic Germans figuring that, since the Anschluß ousted a government that was scarcely democratic or representative, and in fact was first cousin to the authoritarian state to the north, all they did was trade one thug for another.  On the other hand, it’s not as though Austria was poised for war in March, 1938, or that its military had been given instructions similar to those received (with blanched face and sweaty palms) by the German high command in November, 1937.  And it’s not as though pre-Hitlerian Austria was already rounding up and persecuting its Jews.

What’s alarming is that 3% of the respondents agreed that the national socialist era brought “primarily good” to Austria.  I guess all you can do is observe that there’s one in every crowd, and in fact, it seems, at the rate of 3 per 100.

More disturbingly, 56% agreed that it is time to “end the discussion of the Second World War and the Holocaust.”  Yeah, because talking too much about a monstrous crime in which your society played a leading role makes it so much less likely that someone else will go goose-stepping down your path.  American chattel slavery ended 150 years ago next spring.  Scholars are still parsing through the surviving records and evidence and still finding new facets to explore, new insights to gain, new lessons with resonance for human relationships in the 21st Century.  The twelve years of national socialism left incomparably greater documentary residue, and the Last Pertinent Question on the war and its implications for humanity isn’t likely to be asked or answered in my lifetime.  But hey! Austria’s Got Talent! or whatever crap they watch over there.

You can to some degree write off that 56%.  Half the human population is of below-average intelligence (that’s not invidious; it’s statistics).  It’s not reasonable to expect that lower half of the curve to have the imagination to suspect the vast scope of the unexplored that remains out there in any field of contemplation as complex as what went down from 1933-45, and in fact the years preceding it and following.  While it sounds callous, you can write them off because there’s no reason to suppose they’ve been listening to the discussion in the first place.

The genuinely alarming data point from this survey is the number — 29% — who agreed that what Austria needs is “a strong Leader who does not need to worry about parliaments and elections.”  Oh dear.

For starts, don’t think that 29% figure is small enough to ignore.  The Nazis themselves in Germany only topped out at 43.9% in their last election (05 March 1933), and that was after they’d taken power, after the Reichstag fire, after arresting most of the socialist and communist party leadership, and after loosing the Sturmabteilung in its tens of thousands on the streets.

Secondly it gives an idea of how high a proportion of the population (i) seeks its salvation in government action, and (ii) views that action as itself a normative positive value.  As Jonah Goldberg points out in Liberal Fascism, one thing the fascistic parties of Europe (and their leftist sympathizers in America) all shared in common is an express faith in the value of action, forceful action, action that stands for no delays for deliberation.  “Bold, continuous experimentation” (FDR), anyone?

This 29% number suggests that a large proportion of one’s fellows has not contemplated how much easier is it to do harm than good, how much easier it is to un-do good than harm, and finally, how susceptible to the laws of unintended consequences governmental action is.  When Calvin Coolidge’s father was elected to the Vermont legislature, his son, by then a Massachusetts state senator (I’ve slept since I read this, and I don’t think he’d been elected governor yet), wrote him a note.  It was much, much more important, Calvin wrote his father, to thwart bad legislation than it was to pass good.  Calvin Got It.  Wanting a “strong leader” who can “cut through the red tape” and “get things done” without all that pesky give-and-take, all that empty vaporing debate, is strong evidence that one is dealing with someone who simply has not attended to the world around him very carefully.  [Ironically it was Coolidge and Dawes, grinding through the federal budgets line by line, who actually in the literal sense eliminated use of the red tape that had been used to bind government documents.  That anecdote is in Amity Shlaes’s recent biography of Coolidge.]

Finally, 29% thinking what one needs is a strong leader who need not bother with legislatures and elections, while 85% think democracy is the best form of government, suggests that a sizable proportion of the Austrian population is politically schizophrenic.  Guys:  You cannot square those two positions into any relationship other than diametric opposition.  Holding those two thoughts simultaneously and consistently is not possible.

You have to wonder whether the survey designers shoved in questions which, together or in a single question, restated the guts of the Ermächtigungsgesetz (translation here) and then asked the agree/disagree position.  I wonder how many, relative to 29%, would have agreed with the proposition that what Austria needs is legislation that grants the country’s Leader the power to do those certain specific things which the Reichstag granted Hitler in 1933.