Wednesday, November 22, 2006

 

What matters most . . .

___________________________________________________________________

My mom left us four years ago yesterday. When I hear this song, I think of her:

Broken windows and empty hallways,
a pale dead moon in a sky streaked with grey.
Human kindness is overflowing,
and I think it's gonna rain today.

Scarecrows dressed in the latest styles,
the frozen smiles to chase love away.
Human kindness is overflowing,
and I think it's gonna rain today.

Lonely, lonely.
Tin can at my feet,
I think I'll kick it down the street.
That's the way to treat a friend.



Bright before me the signs implore me:
Help the needy and show them the way.
Human kindness is overflowing,
and I think it's gonna rain today.

Lonely, so lonely.
Tin can at my feet,
I think I'll kick it down the street.
That's the way to treat a friend.

Bright before me the signs implore me:
Help the needy and show them the way.
Human kindness is overflowing,
and I think it's gonna rain today.



I think it's going to rain today, Bette Midler

Saturday, November 18, 2006

 

Sad, Michigan plays proudly, loses at Columbus

___________________________________________________________________

Michigan outscored the Buckeyes 25 to 14 in the second half, holding Smith to 50 yards passing in the second half and 1 for 9 to start the second half.

Two plays beat Michigan, both sad short yardage plays in which Michigan misplaced its safeties and allowed touchdown runs they had not allowed all season.



No excuses, we were out-coached in the first half and we outplayed them in the second half.

A oddly, but sad, memory of Bo, who lost national championship shots in 1970, 1972, 1977, 1979, 1985, and 1989 by one score margins in the biggest game of his season. Sad. And though I don't want to see a rematch, I think that if the polls mean anything and the BCS system means anything, there is no question that these are clearly the best two teams in the country. No one can seriously question that.

No one.

Chicago
18 November 2006

Friday, November 17, 2006

 

Bo.... Bo.... Bo.... Our hearts bleed maize and blue for you tonight

___________________________________________________________________

The team, the team, the team! You can hear him still saying it. The team.

Schembechler became Michigan's 13th head coach after the 1968 season, succeeding Bump Elliott. At Michigan, Schembechler became one of the college football's greatest coaches. He won a school-record 194 games, lost only 48, and tied five for a winning percentage of .796. His teams never posted a losing season. In Big Ten Conference play, he had a record of 143-24-3 for a winning percentage of .850. In his first twelve seasons, he won 84% of all his games.


His Michigan teams won or shared 13 Big Ten titles and made 10 Rose Bowl appearances. Schembechler led the Wolverines to a total of 17 bowl games in 21 years, placing him ninth in all-time bowl appearances. He was voted national coach of the year in 1969 by both the American Football Coaches Association and the Football Writers Association of America.

Schembechler's greatest victory came in his first season, when he led the Wolverines to an upset victory over a standout Ohio State team coached by his old mentor, Woody Hayes. Hayes' Buckeyes dominated the series during the late 1950s and for most of the 1960s as Michigan fielded a number of uncharacteristically mediocre teams. Hayes' 1968 team made it clear how far Michigan had fallen behind its traditional rival, when the Wolverines lost 50-14. At the end of the game, Hayes decided to pursue a two-point conversion rather than a simple kick for an extra point. Legend has it that when Hayes was asked why he "went for two," he responded "Because I couldn't go for three." The embarrassment of that outcome set the stage for the 1969 rematch.

In 1969, the Buckeyes came into the game as defending national champions and 17-point favorites with the top ranking in the country and a 22-game winning streak. Many observers regarded Hayes' 1969 squad, which included five first-team all-Americans, among the best college teams of all time. But Schembechler's 7-2 Wolverines dominated what Hayes later admitted was his best team, 24-12. In a single afternoon, Schembechler and his charges resurrected Michigan's grand but moribund football tradition and returned it to college football's elite — a perch it has maintained ever since. Both Schembechler and Hayes — personal friends until Hayes' death in 1987 — agreed it was Hayes' best team and Schembechler's biggest victory. Many consider Michigan's win over Ohio State in 1969 one of the greatest upsets in college football history and the most significant win for a Michigan team ever.

After that game, the Wolverines and Buckeyes proceeded to engage in a fierce "Ten Year War" that elevated the already storied Michigan-Ohio State Rivalry into perhaps college football's greatest annual grudge match. For ten years the two dominated the Big 10, splitting ten conference titles between them and finishing second eight times. After a decade of memorable on-field stratagems, sideline antics, and locker room psychological ploys, the two coaches came out almost dead-even, Schembechler holding a slim 5-4-1 advantage.

Date of birth April 1, 1929
Place of birth Barberton, Ohio, USA
Date of death November 17, 2006
Place of death Southfield, Michigan, USA
Sport Football
Title Head coach
Overall Record 234-65-8
Awards 1969 Paul "Bear" Bryant Award
Championships won 13 Big Ten titles

For some men, a very few, their essence cannot be captured by a description or discussion of their accomplishments. That is what is true of great men.

That was Bo.

Chicago
17 November 2006

Thursday, November 16, 2006

 

Small in stature, he championed small is beautiful: Greatest living economist and liberal dies at 94

__________________________________________

He championed the idea that freedom cannot exist without free markets, an enduring idea that changed the complexion of societies. A Keynesian after World War II became a liberal populist and defined economic thought as it came to prevail in the last quarter-century.
_________________________

This remembrance from the New York Times today:

Milton Friedman, a Leading Economist, Dies at 94

By HOLCOMB B. NOBLE
Published: November 16, 2006
Milton Friedman, the grandmaster of conservative economic theory in the postwar era and a prime force in the movement of nations toward lesser government and greater reliance on free markets and individual responsibility, died today. He was 94 years old.

Conservative and liberal colleagues alike viewed Mr. Friedman as one of the 20th century’s leading economic scholars, on a par with giants like John Maynard Keynes, Joseph A. Schumpeter and Paul Samuelson.

Flying the flag of economic conservatism, Mr. Friedman led the postwar challenge to the hallowed theories of Lord Keynes, the British economist who maintained that governments had a duty to help capitalistic economies through periods of recession and to prevent boom times from exploding into high inflation.


In Professor Friedman’s view, government had the opposite obligation: to keep its hands off the economy, to let the free market do its work. He was a spiritual heir to Adam Smith, the 18th-century founder of the science of economics and proponent of laissez-faire: that government governs best which governs least.

The only economic lever that Mr. Friedman would allow government to use was the one that controlled the supply of money — a monetarist view that had gone out of favor when he embraced it in the 1950s. He went on to record a signal achievement, predicting the unprecedented combination of rising unemployment and rising inflation that came to be called stagflation. His work earned him the Nobel Memorial Prize in Economic Science in 1976.

Rarely, his colleagues said, did anyone have such impact on both his own profession and on government. Though he never served officially in the halls of power, he was always around them, as an adviser and theorist. In time, his influence was felt around the world.

“His thinking has so permeated modern macroeconomics that the worst pitfall in reading him today is to fail to appreciate the originality and even revolutionary character of his ideas,” said Ben S. Bernanke, now chairman of the Federal Reserve, in a speech honoring Mr. Friedman in 2003.

Professor Friedman also a leading force in the rise of the “Chicago School” of economics, a conservative group within the department of economics at the University of Chicago. He and his colleagues became a counterforce to their liberal counterparts at the Massachusetts Institute of Technology and Harvard, influencing close to a dozen American winners of the Nobel prize in economics.

It was not only Mr. Friedman’s anti-statist and free-market views that held sway over his colleagues. There was also his willingness to create a place where independent thinkers could be encouraged to take unconventional stands as long as they were prepared to do battle to support them.

“Most economics departments are like country clubs,” said James J. Heckman, a Chicago faculty member and Nobel laureate who earned his doctorate at Princeton. “But at Chicago you are only as good as your last paper.”

Alan Greenspan, the former Federal Reserve chairman, said of Mr. Friedman in an interview Tuesday: “From a longer-term point of view, it’s his academic achievements which will have lasting import. But I would not dismiss the profound impact he has already had on the American public’s view.”

Mr. Greenspan said that Mr. Friedman came along at an opportune time. The Keynesian consensus among economists, which had worked well from the 1930s, could not explain the stagflation of the 1970s, he said.

But he also said Mr. Friedman had made a broader political argument, which is at the heart of his classic book “Capitalism and Freedom”: that you have to have economic freedom in order to have political freedom.

As a libertarian, Mr. Friedman advocated legalizing drugs and generally opposed public education and the state’s power to license doctors, automobile drivers and others. He was criticized for those views, but he stood by them, arguing that prohibiting, regulating or licensing human behavior either does not work or creates inefficient bureaucracies.

Mr. Friedman insisted that unimpeded private competition produced better results than government systems. “Try talking French with someone who studied it in public school,” he argued, “then with a Berlitz graduate.”

Once, when accused of going overboard in his anti-statism, he said, “In every generation, there’s got to be somebody who goes the whole way, and that’s why I believe as I do.”

In the long period of prosperity after World War II, when Keynesian economics was riding high in the West, Mr. Friedman alone warned of trouble ahead, asserting that policies based on Keynesian theory were part of the problem.

Even as he was being dismissed as an economic “flat earther,” he predicted in the 1960s that the end of the boom was at hand. Expect unemployment to grow, he said, and inflation to rise, at the same time. The prediction was borne out in the 1970s. Paul Samuelson labeled the phenomenon “stagflation.”

Mr. Friedman’s analysis and prediction were regarded as a stunning intellectual accomplishment and contributed to his earning the Nobel prize for his monetary theories. He was also cited for his analyses of consumer savings and of the causes of the Great Depression; he blamed government in large part for it, saying government had bungled early chances for recovery. His prestige and that of the Chicago school soared.

Government leaders like President Ronald Reagan and Prime Minister Margaret Thatcher of Britain were heavily influenced by his views. So was the quietly building opposition to Communism within the East Bloc, including intellectuals like Vaclav Klaus, who later became prime minister of the Czech Republic.

As the end of the century approached, Professor Friedman said events had made his views seem only more valid than when he had first formed them. One event was the fall of socialism and Communism, which the economist Friedrich A. Hayek had predicted in 1944 in “Road to Serfdom.” In an introduction to the 50th-anniversary edition of the book, Professor Friedman wrote that it was now clear that “progress could be achieved only in an order in which government activity is limited primarily to establishing the framework with which individuals are free to pursue their own objectives.”

“The free market is the only mechanism that has ever been discovered for achieving participatory democracy,” he said.

Professor Friedman was acknowledged to be a brilliant statistician and logician. To his critics, however, he sometimes pushed his data too far. To them, the debate over the advantages or disadvantages of an unregulated free market was far from over.

Milton Friedman was born in Brooklyn on July 31, 1912, one of four children, and the only son, of Jeno S. Friedman and Sarah Landau Friedman. His parents worked briefly in New York sweatshops, then moved their family to Rahway, N.J., where they opened a clothing store.

Mr. Friedman’s father died in his son’s senior year at Rahway High School. Young Milton later waited on tables and clerked in stores to supplement a scholarship he had earned at Rutgers University. He entered Rutgers in 1929, the year the stock market crashed and the Depression began.

Mr. Friedman attributed his success to “accidents”: the immigration of his teenage parents from Czechoslovakia, enabling him to be an American and not the citizen of a Soviet-bloc state; the skill of a high-school geometry teacher who showed him a connection between Keats’s “Ode to a Grecian Urn” and the Pythagorean theorem, allowing him to see the beauty in the mathematical truth that the square of the sides of a right triangle equals the square of the hypotenuse; the receipt of a scholarship that enabled him to attend Rutgers and there have Arthur F. Burns and Homer Jones as teachers.

He said Mr. Burns, who later became chairman of the Federal Reserve Board, instilled in him a passion for scientific integrity and accuracy in economics; Mr. Jones, who was teaching at Rutgers while pursuing a doctorate at the University of Chicago, interested him in monetary policy and a graduate school career at Chicago.

In his first economic-theory class at Chicago, he was the beneficiary of another accident — the fact that his last name began with an “F.” The class was seated alphabetically, and he was placed next to Rose Director, a master’s-degree candidate from Portland, Ore. That seating arrangement shaped his whole life, he said. He married Ms. Director six years later. And she, after becoming an important economist in her own right, helped Mr. Friedman form his ideas and maintain his intellectual rigor.

After he became something of a celebrity, Mr. Friedman said, many people became reluctant to challenge him directly. “They can’t come right out and say something stinks,” he said. “Rose can.”

In 1998, he and his wife published a memoir, “Two Lucky People” (University of Chicago Press.

His wife survives him, along with a son, David Friedman, and a daughter, Janet Martel.

That fateful university class also introduced him to Jacob Viner, regarded as a great theorist and historian of economic thought. Professor Viner convinced Mr. Friedman that economic theory need not be a mere set of disjointed propositions but rather could be developed into a logical and coherent prescription for action.

Mr. Friedman won a fellowship to do his doctoral work at Columbia, where the emphasis was on statistics and empirical evidence. He studied there with Simon Kuznets, another American Nobel laureate. The two turned Mr. Friedman’s thesis into a book, “Income from Independent Professional Practice.” It was the first of more than a dozen books that Mr. Friedman wrote alone or with others.

It was also the first of many “Friedman controversies.” One finding of the book was that the American Medical Association exerted monopolistic pressure on the incomes of doctors; as a result, the authors said, patients were unable to reap the benefits of lower fees from any real price competition among doctors. The A.M.A., after obtaining a galley copy of the book, challenged that conclusion and forced the publisher to delay publication. But the authors did not budge. The book was eventually published, unchanged.

During the first two years of World War II, Mr. Friedman was an economist in the Treasury Department’s division of taxation. “Rose has never forgiven me for the part I played in devising and developing withholding for the income tax,” he said. “There is no doubt that it would not have been possible to collect the amount of taxes imposed during World War II without withholding taxes at the source.

“But it is also true,” he went on, “that the existence of withholding has made it possible for taxes to be higher after the war than they otherwise could have been. So I have a good deal of sympathy for the view that, however necessary withholding may have been for wartime purposes, its existence has had some negative effects in the postwar period.”

After the war, he returned to the University of Chicago, becoming a full professor in 1948 and commencing his campaign against Keynesian economics. Robert M. Solow, of M.I.T., a Nobel laureate who often disagreed with Mr. Friedman, called him one of “the greatest debaters of all time.” But his wisecracking style could infuriate opponents like the British economist Joan Robinson, who called him a “paper tiger.”

Mr. Samuelson, also of M.I.T., who was not above wisecracking himself, had a standard line in his economics classes that always brought down the house: “Just because Milton Friedman says it doesn’t mean that it’s necessarily untrue.”

But Professor Samuelson said he never joked in class unless he was serious — that his friend and intellectual opponent was, in fact, often right when at first he sounded wrong.

Mr. Friedman’s opposition to rent control after World War II, for example, incurred the wrath of many colleagues. They took it as an unpatriotic criticism of economic policies that had been successful in helping the nation mobilize for war. Later, Mr. Sameulson said, “probably 98 percent of them would agree that he was right.”

In the early 1950s, Mr. Friedman started flogging a “decomposing horse,” as Mrs. Thatcher’s chief economic adviser, Alan Waters, later put it. The horse that most economists thought long dead was the monetarist theory that the supply of money in circulation and readily accessible in banks was the dominant force — or in Mr. Friedman’s view, the only force — that should be used in shaping the economy.

In the 1963 book “A Monetary History of the United States, 1867-1960,” which he wrote with Anna Jacobson Schwartz, Mr. Friedman compiled statistics to buttress his theory that recessions had been preceded by declines in the money supply. The same was true of the Great Depression, he found, in attributing it to Federal Reserve bungling. And it was an oversupply, he argued, that caused inflation.

In the late 1960s, Mr. Friedman used his knowledge of empirical evidence and statistics to calculate that Keynesian government programs had the effect of constantly increasing the money supply, a practice that over time was seriously inflationary.

Paul Krugman, a Princeton University economist and New York Times columnist, said Mr. Friedman then managed “one of the decisive intellectual achievements of postwar economics,” predicting the unprecedented combination of rising unemployment and rising inflation that later came to be called stagflation.

In this regard, his Nobel Prize cited his contribution to the now famous concept “the natural rate of unemployment.” Under this thesis, the unemployment rate cannot be driven below a certain level without provoking an acceleration in the inflation rate. Price inflation was linked to wage inflation, and wage inflation depended on the inflationary expectations of employers and workers in their bargaining.

A spiral developed. Wages and prices rose until expectations came into line with reality, usually at the natural rate of unemployment. Once that rate is achieved, any attempt to drive down unemployment through expansionary government policies is inflationary, according to Mr. Friedman’s thesis, which he unveiled in a speech to the American Economic Association in 1968.

For years economists have tried to pinpoint the elusive natural rate, without much success, particularly in recent years.

Mr. Friedman, the iconoclast, was right on the big economic issue of that time — inflation. And his prescription — to have the governors of the Federal Reserve System keep the money supply growing steadily without big fluctuations — figured in the thinking of economic policy makers around the world in the 1980s.

Mr. Friedman also pursued his attack on Keynesianism in a more general way. He warned that a government allowed to regulate the economy could not be trusted to keep its hands off individual liberties.

He had first been exposed to this line of attack through his association with Mr. Hayek, who was predicting the failure of Communism and “collectivist orthodoxy” in the early 1940’s in his book “Road to Serfdom.” In an introduction to a 1971 German edition, Professor Friedman called the book a revelation particularly to the young men and women who had been in the armed forces during the war.”

“ Their recent experience had enhanced their appreciation of the value and meaning of individual freedom,” he wrote.

In 1962, Mr. Friedman took on President John F. Kennedy’s popular inaugural exhortation: “Ask not what your country can do for you. Ask what you can do for your country.” In an introduction to “Capitalism and Freedom,” a collection of his writings and lectures, he said President Kennedy had got it wrong: You should ask neither.

“What your country can do for you,” Mr. Friedman said, implies that the government is the patron, the citizen the ward; and “what you can do for your country” assumes that the government is the master, the citizen the servant. Rather, he said, you should ask, “What I and my compatriots can do through government to help discharge our individual responsibilities, to achieve our several goals and purposes, and above all protect our freedom.”

It was not that Mr. Friedman believed in no government. He is credited with devising the negative income tax, which in a modern variant — the earned income tax credit — increases the incomes of the working poor. He also argued that government should give the poor vouchers to attend the private schools he thought superior to public ones.

In forums he would spar over the role of government with his more liberal adversaries, including John Kenneth Galbraith, who was also a longtime friend (and who died in May 2006). The two would often share a stage, presenting a study in contrasts as much visual as intellectual: Mr. Friedman stood 5 feet 3; Mr. Galbraith, 6 feet 8. Though he had helped ignite the conservative rebellion after World War II, together with intellectuals like Russell Kirk, William F. Buckley Jr. and Ayn Rand, Mr. Friedman had little or no influence on the administrations of Presidents Dwight D. Eisenhower, Kennedy, Lyndon B. Johnson and Richard M. Nixon. President Nixon, in fact, once described himself as a Keynesian.

It was frustrating period for Mr. Friedman. He said that during the Nixon years the talk was still of urban crises solvable only by government programs that he was convinced would make things worse, or of environmental problems produced by “rapacious businessmen who were expected to discharge their social responsibility instead of simply operating their enterprises to make the most profit.”

But then, after the 1970s stagflation, with Keynesian tools seemingly broken or outmoded, and with Ronald Reagan headed for the White House, Mr. Friedman’s hour arrived. His power and influence were acknowledged and celebrated in Washington.

With his wife, Rose Director Friedman, in 1978 he brought out a best-selling general-interest book, “Free to Choose.” and went on an 18-month tour, from Hong Kong to Ottumwa, Iowa, preaching that government regulation and interference in the free market was the stifling bane of modern society. The tour became the subject and Mr. Friedman the star of a 10-part series on public television in 1980.

In 1983, having retired from teaching, he became a senior fellow at the Hoover Institution at Stanford University.

The economic expansion in the 1980s resulted from the Reagan Administration’s lowered tax rates and deregulation, Professor Friedman said. But then the tide turned again. The expansion, he argued, was halted by Presidents Bush’s “reverse-Reaganomics” tax increase.

What was worse, by the mid-1980s, as the finance and banking industries began undergoing upheavals and money began shifting unpredictably, Mr. Friedman’s own monetarist predictions — of what would happen to the economy and inflation as a result of specific increases in the money supply — failed to hold up. Confidence in his monetarism theory waned.

Professor Robert Solow of M.I.T., a Nobel laureate himself, and other liberal economists continued to raise questions about Mr. Friedman’s theories: Did not President Reagan, and by extension Professor Friedman, they asked, revert to Keynesianism once in power?

“The boom that lasted from 1982 to 1990 was engineered by the Reagan administration in a straightforward Keynsian way by rising spending and lowered taxes, a classic case of an expansionary budget deficit,” Mr. Solow said. “In fairness to Milton, however, it should be said that one of the reasons for his wanting a tax reduction was to force the spending cuts that he presumed would follow.”

Professor Samuelson said that “Milton Friedman thought of himself as a man of science but was in fact more full of passion than he knew.”

Mr. Friedman remained the guiding light to American conservatives. It was he, for example, who provided the economic theory behind such “prescriptions for action,” as his one-time professor, Jacob Viner put it, as the landslide Republican victory in the off-year Congressional elections of 1994.

By then the 5-feet 3-inch 130-pound Professor Friedman had grown into a giant of economics abroad as well. Mr. Friedman was sharply criticized for his role in providing intellectual guidance on economic matters to the military regime in Chile that engineered a coup in early 1970s against the democratically elected president, Salvador Allende. But, for Mr. Friedman, that was just a bump in the road.

In Vietnam, whose constitution was amended in 1986 to guarantee the rights of private property, the writings of Mr. Friedman were circulated at the highest levels of government. “Privatize,” he told Chinese scholars at a meeting in Shanghai’s Fudan University, as he told those in Moscow and elsewhere in Eastern Europe: “Speed the conversion of state-run enterprises to private ownership.” They did.

Mr. Friedman had long since ceased to be called a flat-earther by anyone. “What was really so important about him,” said W. Allen Wallis, a former classmate and later faculty colleague at the University of Chicago, “was his tremendous basic intelligence, his ingenuity, perseverance, his way of getting to the bottom of things — of looking at them in a new way that turned out to be right.”

Louis Uchitelle and Edmund L. Andrews contributed reporting.



Let us say a prayer for him, who taught us so much.

Chicago
16 November 2006

 

Words, of humility, courage, and kindness, to strive to live by . . .

___________________________________________________________________

From the eulogy of Senator Robert F. Kennedy:

"'Some believe there is nothing one man or one woman can do against the enormous array of the world's ills. Yet many of the world's great movements, of thought and action, have flowed from the work of a single man. A young monk began the Protestant reformation; a young general extended an empire from Macedonia to the borders of the earth; a young woman reclaimed the territory of France; and it was a young Italian explorer who discovered the New World, and the 32 year-old Thomas Jefferson who [pro]claimed that "all men are created equal."

These men moved the world, and so can we all. Few will have the greatness to bend history itself, but each of us can work to change a small portion of events, and in the total of all those acts will be written the history of this generation. It is from numberless diverse acts of courage and belief that human history is shaped. Each time a man stands up for an ideal, or acts to improve the lot of others, or strikes out against injustice, he sends forth a tiny ripple of hope, and crossing each other from a million different centers of energy and daring, those ripples build a current that can sweep down the mightiest walls of oppression and resistance.

Few are willing to brave the disapproval of their fellows, the censure of their colleagues, the wrath of their society. Moral courage is a rarer commodity than bravery in battle or great intelligence. Yet it is the one essential, vital quality for those who seek to change a world that yields most painfully to change. And I believe that in this generation those with the courage to enter the moral conflict will find themselves with companions in every corner of the globe.

For the fortunate among us, there is the temptation to follow the easy and familiar paths of personal ambition and financial success so grandly spread before those who enjoy the privilege of education. But that is not the road history has marked out for us. Like it or not, we live in times of danger and uncertainty. But they are also more open to the creative energy of men than any other time in history. All of us will ultimately be judged, and as the years pass we will surely judge ourselves on the effort we have contributed to building a new world society and the extent to which our ideals and goals have shaped that event.

The future does not belong to those who are content with today, apathetic toward common problems and their fellow man alike, timid and fearful in the face of new ideas and bold projects. Rather it will belong to those who can blend vision, reason and courage in a personal commitment to the ideals and great enterprises of American Society.* Our future may lie beyond our vision, but it is not completely beyond our control. It is the shaping impulse of America that neither fate nor nature nor the irresistible tides of history, but the work of our own hands, matched to reason and principle, that will determine our destiny. There is pride in that, even arrogance, but there is also experience and truth. In any event, it is the only way we can live.'

That is the way he lived. That is what he leaves us.

My brother need not be idealized, or enlarged in death beyond what he was in life, to be remembered simply as a good and decent man, who saw wrong and tried to right it, saw suffering and tried to heal it, saw war and tried to stop it.

Those of us who loved him and who take him to his rest today, pray that what he was to us and what he wished for others will some day come to pass for all the world.

As he said many times, in many parts of this nation, to those he touched and who sought to touch him:

'Some men see things as they are and say why.
I dream things that never were and say why not.'"


Ted Kennedy on his brother Bobby, 8 June 1968.

Chicago
November 16, 2006

 

Time and Chance Happeneth to Us All

___________________________________________________________________

To his Coy Mistress

by Andrew Marvell


Had we but world enough, and time,
This coyness, lady, were no crime.
We would sit down and think which way
To walk, and pass our long love's day;
Thou by the Indian Ganges' side
Shouldst rubies find; I by the tide
Of Humber would complain. I would
Love you ten years before the Flood;
And you should, if you please, refuse
Till the conversion of the Jews.
My vegetable love should grow
Vaster than empires, and more slow.
An hundred years should go to praise
Thine eyes, and on thy forehead gaze;
Two hundred to adore each breast,
But thirty thousand to the rest;
An age at least to every part,
And the last age should show your heart.
For, lady, you deserve this state,
Nor would I love at lower rate.

But at my back I always hear
Time's winged chariot hurrying near;
And yonder all before us lie
Deserts of vast eternity.
Thy beauty shall no more be found,
Nor, in thy marble vault, shall sound
My echoing song; then worms shall try
That long preserv'd virginity,
And your quaint honour turn to dust,
And into ashes all my lust.
The grave's a fine and private place,
But none I think do there embrace.

Now therefore, while the youthful hue
Sits on thy skin like morning dew,
And while thy willing soul transpires
At every pore with instant fires,
Now let us sport us while we may;
And now, like am'rous birds of prey,
Rather at once our time devour,
Than languish in his slow-chapp'd power.
Let us roll all our strength, and all
Our sweetness, up into one ball;
And tear our pleasures with rough strife
Thorough the iron gates of life.
Thus, though we cannot make our sun
Stand still, yet we will make him run.

Tuesday, November 14, 2006

 

On Friends

___________________________________________________________________

I seldom discuss my philosophical or religious views, with anyone really. They tend to be private. So to discuss friends, the subject of which falls into that category, is rare, but seems appropriate.

Moving around a lot as a kid, I always had a new set of friends and I made an effort to keep up with my friends. Friends I had when I was 13 remain my friends today; and 16; and 18; and so on. I enjoy deep friendships with interesting people, and I have been lucky to have known many interesting people, a subset of whom have been very good friends. They have included everyone from a feed and grain dealer in Iowa to a Supreme Court reporter and a federal appeals court judge. Through them runs a common strain of ambition, but more important to all of them, amongst those to whom I feel closest, is always the notion that principle prevails over personal interest.

You choose friends, and yet family is there in the end. Both are an important part of community. The sad thing about friendship is that, as we get older, those intense and wonderful experiences we had when young with our friends give way to more family relationships, which have their own power. It is a sadness in the sense of Shakespeare's line, parting is such sweet sorrow.

My friends have always been an eclectic group of people. Ha, ha. If I were a college admissions director, my friends would look like a diverse entering class. From conservative orthodox jews to atheist socialists, but all good people.

I think that while I can argue well, I shy away from it because I feel the damage it does to someone's psyche when you are a friend yet disagree. Odd to be highly principled and yet not to express those views. And friends have taught me that the personal is the political. If not in our everyday lives, when?

My friends, however, have always been, in some sense, my heros. I admire them. I try to be friendly with everyone; it is the right thing to do. But I cannot be good friends with someone whose values and principles I do not admire even if they do not overlap with my own. Indeed, my two best friends for most of my adult life have very, very different political views, such that we have long since intentionally called moratoriums (correct spelling?) on any such other than passing conversations - though we are each very interested in politics in our own way.

Friends bring out the better angels of our natures. I think that that is it at bottom.

Wilmette, IL
November 15, 2006

Sunday, November 12, 2006

 

Ideology without Issues or Iraq

___________________________________________________________________

The elections were about Iraq. We all know this. Frustration with W. has been extant for ages, or so it now feels, as he holds the dubious distinction of being the most
unpopular president other than Nixon during Watergate. W. has accomplished this tremendous feat in times of economic surfeit, without major political scandal, and without major inflation or deflation, and without deep recession or depression. The country, all-in-all, is in the best shape it may ever have been in economically and yet the voters voted "no confidence" on Tuesday.

On the surface, this points to Iraq, no doubt. And that issue may not go away any time soon. Most likely, the moral course, having "broken it," is to act like we own it, to invoke to Powell's Pottery Barn metaphor. Tony Blair speaks eloquently on this subject. He is obviously correct. Yes, obviously. The reasons for the war are immaterial. The question now is how to help those poor Iraqis who want the best for their country and likely are not the type of folk to pick up an RPG or an AK-47.

But politics may prevail. Republicans in Congress and those with Presidential ambitions must see the obvious simplicity in "just saying get out," whatever the normative content of that statement. This, I think, is a likely result, and a sad one. I was opposed to the war, for, as most wars go, many people die. Now that there is a war, however, it seems we owe to those whose lives we have disrupted to leave them in peace - a moral commitment.

This is a digression from my main concern here. Those who watch say, rightly, that the Republican party has not been the same since it won the Cold War. That was the great motivating force of my youth, the moral imperative, the defeat of communism and the ideas it represented. That victory is so complete that most do not even think of communism other than as an anachronism. Even social democrats are largely discredited, at least in American politics.


Likewise Democratic victories have rendered their campaigns toothless. It is not that they lack candidates; though it may be true that momentous times bring the best out in us and these are not momentous times, so that they lack candidates of true stature. (Isn't the true of almost all Republican candidates since Reagan though?) It is that the Democrats won the Civil Rights struggle; they implemented the Great Society; and economic growth has made a very large part of the country well-off. Indeed, some problems may in the long-term stem from our current and recent economic successes (more on this on another occasion). Regardless, Democratic policy victories have made electoral strategies hard on them. Of course, one might ask, why then worry; we've won?

The closest issue the Republicans have, and not one that resonates, is abortion. The closest issue the Democrats have, and not one that resonates, is universal health care coverage. Unlike the great issues of prior stages in American History - the Cold War, the Great Society, the Civil Rights Movement, Abolition and Slavery, Suffrage - these issues pale in comparison, like a 21st Century abstinence movement. Not enduring or a fundamental part of who we are as a people -- constitutive in truth. They are not so defining of the character and direction of our society that they do not capture the attention of any strong segment of society. This leaves both parties toothless. American politics is played not just between the 40s, as the football metaphor goes, but between the 48s. In a very narrow range. And it is therefore not surprising that, when voters are used to that narrow range of debate, any momentary or short-term jolting issue will sway substantial numbers of voters. Of course, it is not clear that the frustration with W. will lead to broader shifts. A poll released today showed that Americans overwhelmingly favor raising the minimum wage, including 48 percent of Republicans. This can only be a sign, I think it is fair to suppose, that, to paraphrase a famous latin phrase in the law, that frustration with one thing means frustration with all things for which W. stands. I doubt that even though the political landscape may shift that any other than superficial issues will remain the focus of public debate. We are fortunate not to live in "interesting times."

How will this play out in the long run? Over the last 225 years, parties have morphed and some have completely fallen away while others have grown up. I suspect, however, that we will not know until we have some defining issue confront us again. Let us hope it is not world calamity, such as global warming; or blight, leading to vast starvation (the reserves of world food are relatively small). It is little wonder that the center, as Peter Hart describes it, is tired of the wings of each party carping about issues that are not defining. And the corollary to that is that we should not feel a need to devise falsely issues which are not of the same moment as Civil Rights, or the Cold War, but should enjoy this relatively brief, as the history of the world goes, reprise of peace and prosperity.

Leave well enough alone.

Saturday, November 11, 2006

 

Maize and Blue

___________________________________________________________________

Michigan has not been 11-0 since 1997. Michigan and Ohio State have not both been unbeaten since 1973, when they played to a famous 10-10 in Ann Arbor. No two teams have been 10-0 or better from the same conference since 1935. Is this going to be a game or what?

The only blemish on the game is that ABC has moved the game to 3:30 p.m. EST from the venerable 12 p.m. noon starting time that has been the rule for decades.

Ohio State leads the nation in scoring defense; Michigan leads the nation in rush defense and time of possession. Jim "the Emperor" Tressel will play mind games. Lloyd "Used" Carr will be classy as always.

Steve Breaston chose the next to the last game of his regular season career to show that he could catch the ball. It may be that Mario Manningham will be a decoy. If Henne could make it through a progression of three receivers with accuracy, Michigan would have to be the dominant favorite. Unfortunately, Henne -- though very accurate -- lacks the ability to go much beyond his first look. Look for Ohio State to exploit that.


Texas [!], Auburn, Louisville, and California [!!] -- all top 10 teams with fewer than two losses -- lost today. Florida came within a finger of losing. Will there be a re-match? No. Michigan doesn't want a re-match. It wants and needs to show that it can dominate this game on defense. Only that will define a successful season for them. Much like their 24-12 victory in the 1968 game that was Bo's first match-up against Woody. That is the definition of a classic, for Michigan fans. Defensive domination of Troy Smith. We shall see.

The horse shoe. 164 hours, 20 minutes.

 

Forty Acres and a Mule

___________________________________________________________________

A few summers ago, the Supreme Court handed down its 5-4 decision in the Michigan Law School affirmative action case. A law school classmate was, and still is, serving as dean. I had not followed the case closely, but was somewhat taken aback to read the opinion. Knowing two members of the faculty, I dashed off angry e-mails. I found it offensive not that they were using affirmative action simpliciter, but that the University had sought to cover up their practice by claiming that there was no quota. In fact, in each year that was the subject of the suit, the percentage of incoming minority students matched exactly the percentage of applicants in that racial group. My former classmate acknowledged, but not sheepishly, that they had stated during the suit that there was "some attention paid to numbers" during the admissions proces.

The citizens of the state have now sought to undo all that. Leaving no apparent room for the recruitment of deserving students from troubled Detroit, where many of my classmates as an undergrad at Michigan had grown up, but at the same time rightly reacting to the arrogance of the University's leadership.

The more important point to be made, however, is that the case, and Michigan's use of affirmative action is so clearly symbolic. The real issues of race in our society stem from stagnant class poverty and a lack of upward mobility; violence in the inner city; failed systems to protect families and children. Having prosecuted in Washington DC during formative years, I came to be appalled at the tragedy that was half the nation's capitol, more in land mass and population. Yet it has changed little since I left my job, except that the murder rate has roughly halved.

40 acres and a mule.

Sunday, June 04, 2006

 

Dub-ya drowning

___________________________________________________________________

Polls for the better part of 9 months have had W. at record lows, if one excludes Nixon from the record books. (It's the reverse-Roger Maris asterisk.)

What happens if the democrats re-take the House. Frank, Dingell, and others like Conyers become Committee Chairs. Pelosi the speaker. The subpoena power and the power to hold hearings switches hands. The torture issue; prisoners at Gitmo; Abu Gharib; Iraq policy in general; the NSA wire-tap practices; the house-cleaning at the CIA; the use of faulty WMD policy and many other topics become obvious topics of hearings. Some involve possible questions of criminal activity, and you can be assured that the democrats will be looking for any way to run W. out, or more strategically, to keep him around but embarrass him completely, with claims of Presidential criminality.

How did we get here? Good question if I do say so myself (which I do). We got here because W. has to be the least charismatic President since before memory runneth. (My memory ceases at Hoover, who I have not seen in talking newsreels, though in later years he seemed a gentle man and had something of a renaissance after 1945.) Even Nixon had his own likeability in a sort of twisted way. Certainly no one ever thought he was stupid (an idjut in the vernacular) and he still will receive kudos from many quarters for his foreign policy insights. Which leads to electoral politics lesson number one: never elect a President without charisma. When the going gets tough, they have no reserve power with which to recover. The democrats were actually done a favor, this thesis goes, when Dukakis was beaten soundly by 41. Even losing Presidents -- Ford, Carter, LBJ -- have had their fans for life because they had some basic likeability. I could go on, but I am leading to the next question.

Where do we go from here? Ok, let's leave aside the misery of the next two years with W. (who, when his mother suggested he shouldn't run, should have listened). What will happen in '08. The first question is Hilary. Can the democrats win with her? An interesting poll was run by some progressive left bloggers who appeared on Russert's book show this weekend. Their thesis, in part, was that the democrats have suffered from beltway-itis. As evidence, their poll showed that only 2% of their readership wanted Hilary. Russ Feingold won amongst the outside the beltway crowd. Unfortunately for the democrats, organization often counts for a great deal more than merit. But Hilary at 2% amongst their own seems a surprising figure.

Before we get to Gore, who is doing his best to look good, I must say, let's consider that the democrats have not won in a landslide since 1964. The Republicans have had several true landslides and won more than 400 electoral votes in numerous elections during that time period. So it may be that mere political inertia, despite W's best attempts to kill the Republican party, will keep any democrat out of office.

Gore is looking better than in the past. But I have never understood why a Bill Bradley, or a Russ Feingold, intelligent democrats with leadership qualities and some charisma in their own way, were not first choices amongst party faithful. They have heft and personality. I suppose one would have to be, as the authors on Russert's show suggest, inside the democratic beltway bubble to understand truly. The democrats, however, appear to have some real choices, if they eschew the obvious. Was Bill Clinton obvious?

Finally, I think the democrats have to accept two realities of their politics. They are no longer coalition builders. Large parts of the party have been left out as many catholic and many southern voters in the coalition have been ex-communicated. That is a huge electoral reality. I do not think it is simply matter of re-building an infrastructure as Russert's guest's suggest. Absent huge Republican gaffe (which W. has done his best to deliver up on a silver platter in the form of Iraq policy), they also must accept the reality that they are the victim of their own successes. They won the arguments on the social welfare state to a large extent, and they won the civil rights battles, all ending in the 1960s. Their issues no longer are bread-and-butter issues that affect large numbers of Americans. Health care comes the closest, and that is becomign a greater issue in part because of its affect on the health care american business can provide to its workers as those costs increase. But that issue will take time to mature, and it is unlikely that Republicans -- long ago having forsaken any type of principled opposition to entitlement spending -- will not shift their stances to slowly permit democratic policy to prevail but in ways that does not upset Republican control of the political branches. (Even W. saw the hand-writing on the wall after getting pummeled in '00 over prescription drugs and winning only in the electoral college.)

W. is still drowning. Can the democrats replace him? And how do we undo the mess that is Iraq now?

Sunday, May 21, 2006

 

The NSA: Not my big brother, please.

___________________________________________________________________

For nearly eight years, I worked as a federal prosecutor. About two-thirds of this time was spent doing narcotics works. An important part of narcotics investigations in many cases are wiretaps and pen register analyses. Wiretaps are just what they sound like. (Usually) The FBI gets a court order to listen in on and record a phone line suspected of use by criminals involved in criminal activity. Title III warrants, as they are known, require a great deal more than ordinary search and seizure warrants. Among other things, the Deputy Attorney General must sign off while ordinary warrants only require the signature of your garden variety Assistant United States Attorney. Listening must also be "minimized;" and their strict renewal procedures and time limitations.

Pen registers are a less invasive method of checking phone lines. They sound much like what the NSA does, or did, or some phone carriers may have done for them, sort of, if they are to be believed in their carefully worded denials.

Pen registers are obtained under the same portions of the criminal code that contains Title III authority. A pen register, as it is referred to, is a court order that requires a phone company to monitor and turn over to the government all numbers calling and called by a certain phone line. The standard for obtaining a pen register is much lower, lower than probable cause, but does require approval of a federal magistrate or judge, as a warrant does.

The fear with the NSA project is not its current uses, but its potential abuses -- the slippery slope in a big brother world. (Well, some object to having their phone line data turned over to the government now, and I understand why though the volume of data creates only a small probability that any one of us will be singled out.) And the slippery slope apparently -- unlike our pen register -- does not have anyone from another branch of government at the top, checking against uses that might slide down the slope too fast, so to speak. Indeed, in the case of Title III warrants and pen register, the other two branches of government were involved in two different ways. First, Congress created the schema, and then the courts must pass judgment on the Executive's request. The Administration ran through neither of these checks, creating, as they say, no balance.

Liberty is an odd thing. Almost any individual exercise of it can be attacked on utilitarian grounds successfully, except that most of these attacks fail when one applies some version of the Categorical Imperative. What if was always done this?

Obviously the Administration did not want to seek approval for fear of tipping off their methods, making it, they say, easier to evade -- as though terrorists weren't already aware that use of phones might be subject to various forms of eaves-dropping. It is easy to understand why in the month after 9/11 the Administration didn't ask. Everyone was scared. As with many other things -- like the continued detention of prisoners at Gitmo, the non-trial of Jose Padilla and his lack of legal representation until even Rehnquist could not stand the Administration's chutzpah -- it is inexplicable why the Administration continues to take these positions years later, as though we are in some pitched battle of war, and as though except in the most extreme and obvious cases that were enough.

But this stubbornness is not unique to this issue. It would be nice if this Bush could replace his own stubbornness with the word his father so often used, prudence. Between expediency and principle.

Thursday, April 27, 2006

 

Islam and Anti-Intellectualism

___________________________________________________________________

I am in the process of reading a short introduction to Islam. It includes a brief biography of Mohammed; a discussion of Muslim history; Islamic law; Islamic theology; and other subjects. It discusses Arabic contributions to science and mathematics, but in this respect, in a chapter about Islam in the modern world, discusses a notable limitation on Islamic thought and progress.

Apparently, the traditional Islamic view is that the revelations of Mohammed represent the last, best word of God. There has therefore been a great reluctance, and antipathy, toward learning Western ideas and adopting Western discoveries. This, for example, became a very real issue in India after the British began to govern. Hindus accepted the Western educational system and sent their children to the schools; Muslims rejected it, in short hand. This led to the insinuation of Hindus into positions in the administration of the British rule to the exclusion of the Muslims.

At some point, a Muslim leader in India began to encourage Muslims to adopt the Western schlosticism but he suffered from a great deal of criticism and attacks.

I have not read other accounts, nor finished the one I am reading. It is, however, a very interesting of the underpinnings of what we perceive as fundamentalism and opposition to progress in the Western sense -- what bin Laden might describe as Western materialism in part.

I acknowledge here is incomplete by far, but I also note that I find this notion offensive. If there is anything that God has given man, and assuming the existence of God, it is the power of reason. And from reason follows so much of what we know as the modern world, in so many fields.

Here is how it starts:



THE CHALLENGE OF THE MODERN WORLD

It remains to say something about the great changes which have taken place in the world in the present century, especially the last half-century, and about how these affect Islam and MuslimChristian relations. To understand these changes, however, we must go further back.

1. THE ADVANCE OF EUROPE AND THE WEST

Until about 1700 the Ottoman empire and the Western European Countries were roughly equal in military power. Even before then, however, the Europeans had been developing in ways which the Muslims were unable to follow. From the Muslims the Western Europeans had learned methods of improving sailing ships, and this ultimately enabled them to produce ships capable of crossing the Atlantic and other oceans. This led to the discovery of America by Columbus in 1492, but more important for European-Muslim relations was the discovery by Vasco da Gama in 1498 of the route round the Cape of Good Hope to the East Indies. This led to a great growth in trade by this route - trade that was entirely in the hands of the Europeans. For reasons that are not clear, the Muslims of India and Indonesia did not send their ships in the opposite direction, despite their long tradition of seafaring.



More later.

Sunday, April 16, 2006

 

What can be counted . . .

___________________________________________________________________

Is our collective ethos that what can be counted -- hours, dollars, material things -- matter, and what cannot -- love, happiness, truth, family -- doesn't?

 

Tops of the 20th Century: The Scientists

___________________________________________________________________

Top ten lists are always fun. An errant thought led me to wonder who were the most influential people of the 20th century. There are many ways of looking at the question but I think, for my money, the top four are almost trivially easy to pick: Pope John Paul II, Winston Churchill, Ronald Reagan and Albert Einstein, not necessarily in that order. Their combination of achievement and personal presence magnified each other, and there was some endurance to their achievements. Karol Wotya served as the pontiff for over a quarter of a century; Einstein won two Nobel prizes for a series of scientific discoveries that changed the way we see the world in a way that had not happened since Newton; Reagan had trouble gaining the Republican nomination in 1976 against an unelected president who had pardoned a crook but 12 years later had steadied the world through the collapse of communism, relegating its ideas to the trash heap of history; and Churchill influenced British politics for about four decades and was a lone but sturdy voice of hope both during the war (1940) and after (the iron curtain).

The rest of the list has to be more difficult to ascertain. There are the "worst dressed" ... the meglamaniacal who happened to come into power and held it ruthlessly or, worse, projected their evil: Tojo, Hitler, Stalin, Mao, Pol Pot. There are those who had moral authority and worked change through the force of their moral authority: Ghandi, Martin Luther King, Tutu, Nelson Mandela. There are political leaders of importance, Kennedy, Roosevelt, Woodrow Wilson, Ho Chi Minh, Kennedy, Marshall (of the Marshall plan), and, we must not forget, Thacher and Mikhal Gorbachev, all of whom moved history in significant ways through policies that made a sharp break from the past or made it possible. The artists, like Pablo Picasso and Jackson Pollack, provided beauty in a way we had not known before. And the scientists, Marie Curie, Leise Meitner, Schroedinger, Jonas Salk, and Alexander Fleming, the scientist who discovered penicillin. Inventors like Edison and Ford.

Who makes up your list? Scientists and medical researchers operate in relative obscurity but have had, especially in the 20th century, the greatest impact on the human condition. Three generations before me, in the 1920s, two of my great-grandparents died while my grandparents were teens of disease now cured or effectively so: one died of diabetes before insulin (the average life expectancy was about 18 months once diabetes set in prior to insulin) and the other died of an infection from a tooth abcess -- yes, a tooth ache -- before penicillin -- developing lockjaw, a disease virtually unknown today.

The shame is that I do not even know the names of the wonderful scientists, working in obscurity, who have done so much to expand our food supply, making famine known only because of lack of the ability to obtain the food, not the lack of food itself.

Washington, DC

Saturday, April 15, 2006

 

Understanding Islam

___________________________________________________________________

Raised Lutheran and attending church and Sunday school every week until I was 18, I have a good understanding of Christian culture. I know its flavors, expecially differences between protestants and catholics, and I have some sense of its place in European history, both from Sunday school and courses in history. With somewhat less certainty, I have an understanding of Jewish culture. The Old Testament overlap and friends have given me at least an outline understanding.

But I must admit I have no reral understanding of Islam and Muslim values and thought, let alone its history and interaction with cultures from Indonesia to Morrocco. Virtually none. I know there is a Mecca, once was a caliph, have heard of the Ottoman empire, but lack other such basic understandings -- as how did Islam migrate to the Indian subcontinent and beyond, or where Mohammed was born, or how Islam moved into what once was Yugoslavia centuries ago.

I ordered several books on Amazon.com to fill in some knowledge, including some on history and culture and an English translation of the Koran.

The main question for me is to gain an understanding of what its ethical and moral principles are. Some questions I suspect are not easily answered....for example, tolerance. Not so long ago, Christians burned "heretics" at the stake. Muslims may have similar practices, but that obviously does not distinguish the religions immediately. Similarly, al Qaeda attachs Western materialism, but so does the Pope. Closer examination is requrired. But it is hard to even pose the right questions without having at least a deeper understanding. Why for example have women attained a place in Western culture and is that at all related to religious views (but see priests in the Catholic Church)? And why do they appear to lack the same place in Muslim culture?

More after reading more. Understanding begins with listening, or its equivalent.

Sunday, April 09, 2006

 

Defining Populism

___________________________________________________________________

Stan Greenberg wrote a book last year, The Two Americas: Our Current Political Deadlock and How to Break It, in which he described the polarization of American politics. His polling showed the approximately thirty percent of the country is hard-right, 30 percent hard-left, and the 40 percent in the center wish that the 60 percent dominating the debate would stop debating in so stridently. It shows up in the Senate with judicial nominations. It shows up in other ways, but the divisions are always characterized with words like "ideological" or "culture war" or "religion" or "Blue state/Red state."

They do not break down along class or socio-economic lines by-and-large. The lines are depressing to all but those who are fighting.

I want to start a political debate along different lines, regrouping American democracy in terms of a long-time category that drove elections, from Andrew Jackson on: populism. The first step is to define what is meant by American populism, perhaps a unique feature of this polity. The last step will be to ask if populism is dead.

Who?

Who have been the populists in American politics? Andrew Jackson; William Jennings Bryan; Jimmy Carter; Robert LaFollette; Ronald Reagan; Harry Truman; Huey Long; Teddy Roosevelt; Barry Goldwater.


What?

What has characterized populists? The question could be put, what do populists believe? But it seems that both Jimmy Carter and Ronald Reagan ran, at least Carter in 1976 and Reagan in 1980, as populists, yet neither candidate shared many beliefs in common. One could argue that Carter lost his electoral appeal when he stopped being a populist, of the people, and started being a preacher, above the people. Remember the malaise speech? Not only was it a depressing message, it was a preachy message -- as with so much of what took him away from the winning smile and message of a fresh, outside Washington approach that won the election for him in 1976.

Obviously populists are characterized by an appeal to a dis-trust of power -- Teddy Roosevelt as a trust buster, for example; or Andrew Jackson as an opponent of a national bank; or Ronald Reagan as a cowboy riding into Washington to bring a fresh perspective to the politicians in the beltway. Anti-elitist, perhaps; anti-intellectual, probably. One wants to suggest that populists seek mass appeal, but any politician seeks mass appeal in a democracy. Does a populist seek out some lowest common denominator, and rely on a large plurality, while politicians who define themselves in other ways seek to build coalitions which are more fragmented?

So let us start by saying that populists are those who dis-trust power and seek to gain power by attacking those with power -- financial, political, bureaucratic, judicial ... power. We may need to expand this definition, but let us start there. It is easy to see in a country built upon principles that government needs to be limited and cabined how populism would have a natural appeal, how it would be an always extant undercurrent of popular sentiment if the basic values upon which the country was founded maintained their currency.

Our definition, however, still seems limited. It also raises an important question about why few American politicians seem to present themselves as populists, whether or not they use the word to define themselves. That, given Greenberg's unassailable analysis, is the important question.

To be continued.

Friday, April 07, 2006

 

Little Wonder They Wanted to Discredit Wilson

___________________________________________________________________

This Article is a very interesting read, for the facts:

he Recorder
December 9, 2005 Friday
SECTION: NEWS; Pg. 4 Vol. 129 No. 238

LENGTH: 2848 words

HEADLINE: Pardon Libby Now;
NEWS

BYLINE: Lawrence J. Siskind

BODY:


By Lawrence J. Siskind

Within a week of the indictment of I. Lewis Libby, Vice President Dick Cheney's ex-chief of staff, Democrats demanded that President Bush pledge in advance not to pardon him. So much for the presumption of innocence. But the Democrats had a point. It is worth considering now ? before Libby is tried and before he is financially ruined by lawyer fees ? whether a pardon is appropriate.

President Bush should respond to the Democrats' effrontery in evoking the pardon issue at this early stage by seeing and raising them with even greater effrontery. He should pardon Libby. Now.

The power of the president to pardon emerges from Article II of the Constitution. The exercise of this power is virtually unchecked and frequently controversial. Democrats have reason for concern over the White House's exercise of the power. After all, the last time they held the power, President Clinton scandalized the nation by waiting until the final hours of his presidency to issue 140 pardon and 36 commutations. Among the many questionable beneficiaries were Marc Rich and Pincus Green, who had been indicted for tax evasion, fraud, racketeering and selling oil to Iran while Americans were being held hostage. They had fled the country rather than answer the charges. Their pardons followed large contributions to the Clinton Library by relatives and friends, including one $450,000 contribution by Rich's ex-wife. Clinton's last minute pardons were also controversial because many were the result of lobbying efforts by Roger Clinton, Hugh Rodham and Tony Rodham, who traded on their close family ties to the White House to earn substantial fees interceding for organized crime figures.

But unlike the Clinton pardons, there would be no political or financial benefit to George Bush from pardoning Libby. On the contrary, such a pardon would generate a political firestorm. Yet a pardon would be the right thing to do.

First, the indictment is another example of bootstrap prosecution, where a prosecutor, having failed to find wrongdoing, ends up indicting someone for lying during the failed investigation. Second, Libby did not "out" Valerie Plame. He could not have done so, because she hadn't been "in" since 1997. Third, and most importantly, Libby faces charges not because of criminal activity, but because he got caught in the middle of a fight between the White House and the CIA over control of foreign policy. Libby ought to be pardoned even if the White House pays a political price for doing so. For it is important to establish, especially in a time of war, where ultimate control over foreign policy rests.

The 1982 Agent Identities Protection Act makes it a crime to disclose the identity of a covert CIA employee. The possibility of an unlawful disclosure was the reason the CIA demanded an investigation, following a July 14, 2003 Robert Novak column that identified Valerie Plame as a CIA employee. Eight days earlier, Plame's husband, Joseph Wilson, had published aNew York Timesop-ed piece accusing the administration of dishonesty in its handling of information related to Saddam Hussein's attempt to purchase uranium in Africa.

The Libby indictment makes no mention of the 1982 act. And for good reason. The act requires that the person whose identity is disclosed actually be covert (which requires a foreign assignment within five years of the revelation), that the government take "affirmative measures" to conceal the person's identity, and that the revealer know that the government is taking those measures.

Regardless of CIA job titles and classifications, Valerie Plame was not "covert" in any real sense of the word. She has had no recent foreign assignments. On the contrary, she has been commuting to and from her desk job at CIA headquarters in Langley, Va., since 1997. Anyone with any interest in the matter, certainly including any foreign intelligence service, could have discovered her CIA affiliation merely by following her daily routine.

Nor has the CIA taken "affirmative measures" to conceal her identity. The CIA did not object when her husband published his op-ed piece, nor when he followed up with a round of newspaper and television interviews.

Now Wilson has as much of a constitutional right to promote himself as Paris Hilton does. But when he did so, and when the CIA let him do so, they both knew that his frenetic self-glorification would generate interest in who he was, how he came to be selected for the mission, and what, if any, grounds existed to view his conclusions skeptically. The CIA sent Wilson to Africa at the suggestion of his wife. By sending him on the mission, and then allowing him free rein to promote himself, the CIA virtually assured that the public would learn of his wife's connection to the CIA and his mission.

Instead of invoking the 1982 Agent Identities Protection Act, the Libby indictment is based on 18 U.S.C. sections 1001, 1503, and 1623, which make it a crime to obstruct justice or make false statements.

When the indictment was announced, theNew York Timesdevoted its entire front page above the fold to the story. TheSan Francisco Chronicledid them one better, devoting their entire front page, above and below the fold. Readers of those newspapers might have thought the Republic was tottering. But readers of the actual 22-page indictment would have found it astonishingly insipid.

Libby's "crimes" involve alleged contradictions between his testimony during the investigation and his actual conversations with members of the press. Here are the contradictions in all their criminal horror, as cited in the indictment.

Libby testified to the grand jury that when Tim Russert of NBC News asked him whether he knew that Wilson's wife worked for the CIA, Libby "was surprised to hear that." (Par. 32.a) The truth, according to the indictment, was that Russert did not ask him whether he knew that Wilson's wife worked for the CIA. Instead, Libby already knew that she worked for the CIA. (Par. 33)

Libby testified that he advised Matthew Cooper of Time Magazine and Judith Miller of theNew York Timesthat he had heard other reporters saying that Wilson's wife worked for the CIA, but that he (Libby) did not know whether this was true. (Par. 32.b and c) The truth, according to the indictment, was that Libby did not tell Cooper and Miller that he did not know whether Wilson's wife worked for the CIA. Rather, Libby confirmed without qualification that he had heard that she did. (Par. 33)

This is what the special prosecutor has been investigating for two years? If this were a civil case for fraud, Libby could demur to the complaint, and the demurrer would probably be sustained without leave to amend. For the prosecutor could not credibly claim that anyone detrimentally relied on Libby's apparent efforts to "spin."

The Libby indictment follows the path of the Martha Stewart and Bill Clinton prosecutions. In both cases, investigators failed to find evidence of indictable offenses. But in the course of the investigations, Stewart and Clinton both lied. Liberals were outraged when the multimillion-dollar Whitewater investigation led to an indictment over Clinton's manipulation of an intern for oral sex. Libby may have manipulated Judith Miller. Neither form of manipulation was criminal.

Justice Ruth Bader Ginsburg, in her concurring opinion inBrogan v. United States, cautioned that the same statute underlying the Libby indictment could be abused. "[A]n overzealous prosecutor or investigator ? aware that a person has committed some suspicious acts, but unable to make a criminal case ? will create a crime by surprising the suspect, asking about those acts, and receiving a false denial." At this stage, we do not know whether Libby's versions of events, or the reporters' version, is correct. Certainly, there are reasons to trust Libby. After all, none of the reporters would talk to the investigators until Libby released them from their journalist vow of silence. If Libby feared they would contradict him, why would he have released them to talk? But whether Libby's or the reporters' version is correct ? or whether this was just a case of busy people remembering the same years-old conversation in different ways ? this remains a bootstrap indictment. Justice Ginsburg's warning still rings true.

Public outrage over the episode has been fueled by the impression that the indictment was part of an administration effort to uncover, and so endanger, a female covert CIA agent, in retaliation for her husband's criticism of its policies. As seen, the indictment is not based on any such wrongdoing. But was retaliation in the air, even if the prosecutor decided he could not make a case for it against Libby?

Bluntly put, Wilson's critique was a pack of lies. Whatever the truth might be behind the indictment, Libby had every right to defend the administration against a mendacious critique by disclosing the nepotism underlying its author's selection for the Niger mission.

In his op-ed piece, Wilson claimed that the mission originated with Vice President Cheney. In his round of interviews and in his instant memoir, Wilson consistently denied that his wife played any role. "Valerie had nothing to do with the matter," Wilson wrote.

But the vice president's office has denied foreknowledge of the mission, and no evidence has surfaced to contradict them. On the contrary, all the evidence has contradicted Wilson. In July 2004, the Senate bipartisan Intelligence Committee released a report, stating that a CIA official had told the Senate committee that Plame "offered up" Wilson's name for the mission. She followed up her lobbying efforts with a Feb. 12, 2002 memo to a deputy chief in the CIA's Directorate of Operations citing her husband's qualifications. Wilson's wife, not the vice president, was the source of his assignment.

If Valerie Plame were sincerely worried about her supposed covert status, why would she lobby for her husband to undertake the mission, and then allow him to broadcast his role by publishing a controversial piece in theNew York Times, grant interviews to other newspapers and appear on talk shows?

In his op-ed piece, Wilson portrays himself as embarking on the mission with an open mind, interested only in the truth. But his wife told committee staffers that when she relayed the CIA's instructions to her husband, she said "there's this crazy report" about a purported deal for Niger to sell uranium to Iraq. From the outset, there was little doubt as to what the CIA and Wilson's wife expected he would find.

The Senate report also found that Wilson provided misleading information to theWashington Post. In June of 2004, he told the paper that he had concluded the uranium rumors were based on documents ? supposed sales agreements between Niger and Iraq ? that were obvious forgeries because "the dates were wrong and the names were wrong." But those documents were not in American hands until eight months after Wilson's trip.

According to the Senate report: "Committee staff asked how the former ambassador could have come to the conclusion that the 'dates were wrong and the names were wrong' when he had never seen the CIA reports and had no knowledge of what names and dates were in the reports." Wilson then admitted to the committee that he may have "misspoken" to reporters.

Wilson's op-ed piece misrepresented his actual findings. He wrote: "I spent the next eight days drinking sweet mint tea and meeting with dozens of people: current government officials, former government officials, people associated with the country's uranium business. It did not take long to conclude that it was highly doubtful that any such transaction had ever taken place."

But in fact, Wilson's reports to the CIA actually strengthened the evidence of Iraq's possible attempts to buy uranium in Niger. According to the Senate report, Wilson, when debriefed by the CIA, said that in June 1999 a businessman approached the former prime minister of Niger, asking him to meet with an Iraqi delegation to discuss "expanding commercial relations" between Niger and Iraq. The former prime minister understood the meeting to mean they wanted to discuss yellowcake sales. But he let the matter drop because of UN sanctions. Wilson also told his CIA debriefers that Iraq tried to buy 400 tons of uranium in 1998.

Why would Wilson have lied about his reported findings in his op-ed piece? In May 2003, shortly before he published it, Wilson had joined the Kerry presidential campaign as a foreign policy adviser. In October, when Wilson's publicity campaign was at its peak, the Kerry campaign registered the domain name for Wilson's www.RestoreHonesty.com Web site.

But ties between Wilson and the Kerry campaign cooled in July 2004, when the Senate Intelligence Committee released its report. The report concluded: "The former ambassador, either by design or through ignorance, gave the American people and, for that matter, the world a version of events that was inaccurate, unsubstantiated and misleading." In the wake of the report, Wilson resigned from the Kerry campaign, which quietly removed his Web site.

The op-ed piece was a partisan political attack. In politics, as in litigation, it is perfectly acceptable to discredit your adversary by revealing bias. If Libby told reporters about Wilson's wife's connection to the CIA (as the indictment claims), instead of the other way around (as Libby claims), he was quite justified in doing so.

Even though the indictment does not accuse Libby of "outing" Valerie Plame, even though selection of Joseph Wilson for the African mission and his subsequent press campaign all smack of nepotism and dishonesty, we are still left with the allegation that Libby misled the grand jury and FBI investigators. If the indictment is false, Libby should be vindicated at trial. If the indictment is true, why should the president pardon a man for lying, even if the lie occurred in response to a dishonest political attack?

The answer goes beyond the self-promotion, mendacity and rank political interests of the players on the other side of this beltway drama. It involves an institutional conflict.

From the buildup to the war to the present day, there has been a steady flow of leaks about the administration's Iraq policy. Every leak has sought to weaken the administration's case. On May 31, theNew York Timescarried a leaked story about a CIA-run airline. On Nov. 5, theWashington Postcarried a leaked story about the existence of CIA-run prison camps throughout Europe. Both leaks may endanger real covert CIA personnel who are really working undercover. They may also endanger the governments and citizens of the states where these airlines and camps have been established. As London and Madrid know, the world can be a dangerous place for allies of the United States in the war on Islamic fascism.

Yet the CIA has done nothing about these leaks. The fact that its own people have been imperiled has not led it to demand an investigation.

At the same time, the CIA has permitted the publication of reports critical of administration policy. The Wilson op-ed was only one such publication. That same year, the CIA allowed the former head of its Osama bin Laden unit, Michael Scheuer, to publish a book entitled "Imperial Hubris: Why the West is Losing the War on Terror." The author criticized the 2003 Iraq invasion and accused the administration of pandering to Israel. (Wilson has voiced similar anti-Israel sentiments.) With all these leaks and authorized releases of CIA information, Robert Novak's July 2004 column stands out as the only report to arouse the CIA's concern.

Not coincidentally, Novak's column is the only report to discredit an administration critic and to bolster the administration's policies.

The CIA, as an institution, has never agreed with the White House on Iraq policy, and has consistently tried to sabotage that policy. This problem transcends politics. Sooner or later a Democrat will be president again. Democrats too will suffer if executive branch agencies feel they can carry out internecine warfare with the White House. Presidents of both parties once felt compelled to truckle to FBI Director J.Edgar Hoover for fear that he would use his secret stash of information against them. The spectacle of the White House incapable of controlling other executive branch agencies is dangerous for the country.

President Bush is in a fight over his wartime policies. His adversaries have shown themselves willing to resort to lies to undermine those policies. In wartime, commanders look out for their troops. Libby is not a hero. He is merely a Washington insider who played by the rules as he found them. Libby is in trouble now, not because he endangered a CIA agent, but because he stood up for the administration, like a good and loyal soldier, in a shadowy beltway turf battle. The president should stand up for Libby.

Thursday, April 06, 2006

 

Has Fitzgerald Failed?

___________________________________________________________________

Today revealed that the President had authorized Scooter Libby to discuss classified National Intelligence Estimates with reporters. The subject of the NIE was Iraqi attempts to obtain enrichable uranium. Libby testified to this fact before the Grand Jury.

Much of the hub-bub this revelation is causing focuses on whether the President can be trusted if he ordered the leak of classified information. But last I checked, it is the President who has the ultimate authority to classify or to declassify information. And that is the way we should want it. Bureaucratic machinery should not stand in the way of direct Presidential determinations that declassification will serve the national interest. There is a caveat – what about the classification of the names of undercover agents covered by a statute that makes it a crime to disclose their names – that statute itself involved in the Plame case. I beg off on that question here even though my instinct is that the President could, if the statute is read carefully and fairly, short-circuit any questions about application of the statute, prospectively only of course, simply by deciding to declassify an agent’s position. I also recognize that the issue could raise important separation of power issues, pitting the President’s ability to conduct foreign policy and wage war and Congress’ authority to circumscribe the manner in which the President does so. For example, no one would question the Congress’ authority to prevent the use of torture as a means of waging war absent truly extreme circumstance. As I said, however, this is a digression.

The more serious question raised by the disclosure concerns why Libby is being prosecuted at all. As a federal prosecutor for 8 years, it was always clear to me that the most valuable quality of any good prosecutor was how he or she exercised his or her judgment – too much power is wielded over people’s lives if the power is mis-used. The same is true in spades for a special or independent counsel, because the normal resource constraints that often cabin the exercise of that power are absent. It leaves a kind of void where there should be competing cases to limit the zealousness of a prosecutor to pursue any one matter or set of charges. There are normally other, important fish to fry.

So what about Fitzgerald’s judgment, so far presented as impeccable with a very American, middle-class aura about it? Should Libby have been indicted, with all it will cost him regardless of outcome?

Let’s suppose for argument sake that the President did not expressly authorize the leaking of Plame’s name, and further that Libby did, at least with his FBI interview and then in his first grand jury appearance, tell a story about his knowledge of Plame that appears inconsistent with the timing of his disclosures of her identity to reporters. It seems clear now, and probably was never seriously in doubt, that Libby had authority to defend the President with at least some very sensitive classified information.

The argument has to be, then, that had the President expressly authorized the Plame leak, that the President would have been violating the criminal statute that seeks to prevent the outing of CIA operatives. Libby, however, was not prosecuted on that basis, and, if it otherwise applied to Plame’s position – an issue never fully discussed by Fitzgerald – then it is hard to understand why Libby is not being prosecuted under that statute. It must not imply for some reason, perhaps because of Fitzgerald’s understanding about the level of specific intent required to establish the crime (i.e., does it require the intent to do the acts that would knowingly result in “outing,” or does it rerquire the intent to actually “out”).

This all strongly suggests that there was not enough to establish that crime. So why is Libby being prosecuted for a lie he didn’t really need to make? The venerable canon of criminal construction, no harm, no foul, would seem to apply. On this version, he was stupidly trying to protect Cheney or Bush, but nonetheless stupidly.

Which raises the problem with independent counsel all over again. We ended that practice because too many of the prosecutions seemed to involve perjury prosecutions, when almost none of the work of a normal prosecutor involves the prosecution of perjury or obstruction. Not to say it can’t be warranted, but it is to say that it raises and raised in everyone’s mind the most serious questions about how these prosecutors were, perhaps in good faith, driven to exercise that awful power, prosecutorial discretion, which if used unwisely can unjustly ruin lives.

Fitzgerald needs to tell us where the there is there. Is Scooter being prosecuted for being, in essence, stupid, or perhaps ill-advised about the risk to the President and vice President? Should he be prosecuted when it appears clear that the President and Vice President gave him the authority to discuss other, closely related highly classified information on the same subject? Even if the elements of the offense can technically be established, isn’t this really a technical perjury prosecution with no real punch behind it?

Fitzgerald has some questions to answer.

Wednesday, April 05, 2006

 

Take my salary, please??

___________________________________________________________________

Recent news articles have discussed a boost by some firms in the salaries they pay first year lawyers. The going rate has apparently leaped from $125,000 to $145,000. Not all firms have gone along, but the ritual increase that occurs every five or six years has, as it did when salaries jumped to $125,000 -- now 6 years ago -- in January 2006, spawned by the need for newly minted deal lawyers in Silicon Valley sparked similar articles. The Wall Street Journal published an editorial whose theme was that new lawyers were being hurt by this salary increase. The thesis is that the higher salaries carry with it greater work demands.

I have always been suspicious, at least after age 5 or 6, of parternalistic suggestions that giving me more money would actually make me worse off. In fact, I don't think I've ever believed such a claim. Let's examine what is really going on.

True enough, young lawyers are taking it in the briefs. But not because their salaries are out-pacing any reasonable measure of what they should be. Normally law firm salaries jump in large increments after several years of sticki-ness at one level. This is not surprising given the structure of law firms and the bargains they make with new lawyers. In an article in the American Economic Review by Dennis Carlton of the University of Chicago, circa 1986, he discussed his research on price stickiness and found that prices are most likely to grow on slow and stable paths in industries where purchasers and suppliers have long-term relationships. By contrast, first year associates quickly become second year associates, and so on, and law firms are bargaining with a whole new set of suppliers. Not surprisingly, the jumps in wages occur when critical demand shortages creep into the market. It happened as I graduated from law school because of merger activity and the competition for talent between law firms and investment bankers. It happened in 2000 because of the need for (young, low level) lawyers in Silicon Valley to do the many deals taking place, and to replace more senior lawyers who increasingly had more lucrative or interesting opportunities in the boom.

Now a review of the reality of the statistics. When I graduated from law school in 1986, first year lawyers were receiving $57,000 in Washington, D.C. Between 1986 and 2005, GDP grew by 180% in nominal dollars. For those young lawyers in the same job (assuming no productivity changes in their jobs) to maintain pace in purchasing power with the rest of the economy, they would have to be paid over $159,000 -- or 25% more than the current going rate and still about 10% over what the new "high" salaries have become for first year lawyers. There are other ways of looking at the data but they all point to this same conclusion.

The premise of the Wall Street Journal is that the higher salaries mean more expectation and work for new lawyers. I am tempted to guffaw here, but my pity for the new lawyers stops me from doing so. The fact is that the new lawyers are not even catching up to where they should be, and, assuming their productivity hasn't changed, they should be working the same as in 1986. Law firm partners, aided by people writing editorials in places like the Journal, have done a wonderful job of selling young lawyers on the idea that these new "high" salaries should carry with them an extra expectation of work. But they are just keeping pace, and for most of the last six years, had actually fallen further and further behind, all the while hearing that they must work harder to justify these "outrageous" salaries. Quite a racket for the law firm partners, and very clever marketing on their part, though I suspect most are so inured to the relevant economic data, at least for the relevant periods of comparison, that they may actually believe what they say -- which is why new lawyers are taking it in the shorts.

A couple other relevant mileposts may be of interest. Tuition over that time, as measured at my law school, is about 29% more than it would be if it grew at the rate of GDP. This suggests that new lawyers are getting it from both ends. Lower pay and higher educational costs. Fortunately, their total costs of education have not risen quite as precipitously, that is, the cost of tuition, room and board. The total cost is only about 5 percent more than it would be if it had risen at the same rate as GDP growth.

It's unfortunate that young lawyers do not know these facts. And we could discuss how the facts are less relevant for lawyers 2-5 years out of law school.

But The Wall Street Journal, of all places, ought to apply a more discerning eye to claims made about numbers, without any numerical analysis of even slight depth, when it aids law firms into duping young lawyers into believing that they have so much better than their predecessors. Not so. The trend, even without considering the increased work burdens, is in the opposite direction.

If there is any silver lining, albeit shaddenfreude, it is that the partners who manage the law firms are probably shopping similarly off-base analyses to their own partners, as different factions within firms become winners and losers, when at many firms the partnership take is by any reasonable measure out-of-hand.

 

Equality's Impact on the Marketplace

___________________________________________________________________

In his famous book, The Economics of Discrimination, Gary Becker explains how we all pay for discrimination, as economists would put in, in terms of reduced consumer welfare. How does this work? Jane, a black woman in the South in 1952, happens to have a brilliant mind for bio-chemistry. She would be a wonderful research doctor in a specialty field of her choosing. But Jim Crow excludes Jane from the best universities and thereafter the jobs in those same universities. We all know how Jane is harmed. Becker's insight is that we are all harmed - we "pay" for discrimination if we allow it to keep Jane in a job where she can not be fully productive because we would otherwise reap some consumer surplus from her productive mind. And it will be a dead-weight loss. Breaking down barriers in job markets, like other forms of free competition in the face of state or quasi-state control of markets, thereby results in making us all better off.

This insight is an important one, but not the only impact. An unkept but seldom discussed secret of the job market is that the mass entry of women at all levels has depressed real wages because of increased competition. Many two income families see this because a couple feels that both need to work to maintain the standard of living their parents had with one income. Competition has consequences not all of which favor a social structure that permitted an atmosphere amenable to giving children maximum attention during their pre-teen, and perhaps teen years. Becker's dissertation adviser would say, "there is no such thing as a free lunch" and it's very unlikely that mom will be cooking it.

There have been other not so hidden consequences. When my mother and her sister were going to college in the mid-1950s, a professional woman became a secretary, a teacher, or a nurse. My mom had opted for nursing prior to her marriage. My aunt chose teaching, elementary education. While my aunt had a wonderful and highly successful career, it is likely that my aunt would have become something else 25 years later; my mother very likely a doctor (her grades were strong). They were in some sense losing out. The flip side is that many gained from the extraordinary ability (in economic terms, surplus productivity) of women whose careers were directed toward areas like teaching and nursing. The current nursing shortage is evidence of that phenomena. Many qualified to get through the college courses required for nursing can handle them well enough to become doctors. The prior social caste system led to over-qualified nurses and teachers, from which patients and schoolchildren benefitted.

The point, of course, is not to shy away from the impact on the market these social reforms have had, but to understand them for what they are so that we can take the next steps -- understanding why there is a nursing shortage makes people better able to understand how to change the model of medical care or of paying nurses to reduce the loss of consumer surplus caused by the likely reduction in talent in the nursing profession.

A friend raises the possibility that similar forces are at work in reducing the quality and depth of the nation's university professors. That interesting subject will be for another day.

This page is powered by Blogger. Isn't yours?

     
        Well-written words