This past Sunday, Clark Hoyt, the public editor of the New York Times, wrote a column chronicling the "especially embarrassing correction" the Times had to make after publishing an appraisal of the late Walter Cronkite, the legendary CBS news anchor. The appraisal contained seven separate errors, which no doubt would have caused anguish to Mr. Cronkite, a person who built his career on precision and detail in his reporting.
The cause, Mr. Hoyt said, was inadequate editing of a piece written in haste by a "television critic with a history of errors." Errors can happen, even with a phalanx of editors who should catch them. People make mistakes. Throw in ever-present deadline pressures, poor communication and perhaps even recent newsroom cutbacks throughout the country and the error potential rises dramatically. The Times and the critic, Alessandra Stanley, both acknowledged responsibility for the shortcomings.
Two things trouble me about the incident. The first is that a number of editors (as Mr. Hoyt explains it) either missed the errors or in some instances compounded them. Once again, deadlines were partially blamed, but this type of piece should have been written, fact-checked and edited well before Mr. Cronkite's death. That it was not points to poor news management, which to me ranks equally in dishonor with the errors themselves.
What also bothers me is the seeming disconnect in Mr. Hoyt's description of Ms. Stanley. He initially notes her "history of errors," pointing out that she made so many errors in 2005 that the Times assigned her a single copy editor to check her facts. Then, a few paragraphs later, he says that she is "a prolific writer much admired by editors for the intellectual heft of her coverage of television." No disrespect intended to Ms. Stanley, but I have a difficult time reconciling "intellectual heft" with "history of errors," particularly for a journalist. That's like saying "I admire Dr. Stephen Hawking's intellectual heft in physics, but we have to keep telling him that two plus two is four, not five."
The Times has added an editor in the obituary department and Ms. Stanley is once again assigned special editing help. That's the way it is with journalism in the 21st century.
Note: I have edited and fact-checked the above piece. I hope it is error-free.
With all the health care reform talk, I thought I'd add a personal perspective.
My mother, who turned 90 this year, always saved everything when it came to official paperwork (like many others of her generation). So I wasn't much surprised when she handed me what was clearly a decades-old piece of paper. "Take a look at this," she said, offering me a faded pink document.
It was the Member's Copy of a Blue Cross Statement of Account. It wasn't until I straightened out the creases and peered more intently that I saw the paper was a breakdown of the hospital charges for my birth -- March 19, 1952 (you do the math).
For Mom's stay in Ward 323, the hospital charged $53 for a seven-day stay -- three days at $7 per and the remaining four at $8 a day. I don't think moms get to stay that long with no complications, do they? Next we add in $10 for the delivery room charges and $7 for my board -- I didn't take up much space, I guess, so I got the $1 a day rate. Throw in $2.25 for drugs (today's paper dispensing cup costs more, I think) and $5 in lab costs and you get the grand total of $77.25 for my entrance into this world.
Now at the time, Blue Cross had a flat-rate maternity benefit in my parents' policy of $9.50 a day for 10 days maximum. That made for a benefit of $66.50 -- leaving my folks to come up with $10.75 if they wanted to take me home.
I realize you have to factor in inflation, but using the Inflation Calculator provided by the Federal government's Bureau of Labor Statistics (no pun intended) tells me that $77.25 translates to $628.77 in 2009. I think hospital maternity charges today are a whole lot higher than that.
So why have health care costs risen so much more than the general rate of inflation? Technology? Labor? Malpractice Insurance? Maybe it's paperwork. A New England Journal of Medicine study showed that U.S. health care paperwork cost almost $300 billion in 1999, and a Harvard/Public Citizen report noted that the U.S. health care bureaucracy in 2003 cost nearly $400 billion! What's it cost today -- half a trillion?
I wouldn't want to go back to the medical technology, salaries or other circumstances of the health care industry of 1952, but maybe if we went back to the paperwork system of that era, maybe we'd save a little money?
Handwritten at the bottom of my mother's Statement of Account were the words: "Paid 3/26/52" along with a signature. I think my mother kept the receipt as proof of payment to keep somebody from the hospital from showing up at her door demanding the return of the "goods" (me).
I'm officially worth just short of $11 -- or $78 if you take out the reimbursement. To Mom -- and Dad (10 years in heaven this fall) -- thanks for coming up with the cash. I hope I've been worth the investment.
The New York Times recently ran an article about a visit made by Chrysler executives to a Fiat plant in Tychy, Poland. Their purpose: to learn how to build small cars profitably.
It's one more sign of fading American influence in the global marketplace. Where once this country led and taught, now it follows and tries to learn from others with greater knowledge. How did it get this way?
For every industry, there's a different story, but for the auto industry, it would seem they arrived at this juncture through a combination of hubris, bad product management, misunderstanding consumer expectations and faulty labor policy.
Hubris? "Americans want big American cars," said the industry in the early 1970s, as the first gas crisis sent prices soaring and consumers scurrying to find models that could go further before pulling into another seemingly-endless gas station line. The domestic auto industry disparaged the small-car offerings of overseas makers: "Americans don't want to ride in kiddie cars with their knees in their ears."
Then something happened while those Americans were riding around in those kiddie cars with their knees in their ears. Not only did those cars spend more time on the road and less time in the gas lines, but they spent more time in their owner's driveway than in the repair shop. So when it came time to buy another car, those Americans opted more and more for imports. Better gas mileage, better reliability, better fit-and-finish made for happier owners -- and return customers.
The domestic automakers' share of the American marketplace began shrinking -- in the mid 60s, it was around 95%. By 1986, a little over 10 years after the first gas crisis, it was about 75%, which held roughly stable until 1995. Since then, the share picked up speed -- downhill. In 2007, it was barely 50%. The domestic automakers responded to their shrinking market share with a series of forgettable models and little appreciable improvement in quality. They still had the "slap more chrome on it and they'll buy it" mindset, seemingly unaware that the U.S. auto consumer was looking for quality and reliability along with stylishness. They were finding it "foreign" cars.
Now that the millenials are car buyers, the "buy American" marketing strategy, whether explicit or implicit, falls on deaf ears. To my students, "country-of-origin" means little if anything. Honda, Ford, Toyota, Chevy, Chrysler, Subaru -- they're all "cars" -- and besides, Hondas, Toyotas and Subarus are made in the U.S., anyway.
The domestic automakers claim to be burdened by higher labor costs and legacy health care obligations. While that's certainly true, and the state of the general economy pushed two of the used-to-be "Big Three" into bankruptcy, they're taking steps along with the UAW to bring those costs more in line with the reality of a global marketplace. Now if they would only do the same with the cars they make.
Did the Chrysler executives and engineers learn anything on their trip to Tychy? I guess we'll see -- down the road.
Note: after buying Ford/Mercury cars for 30 years, I crossed the line in 2007 to a Subaru Outback.