Thursday, August 6, 2009
The cause, Mr. Hoyt said, was inadequate editing of a piece written in haste by a "television critic with a history of errors." Errors can happen, even with a phalanx of editors who should catch them. People make mistakes. Throw in ever-present deadline pressures, poor communication and perhaps even recent newsroom cutbacks throughout the country and the error potential rises dramatically. The Times and the critic, Alessandra Stanley, both acknowledged responsibility for the shortcomings.
Two things trouble me about the incident. The first is that a number of editors (as Mr. Hoyt explains it) either missed the errors or in some instances compounded them. Once again, deadlines were partially blamed, but this type of piece should have been written, fact-checked and edited well before Mr. Cronkite's death. That it was not points to poor news management, which to me ranks equally in dishonor with the errors themselves.
What also bothers me is the seeming disconnect in Mr. Hoyt's description of Ms. Stanley. He initially notes her "history of errors," pointing out that she made so many errors in 2005 that the Times assigned her a single copy editor to check her facts. Then, a few paragraphs later, he says that she is "a prolific writer much admired by editors for the intellectual heft of her coverage of television." No disrespect intended to Ms. Stanley, but I have a difficult time reconciling "intellectual heft" with "history of errors," particularly for a journalist. That's like saying "I admire Dr. Stephen Hawking's intellectual heft in physics, but we have to keep telling him that two plus two is four, not five."
The Times has added an editor in the obituary department and Ms. Stanley is once again assigned special editing help. That's the way it is with journalism in the 21st century.
Note: I have edited and fact-checked the above piece. I hope it is error-free.
Wednesday, August 5, 2009
My mother, who turned 90 this year, always saved everything when it came to official paperwork (like many others of her generation). So I wasn't much surprised when she handed me what was clearly a decades-old piece of paper. "Take a look at this," she said, offering me a faded pink document.
It was the Member's Copy of a Blue Cross Statement of Account. It wasn't until I straightened out the creases and peered more intently that I saw the paper was a breakdown of the hospital charges for my birth -- March 19, 1952 (you do the math).
For Mom's stay in Ward 323, the hospital charged $53 for a seven-day stay -- three days at $7 per and the remaining four at $8 a day. I don't think moms get to stay that long with no complications, do they? Next we add in $10 for the delivery room charges and $7 for my board -- I didn't take up much space, I guess, so I got the $1 a day rate. Throw in $2.25 for drugs (today's paper dispensing cup costs more, I think) and $5 in lab costs and you get the grand total of $77.25 for my entrance into this world.
Now at the time, Blue Cross had a flat-rate maternity benefit in my parents' policy of $9.50 a day for 10 days maximum. That made for a benefit of $66.50 -- leaving my folks to come up with $10.75 if they wanted to take me home.
I realize you have to factor in inflation, but using the Inflation Calculator provided by the Federal government's Bureau of Labor Statistics (no pun intended) tells me that $77.25 translates to $628.77 in 2009. I think hospital maternity charges today are a whole lot higher than that.
So why have health care costs risen so much more than the general rate of inflation? Technology? Labor? Malpractice Insurance? Maybe it's paperwork. A New England Journal of Medicine study showed that U.S. health care paperwork cost almost $300 billion in 1999, and a Harvard/Public Citizen report noted that the U.S. health care bureaucracy in 2003 cost nearly $400 billion! What's it cost today -- half a trillion?
I wouldn't want to go back to the medical technology, salaries or other circumstances of the health care industry of 1952, but maybe if we went back to the paperwork system of that era, maybe we'd save a little money?
Handwritten at the bottom of my mother's Statement of Account were the words: "Paid 3/26/52" along with a signature. I think my mother kept the receipt as proof of payment to keep somebody from the hospital from showing up at her door demanding the return of the "goods" (me).
I'm officially worth just short of $11 -- or $78 if you take out the reimbursement. To Mom -- and Dad (10 years in heaven this fall) -- thanks for coming up with the cash. I hope I've been worth the investment.
It's one more sign of fading American influence in the global marketplace. Where once this country led and taught, now it follows and tries to learn from others with greater knowledge. How did it get this way?
For every industry, there's a different story, but for the auto industry, it would seem they arrived at this juncture through a combination of hubris, bad product management, misunderstanding consumer expectations and faulty labor policy.
Hubris? "Americans want big American cars," said the industry in the early 1970s, as the first gas crisis sent prices soaring and consumers scurrying to find models that could go further before pulling into another seemingly-endless gas station line. The domestic auto industry disparaged the small-car offerings of overseas makers: "Americans don't want to ride in kiddie cars with their knees in their ears."
Then something happened while those Americans were riding around in those kiddie cars with their knees in their ears. Not only did those cars spend more time on the road and less time in the gas lines, but they spent more time in their owner's driveway than in the repair shop. So when it came time to buy another car, those Americans opted more and more for imports. Better gas mileage, better reliability, better fit-and-finish made for happier owners -- and return customers.
The domestic automakers' share of the American marketplace began shrinking -- in the mid 60s, it was around 95%. By 1986, a little over 10 years after the first gas crisis, it was about 75%, which held roughly stable until 1995. Since then, the share picked up speed -- downhill. In 2007, it was barely 50%. The domestic automakers responded to their shrinking market share with a series of forgettable models and little appreciable improvement in quality. They still had the "slap more chrome on it and they'll buy it" mindset, seemingly unaware that the U.S. auto consumer was looking for quality and reliability along with stylishness. They were finding it "foreign" cars.
Now that the millenials are car buyers, the "buy American" marketing strategy, whether explicit or implicit, falls on deaf ears. To my students, "country-of-origin" means little if anything. Honda, Ford, Toyota, Chevy, Chrysler, Subaru -- they're all "cars" -- and besides, Hondas, Toyotas and Subarus are made in the U.S., anyway.
The domestic automakers claim to be burdened by higher labor costs and legacy health care obligations. While that's certainly true, and the state of the general economy pushed two of the used-to-be "Big Three" into bankruptcy, they're taking steps along with the UAW to bring those costs more in line with the reality of a global marketplace. Now if they would only do the same with the cars they make.
Did the Chrysler executives and engineers learn anything on their trip to Tychy? I guess we'll see -- down the road.
Note: after buying Ford/Mercury cars for 30 years, I crossed the line in 2007 to a Subaru Outback.
Sunday, April 5, 2009
Yesterday's New York Times carried a story on what it termed one of "the audacious proposals" in President Obama's budget – a plan to cut farm subsidies that would save nearly $10 billion over a decade. Unfortunately, the plan set off "a huge alarm in the powerful farm lobby." Among those clanging the loudest "No!" were Mr. Obama's fellow Democrats in farm states, such as Sen. Kent Conrad of North Dakota and Rep. John M. Spratt, Jr. of South Carolina.
Mr. Obama's vision of new politics, outlined in The Audacity of Hope, has once again collided with the realities of old politics, where vested interests plow under efforts to weed out government spending programs that most people agree have outgrown the fences. There are certainly elements of merit in Mr. Obama's plan, but even farm subsidy critics agree this was an overreach.
The Times article further noted that in this Congress "farm subsidy limits never got off the ground." I'd prefer another description: the farm lobby planted them so deeply they'll never see the sunlight needed for them to sprout. It just goes to show that everybody wants reform, but don't touch their sacred cow. Or corn. Or beans. Or wheat. Or whatever. It's a variation of NIMBY – everyone says we need more drug treatment facilities, or homeless shelters or power grids – just Not In My Backyard. There will be more such collisions. When it comes to budget reform, everyone agrees, sacrifices are critical, but NAME – Not At My Expense.
Full disclosure: my mother's cousins and the family of an ad industry friend both run dairy farms.
Thursday, March 12, 2009
Daily newspapers, once vital strands in the fabric of American family and community life, have been unraveling, a decline accelerated in the past year by a withered economy.
I'm a journalism graduate and spent a brief time as a reporter for an Eastern Pennsylvania daily before going into advertising and public relations. My late father was a typesetter (the old "hot metal" type) for several papers, including the Stars and Stripes in World War II. So I confess a sentimental attachment to the institution of a daily newspaper. But the Norman Rockwell-styled image of someone sitting down with a newspaper is as faded and yellowed as old newsprint.
What's brought daily newspapers to the brink of oblivion? Some may point to short-sighted management, inflexible labor or other escalating costs of production and distribution (sounds like the auto industry). The biggest factor? The world just changed, and that's been going on a long time.
Even before the rise of the Internet, newspapers were declining. People no longer had the time to spend with their daily companion. First radio (which began its news operations by reading stories from the printed paper) and then television compressed news into ever-shrinking segments. News became just a few morsels the busy person could nibble on while buzzing from task to task. Whatever evening dailies were left by the 1970s quickly became morning papers or folded altogether -- reflecting long commutes, the two-parent workforce and other trends.
Then came the digital revolution.
With even more to do and less time to get it done, people turned to the Internet for news. Historically, a newspaper replenished its circulation base as younger persons entered the marketplace. No longer. In my Advertising Copywriting class, I have 18 bright folks who represent tomorrow's leaders. Of the group, I think there are two who read a printed newspaper with any trace of regularity. I'm surprised it's that many.
Newspapers tried to adapt their printed product to the digital age, with less-than-stunning success. Many of them looked uninviting at best. As communications strategist David Henderson observed earlier this year, the Washington Post online "has not changed much in appearance in the last five or six years," further noting that the Post online "cannot figure out what it wants to be." The New York Times experimented with a concept called "Times Select," where they made their news available free but charged for certain content, such as columnists. They eventually scrapped Times Select.
Advertising revenue, the printed paper's lifeblood, has been hard to capture online. Even when the paper rounds up a set of regular advertisers for on-page advertising, people using the Firefox browser teamed with the AdBlock add-on don't see a single ad. The Wall Street Journal operates online via subscription, but there have been discussions on opening up more of its content since its acquisition by News Corp. People expect things on the Internet to be free and a successful online business model for newspapers -- and many other traditional businesses -- has been elusive.
Even an old newspaper fan like me has largely left the ranks of daily print newspaper readers. A quick flip-through is all I can spare for the local paper, with a pause at the obituaries -- a sure sign of aging. Then it's online whenever I have a few moments; although I may do it in little bursts, I may consume more news than before. With the Internet, I can scan the Telegraph from London, sample Haaretz from Jerusalem, listen to the BBC and even catch a story on Al-Jazeera.
So where do printed newspapers go from here? In a video-driven, 24/7 news cycle world, even the most incisive reporting in a printed paper is old by the time it reaches its readers. What's in the paper that you can't find online? Forget national or international news. Pictures? Too static; watch the video online. Help Wanted and other classifieds? Nope. Event calendars? Plenty of Internet sites for those, even at the local level. You can even look up obituaries online. The one exception might be some types of local news -- if you haven't already caught it on your mobile phone or that 20th century relic, television.
It may be time for print newspapers to go; but there's always sorrow for a death in the family.
Tuesday, February 24, 2009
John Stumpf, president and CEO, immediately castigated AP for a misleading news story, claiming the events were not junkets for highly paid executives, but "recognition events" for front-line employees such as tellers, personal bankers, technology specialists, credit analysts and other "team members." The company at first defended its event schedule, but faced with a torrent of criticism from Congress, the blogosphere and the general public, eventually canceled its employee recognition events for the balance of this year.
On February8, Wells Fargo took out a full-page ad in the New York Times. In the ad, entitled "The Value of Team Member Recognition," Mr. Stumpf took a whiny tone in blaming the media for the event cancellations, saying that for many employees, it is "the only time in their lives that they're publicly recognized and thanked for a job well done." He went on the say that those employees and the hospitality industry workers were the real losers in this media-inspired outcry. He concluded, "Since we aren't thanking our award winners in person this year, we'll have to do it this way" -- that is, through the ad.
Subsequently, Times columnist Maureen Dowd characterized the Wells Fargo effort as an "inadvertently hilarious full-page ad...to whinge (sic) about the junkets to Las Vegas and elsewhere it was forced to cancel because of public outrage." As to Mr. Stumpf's claim that employee recognition events "energized employees," Ms. Dowd responded, "In this economy, simply having a job should energize them." She speculated that the ad, which may have cost the bank $200,000, might serve as a partial bailout for the newspaper industry.
Two thoughts come to mind. First, Mr. Stumpf should realize it's the perception of these events stacked next to the bailout money that counts. Second, why take out an ad that may have cost $200,000 ostensibly to thank employees? Couldn't Wells Fargo have been more creative? Maybe staging a video or even a virtual event, making use of some 21st Century technology? OK, it isn't Vegas, but with new means of connecting with people, you can do some neat things. The respected communications strategist David Henderson observed, "For any organization to buy a full-page in the NY Times reveals a lack of how people communicate in today's world." I agree. Think a little harder, guys -- maybe even some old-fashioned personal communications as well?
I believe bank (and other industry) employees should be recognized, particularly those that don't get the big bonuses. But this year, just having that job may have to do.
(Full disclosures: 1. My wife worked as a front-line bank employee for 14 years in retail credit (not at Wells Fargo). 2. I've worked with clients in the financial services industry for 30 years. 3. My home mortgage is with Wells Fargo, having originated at PNC and passing through WAMU on its way there. Hope they don't call the loan.)
Thanks to David Henderson for his insight. You'll find him at www.davidhenderson.com.
Saturday, February 21, 2009
I know I'm a lonely voice when it comes to pointing out incorrect word usage in the media, and most people probably find it tiring. I intend to continue. Proper use of the English language is rapidly becoming a "who cares" issue, but I will soldier on. The latest example comes from the halls of power in Washington, D.C., namely the Office of the White House Press Secretary Robert Gibbs. When asked to comment on the controversy surrounding Illinois Senator Roland Burris, Gibbs replied, in part, that Senator Burris should "take some time this weekend to either correct what has been said and certainly think of what lays in his future." (Quoted from the official White House transcript of the February 20, 2009 Press Briefing; emphasis mine.)
If Mr. Gibbs – generally acknowledged as a skilled professional – wasn't lying down on the job on this one, he'd know that he should have said "what lies in his future." Perhaps the news organizations that quoted him, including the Los Angeles Times and the Baltimore Sun, might have used the (sic) convention to denote that they weren't responsible for the mistake. (I won't mention that, in addition, Gibbs should have not used "either" with "and" in that phrase – proper usage would have been "both." Oops, I mentioned it.)
President Obama is greatly concerned about education; maybe he should start with his own Press Secretary's office.
Friday, February 20, 2009
The first was of my late Uncle Mike, who died on his 92nd birthday in February 2006. Mike was a Pontiac man -- one of those many brand-loyal customers that the domestic auto makers had for so long. I have an old photograph of him posed jauntily with one foot on the front bumper of his Pontiac. The photo's undated, but it appears to be an early 1950s vintage. He would never consider another car make; his last car was a maroon 1992 Bonneville.
One of the biggest regrets of his last years was that several strokes had rendered him unable to drive. He still had the mental wherewithal for it; his mind was sharp (and highly opinionated) virtually up to the day he died.
I found in Mike's effects a receipt for a new 1950 Pontiac coupe from a local dealer. Complete with undercoating, it cost just under $2,000. (According to the Bureau of Labor Statistics, that amount translates to about $17,500 in 2009 dollars.)
He favored the Bonneville, a wide-track "heavy" design. In a brochure for its 1970 models (also among Mike's possessions), Pontiac claimed that its decision to name the car "after a gruesome stretch of salt" came from the "brand-new, 455-cubic-inch, 360-hp V-8." Mike did a lot of driving to work on construction sites and he wanted power and comfort; it would be just a few years before gas mileage would become a concern. He had a Pontiac Parisienne at one time. The Parisienne was the model name of the Bonneville in Canada, but was also sold in the U.S. for a short time in the 1980s. I thought that the Parisienne name was a bit too elegant for Mike's tough-guy pipefitter image; he went back to the Salt Car.
Pontiac also built its fame on "muscle cars," like the GTO, which brought back a more personal memory. It's one of the two cars I've been in (as a passenger) that was going 100 miles an hour. The other was a Road Runner. Gee, I'm glad I'm still here!
My final impression of Pontiac is emblematic of the brand's downfall. A few years ago, I was waiting at a stoplight near a local Pontiac dealership when I glanced over at the lot. I saw what I first thought was one of those "gag" cars -- it reminded me of Chevy Chase's Family Truckster wagon in Vacation -- right down to the pea-green color. It wasn't until I saw another of these grotesque creations on the street that I realized it was a Pontiac Aztek -- an ill-fated attempt at an SUV-crossover-whatever that earned it a place on many "ugliest car" lists. It had the honor, according to the Times article, of earning the top spot in an ugly car listing by the Daily Telegraph of Britain. It's a long way from Pontiac's 1980s marketing slogan: We Build Excitement.
So Pontiac fades to insignificance. I can hear Uncle Mike sighing.
Thursday, February 12, 2009
Much as I'd like to "common on," it's another case of sloppy editing that relied on a spell checker program's substitution because it didn't recognize the colloquialism "c'mon." Doesn't anyone read the final copy before posting? I suppose not -- or worse, someone read it and found nothing wrong.
I emailed Reader Editor Brent Jones, listed as the contact to report "corrections and clarifications." So far, no response from Mr. Jones. C'mon, Brent, admit it; it's wrong.
Friday, February 6, 2009
Here's a quick follow up to my post of yesterday about incorrect word usage. In an email yesterday to supporters, Family Research Council President Tony Perkins said, "On the campaign trail, Obama insisted that groups who seek government grants couldn't disqualify an applicant for a social service role if their beliefs are incompatible with the organization's tenants" (emphasis mine). The message might resonate a bit more if he said "tenets," which Merriam-Webster says is "a principle, belief, or doctrine generally held to be true; especially: one held in common by members of an organization, movement, or profession."
Then again, maybe he didn't want to take the chance on getting "principle" right. As Casey Stengel once said, referring to the 1962 New York Mets, the ultimate sports metaphor for futility until this season's Detroit Lions, "Doesn't anybody here know how to play this game?"
Casey, you may be right.
Update: In reviewing today's news, I came across a MediaPost Online Media Daily story headlined "Brick-And-Mortar Retailers Loosing Search Battle." I guess they're implying that traditional retailers aren't tight with their customers, rather than losing ground by not paying attention to search marketing.
Thursday, February 5, 2009
I know correct spelling and usage is a long-lost art, but it's getting out of hand.
Last week, I read a post on MediaPost’s Raw blog on trends in social media that concluded with the observation that “all media is becoming social, which seems to demonstrate the threat and opportunity to well healed media firms.” I’m hoping those media firms aren’t too sick to realize that it should be “well-heeled,” as in prosperous.
Today, I came across an item on the Public Relations Society of America (PRSA) web site on the ethics of Twitter usage. I was only too happy to learn that when certain White House reporters tweet during briefings, the result can be that “…by the time the rest of the members of the press core file their stories, the news is already dated.” Perhaps the core of the White House press corps might want to know that. I hereby request that a larger portion of my PRSA dues be devoted to more careful web editing.
At the top of my list, however, is the evaluation copy of the Thomson/Wadsworth textbook “Creative Strategy in Advertising,” 9th edition I received last year. The back cover blurb began “Focusing on the fundamental principal that good advertising always starts with an understanding of people…” My evaluation would be that a fundamental principle of good textbook writing is proper usage and the editors should be sent to the principal. And we expect our students to get it right?
I know, I know. It’s a world of spontaneous communication, powered by tweets and status updates. (Forget emails; that’s so 20th Century.) There’s no time to be right; just blurt it out. I may be disorganized (just ask my wife), but I still think there’s some value in getting your words right.
Back to work; I’m not well-heeled enough to ignore the principle that he who does not work does not eat. Just ask the press corps.
P.S. To their credit, the web editors at PRSA corrected the “press core” reference within minutes of my comment. Hats off!