9 3 7 7 0

Think Pieces

Michael's thoughts on the auto industry, its products, and/or this website.

 

 

 

 

 

Seven Serious Problems with
Consumer Reports

-- and how TrueDelta will avoid them

For decades, Consumer Reports has been the best source of vehicle reliability information. But even the best is not good enough. In at least seven ways, Consumer Reports' data collection methods or modes of presentation mislead or underinform consumers.

In each case, TrueDelta is doing things much differently.

1. "Serious problems"

Consumer Reports' ratings are based on the number of "serious problems" reported by its members. They never precisely define the term. Instead, Consumer Report's survey form leaves it up to each respondent to determine which problems are serious enough to report, an invitation to bias.

In contrast, TrueDelta will report measures like "times in the shop" and "days in the shop." These mean what they seem to mean. If a vehicle is in the shop for something other than routine maintenance or an excluded wear item (listed on TrueDelta's survey form), that's serious enough.

2. Relative ratings

Consumer Reports rates each model relative to the average vehicle. As a result, the absolute number of problems a vehicle will experience remains unclear. Does an "above average" vehicle "never break?" Is a "below average" vehicle "always in the shop?"

In the absence of hard numbers, people tend to assume that the best vehicles are better than they are and that the worst vehicles are worse than they are. I once had a vigorous discussion with the owner of a Japanese SUV. As proof of his vehicle's superior reliability, he noted that its brand had been the highest-rated in Consumer Reports' 2005 auto issue. His brand's nearly new cars had had eight "serious problems" per hundred vehicles. While this was less than half the eighteen problems per hundred nearly new domestic brand vehicles, the absolute difference was just one-tenth of a serious problem per car.

This did not--and does not--strike me as anything to get wound up over. The real problem: few people when glancing through the magazine think about the absolute numbers behind the relative ratings.

In contrast, TrueDelta will clearly report absolute ratings.

3. Ranges

Consumer Reports' rates models on a five-point scale from "much worse than average" to "much better than average" using their well-known red and black "blobs." In 2005 (when I first wrote this), more than half of domestic models earned an "average" rating, while many Hondas and Toyotas earned an "above average" rating. (With the average increasing, "much better than average" ratings have become rarer.)

"Average" means within twenty percent of the average, so 80 to 120 on an index with 100 being average. "Better than average" ranges from 121 to 145. So if one vehicle is "average" and another is "better than average," then the difference between them can range anywhere from a single point--totally insignificant--to 65 points--very significant. The red and black dots appear simple to understand, but they conceal far more than the convey. As a result, many readers of the magazine understand far less than they think they do.

In contrast, TrueDelta will clearly report the absolute differences between vehicles. For example, analysis of the data might find that one vehicle over the first five years of ownership will take 2.3 extra trips to the shop, for a total of 3.6 extra days.

4. Only averages

Vehicle reliability has been steadily improving. Even the average eight-year-old domestic brand model was reported (on page 17 of the 2005 auto issue) to have fewer than one-and-a-half "serious problems" per year. Yet many people would avoid such a car because they fear it will have "lots of problems."

While perceptions are distorted by Consumer Reports' emphasis on relative ratings, another factor is involved: people are afraid of getting an unusually troublesome vehicle. Even if the average is the same for two models, the chances of getting a lemon could be far higher for one than the other. People might fear that even as the average rate of problems for domestic vehicles comes down the odds of getting a lemon remain uncomfortably high.

Based on Consumer Reports' reported results there's no way to know one way or the other, as they only report averages. To my knowledge, they have never discussed the odds of getting an unusually good or bad example of a particular model.

In contrast, TrueDelta will report the odds of getting a lemon and the odds of getting a perfect car (in addition to reporting the average number of trips to the shop and days in the shop).

5. Survey (in)frequency

Consumer Reports sends out an annual survey asking people to report problems that occurred during the entire previous year. This is too long a period to expect people to accurately remember what happened.

In contrast, TrueDelta sends a monthly email asking people to report trips to the shop that occured the previous month. In most cases participants will still only have to fill out one or two brief survey's a year. So the effort will be the same or less. But respondents' recall will be much more accurate.

6. Stale information

Consumer Reports mails out surveys each spring, then first reports the results the following November. As a result, when a new vehicle is introduced in the fall its reliability isn't reported until over a year later. This is a long time to wait for someone interested in a hot new design; by the time its reliability is known it will no longer be hot.

In a related issue, the vehicles reported on aren't as old as Consumer Reports suggests. For example, while "three-year-old vehicles" are, on average, three years old at the time the auto issue appears, they were only about two years old when the problems were reported, and only about one year old at the beginning of the period being reported upon.

In contrast, TrueDelta updates its stats quarterly, and will first report reliability as soon as four months after a new vehicle reaches dealers. On average, TrueDelta's resutls are over ten months "fresher" than those of Consumer Reports.

7. Fossilization

The last serious problem at least partially explains the others: Consumer Reports, once an innovator, has ceased to innovate. They have been asking questions and reporting results much the same way for decades. No surprise, really, as they've had no serious competition and have been subjected to very little outside evaluation.

Want better vehicle reliability information? Participate in TrueDelta's research and help make it happen.

Thanks for reading.

Michael Karesh, TrueDelta

First posted: September 5, 2005
Last updated: February 24, 2007