You saw it in the papers, or you heard it on the news: a new list of America's fattest and fittest cities says Chicago is America's fattest. Lots of surprises on the list, too. Quite provocative, and I've already heard talk radio hosts discussing it.
But none of them ask a simple question: how was the survey compiled?
(Chicago) catapulted to the highest spot in 2006 because it has the survey's worst workout environment, said Men's Fitness editor Neal Boulton. The ranking "is not really based upon what a person does at the dinner table," Boulton said. "It's not about going around counting the number of overweight people. It's about a call of arms to mayors and governors to provide public health initiatives that will stem tide of obesity."
In other words, they kinda made it up. Oh, there are some statistics, but they threw in some whoppers that let them use the "survey" to punish politically incorrect city governments.
Now, there are things in the rankings that should raise red flags among the nation's newspaper editors and TV and radio news assignment editors. First, the moves of certain cities- L.A. from 21st to 3rd fattest, Long Beach from 20th to 7th, Philadelphia from 2nd all the way to 23rd in a single year- ought to raise red flags, because there's no way on an objective basis that swings that wild would be possible.
And a look at the methodology ought to raise more red flags. It starts out in a vaguely promising manner:
The 50 largest U.S. cities were selected using the most recent United States Census Bureau statistics available at the time of the survey, which was conducted from August 2005 through October 2005. Cities were assessed in 17 weighted categories, using data specific to each city, except when data was available only for a metropolitan statistical area or for a state. (When no data was available, a neutral score was assigned.)
OK, where did the data come from? Let's look:
Gyms/Sporting Goods: Composite score, equally weighing (a) total number of clubs, gyms and fitness studios ranked per 100,000 population, from YellowPages.com; and (b) total number of sporting-goods retailers ranked per 100,000 population, from YellowPages.com.
Since when is the number of sporting goods stores in a market an indicator of health? Was the Northeast suddenly less healthy when the Herman's World of Sporting Goods chain went belly-up several years ago? Did L.A. get unhealthy when Oshman's started to disappear, and did its health come back when Sportmart entered the field? Besides, anyone else notice how inaccurate the Yellow Pages listings are? My local SBC and Verizon books seem to be missing some major retail operations in every category.
Nutrition: Composite score, equally weighing (a) average frequency of fruit and vegetable consumption (percent of people who consume five or more servings per day) in state-level data from the Centers for Disease Control and Prevention's Behavioral Risk Factors Surveillance System; and (b) total number of health-food stores ranked per 100,000 population, from YellowPages.com.
Again with the retail! We have a new Whole Foods seemingly in every town- does that make us healthier or trendier? (And how, exactly, do the CDC and Prevention surveys measure who eats five servings a day?)
And so on, through junk food (again, number of outlets per 100,000), alcohol (number of taverns per 100,000), TV watching (wouldn't this be pretty much the same nationwide? Doesn't, say, "Desperate Housewives" do the same basic share in most markets?), climate (wouldn't this be irrelevant if people can WORK OUT INDOORS?), and on and on and then you hit paydirt, the real way they ranked the cities:
Mayor & City Leadership (new): Composite score, weighing (a) mayoral participation in, or promotion of, public fitness events; (b) position reporting to mayor responsible for antiobesity programs or citywide fitness initiatives (sometimes called "fitness czar"); (c) current citywide antiobesity or fitness initiatives; (d) mayor's personal example and exercise habits.
So a city that wastes money on a public relations campaign and a patronage job for some friend-of-the-Mayor is healthier than one that spends it on education or police. Oh, and if the mayor's out of shape, that's another demerit. (By that token, L.A. has a relatively fit Mayor and Arnold in the State House- that ought to make it a winner) And:
Obesity-Related Legislation (new) : Points were awarded for (a) state "snack taxes;" (b) state-based nutrition and physical-activity programs; and (c) participation in the federal Steps to a Healthier U.S. program. Points were subtracted for states that have laws limiting liability for purveyors of junk food. Data reported by the Trust for America's Health.
If your city is passing punitive social-engineering legislation financially punishing you for having the occasional potato-chip craving, it's fitter. If your state quite reasonably has limited liability for snack food makers with the understanding that IF YOU EAT JUNK FOOD, YOU KNOW THE RISKS AND YOU SHOULDN'T BE ABLE TO BLAME SOMEONE ELSE FOR YOUR SPARE TIRE, your city is not fit.
So, basically, this is advocacy disguised as "research." And the editors eat it up, because, well, what's the harm? And there isn't much harm, unless you think that the news ought to be a little more, you know, factual.
But this kind of thing goes on all the time- you put out a list, and you'll be quoted in the news as if you're Gallup or Nielsen. That "Boring Institute" guy was able to con every newspaper and wire service into reporting his annual "most boring celebrity" list as if there was really an institute of scholars researching what's boring. (He's a PR guy, naturally) Mr. Blackwell made himself into a national celebrity without ever quite having to do much other than release a list every year. And then there's that "other" talk radio trade magazine that annually puts out a list of what it claims are the 100 most influential talk radio hosts in America, only you've never heard of half of them and several seem to be, well, advertisers. And it's cited uncritically in bios and articles as a major honor, and the editor/publisher is often quoted as an expert in talk radio. The basis for the list? None. It's really one guy's opinion, no voting involved. Totally subjective, and he admits it- the only qualification is that you have to, er, have a show. And that's how people with literally no ratings, no following, and no influence make the list (and some who DO have ratings and influence are missing).
That's my mistake. I really AM an expert on talk radio, and I don't get quoted much, because I don't have a list. I should make a list. One press release and I'll be in every paper in America. We should all make lists. It really IS that easy.
Share