In today’s New York Times there’s an article entitled “Who Says Maths Has To Be Boring?” that contains an interesting reference to the US National Assessment of Adult Literacy:
“Only 18 percent of American adults can calculate how much a carpet will cost if they know the size of the room and the per-yard price of the carpet, according to a federal survey.”
That so many people get this wrong is frightening, and the site has lots of other fascinating examples of people struggling with charts, bus timetables, telephone books, sports, etc.
However, I wasn’t very surprised.
I believe most people implementing analytic systems underestimate the extent to which many people are not equipped to make the best use of data, no matter how well it’s delivered.
As part of my job I often give talks on how important it is to foster effective information cultures in organization – not just provide more technology. To ensure analytics projects are successful, it’s important to ensure all employees have the skills they need to understand and use information effectively. Human analysis and decision-making is the most important “technology” in the analytics architecture. Or as Einstein put it: “Computers are useless – they can only give you answers.”
Giving people the latest-and-greatest-technology business intelligence system and suddenly expecting them to achieve great business insights is like giving somebody the easiest-to-use, most sophisticated pencil ever and expecting them to turn into Picasso.
Data Quality Problems of the NAAL
One fun thing that came out of my research: at one point, I found a question that only 14% of adults answered correctly.
But as I tried to answer it myself, I found myself struggling, too! Here’s the question:
Refer to the article “Butcher captures Iditarod Sled Dog Race” on page two of the newspaper to answer the following question. Calculate how much less time in hours, minutes, and seconds Susan Butcher took to complete the race this year than in 1986.
This image isn’t really legible, but the site makes it clear that the people taking the survey had a proper newspaper article to read – and it provides a transcript of the handout:
Butcher captures Iditarod Sled Dog Race
Greeted by sirens, banners and shouts, Susan Butcher stormed into Nome, Alaska yesterday in subzero weather and record time to rack up her fourth victory in the past five years in the 1,158-mile Iditarod Trail Sled Dog Race.
Butcher was running about two hours ahead of defending champion Joe Runyan in a reveral of roles from the 1989 finish. Runyan won that race with a 65-minute cushion over Butcher.
Butcher, who had to drop 1 of her veteran dogs, credited unknowns among her team the win.
“This team has been absolutely incredible,” she said “I never had a team go as strong as this.”
Her official time in the marathon from Anchorage to Nome was 11 days, 1 hour, 53 minutes and 23 seconds. The previous record set by Butcher in 1989 was 11 days, 15 hours, 6 minutes and 2 seconds.
The correct answer would be “we don’t know” because the article doesn’t have any information about 1986, only 1989.
I first wondered if the question had a typo, but a search for the original source of the content revealed that the typo is actually in the article transcript — the previous record by Butcher was indeed set in 1986, not 1989. Thus the correct answer is:
Correct Answer: 13 hours, 12 minutes, 39 seconds (11 days, 15 hours, 6 minutes, 2 seconds – 11 days, 1 hour, 53 minutes, 23 seconds)
Another part of the survey uses the same materials but asks a slightly different question:
Calculate about how many times per day Susan Butcher traveled during this year’s race.
The answer is given as:
Correct Answer: Any of the following: 1) 105 or 106 miles (1158/11); 2) 100 miles (1100/11); 3) 109 miles (1200/11)
Again, it’s clear that there’s a bad typo, and the question was supposed to ask how many miles she traveled.
Isn’t there’s a wonderful irony in the official National Assessment of Adult Literacy survey including such obvious errors? (they are shown several different times on the site).
And these results were reported in 2010, and are still uncorrected — what does this say about “web literacy” in the digital age?!