Project Literacy Design

When Emergency Alerts Are Written at a College Reading Level, Who Gets Left Behind?

by Ashley Hennefer

October 7, 2015
Warnings about Joaquin were not written in the plainest language. Tweet via Twitter user Lisa Eastcoast.

On October 1, New Yorkers with smartphones were jolted by a collective ping, accompanied by a text message from the National Weather Service warning that the high winds of Hurricane Joaquin were imminent. Though Joaquin has since veered off to Bermuda as a tropical storm, high surfs and historic levels of tidal flooding have devastated the Carolinas, with more weather alerts popping up daily. 

Though every community faces a different natural threat—wildfires, tornadoes, earthquakes, hurricanes, and more—the goal of all emergency managers (those who coordinate responses to disasters) is to contact and prepare as many people as quickly and efficiently as possible, since there’s never much time to act when natural disaster strikes.

The aftermath of the flooding in North Charleston, South Carolina caused by over 15 inches of rainfall resulting from Hurricane Joaquin. Photo by Ryan Johnson, courtesy Flickr user North Charleston (cc).

Not so long ago, emergency managers could rely on audible warnings—the kind that break into broadcast news on radio and television, alerting the public about imminent danger. But people just aren’t tuning into traditional media the way they used to anymore. So to do their jobs, emergency managers must rely on technology like text messaging, emails, website posts, and social media to reach members of their community during a crisis.

Overall, such options seem to have improved safety. Anyone can enroll in automated alert systems, where they’ll be notified via text or email if an environmental threat is imminent. And some smartphones are automatically enrolled in such systems. Yet such strategies often make major assumptions about the literacy levels of the communities they serve—which means that the people receiving these messages may not be able to take action to get to safety.

Thomas Phelan, researcher and professor of communication at Hamilton College, recently noticed a literacy gap between the abundance of text-based emergency notifications and the people for whom the messages were intended. Phelan’s research focuses on emergency management and risk communication, and he writes frequently on the topic. He also teaches emergency managers at FEMA’s Emergency Management Institute.

“[I’m concerned with] the gap and readability levels of the attended warnings, so that emergency managers can respond and prepare, and save lives and property,” says Phelan.

Although emergency management strategy is largely text-based, Phelan says that literacy research in this discipline is very new, and there aren’t many researchers looking critically at the way populations are being informed about emergencies. It’s also a multifaceted area to study; Phelan cites researchers like J.M. Novak and P. Biskup, who found a similar discrepancy in the way the public was being notified about food-borne illnesses.

Screenshot via Massachusetts Alerts, a free public safety alerting app.

The literacy Phelan refers to is threefold: readability, numeracy, and computer-based problem solving. One of his first studies on this particular issue focused on readability.

“I started with a review of 40 websites of emergency management agencies, and I did a readability level on the entry paragraphs of that website, because that’s what I figured [people] would see first,” he says. “If they couldn’t read the first page, they wouldn’t go past that.”

Phelan uses the Flesch-Kincaid readability test, which evaluates the reading level of a passage. The test consists of two parts: the Flesch, which measures ease of reading, and the Flesch-Kincaid, which measures grade level. An equation is used to calculate this; the higher the number, the more readable the passage. For instance, a score between the 90-100 range is readable to a sixth grader.

This tool is built into software like Microsoft Office. Using this, Phelan discovered that most messages sent by emergency managers were at a college reading level. But there are ways to help improve readability, one of which is to focus on word use. Phelan says this can mean the difference between “evacuate” and “leave.”             

Numeracy plays a role, as numbers are frequently used to quantify the severity of weather warnings. For instance, the National Hurricane Center visualizes storm patterns and wind speed, but the data is only discernible by understanding the numbers displayed on the map. Varying degrees of hurricane threat are indicated by numerical ranges. Interpreting a storm map bridges numeracy and computer-based problem solving. Computer literacy, too, requires that a person has the ability and know-how to access the emergency response website for their county. It also refers to how people use their mobile devices. Phelan says that swiping away the alerts is a go-to response for people who don’t understand the severity of the disaster, and not knowing how to take a screenshot means that the vital information they’re receiving is lost.

One option might be to teach crisis communication starting in elementary school. Another is to change the way emergency managers write for a greater audience. Phelan says that many emergency managers assume that their written materials are accessible. To demonstrate this for them, Phelan calculates the readability of passages on their district websites during his teaching sessions.

But “there are great complexities” for this type of research, says Phelan, since every community varies. His next plan, for which he is hoping to collaborate with other researchers, is to randomly select groups of jurisdictions across the country and measure the literacy levels of the intended audience.

Phelan is adamant that making emergency response messaging more clear is not about “dumbing it down,” which he says is a common criticism for making information more accessible. It’s also important to consider the needs of people who speak English as a second language. Accessibility aids literacy, says Phelan. “When you lower a reading level so people can read it, you’re increasing their learning.”

We think words mean power, and so should you. Through Project Literacy, GOOD and Pearson are building partnerships for a more literate future. Follow the #ProjectLiteracy hashtag and visit good.is or projectliteracy.com to tell us your stories, help us ask the right questions, and take action in your community.

— Like us on Facebook to get more GOOD —
When Emergency Alerts Are Written at a College Reading Level, Who Gets Left Behind?