Most kids with a bad grade on their report card are happy to avoid notice. But when rivers in the Chesapeake Bay region fail to make the grade, they often become front page news-and river advocates couldn't be happier.

Ecological report cards are not new to the region. The Chesapeake Bay Foundation began issuing an annual "State of the Bay" report in 1998, and began including letter grades in 2004. The University of Maryland Center for Environmental Science and the U.S. National Oceanic and Atmospheric Administration's Chesapeake Bay Office started issuing an annual report card for the Bay in 2006.

Both reap media attention when released.

Increasingly, watershed organizations are discovering that report cards and score cards are a powerful tool for snagging media attention and public awareness at the local level as well.

"Depending on how they are produced, they can be very effective mechanisms for getting the public interested and involved," said Joe Rieger, watershed restoration manager for the Elizabeth River Project.

The Elizabeth River Project, James River Association, and Lynnhaven River have issued report-card style evaluations for waterways in Virginia. The Magothy River Association, Chester River Association, Patuxent Riverkeeper, South River Federation, and Potomac Conservancy have done the same in Maryland.

Some rate their rivers with letter grades, others with numbers. The Elizabeth River Project uses labels such as "degraded," "marginal," and "good."

The simple format is a departure from detailed status reports that are often too technical for the general public. Grades and scores make good sound bites. They also send a clear message about the overall state of the river.

In 2007, the Potomac Conservancy gave the river a D-plus, based mostly on poor land use practices in the watershed.

Communications director Anne Sundermann said the report drew good press coverage, including several editorials. "It really brought the issues to the forefront," she said. "Now people say to me all the time, 'you gave the river a D-plus.' It's a shame the grade was so low, but it really sticks in people's heads."

The South River Federation published its second scorecard in 2008, rating different aspects of the river on a scale from one to 10, and presented their findings to the Anne Arundel County Council.

"We found a tremendous amount of interest in the scorecard results," said executive director Matt Berres. "It brings lots of attention to the condition of the river and creates an opportunity to rally public support."

The first Patuxent River report card, released earlier this year, gave a D-minus to the river, which helped to recruit volunteers for a new citizen monitoring program.

But while watershed groups are enthusiastic about report cards, some are also wary of the pitfalls.

Patuxent Riverkeeper Fred Tutman is among those who like the report card format but worry about the quality of information it conveys. "A report card can be a trite way of putting things, but people respond to it. They want to know how the river measures up," he said. "On the other hand, we are concerned that it can be overly trite, and not give enough information to make better decisions."

The same simplicity that captures public attention can mask variations in the river. The upper reaches of the Patuxent, for example, are in better shape than the lower section. The Patuxent report card points this out by grading both regions individually, with a C-minus and F, respectively. But these details can be overshadowed by public interest in the river's overall grade.

An improvement in any given year, which might be driven by factors such as weather, can also distract from a long-term downward trend.

"The overall trend is down, whether or not you slap a B, C or D on it," Tutman said. "That's far more distressing than what a report card can tell us in any one year."

Report card data often come from several sources, including citizen monitoring programs, government agencies and universities. A team of professionals analyzes the data and draws conclusions. That's the tricky part.

The challenge lies in condensing thick scientific data into a grade, score or descriptive phrase. What makes nutrients earn a score of seven? When do bacteria levels slip from "good" to "marginal"? Should letter grades have pluses and minuses? And what does a C mean to the general public-relief or alarm?

"A C or D really doesn't tell people enough," Tutman said. "Maybe we should tell them, if it's below a C, don't fish. If it's below a B, don't swim in it."

Then there's the question of consistency. Does a C for a Virginia river mean the same thing as a C in Maryland-or even for two rivers in the same state?

To keep these problems at bay, some watershed organizations take great care in how they collect their data and present it to the public.

The Chesapeake Bay Trust recently funded a pilot project to add technical strength to the process. A $23,000 grant was awarded to the University of Maryland Center for Environmental Science, which in turn provided guidance to the Patuxent Riverkeeper and Chester River Association.

"Part of the intent was to develop a fairly straightforward method for doing report cards so that not everyone is recreating the wheel, with variations on what's being measured, the quality of the data and the interpretation of the data," said Allen Hance, director of the trust.

They also worked to keep the message local. "Report cards draw attention to people's own back yards, watershed by watershed," Hance said. "That sense of ownership is really important because for some, the Bay is a distant place. We see it as an important extension of the traditional, more Bay-focused report cards."

Robert Parks, executive director of the Chester River Association, said the report card confirmed the group's priorities and provided new insight-using a larger data set allowed them to compare progress on the Chester with other rivers and the Bay as a whole.

UMCES scientist William Dennison, who advised the groups in the Trust project, said the ability to draw such comparisons helps to make report cards a powerful tool. Dennison first worked on ecosystem report cards for the coastal bays of Australia in 1999 and is a lead scientist in UMCES' annual report card on the Chesapeake Bay.

The Australian report cards began as a "best guess," but became a rallying point to deliver timely, integrated information to citizens and decision makers. Over the years, Dennison's team transformed the report cards from a simple public relations tool to a report fueled by rigorous scientific data.

Communities and watershed groups even set goals to improve their grades. A falling grade cost at least one public official his job. People were motivated by peer pressure and accountability. "When we started to see the sewage plumes shrinking; that was a really empowering moment," Dennison said. "The community began to realize that they could control their destiny."

A report card that triggers meaningful action depends on selecting and gathering good data. Dennison recommends an annual report that reflects changing conditions in the river. "Some report cards tend to come and go, or measure things that aren't likely to change," he said. "People get weary of things that don't respond."

Good data, on the other hand, directs attention away from bad grades and toward solutions. "If you have good data and a sound process, then some of the angst goes away and the focus is on how to fix the problems," he said.

Dennison uses multiple measures, or indicators, to avoid an oversimplified message. Their combined average becomes the overall grade for the river.

He suggests that groups consider not only what to measure, but when to take those measurements-and why.

Six water quality indicators were used for the Patuxent and the estuarine parts of the Chester: dissolved oxygen, water clarity, chlorophyll, aquatic grasses, phytoplankton and the benthic community. A different set of indicators tracked water quality in the Chester's creeks: ammonium, nitrate, phosphate, dissolved oxygen and turbidity.

Choosing the indicators is as important as measuring them well. "Deciding what you measure is a very important step in the self-realization of what you care about as an organization and society," Dennison said. "It forces self-reflection." As a set, they should track progress toward a specific vision for the river.

The Elizabeth River Project added indicators to their 2008 State of the River to address concerns of local citizens about natural resources like oysters and fish and about bacteria levels that impact human health.

The South River Federation includes oysters, bacteria and underwater grasses among their 10 indicators, but stopped tracking impervious surfaces because it was difficult to monitor on an annual basis. They added chlorophyll because of its role in producing algae blooms and because regular chlorophyll data is available from the state.

Topics like land use, education and enforcement have also made their way into some report cards.

Lynnhaven River Now issues a report card that includes media attention, educational programs, school participation and public involvement. They set goals for each category and track them on an annual basis.

Dennison believes these indicators have a place in report cards because they help to depict the river's context and progress toward restoration. The science behind them, though, hasn't received the same amount of attention as ecological indicators.

"We should offer that challenge to social or socioeconomic scientists," Dennison said. "It would be very exciting to do some of those indicators and put them on par with the ecological ones."

Some river groups are aligning their cleanup strategies directly to report card findings. The Potomac Conservancy, for example, issued a Potomac Agenda at the same time as its report card in order to address solutions along with the problems. The Elizabeth River Project is revising its action plan to address the 2008 State of the River report, and highlight the steps toward a swimmable, fishable river.

Dennison believes that these responses show the power of a localized message. "The more localized the report card, the more impact we feel it can have," he said.