Using the Richmond Fed’s way to measure student success

Kevin Walthers, superintendent/president of Allan Hancock College, congratulates a recent graduate. The California college is using a Richmond Fed measurement to more accurately gauge the success of its students. (Photo: AHC)

After seeing a presentation last fall by the Federal Reserve Bank of Richmond for its new measure of community college completion, Kevin Walthers decided to apply it to the rural college he leads in California.

Using the Richmond Fed methodology, Allan Hancock College calculated a 41% success rate for students who enrolled in the 2020-21 academic year and completed by 2025. Compare that to the standard federal metric — compiled by the Integrated Postsecondary Education Data System (IPEDS) — which yielded a rate of just 24% for the college for the same period.

“We were really pleased with how it came out,” said Walthers, who has served as the college’s superintendent/president since 2013. “What we are able to do is take a look, at the beginning of the semester, and share with our faculty and staff the progress we’re making. And show them the data, and show that when we look at these data, we’re doing pretty well.”

New way vs. old way

The main differences between the two systems: IPEDS includes only full-time students, those entering in the fall semester, and those who earn a credential before transferring to a four-year institution — and it measures success rates three years out. The Richmond Fed, which surveys colleges and also visits dozens of campuses, doesn’t account for full-time status or when a student enters, measures out four years and includes those with a degree, certificate, transfer to a four-year school, or who are still enrolled and have completed at least 30 units.

“That [IPEDS] system doesn’t reflect the work we do,” Walthers said. The Richmond Fed is “trying to do state-level data, which is fantastic, and they want to get states going. But my thought was, ‘We’ve got a team that can crank this out ourselves.’ So we decided that we would take a look and see what the data would look like for our college.”

In 2021, the Richmond Fed began developing its new measure to more accurately gauge how well all community college students were doing. It started a pilot in 2022 with 10 colleges in its service area, and scaled it quickly, expanding to 63 in 2023 and 121 in 2024. Its latest survey gathered data from 189 two-year colleges across 10 states.

Each year’s survey shows that students were performing significantly better than what IPEDS showed. In 2025, the Richmond Fed Success Rate averaged 49.8%, compared to the 33.8% calculated by IPEDS.

A stronger narrative

The Richmond Fed-inspired data has served to start conversations within Allan Hancock and the community, with organizations such as the local chamber of commerce, Walthers said.

“We can go back into the community and say, ‘This is the return on investment.’ … Because I don’t think it’s unreasonable for local leaders, state leaders, even federal leaders to say, ‘We’re giving you $100 million a year. What are you doing with it?’ And we should be able to say, ‘We’re building our workforce,’” Walters said recently in an interview on CC Voice, the podcast of the American Association of Community Colleges.

The data also allow college officials to dig deeper into their students’ experiences to better understand trouble spots. For example, community colleges have known for years that full-time students are more likely to complete, but Allan Hancock found that while the overall cohort that began in 2021-22 saw a 41% success rate, that rate jumped to 54% among those who were full-time at least one semester.

“This gives you tangible evidence that the students are investing the time, so that leads them to a completion outcome,” Walthers said. “I would guess that most of those students who show up in the full-time batch, it’s more than one semester, but that’s the minimum.”

The college has a Promise program for local high schoolers that requires they go full-time, he added, which the school already knew was boosting graduation rates.

Another discovery for the college through the Richmond Fed method: the gaps between Latino and Anglo students, who collectively comprise the overwhelming majority of students at Allan Hancock, are minimal — 40% of Hispanic students vs. 44% of white students achieved some form of completion.

“So that’s a positive outcome,” Walthers said. “It shows that some of the things that we put in place — our student equity plans and the programs that we’ve targeted to try to increase the achievement and close those gaps — have been really paying dividends.”

Broadly speaking, in addition to presenting an encouraging narrative to administrators, faculty and staff, the data tee up the opportunity to discuss how the college can do better on all fronts, Walthers said. 

“And then we can talk about the things we’re doing to try to improve those numbers — the things that we should be thinking about implementing in order to make those success outcomes even better,” he said.

Going forward, Walthers expects the newly calculated numbers to come into play as the college begins work on the next iteration of its educational master plan, and he suspects they will figure into a variety of other conversations.

“I would love to start tying this to our class completion and course completion metrics, things like that,” he said. “Especially among our faculty on the instructional side, and our staff who are on the student services side, having those conversations where they can start to look at their own processes and … think maybe a little differently about, ‘What I do that impacts completion, and how do I make it so the students are better served?’”

Walthers encourages leaders at other community colleges to calculate their own results and share them widely, in a slide deck like the one he and his staff have prepared.

“The visualization is a powerful tool,” he said. “It takes a little bit of work to get it set up, but once you’ve gotten it set up, it’s just updating it each year, and keeping track of those trends.”

Leaning on the data folks

Walthers also emphasizes the importance of having a good data team. He said he drew inspiration from the West Virginia Community & Technical College System, where he previously worked, and which had a “really strong institutional research shop.” Despite being a small campus, Allan Hancock has invested in three full-time researchers and a director in the institutional effectiveness office.

“Having an institutional research team that’s doing more than just filling out reports for the federal government, or reports for the accreditation agency, or sending surveys to campus — it’s like, let’s think outside the box and try to do some meaningful analysis the faculty and staff can use to make better decisions,” Walthers said. “I think this is a good example of them pulling that kind of data together.”

About the Author

Ed Finkel
Ed Finkel is an education writer based in Illinois.
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.