California community colleges are under pressure to improve outcomes for their students but college officials complain that the data being used to measure progress are faulty.
Two years ago, the state board that oversees California’s 114 community colleges and its new chancellor, Eloy Ortiz Oakley, launched an ambitious effort to improve student outcomes, especially by increasing the numbers of associate degrees and transfers to four-year colleges.
Dubbed “Vision for Success,” the statewide effort was something of a departure because day-to-day management of the colleges is in the hands of 72 semi-autonomous college districts.
“It’s very impressive obviously to see this document done and the need that we have statewide to be more aspirational, to have goals, to have strategies of core commitments,” state board member Joseph J. Bielanski said as the plan was adopted. “The question to me is how does this get rolled out to the 72 districts so that the 72 districts are invested in this?”
Another board member, Scott Budnick, was more upbeat. “I want something big and huge and something to strive for and this is it.”
Fast forward two years.
There has been some improvement in degrees and transfers, Oakley told the board last month, but “while there is some progress, it is not acceptable progress.”
The goals include a 20 percent increase in degrees or some other professional credentials and a 35 percent gain in transfers to four-year colleges, but degree awards increased by less than one percent in 2017-18, he said.
As Oakley, et al, ramp up pressure on local colleges to meet the plan’s ambitious goals, a sharp-elbow squabble has developed over the data used to chart progress, or lack thereof.
In the wake of Oakley’s “disappointing” report, local college officials are complaining that his data are faulty because numbers from different databases are being combined in ways that don’t reflect reality.
One March 28 email from a member of the Community College “Research and Planning Group,” Erik Cooper of Sierra College, complained that the methodology being used “has led to hundreds of emails from dozens of researchers, almost as many behind the scenes emails, numerous phone calls and some pretty bent feelings and frustrations that are … eroding confidence.”
Cooper, speaking for many other officials, added, “Chancellor Oakley in several recent publications has noted that the (community colleges) haven’t made progress on his Vision for Success goals. Colleges are being shamed … we are being asked what’s going on and how to improve … repeatedly walking back data and being held accountable for something that is out of our control. We have to have confidence that this work is accurate and reliable.”
“The math doesn’t make sense,” Marybeth Buechner of Los Rios Community College District said in another of many critical emails, adding, “The current … procedure gives us a graduation percent that is so low as to not really make sense.”
In response to the complaints, state officials insist that even though the data are being calculated in a way that differs from traditional methods, they are still accurate enough to show trend lines.
“All I can say is that if you set the goals on the baseline and we continue to measure it the same way over time, the trend should make sense even if the baseline number seems low at the onset,” Stacy Fisher, one of the state planners, told others on the research committee.
That may be true, but if Oakley, et al, need the cooperation of local college officials to make the program a success, using untrustworthy data has just the opposite effect. Or as one college president put it to me in a private email, “Even though the data are ill-defined and unreliable, colleges are being scolded for not making progress.”