The numbers were too good. Test scores trending upward. Grant performance reports showing robust outcomes. On paper, the Adult Education department at Danville Area Community College in Illinois appeared to be doing exactly what federally funded adult education programs are supposed to do: lifting up low-income learners and moving the needle on workforce readiness.
Except, according to college officials, much of it wasn't real.
In October 2025, DACC discovered what it describes as “a suspected coordinated system of misappropriation of Adult Education funds and falsification of test scores and grant performance reports” spanning three years, from 2022 to 2025. Following an employment hearing last week involving two college employees, DACC went public Monday with details of what President Randall Fletcher and his executive leadership team unearthed over a six-month internal investigation. The findings include alleged fraud, alleged theft, and alleged manipulation of the very data used to justify continued public investment in the program.
The Illinois Community College Board has since announced it will conduct an on-site comprehensive monitoring visit and investigation, beginning immediately.
The Danville case is, at first glance, a local story about institutional misconduct. But upon closer introspection, it may represent a larger problem within the adult education ecosystem nationwide, at precisely the moment when more colleges are being asked to deliver results without the appropriate guidance or oversight.
Adult education programs — GED preparation, English language learning, workforce readiness, basic skills instruction — operate at the intersection of public need and public accountability. They are funded primarily through the Workforce Innovation and Opportunity Act, or WIOA, the federal law that governs most adult education and job training investments. Under WIOA, states receive federal formula grants and distribute funds to local providers, including community colleges like DACC, based largely on enrollment figures and performance outcomes.
That last part matters enormously. Performance-based funding is not just an accountability mechanism — it is the engine that drives resource allocation. Institutions report how many learners advanced educational functioning levels, how many obtained high school equivalencies, how many entered employment or postsecondary education. Those numbers feed into state reports that feed into federal reports that feed into future funding decisions.
When those numbers are falsified — as DACC alleges happened on its campus — the damage runs in multiple directions at once. Real learners who may have been counted but never served are left without support. Other institutions competing honestly for limited funds are disadvantaged. And policymakers making decisions about where to invest next are working from corrupted data.
But researchers who study higher education finance argue that the fraud at DACC, as egregious as it is, did not emerge from a vacuum. It emerged from a funding architecture that may be structurally prone to producing the very distortions it is designed to detect.
Dr. Frank Fernandez, an associate professor of educational leadership and policy analysis at the University of Wisconsin–Madison, has spent years studying how states fund community colleges. He does not mince words about what the evidence shows.
“There is a strong body of research that shows that performance-based funding does not improve outcomes,” Fernandez said. “Conversely, it creates unintended consequences. It's particularly egregious to falsify data, but there is evidence that performance-based funding can create incentives for colleges to exclude students who need the most help."
"If a college is trying to show that it can have better outcomes, it's much easier to exclude students who might not be able to do well than to admit them knowing that they're less likely to succeed by the end of the program,” he added.
That insight reframes the DACC case in an important way. The alleged fraud — falsifying scores, inflating performance reports — is a criminal manifestation of a pressure that operates more subtly across the adult education system every day. Programs that depend on outcome metrics to secure future funding face a constant incentive to manage those metrics, whether by manipulation, by selective enrollment, or by quietly counseling out learners who are unlikely to move the numbers in the right direction.
Fernandez's research, including work published through the Partnership for College Completion and the State Higher Education Executive Officers Association, has documented how performance-based funding models can actually widen equity gaps at the institutional level, producing outcomes that are the opposite of what policymakers intend. His scholarship found that when state funding flows disproportionately to colleges in whiter counties, it negatively influences Black and Hispanic student attainment. Conversely, when states direct resources toward colleges serving larger Black and Hispanic populations, credential attainment improves across multiple racial groups. The lesson is not subtle: money directed equitably, not money tied to performance metrics, moves the needle on outcomes.
“I agree that WIOA is important and can help colleges serve populations that need attention,” Fernandez said, "but we should be focusing on promoting adequacy-based rather than performance-based funding formulas."
The timing of the DACC scandal could not be more consequential. Federal investment in adult education has been a bipartisan priority in recent years, with workforce development framed as a national competitiveness issue. Proposals circulating in Congress and at the U.S. Department of Education have contemplated significant new investments in adult learners, a population that, advocates note, represents tens of millions of Americans without a high school credential or with limited English proficiency.
More money flowing into adult education is, by most measures, a good thing. But it also creates conditions under which fraud, if left undetected, can scale. The DACC case allegedly involved three years of falsified reports before the college itself discovered the suspected scheme. That is three years during which the Illinois Community College Board — the very body now conducting an investigation — received numbers it had no immediate reason to question.
The question is not whether ICCB was negligent. By most accounts, the board acted appropriately once it was notified. The question is whether the monitoring architecture that governs adult education funding — at the federal, state, and local levels — is designed to catch this kind of misconduct before it metastasizes.
In many states, the answer is: probably not.
Experts say that the DACC case presents an opportunity — not simply to assign blame, but to ask what a genuinely accountable adult education system would require.
At minimum, it would involve independent verification of performance data. States already have mechanisms to audit institutions, but audits are periodic and resource intensive. Some states have moved toward requiring third-party proctoring of standardized assessments used to measure educational functioning, a step that makes score falsification significantly harder. Broader adoption of such requirements, particularly for programs receiving above-threshold federal funding, would reduce the temptation and opportunity for manipulation.
It would also involve more robust separation of duties at the local program level, ensuring that the administrators who supervise instruction are not the same administrators who compile and submit performance data. The DACC investigation focused on administrators who both "supervised this academic program and managed the grant funds," a dual role that created the conditions under which an alleged coordinated scheme could operate without detection.
At the federal level, the Department of Education's Office of Career, Technical, and Adult Education — which oversees WIOA Adult Education formula grants — has historically taken a light-touch approach to monitoring, relying on states to conduct oversight of local providers. That model may warrant reconsideration. If Congress does move to expand adult education funding, some argue that it should consider pairing that investment with strengthened monitoring infrastructure at the state and federal levels, including dedicated resources for data validation and programmatic auditing.
And if researchers like Fernandez are right, Congress should go further still, reconsidering not just how adult education is monitored, but how it is funded. An adequacy-based model, one that directs resources toward institutions serving the students with the greatest need rather than rewarding institutions that can demonstrate the best outcomes, would not eliminate fraud. But it could remove one of the most powerful structural incentives that makes distorting data feel, to some administrators under pressure, like a rational response to an irrational system.














