Today, University of Colorado faculty member Kevin Welner has an op-ed in the Orlando Sentinel criticizing the Office of Program Policy Analysis and Government Accountability (OPPAGA) report on the state’s corporate tax-credit voucher program and its fiscal impact. Welner argues that the claim of net savings to the state “is based on smoke and mirrors … [and] concocts its numbers out of thin air.” The key point: OPPAGA had to gather data or make an assumption about the proportion of voucher users who would have gone to private schools anyway. OPPAGA failed to gather data and assumed that only 10% of the students would have either paid tuition or received scholarships from the private schools. Based on that assumption, OPPAGA claimed that the state saved about $1.50 for every $1 it gave away (in the form of tax credits to corporations participating in the program).
Welner’s point is important: the arguments about the fiscal impact depend on whether you think that the participating students are increases in the poor students attending private schools. As the OPPAGA report notes, if only 60% are private students who would not have attended without the program, then there is no net fiscal benefit to the state, assuming the rest of their models are correct. Since the number of participating students doubled in the past three years, it should have been simple to ask whether the new students (who otherwise were eligible to attend public schools in Florida in the prior years) were transfers from public schools. There are fewer than 1,000 participating private schools in Florida, the schools have to track students individually for audit purposes, and OPPAGA had to contact them for another part of the report, anyway. Is the failure to gather the crucial data a matter of flawed research or is it the result of an explicit directive by politicians? (More on that other part later.)
Then there’s the question about the rest of the model. As one correspondent with the St. Petersburg Times noted, there is no analysis in the report about the difference between the fixed and variable costs for students. Some part of the cost of public education is scalable by student, while some of it is not scalable. The fixed costs of running a school is the reason why districts with shrinking enrollments close schools: it doesn’t make sense to run a school for 50 students when the school was built for 500. If the beneficiaries of tax-credit vouchers are concentrated in a district, that district’s fixed cost does not decline, and it rises in proportion to the remaining students. Since more than half of the recipients in Florida last year lived in three districts (Miami-Dade, Orange, and Duval counties), those districts had to bear a disproportionate burden of those fixed costs to serve the remaining students. Given the relative size of the voucher program and the number of students (21,000), even the most affected district (Miami-Dade) would have seen marginal impacts. But it’s missing from the fiscal impact analysis, whose conclusions are distorted as a result. For similar reasons, a comprehensive fiscal-impact analysis would need to address the local costs of providing special education and other services that are not covered by the state or where the local district loses grant or categorical funding from the federal government. Depending on whether the federal government provides significant aid, a gain by the state may be balanced by a loss of federal funding. Essentially, the analysis is a simplified (and data-thin) calculation of the impact on the state government, not the impact on total revenues and services.
The big picture is important: the corporate tax-credit voucher program has been expanding rapidly, and as the actual state expenses looked like they would bump up against the $88 million ceiling, the legislature agreed to increase the ceiling and asked for the fiscal-impact report in the same bill. Because of the report’s construction, I expect lawmakers who wanted to increase the ceiling further to argue for it as a matter of saving the state money. We still don’t know the facts, though, either about the fiscal impact or how students are doing in the program.
And it’s the last issue that was in the report but has gone unreported in Florida’s newspapers. Part of the required report included a question of how to induce participating private schools to have their students take the state assessment for public-school students. The answer by private schools who received public funding for some of their students was essentially, “We don’t want to participate, and you can’t make us.” The private schools contacted by OPPAGA told staff that they did not think the FCAT was an appropriate measure of what their students learned, that having their students take the FCAT would cause them to distort their curriculum, and that norm-referenced tests they already used would be sufficient. Now, where have I heard these arguments before?