District officials canceled summer school this year, saying they had data that showed summer school didn't really help kids get farther ahead academically. In fact, sometimes kids were worse off after going to summer school, they said.
That seemed hard to believe.
I met with Fran Wilson, the chief academic officer, and asked to see the data.
She handed me three slides from a PowerPoint presentation.
One was a bar graph that showed what percentage of kids in each grade scored at grade level/below grade level/severely below grade level on a literacy assessment prior to summer school, compared to the percentages at each level at the beginning of the school year after summer school.
(EOY = end of the year, meaning how they scored on the DIBELS literacy assessment at the end of, say, kindergarten. BOY = beginning of the year, meaning how they scored at the beginning of, say, first grade, after going to summer school. Green means students are at grade level; yellow means they need intervention; red means they need intensive intervention.)
As you can see, there are not even any numbers on any of the bars. Wilson actually pulled out a piece of paper so that we could try to compare the bars visually.
The point she was making to me was that there were not huge gains in the percentage of students in the green category. And in some grade levels, it seemed there might have even been a slight decrease in that category.
Here's the second page she shared with me. This reflects the same sort of data (scores on the literacy assessment at the end of one school year, compared to the beginning of the next), but for students who did not attend summer school.
So here's the third page she gave me, which summarizes the conclusions that the district drew from those two bar graphs. (The highlighting is from Wilson, not me.)
Well, it seemed that surely, there was no way the district made a $4 million decision based on three PowerPoint slides. I asked for a copy of the full report.
What the district sent me was a 12-slide PowerPoint presentation. That, apparently, was the full report. Take a look:
It seemed odd that the district would have been spending $4 million a year on a summer school program and would have only just recently gone to the trouble of evaluating it.
Through the course of my reporting, I discovered that the district had, in fact, hired the University at Buffalo to evaluate the summer school program.
The district spent more than $100,000 on those studies.
I asked the district for copies of those reports; I was told that central office staffers could not find them -- meaning the district paid tens of thousands of dollars for evaluations that apparently somehow vanished at the critical moment when a $4 million decision was being made.
While the district could not locate copies of the reports, UB could. The good people over at the university sent copies to the district, which then sent them to me.
Let's just say the UB reports were just a bit more comprehensive than the district's 12-slide PowerPoint.
And full of standard deviations, T-scores, and all those other pesky little things that I seem to recall from an education stats course I took as being considered kind of important when you're doing a statistical analysis.
Here's UB's evaluation of the 2007 summer school program:
And UB's evaluation of the 2009 summer school program:
Wilson apparently read those reports after UB sent copies of them to the district, following my request. After she read them, she said, "In looking at the reports, none of them makes an overt argument for or against summer school."
Take a look at Page 3 of the UB evaluation of the 2009 program. Here are some excerpts:
Does giving students more instructional time through a summer extended learning opportunity improve student reading scores?
- YES! Overall, the majority of students who participated in ELOP improved their reading scores over the course of the summer as students saw an average gain of 3.6 percentile points from June to the ELOP posttest.
Does participation in a summer extended learning opportunity improve student reading scores from June to the following September?
- YES! Students who participate in ELOP show gains in their reading scores from the end of the school year to the beginning of the next school year.
Does participation in a summer extended learning opportunity improve student NYS ELA and math scores?
- YES! Larger percentages of students participating in ELOP improve both their ELA and math scores than do students who do not participate in ELOP.
But don't rely on a few excerpts. You can read all three reports for yourself and draw your own conclusions.
- Mary Pasciak