Print Page   |   Contact Us   |   Sign In   |   Join
News & Press: Assessments & Standards

Learning Loss in Tennessee

Wednesday, October 7, 2020   (0 Comments)
Posted by: Bethany Bowman
Share |

 

FOR IMMEDIATE RELEASE

 

LEARNING LOSS IN TENNESSEE  [View/Download PDF]

 

Mark Twain once wrote: “A lie can travel halfway around the world while the truth is putting on its shoes.” Government judges their public policy success on whether their service is delivered in an efficient process, not whether their policies solve the actual problems. Often the taxpayers are oblivious to the game. A good comparison is the classic con game, Three-card Monte. In that ruse, a shill pretends to conspire with the mark to cheat the dealer, while conspiring with the dealer to cheat the mark. The mark has no chance whatsoever of winning, at any point in the game.

That deceptive game is what crossed my mind when Commissioner Penny Schwinn released some surprising statistics that reaped national attention. The Commissioner stated, “Preliminary data projects an estimated 50% decrease in proficiency rates in 3rd-grade reading and a projected 65% decrease in proficiency in math.” We, along with many others, have questioned those statistics and believe the state lacked the substantiation to back up the released learning loss claims. Regardless, this information or projections should have been shared with state legislators and school districts before her national media release. The motivation and timing of data release were suspect to policymakers and stakeholders across the state.

While COVID-19 could very well cause learning loss, these statistics and predictions about student proficiency could have a damaging effect on all the challenging work our hard-working educators are currently engaged in. Everybody has the same questions: Where did the data used to formulate these projections come from? What was the sample size? Are they reliable? Are they valid? Reliability relates to the accuracy of their data. Reliability problems in education often arise when researchers overstate the importance of data drawn from too small or too restricted a sample. Validity refers to the essential truthfulness of a piece of data. By asserting validity, does the data measure or reflect what is claimed? Were the projections based on performance by Tennessee students from this current academic year? Overzealous data mining can seriously harm confidence in public education and create privacy concerns if individual data is compromised. Policymakers and stakeholders will need to ask these and other questions.

Here is what we know: The department did not answer what data source unmistakably supported their claim. Their response on the source ranged from national studies performed by NWEA and CREDO, to district data, to data from an optional assessment at the start of the 2020-2021 school year. As addressed below, none of these sources substantiate the Commissioner's claim.

National Studies

First, neither the NWEA nor CREDO studies cited projected proficiency losses. Both studies state losses in terms of standard deviations. CREDO's study used historical TNREADY data to predict what students would have scored in spring 2020 on TNREADY without COVID interruptions. Next, they standardized those scores and applied a set of assumptions to predict losses in terms of standard deviations, based on school closures and learning loss. They never translate their findings to proficiency rates. In the recently released full study, CREDO uses a formula to translate its losses into the number of instructional days lost. This type of translation is appealing due to its ease of interpretation, but research highlights the many weaknesses. Many researchers recommend avoiding this approach at all costs.

The assumptions used for the CREDO study are worth examination. For example, did all students lose the same amount due to school closures (e.g., unfinished learning)? The study predicts the same loss, .1 standard deviation or 58 days, for all students based upon early school closures. The learning loss deduction is taken from studies on the summer slide. It differs by prior achievement. It is important to note that the higher achieving students lose less than their low-performing peers. The latter lack access to the same resources during school breaks. In fact, in some grades, high-performing students show gains. Unfortunately, Tennessee proficiency rates are low. The kids who tend to gain during school closures are overwhelmingly the proficient students. For both studies, the researchers assume three months of the school year remained on school schedules when schools closed. This was not true for most TN school districts. Finally, the CREDO study "assumes that schooling effectively stopped for the year in mid-March." If the state believes this is a valid assumption, what does that mean about their initiatives in spring 2020? More importantly, what does that imply about all the hard work educators put in last spring?

Checkpoints

Second, the department's statement was specific to 3rd graders. The Commissioner cited the state's optional checkpoint test as one potential source for her statement. The department claims 40,000 checkpoints are now completed. These counts are not students; they are tests. The department never shared how many third graders took the math checkpoint and ELA checkpoint. They shared a list of districts where at least one checkpoint took place but not a list of just districts that used one of the 3rd-grade checkpoints.

Third, the checkpoints are optional, which means they represent a non-representative sample of students. The department released no information about the students' demographic and prior achievement makeup who participated in a checkpoint test. Without a representative set of students, the checkpoints' results cannot predict the state's performance.

Fourth, and more importantly, the state's checkpoints are not validated for learning loss. According to the Tennessee Department of Education's own publicly available document, "Checkpoint is not predictive of, or comparable to, summative TCAP results." Thus, they are not validated to predict student proficiency rates.

Finally, the checkpoints are off-grade level. This means fourth-graders take a checkpoint aligned to 3rd grade standards. Administering off grade-level assessments is a remediation mindset practice and strongly discouraged by experts. Most educational experts recommend diving immediately into grade-level standards and using formative processes to understand gaps in prior knowledge. The department recently stated that these checkpoints continue to take place. It is October, two months into school for many districts. Students need to be learning grade-level standards.

Commissioner Schwinn may be correct that there is learning loss, but her statements and statistics are unsubstantiated. The department has shared no study or evidence to support the claims on proficiency declines. Her statements on proficiency loss are not valid on their face as the research base does not support her claims. In this case, the government was inefficient and may have created more problems for public education. And if this learning loss strategy was Three-card Monte, we all lost. The story has gone national, and we may have lost the ability to shape a more constructive strategy to learning loss and other ongoing education issues.
 

 

#####

 

 

JC Bowman is the Executive Director of Professional Educators of Tennessee, a non-partisan teacher association headquartered in Nashville, Tennessee. Permission to reprint in whole or in part is hereby granted, provided that the author and the association are properly cited. For more information on this subject or any education issue please contact Professional Educators of Tennessee. To schedule an interview please call 1-800-471-4867 ext.100.

Association Management Software  ::  Legal