Teachers are more than an algorithm

Value-added modeling (VAM) uses students’ performance on prior standardized tests to predict academic growth in the current year. In many places VAM doesn’t account for factors that have a big impact on students, like poverty and school funding; it only counts standardized tests.
 
Leading researchers have questioned the use of VAM in high-stakes teacher evaluation, indicating that it can be an inaccurate and unstable measure of teacher performance when used on its own. This is coming from groups like the American Statistical Association, the Rand Corp. and the Economic Policy Institute. So why are districts, legislators and even the U.S. Department of Education still pushing it as a silver bullet for education?