Leadership Matters - August 2013

(Continued from page 4)

responses were in the “Strongly Agree” category the end result was a school being labeled “Neutral” or “Weak” in a category. We are told that probably is the result of scoring on a bell curve, which inherently creates “winners” and “losers.” Regardless, the results are undermined and become invalid when “Agree” ends up being counted as a negative response because the respondent clearly was “agreeing” with a statement not knowing that “Strongly Agrees” actually was the only answer that would count as a positive response.  Some of the survey questions are unclear or set up for failure. For example, questions about parental

visits to classrooms simply do not apply in many cases, especially in high schools. For that matter, given today’s safety concerns do we even want lots of parents in the school buildings when many schools already are struggling with security concerns? Some schools, like Pittsfield High School, operate on a Block 8 system, meaning classes meet every other day. How is a math question where the best response is “We do this most every day” valid for a class that doesn’t even meet every day?

“In summation, due to the inaccuracies because of the invalidity and unreliability of the survey results, the consequences of the release of information from this survey as it currently exists could be very hurtful and harmful to school districts. We do not believe that is the goal of ISBE. “

Similarly, as Dr. Chris Clark of Zion-Benton Township High School District 126 mentioned in her letter to you, how can a principal of a very large school be expected to know what’s going on in every classroom? The question refers only to the principal and fails to refer to any other type of administrator who might be the principal’s designee, such as an assistant principal, department or division chair. How are teachers supposed to answer that question in a case where other administrators have visited their classrooms regularly? Those are three general areas of concern that remain even after the issue of comparing all Illinois schools to CPS schools. We believe that ultimately this survey could be a useful tool, but not as it is currently constructed and administered. The premature release of this first-year data very likely will paint a distorted picture of many schools and be one more weight placed on school districts and administrators, who will be left to try and explain the flaws in the survey when all the media and the general public will focus on will be the summative conclusions. Why rush to release data that may not be reliable, especially in the form of comparative summative ratings? As I said earlier, it would seem to be a more prudent approach to sit down with stakeholders, take a look at all of these issues with an eye toward improving the survey and use this year’s results to start forming a benchmark. If these first-year results must be made public for some reason, then we believe those results should be reported as the percentages for each response (for example, 25 percent “Strongly Agree,” 70 percent “Agree,” 3 percent “Disagree,” and 2 percent “Strongly Disagree.”) It is simply too early to try and use comparative data and make a summative evaluation based on norms that are not yet well established. In summation, due to the inaccuracies because of the invalidity and unreliability of the survey results, the consequences of the release of information from this survey as it currently exists could be very hurtful and harmful to school districts. We do not believe that is the goal of ISBE. Thank you for your consideration of this letter. As a statewide association, IASA stands ready to assist you in a good-faith effort to review and improve the 5Essentials Survey. Sincerely,

Brent Clark, Ph.D., Executive Director Illinois Association of School Administrators

5

Made with