Home page


Follow HSDistrict211 on Twitter facebook   


 


District 211 PSAE Appeal

 
   

High School District 211 Superintendent Roger Thornton and Superintendent-Elect Nancy Robb delivered the District's appeal of its Prairie State Achievement Examination scores to Illinois State Superintendent Chris Koch on Tuesday, September 30, 2008.

On October 2, 2008, High School District 211 amended its September 30, 2008 appeal of its PSAE results with the addition of
five exhibits (Exhibit 16 -20), a text explanation of each new exhibit, and an amended Exhibit List/Description.

If you wish to review exhibit materials, please visit the G. A. McElroy Administration Center or the Principal's office at any District 211 high school.

 

Letter to Illinois State Board of Education
and State Superintendent Koch

Amended letter to ISBE and Superintendent Koch

Amended Exhibit List/Description

APPEAL OF PSAE RESULTS

Township High School District 211
Palatine, Illinois

 

Historical Perspective of the Appeal

Township High School District 211’s history includes the National Blue Ribbon of Excellence Award for each of its five (5) high schools.  Palatine High School received the award on two occasions.  A system-driven curriculum that includes end-of-course examinations has been present in the District for many years.  The District’s feeder districts, Schaumburg District 54 and Palatine District 15, are noted for educational excellence and each has achieved awards of excellence in their own right.

Beginning in 2004, Township High School District 211 began significant initiatives to increase student achievement.  The goal was not focused solely on achieving Adequate Yearly Progress.  Rather, acceleration of learning for students who were not succeeding through increased rigor in required core subjects and an increased mathematics requirement were coupled with an Incoming Freshman Academy summer class (for students who were not projected to meet PSAE standards based on their EXPLORE score).

The State Board of Education may recall our request to permit seniors, who did not meet standards on the PSAE as juniors, to take the PSAE in their senior year.  We were then willing, and we remain willing, to do all possible to make certain that all students graduate ready for higher education and/or the world of work.

In 2007, our perspective of PSAE as a meaningful and reliable indicator of student achievement changed.  The catalyst of our changed view was the PSAE scores for our District and the entire State of Illinois.  Through no explainable reason, scores for the graduating class of 2008 (Spring 2007 PSAE test takers) dropped by 4%.  District 211, along with several other school districts, met with ISBE staff and ACT representatives.  In December 2007, District 211 leaders presented a request to the Illinois State Board of Education that a third-party review of the 2007 scores be completed and that District 211 representatives be permitted to work with the reviewers to express the concerns of high school districts.  Though a commitment was made in the public Board Meeting to honor our request and to involve us in the discussions, no such involvement was permitted.

The third-party review, completed by HumRRO in February 2008, confirmed our concerns but was unable to resolve the shortcomings inherent in the PSAE as the measurement instrument regarding a reliable indicator of student achievement relative to the State’s defined standards.  (See exhibit 1) Page v of the report included the following statement:

“It is possible that some methodological or random error might have contributed significantly to the apparent gain from 2004 to 2005.  Then, when the 2006 administration is equated to 2005 and the 2007 administration to 2004 using common item equating, that same methodological or random equating error would result in an apparent decrease in mean scores. (emphasis added)  HumRRO has no means of investigating the likelihood of this possibility from the data provided, but the overall data patterns indicate that this is a plausible explanation for the decline.”

 

Basis of the Appeal

We formally make our appeal based on the following statistical and substantive reasons, consistent with the defined appeals process: 

  1. PSAE is a combination of norm-referenced examinations that are scored separately, calibrated separately and then jointly, and reported jointly.
  2. PSAE reports have reflected inaccurate and dysfunctional scoring calculations so that students with ACT scores high enough to enter high caliber universities are reported as failing PSAE.
  3. PSAE reports reflect unexplainable variances that create and impose an unreliable and inaccurate basis for evaluating student, school and district progress toward AYP.
  4. PSAE is comprised of norm-referenced tests that are reported in a norm-referenced context wherein a mean score serves to determine passing levels.  Individual and group student achievement is compared to other students.  Reported scores should reflect to what level individuals and schools achieved Illinois learning standards.
  5. PSAE scores discriminate against disadvantaged students by comparing their achievement to advantaged students rather than reporting the level to which all students achieved standards.
  6. PSAE discriminates against Illinois high students and schools.  Under the norm-referenced mean-score based PSAE reporting structure, it is impossible for high school students and schools to achieve AYP in future years when increasing percentages of students must meet standards.  However, under ISAT, a criterion-referenced test mandated by Illinois for elementary and junior high school students, AYP is possible for the very schools from which our high school students come.

 

Examining the Differences and the Different Impacts of Reported Scores of Criterion-Referenced vs Norm-Referenced Examinations

The HumRRO report corroborates much of the basis of our appeal.  Please note the bold-type phrase in the previously cited quotation from the report where the writers of this appeal added emphasis.  The referenced phrase is “decrease in mean scores.”  Therein lays the issue:  the ACT and WorkKeys, while set forth by the State of Illinois and the State Board of Education, as the means to assess student achievement of Illinois standards for 11th-graders, uses student score norms from significantly different segments of students across the United States to establish mean scores for the purposes of reporting student performance.  It is critical that ISBE Board Members and staff pause to reflect on this issue. 

PSAE is comprised of examinations from which results are reported in a norm-referenced contest, thereby comparing students’ achievement to the achievement of other students; not to standards.  Norm-referenced results will always reflect half of the students above the mean and half of the students below the mean.

Those who observe student accountability assessments across the United States have analyzed Illinois’ student assessments.  The National Center for Education Statistics (see Exhibit 2) lists Illinois’ ISAT as a “criterion-referenced” assessment type.  They list the Prairie State Achievement Examination as a “hybrid” assessment type.  It is important to note that they describe West Virginia’s tests, the ACT Plan and the ACT Explore, as “norm-referenced” assessment type.  Illinois recognizes the replicated assessment type connection among the ACT Explore, ACT Plan and the ACT test by its funding of all three examinations for the purpose of enabling high school students to predict their ACT score through the ACT Explore and the ACT Plan experience in the eight or ninth grade and the 10th grade, respectively.  EXPLORE and PLAN are norm-referenced tests and can only be a predictor of ACT scores if ACT is also norm-referenced.

The National Center for Education Statistics further defines the two types of tests with the following delineations:

Criterion-referenced:  A standardized test that is aligned with a states academic standards and thus intended primarily to measure students performance with respect to those standards rather than to the performance of their peers nationally.

Norm-referenced:  A standardized test designed primarily to compare the performance of students with that of their peers nationally.  Such tests do not generally measure how students perform in relation to a states own academic standards.

A separate publication by FairTest (See Exhibit 3), the National Center for Fair and Open Testing, includes the following:

Criterion-referenced tests (CRTs) are intended to measure how well a person has learned a specific body of knowledge and skills.  Multiple-choice tests most people take to get a driver’s license and on-the-road driving are both examples of criterion-referenced tests.  As on most other CRT’s, it is possible for everyone to earn a passing score if they know about driving rules and if they drive reasonably well.

In contrast, norm-referenced tests (NRTs) are made to compare test takers to each other.  On an NRT driving test, test takers would be compared as to who knew most or least about driving rules or who drove better or worse.  Scores would be reported as a percentage rank with half scoring above and half below the mid-point (see NRT fact sheet).”

On page two of the publication, the following admonition is included:

“Sometimes one kind of a test is used for two purposes at the same time.  In addition to ranking test takers in relation to a national sample of students, a NRT might be used to decide if students have learned the content they were taught.  A CRT might be used to assess mastery and to rank students and schools on their scores.  In many states, students have to pass either an NRT or a CRT to obtain a diploma or be promoted.  This is a serious misuse of tests.  Because schools serving wealthier students usually score higher than other schools, ranking often just compares schools based on community wealth.  This practice offers no real help for schools to improve.

NRTs are designed to sort and rank students “on the curve,” not to see if they met a standard or criterion.  Therefore, NRTs should not be used to assess whether students have met standards.”

The ACT student information publication clearly describes the ACT as a norm-referenced examination.  “THE 1973-74 ACT ASSESSMENT STUDENT’S BOOKLET” (See exhibit 4) states:

On page 4:
“Your ACT standard scores are like a yardstick.  They measure you, but don’t tell whether you are “short or tall” until your measurements are compared with others.
To check your performance with the performance of other students, ACT standard scores are converted to percentile ranks.
A percentile rank is an estimate of how you compare to a particular group.  That is, it tells you what percentage of students in a given group scored lower than you.”
On page 5:
“Information on your ACT Student Profile Report compares your ability scores with scores of other students who wrote the ACT assessment at your high school, other college-bound students in your state, and a national sample of college-bound students.  Your percentile ranks in each of these groups are provided on your score report.”

The more recent “2007/2008 USING YOUR ACT RESULTS” publication from ACT (See exhibit 5)  includes the following statements:

           On page 5:
“Understanding and using your ACT scores
You’ll get the most out of your ACT scores if you—

    • Compare them to the scores of other students
    • Match them to the requirements of your preferred college
    • Link them to specific strengths and weaknesses in your own skills
    • Compare them with our grades

See How Your Scores Compare
Your ranks on the multiple-choice tests tell you the percentage of recent high school graduates who took the ACT and received scores that are the same or lower than yours.  Your ranks for the Combined English/Writing subscore are based on the subgroup of students who chose to take the Writing Test during recent administrations and who scored at or below your scores.”

The contrasting impact of criterion-referenced tests and norm-referenced tests can be readily observed through Illinois ISAT and PSAE data.  The Chicago Tribune noted the stark differences in its Thursday, September 18, 2008, Metro Section 2 article (See exhibit 6). 

 

Prior ISBE Reviews of PSAE

The McCabe, Miller, Lange presentation to the Illinois State Board of Education (See exhibit 7) entitled “Debunking the Myths of the PSAE” noted the differences as well.  Though appearing to prove that the PSAE measures Illinois state standards, the report unintentionally proves that the PSAE is not an appropriate instrument for the reporting of student achievement of state standards.  We quote from:


Bullet 3, page 14 of the report:

  • “To scores of other students using ACT’s 3-year rolling user norms (norm referenced reporting based on relative standard.)”

     
Bullet 3, page 17 of the report:

  • “The forms are equated back to an anchor form to ensure that a score of 21 on the new form represents the same level of achievement as a score of 21 on forms used in the past.”

Bullet 3, page 21 of the report:

  • “These 3-year rolling user norms may change each year.  The norms do assist with interpreting scores (i.e. how a student is achieving compared to other students.)”

We accept the good intent of the authors of the report to give substantial support to a state mandate (the PSAE) that was then and remains under attack.  However, the reporters and the report missed a very important environmental change.  The assessment environment changed from 1999 when PSAE became a reality in Illinois as a compromise among legislators, the business community, and educators.  In 1999, a norm-referenced test of national significance deemed to predict university readiness coupled with an examination that theoretically assessed workplace readiness made sense, at least to those making the decisions to put PSAE in place.

The test’s shortcomings were evident but seemingly ignored, even then.  Only a portion of students in most states take the ACT.  Illinois, Colorado and Michigan, at 98% and 100% and 100%, respectively, led the way in 2008 in the percentage of high school graduates tested.    Maine tested 9%, New Jersey tested 13%, and California tested 17% as did Massachusetts.  Nationally, 43 % of seniors took the ACT at least once.

 

Summarizing the Impact on Illinois High School Students and Schools:  Why Does it Matter?

What is the impact of the variance described above between criterion-referenced results vs norm-referenced results?  If ACT scores were reported on a criterion-reference basis, there would be no impact.  The report would demonstrate which students achieved which standards and the report would reflect progress or lack of same.  The same is true with WorkKeys.

Reported in its norm-referenced basis, the “Average ACT Scores by State: 2008 ACT-Tested Graduates” (See exhibit 8) report compares all Illinois students to a varying depth of creaming across the country.  When compared to Colorado and Michigan, Illinois’s composite score of 20.7 betters their composite scores of 20.5 and 19.6, respectively.  But, none of the scores have meaning in terms of how many Illinois students met Illinois standards.  The scores merely compare the entire range of graduates in three states, including Illinois, with sometimes very limited and likely highest performing graduates from other states. 

“ACT News” (See Exhibit 9) in its Facts about the ACT, confirms that comparison of students as the score reporting basis when it answers the question:

“What is the source of the 2008 ACT national average?  The scores of all ACT-tested 2008 high school graduates (more than 1.4 million students)”

Combining norm-referenced ACT reports with norm-referenced WorkKeys reports does not result in a usable or valid PSAE report for AYP purposes.  Thus, HumRRO, ISBE’s selected third-party reviewer, found that the 2007 issues with PSAE scores, even though the number of students scoring in the top three levels of WorkKeys declined by 10%, were not explainable though described as “ a dramatic and anomalous change.”   Had the results been reported in a criterion-referenced basis as is the case with ISAT, an item analysis would have pointed to a solution.

Equally troubling is the fact that annually Illinois students achieve high scores on the ACT, in a norm-referenced report, only to learn that they did not pass the PSAE.  For example, 440 District 211 students scored a 19 or higher on the ACT during the 2007 PSAE administration (See exhibit 10), but did not pass the PSAE.  Some of those students received scores in the mid-to-high 20s.  In the 2008 administration (See exhibits 11 and 12), similar injustices occurred when 438 District 211 students scored a 19 or higher but failed the PSAE.

Annually, approximately 90% of incoming District 211 freshman have met ISAT standards.  The criteria of ISAT criterion-referenced examination are clear and the results report the percentage of students who have achieved standards.  The clear and constant conflict and contrast between and among ISAT scores, a criterion-referenced test, and EXPLORE scores, a norm-referenced test, is evidenced in the scores of District 211 incoming freshmen of the graduating Class of 2011 (See exhibit 13).  ISAT reports student performance in terms of mastery of standards.  EXPLORE, a precursor to the ACT, reports student performance in terms of how students compare to other students.  The historical trend of EXPLORE, PLAN and ACT scores among District 211 students demonstrates acceptable progress through the 11th grade (See exhibit 14).

PSAE results are reported in a norm-referenced format, and while claims are made that Illinois standards are measured, no separate Illinois ACT exists.  ACT results are reported as they always have been by comparing students with other students.  That process seemingly met the goals of state policy-makers in 1999.  Cemented in place by the passage of “No Child Left Behind” in 2001, and with retention of the examination held in place through the threat of the withholding of federal funds from the State of Illinois, it is imperative that the Illinois State Board of Education acknowledge and permit its staff to acknowledge that PSAE does not report Illinois students’ achievement of Illinois standards. Rather, in a general sense, the PSAE, through the ACT and WorkKeys, reports the performance of each and all Illinois students compared with the performance of an array of different proportions of creamed students across the United States and then reports the results in a “how you compare to others” format.  Generally, by definition, no more than one-half of Illinois students (given that all Illinois students take the PSAE while far less than half of the students in most states take the exam) should be expected to be above the norm.

But, PSAE scores are also irrelevant.  Illinois students, many of whom fail the PSAE as demonstrated earlier with ACT scores of 19 or higher are prepared to gain entry and do gain entry to Illinois public universities (See exhibit 15).  In fact, many of those who fail the PSAE achieve ACT scores that qualify them for entry into prestigious universities across the United States.  It is our position that an exam that repeatedly, on an annual basis, fails students who, as part of the exam, test high enough to gain entry into highly ranked universities, and, yet, are deemed failing by the PSAE is just plain wrong.  The test does not specifically test achievement of Illinois standards.  The reports do not reflect to what degree a student or all students achieved Illinois standards.  And, the reported scores hold no value for either universities or employers.  

Finally, we suggest that the PSAE is discriminatory toward students of poverty and disadvantage simply because the examination is reported in a norm-referenced basis.  It is possible to accelerate learning for disadvantaged students.  We have evidence of that success.  Acceleration, added rigor, advanced courses, more time in school, mentoring caring and dedicated teachers and school leaders;  all have an impact.

However, at the same time disadvantaged students are progressing, so are those whose privileged life circumstances gave them a head start and that advantaged circumstance  permits them the continued and ongoing opportunity to excel. In the end, disadvantaged students, though often learning at high levels and achieving standards, remain below the mean in a norm-referenced comparison, and thus fail the PSAE examination, handicapped solely because of their starting place.

 

Move Illinois High Schools to a Testing Model Similar to ISAT

Under ISAT and similar criterion-referenced examinations, the results are not reported as a comparison to what others are achieving.  Rather, the focus is how each student and the students in each school achieve in regard to standards.  Under PSAE, the focus is how each student and each school achieves compared to all other who took the tests with no direct link to Illinois standards.

The PSAE, as a norm-referenced examination, appears to tangentially measure student achievement on Illinois learning standards with results reported in a United States-wide-norm-referenced format.  The results give no clue as to which Illinois students have met specific Illinois standards.  Most students taking the ACT throughout the United States are not accountable for Illinois standards because they are not residents of Illinois and they do not attend Illinois schools.

For school accountability purposes, this difference is the difference between success and failure for all Illinois high schools.  With norm-referenced results, achieving 100% proficiency for all students is impossible.  Achieving 90% or 80% or 70%  proficiency statewide is impossible by definition. The Illinois State Board of Education must accept the reality of the definition of norm-referenced results:  half of the students will be above the mean and half of the students will be below the mean.  Illinois School Accountability requirements and No Child Left Behind requirements cannot be met in a norm-referenced reporting system.

Further, it is our observation and position that the Illinois State Board of Education and its own staff are fully aware of the problems with the PSAE, but are unable, due to the financial penalties that would be imposed on the State of Illinois by the federal government, to enact the needed changes.  In the meantime, high schools across Illinois are forced to allocate significant teacher and administrator time away from the classroom and the school to create school improvement plans solely on the basis of the school’s standing as prompted by its PSAE scores.  And, for students, teachers, school leaders, and communities that care deeply about their schools, we urge the Illinois State Board of Education to move with haste to initiate a review of our appeal and to enact appropriate remedies in the immediate future.  As a State, we can do better by our students and our schools.  We must.

 

Requested Action

We petition the Illinois State Board of Education to:

  1. Set aside the ACT/WorkKeys-based PSAE and replace it with a criterion-referenced examination similar to the ISAT.
  2. Remove all sanctions and precursors toward sanctions for Illinois high schools where PSAE test scores were the prompt until a criterion-referenced examination is developed that specifically measures student achievement on Illinois standards and reports performance on same for individual students, schools and school districts.

 

Amended Text Explanation (amended 10-2-08)

 

Township High School District Amended Appeal of PSAE Scores

 

Additional Amended Text of October 2, 2008.

Exhibit 16

The issues of concern regarding PSAE scores are exemplified in the 2002 study of the relationship of ISAT and PSAE scores (Exhibit 16).  The anticipated cross-validation coefficients of the tests were .76 for reading and .84 for mathematics.  In actual performance, the actual scores on the PSAE compared to the predicted PSAE scores based on ISAT score predictions, had a reliability factor of 62% in reading and 70% in mathematics.  Since, as we all know, a coefficient of 1 signifies alignment, Illinois has known for six years that the alignment of ISAT and PSAE earns a score that would merit a grade of “F” in most high school mathematics classes.  On what basis is this acceptable to the State Board of Education as it grades its high schools’ performance?

Results from the PSAE serve as the basis for schools to be required to remove approximately thirty teachers from our classrooms for multiple days to prepare school improvement plans with the threat of annually increasing penalties made nearly certain by the required ascending scale of proficiency increasing 7.5% per school year.  Given the 2002 study, District 211 augments its appeal of the scores on the basis of unacceptable correlation between ISAT and PSAE.

Exhibit 17

The August 26, 2008, communication from the State Superintendent set forth clear goals as determined by the State Board of Education. The first goal relates directly to PSAE:

“Every student will demonstrate academic achievement and be prepared for success after high school.”

Though adequate time has not elapsed for the State Board of Education to achieve this goal, progress toward the goal can be demonstrated by hearing and seriously considering the District 211 appeal of PSAE scores.  Several factors are central to that hearing and consideration:

  1. Preparation for success after high school begins in Kindergarten or before and continues through the senior year. 
  2. The serious and amply evident disconnect between and among elementary, junior high, and high school state test results as reflected by the ISAT, EXPLORE, PLAN, and PSAE scores demonstrate either significantly different standards, or significantly different levels of accurate measurement, or both, relative to student achievement on Illinois standards.
  3. The up-to-the-present acceptance of intolerable variances of correlation between ISAT and PSAE along with similar variances between ISAT and EXPLORE as demonstrated by Exhibit 13 should give state policymakers pause.
  4. An accurate measurement of success for Illinois high schools is not possible under the present testing structure with the PSAE in direct conflict with the State Board of Education’s stated goal to the degree that PSAE is the specified measure of success.

 

Exhibit 18

The State Superintendent’s comments regarding the scoring of PSAE in the September 30, 2008, Weekly Message are troubling.  The District 211 PSAE appeal was delivered to ISBE at approximately 9:30 AM on the 30th.  Perhaps pure coincidence, both in timing and substance, the comments reflect the position taken by certain leaders of ISBE in our previous meetings, rather than the position and comments of Dr. Koch and the State Board of Education.  Regardless, the comments again deflect the focus from the real issue.  Consider, please, our request for the following information:

  1. Comparing ACT and WorkKeys questions with ISAT questions, to what degree do each directly assess student achievement on Illinois standards?
  2. Comparing ISAT and PSAE reporting structures, which provides a direct report of each student’s achievement on Illinois standards?
  3. How many PSAE items are borrowed from ACT and WorkKeys?
  4. How many items on the PSAE are not part of the ACT or WorkKeys and how are they imbedded into the ACT and WorkKeys tests?
  5. Do Illinois juniors take a different version of the ACT than students in other states?  If so, how are those items created and how are Illinois students’ ACT scores calculated?  
  6. Are Illinois juniors given additional time to complete the PSAE questions that are not part of the ACT or is the ACT testing time the same for our students as it is for students in other states?
  7. Though it is stated that “there is no limit on the percentage of students that can meet standards,” transparency is absent in regard to how the results are reported.  Reference Exhibits 10, 11, and 12.  How can students achieving a 19, 20, 21, 22, 23, 24, 25, 26, 27, or 28 on the ACT be judged to not meet standards when, per Exhibit 15, they could be admitted to numerous Illinois state universities?  The formula for calculating meets or exceeds expectations on the PSAE was hidden even from the State Board of Education’s own third party reviewer (Exhibit 1).

 

Exhibit 19

This letter from the Illinois State Board of Education dated March 14, 2008, is either inaccurate or it impeaches the assertions made in Exhibit 18.  We quote:

“For example, for Math, the normalized score on the 60-item ACT Math test and the normalized score on the 30-item WorkKeys Applied Math test, each counted for half of the overall score, giving greater weight to the WorkKeys items.

Starting with the 2008 test, each of the 90 items will contribute equally to the overall score.  A psychometric process was used to equate the 2007 results from the old methodology to the new methodology.  This minimizes the impact on the percentage of students meeting standards.”

Several concerns are present with this exhibit.  They are:

  1. No mention is made of any variances from the normal ACT or WorkKeys.
  2. Constant reference to the “normalized score” gives credence to concerns about norm-referenced results.  In fact, it is our position that the term confirms norm-referenced results.
  3. Our concern #7 as stated in regard to Exhibit 18 is redoubled here.  Normalized score references cause us to believe that the assertion that there is no limit on the percentage of students that meet standard is misleading when compared to what a similar statement would mean in regard to ISAT.
  4. The only PSAE related questions mentioned in this exhibit that are not part of the normal ACT or WorkKeys relate to science.
  5. Why would a psychometric process be used to cause results to remain similar to the results gained from the previous significantly different scoring value?  Did the psychometric process mask the change that should have been apparent as a result of the changed individual item values?

 

Exhibit 20

This exhibit confirms that the 60 items on the ACT all count toward the PSAE score.  And, all 33 WorkKey items count.  We cite the note at the bottom of the page:

“Note: The mathematics portion of the PSAE is a combination of the ACT Assessment Mathematics component and the WorkKeys Applied Mathematics Assessment component.”

District 211 again asserts that either Exhibit 20 is inaccurate or it impeaches the statements made in Exhibit 18. 

District 211 amends and affirms its appeal of the PSAE results based on the issues raised in the original filing and in these amended questions and concerns.

 

 

 

 

 

 

     Home page    
About Us  |  News  |  Info |  Schools  |  Board of Education  Administration  |  Jobs  |  Search  |  Site Map  |  Emergency Info
building the future one student at a time
G.A. McElroy Administration Center• 1750 South Roselle Road • Palatine, Illinois • 60067-7336 • 847-755-6600
Copyright @ 2012 Township High School District 211 •  All Rights Reserved.