SPF: School Performance Fiction


When I first considered writing about the SPF – School Performance Framework – I thought it would be a very straightforward endeavor. The headline was obvious to me: 13 organizations on one side of the education divide collaborated with one organization from the other side and one individual (that would be me)  to ask the Denver Public Schools Board of Education to “revise the SPF criteria and thresholds so district staff, parents, and the general public has a clearer understanding of the definition of a quality school.” The October 21, 2104 letter went on to ask the district to “reflect upon the purpose and design of the SPF in order to guide the district toward greater improvement.” Remember, this is a District that has seen an increasing achievement gap, minimal yearly progress, while pretending “reforms” are working by using data gathering instruments like Median Growth Percentiles (MGP) to hide the truth.  As I conducted more research, this seemingly simple venture turned into a very multi-faceted investigation. And not surprisingly, my suspicions have been confirmed. The School Performance Framework is in reality the School Performance FICTION.

In its letter the SPF Collaborative, as I will refer to the organizations and me, identified three specific criticisms of the current SPF:

  •  Academic status expectations are too low, especially for elementary schools.
  • “High performance” is inconsistent within schools.
  • Growth overshadows proficiency.

All of us believe the DPS SPF (who can stand all these acronyms?) is faulty and leading to confusion and misinformation.

An interesting sidebar to this latest quest for clarity revolves around historic perspective and an old fashioned paper trail. I can provide both because: 1) I was a DPS Board Member in 2008 when the SPF was born; 2) I myself am sort of historic (old) and have institutional memory; and 3) because I still find it easier to work from hard copies, I have most of the documentation around initial SPF presentations. Holding on to paper can be a bad thing, because I can get into the weeds quickly (and my office is a mess). But overall, holding on to paper in this case has been a good thing because when I visited the DPS SPF website, little history is provided.

A brief history from DPS School Performance Framework Binder, April 2008

Purpose: “It (SPF) was created as a means of accrediting our schools as required by the Colorado Department of Education, but also as a way to give the district, our schools and the larger community the information they need to make critical, instructional decisions that will have the greatest impact on improving students achievement.   In addition, the SPF will also eventually be used to inform both teacher and principal compensation systems.”  SB-191 is one of the most punitive teacher evaluation bills in the country.

In the beginning schools were evaluated on six indicators. Each indicator carried its own “weight.”

  • Student Progress over Time or Growth
  • Student Achievement Level or Status
  • College and Career Readiness (high schools only)
  • Student Engagement and Satisfaction
  • School Demand
  • Parent and Community Engagement

Since this beginning rubric more educational indicator has been added for high schools:

  •  Improvement in College and Career Readiness over Time – added in 2011

Please note that from the very beginning, growth has consistently outweighed status (proficiency) by almost 3:1 for elementary and middle schools and 2:1 for high schools.  Some early accountability measures have changed and even gone away – School Accountability Report (SAR), Adequate Yearly Progress (AYP) CSAP. They have been replaced with new educationese: Post Secondary readiness, College and Career readiness, TCAP, PARCC and CMAS. Also, note the dramatic decline in weight for status that has occurred in the ensuing years in the high school scorecard. Growth and “Post Secondary Readiness Growth” have somehow taken precedence over old-fashioned proficiency. The SPF Collaborative has criticized the District for this heavy emphasis on growth because what it does is showcase Blue (Distinguished) and Green (Meets Expectations) Schools that are really not Distinguished or Meeting Expectations. Schools losing proficiency percentages have been deemed Blue or Green, when in reality some of these schools are losing proficiency. In fact, DPS recognized 41 schools this fall for growth gains; 17 out of 41 actually LOST proficiency. Finally, historically and presently the bottom three indicators have accounted for approximately 10-12% of overall scores.


2008           Elementary                   Middle                         High

Growth           61.8%                          62.8%                          60.3%

Status              32.5%                          31.4%                          30.2%

College and Career Readiness                                              4.0%



Growth           66.2%                          65.3%                          36.6%

Status              23.6%                          23.8%                          13.4%

Post secondary readiness Growth                                          26.7%

Post secondary readiness Status                                             17.9%



Growth           69.4%                          68.7%                          48.8%

Status              19.1%                          19.0%                          10.9%

Post Secondary Readiness Growth                                        16.4%

Post Secondary Readiness Status                                           17.6%

With the post secondary readiness measures added for high school, the totals for growth and status hold pretty true to the 2 to 1 ratio. And over the years the emphasis on growth has increased while the emphasis on status has declined. It strikes me as very scurrilous that the District has resorted to reassigning weights for this measuring instrument, as if they are trying to show “reforms” are successful. Let us not forget that the SPF was originally designed as an honest attempt to assess just how Denver schools were doing as well as a way to help accelerate improvements in student achievement. Much of the data has been massaged, leaving questions about the SPF’s overall accuracy and truthfulness. While much more nuanced than what I have just said, the SPF Collaborative feels much the same.

After calculating the total points a school receives, the school is then “graded” according to a stoplight:

  • Blue Distinguished
  • Green Meets Expectations
  • Yellow Accredited on Watch
  • Orange Accredited on Priority Watch (added in the last few years)
  • Red Accredited on Probation

In February 2014 The Denver Public Schools published a white paper on the SPF.  While acknowledging too much emphasis has been placed on growth, the changes proposed were minimal and still overemphasize growth.  Growth/Status: Elementary and Middle would go from 3:1 to 3:2 for elementary, 2:1 for middle. High schools would remain the same, 2:1.  These changes would go into effect 2015-16. I do not know how this will be possible since TCAP is gone and totally new standardized tests CMAS and PARCC will be in play in Colorado. We shall see.

Back to the SPF according to the 2008 binder.

Within each of the six or seven primary indicators there are many sub-indicators. Seven main indicators lead to 46 sub categories, which in turn lead to 58 sub-sub categories. You get the picture. This is a very confusing and consumer unfriendly tool for parents to navigate. And why is there little consistency across the district high performing schools?  There is something drastically wrong with a measuring instrument that celebrates proficiency declines and is inconsistent in its definition of high quality schools.

I know I am in the weeds but the following is important because the signatories of the October 21 letter asked the Board to revise these criteria so “district staff, parents, and the general public have a clearer understanding of the definition of a quality school…It is now time to reflect upon the purpose and design of the SPF in order to guide the district toward greater improvement.” Let me revisit the three criticisms from the collaborative in more detail:

 Academic status expectations are too low, especially for elementary schools.

“Third graders reading at grade level across the district… should be the primary goal for Denver.  It should follow that green elementary schools which are by definition meeting expectations, should have 80% of their students at grade level.” Three elementary schools on this year’s SPF scored “DID NOT MEET” in the status indicator, having earned 3%, 20% and 27% of possible proficiency points. Yet these schools were rated as “MEETS EXPECTATIONS.” But as I asked in my MGP post: whose expectations are they meeting? Certainly not mine, nor as it turns out the 13 organizations who signed this letter.

 “High performance” is inconsistent within schools

“It is critical to better define success for those communities that continue to be at the losing end of the widening achievement gap. At “high performing schools” the current performance by low-income and minority students – and progress in closing the achievement gap – is unacceptable. This performance should not be defined as meeting expectations. “

Across the District “Distinguished” and “Meets Expectations” should have the same meaning and the same percentage of proficiencies. Anything else contributes to the fiction of the SPF and contributes to the growing achievement gap.

As education activist and retired DPS teacher Mary T. Sam says,

“We (in the Far Northeast) are aware that when you rank a southeast or Stapleton area school as blue that has proficiencies in the eighties and nineties, and then also rank a Far Northeast school as blue that has proficiencies in the fifties to low sixties, you have no expectations for our black and brown kids.”


 Growth overshadows proficiency

“In too strongly weighting academic growth relative to academic proficiency, the current School Performance Framework provides a false positive about what is a good school…The system’s signal that they are green is likely to slow, not increase growth, as it will lead to complacency with the status quo….Students need to be in schools that actually produce learning – as measured by proficiency metrics.” (my emphasis)

No need to be redundant.


So, how much has this School Performance Fiction cost Denver taxpayers? Even with my institutional memory and my stacks of paper, I have not been able to pinpoint the initial roll out cost. I recall some of the initial work was done in-house but how much was and how much was contracted out I could not find.  I have heard guestimates of half a million dollars at the outset.  However, going forward there is documentation.

In March 2012 the Board of Education approved a five year contract with Colorado-based RevGen for almost $3million. From the DPS Finance and Audit Committee meeting:

“The total cost for the 5 year contract is $2,838,580, or an average of $567,716/year and will come from the General Funds. This includes annual metric and reporting changes, delivery of all related reports, and support to public and schools. RevGen will deliver the solution in it’s entirety with minimal agreed upon support from ARE and DoTS.”


RevGen’s Mission: Profit from Knowledge:

“Today’s Business is data rich. RevGen Partners harvests and transforms your data into knowledge so that it can be directed into action. We dig, sift, extract and apply. With newfound knowledge, a company can increase productivity, performance and profitability.”  This kind of talk is creepy because it uses education “reform” jargon that should have no place in public education, in a “business” dealing with human beings:  data-mining, business, bottom line.

If tweaking and delivering reports is worth over half a million dollars a year, honestly, how much did the initial research and development cost even if it were done in-house?  And has the “investment” been worth the “returns?”  Not so far!

The letter from the 14 of us did not propose solutions. We want to engage with the Board and staff “to best improve performance and accountability in the district…” I, personally, would add engaging with honesty. There will be no improvement until we have honest conversations.

“It is critical that DPS not complicate the message to families that “high performing schools” are actually that – high performing, rather than simply on a path toward high performance. Some green schools are on a strong path to proficiency while others are on a path to proficiency but will never get there. Students need to be in schools that actually produce learning – as measured by proficiency metrics.” (my emphasis)

The SPF Collaborative has asked for a response from the Board of Education by the end of November 2014.   We all await an answer which will determine if this Board is ready to turn the Fiction into an authentic Framework.



















8 thoughts on “SPF: School Performance Fiction

  1. Thanks, Jeannie K. As usual, you are on point. One of the reasons I have fought so hard against the SPF is that by not clearly showing the stakeholders the real achievement (or lack of achievement) of our students there is no incentive to fix the problems. I simply want DPS to use this information to drive its policies around remediation and, dare I say it–retention.


    • An interesting take on the untruthfulness, Mary Sam. I had not thought of that. If everything looks rosy, until it is bright red (failing), there is no reason to address troublesome issues and then try to fix them.


  2. As a parent who has been forced to participate in “choice” , I see the SPF rating as a marketing tool which the District uses to steer parents toward certain schools and scare them away from others. Enrollment at Morey is falling and their demotion from green to yellow to orange will ensure that the trend continues. When enrollment drops far enough the District will feel justified in proclaiming that parents are not choosing Morey and that they are forced to close the school or replace it with another program (could that be a charter?). Smiley was our neighborhood school and would have been our choice. DPS misleads with slogans guaranteeing a “great school in every neighborhood” while eliminating neighborhood schools.


  3. Two things, Margaret: 1) Self-fulfilling prophecy as you clearly point out (I had not thought about the marketing piece which is clearly in play), and 2) “a great school in every neighborhood” is double speak for charters coming in or “innovation” schools taking over. DPS, with its administration heavy Office of School Reform and Innovation (OSRI), has been unable to come up with one solution for any neighborhood. The so-called “innovation” schools have really been about “working conditions,” as Mr. Boasberg clearly said in the winter of 2009. In other words protections for workers. Both models currently in DPS – innovation and charters – have no protected workers.


      • I don’t know if you’ll see this since it’s several posts back but I wanted to update you on Morey. We received a letter from DPS yesterday because they have put Morey into “priority improvement” status beginning next fall. As Morey is now an underperforming Title 1 school we have the “option” to choose a higher performing school. Since my son is a seventh grader (and we believe he’ll be able to finish 8th grade at Morey) we’re exercising our right NOT to chose a different school. However, I’m sure many will leave and few 5th graders will chose Morey. My son has already started referring to it as DSST – Morey. Do they do this just to disperse the students who are not proficient on the tests into enough other schools so that the test results look less dismal for a couple of years? Is it an attempt to undo the segregation that has been allowed to happen under “choice” ? I don’t understand how the district believes this approach can help Morey, but maybe they have no intention of helping Morey. The impression I got from them in the meetings following Smiley’s closure seemed to be “if you chose a failing school you deserve a failing school and we can’t help you”. Thank you for keeping a spotlight on our schools.


  4. Jeannie –

    Excellent stuff, as always. There is so much to discuss here, but I’ll try to keep it brief. 🙂

    Trying to measure school performance with one aggregated number is pointless. It’s like walking in to your doctor and saying “Tell me how healthy I am on a scale of 1-100”. Different metrics make sense in different contexts and for different schools. This is why it is so critical that we have experienced educators in leadership positions – they should be the ones with the skill and experience to evaluate our schools and teachers. Instead we end up with a bunch of formulas (based ~90% on test scores as you point out) that spit out some numbers that the powers that be then use as evidence of “progress”.

    It’s ridiculous that DPS Administration and some members of the BOE think they can change student outcomes simply by putting emphasis on different metrics. Do they really think that they can close the achievement gap by putting more weight on the SPF metrics for minorities, ELL, etc? Sadly, I guess the answer is yes. (as presented on March 24th, 2014 to the BOE)

    Hope all is well.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s