The purpose, the audience, the outcome
By | Allan Tellis, Chief Writer, Denver
In our contemporary era, the collection and analysis of data play an important role in how we come to understand the world around us. Similar to other public institutions, educational institutions have increasingly integrated data into processes of change and analysis. Many schools and school districts hope they can produce a more accurate picture of how schools are functioning and how they can be improved by increasingly using school data as a guiding light. According to many community members, however, current school and district use of educational data does not answer their questions about what is happening in schools.
Data: A tool for what?
It is tempting to suggest that through the analysis of education outcomes we can derive meaning about what schools are
doing and what they need, but that may not be the case. One parent in Denver Public Schools (DPS) realized that despite their desire to be able to understand the school system via a data collection framework, there is no way for data collection alone to tell the full story. As they put it, “regardless of what you’re looking at, data gets you 80% there but it will not cross the finish line and that has been a really hard truth that I’ve had to accept as someone who loves data. I think it’s a tool, the desire for efficiency is great but if a task gets done then the task gets done, it doesn’t really matter how efficient you are.” It seemed important throughout the conversations that data, especially quantitative data, be seen as an incomplete instrument- potentially helpful to solve problems but a complete solution. In order for education performance to inform school improvement, it must answer the right questions, including those that come from the community. Answering the community’s questions may require expanding conventional definitions of what counts as relevant data, as well as how that information is dispersed, is necessary.
When educators and community members are all asking different questions about schools, the information presented in response may not always address the full array of concerns. One community member described feeling as though schools have a very narrow definition of what counts as usable data, leaving out valuable qualitative data. It seems clear that numerical information about school outcomes should be combined with information about personal experiences in schools; otherwise, leaving out individuals’ stories impoverishes the resources decision-makers can use when coming to conclusions about any next steps. As one community member put it, “data can streamline stories and perspectives; like, if 10 students in a survey all have really similar experiences, data can bring them together. And, like, reporting that data can tell us this is a pattern and this is systemic.”
A narrow focus on quantitative data harms in two ways: it minimizes people’s experiences, and it minimizes the potential severity of unintended consequences of public school practices. A Cherry Creek School District (CCSD) alum suggested that oftentimes in pursuit of reputational clout, school districts, especially those that are considered well-performing, have a vested interest in framing quantitative findings in ways that support the district’s standing. Importantly, this alum believes that including context can offer a remedy to data manipulation and denial of people’s experiences. They put it this way: “there are plenty of quantitative data points that could tell you that the system of education is built to criminalize Black students, that the amount of teachers are predominantly white and they hold a bias against Black and brown students. The thing is people don’t want to believe it and…that’s why qualitative data can be very powerful…it connects to emotion. I think it connects to the human experience, and it’s a lot harder to deny. In that sense, qualitative data is really important in ways that quantitative data simply can’t be.” This alum is asking a different question than many school and district leaders are asking.
Process matters as much as outcomes
Since the general public is often given outcome information as data, it becomes increasingly important that the relevant information to contextualize an issue is included in the information presented to the public. And as metro area school districts use metrics that fail to richly capture the elements of individuals’ experiences in school, it is evident that these data are not particularly meaningful when they are abstracted from important contextual factors. One can not, for instance, understand student attendance rates as data without understanding what students experience when making their way to the school each morning–unless such questions are insignificant to decision-makers. More likely, an effort to efficiently craft solutions reduces educational data to a set of numbers that depersonalize. As one community member put it, “how do you quantify that, how do you quantify getting on a bus at 5 am… our values and morals and dignity are far beyond who we are within a statistic, but I think sometimes it can be hard to explain that.”
The reduction of experiences solely to statistics can also cause problems for educators trying to understand what is happening in their schools.
For instance, an educator noted that presenting teachers with a plethora of statistics related to student and school performance is not particularly useful as they try to understand what problems exist and how they can best support students going forward. As he put it, “data can be both numbers and words. I think you need both. You need the numbers but you also need the words, like the humanization that goes along with it. Being a teacher for the last couple of years as a new teacher, there are so many statistics…
What is that number, what does that number do and what does it tell me? You have to have that written part so it can’t be just the number, it has to be the qualitative part as well.”
A school leader noted that pairing quantitative data with additional context helps community members develop a more accurate and vivid impression of what is happening in a school. Given that for many community members, educational data should facilitate accountability, it is important that school information provides direct responses to educators’ and community members’ questions. One school leader noted that context-free data can allow for incomplete perceptions of a school to be proliferated. As she put it, with “public data you have to be thinking about the unintended consequences and the context and where do these schools have control and where do they not. Where are they making choices and where are they not for their kids.” She also noted that in the service of transparency, it is important to make sure those engaging with the data are aware of where the data they are seeing are coming from.
How and what data communicate
Providing contextual framing along with the outcomes and other data is especially important as sometimes different processes for collecting and analyzing data can produce contradictory conclusions. One community member described it this way: “we experienced systems warfare for years between the state system, the Denver system, and then the internal data that told us what our kids were doing every single day. For example, on the state [rating system] we were yellow but if you recall DPS had a matrix approach, they never did just one year. So if we had a good year, and we had any bad years coupled in front or behind it, it would hide all the growth the kids made. So while one system is telling us ‘hey, you’re doing what you need to be doing and you’re on the right path’, the other was telling you red and that one had more power. And none of them really told our story.” Having a consistent understanding of what the data are suggesting is important not only for school leaders who are accountable to these frameworks but also for other members of the community trying to understand how well a school is functioning.
One parent of a student in DPS voiced concerns that given the existing methods of consolidating educational data via dashboards, it can be easy to present a school as thriving when it is failing many of its most marginalized students. They described this dynamic as the creation of ‘fake green schools.’ Due to the types of data informing a school’s rating, some dashboards can begin to reflect an evaluation of the students instead of the services and accomplishments of a school.
Community members noted that educational data are often misused, as any agenda derived from data may not always function in service of questions and experiences of families, educators, and students. Nearly ubiquitously, community members in these conversations asserted that when data are separated from context and used to make decisions, they become a weapon rather than a tool to facilitate positive school change. A school leader tasked with reimagining data in schools noted that the data she and most of the school’s parents are most interested in reflect the growth of students, not necessarily where they find themselves in comparison to other students within the district. As she put it, “it was all about how to hold a school accountable for things in many ways we have no control over… these systems are not telling the true story which is how are we growing our kids and are we growing them at a fast enough rate? Because context doesn’t matter on these, you’re just seen as a red school in status and I don’t think that tells the kids’ story, and it’s not data that’s actually very useful, it just tells you how they walked in the door.” Many suggested increased transparency as a strategy to ensure that educational data does not function as a weapon to punish schools and individuals that are ‘performing poorly.’ Knowing how and why the data are being collected, and understanding the strategy for making use of the data, creates an atmosphere where the analysis and collection of data are ethical and useful in the pursuit of creating better schools.
Often in an effort to ensure that educational data have been appropriately collected and rigorously analyzed, those conducting and presenting the research end up producing something that is laden with academic jargon and complicated charts–answering altogether different questions than those asked by community members. For instance, one mother noted it is important to keep in mind that not all parties interested in educational data are familiar with certain academic terms and conventions. One community member even suggested that educational data ought to be communicated in language that can be understood at around a middle-school reading level. Along with the reduction in academic jargon, community members suggested that simple graphics are key to ensuring educational data is intelligible to those who can benefit from its findings.
Like including understandable graphics, having data broken out into discrete categories can also help make educational data more digestible.
As one mother of a DPS student noted, “things that also help is if whatever is presented is not rolled up into a final score. Putting a label on the school and saying this school is in the top 20 percent or whatever you want to say – that is not fair and it does not allow people to see the school’s strengths and weaknesses.” Another community member noted that the current framework lends itself to that type of lack of intelligibility, especially for parents. As one school leader noted, “I don’t even know how many of our families knew or even cared what color we were on the SPF. I do think it matters when you’re recruiting because that’s the tool that everybody gives parents to know whether there’s a quality school or not, and I don’t know, but I think that status is not the driver for our families. I would say their growth, language development, and the actual lived experience of the student feels like the most important criteria for our families.”
Data at their best offer insights into the performance of schools in categories that matter to community members. Not only does this allow for a more accurate picture, but it can also help educational institutions be held accountable for how their decisions impact the communities in which they operate. However, without considering the community’s questions and values, decisions based on school data cannot lead to educational justice.