On the Uses and Meaning of Data by David B. Cohen

On the Uses and Meaning of Data

NOVEMBER 21, 2012

Get ready – we know the data is coming!  Common Core Assessments are going to change everything, right?  But for teachers and schools on the receiving end of greater volumes of information, what does all that data really mean?  Not surprisingly, it depends on whom you ask.

Kate McKenna

CalTURN’s Kate McKenna addresses conference attendees (photo by the author).

Last week, I attended a labor-management collaboration conference in Santa Monica, California, organized by WestEd and the California Teachers Union Reform Network (CalTURN).  I was invited to offer a short presentation on the second day of the conference, and then participate on a panel for a short discussion of three presentations including mine.  The first two presentations focused on what good work can be accomplished when labor, i.e., unions, and district management, are able to collaborate in a productive, ongoing, trustful atmosphere.  These presenters included a team from Farmington, Michigan, and from Green Dot Charter Schools in Los Angeles.  The audience mainly consisted of districts’ labor-managment teams; from what I could gather, these are people who are already among the converted – representing districts that have made the decision to try this type of work, or perhaps already have some successes in that regard.  For example, Poway Unified and San Juan Unified are two districts whose peer evaluation programs have been quite successful pioneers in the improvement of teacher evaluation practices, and those programs have come about through the development of trusting professional relationships between unions and districts.

My part in the program was to share some of the work done by Accomplished California Teachers, especially on the topic of teacher evaluation, and then to offer some thoughts and reflections that might inform the collaborative work of district and union leaders that was to follow later in the morning.  As I listened to the first two presentations, I found myself editing my prepared remarks and PowerPoint slides on the fly, so as to avoid needless repetition with the presenters ahead of me.  However, one section of my presentation that I held on to concerned the misuse of value-added measurement in teacher evaluation.  I used some graphs showing  New York’s failure, so far, to produce useful VAM data for teacher evaluation, (graphs that have also appeared in this blog in the past).  I also reminded my audience, as I’ve reminded every audience I’ve ever spoken to or written for that the nation’s leading professional associations for educational research all agree that VAM is not yet useful for teacher evaluation, and that more fundamentally, a test validated to measure student learning requires a separate validation process if it is to be used a measure of teaching effectiveness.  (See this post for details).  I urged administrators in the audience to defend quality teaching by taking a stand as more than 1,500 New York principals have, against the use of student test scores in evaluations.

During the panel discussion follow-up, the original presenters were joined by Marco Petruzzi, CEO of Green Dot.  Prior to this event, I knew little beyond the facts that Green Dot is a large charter management organization running a number of high schools in Los Angeles, and that perceptions of their record tend to break down along predictable lines in the education reform and charter school debates.  With friends, family members, current and former colleagues and mentors involved in charter schools, I’ve tried to remain open-minded about the teaching and learning going on in charter schools as a sector, and certainly would caution people about lumping them all together.  In short, I wish existing charter schools and their teachers and students well.  I don’t necessarily want to see charter schools expanding rapidly, as there are many legitimate concerns about management and oversight, and the negative impact that charters can have on existing neighborhood schools.  At the same time, I do not clamor for the closing of charter schools already up and running, as I take no delight in the idea of major, unwanted disruptions in anyone’s education or professional life.  Furthermore, I was impressed by the team from Green Dot, presenting the careful collaborative process they went through – union teachers and school leaders – to craft a teacher evaluation program that by and large reflects a solid understanding of what teachers need, what motivates teachers, and how we improve.

However, at one point in the panel, the moderator, Pat Dolan, commented about the ever-increasing amounts of data headed towards schools and districts, and suggested we’d better be ready to make use of it all.  I jumped in to the conversation and cautioned against accepting data as passive recipients.  “We are not just an audience” for the data, I suggested.  We must retain the perspective, judgement, and professional prerogative to look at data critically and decide what it means, or doesn’t mean, and how it might be read, understood, used, or benignly set aside.  My off-the-cuff remarks suggested that some of the data is frankly incomprehensible to many practitioners anyways – or rather, what’s been done to the raw data before it’s presented as meaningful data is, for many of us, an incomprehensible process.

At that point, Petruzzi felt the need to respond to me directly, and called me out for an alarmist or hypercritical attitude towards data.  I think he even said “dangerous” – in that my attitude suggested a rejection of potentially rational, appropriate, and productive use of data.  He said that my narrow selection of graphs ignored the fact that more years of data, combined with other indicators of teacher quality, dramatically reduce the risk of misidentifying poor quality teaching.  And, he made it known that he does understand the data, statistics, and formulae.  Honestly, I felt no sting in his comments, though some in the audience apparently thought I should have.  In fact, I welcomed the exchange, because if we don’t find disagreement we haven’t pushed the boundaries much in our discussions.  I want to learn more, and I want to be effective as an advocate to many audiences.  I could take constructive criticism here, and try to avoid coming off as data-phobic or data-hostile in the future.  I know I wear my bias on my sleeve in such venues, and I would love to retire the phrase “data-driven” from educational policy discussions.  (“Data-informed” – perhaps.  For thoughtful critiques of “data-driven” teaching, see also:Robert PondiscioEsther Quintero; James Boutin).

There wasn’t time in the panel discussion for me to respond to Petruzzi for the audience, but I approached him afterwards and we had a lively exchange of views.  I tried to clarify that I have no animus against data as a concept, but wished to steer people away from the passive acceptance of data delivered from afar and on high.  For example, we might get reams of data from Common Core Assessements, but the data may not indicate the appropriateness of the standards for the students we’re teaching, the quality or clarity of the standards, the validity of the assessments, the alignment of the assessments with instruction, or a whole host of other information we need to know before discerning what the data offer us.  I gave the example that my school district changed its math curriculum and saw a significant jump in test scores districtwide the following year.  I do think that’s significant and potentially useful information; however, I do not automatically assume that the increase reflects better teaching or learning, nor would I assume conversely that a hypothetical drop in scores demonstrated poorer quality teaching or even an inferior math program.

What's covered by the CA Standards Test in English Language Arts?

California’s existing standardized tests barely cover the standards for an English language arts course (original image by the author).

In my own subject area, English Language Arts, I find almost no value in standardized testing data.  I pointed out to Petruzzi that these once-a-year tests barely attempt to address the range of standards for my subject,  and the standards that they do cover, they cover quite poorly.  In fact, if the tests are supposed to help us improve our teaching, one would think we’d be able to look at the tests and our students’ answers.  But of course, the tests were not designed to help us with our teaching, nor are the state’s test security protocols designed to help teachers or students.

Furthermore, whatever value-added formula is applied cannot adequately control for all of the factors that affect student performance.  How could they?  The people who construct the formulas are neither capable of identifying all the factors, nor do they have any defensible means of calculating the interplay of those factors - either conceptually, or for any given school.

Petruzzi responded that Green Dot teachers have embraced value-added measurement and the use of test scores in their evaluations – which means nothing in terms of addressing the substance of my critique.  Still, in that case, I can at least approve of the process – teachers were partners in the design of their evaluations.  I still heartily disagree with that part of the decision.  I say the same thing to Green Dot’s teachers that I’d say to anyone else in that position: this approach has been a dramatic failure in a number of school systems, and it violates some basic principles of educational measurement.  On what basis do you assert that your measures and methods will evade the problems encountered elsewhere, and meet the standards for educational measurement and assessment?

I told Petruzzi that I wondered why top-performing public school systems, private schools, and international education systems haven’t rushed to embrace this methodology.  Petruzzi responded in what seemed to me like a tangential approach, by telling me how much more competition there is to become a teacher in Finland or Italy.  I took that to mean that since American teachers are, on the average, less talented or less qualified, that we need more agressive measures to remove bad teachers using data.  I responded that such an approach doesn’t strike me as a way to attract more teachers to the profession, increase competition and raise the bar.

There was more to the exchange, and I welcome Mr. Petruzzi or others who heard our follow-up to add comments below.  But, in the interim, I’ve been wondering about how we come to have such different perspectives on what works for teachers.  If Petruzzi is right, that multiple measures will overcome the effects of flawed value-added data, then why don’t we just dispense with the value-added data and refine the information and methods that will produce more timely, useful, valid, and robust information that will improve teaching?  Why do some people cling to promise of data so fervently?

A glance at Mr. Petruzzi’s bio (link above) shows an interesting mix of backgrounds in science, business, and organizational management and consulting.  I wouldn’t pretend to know in detail what it takes to succeed in those fields, and sincerely congratulate him for being able to transfer that background to the management of a school system.  However, I see no information to suggest Petruzzi has worked in a school or classroom, and I offer that without that experience, there’s some room for additional learning.  I’m not suggesting that just because I have the decades of classroom experience with thousands of students, I’m right.  But I do have a vital perspective that Mr. Petruzzi is unable to bring to any examination of existing systems, research, and standards.  Those combined perspectives make it clear to most teachers and principals I’m aware of that current models of value-added measurement using existing state assessments should not be used to evaluate teacher effectiveness.

After nearly 20 years of teaching, plus years of studying education through coursework, conferences, conversations, school visits, and constant professional reading, I’ve observed that thriving schools are not mysterious.  They tend to have four elements in place: expectations, relationships, access, and support.  An intense focus on data is not an essential part of the formula unless we apply the broadest definition of data: facts and statistics collected together for reference or analysis (New Oxford American Dictionary).

Can data schools and teachers improve?  Absolutely.  The right data can provide essential insights to guide school management and teachers.  But it takes contextual awareness and professional judgement to apply critical thinking to any set of data to see what use can be made of it – if any.  The movement currently afoot asks educators to pass along flawed data from poor measurement tools we’re not allowed to scrutinize, demands we defer to unsound assumptions about teaching and learning, stand by while distant statisticians manipulate the data using shoddy methodology, and then accept at face value that the results sent back to us have utility and meaning for our professional practices.  Sorry, folks, I can’t let that slide.

Teachers and administrators, take data down off the altar and stop genuflecting.  The data is supposed to serve us, not vice versa, and when it doesn’t serve us well, reject it.

Views: 179

Comment

You need to be a member of School Leadership 2.0 to add comments!

Join School Leadership 2.0

Comment by Rebecca Dow on December 12, 2012 at 3:36pm

Yes! It makes me crazy the schools spending all this money and torturing teachers for Data - in our district the teachers collect the Data, but have nothing to do with what information they collect, or why they are collecting it. "Data" is meaningless when no one has figured out what the problem is you are trying to solve.

The focus needs to be on quality teaching. Spot on.

JOIN SL 2.0

SUBSCRIBE TO

SCHOOL LEADERSHIP 2.0

School Leadership 2.0 is the premier virtual learning community for school leaders from around the globe.  Our community is a subscription based paid service ($19.95/year or only $1.99 per month for a trial membership)  which will provide school leaders with outstanding resources. Learn more about membership to this service by clicking one our links below.

 

Click HERE to subscribe as an individual.

 

Click HERE to learn about group membership (i.e. association, leadership teams)

__________________

CREATE AN EMPLOYER PROFILE AND GET JOB ALERTS AT 

SCHOOLLEADERSHIPJOBS.COM

FOLLOW SL 2.0

© 2024   Created by William Brennan and Michael Keany   Powered by

Badges  |  Report an Issue  |  Terms of Service