Doug Reeves Would Still Write Our SIP Plan . . .


Reeves, D. B. (2010). Transforming professional development into student results. Alexandria, VA: ASCD.

Those of you who know me (this blog’s entire readership?) know I got to meet Doug Reeves a couple weeks ago! In honor of this fanboy moment, I have altered the original plan for this month’s post in order to write about another Doug Reeves book, Transforming Professional Development into Student Results (click here to read my earlier post on my favorite Doug Reeves book: Finding Your Leadership Focus: What Matters Most For Student Results).

This book came out when I was two years into my dissertation, which addressed concerns that are central to this book, so when I first read it I found it very affirming. It is a really quick read, as the last 30 pages are the results and rubric from a study establishing a model for Planning, Implementation and Monitoring (PIM) of professional development. In the 110 pages which proceed these appendices, however, Reeves provides a pretty scathing indictment of how most systems attempt to deploy professional learning programs, a summary of research on how to  create high-impact professional learning, and an action plan on how to sustain it. The aforementioned Planning, Implementation and Monitoring (PIM) in the appendix provides the capstone to the book as an outstanding tool to assess how close your school is to best practice in your design and implementation of professional development programs.

Reeves exhibits his characteristic style of summarizing research from other giants in the field (e.g. Tom Guskey, Michael Fullan) interspersed with fun illustrative vignettes from scientific research, history, and fictional scenarios. Throughout, he argues that “High-impact professional learning has three essential characteristics: (1) a focus on student learning, (2) rigorous measurement of adult decisions, and (3) a focus on people and practices, not programs” (p.21).

Reeves exhorts leaders to develop a short list of topics (suggesting several that are high leverage in accelerating student learning) around which to provide deep, sustained professional development activities. He then calls for administrators and teacher leaders to remain disciplined.

First, leaders remain fixated on the fact that student achievement is the criterion for evaluating teaching, the curriculum, and assessment strategies. This is the opposite of consumer-driven professional learning, in which teaching professionals select courses and conferences from catalogs. With relent­less regularity, focused leaders ask the question “Is it working to improve student learning?” Every other leadership decision that they make must be seen through the lens of the effect on student learning. (p.70)

Reeves then charges leaders to design implementation plans for the professional learning that don’t just bring in some guru to do a one-day lecture on a topic. The best schools follow up the guru with the expectation of sustained  “deliberate practice,” the components of which include “performance that is focused on a particular element of the task, expert coaching, feedback, careful and accurate self-assessment, and—this is the key—the opportunity to apply feedback immediately for improved performance” (p.66).

Referencing the “myth of linearity” I discussed in my previous post, Reeves presents research demonstrating professional development only impacts student learning after extensive “deliberate practice” takes place on the part of teachers (30-100 hours over 6-12 months! (p.67) ). He acknowledges that these kind of sustained implementations can seem impossible to those of us in traditional school districts with limited time dedicated to professional learning. However, Reeves challenges us to consider what would be possible if we chose to radically harness the admittedly limited time that we do have.

Although this sort of commitment may sound overwhelming in a time of tight budgets and crammed schedules, trade-offs are possible. What would be the effect on professional learning if you combined the traditional opening-of-school inspirational speech, four district-level staff development days, and 18 biweekly staff meetings—perhaps 48 hours of professional learning—and focused all of them on improved literacy instruction? While your immediate thoughts might migrate to all of the content that teach­ers would miss by forgoing those workshops and meetings, weigh that loss against the power of focus on a single area of improved teaching. To make the comparison more dramatic, stop for a moment and evaluate the effect on learning of the school opening, the one-day workshops, and the staff meetings of last year. What aspects of that content are you applying? What would you have missed by being absent those days? If you were to decide in the months ahead to substitute high-impact learning for meetings, assemblies, and workshops, you may decide that you are not giving up very much after all. (p.67)

Reeves argues that professional development programs should not only be evaluated based on student scores on the next subsequent test after the professional development program takes place, but also on the adult decisions that resulted (or did not result) in sustained deliberate practice on the part of teachers. Do we as administrators and teacher leaders hold ourselves accountable to implementing professional development deeply and not get distracted by “If It’s Tuesday, It Must Be Brain Research” syndrome (p.48)?

How can we assess ourselves in relation to these high standards? Well, Reeves provides a tool. In Appendix A, Reeves sites a major study that “makes the essential link by showing that effec­tive assessment of adult learning processes is directly related to improved student learning.” In Appendix B, he then provides the assessment instrument used for the study called the “Planning, Implementation, and Monitoring” (PIM) rubric, currently used in more than 2,000 school plans in the United States and Canada.

It includes the following elements: compre­hensive needs assessment, inquiry process, specific goals, measurable goals, achievable goals, relevant goals, timely goals, targeted research-based strategies, master plan design, professional development focus, pro­fessional development implementation, parental involvement strategies, monitoring plan, monitoring frequency, and evaluation cycle. For each of these elements, the planning, implementation, and monitoring process was assessed on a three-point scale. (p.96)

As a school administrator, I can attest that one glance at the PIM rubric is a pretty humbling experience. It sets a high bar. Here’s a sample of the descriptor on the “professional development implementation” domain. Before you read the descriptor, please note that this describes a school that achieves a TWO on the three-point scale. This isn’t even the top score!

A majority of key initiatives described in action steps are supported by specific professional development and research-based strategies. Professional development support is evident. Examples include time, patient and persistent coaching, mentoring linked with initiatives, and multiple opportunities for training or retraining to support teachers. In a majority of professional development action steps, consideration of adult learning needs and change processes is clearly evident and reflected in time, strategies, and resources (limited initiatives, focused professional development, integrated planning, related support structures, etc.) to sus­tain growth over time. (p.97)

In a perfect school, Doug Reeves would still write our SIP Plan because it would include a short list of professional development topics, carefully chosen to support student learning, implemented through many hours of deliberate practice. The evaluation of the professional development would consist of ongoing assessment not only of student results, but of the decisions made by administrators and teacher leaders to sustain implementation.

Next On In A Perfect School . . .               Mike Schmoker Would Do Our Fidelity Checks . . .