I didn’t meet my reading goal (and is that okay?)

2016 has come to a close. Like any year, there were events to celebrate along with a few experiences we may not care to reminisce over. One event that is somewhere in the middle for me is that fact that I didn’t achieve my reading goal.

For the past two years, I have set a goal for number of books to read from January to December. In 2015 I not only met my goal but surpassed it (50/53). This past year I decided to up the ante – more is better, right? – and set a goal for 60. I ended up reading 55 books this year. Not too shabby, considering my recent move and a new job.

Screen Shot 2017-01-01 at 5.45.48 PM.png

Goodreads, the online community where I along with many other bibliophiles post said goals, seems indifferent to this fact. “Better luck in 2017!” is all the feedback Goodreads offers. I can live with that. The site focused more on all of the books I did read, covers facing out, along with number of pages read and related statistics.Screen Shot 2017-01-01 at 5.43.01 PM.png

I guess I could have pushed through in December and quickly devoured some titles just to meet my goal. They may not have been what I necessarily wanted to read though. Also, I could have thrown in a few more books that my wife and I listened to with our kids while driving. But to be honest, I was half listening and didn’t feel like I could count it.

I’m glad that I didn’t caught up in meeting arbitrary goals. If that had been the case, I may have passed on longer, more complex works of fiction such as All The Light We Cannot See by Anthony Doerr. It’s fiction, yes, but also helped me deepen my understanding of what it means to live in a nation that does not share your beliefs. If I had worried too much about meeting a reading goal, I might not have reread and reread again Last Stop on Market Street by Matthew de la Pena. It still floors me how many ideas and perspectives a reader can glean from such a short text. If I had worried too much about meeting my reading goal, I may have avoided reading reference books about writing, such as Write What Matters by Tom Romano and A Writer’s Guide to Persistence by Jordan Rosenfeld. These are not texts you plow through. Yet I come back to these resources for information and inspiration.

If I was teaching in the classroom again, I think I would adopt a Goodreads-approach to independent reading. Students would still be expected to set some type of goal based on number of books. But it would not be the function of independent reading. We would look at different data about their reading lives, including:

  • Variety of genres explored
  • Complexity of texts from fall to spring
  • Favorite authors, titles and series based on ratings and reviews
  • Classmates whose reading habits influenced their reading lives
  • Books on their to-read list
  • How they feel about reading in general

This data seems a lot more important than the number of books read. I do believe volume in reading is important. But what leads someone to read? We still get reading goals like number of books read confused with purpose. The purpose of a reading goal is to make a more concerted effort to read more and to read daily. The idea is that through habitual reading, we will discover new titles, authors and genres that we come to enjoy and find valuable in our lives. I think about how I got hooked on reading: in the 3rd grade, our teacher read aloud Tales of a Fourth Grade Nothing by Judy Blume. No reading goal, amount of guided reading or immersion into a commercial program did that for me.

As teachers take stock with their students during the school year regarding reading goals, I sincerely hope they look beyond mere numbers and work with their students so they can understand them as readers. Data that only measures quantity and disregards quality tells us very little about who our students are and who they might become as readers.

Screen Shot 2017-01-01 at 5.42.50 PM.png

Suggestion for Further Reading: No AR, No Big Deal by Brandon Blom

Learning Management Systems: Who are they for?

A learning management system, or “LMS” is defined as “a digital learning system” that “manages all of the aspects of the learning process” (Amit K, 2015). A teacher can use an LMS for a variety of classroom functions, including communicating the learning objectives, organizing the learning timelines, telling the learners exactly what they need to learn and when, delivering the content straight to the learners, streamlining communications between instructor(s) and learners, and providing ongoing resources.

An LMS can also help the learner track their own progress, identifying what they have learned already and what they need to learn (Amit K). There are many options for learners to share their representations of their understandings within an LMS, including video, audio, images and text. In addition, discussion boards and assessment tools are available for teachers and students in most systems.

This definition and description of your typical LMS leads to an important question: Who is the learning management system for?

If an LMS is for the teacher, then I think they will find the previously listed features to be of great benefit to their practice. As an example, no longer do they have to collect papers, lug them home and grade them by hand. Now, students can submit their work electronically through the LMS. The teacher can assess learning online. The excuse “My dog ate my homework” ceases to exist. Google Classroom, Schoology and Edmodo fall into this category.

Also, teachers can use the LMS tools to create quizzes that could serve as a formative assessment of the lesson presented that day. Data is immediately available regarding who understands the content and who needs further support. This quick turnaround can help a teacher be more responsive to student’s academic needs. There are obvious benefits for a teacher who elects to use an LMS for these reasons.

If, on the other hand, an LMS is for the students, then we have a bit more work to do. With a teacher-centric LMS, not much really changes regarding how a classroom operates. The teacher assigns content and activities, the students complete it, and the teacher assesses. The adage “old wine in new bottles” might apply here.

With students in mind when integrating an LMS in school, the whole idea of instruction has to shift. We are now exploring concepts such as personalized learning, which “puts students in charge of selecting their projects and setting their pace” (Singer, 2016), and connected learning, which ties together students’ interests, peer networks and school accomplishments (Ito et al, 2013). In this scenario, it is not the students who need to make a shift but the teachers. Examples of more student-centered LMSs include Epiphany Learning and Project Foundry.

The role that teachers have traditionally filled looks very different than what a more student-centered, digitally-enhanced learning environment might resemble. I don’t believe either focus – the teacher or the student – is an ineffective approach for using a learning management system. The benefits in each scenario are promising. Yet we know that the more students can have ownership over the learning experience, there is an increased likelihood of greater achievement gains and higher engagement in school.


Amit K, S. (2016). Choosing the Right Learning Management System: Factors and Elements. eLearning Industry. Available: https://elearningindustry.com/choosing-right- learning-management- system-factors-elements

Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., Schor, J., Sefton-Green, J., Watkins, S.C. (2013). Connected Learning: An Agenda for Research and Design. Media and Learning Research Hub. Whitepaper, available: http://dmlhub.net/publications/connected-learning- agenda-for- research-and-design/

Singer, N., Isaac, M. (2016). Facebook Helps Develop Software That Puts Students in Charge of Their Lesson Plans. The New York Times. Available: http://nyti.ms/2b3LNzv

Data-Driven or Data-Informed? Thoughts on trust and evaluation in education

Data-informed or data-driven? This is a question I have wrestled with as a school administrator for some time. What I have found is that the usefulness of student data to inform instruction and accountability rests on the level of trust that exists within the school walls.

First there is trust in the data itself. Are the results of these assessment tools reliable (consistency of results administered over time and students) and valid (accuracy in the results of the assessments to measure student learning)? These are good initial inquiries, but should only be a starting point.

Security of student information should also be a priority when electing to house student data with third parties. One question I have started asking vendors that develop modern assessment tools include “Where do you house our student data?”, “What do you do with this data beyond allowing us to organize and analyze it?”, and “Who owns the student data?”. In a commentary for The New York Times, Julia Angwin highlights situations in which the algorithms used to make “data-driven decisions” regarding probability of recidivism in the criminal justice system were too often biased in their results (2016). Could a similar situation happen in education? Relying merely on the output that a computer program produces leads one to question the validity and reliability of this type of data-driven decision making.

A second issue regarding trust in schools related to data is how student learning results are being used as a tool to evaluate teachers and principals. All educators are rightfully skeptical when accountability systems ask for student learning results to be counted toward their performance ratings and, in some cases, level of pay and future employment with an organization.

This is not to suggest that student assessment data should be off the table when conversations occur regarding the effectiveness of a teacher and his or her impact on their students’ learning. The challenge, though, is ensuring that there is a clear correlation between the teacher’s actions and student learning. One model for data-driven decision making “provides a social and technical system to helps schools link summative achievement test data with the kinds of formative data that helps teachers improve student learning across schools” (Halverson et al, 162). Using a systematic approach like this, in which educators are expected to work together using multiple assessments to make instructional decisions, can simultaneously hold educators collectively accountable while ensuring that students are receiving better teaching.

Unfortunately, this is not the reality in many schools. Administrators too often adhere to the “data-driven” mentality with a literal and absolute mindset. Specifically, if something cannot be quantified, such as teacher observations and noncognitive information, school leaders may dismiss these results as less valuable than what a more quantitative tool might offer. Professional trust can tank in these situations.

That is why it is critical that professional development plans provide educators with training to build assessment literacy with every teacher. A faculty should be well versed in the differences between formative and summative assessments, informal and formal measurements, deciding which data points are more reliable than others, and how to triangulate data in order to analyze results and make a more informed decision regarding student learning.

Since analytics requires data analysis, institutions will need to invest in effective training to produce skilled analytics staff. Obtaining or developing skilled staff may present the largest barrier and the greatest cost to any academic analytics initiative (Baer & Campbell, 2012).

Building this assessment literacy can result in a level of trust in oneself as a professional to make informed instructional decisions on behalf of kids. If a faculty can ensure that the data they are using is a) valid and reliable, b) used to improve student learning and instructional practice, and c) considers multiple forms of data used wisely, then I am all for data-driven decision making as a model for school improvement. Trust will rise and student achievement may follow. If not, an unfortunate outcome might be the data cart coming before the pedagogical horse.


Angwin, J. (2016). Make Algorithms Accountable. The New York Times. Available: http://www.nytimes.com/2016/08/01/opinion/make-algorithms-accountable.html?_r=0

Baer, L.L. & Campbell, J. (2012). From Metrics to Analytics, Reporting to Action: Analytics’ Role in Changing the Learning Environment. Educause. Available: https://net.educause.edu/ir/library/pdf/pub72034.pdf

Halverson, R., Gregg, J., Prichett, R., & Thomas, C. (2007). The New Instructional Leadership: Creating Data-Driven Instructional Systems in Schools. Journal of School Leadership. Volume 17, pgs 159-194.

This is a reation paper I wrote for a graduate course I am currently taking (Technology and School Leadership). Feel free to respond in the comments to extend this thinking.

Action Research: Professional Learning from Within

By becoming question-askers and problem-solvers, students and teachers work together to construct curriculum from their context and lived experiences.

– Nancy Fitchman Dana

13266113_10209362500521785_3561560696307026816_nOver 20 teachers recently celebrated their learning as part of their work with an action research course. They presented their findings to over 50 colleagues, friends, and family members at a local convention center. I was really impressed with how teachers saw data as a critical part of their research. Organizing and analyzing student assessment results was viewed as a necessary part of their practice, instead of simply a district expectation.

Equally impressive was how some of the teachers shared data that suggested their interventions did not have an impact on student learning. One teacher, who explored student-driven learning in her middle school, shared survey results that revealed little growth in her students’ dispositions toward school. What the teacher found out was she had not provided her students the necessary amount of ownership during class.

Another teacher did find some positive results from her research on the benefits of reflection during readers workshop. Students wrote in response journals and engaged in authentic literature circles to unpack their thinking about their books they were reading. At the end of the school year, the teacher was starting to observe her students leading their own literature conversations with enthusiasm. This teacher is excited about having some of these same students in 2016-2017, as she is looping up. “I am really looking forward to seeing how these kids grow within the next year.”

A third teacher shared her findings regarding how teaching students how to speak and listen will increase their comprehension of reading and their love for literacy. One of her data points – student surveys – was not favorable toward this intervention. Yet her other two pieces of data (anecdotal evidence, volume of reading) showed positive gains. Therefore, she made a professional judgment that her students did grow as readers and thinkers. This teacher is also reflecting on the usefulness of this survey for next year.

In these three examples, I couldn’t help but notice some unique outcomes of this action research course:

  • Teachers were proudly sharing their failures.

With the first teacher who focused on student-driven learning, she developed a greater understanding about her practice than probably possible in a more traditional professional learning experience. She learned what not to do. This teacher is stripping away less effective methods in favor of something better. And the reason she is able to do this is because she had a true professional learning community that allowed her to take risks and celebrate her discoveries.

  • Teachers didn’t want the learning to end.

This goes beyond the teacher who expressed her excitement in looping with her current students next year. Several participants in this action research course have asked if they could take it again. The main reason: They felt like they just found the question they really wanted to explore. It took them most of the school year to find it.

  • Teachers became more assessment literate.

The term “triangulation” was never referenced with the teacher who focused on conversations to building reading comprehension and engagement. Yet that is what she did, when she felt one set of data was not corroborating with the other results and her own professional judgment. Almost all of the staff who participated in action research had 3-5 data points to help make an informed conclusion about the impact of their instruction.

I also learned a few things about myself as an administrator:

  • It is not the professional development I offer for staff that makes the biggest difference – it is the conditions I create that allow teachers to explore their interests and take risks as innovative practitioners.
  • My role often is to the side of the professionals instead of in front of them, even learning with them when possible. For example, we brought in two professors from UW-Madison to lead this course. The best decision I made was recognizing that I was not the expert, and I needed to seek out those who were.
  • Principals have to be so careful about providing feedback, as we often haven’t built up enough trust, we can make false assumptions about what we are observing, and/or we do not allow teachers to discover better practices on their own terms.

In a world of standards and SMART goals, it is frowned upon when teachers don’t meet the mark regarding student outcomes. The assumption in these situations is that the teacher failed to provide effective instruction. However, the fault in this logic is that learning is not always a linear process. We work with people, dynamic and unpredictable beings who need a more personalized approach for real learning. Facilitating and engaging in action research has helped me realize this.

From Idea to Iteration: Honoring the Process of Learning #IPDX16

41t7g4xHHzL._SX258_BO1204203200_One of my favorite books to read aloud, to staff and students, is What Do You Do With an Idea? by Kobi Yamada and Kae Besom (Compendium, 2014). According to the summary posted on Barnes and Noble:

This is the story of one brilliant idea and the child who helps to bring it into the world. As the child’s confidence grows, so does the idea itself. And then, one day, something amazing happens.

This is a story for anyone, at any age, who’s ever had an idea that seemed a little too big, too odd, too difficult. It’s a story to inspire you to welcome that idea, to give it some space to grow, and to see what happens next. Because your idea isn’t going anywhere. In fact, it’s just getting started.


It’s been a year and a half since I published my first book on digital portfolios for students. In the time between then and now, my beliefs regarding the smart use of technology to provide authentic, connected assessment for students to showcase their understanding and skills have largely stayed the same. I continue to reference this resource in my workshops, such as the one I facilitated today at AcceleratED.

The consistency in the concept that learners require access, purpose, and audience for this type of learning to take place gives credibility to what I’ve shared today and in the past. This is what I knew at the time:

Screen Shot 2014-10-12 at 12.25.19 AM
Source: Digital Student Portfolios: A Whole School Approach to Connected Learning and Continuous Assessment (eBook, 2014)

The visual was designed to locate access as the cornerstone for all of the other work we might engage students in with regard to digital assessment. The purpose of the learning task and the audience for this work would envelope the access students require to share their learning in ways that best meet their needs and preferences.

My thinking has not changed in these three tenets of engagement with digital assessment. However, I am wondering if this visual is the only representation for this framework. As I was flying over the Rockies from Denver on my way to Portland for the excellent AcceleratED experience, a new visual coalesced.


This graphic was not rendered with the same production quality as the previous graphic, but the difference is hopefully clear. By provide access to students with multiple ways to represent their learning (audio, video, image, text), they can feel more successful as well as better inform the teacher about the next steps (purpose) in their learning journey. Motivation is increased when there is an authentic audience involved in viewing student learning, namely their family through digital tools such as FreshGrade (www.freshgrade.com). One tenet of engagement informs the other, which informs the other, and back again. Kind of like learning! 🙂

In my subsequent experiences as a school principal who visits classrooms regularly since writing this digital resource, I have found that the digital portfolio assessment process is as much of a cycle as well as a framework. Was I wrong in my initial thinking? I don’t think so. It was my paradigm at the time. I think the premise still holds true. What I’ve realized since then is, what I imagine as a mental model doesn’t necessarily translate to reality. As a lifelong learner, I’ve received a lot of feedback from other educators and explored different perspectives on this topic. The more I learn, the more questions I have.


The main message from What Do You Do With an Idea? is that when we share something new and possibly innovative to the world, it is hard to predict where the idea might lead. Others start to own it, put their personal stamp on it, and eventually make it their own. This is okay. I have given digital portfolio assessment “some space to grow, and to see what happens next.” It wasn’t my original idea anyway. The initial framework has evolved due to other educators’ perspectives and from my own reflections. Who am I to stop these continuous iterations? I look forward to what the framework might look like in 2017.



How Can I Rethink Reading Logs with High Schoolers?

This post is actually a lengthy reply I left for a reader, who asked me the question via comments in a post I published a year and a half ago. So great to see how what we share online impacts other schools!

Hi Francisco. I appreciate your honest question. I’m not experienced with high school, but I have some thoughts. My initial suggestion is to get your students on Goodreads (https://www.goodreads.com/about/how_it_works). If you are not familiar with Goodreads, it is a social media tool for readers. They can use their Facebook accounts to create an account within Goodreads. Readers can rate and review books, read what others are reading, and have suggestions sent to them based on their past interests (https://www.goodreads.com/recommendations). Students can also make “to-read” lists, selecting what books they want to read next, which all readers should have anyway.

Maybe have them take the Goodreads Book Challenge (https://www.goodreads.com/challenges/), where they select a number of books they plan to read for the calendar year. They can then see their progress as time goes on. They can also recommend books to peers through Goodreads as long as they are “friends”. In addition, the students can download the book titles they’ve read so far into a spreadsheet to share with you periodically. They could also use this list as a way to reflect about their reading, such as what genres they prefer and who has been influential in their reading lives.

I also like the “groups” function of Goodreads, which is an online community around a topic, favorite author, or a genre. Discussion boards can be created within a group. Goodreads is very mobile friendly, so they can use their smartphones and tablets for this purpose at school. One more idea: As students build a substantial list of books they’ve read, they can start creating libraries around the categories of books they have been reading.

If there are privacy/sharing concerns from families or administration, you could also have students use Google Docs to keep track of their reading and thinking, but it is not as authentic. As for strategy work with high schoolers, if they are engaged in what they are reading because they could pick the texts and talk about them with friends, older students have shown that they can teach themselves strategies because they are motivated to read. Our jobs as teachers at this age level is to educate our students about the strategies they are using, which can then lead into future instruction using more complex texts they will need to read closely today and in the future.

As I stated, I do not have a lot of background in adolescent literacy, but reading enough of the research tells me that older students’ reading instruction should be as authentic and relevant as we can make possible. Your students may continue to use Goodreads as they get older, which also helps them leave a positive digital footprint in their future. Using a social media tool would allow your students to continue their conversations with peers beyond the school day. They will be doing exactly what you ask of them with less of the griping, because they won’t see it as school work.

Good luck!


Data Poor

Have you heard of “DRIP”? It stands for “Data Rich, Information Poor”. The purpose of this phrase is to convey the idea that we have all of this data in schools, but cannot organize it in a way that will give us the information to make responsive instructional decisions.

photo credit: Images by John ‘K’ via photopin cc

But what if this is not case? What if we are actually data poor? When we consider only quantitative measures during collaboration, such as test scores and interim assessments, we miss out on a lot of information we can glean from more qualitative, formative assessments. These might include surveys, images, audio, journaling, and student discussions.

In this post for MiddleWeb, I profile two teachers in my district who have leveraged technology to better inform their instruction and student learning. The videos and web-based products the students and teachers develop are captured as close to the learning as possible. The results are dynamic, authentic, and minimally processed.

In tomorrow night’s All Things PLC Twitter chat (follow #atplc), we will pose questions to dig more deeply into what data means in the modern classroom. There are too many ways for learners to show what they know to ignore the potential of connected learning and continuous assessment. Join us at 8 P.M. CST for this discussion.