Paper or Digital Notes?

At the core of every blog post or article I have read about note-taking, it’s often a persuasive pitch for using paper or technology. What should be discussed is where and how students are being taught to take notes effectively, regardless of the tool they choose. For example, if students are writing down everything they hear, instead of summarizing key points, it doesn’t matter if they’re doing it digitally or on paper; they shouldn’t be doing it period. Programs such as AVID that teach students how to properly take notes are a great place to start for great guidance.

Having said that, when notes are taken digitally, there are distinct advantages that paper can’t compete with. Digital notes can easily be shared with classmates and teachers if done within a platform such as Google docs, they can be searched to quickly access information, and they can easily be refined as students begin to learn more about whatever they are studying. I can’t tell you how many times my daughter had to re-do all of her notes because she wanted to reorganize and refine them; this was a highly debated subject in our household 🙂 I always told her it would be so much easier had she not used paper.

I also hear about research that says writing notes on paper makes the information stick better in our brains, but I am skeptical about the research I have seen to date; it’s certainly biased in my opinion. I’d love to get some feedback on this topic, as I’ll admit, I am also biased in this area because I can’t stand paper clutter. If anyone has any research that seems reasonably objective about note taking, I’d love to hear about it.

If anyone has any research that seems reasonably objective about note taking, I’d love to hear about it.



Questions districts should be asking, from an MTSS and implementation perspective

This spring I was introduced to Clay Cook’s work at the U of M, which you can check out here. Several things he presented on interested me. First, he is a strong proponent of school-based mental health for students. This last legislative session, thanks to our superintendent and the other intermediate superintendents, the Legislature passed a bill with funding so that intermediate districts (and other cooperatives) in Minnesota could pilot new and innovative school-linked mental health models. This legislation may open the door to the “adjacent possible” that we’ve been trying open for quite a while!

The second thing that really got my attention was Dr. Cook’s perspective on how often school districts don’t comprehensively examine what initiatives or practices are implemented in their districts, many of which are not evidence-based or that have been implemented with fidelity. It’s not to say that districts can’t try new things that may not be very well researched, but that must be intentionally thought through and then properly evaluated over time.

One of the resources shared in his presentation was the National Implementation Research Network, and as I perused their many resources, I stumbled upon the Hexagon Tool, which we have been using for a few weeks now to evaluate and select which initiatives or projects we should invest our time in. So far this tool has really helped our leadership team to comprehensively think through the many factors that must be considered when making “go-no go” decisions on initiatives that impact students and staff district-wide.

The third thing Dr. Cook touched on was the science of implementation, which many districts fail at. I believe these failures have to do with federal, state and local constraints in addition to a lack of funding. I also think we have suffered from the chasing of the next shiny thing in education. Regardless of the cause, implementation science can certainly help districts to 1) Determine what practices and supports to choose 2) Provide frameworks to significantly increase the odds of successful implementation 3) Provide evaluation frameworks to ensure that the implemented practices are having positive outcomes for students.

Here is what I believe to be a great set of questions that I feel every district could benefit from. I look forward to all feedback and other ideas! Here they are:

  • What systems of support are in place for your students?
  • How do you know you they’re the right supports for your students, and are they evidence based?
  • Has your district successfully disseminated the practices and supports to instructional and operational staff? How do you know?
  • How are you evaluating your initiatives to ensure they’re positively impacting student outcomes?
  • What’s preventing you from successfully implementing supports your students need?


What is the True Cost of Email?

How much does email contribute to your stress levels at work and home? Is Email helping us be more effective? I’m guessing most of us haven’t spent much time pondering its effects on our lives. The research I’ve found confirms that we should be very concerned about the effects email has on us personally, professionally, as well as the total impact on the organizations we work within.

So what does the research I found say?

  • Email does, in fact, increase our stress levels.
  • The average person spends over 25% of their workweek reading and responding to email.
  • Less than 50% of emails deserve our attention.
  • Many employee hours are wasted checking email, which disrupts our ability to stay focused on the tasks that matter most.

So what can we do about email and its impact on our personal and professional well-being? Possible considerations for employees and organizations to proactively address email overload are:

  • Schedule daily time slots to check emails, rather than constantly or sporadically throughout the day; this will help you to stay focused and remain present for things that matter.
  • Ensure your organization has clear employee expectations about how often voicemail and email should be checked.
  • Promote email best practices and provide training for an organization’s leadership team, and then all employees. Many things shouldn’t be done via email, and training and accountability are how you change email culture.
  • Adopt a staff newsletter approach to be more strategic about what needs to be communicated, rather than flurries of emails that pile up quickly and probably don’t even get read.
  • If more than a couple of emails a required to effectively address an issue or it’s a touchy subject, schedule a meeting and take the time to build an agenda with meaningful outcomes beforehand.

These are my initial ideas on how to address the email epidemic. I’d love to hear from anyone that has ideas about how we can reduce the clutter in our inboxes and become more effective, efficient, and healthy. Next time you start composing an email, remember that it is a one-way communication tool, so if you’re hoping to fully engage with someone or a team, you might consider a different way to communicate.


Organize Your Life With Google Keep


Do you need a tool to help organize your busy life, or are you looking for a simple way to take notes on your Android, iPhone, or laptop? Google Keep should be the first thing you look at. Why? Well, for starters, it’s free, and most people that I talk to seem to be looking for simplicity anyways. So why not start with a freebie and see if it meets your needs?

Google Keep is a great tool that I use personally, professionally, and one that I would also recommend for students too. You can color code notes, categorize them, easily make checklists, share notes with collaborators, and it all syncs across your devices, so you always have access to your notes on any device via your Google account.

If you are struggling to get organized, are ready to move beyond a hodgepodge of paper and digital notes, and are looking to become more efficient at prioritizing your tasks, I highly recommend taking a serious look at Google Keep

Stay organized my friends!

Testing Considerations for the Minnesota Legislature, from Testing and Assessment Expert, Dr. Ben Silberglitt

When I took on the duties of coordinating research, evaluation, and assessment at the St. Croix River Education District (SCRED), fresh out of graduate school, one of the first tasks I was confronted with was helping our consortium of small, rural, Minnesota districts make sense of these new tests (the Minnesota Comprehensive Assessments, a.k.a. MCAs) and this brand new legislation (No Child Left Behind). As part of our work, I suggested that we lead groups of teachers through a “deep dive” into the assessment, dissecting the test by the types of questions asked, how much weight was given to each strand area, and matching that up with the scope and sequence of their curriculum.

I was working with a group of 3rd-grade teachers on the MCA math, reviewing how the test included a significant number of questions on predicting the outcomes of spinners, dice, and other “games.” Through my research, we discovered that those activities were well-matched to the scope of their curriculum, but two years prior, the teachers had purposefully moved the units covering these topics to the end of the school year, after testing was over.

They had felt at the time like these topics were less rigorous (more “fun”) and would not be as good at preparing students for the MCA. Needless to say, they promptly moved these units to immediately before the test, and the following year third-grade math test scores jumped considerably.

The students’ overall learning didn’t change. The quality of their education was just as high. The socioeconomic status of the students arriving into the district hadn’t shifted. But suddenly, this school moved from near the middle of the pack to near the top of the state, in 3rd-grade math. From an accountability perspective, they were rock stars. What lessons are to be learned from this work, which was playing out in schools across the country?

As the adage goes, “what gets measured gets done,” and in this case, we have an example of a system adapting to the measurement pressures placed upon it. What’s broken about this system is not the act of measurement itself, but the extreme overfocus on summative, accountability-focused assessments that don’t provide timely information to educators.

The legislature recently commissioned a report on our state’s MCAs, and, to paraphrase, they found that we spend a lot of money on these tests, but no one finds them useful. The irony is thick, here: you might have said the same thing about this commissioned report. I’ve been working with districts across the state for the past 15 years on topics related to data and testing. I frequently ask large groups at presentations, “how many of you find the MCA useful to help you plan instruction?” I haven’t seen a hand raised yet. I think the collective response by educators to this legislative report has been, “We already knew that!”

My hope is that our state legislature will make informed decisions and listen to what educators already know: Assessments have value, and each has a purpose, but no assessment meets all purposes. The reason the MCA isn’t valuable to educators is that its only purpose is accountability. Accountability isn’t a bad thing, but it isn’t everything.

The cognitive dissonance of policymakers contributed to bad assessment policy. If we spent $50 million on a test, it must be amazing. Our state education department leaders, under pressure to make these tests as valuable as policymakers thought they were, then made the fatal mistake of trying to make the MCAs a “silver bullet” – useful for every possible purpose. (It’s a formative, it’s a summative, it’s a Supertest!) Now we see the backlash, and assessment as a whole is at risk, as the pendulum swings back.

What we need is policy that understands the need to balance different types of formative assessment (screening, diagnostic, and progress monitoring assessment) with the need for summative accountability and outcomes assessments. We aren’t overtesting our students; we are overtesting for just one purpose, and not providing enough support to districts for other purposes.

We need a wider range of assessments, and our resources (both money and time) should be redistributed across these purposes. This redistribution requires more flexibility for schools to implement research-based assessments that match their needs, and more support to help schools make these choices, as well as support for using these assessments effectively once they are in place. We can do all of this, but our staff continues to be hamstrung by outdated state-mandated assessments, pushed onto districts who are left to pay the extensive “hidden” costs of administration, coordination, professional development, and lost instructional time.

It’s time for a more sensible approach to testing in our state. Minnesota can continue its reputation as a trailblazer in education, by doing what’s right for kids, by supporting a balanced approach to assessment – one that focuses on the needs of educators, giving us the tools we need to do what we do best.


The State of Standardized Testing in Minnesota

Many educators have concerns about standardized testing, and this week the Office of the Legislative Auditor released its evaluation report on standardized student testing. The report confirms what many of us knew all along. Key findings in the report included:

  • MDE spent $19.2 million on standardized tests in FY 2016.  Federal sources contributed over one-third of the funding.
  • State required tests strain the resources of many school districts and charter schools.  MDE does not systematically measure the local costs and impacts of state testing requirements.
  • The use of test scores at the local level varies widely with many teachers and principals not feeling prepared to interpret the testing data.
  • Most school districts and charter schools administer other standardized tests and find their locally adopted tests more useful.

The full report can be accessed here.

I have two questions:

  1. How can we justify paying $19.2 million for standardized testing that has little to no return on investment when we have students that struggle to access education due to lack of transportation, food, housing, and the social services they desperately need?
  2. Why are we spending $19.2 million on testing that doesn’t improve outcomes for students?


What Do You Believe?

“One person with a belief is equal to a force of ninety-nine who have only interest.”
John S. Mill, Philosopher

A few questions that I believe we forget to ask ourselves on a regular basis: What do you believe about education? What is its purpose and how do your beliefs impact your students directly or indirectly? If we don’t examine our beliefs and biases, we run the risk of continuing to do what we have always done, which will prevent us from making necessary changes to provide learning experiences that prepare students for a future that doesn’t exist yet.

For example, if students are merely taught to remember stuff, how will they respond when they see or experience something they have never seen? If you work directly with students, whether you’re a teacher or an educational support professional, it is imperative that we model positive responses to adversity. One of the things I strongly believe is that as leaders of learning, we should be comfortable diving into unfamiliar territory, so we can model the learning process for students. After all, students will need to learn how to learn on their own if they will succeed in the future.

If I hadn’t taken the time to examine my beliefs about education, teaching, learning, mindset, etc, I may have never come to the realization that teachers need to be learners willing to model their learning with their students. As Simon Sinek always suggests, we should start with the “Why”, and while I agree with this 100%, I also believe that we need to be asking ourselves what we believe and why we believe it.