← Back to Blog
Assessment6 min read

How to Use Data in Teaching Without Drowning in Spreadsheets

Data-driven instruction is one of those mandates that arrives from administration without enough guidance on how to actually do it. Teachers end up with binders full of assessment results they don't know what to do with, or spending Sunday afternoons in spreadsheets that produce no actionable information.

The goal of data in teaching is not data collection. It is better instructional decisions. Here is how to get there without making data analysis a second job.

Start With a Question, Not a Dataset

The most common data mistake is collecting data first and trying to figure out what it says later. This produces noise instead of signal.

Start with the instructional question: What specifically am I trying to know? "Did students master the skill I taught last week?" is a clear question. "How are students doing in reading?" is not.

When you start with a specific question, you can design an assessment that answers it, collect only the data you need, and analyze it quickly because you know what you are looking for.

The Three Questions Data Should Answer

Every assessment data analysis should answer at least one of three questions:

Did the class as a whole learn this? — If the class average on an assessment is low (below 70%), the skill likely needs to be retaught to the whole class. If it is high, you can move on.

Which specific students need additional support? — Students who scored significantly below the class average on a specific skill need targeted follow-up, not just more of the same instruction.

What specifically did students not understand? — An item-level analysis tells you which questions most students missed. That tells you which specific concept or skill broke down — information you need to reteach effectively.

Three questions. Most teachers only look at the first one.

Use Exit Tickets as Your Daily Data Source

Standardized assessments generate useful data, but they arrive too infrequently to drive daily instructional decisions. Exit tickets — two or three questions at the end of a lesson — generate daily data that you can act on tomorrow.

Exit ticket analysis should take five minutes, not an hour. Sort student responses into three stacks: got it, almost got it, didn't get it. The proportion in each stack tells you how to start tomorrow's class.

If 80% got it, do a five-minute review for the other 20% while the rest move on to the next step.

Create assessments in seconds, not hours

Generate quizzes, exit tickets, and formative assessments aligned to your standards. Multiple formats, instant results.

Try the Quiz Generator

If 50% got it, reteach to the whole class — but differently than yesterday.

If 20% got it, stop. The lesson didn't work. Rethink the approach before continuing.

Item Analysis: Know Which Questions They Missed

When you look at a class assessment, don't just look at overall scores. Look at individual questions. Which questions did most students miss? That's your reteaching priority.

If question 7 had a 30% pass rate and question 7 assessed the same skill as question 4, which had a 90% pass rate, you know exactly what students understand and what they don't. Reteach the concept assessed by question 7, not the whole unit.

Item analysis sounds technical, but it is just counting. Which questions had low pass rates? What skill or concept did those questions assess? That's your next lesson's focus.

Create a Simple Tracking System

You don't need a complex spreadsheet. A simple class roster with skill columns and a color-code system (green = mastered, yellow = approaching, red = needs reteaching) gives you a visual snapshot of where each student is on each skill.

This tracking serves three purposes: identifying students who need support, informing grouping for targeted instruction, and documenting growth over time. It also gives you something concrete to look at during parent conferences or IEP meetings.

The tracking is only useful if it is updated and consulted. A spreadsheet that is filled in at the start of the year and never opened again is not data-driven instruction.

Translate Data Into Specific Actions

Data becomes instructional only when it produces a specific action: "I will reteach X to the whole class tomorrow," "I will pull a small group on Y on Thursday," "I will move on because 85% of students demonstrated mastery." Without a specific action, the data was collected for nothing.

Build the translation step into your data review habit. After analyzing any assessment: write one or two sentences about what you will do differently in the next lesson based on what you found. This is the whole point.

LessonDraft helps you build formative assessment into your lesson plans from the start — exit tickets, comprehension checks, and discussion prompts that generate usable data without requiring separate assessment design.

Your Next Step

Before your next lesson, write one specific question your exit ticket will answer. Design the exit ticket to answer only that question. After class, sort the responses in five minutes and write one instructional decision for tomorrow. Repeat weekly until it is habit.

Frequently Asked Questions

How do you have time to analyze data every day?
The key is simplicity. An exit ticket that asks two questions, analyzed by sorting into three piles, takes five minutes. The analysis doesn't need to be sophisticated to be useful — it needs to answer the question 'what do I do tomorrow?' A five-minute daily data practice produces better instruction than a two-hour monthly data meeting that generates insights no one acts on. Start with a simple daily practice and resist the urge to make it more complex than necessary.
What do you do when data shows students haven't learned something you've already moved on from?
Go back. The compulsion to keep moving through curriculum regardless of student mastery is one of the most counterproductive habits in teaching. Skills that are not mastered do not disappear from students' needs — they become gaps that compound. A five-minute reteach of a foundational skill is a much better investment than continuing to build on a foundation that doesn't exist. When data shows a gap, schedule targeted reteaching as quickly as possible rather than hoping students will catch up on their own.
What is the difference between formative and summative assessment data?
Formative assessment data is collected during the learning process to inform instruction — exit tickets, quick checks, observation notes, draft feedback. It tells you where students are now so you can adjust. Summative assessment data is collected at the end of a learning unit to evaluate what students learned — unit tests, final projects, end-of-year assessments. Both types matter. Formative data drives daily and weekly instructional decisions. Summative data informs unit design and long-term planning. The most common mistake is using summative data as if it were formative — receiving unit test results and then moving on rather than using them to understand what needs to be addressed in the next unit.

Get weekly lesson planning tips + 3 free tools

Get actionable lesson planning tips every Tuesday. Unsubscribe anytime.

No spam. We respect your inbox.

Create assessments in seconds, not hours

Generate quizzes, exit tickets, and formative assessments aligned to your standards. Multiple formats, instant results.

No signup needed to try. Free account unlocks 15 generations/month.