Using Feedback To Boost Student Performance In 11 Steps

By StefanApril 28, 2025
Back to all posts

We’ve all been there—spending hours writing detailed comments, only to see students glance at the grade, shrug, and then repeat the same mistakes next time. It’s maddening. Like you’re explaining something important to a brick wall.

Here’s the thing, though: feedback doesn’t have to be ignored. In my experience, the difference isn’t “more feedback” or “better writing.” It’s feedback that’s clear, timely, and built for action.

Below are 11 steps I’ve used (and tweaked) to make student feedback actually get used—without turning grading into your full-time job.

Key Takeaways

  • Make feedback specific and targeted so students know exactly what to change (and how).
  • Aim to return feedback fast—ideally within a week—so it’s still connected to the learning.
  • Emphasize progress and growth, not just missed standards, so students stay motivated.
  • Mix feedback formats (written, audio/video, conferences, peer review) to reach more students.
  • Use a simple structure like Situation-Behavior-Impact to reduce vague “do better” comments.
  • Limit feedback to two or three priorities per assignment so students can actually act on it.
  • Build in reflection (even short prompts) so students process your feedback, not just read it.
  • Track whether students improve after feedback using a simple measurement plan.
  • End every feedback cycle with actionable next steps students can complete immediately.

Ready to Create Your Course?

Try our AI-powered course creator and design engaging courses effortlessly!

Start Your Course Today

1. Provide Specific Feedback (Not “Nice Job”)

Have you ever gotten comments that basically say “nice job” or “needs improvement” and you’re left thinking… okay, but what do I do now?

Students feel that exact confusion too.

The best feedback is specific. It tells them what worked (and what didn’t) and gives a reason tied to the assignment criteria.

For example, instead of “This essay lacks clarity,” I’d write:

“I had trouble following your main point in the second paragraph. Can you clarify your claim there, and add one concrete example (a fact or scenario) to support it?”

That’s actionable. It also signals you read with purpose.

One framework I use a lot is the “feedback sandwich,” but I don’t use it blindly. It works best when the “improvement” part is still concrete. If you just sandwich vague comments, students still won’t know what to change.

Quick example (strong vs. weak):

  • Weak: “Good effort. Needs improvement. Keep it up!”
  • Stronger: “Your introduction grabbed my attention and clearly previewed your topic. In paragraph two, your evidence doesn’t connect back to your claim—add a sentence that explains how the example supports your argument. You’re on the right track; your conclusion already ties things together.”

2. Give Timely Feedback (So It’s Still Useful)

Waiting days—or a week or two—for feedback is one of the fastest ways to make it irrelevant. By the time students see your comments, they’ve already moved on (and sometimes they’ve already revised nothing).

In my experience, feedback hits hardest when students get it within a week. If that’s impossible for every assignment, then at least return something faster: an outline score, a rubric check, or quick notes on the first draft.

Here’s a workflow that’s worked for me when grading piles up:

  • Split the task: students submit an outline or draft early.
  • Grade in passes: pass 1 checks criteria A & B (quick scan), pass 2 checks criteria C & D (deeper read).
  • Return partial feedback: “Here’s what to fix first” rather than full line edits for everything.

And yes, the point isn’t just speed. It’s timing—feedback needs to arrive while students still have the mental energy to apply it.

3. Focus on Student Progress (Growth Beats Shame)

When students get negative feedback, it can land like a label: “I’m just not good at this.” I’ve seen it happen in every subject area, honestly.

So I try to shift the conversation from “standards missed” to “progress made.” Not in a fluffy way—more like, what did you improve? and what’s the next step?

Instead of:

“You didn’t meet the requirements.”

I’ll write something like:

“Your paragraph structure improved since the first draft. Next, let’s sharpen your thesis so the rest of the essay has a tighter target.”

That framing helps students treat feedback as part of a learning cycle, not a final verdict.

One practical thing I do is include a “progress callout” at the top of feedback. It’s usually one sentence based on something they already did better.

Example progress callouts:

  • “You used clearer transitions this time, and your ideas flow more logically.”
  • “Your evidence is more relevant than last submission—nice improvement.”
  • “You’re asking better questions in your analysis; now we’ll make your claims more specific.”

If you want extra inspiration beyond writing, it can help to look at research on student engagement and how belonging affects persistence. For one example, the NSSE Research Blog shares ways institutions use engagement and belonging data to improve student support—useful context if you’re trying to make feedback part of a bigger retention effort.

4. Use Different Types of Feedback (Mix It Up)

Feedback isn’t just red-ink comments. It’s communication. And different students respond to different formats.

Here are the types I rotate depending on the assignment:

  • Written comments: best for specific rubric criteria and short explanations.
  • Audio/video feedback: great for tone, modeling, and “walkthrough” moments.
  • Conferences/check-ins: useful when students are stuck or when you need to clarify expectations.
  • Peer feedback: works well with structure and examples (more on that below).

About video: I’m a fan, but only when it’s short. A 2–4 minute Loom-style walkthrough is usually enough to explain two things—what you did well and what to change next.

Simple video structure (that students actually follow):

  • 0:00–0:30: “Here are the 2 strengths I noticed…”
  • 0:30–2:30: “Here’s what to fix first…” with one concrete example
  • 2:30–4:00: “Your next step is…” (tell them exactly what to do)

If you do peer feedback activities, don’t just say “be constructive.” Give students a prompt set, a rubric slice, and a reminder that they’re responding to criteria—not judging the person.

5. Try the SBI Feedback Model (Situation → Behavior → Impact)

The SBI model is simple for a reason: it forces specificity.

Situation: what was happening?

Behavior: what did the student do?

Impact: what did it cause / what effect did it have on the reader or audience?

Here’s what it looks like in real feedback:

“In your presentation, you started with a broad claim about the topic. You didn’t give a concrete example until halfway through. The impact is that I wasn’t sure what evidence you’d be using, so it took longer to follow your argument.”

What I like about SBI is that it turns vague feedback into a mini-case study. Students can “replay” what happened and connect the behavior to the result.

One limitation: SBI can get wordy if you try to cover everything. I only use SBI for the two things I most want students to change next.

6. Structure Peer Feedback (So It’s Not Useless)

Peer feedback fails when students think it’s just opinions. You know the vibe: “It’s good!” “Needs work!” “You got this!”

To avoid that, I always structure peer feedback with three things:

  • Clear questions tied to rubric criteria
  • Rating guidance (what counts as “meets,” “approaches,” etc.)
  • Examples of constructive criticism

Peer Feedback Worksheet (copy/paste for your class)

  • 1) What’s the strongest part?
    - Quote or point to one sentence/section you think works well:
    “The strongest part is…”
  • 2) What’s unclear or missing?
    - What did you struggle to understand?
    “I got confused when…”
    - What would make it clearer?
    “To improve, consider…”
  • 3) Evidence / support check (if applicable)
    - Is there evidence that supports the main claim?
    - If not, what type of evidence is missing?
    “The support I’d add is…”
  • 4) Next-step action
    - What’s one revision the writer should do first?
    “Your first revision should be…”
  • 5) Quick rating (circle one)
    - Clarity: Meets / Approaches / Needs work
    - Criteria alignment: Meets / Approaches / Needs work
    - Specificity of feedback you gave: Strong / Okay / Weak

Worked example: strong vs. weak peer feedback

  • Weak peer feedback: “Your essay needs work. Try harder.”
  • Strong peer feedback: “In your second paragraph, your claim is broad, but the example comes later. I think you’ll improve clarity if you add one sentence after the claim that explains what the example will prove. Then move the evidence right after that sentence.”

7. Avoid Overloading Students with Feedback (Two or Three Priorities)

Overwhelming feedback is usually unintentional. Teachers write a lot because they care. But students read it like a to-do list they can’t finish.

So I try to choose the top 2–3 priorities for each submission. Everything else gets shorter notes like “address this in revision” or “see rubric row 2.”

Here’s how I prioritize in practice:

  • First: the biggest impact on understanding (clarity, organization, argument, accuracy)
  • Second: the criteria that directly affects the assignment goal
  • Third: a smaller improvement that’s easy to fix next

If you do only one thing: don’t give students 12 fixes. Give them 2–3 fixes they can actually complete.

8. Keep Feedback Task-Specific (Tell Them What to Do on This Assignment)

Generic advice like “work harder next time” is basically useless. It doesn’t tell students what to change in this task.

Instead, tie feedback directly to the assignment:

Generic: “Your essay structure needs work.”

Task-specific: “Next time, use transitions between paragraphs that explain the relationship (for example: ‘This matters because…’). Also include one piece of evidence in paragraph two to support your claim.”

Students don’t need motivation as much as they need clarity. Task-specific feedback is clarity.

9. Encourage Student Reflection on Feedback (Make Them Respond)

If students read your feedback and then never do anything with it, feedback becomes wasted effort.

Reflection doesn’t have to be long. It just has to be required and connected to action.

Here are a few reflection prompts I’ve used:

  • “In your own words, what are the 2 changes you’ll make next time?”
  • “Which comment helped you most, and why?”
  • “What’s one question you still have about the feedback?”
  • “Copy the rubric row where you improved most. What did you do differently?”

You can collect this as a short paragraph, a quick form submission, or a 3-minute in-class discussion.

In my experience, even a tiny reflection step increases follow-through because students have to process what you wrote—not just skim it.

10. Measure the Impact of Feedback (A Simple Plan)

It’s totally fair to wonder: “How do I know my feedback is working?”

You don’t need fancy research software. You just need a consistent check.

Here’s a measurement plan you can run each grading cycle:

  • Track 1–2 rubric criteria over time.
    Example: clarity of claim and quality of evidence. Compare scores from Assignment 1 to Assignment 2.
  • Use a quick student feedback survey (2 minutes).
    Ask 3 questions on a 1–5 scale:
    - “My feedback helped me know what to change.”
    - “I understood the next steps.”
    - “I used the feedback in my revision.”
  • Look for revision evidence.
    If you allow revisions, check whether students actually changed the targeted areas.
  • Do a quick “common errors” review.
    After grading, list the top 3 repeated issues. Did your feedback address them in a way that reduced repeats?

Example of how results can change your process:

  • Cycle 1: Students score low on “evidence explanation.” Survey shows confusion about what “analysis” means.
  • Change: You add one model paragraph and rewrite feedback to include a specific “because…” sentence students must use.
  • Cycle 2: Survey improves (students report better understanding), and rubric scores for “analysis” rise.

That’s the loop: feedback → student action → evidence of improvement → adjust your approach.

If you want a broader view on why data-informed improvement matters in education, you can explore the Higher Learning Commission site for general guidance on continuous improvement and assessment practices. (I’m not going to pretend there’s one universal “future requirement” that applies to every school, but the core idea—using evidence to improve outcomes—is widely emphasized.)

11. Create Actionable Next Steps for Students (Before They Leave the Page)

Students don’t act on feedback that feels like commentary. They act when you give them a next step that’s specific enough to complete.

I treat the last part of feedback like instructions for a revision. If I can’t imagine a student finishing the task after reading my comments, then it’s not actionable enough.

Here are before/after examples I actually prefer:

Example 1: Argument clarity

Before: “Your argument needs work.”

After: “Rewrite your thesis as one sentence that includes (1) your claim and (2) the reason it matters. Then check each body paragraph: does it directly support that thesis?”

Example 2: Paragraph structure

Before: “Improve your structure.”

After: “For each paragraph, add a topic sentence that states the point of the paragraph. Then include one piece of evidence and one sentence explaining how the evidence supports your claim.”

Example 3: Source use (if applicable)

Before: “Use better sources.”

After: “Replace one source with a peer-reviewed article (or a credible primary source). Add a sentence after the quote that explains why that source is trustworthy and how it supports your point.”

If you want feedback to stick even more, I recommend building these steps into your teaching cycle. For instance, after you grade, do a quick “revision workshop” where students practice the exact fix you highlighted.

And if you’re planning your lessons, it helps to think about feedback like part of the workflow: draft → targeted comments → student revision → check again. That’s where the improvement happens.

FAQs


In general, feedback works best when it’s regular and returned soon after students complete the task. If you can’t do it for every assignment, aim for at least one early checkpoint (like an outline or draft) plus feedback on the final submission so students have something to revise while it’s still relevant.


SBI stands for Situation, Behavior, and Impact. You describe the situation, explain the behavior you observed, and then connect it to the impact (how it affected understanding, engagement, or outcomes). It’s a great structure when you want feedback to be clear, objective, and actionable.


Use specific suggestions and then ask a question that pushes students to respond. Prompts like “What will you change next time?” or “Which comment helped most and why?” encourage students to process your feedback and choose what to do next, instead of just reading it.


Different students learn differently, and different tasks call for different feedback formats. Mixing written comments, audio/video, peer feedback, and brief conferences can keep students engaged and improve understanding—especially when you match the format to what students need to change.

Ready to Create Your Course?

Try our AI-powered course creator and design engaging courses effortlessly!

Start Your Course Today

Related Articles