Skip to main content
MarkMate logo
Case study10 May 2026 · 6 min read

How I marked 30 Year 12 Ancient History essays in 14 minutes

Cheryl · Head Teacher Administration, NSW. Built MarkMate.

I'm Cheryl. I'm Head Teacher Administration at a NSW secondary school, and I built MarkMate. Last week I batch-uploaded my Year 12 Ancient History cohort, 30 essays on Hatshepsut's religious propaganda, into MarkMate. Marking time: 14 minutes.

The same cohort marked by hand last term took me roughly 6 hours.

I'm going to walk through the actual workflow, including the bits that took longer than 14 minutes (rubric upload, sanity checks, follow-up planning), so you can decide whether the headline number is the relevant one for your situation.

What "marking" actually means here

When teachers say "I marked the essays in 14 minutes," there's a category error to watch for. Marking includes:

  1. Reading the response
  2. Scoring against the rubric
  3. Writing student-facing feedback
  4. Logging the result
  5. Preparing follow-up teaching adjustments

MarkMate handles 1, 2, and 3. The 14-minute number is the time from clicking Upload to having 30 marked, annotated, rubric-scored essays with feedback ready to send.

What it doesn't include:

  • Setting up the assignment (rubric upload, marking criteria, etc.). About 4 minutes the first time, 30 seconds for subsequent assessments because you can clone.
  • My own sanity check on each result. I scanned every one, which took another 35 minutes.
  • The cohort summary review and the calls I needed to make about three specific students. About 25 minutes.

So the honest end-to-end number for this assessment was around 80 minutes. That still beats 6 hours by a wide margin, but it's not 14 minutes from start to finish. If anyone tells you marking dropped from 6 hours to 14 minutes, ask them what their setup and override time looked like.

The setup

The task was an extended response from the Personalities in Their Times option. The question:

Assess the role of religious propaganda in legitimising Hatshepsut's reign as pharaoh.

The rubric had four marking criteria mapped to NESA Stage 6 outcomes: argument and analysis, source integration, historical knowledge, and structure and expression. Each criterion scored against a six-band scale.

Setup steps:

  1. Uploaded the task notification PDF (the one I'd given the students).
  2. Confirmed the rubric criteria MarkMate extracted matched what I'd written. They did, but I always check.
  3. Set the assessment type to extended response.
  4. Generated a class link in case I wanted students to self-check before submission. I didn't use it for this assessment, but I would for future drafts.

Total setup: about 4 minutes.

The batch upload

I had 30 PDFs from Google Classroom, named by student. I selected them all and dropped them into MarkMate's batch upload.

The progress bar showed each essay being processed. The first finished in around 25 seconds, the last in around 14 minutes. I had a coffee. I didn't watch the progress bar.

When it finished I had:

  • 30 essays each marked against four rubric criteria with band scores
  • Each essay annotated in green (strengths) and orange (improvements)
  • Click-to-expand explanations on every annotation
  • A class summary showing band distribution and common issues
  • An AI detection report for each student, with four risk categories

The sanity check

I'd promised myself I wouldn't trust MarkMate blindly. So I scanned every essay's rubric scores against my own gut sense of where the student sat from class.

Three observations:

Observation 1: The bands matched my expectations on 27 of the 30 essays. Same band on every criterion. No drift.

Observation 2: Two essays were marked one band lower than I'd have given. Both were borderline cases I'd have edged up on the basis that the student had improved across the term. MarkMate doesn't know that context. I overrode the band on both.

Observation 3: One essay was marked one band higher than I'd have given. I read it more carefully. MarkMate had picked up source integration that I'd missed because the student had used Manetho in an unexpected place. I didn't override; I let MarkMate's mark stand.

So my override rate was 2 out of 30, and one of those was a genuine catch on MarkMate's part.

What MarkMate didn't do

The cohort summary flagged that 11 students had weak conclusions. That's useful, but it's not yet teaching. I still had to plan the lesson where I show them what a strong conclusion looks like, with worked examples from the better essays in the cohort.

The AI detection report flagged two essays as Medium risk on language patterns. I read those carefully myself. One was a high-effort piece from a student whose voice has lifted across the term, which is exactly what AI detection sometimes misreads. The other was worth a follow-up conversation with the student, which I had. MarkMate didn't make the call; I did.

Three essays needed me to add context in the feedback. One student's argument was technically sound but didn't engage with the directive verb at the right level. MarkMate flagged the issue but its suggested feedback was generic. I rewrote the feedback in 30 seconds because I knew the student.

What you'd want to know if you were considering this

If you're a teacher reading this and thinking about whether to try MarkMate, here's what I'd want to know in your shoes.

Is the time saving real? Yes, with the caveat above. Including setup and sanity checks, it's roughly an 80% time reduction on a class set, not 100%. That's still significant.

Is the marking quality acceptable? For my class, on this assessment, yes. Two genuine overrides and one catch I'd missed. That's a similar error rate to what I'd accept from a co-marker.

Will it replace your judgement? No, and don't use it that way. The cohort summary and the AI detection are inputs. The decisions about each student's grade, follow-up, and feedback adjustments are yours.

Will the kids notice? They'll notice that the feedback is detailed and rubric-aligned. They won't know whether you wrote it line by line or whether MarkMate did the bulk and you edited. Most will assume you wrote it. (One of mine asked, and I told her the truth: I used a tool, then I checked it, then I added the bits the tool missed.)

Try it on your next class set

The demo on the homepage lets you paste up to 500 words of student writing and see what MarkMate does with it. No sign-up required.

If you'd rather try a real class set, the free plan covers two assignments per month with up to 10 student checks. That's enough to test the workflow above on a small Year 7 or 8 task before you commit to using it on senior cohorts.