Back to Blog
Vulnerability ManagementRemediationPatchly Validate

The Spreadsheet Is Not Your Remediation Program

7 min read

Frank writes about vulnerability management, patch operations, and Microsoft-native security workflows.

The Spreadsheet Is Not Your Remediation Program

I’ve seen this workflow at dozens of organizations over 15 years, and it’s remarkably consistent. The pen test report arrives. Someone exports the findings into a spreadsheet. Columns get added: owner, priority, due date, status. Rows get assigned. Status fields start as “Open” and, over the following weeks, gradually migrate to “In Progress” and then “Resolved.”

Six months later, the next assessment runs. A significant portion of those “Resolved” findings are still present. The fix didn’t hold, or it was applied to the wrong scope, or the person who updated the status genuinely believed it was fixed but never verified. The spreadsheet said the problem was solved. The environment said otherwise.

This is the remediation verification gap. It persists not because the tools don’t exist, but because the process between finding and fixing is still, in most organizations, a manual status-tracking exercise with no verification loop.

How the spreadsheet becomes the program

It usually starts reasonably enough. The pen test report is a PDF. Findings need to be assigned to the people who can fix them. A spreadsheet is the lowest-friction way to do that – familiar tool, no procurement process, no integration work. Everyone can access it. It’s the fastest way to turn findings into assignments.

The failure starts immediately, even if you don’t feel it for months.

Ownership goes stale. The person assigned to a finding in March may have changed roles, left the company, or been reassigned by June. Nobody updates the owner column because nobody is looking at it until the next assessment forces the question.

Status is self-reported. When someone marks a finding as “Resolved,” what they typically mean is “I believe I addressed this.” Not “I retested the specific vulnerability and confirmed it’s no longer exploitable.” The difference matters enormously, but the spreadsheet can’t distinguish between the two. Both look like a green cell.

Scope errors compound. A finding might say “exposed admin panel on port 8443.” The remediation team blocks external access to 8443 on the production server. They mark it resolved. But the same admin panel is also exposed on the staging server that wasn’t in the original pen test scope. The spreadsheet says fixed. The exposure persists.

Evidence doesn’t exist. When audit time arrives, the spreadsheet becomes the evidence: a document with “Resolved” beside each finding and a date someone typed in. That’s not evidence of remediation. It’s evidence that someone updated a spreadsheet. A competent auditor will treat those as very different kinds of evidence.

Why self-reported remediation no longer holds up

This matters because organizations are being asked for stronger evidence that remediation actually happened and held. A spreadsheet with “Resolved” in the status column is an assertion. A timestamped comparison showing that a finding was present in one assessment and absent in the next is evidence.

That distinction is where audit friction, insurer questions, and leadership skepticism tend to appear.

I covered the broader verification problem in Finding Vulnerabilities Is Easy. Proving You Fixed Them Is the Hard Part – the spreadsheet is the operational mechanism through which that gap perpetuates itself.

The failure modes are predictable

Spreadsheet-based remediation tracking doesn’t fail randomly. It fails in the same predictable ways, across almost every organization that relies on it.

Handoff failure. The pen test was conducted by an external firm, received by the security team, and exported to IT operations. At each handoff, context was lost. The operations team sees “Critical: SQL injection on login form” and doesn’t have the full reproduction steps, the affected URL, or the context about why this particular finding is the priority. They fix what they think is the problem. Nobody verifies.

Recurrence blindness. Without structured comparison across assessments, there’s no way to see which findings keep coming back. A misconfiguration that gets remediated and then reintroduced during the next deployment cycle will show up as “new” in the next assessment rather than “recurring.” The spreadsheet has no memory. Every assessment exists in isolation.

Artificial confidence. Organizations often report high remediation rates to leadership based on spreadsheet tracking. The verified remediation rate is frequently lower, not because teams are dishonest, but because self-reported status is not the same thing as tested closure. When the only data you have is self-reported, your metrics reflect what people believe happened, not what actually happened.

SLA gaming. When remediation SLAs are measured by when someone updates a status field, rather than when a vulnerability is verified closed, teams optimize for the metric rather than the outcome. A finding gets marked “Resolved” within the SLA window because someone applied a configuration change. Whether that change actually fixed the problem – and whether it stayed fixed – is a question the spreadsheet never asks.

What replaces the spreadsheet

The answer isn’t a better spreadsheet. It’s a fundamentally different approach to tracking remediation – one built on verification rather than assertion.

Automated retesting. When a finding is reported as remediated, the specific vulnerability should be retested – not the entire engagement, just the targeted finding. If the pen test found an exposed admin panel on port 8443, verification means scanning that port and confirming it’s no longer accessible. This should be fast, scoped, and automated where possible.

Structured diff. Every assessment should be compared against the previous baseline, with findings categorized as new, resolved, persistent, or changed. This is the mechanism that turns remediation tracking from a status-update exercise into a measurement system. Without a diff, you can’t see whether your program is actually improving or just cycling through the same problems.

Timestamped evidence. The output of the diff is the audit trail. Finding X was present in scan A on January 15. Finding X was absent in scan B on March 12. That’s a verifiable, timestamped record of remediation – the kind of evidence auditors want, insurers ask for, and boards need to have confidence the security program is working.

This is the operational gap a remediation platform has to solve. In Patchly, that logic sits in the Scan Diff Engine: each assessment is compared against the prior baseline, findings are categorized by status change, and remediation is confirmed through retesting rather than a human-updated field. The reports pull directly from the diff data, so the executive summary tells a real story: here’s where you were, here’s where you are, here’s what improved, here’s what didn’t.

The reporting conversation changes

The downstream effect of moving from spreadsheet tracking to verified remediation is that the reporting conversation with leadership changes entirely.

Instead of: “We had 47 findings. 38 are marked as resolved. Our remediation rate is 81%.”

You get: “We had 47 findings in the January baseline. 32 are confirmed resolved through retesting. 8 persist. 4 changed scope. 3 are new since the last assessment. Our verified remediation rate is 68%, up from 54% last quarter.”

The second version is a harder number. It’s also a more honest one. It tells leadership whether the program is actually improving, not just whether status fields are getting updated.

The spreadsheet was never designed to carry this weight. It’s time to stop asking it to.


Related reading: Finding Vulnerabilities Is Easy. Proving You Fixed Them Is the Hard Part. | The Case for Continuous Penetration Testing

See what verified remediation evidence looks like in practice. Download a sample Patchly Validate report or book a demo.

In this article
  1. How the spreadsheet becomes the program
  2. Why self-reported remediation no longer holds up
  3. The failure modes are predictable
  4. What replaces the spreadsheet
  5. The reporting conversation changes

Want to see how Patchly works? Request a free assessment or book a demo.