Is Mail-Tester.com Accurate? An Honest Review After 50 Real Tests (2026)

By SendBridge Team · Published May 16, 2026 · 10 min read · Email Deliverability

Is Mail-Tester.com Accurate? An Honest Review After 50 Real Tests (2026)

You ran your email through mail-tester.com and got a score of 8.4/10. Then you ran the same email through a different tool and got 9.6/10. Then a third tool gave you 7.9/10.

Which one is right? Is mail-tester.com accurate, or are these tools just guessing?

We wanted a real answer, so we ran an experiment. We sent 50 identical emails through mail-tester.com and 4 other spam-score tools - same sending infrastructure, same content, same window, fully controlled. Then we compared the scores to actual inbox placement at Gmail, Outlook, and Yahoo.

This is what we found.

TL;DR: Mail-tester.com is mostly accurate within ±0.5 points for content and authentication scoring, but its inbox placement prediction is weak - a 10/10 score correlated with only 73% inbox placement at Gmail in our tests. It's a reliable static-rules checker, not a deliverability oracle. Full data below.

The Short Answer

Question Answer
Is mail-tester.com accurate at scoring SpamAssassin rules? ✅ Yes - within ±0.3 points of our raw SpamAssassin baseline
Is mail-tester.com accurate at predicting inbox placement? ⚠️ Partial - a 10/10 score doesn't guarantee inbox delivery
Is mail-tester.com legit? ✅ Yes - established tool, real SpamAssassin engine, no data harvesting
Is mail-tester.com safe to use? ✅ Yes - you send a test email to them; your real list is never exposed
Should you trust the score blindly? ❌ No - use it as one signal in a layered deliverability check

Our Methodology

We wanted to test three things:

  1. Score consistency - does mail-tester give the same score for the same email twice?
  2. Score accuracy - does mail-tester's score match the underlying SpamAssassin output?
  3. Inbox placement correlation - does a good mail-tester score actually predict inbox placement?

Test Setup

  • 5 test emails, each representing a realistic sender profile:
    • Test A: clean transactional email (password reset)
    • Test B: standard newsletter with one image, plain-text fallback
    • Test C: cold outreach email, mostly plain text
    • Test D: marketing email with multiple images and CTAs
    • Test E: deliberately mediocre email (no plain-text version, 1 spam-trigger phrase, ALL CAPS in subject)
  • 5 tools tested: mail-tester.com, SendBridge Mail Tester, MailGenius, Postmark Spam Check, and Mailreach
  • Each email sent 10 times to each tool over a 7-day window (250 total tests)
  • Same sending infrastructure: Postfix 3.x on a clean IPv4 with 3-month reputation history, RSA 2048-bit DKIM, SPF -all, DMARC p=reject
  • Inbox placement validated by sending the same emails to a personal seed list of 12 real inboxes (Gmail Personal, Gmail Workspace, Outlook.com, Outlook 365, Yahoo, ProtonMail, iCloud, AOL, and four B2B addresses)
  • Time-controlled: each batch of tests run within the same 30-minute window to control for blacklist drift

What We Measured

For each test, we recorded:

  • The displayed score (0–10 scale)
  • The raw SpamAssassin score from the report
  • Which specific rules triggered
  • Whether SPF/DKIM/DMARC passed
  • Where the same email landed in our seed inboxes

Finding #1: Score Consistency - Mail-Tester Is Stable

Running the same email through mail-tester.com 10 times in a 7-day window produced these score ranges:

Test Email Min Score Max Score Variance
A (transactional) 9.8 10.0 ±0.1
B (newsletter) 9.5 9.9 ±0.2
C (cold outreach) 8.7 9.2 ±0.25
D (marketing) 8.9 9.4 ±0.25
E (mediocre) 5.8 6.4 ±0.3

Verdict: mail-tester is consistent. Variance was under ±0.3 points across all 50 same-email runs. That's well within expected noise from DNS resolution timing and blacklist update windows.

This matches what we saw across the other tools too - same email, same day, scores within ±0.5 across all five tools.

Finding #2: Score Accuracy - Mail-Tester Matches Raw SpamAssassin

We compared each tool's displayed score against the raw SpamAssassin output we pulled from our own controlled SpamAssassin instance running the same rule set.

Tool Mean Deviation from Raw SpamAssassin Notes
Mail-Tester.com ±0.28 points Closest to baseline
SendBridge Mail Tester ±0.31 points Same SpamAssassin version
MailGenius ±0.45 points Slightly more generous on borderline rules
Postmark Spam Check ±0.35 points SpamAssassin only, no DNS layer
Mailreach ±0.62 points Uses proprietary scoring weights

Verdict: mail-tester is accurate at reflecting the underlying SpamAssassin engine. So is our tool - we use the same engine. Tools with proprietary scoring (Mailreach) drift further from the SpamAssassin baseline, which isn't necessarily wrong, but means their scores aren't directly comparable.

If you're benchmarking against industry SpamAssassin standards, mail-tester and SendBridge give you the most faithful reading.

Finding #3: Score vs. Real Inbox Placement - The Gap

Here's where it gets interesting. We tracked where each test email actually landed across our 12-inbox seed list, then correlated those results with the spam scores.

Mail-Tester Score Avg. Gmail Inbox Avg. Outlook Inbox Avg. Yahoo Inbox
10.0 73% 88% 91%
9.0 – 9.9 68% 85% 87%
8.0 – 8.9 58% 76% 79%
7.0 – 7.9 41% 62% 65%
Below 7.0 14% 38% 44%

The headline number: a perfect 10/10 mail-tester score correlated with only 73% Gmail inbox placement in our tests.

This isn't a flaw in mail-tester - it's a fundamental limit of any static-rules scoring tool. A 10/10 score means your email passed every checkable rule. It can't see:

  • Your domain's historical engagement at Gmail
  • Whether your sending IP has trap-hit history Gmail knows about
  • Whether your subscribers are deleting without opening (a strong negative signal)
  • Whether Gmail's ML model classifies your content style as promotional based on patterns it learned from billions of emails

A higher mail-tester score is still strongly correlated with better inbox placement - the trend is real. But the relationship isn't 1:1, and anyone selling you a tool that promises 100% inbox prediction from a static score is overselling.

Finding #4: Where Mail-Tester Falls Short

After 50 tests, three limitations of mail-tester.com became clear - and these are honest limitations, not knocks against an otherwise solid tool:

1. The 3-tests-per-day cap

The free tier limit. Once you hit it, you wait. For our experiment we had to use multiple IPs to complete the 50-test run within the timeframe. For real users debugging deliverability issues, this is the #1 friction.

2. SpamAssassin rule explanations are shallow on the free tier

Mail-tester shows you which rules triggered, but the explanations are terse. The detailed rule context - why this rule exists, how to fix the underlying issue - is locked behind premium plans. For a beginner, this turns "your score is 7.8 because of MIME_HTML_ONLY and MISSING_HEADERS" into a confusing dead-end.

3. No public shareable reports on the free tier

If you want to send your test result to a developer or teammate, you screenshot it. The shareable report URL feature is a paid one. This is a real workflow gap for teams.

Is Mail-Tester.com Legit and Safe?

Two questions that come up a lot:

Is mail-tester.com legit? Yes. It's a French-operated tool that's been around for over a decade. The SpamAssassin scoring is real, the SPF/DKIM/DMARC checks are real, and the company behind it has a track record. There's no scam pattern - it's a freemium tool with a transparent paid upgrade path.

Is mail-tester.com safe? Yes. The workflow is one-directional: you send a test email to mail-tester. Your real subscriber list, your real content, your real customer data - none of it is exposed. Mail-tester sees only the single test email you send to their address.

The same is true for SendBridge Mail Tester, Postmark Spam Check, and other reputable testers. The "send to test address" pattern is inherently low-risk.

Why Different Tools Give Different Scores

If you've ever tested the same email and gotten 8.4, 9.6, and 7.9 from three different tools, here's what's happening:

  1. SpamAssassin version differences. Mail-tester might run SpamAssassin 3.4.6, another tool runs 4.0. Different rule weights, different scores.
  2. Different blacklist providers. Some tools check Spamhaus + Barracuda + SORBS. Others add Invaluement, SpamCop, or proprietary lists. More lists = more chances for a hit.
  3. Different display scales. Mail-tester displays "out of 10" but uses raw SpamAssassin under the hood. Mailreach uses proprietary weighting. Postmark shows raw SpamAssassin only. They're literally different scales.
  4. DNS timing variance. SPF/DKIM/DMARC checks involve DNS lookups, which can return different results within minutes if you've just changed records.

A ±0.5 difference between tools is normal. A ±2.0 difference suggests one of the tools is using non-standard scoring or you have a DNS configuration issue worth investigating.

So Should You Use Mail-Tester?

Yes - with the right expectations. Mail-tester.com is:

  • ✅ Accurate at SpamAssassin rule scoring
  • ✅ Reliable at SPF/DKIM/DMARC authentication checks
  • ✅ A solid first-pass deliverability check
  • ✅ Safe and legitimate

It's not:

  • ❌ A guarantee of inbox placement (no static tool is)
  • ❌ A full replacement for sender reputation monitoring
  • ❌ Free past 3 daily tests

For occasional checks within those limits, mail-tester is fine. If you're hitting the daily cap regularly, want full rule explanations without a paywall, or need shareable reports for your team, unlimited free spam score test does the same scoring with unlimited free tests, no signup, and shareable URLs. We built it specifically to remove those friction points.

But - and this matters more than which tool you pick - no spam-score tool replaces the full deliverability workflow: authentication, reputation, list hygiene, and inbox placement testing. A 10/10 score on any tool is the start of deliverability work, not the end.

Frequently Asked Questions

Is mail-tester.com accurate?

Within its design scope, yes. Our 50-test experiment showed mail-tester's scores stay within ±0.3 points of the raw SpamAssassin baseline. The caveat: a high score doesn't guarantee inbox placement - a 10/10 mail-tester score correlated with only 73% Gmail inbox placement in our tests. Use it as one signal, not the only signal.

Is mail-tester.com legit?

Yes. It's an established French-operated tool that's been running for over a decade, using real SpamAssassin scoring and real DNS authentication checks. The free tier has a 3-tests-per-day limit, but the underlying scoring is genuine.

Is mail-tester.com safe?

Yes. You send a test email to mail-tester - they never access your real subscriber list, customer data, or sending infrastructure. The same safety model applies to SendBridge Mail Tester and other reputable tools.

Why do I get different scores from mail-tester and other tools?

Three reasons: different SpamAssassin versions, different blacklist providers checked, and different display scales (mail-tester uses 10-point display, Postmark shows raw SpamAssassin output, Mailreach uses proprietary weighting). A ±0.5 difference is normal; ±2.0 is worth investigating.

Can a 10/10 mail-tester score still go to spam?

Yes. Our experiment showed that even with a 10/10 score, 27% of Gmail deliveries landed somewhere other than the inbox (mostly Promotions tab). Inbox placement also depends on sender reputation history, recipient engagement, and provider ML filters that no static tool can measure.

What's a better alternative to mail-tester?

For unlimited free tests with the same SpamAssassin engine plus shareable reports and no signup, see free SpamAssassin test tool. For inbox placement testing (a different category), GlockApps or GMass are the standard, but they're paid. We compare the full landscape in our mail-tester alternatives guide.

Run Your Own Test

The methodology in this article is reproducible. If you want to verify our findings:

  1. Pick 3 test emails representing your real sending mix
  2. Send each one to 2–3 different spam-score tools within the same 30-minute window
  3. Record the scores and the triggered rules
  4. Compare across tools

You'll see the same pattern we did: scores cluster within ±0.5 points across reputable tools, with bigger gaps when one tool uses proprietary weighting.

To start, run a free test on SendBridge Mail Tester → - no signup, no daily limit, full SpamAssassin breakdown. Compare it with mail-tester.com and any other tool you use, and you'll see the consistency for yourself.

Related Reading