All posts
Opinion

Your IT Guy's Phishing Test Is Probably Making Things Worse

If your employees dread your phishing simulations, resent the person running them, or hide their clicks instead of reporting them, your testing program is actively making your company less safe.

By Dan J, Founder of PhishPlease

If your employees dread your phishing simulations, resent the person running them, or hide their clicks instead of reporting them, your testing program is actively making your company less safe. Not less trained, less safe, and the gap between those two things is where breaches live.

Here's a pattern I've seen play out so many times I could narrate it in my sleep like a nature documentary. A company decides they need to do something about phishing, usually because their insurer asked or someone read a headline about a breach that made their arsehole clench. Someone in IT finds a simulation tool, crafts the most devious phishing email they can possibly think of, sends it to the whole company on a Monday morning with the giddy energy of a kid who just lit a firework in a bin, and then watches the results roll in. Fourteen people clicked. IT sends a company-wide email naming who failed. Now half the company resents IT, the other half is too paralyzed to open anything, and nobody is actually any better at spotting phishing than they were before this whole circus started.

The 5 ways most phishing tests go wrong

1. Starting with an impossibly hard simulation

You wouldn't put someone who's never driven before into a Formula 1 car at Silverstone and then act bewildered when they crash it into the barrier. But that's what most companies do with their first phishing test. They send the hardest possible simulation cold to a team that's never had training, then panic when 40% clicks. KnowBe4's 2025 benchmarking report found the global average baseline click rate before any training is 33.1%. One in three employees everywhere will click before training. Blaming your team for that is like pushing someone off a boat and then writing them up for not being able to swim.

The fix: Start moderate. Detectable red flags, not insultingly obvious. Increase difficulty gradually over months.

2. Testing without any prior communication

Sending a phishing simulation to employees who've never been told what phishing is, or that simulations are a thing at your company, is like slapping someone across the face and handing them a leaflet about conflict resolution. Research from ETH Zurich at the NDSS 2025 symposium found deceptive simulation tactics cause significant backlash and long-term trust damage. Resentful employees don't report suspicious emails to IT. They quietly delete them and add another item to their mental list of reasons to update their CV. That's not a security culture. That's a workplace mutiny incubating in slow motion.

The fix: Tell your team simulations are part of the security program before you start. They don't need to know when or what. But they should know why.

3. Using it as a gotcha

Public leaderboards of who clicked. Results tied to performance reviews. Managers pulling people into meetings about their "security awareness deficiency" like it's a disciplinary hearing for a crime they didn't know was illegal. All of it creates a culture where someone's first thought after clicking a suspicious link isn't "I should tell IT" but "shit, I need to bury this deeper than a mob informant in the Nevada desert."

In security, that instinct is catastrophic. The difference between "reported within 5 minutes" and "discovered 2 weeks later when the attacker has been camped in your email like a squatter who's redecorated and changed the locks" is the difference between a nosebleed and internal bleeding, one you can deal with, the other kills you while you're still walking around thinking you're fine. Harvard research across 5,400 employees found punitive measures showed no statistical improvement. Punishment teaches people to hide mistakes with the frantic desperation of a teenager clearing their browser history.

The fix: Keep individual results private. Celebrate team-level trends, not individual failures.

4. Simulations that are laughably fake

Emails from "Amaz0n" written in grammar that suggests someone threw a dictionary into a blender and hit puree. Your team spots them instantly, click rate comes back at 2%, and leadership declares the company "phishing-resilient" with the misplaced swagger of a bloke who's beaten his 8-year-old at chess and now reckons he could take on Magnus Carlsen.

Real phishing in 2026 is AI-generated. Verizon's DBIR noted AI-written malicious emails have doubled in two years. These emails are grammatically flawless and personalized. Training against cartoon fakes and expecting your team to catch AI phishing is like preparing for a cage fight by wrestling an inflatable clown.

The fix: Use templates based on real attack patterns. If a simulation wouldn't make you pause for at least a second, it's too easy.

5. Running simulations once or twice a year

Annual phishing tests are compliance theater, a pantomime performance designed to make an audit trail look good while doing approximately as much for your actual security as a chocolate padlock on a bank vault. KnowBe4's data shows click rates drop 40% in 90 days of consistent training, and 86% after 12 months. But that requires ongoing exposure. Once a year doesn't create behavior change. Skills you don't practice rot away, like food at the back of a fridge everyone pretends isn't there until something starts to smell.

The fix: Monthly simulations. Vary templates, difficulty, and attack type. Consistency beats complexity.

The metric everyone gets wrong

Most companies measure success by click rate alone. But click rate is only half the picture.

The metric that actually tells you whether your security culture is working is report rate. When someone gets a suspicious email, they can click it (bad), ignore it (the cybersecurity equivalent of hearing a strange noise at 3am and pulling the covers over your head), or report it. Ignoring means the email is still sitting in everyone else's inbox, ticking away like a landmine that your whole team walks past every morning. Reporting means IT can warn everyone, block the sender, and contain damage before it spreads through the company like a rumour at a school reunion.

Hoxhunt's 2025 data found that before training, only 34% of users report simulated phishing. After 12 months, that climbs to 74%. A team with a 15% click rate and 5% report rate is in worse shape than one with 15% click rate and 40% report rate. The second team is catching threats and screaming for help. The first is quietly bleeding out like a dinner party where the host is clearly having a breakdown but nobody wants to say something.

If your program doesn't measure report rate, you're optimizing for the wrong thing. You're teaching people to be afraid of clicking instead of fast at reporting. And fear makes people hide mistakes with the panicked energy of someone flushing evidence during a police raid, which is the last thing you want when an attacker is elbow-deep in your company's email.

What to do instead

The loop is simple: Simulate, Catch, Teach, Repeat. Send realistic simulations. When someone clicks, immediately show them what they missed and give them a 2-minute lesson. Track click rates AND report rates. Celebrate improvement. Do it monthly.

The culture is simpler: Train, don't shame. Communicate that simulations exist because phishing is how most breaches start. Reward reporting. Make it clear that clicking a simulation and immediately reporting it is a win, because that "report it fast" reflex is exactly what saves companies in a real attack.

After 3 years at Duo watching how enterprise companies handle security, the ones with the strongest posture aren't the ones with the lowest click rates. They're the ones where employees feel comfortable saying "I think I just clicked something dodgy" without fear. Because in a real breach, the 30 seconds between "I clicked something bad" and "IT knows about it" is worth more than a year of flawless simulation scores gathering dust in a compliance folder.

That's why I built PhishPlease the way I did. Monthly simulations, immediate feedback, 2-minute lessons, no gotcha bullshit. We're the good bad guys. We trip your people up so real attackers can't, and we do it without making your team want to key your car in the car park.

The bar for small business security is so low it's basically a tripping hazard in hell. If you're running simulations, even imperfectly, you're ahead. But the gap between "going through the motions" and "actually changing behavior" is the gap between a phishing program built on fear and one built on practice. Close it, and you've addressed the vulnerability behind 60% of breaches, a door that's either reinforced with steel bolts or propped open with a neon WELCOME sign, a red carpet, and a complimentary drink on arrival.

Ready to test your team?

Send your first phishing simulation in under 2 hours.

Start free trial

14-day free trial · No credit card required