Friday, November 22, 2024

Googler calls for fire drill-like overhaul of phishing tests

Must read

A Google security bigwig has had enough of federally mandated phishing tests, saying they make colleagues hate IT teams for no added benefit.

Matt Linton leads Google’s security response and incident management division. Tasked with rolling out phishing exercises every year, he believes tests should be replaced by the cybersecurity equivalent of a fire drill.

Today’s phishing tests more closely resemble the fire drills of the early days, which were more like fire evacuation drills – sprung upon a building’s residents with no warning and later blaming them as individuals for their failures.

Since then, more security features have been fitted to buildings. Linton cited wider doors and their push-bar exit designs, as well as fire sprinklers as examples of innovations that improved a building’s fire safety. None of these were implemented to improve individual residents’ response to drills, but together they increased survival rates and now fire drills are better planned, well-announced procedures.

Readers, you can probably see where he’s going with this. Parallels between these early fire tests and modern-day phishing exercises are clear – in both cases the burden of responsibility is applied more to the individual rather than the infrastructure around them.

Google Spectre whiz kicked out of DEF CON hotel over misunderstood tweet

FROM 2018

Despite anti-phishing controls being baked into security products and email clients, research points to phishing attacks increasing. Zscaler’s latest annual phishing report found the past 12 months saw a 58 percent increase in phishing, and the wider adoption of AI by cybercriminals has driven that surge.

The Federal Risk and Authorization Management Program (FedRAMP) is one of the US organizations that promotes cybersecurity standards. Google maintains FedRAMP compliance and does so, in part, by running phishing tests that follow its guidance, which still claims users “are the last line of defense and should be tested.”

Linton argues that there is value in providing staff phishing training, but achieving a 100 percent success rate “is a likely impossible task.”

“Phishing and Social Engineering aren’t going away as attack techniques,” he blogged. “As long as humans are fallible and social creatures, attackers will have ways to manipulate the human factor. 

“The more effective approach to both risks is a focused pursuit of secure-by-default systems in the long term, and a focus on investment in engineering defenses such as unphishable credentials – like passkeys – and implementing multi-party approval for sensitive security contexts throughout production systems. It’s because of investments in architectural defenses like these that, we’re told, Google hasn’t had to seriously worry about password phishing in nearly a decade.”

The problem with current tests, and possible alternatives

The main argument against current phishing tests is “there is no evidence that the tests result in fewer incidences of successful phishing campaigns,” said Linton.

Some tests like those mandated by FedRAMP require organizations to reduce or eliminate existing controls to maximize the perceived impact of a failed test. This opens up a litany of issues, such as giving test subjects a false sense of the real risks and the allowlists implemented during exercises not being removed after, leaving them open for abuse by attackers.

There’s also the increased load placed on incident responders and those tasked with triaging reports sent to threat detection teams, all while staff are left feeling unnecessarily deceived, Linton said, and he’s not alone.

The guidance from the UK’s NCSC, for example, concurs with many of the points raised by the Googler, saying they erode trust between staff and security teams, and that there are a host of reasons why a user may click on a link in a phishing test.

Factors such as certain personality traits of a given individual may compel them to click a link and situational variables including a particularly stressful workload being managed at the time a test is issued may unfavorably skew results.

“Employees should instead create a positive cybersecurity culture so employees feel comfortable reporting phishing incidents, and in this sense, they can be a valuable early warning system,” the NCSC says.

Linton’s idea of how these tests could be improved goes back to the notion of fire drills evolving into what they are today.

Rather than them being delivered with deception, the fact they’re a test should be clear as day, in the same way that apartment and office blocks have posters plastered around every corner weeks before a test is carried out. They should point to a test and inform the recipient of the benefits.

Linton’s idea of a possible alternative is considerably different compared to the tests office workers have become accustomed to over the years.

In addition, the NCSC says a multi-layered approach should be taken to mitigating phishing attacks in a workplace:

  1. Make it difficult for attackers to reach your users

  2. Help users identify and report suspected phishing emails

  3. Protect your organization from the effects of ‘successful’ phishing emails

  4. Respond quickly to incidents

“Educating employees about alerting security teams of attacks in progress remains a valuable and essential addition to a holistic security posture,” Linton said. “However, there’s no need to make this adversarial, and we don’t gain anything by ‘catching’ people ‘failing’ at the task. 

“Let’s stop engaging in the same old failed protections and follow the lead of more mature industries, such as fire protection, which has faced these problems before and already settled on a balanced approach.” ®

Latest article