An unidentified individual uses a laptop computer in Bryant Park last March New York City. (Photo by Cindy Ord/Getty Images)
Endpoint detection and response systems can often serve as a frontline defense for many organizations, collecting and storing telemetry from dispersed employee devices and using it to detect malicious activities or behaviors. However, a recent experiment by academic researchers at the University of Piraeus in Greece indicates they are far from a silver bullet when it comes to protecting your organization.
For the experiment, the researchers attempted to emulate the tools and behaviors of Advanced Persistent Threat actors, using scripted attacks involving spearphishing and various malware delivery techniques. They also leveraged popular tools like Cobalt Strike for lateral movement and modeled their threat activity using frameworks like Mitre [email protected] They tested 11 of the most popular EDR systems on the market, seeking to answer four core questions:
- Can the system detect “common” APT attack methods?
- Where are the blindspots in detection?
- What type of data does it rely on to generate alerts?
- Can you reduce the level of overall noise in the telemetry?
There are some limitations to the research. It can’t account for differences in tool customization, the sophistication of the human team using it, and other layers of enterprise security (like firewalls or antivirus programs) that may catch or prevent the same attacks. Nevertheless, the researchers believe that “we should expect that a baseline security when opting in for all possible security measures should be more or less the same across most EDRs.”
“Moreover, one would expect that, even if the EDR failed to block an attack, it should have at least logged the actions so that one can later process it,” wrote authors George Karantzas and Constantinos Patsakis. “However, our experiments show that often this is not the case.”
The team tested its attacks against 11 EDR products from Kaspersky, Crowdstrike, Carbon Black, ESET, F-Secure, McAfee, Sentinel One, Sophos, Symantec, Trend Micro and Windows Defender. Some performed better or worse than others, but the overall failure rate was high. Of the 20 attacks the team launched, half were successful and did not generate an alert.
“It is rather alarming that none of the EDRs managed to detect all of the attacks,” the study concludes. “More precisely, 10 attacks were completely successful… and no alert was issued; three attacks were successful, yet they issued a low significance alert; one attack was not successful, yet it did not issue an alert; and six attacks were detected and correctly reported by the EDRs.”
The researchers also found numerous ways to leverage their access to attack and degrade the ability of these tools to process the necessary telemetry.
“The heart of most EDRs lies in the kernel itself as they utilize mini-filter drivers to control file system operations and callbacks in general to intercept activities, such as process creation and loading of modules. As attackers, once high integrity is achieved, one may effectively attack the EDRs in various ways [to further evade detection rules],” they wrote.
SC Media has reached out to the EDR vendors mentioned for comment on the study’s conclusions and will update this story with any responses received.
The findings underscore the gap between the marketing-driven security promises made around EDR and the limitations of any one security tool. The market for endpoint detection and response systems is estimated at approximately $13.7 billion and is expected to grow to as much as $23 billion by 2027 as more organizations and shift towards more relaxed remote or Bring Your Own Device work policies.
Allie Mellen, an analyst at Forrester who evaluates EDR systems and other security tools, told SC Media last month that “incident responders love using EDR technology to detect and respond to threats” but that “ultimately, there are other sources of telemetry that they use both for detection and then also for deeper investigation, like the network.”
Nick Landers, director of research at penetration testing company NetSPI, told SC Media that that it’s rare for one team or company to even have access to such a wide range of EDR systems and any research that can test and compare different products in the EDR market is valuable in and of itself.
He said the results outlined in the study largely mirror his experience with customers, and that many advanced threat actors generally rely on two strategies for evading detection by EDR systems: using completely unique or novel tactics that can frustrate heuristic analysis or data algorithms, and “not making noise in general” by understanding what telemetry EDR systems collect and measure.
“I think the ones we see that are the most effective are ones where the attacker understands the data [the EDR system is] collecting and keeps generation of that data low,” he said.
However, Landers said his main takeaway from the study is not necessarily that EDR products are shoddy or not worth the cost (though he again lamented the lack of access that independent third parties typically have to test such systems), but rather a “more constructive” reinforcement of the need for multiple layers of security to ensure any one tool or process doesn’t become a single point of failure.
“I think looking at the minutiae and finger-pointing and trying to identify specific products and their specific failings is a fault that belongs to everyone in the industry,” he said. “But [EDR systems] are valuable tools and while I might not agree with their strategy or their marketing or cost or licensing model or availability, I think they do contribute to a defense in depth strategy and that’s ultimately what we should all be striving for.”