SecurityBrief UK - Technology news for CISOs & cybersecurity decision-makers
Secure server gateway filtering bot ai traffic layered shields 1

Netacea launches Trust Layer for AI agent web traffic

Wed, 25th Mar 2026

Netacea has launched Trust Layer for enterprises managing automated web traffic. The product is aimed at organisations dealing with a growing mix of AI agents and other non-human online activity.

The new server-side layer is designed to help businesses identify, govern and control different forms of automated traffic before they reach an application. This includes AI agents, crawlers, scrapers, partner automation and malicious bots, which now account for an increasingly complex share of internet activity.

With more than half of web traffic now generated by non-human sources, companies are rethinking how they assess digital interactions. Rather than treating all automation as hostile, many are having to distinguish between useful, ambiguous and harmful traffic.

The product builds on Netacea's existing work in server-side web traffic classification, which focuses on separating legitimate automated interactions from malicious activity and applying controls without blocking genuine users.

That approach is particularly relevant for large eCommerce groups, financial services firms, media companies and other businesses with substantial online operations. In these environments, automated traffic can affect fraud rates, account security, data collection, abuse levels and the reliability of digital signals used for business decisions.

By classifying traffic before it reaches the application layer, organisations can make more precise access decisions. Trusted automation can be allowed through, traffic with unclear trust status can receive greater scrutiny, and unwanted or malicious activity can be mitigated earlier in the process.

Traffic shift

The launch reflects a broader industry debate over how to manage the rise of machine-driven interactions on the web. Traditional bot management has focused on stopping malicious automation, but the spread of AI tools and software agents has created a wider spectrum of automated actors with different purposes and levels of risk.

For many online businesses, the challenge is no longer limited to blocking bad bots. It now includes understanding which automated systems are acting on behalf of customers, partners or third parties, and which are scraping data, distorting usage metrics, or attempting fraud and account takeover.

Netacea said its system analyses billions of behavioural signals in real time to assess these interactions. The aim is to give organisations a clearer picture of who, or what, is engaging with their digital services, while helping reduce fraud, scraping, abuse and skewed data.

Framework support

The Trust Layer is supported by Netacea's threat intelligence and by BLADE, the Business Logic Attack Definition Framework. Netacea created BLADE with input from industry specialists and later donated it to OWASP as a framework for understanding automated activity and business logic abuse.

A shared framework matters because automated abuse often exploits the rules and workflows of digital services rather than technical weaknesses alone. That can make attacks harder to define and defend against using older security models built mainly around network or infrastructure threats.

Andy Still, Chief Technology Officer and Co-Founder at Netacea, said the core issue for customers has remained the same even as online traffic has evolved.

"At its core, this is still the problem Netacea has always solved, understanding who, or what, is interacting with a digital service, and helping enterprises decide what to trust, what to challenge, and what to control," said Andy Still, Chief Technology Officer and Co-Founder, Netacea.

He added that the web itself has changed as a wider range of automated entities interacts with business systems for different reasons.

"What has changed is the nature of the web. Enterprises are now seeing a broader mix of automated actors, including AI agents, crawlers, scrapers, and malicious automation, all interacting with systems in different ways and for different purposes. Because Netacea has been classifying web traffic server-side for years, we are well placed to help customers extend that same visibility and governance into the agentic era," Still said.