Cloud-based security provider Incapsula revealed in an interview with Siliconvalleywatcher.com today that 51% of web traffic consists of possibly threatening automated software programs.
Only 49% of web visitors are actual humans, whereas the rest consists of various other sources.
Incapsula collected the data from a 1000 sample websites enrolled in their service.
Their research indicates the following:
- 5% are hacking tools searching for an unpatched or new vulnerability in a web site.
- 5% are scrapers.
- 2% are automated comment spammers.
- 19% are from “spies” collecting competitive intelligence.
- 20% are from search engines — which is non-human traffic, but benign.
- 49% consists of humans browsing the web.
Incapsula co-founder Marc Gaffan said, “Few people realise how much of their traffic is non-human, and that much of it is potentially harmful. Because we have thousands of web sites as customers, we spot exploits way ahead of others and we can then block them for all our customers.”