34 arrested as Menlo Security’s State of the Web 2016 report confirms “there is no such thing as a safe site”

0

menloThe cyber threat landscape and website risk has increased in 2016, where nearly half of the top 1 million websites are at risk, according to Menlo Security’s State of the Web 2016 Report released this week. In addition, the report found that the attackers are young, savvy and are getting their hands on exploit tools that are readily available, easy to get, easy to deploy and are highly affective and lucrative in their impact.

menlo-report-coverAs if to support the report, between 5 and 9 December, Europol and law enforcement authorities from Australia, Belgium, France, Hungary, Lithuania, the Netherlands, Norway, Portugal, Romania, Spain, Sweden, the United Kingdom and the United States carried out a coordinated action targeting users of Distributed Denial of Service (DDoS) cyber-attack tools, leading to 34 arrests and 101 suspects interviewed and cautioned. Suspects in the EU and beyond were mainly young adults under the age of 20. The individuals arrested are alleged to have been paying for stressers and booter services to maliciously deploy software to launch DDoS attacks, which flood websites and web servers with massive amounts of data, leaving them inaccessible to users. The tools used are part of the criminal ‘DDoS for hire’ facilities for which hackers can pay and aim it at targets at their choosing.

This is the state of the Internet as we move into 2017! The combination of wide spread software vulnerabilities, pervasive exploit kits, and throngs of new attackers has created the perfect storm. Traditional security products are failing with web based and email attacks unable to be stopped from simply applying “a good or bad approach” as we don’t understand what’s going to be bad tomorrow, according to Greg Maudsley, Senior Director of Product Marketing at Menlo Security. Phishing attacks can now use legitimate URLs and because of this vulnerability, attackers can compromise a legitimate site and create ‘drive-by attacks’ or a spoof link within a legitimate website – meaning there is no obvious anomalies in the URL that anti-phishing techniques can pick up.

The report’s website review methodology involved developing a Chrome based browser farm to review websites ‘through the eyes of a normal human’, visiting the sites and fingerprinting the code of the site’s software and correlated that to CVEs (Common Vulnerabilities & Exposures) and the website’s reputation as a known bad or previously compromised site within the last 12 months  – ticking any of these would class the site as ‘risky’. News and Media websites were considered the highest risk. Greg Maudsley pointed out the reason being, “they are all competing to get and keep user’s attention so they’re aiming for the newest and richest content and keeping it as ‘sticky’ as possible – to do that they are serving up more and more active content – therefore more background content – it is the background sites that are beyond the control of the main sites. The most important finding was that last year 30% of the web is classed as ‘risky’ – but with each online request having up to 25 other additional requests going out in the background as a result – it is the factors associated with the background sites that have been included in this report to reach the conclusion that 46% of the top 1M sites are at risk.  In other words, there is no such thing as a safe site.

The statistics also showed a correlation between the type of code used, such as Microsoft-iis 7.5 which was the second most common vulnerable software seen in the Menlo Security report, and is currently running on over 50,000 of the top 1 million websites. For an individual to compromise a web server running this software, it is a simple matter of using exploit kits readily available on the Internet to enable a total system compromise. Today, within minutes, any motivated attacker can exert full control over a primary or background site, and deliver ransomware to unsuspecting visitors. Despite the term ‘risky’, the sites are not necessarily delivering malware but it is rather meant to describe the risk of these sites being exploited by attackers. And as the Europol arrests confirm, there is no shortage of attackers, they are spread across the globe and they have all the tools they need!

2016 Highlights:

  • News and media sites were riskiest overall with 50% of sites being classified as risky
  • Business and Economy sites this year came in as having the most recent threat history
  • Unintentional background requests, which send content to the user, outnumber intentional user requests at a ratio of 25:1
  • Nginx version 1.8.0 and Microsoft-iis version 7.5 were the most commonly used vulnerable software versions.

Full report can be accessed here

By Chris Cubbage, Executive Editor

Share.

Leave A Reply