SES London Session Recap – Auditing Paid Click and Click Fraud

Guest entry from Marie Howell, Bruce Clay Europe.

Marie will be digging through her notes and posting recaps on some of her favourite sessions from the London Search Engine Strategies conference.

Jeff Rohrs was the expert moderator and the ‘host with the most’, introducing this advanced track session with flair, charisma and passion. In his introduction, he referenced his highly articulate and comprehensive paper ‘The Sausage Manifesto‘ as a starting point for the session. Based on attendees’ frantic scribbling, you know that the traffic to that excellent document must have spiked dramatically later in the day.

Jeff introduced Jon Myers who discussed what click fraud and click auditing is and, as it was an advanced session, recommended sites like Search Engine Watch and NeverblueAds for more information. He discussed how Google has a four stage process to remove invalid clicks and how the user can view the results of this process within the reports available in their AdWords account.

There are a variety of different companies who deal with click auditing, and as advertisers become more aware of the potential for click fraud, the number of companies investigating it is only increasing. Jon stated that it is imperative to look at the log files and at the second tier URL to try and understand the click and conversion data. He said to particularly look out for repeating IP addresses/domains, where visitors are coming from (e.g. Asia), click spikes, etc. He told the audience that it is possible to refine this process with technology and to drill down into the data further by looking at factors like geo location, where the end user has used an open proxy, onion routers, where the DNS recorded doesn’t have a valid IP address, etc. The technology would marry this automatically with client conversion data.

Jon added that the search engines are willing to solve this before concluding that fraud detection will need to evolve and continue to develop as new products, such as mobile search, come into play.

Dr. Stephen Turner of Click Tracks discussed distinguishing poor performing ads from click fraud, and encouraged recognising false positives before the microphone passed on to Shuman Ghosenmajumder who added ‘a bit of colour’ to the previous perspectives.

Shuman gave the two types of click fraud, which are those from:

  1. Competitors, trying to hurt other advertisers.
  2. Affiliates, by generating false clicks on their ads.

He continued by saying that this might be via click farms recruiting ‘pay to click’ scams and software (Clickbots or Botnets – using a network of machines to create false traffic). Shuman stated categorically that Google detects fraud independent of the type of fraud. He showed how Invalid Click Reports show the number of clicks that are being marked as invalid and that Google aspires, naturally, to identify and refine out click fraud as soon as it happens.

Shuman described Google’s ‘shield’ that provides protection to the advertiser. He said that the search engines find it acceptable to have a high false positive rate and that Google will perceive many false positives as fraud and therefore not charge, even though it may actually be a genuine click. Google casts the net wide, apparently, to catch as many invalid clicks as possible and are quite happy for false positives to not be charged for.

Shuman discussed how real-time filters will come into play for a number of situations, including double clicks. Moreover, the offline analysis continues and although there is a period of time where the advertiser have been affected (and potentially not having their ads showing), a refund is given afterwards as soon as it is detected. Credits are given back to advertisers immediately. Shuman also covered reactive investigations – those investigations that are initiated by the advertisers from the logs that are submitted to Google. He says that this is a valuable feedback mechanism, although Google say this is a negligible percentage of the overall detected fraud.

Shuman also advocated using the Google Autotag, which allows users to see which clicks are being recorded by Google and which are not. This reconciles the number of log clicks and billed clicks.

Shuman concluded by saying that Google wants to hear from advertisers who believe that click fraud hasn’t been detected and that they are investing in the click quality team to improve customer experience and ensure a good ROI. So, the floodgates are officially open. Google’s investment means that there should be sufficient staff to help address advertisers’ concerns – in theory.

Ever conscious of the audience’s needs, Greg Boser set the stage by summarising the history interaction process between the search engines and the advertisers who were trying to report click fraud. He even included reference to the Clickbot Wars and competitive sabotage that was rife in 2002-3. When AdSense first came out, it was evident that detection of fraud was becoming more apparent and so the process of fraudsters using dirty proxies had been pretty much stamped out. Greg stressed how vital it is that analytics comes into play to assess potential click fraud. He recommended tracking clicks especially if they are from anonymous proxies, to check log files, cookies, etc.

Controversially, Greg (based upon extensive experience and a really tight handle on this issue) doesn’t believe the suggested 38 percent click fraud rates, He believes this figure is amplified by the click auditing companies in their efforts to capitalise on this side of the industry. A superb speaker, Greg rounded off his presentation with a few pearls of wisdom and thoughts for the future, like recommending Jeff Rohrs’ ‘Sausage Manifesto’, stating that it is the ‘greatest document’ and encouraging people to track contextual advertising in addition to search. He concluded by saying that he believes strongly that ‘it is the responsibility of the search engines and not the advertisers’ to monitor and prove click fraud, and that detailed log files from the search engines should be made available to advertisers in order to facilitate reconciliation with the site log files.

The Q & A saw the eloquent Greg come head to head with Shuman based on Greg’s vast experience on the history of click fraud management by the search engines, plus the value of contextual advertising. A highly interesting session – the star of which was Greg – with some real nuggets of information magnanimously shared with the audience.

Lisa Barone is a writer, content marketer & VP of strategy at Overit Media. She's also a very active Twitterer, much to the dismay of the rest of the world.

See Lisa's author page for links to connect on social media.

Comments (0)
Still on the hunt for actionable tips and insights? Each of these recent Digital Marketing Optimization posts is better than the last!
Bruce Clay on January 22, 2024
How To Optimize Content for Facebook and Instagram
Bruce Clay on December 14, 2023
SEO vs. PPC: How To Choose
Bruce Clay on October 16, 2023
7 Proven Strategies To Increase Website Traffic for Your Business


Your email address will not be published. Required fields are marked *

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919