With the constant chopping and changing of the Google algorithms that drive rankings on the search engine’s results pages, it is a constant task for SEOs to stay on top and learn what’s new. Fortunately, the accuracy and sophistication of analytics programs such as Raven, Google Analytics and Screaming Frog have also been on the rise and have played a large part in the continued relevance of SEO. One recent change, however, has threatened to severely affect our ability to analyse website traffic accurately. It all began in early March of this year. Google started encrypting searches for users who were logged into their Google accounts. This made it impossible to ascertain which keywords were being used to take those logged-in users to their respective landing pages, an important part of the search engine optimisation process. From that point onwards, whenever examining the list of search terms that sent browsers to a particular page, the top-listed result would nearly always read ‘(not provided).’ At this point, however, all was not lost. It stood to reason that the data available (i.e. from those who weren’t signed into their Google accounts) would still be a reliable representation of the overall rankings of the keyword terms for that landing page. This is because removing a broad cross-section of data would not affect the proportions as a whole. You could even work out a reasonably accurate estimate of the total volume for a particular search term by using percentages. However, ripples of fear have been sent through the SEO world as the amount of keyword referral data available has been steadily decreasing. This is based on numerous reports by online media agencies that have seen increasingly high volumes of search traffic affected, some even seeing averages as high as 20%. The problem continues with Mozilla, who, valuing privacy, has chosen to encrypt all natural searches made on the latest version of the Firefox browser by default. With numerous signs of Google’s Chrome following suit, and rumours of Microsoft making the same changes to IE, the massive worry for SEOs is that all browsers will eventually take up this policy, rendering accurate keyword data analysis impossible. So, will this result in the death of the search engine optimisation industry? It certainly looks likely that at some point in the near future, we’re not going to have access to visit and conversion data at the keyword level. Unsurprisingly this will certainly make our jobs a bit harder, but as canny individuals, there are still plenty of methods with which to draw the conclusions that we need to perform SEO. Luckily we still have access to a gold mine of valuable information. This includes: -Rankings, -Pages receiving SEO traffic, -Knowledge of which pages rank for which keywords, -Conversions from visits landing on these pages. In addition, we will still be able to use Webmaster Tools to see impressions, clicks and the average position for bucketed keywords. The SEO industry will still continue to carry out the same processes and tasks that it did previously, however, in some areas, particularly those relating specifically to keyword analysis, it might take a little longer. Something that does seem very likely is the further entwining of SEO and PPC. It is highly improbable that any data needed for effective PPC optimisation will be blocked, so it will therefore be more and more important to bring learnings from PPC into our SEO campaigns. With all of this information available, the SEO industry will still be able to run effectively. However, one issue that is sure to become more of a challenge and will make our jobs quite a lot trickier is the ability to separate out the success of brand and non-brand traffic with accuracy. Although, as we always have in the past, I’m sure we’ll find a way.