At the beginning of this year, Google made a change to their algorithm to eliminate duplicate content and attempt to find and give credit to the original author. This was in effort to fight the war on spam websites that are getting search engine ranking by copying good quality content and bumping the original page down in search results.
This update changed the display of duplicate content in search results. Multiple copies of the same content on different sites used to receive equal credit. Now googlebot compares over 200+ signals to determine which copy is the original author that it will give all of the credit to.
Websites hijacked by loophole in Google algorithm
An Australian search engine optimizer made an interesting discovery that allowed him to exploit a loophole in the new algorithm. After only performing three tests on three different websites, he was able to hijack the search results of these websites by creating a duplicate page of their content and essentially stealing their online identity.
This loop-hole is not technically a violation or a bug in Google's algorithm, however I definitely see it being exploited and treated as a bug in the near future.
How to defend your website traffic from being hijacked
Web traffic has can be compared to gold in our society since it can be a driving force in the success of a business; both with a physical location and/or internet presence. A few things you can do to help deter this type of activity is to manage certain meta tags within your website programming and make sure it is optimized for search engines.
- Canonicalization
Adding link rel="canonical" to your webpages will tell Google the source of the content. - Authorship
Create a Google+ page and setup an Google+ authorship to your domain or content. - Internal Links
Use an absolute URL in your internal links instead of a relative URL. If there are any sites copying your content using a bot, your content will be linked to you instead of their own domain. - Monitor your content
Add branded keywords (i.e. fishpunt) to Google Alerts and it will send you a full report of new/existing content on the web that contains that specific term. Be sure to use the advanced keyword tags to eliminate non-relevant content from appearing.
Exploiting Webmaster Tools to research your competition
The webmaster tools available from Google & Bing give a detailed scope of how these search engines view a specific domain and it's internal pages. It is good practice to use multiple tools to make sure you have all the details needed to boost any website marketing strategy.
Why is this so important? There are different SEO tools available, such as Raven Tools, Market Samurai and many more, that compile available research information based on the data that each search engine provides. However, their biggest pitfall is they are simply compiling information that is publicly available using webmaster tools, analytics, web traffic logs, and other tracking scripts.
The exploit is in the duplicated content
In concept and execution, Google's new algorithm update seemed to work out great. All of the spam sites started moving down in the results and originators began moving up in rank. The flaw in googlebot is that it gives competitor information to both the content owner and the hijacker through webmaster tools since it believes that both documents are the same entity.
Here's an example:
If red.com wants to see the competitor information for a page on blue.com, it would create a subdomain called blue.red.com with a complete copy of the blue.com page including the mirroring of the website page url, content, images, and HTML code.
When googlebot sees the duplicated page, it is forced to calculate over 200+ signals to decide which page has more authority and credits what it believes to be the original author. If the hijacker's content is chosen as the original author, it will replace the original authored page with the hijacker's copy.
In conclusion
My advice is to stay away from exploiting this loophole with ranked domains. Even though it is not a technical violation, using this technique can result in a penalty if Google decides to flag it as an issue. I foresee several reports for hijacked website ranking over the next few months as people push the limit to see how far they can manipulate search results.