What is cloaking?
Cloaking is an SEO technique where the content displayed to users is different compared to the information presented to the search engine crawlers i.e. spiders or bots.
In the SEO industry, cloaking is considered an illegal practice where website servers are specifically programmed to improve rankings for certain keywords.
This technique is prohibited and should not be used as it tricks and manipulates search engines to boost web traffic and rankings.
Who is responsible for cloaking?
Some product-selling agencies use attackers to perform SEO cloaking. They attempt to break security on your website and, if they succeed in breaking it, hide some links within your web pages.
Through your website, they promote and rank their web pages within search engines. You might wonder why they don’t perform SEO cloaking on their website. This is because they know that cloaking can result in a website being blacklisted and penalized.
Types of cloaking
There are 5 types of cloaking, namely:
- IP-based cloaking: This is the most used cloaking technique. A website honour uses its DNA reverse facility in the C-panel to identify a user’s IP address and redirect them to the desired page from an already ranked web page. After that, they set up ‘.htaccess’ to redirect visitors to the desired URL.
- User-agent cloaking: User-agents are programs (software agents) that act on behalf of their users. For example, browsers fetch website information through an operating system as user agents. Upon entering a query, the browser sends a code to the server to identify the user agent. Cloaked content is served when the user agent is identified as a crawler.
- HTTP_REFERER cloaking: The HTTP_REFERER header of the requester is checked in this method, and a cloaked or uncloaked version of the website is served.
- HTTP Accept-language header cloaking: This technique involves identifying a user’s HTTP accept-language header before serving cloaked content. If a user is a search engine, then cloaked content is served.
- JavaScript cloaking: By using this technique, different versions of content are served depending on whether JavaScript is enabled or disabled. Due to the lack of JavaScript support, cloaked content is served to search engines.
Common ways to implement cloaking
Below are some standard methods used to implement on websites:
- Invisible or hidden text: This is achieved by adding text in the same colour as the background so that it is not visible to the human eye.
- HTML rich websites: It is recommended that your web page has a “TEXT to HTML ratio” that is as high as possible. It means there should be more text (content) on the page than HTML tags. In contrast, writing a short article or post will result in a low text to HTML ratio. To meet SEO guidelines, people use cloaking instead of redesigning their websites.
- Flash-based websites: It is well known that flash is not recommended as per SEO guidelines. However, a lot of websites have built-in flash or require a lot of flashes, so they cannot avoid it. Rather than rebuilding websites and rewriting everything in plain HTML, they create content-rich web pages and provide them to search engine crawlers, and flash pages for visitors.
- Image gallery websites: An image gallery website is a website that contains more images than the actual content of the page. These images are not scanned by search engine bots or crawlers, even if they are relevant. Therefore, webmasters use keywords and content to camouflage their websites and boost their rankings.
Google’s response to cloaking
Google may penalize or blacklist the website for violating Google’s Webmaster Guidelines. This results in a loss of rankings and any source of organic traffic for the website.
What action to take if cloaking is found on your website?
Since long-term usage of cloaking can lead to your website being blacklisted, you must act quickly. With coding knowledge, you can prevent your site from spamming. However, if you don’t know how to do it, you should hire an SEO specialist for assistance.
Cloaking is not supported by most search engines. The algorithms that search engines use are more intelligent than ours. Their algorithms are often changed to discourage and stop cloaking and other techniques that create a negative user experience. Sooner or later the search engines can detect the cloaking and can ban the concerned websites.