Cloaking in black SEO. Is it worth using? What is cloaking and how to cloak in different sources? Do I need to hide from an advertiser or affiliate program?

We have released a new book “Content Marketing in in social networks: How to get into your subscribers’ heads and make them fall in love with your brand.”


Cloaking is a semi-legal method search engine optimization. Its essence is users and robots search engine when asked they see two different versions the same page.

More videos on our channel - learn internet marketing with SEMANTICA

Translated from English, this term means “mask, cover.” This explains the purpose of this method of presenting information on the network. It's difficult to get good text that fits everything keywords. Therefore, optimizers create two versions of site pages - one that users can easily read, and one that contains all the keywords for robots.

The obvious advantage of this technique is that the site quickly reaches high positions in the search results, and everyone gets what they want: the robots get an optimized page, and the user gets readable text without spam and verbal garbage.

Cloaking is often used in. It is redistributed and opens different content for advertising moderators and target audience. This is done in order to fit the required text to the rules advertising network– using just one link, show two different sites.

Don’t think that cloaking and doorway are the same thing. The difference is that the first method does not redirect the user to another page.

Why is cloaking needed?

Cloaking is used not only to deceive search engines, but also for simpler and comfortable work on sites. An example of black cloaking would be pages that rank for top keywords, but actually contain ads or links that are not relevant to user queries.

There are also more harmless reasons for using cloaking:

  • Protect contents from theft. The code, which the owner protects from copying, is not visible to users.
  • Provide websites in the required language. Browser settings are used for this.
  • Recognize the user's location by IP address.
  • Maintain page design using methods that are not taken into account by search robots when adding pages to their system. The version that was created for search engines, the same in structure and content, is offered in the most favorable light for them.

How cloaking works

To create copy pages, you need not only programming knowledge, but also the ability to optimize text for a search engine. You also need to have information about the IP or User-agent.

The cloaking process is performed using scripts that run on the web server. They receive the request and direct the script to find the source. Their task is to find out who contacted them - a robot or a user, and show the right option pages. The parameter to determine the source of the request is the IP address or User-agent.

Using User-agent

This is the name of the method for checking User-agent request data in the server. The name of the search robot is specified, the script looks for this name in its database. If the server returns the name of this robot, then the optimized page is shown. If the name is not in the list, an option is given to display the page for the user.

This is an affordable and effective technique, but it has a number of disadvantages:

  • It is easily recognized even at the user level. It is enough to install a special program and, using a fake name, access the version for robots.
  • The search engine, changing the robot's name to one that is not in the script database, will open a page created for people.

Use of IP addresses

This method is very similar to the User-agent method in terms of operation, but is considered the most effective. Its essence is in recognizing an IP address, which cannot be faked. Any user or robot has its own individual address. The script checks the user's IP with its search engine data. After this check, a page intended for each opens: the user has his own, the robot has his own.
By owning the addresses of these IPs, you can easily fool not only a robot, but also real people - search engine workers who sometimes manually check sites. The same page will open in front of them as in front of the spider.

The most reliable way to avoid being detected when using cloaking is to combine both methods at once. The scripts are tasked with checking User Agent data and IP addresses upon request.

Search engines against cloaking

Almost all search engines are against such methods of work. They believe that cloaking, like spam, clogs databases and interferes with the normal operation of search engines. When bots recognize such sites, they apply penalties to them.

Yandex is fighting this method of popularizing sites by using pessimization - the site loses positions in the search results for certain search queries. Thus, the search engine strives to include resources with useful content in the TOP 10.

Not only Yandex, but also other search engines are fighting cloaking and stopping it different ways. It often happens that versions of server pages differ not only in the quality of the optimized text, but also in its absolute difference from the original one. The robot looks at the text with keywords, and the user sees advertisements and links that are not related to the request.

But let us remind you once again that cloaking is not always evil. For example, Google selects the version of the site’s home page based on the user’s region and language.

If you intend to use cloaking or similar methods, remember that this may be grounds for blocking the site. But if you can prove that you are using such a technique for the benefit of users, they will show you leniency.

Cloaking(from the English "cloaking" - to hide) - one of the black search methods website optimization. Its main purpose is to show the search engine one html page code, but a different one for the visitor. As a result, we get two versions of the same site on the same URL. One is aimed at the visitor, the other at the search robot.

You may be asking the question, what is this for? The answer is quite simple: for maximum earnings affiliate programs for the site.

For example, if a page contains a lot of advertisements in a prominent place, then the chances of a user clicking on it are very high. But promoting a page with only advertising would be stupid, since the search engine knows about the content of the page and the likelihood that such a page would be in the top is negligible.

Another thing is a perfectly optimized page for a search engine, in which there is no advertising, but only a good optimized page content, which is sharpened under keywords. A site like this will be highly appreciated rank in the issue. However, this method of deception is very dangerous. If the search engine finds out about this, the site will at least lose positions, and, most likely, will be completely thrown out of the site. index.

How to do cloaking

I'll tell you a few words about how to do cloaking. This is done using a special script that distinguishes users from robots. The distribution is based on either User Agent, or IP addresses.

  • The User-Agent of each search engine is known to everyone, but this method is obviously a failure, because the robot can be encrypted for the user.
  • An IP address is a more reliable method, but it is not ideal. Firstly, assessors can access your site through a proxy server or in some other way and see the real content of the site. Secondly, the IP addresses of search engines are constantly changing and in order to receive up-to-date information, you will need to pay well.

Despite all the dirty methods of using cloaking, it also has its advantages. Let's look at how it can be used for good.

Pros of cloaking

  • Output content in the required encoding depending on browser settings;
  • Improving the design by replacing many elements of various poorly indexed elements, for example, AJAX or javascript, etc. The user sees the same content in a convenient form, and the search engine easily indexes the same content, but in an easier form

Despite the advantages, cloaking is usually used not for the convenience of the user, but in order to increase traffic and ultimately get money. I am not a supporter of black hat SEO, so I do not recommend cloaking. But if your competitors are guilty of this, then I advise you to complain about them through the form feedback in Yandex and

Cloaking (from the English cloak - mask, cover)one of the methods of black SEO, when the search robot and the user are shown different variants the same page.

Readable texts are difficult to optimize for all keywords, so webmasters develop 2 versions of site pages, for the user and for the robot.

Cloaking is a bit similar to the doorway method, however, it does not use automatic or manual redirection of the user to desired page, which reduces the likelihood of your optimized page being stolen by competitors. But creating copies of pages is painstaking work, since it requires not only the basics of programming, but also the availability of data such as ip or user-agent of robots.

Cloaking, like doorways, can be divided into:

  1. Black is illegal, the user is shown text that does not correspond to the request, and the robot is shown optimized text material to increase ranking.
  2. Grey. Sometimes the content presented on the site can be printed or displayed in text form, which is why the same material can be presented in 2 or more versions. Links to such texts from other sites may cause them to be indexed and classified as non-unique. To prevent this from happening, a redirect is installed on secondary links, which redirects the link juice to the original source article. This method does not harm users and does not entail the imposition of filters.
  3. White is legal cloaking. Websites redirect users to make using the site easier. This is how geotargeting works.

How to create?

A user can be distinguished from a search robot using an IP address and User Agent. The robot is automatically shown an optimized page, and the user is shown standard site content. The following programs are used in the development of such sites:

  1. User Agent is the easiest way to cloak. The webmaster views the User Agent data, which contains the name of the search robot. One of the Yandex algorithms is known as Yandex/1.01.001 (compatible; Win16; I), knowing the names of robots, you can write a function for comparing the names of robots and User Agent of visitors and show each of them the required content. Easily detected - for this, using special programs, you just need to go to the site under the robot’s name and get a “corrected” site page. Most often, cloaking is determined in this way by competitors in order to “report” a fraudulent site to a search engine service, which will punish the violators.
  2. IP address is the most effective method cloaking, search robots are identified by IP addresses, and it is quite difficult to falsify them. You can show users the site standard by selecting its pages from a special database - a robot cannot recognize such a scheme manually. But this method requires access to a database of addresses.
  3. Combined cloaking - a combination of checking User Agent and IP addresses. The method shows the highest results, but it is the most labor-intensive.

Cloaking and search engines

Almost all search engines recognize cloaking as illegal and suppress it in every possible way. Sometimes website pages differ not only in the quality of content optimization, but also in its total difference from the original. For example, a robot can see text material with keywords, and a user can see advertisements. As mentioned above, not all cloaking is a scam.

Cloaking is used by Amazon.com (shows products depending on previously open pages), Yelp.com, some Google services, NYTimes.com (you can register only after 5 clicks), ComputerWorld.com (users see ads, and the robot sees html text) and Forbes.com (to get to home page, you need to watch one commercial).