Cloaking in black SEO. Is it worth using? Cloaking: what is it and why is it used? What is cloaking in traffic arbitrage

Cloaking is one of the methods of black SEO website promotion, in which search robots and ordinary Internet users open different versions of the same page during a request.

Translated from English, cloaking means “mask”, “mantle”, “cover”, which quite clearly explains the essence of this technique.

It is very difficult to write high-quality text that contains all the keywords for successful page promotion, and at the same time satisfy the needs of the user who needs readable and useful information. In this regard, webmasters resort to creating two different versions of web resource pages:

  1. Readable– for users.
  2. Tailored to keywords– for robots.

The main advantage of the method is its high efficiency - very quickly the project reaches a leading position in search results, while everyone is happy: both readers, who do not have to “break” their eyes due to spam and inorganically entered keywords, and bots, who receive an optimized document.

Important! Many people confuse cloaking with doorway, although they are significantly different from each other. The doorway, after the user has clicked on the link, redirects him to another resource that is not relevant to the entered query, while cloaking displays the required page.

In addition to deceiving search engines, the use of cloaking also provides a more comfortable and simpler experience on websites. Black cloaking is when pages are optimized for the most common keywords, but in reality they contain links or advertisements that do not correspond to the needs of the target audience.

But there are more, so to speak, legal reasons for using cloaking:

  • Protection of materials from theft. For example, the site owner wants to hide the code from users so that no one can copy it.
  • Saving page designs in ways that are not taken into account by bots when adding them to a search engine. The resource for search engines is made according to a similar structure and has the same content, displayed in the most convenient form for them.
  • Displaying Internet resources in the required language. Browser settings are used for this purpose.
  • Ability to find out the location of visitors via IP address.

What is cloaking in traffic arbitrage

Cloaking is often used to redirect the target audience to affiliate programs and other sites and obtain financial benefits. Information is redistributed, and therefore visitors see one page content, and advertising moderators see another. In this way, it is possible to create text that complies with the rules of the advertising network - one link shows two different documents.

How cloaking works

To create several versions of the same web page, you need not only to be a good programmer, but also to have good knowledge in the field of search engine optimization of the site. You also need to have information about the User-agent or IP.

The “masking” procedure itself is carried out through scripts that operate directly on the server. When the user enters a query, the script is sent to look for the source. He must understand which version of the web page to display - the user version or the search version. And without an IP or User-agent, it is not possible to find out who made the request.

How does User-agent work?

  1. Enter the name of the search engine robot.
  2. The script searches for a name in the database.
  3. If the name is in the database, the server displays the corresponding page, optimized for search engines.
  4. If the name is not identified, a page is sent to the Internet user.

Despite the effectiveness of the technique, it has its drawbacks:

  • You can fake your name. Even an ordinary user, having installed a special program, can assign the name of one of the search robots and go to a page with key spam.
  • The names of the bots change from time to time, and if one that is not yet in the script database comes to the site, then a page intended for a live audience will be available to it.

How do IP addresses work?

The principle of operation of this technique is similar to User-agent, but it is more reliable. It identifies the IP address of whoever made the request. The fact is that it is impossible to fake IP, since they are unique for all robots and users. The script checks that the visitor's IP address matches its search engine information and then returns a web page depending on the source of the request: a bot or a human.

Having information about IP addresses, a webmaster can deceive not only bots, but also search engine workers who check sites personally from time to time.

On a note. If, when using cloaking, you want to protect yourself as much as possible from being exposed by search engines, it is recommended to combine both methods - IP and User-agent data. Then the scripts will check both parameters when receiving a request.

How do search engines view cloaking?

As we mentioned at the very beginning, cloaking is a black hat optimization method because almost all search engines oppose such techniques and perceive them as spam, clogging the database and worsening the functioning of search engines. Therefore, when they detect sites that use duplicate content to quickly promote the site to the TOP search results, robots apply severe penalties to them.

For example, Yandex immediately resorts to pessimization, after which the resource abruptly flies off the first pages of search results. This is how the search engine ensures that the leaders always include only resources containing high-quality, relevant and useful content. And such preventive measures apply not only to Yandex.

Tricks are often observed when a page tailored for search engines differs from the original one not only in the usefulness of the content, but also in its entire content, structure, etc. It turns out that the robot sees text with keywords, and the user receives in response to the request advertising that is useless to him and links that do not correspond to the topic.

How do ad networks view cloaking?

If you are engaged in traffic arbitrage, then by using cloaking you expose yourself to a ban on advertising network accounts without the right to restore them. This method of deception is prohibited and advertising systems are constantly improving ways to detect it.

Conclusion

If you decide to use cloaking or other similar methods, please note that such techniques must be done wisely. Otherwise, this may lead to your account being blocked from the advertising network. We strongly do not recommend using cloaking for sites with the aim of manipulating search results, since it is very difficult to remove the site from under the filter for this deception.

Cloaking (from the English cloak - mask, cover)one of the black SEO methods, when the search robot and the user are shown different versions of the same page.

Readable texts are difficult to optimize for all keywords, so webmasters develop 2 versions of site pages, for the user and for the robot.

Cloaking is a bit similar to the doorway method, however, it does not use automatic or manual redirection of the user to the desired page, which reduces the likelihood of competitors stealing the optimized page. But creating copies of pages is painstaking work, since it requires not only the basics of programming, but also the availability of data such as ip or user-agent of robots.

Cloaking, like doorways, can be divided into:

  1. Black is illegal, the user is shown text that does not correspond to the request, and the robot is shown optimized text material to increase ranking.
  2. Grey. Sometimes the content presented on the site can be printed or displayed in text form, which is why the same material can be presented in 2 or more versions. Links to such texts from other sites may cause them to be indexed and classified as non-unique. To prevent this from happening, a redirect is installed on secondary links, which redirects the link juice to the original source article. This method does not harm users and does not entail the imposition of filters.
  3. White is legal cloaking. Websites redirect users to make using the site easier. This is how geotargeting works.

How to create?

A user can be distinguished from a search robot using an IP address and User Agent. The robot is automatically shown an optimized page, and the user is shown standard site content. The following programs are used in the development of such sites:

  1. User Agent is the easiest way to cloak. The webmaster views the User Agent data, which contains the name of the search robot. One of the Yandex algorithms is known as Yandex/1.01.001 (compatible; Win16; I), knowing the names of robots, you can write a function for comparing the names of robots and User Agent of visitors and show each of them the required content. It is easily detected - to do this, using special programs, you just need to go to the site under the name of the robot and get a “corrected” site page. Most often, cloaking is determined in this way by competitors in order to “report” a fraudulent site to a search engine service, which will punish the violators.
  2. An IP address is the most effective method of cloaking; search robots are identified by IP addresses, and it is quite difficult to falsify them. You can show users the site standard by selecting its pages from a special database - a robot cannot recognize such a scheme manually. But this method requires access to a database of addresses.
  3. Combined cloaking - a combination of checking User Agent and IP addresses. The method shows the highest results, but it is the most labor-intensive.

Cloaking and search engines

Almost all search engines recognize cloaking as illegal and suppress it in every possible way. Sometimes website pages differ not only in the quality of content optimization, but also in its total difference from the original. For example, a robot can see text material with keywords, and a user can see advertisements. As mentioned above, not all cloaking is a scam.

Cloaking is used by Amazon.com (shows products depending on previously opened pages), Yelp.com, some Google services, NYTimes.com (you can register only after 5 clicks), ComputerWorld.com (users see advertising, and the robot sees html text ) and Forbes.com (you must watch one commercial to get to the main page).

Cloaking(from the English “cloaking” - to hide) is one of the black methods of search engine optimization of a website. Its main point is to show the search engine one html code of the page, and another to the visitor. As a result, we end up with two versions of the same site on the same URL. One is aimed at the visitor, the other at the search robot.

You may be asking the question, what is this for? The answer is quite simple: to maximize earnings through affiliate programs for the site.

For example, if a page contains a lot of advertisements in a prominent position, then the chances of a user clicking on it are very high. But promoting a page with only advertising would be stupid, since the search engine knows about the content of the page and the likelihood that such a page would be in the top is negligible.

Another thing is a perfectly optimized page for a search engine, in which there is no advertising, but only optimized good content that is tailored to keywords. Such a site will rank highly in search results. However, this method of deception is very dangerous. If the search engine finds out about this, the site will at least lose positions, and, most likely, will be completely thrown out of the index.

How to do cloaking

I'll tell you a few words about how to do cloaking. This is done using a special script that distinguishes users from robots. The distribution is based on either User Agent, or IP addresses.

  • The User-Agent of each search engine is known to everyone, but this method is obviously a failure, because the robot can be encrypted for the user.
  • An IP address is a more reliable method, but it is not ideal. Firstly, assessors can access your site through a proxy server or some other way and see the real content of the site. Secondly, the IP addresses of search engines are constantly changing and in order to receive up-to-date information, you will need to pay well.

Despite all the dirty methods of using cloaking, it also has its advantages. Let's look at how it can be used for good.

Pros of cloaking

  • Output content in the required encoding depending on browser settings;
  • Improving the design by replacing many elements of various poorly indexed elements, for example, AJAX or javascript, etc. The user sees the same content in a convenient form, and the search engine easily indexes the same content, but in an easier form

Despite the advantages, cloaking is usually used not for the convenience of the user, but in order to increase traffic and ultimately get money. I am not a supporter of black hat SEO, so I do not recommend cloaking. But if your competitors are guilty of this, then I advise you to complain about them through the feedback form in Yandex and

We've released a new book, Social Media Content Marketing: How to Get Inside Your Followers' Heads and Make Them Fall in Love with Your Brand.


Cloaking is a semi-legal search engine optimization method. Its essence is that users and search engine robots see two different versions of the same page when making a request.

More videos on our channel - learn internet marketing with SEMANTICA

Translated from English, this term means “mask, cover.” This explains the purpose of this method of presenting information on the network. It is difficult to get good text that matches all the keywords. Therefore, optimizers create two versions of site pages - one that users can easily read, and one that contains all the keywords for robots.

The obvious advantage of this technique is that the site quickly reaches high positions in the search results, and everyone gets what they want: the robots get an optimized page, and the user gets readable text without spam and verbal garbage.

Cloaking is often used in. It is redistributed and opens up different content for advertising moderators and the target audience. This is done in order to adjust the required text to the rules of the advertising network - using just one link to show two different sites.

Don’t think that cloaking and doorway are the same thing. The difference is that the first method does not redirect the user to another page.

Why is cloaking needed?

Cloaking is used not only to deceive search engines, but also to make work on websites easier and more convenient. An example of black cloaking would be pages that rank for top keywords, but actually contain ads or links that are not relevant to user queries.

There are also more harmless reasons for using cloaking:

  • Protect contents from theft. The code, which the owner protects from copying, is not visible to users.
  • Provide websites in the required language. Browser settings are used for this.
  • Recognize the user's location by IP address.
  • Maintain page design using methods that are not taken into account by search robots when adding pages to their system. The version that was created for search engines, the same in structure and content, is offered in the most favorable light for them.

How cloaking works

To create copy pages, you need not only programming knowledge, but also the ability to optimize text for a search engine. You also need to have information about the IP or User-agent.

The cloaking process is performed using scripts that run on the web server. They receive the request and direct the script to find the source. Their task is to find out who contacted them - a robot or a user, and show the desired version of the page. The parameter to determine the source of the request is the IP address or User-agent.

Using User-agent

This is the name of the method for checking User-agent request data in the server. The name of the search robot is specified, the script looks for this name in its database. If the server returns the name of this robot, then the optimized page is shown. If the name is not in the list, an option is given to display the page for the user.

This is an affordable and effective technique, but it has a number of disadvantages:

  • It is easily recognized even at the user level. It is enough to install a special program and, using a fake name, access the version for robots.
  • The search engine, changing the robot's name to one that is not in the script database, will open a page created for people.

Use of IP addresses

This method is very similar to the User-agent method in terms of operation, but is considered the most effective. Its essence is in recognizing an IP address, which cannot be faked. Any user or robot has its own individual address. The script checks the user's IP with its search engine data. After this check, a page intended for each opens: the user has his own, the robot has his own.
By owning the addresses of these IPs, you can easily fool not only a robot, but also real people - search engine workers who sometimes manually check sites. The same page will open in front of them as in front of the spider.

The most reliable way to avoid being detected when using cloaking is to combine both methods at once. The scripts are tasked with checking User Agent data and IP addresses upon request.

Search engines against cloaking

Almost all search engines are against such methods of work. They believe that cloaking, like spam, clogs databases and interferes with the normal operation of search engines. When bots recognize such sites, they apply penalties to them.

Yandex is fighting this method of popularizing sites by using pessimization - the site loses positions in the search results for certain search queries. Thus, the search engine strives to include resources with useful content in the TOP 10.

Not only Yandex, but also other search engines are fighting cloaking and stopping it in different ways. It often happens that versions of server pages differ not only in the quality of the optimized text, but also in its absolute difference from the original one. The robot looks at the text with keywords, and the user sees advertisements and links that are not related to the request.

But let us remind you once again that cloaking is not always evil. For example, Google selects the version of the site’s home page based on the user’s region and language.

If you intend to use cloaking or similar methods, please remember that this may be grounds for blocking the site. But if you can prove that you are using such a technique for the benefit of users, they will show you leniency.