Google's new algorithm is KBT. How does Google search work? Due to changes in Google search algorithms

Technologies

Google search works based on an algorithm. Computers process data, looking for signals that a particular site is more or less interesting and relevant. this request. PageRank takes up a large part in this process, through which Google and selects external links, as well as texts related to them. But Google also uses many other signals to determine page relevance, otherwise PageRank would search endlessly.

Incoming links are rated for various web pages. If we take these links into account, the search gives fairly good results. That's why Google was unique and unique in 1999, when the results were less like an endless game and less human intervention.

But today all that has changed. Google algorithm increasingly subject to motivated people and companies fighting for it. How is that "arms race", where Google is losing.

Another event that happened the other day does not play in favor of the search engine itself - changes have appeared in the Google algorithm aimed at combating some of the most annoying Internet polluters - "content farms", for example. The "arms race" continues.

Google insists it is not directly using the recently launched feature Users blocking irrelevant results V Chrome browser. However, when it comes to how the algorithm works, Google's new solution is similar to a user ban.

“If you take the top few dozen sites that Chrome users blocked most often, 84 percent of them are downgraded in the new algorithm,” the company says.

A strange statement, however. After all, this could be a huge incentive to manipulate the blocking. Forget about content farms, just hire thousands of people who will simply block competitors' sites. Another side of the "arms race".

Google, like the wizard from Oz, hides behind its powerful and mysterious search algorithm. If good search is as simple as clicking a mouse on a web page, all the magic can go away.

And, if somehow you can remove as much spam as possible, doing it based on demographic or geographic indicators or even the user's personal data, then things can really be quite unpleasant. Especially if Google won't have access to any of that data. And Facebook will have it.

One of the most interesting experiments is happening now, it concerns Possibility of searching by links you like on Facebook. Search results are determined by what your friends liked on Facebook, and this data is stored for a really long time.

Social network used by more than five hundred million people, half of whom check the profile every day. These people spend 700 billion minutes on the site and share thirty billion pieces of content. They share links, and people click on the “like” button to vote for its content. Overall this is quite interesting search experiment with Blekko.

Now imagine what Google can do with all this data, and you'll understand why social media is so important to it now. Not to "kill" Facebook, but to try to "neutralize" the threat so that the next big leap in search engine evolution doesn't happen without them. Where Google tends to fall short is in trade and tourism, which is why a friend's suggestion for a hotel in Paris, for example, can be a good thing in search results.

It may take more than a year before Facebook will actually be interested in search fully. There's so much in this business advertising revenue, that it will also not be possible to ignore it endlessly. This, in fact, is what should scare Google the most.

We all know firsthand about the existing algorithms of the search engines Yandex and Google. It is to comply with their “constantly updated” rules that all optimizers are racking their brains with more and more new ways to get to the TOP of search results. Among the latest innovations that site owners have felt on the part of the PS are the requirements for the mobility of Internet resources and a decrease in the search for those sites who don't know how to buy links. What algorithms, introduced into search so far, have significantly influenced the ranking of sites? In fact, not all optimizers know what technologies, when and why were created in order to give the most fair position to each site in the search and clear the search results of “junk”. We will look at the history of the creation and development of search algorithms in this article.

Yandex: types of algorithms from conception to today

The algorithms were not all created in one day, and each of them went through many stages of refinement and transformation. The bulk of the names of Yandex algorithms consist of city names. Each of them has its own operating principles, points of interaction and unique functional features that harmoniously complement each other. We will consider further what algorithms Yandex has and how they affect sites.

In addition to information about search algorithms, an article about . I suggest you read tips on creating high-quality SEO content suitable for Google and Yandex search engines.

Magadan

The Magadan algorithm recognizes abbreviations and identifies nouns with verbs. It was first launched in test mode in April 2008, and the second permanent version was released in May of the same year.

Peculiarities

"Magadan" provides the user who wrote the abbreviation with websites and transcripts. For example, if in search bar entered the request of the Ministry of Internal Affairs, then in addition to sites with such a keyword, the list will also contain those who do not have an abbreviation, but have the decoding “Ministry of Internal Affairs”. Transliteration recognition gave users the opportunity not to think in what language to write names correctly, for example, Mercedes or Mercedes. In addition to all this, Yandex included almost a billion foreign sites in the indexing list. Recognition of parts of speech and recognition of them as equivalent search queries allowed sites with different key phrases to be included in one search. That is, now, for the keyword “website optimization”, sites with the phrase “optimize website” are also displayed in the search results.

results

After the launch of the Magadan algorithm, it became more difficult, mainly for low-authority sites. In the ranking, the positions for relevant queries of low-visited and young resources decreased, and authoritative ones, even with low-quality content, moved to the first places, taking into account the morphology and dilution of keywords. Due to the inclusion of transliteration, foreign resources also entered the TOP of the Runet. That is, optimized text on a topic could appear on the second page, only because, supposedly, there is a more visited site on the same topic or a similar foreign one. Because of this, competition for low-frequency keywords and foreign phrases has increased sharply. Advertising has also become more expensive - the rates have increased, because previously sites competed only on one specific request, and now they also compete with “colleagues” with morphological phrases, transliteration, words that change into another part of speech.

Nakhodka

The “Nakhodka” algorithm is an expanded thesaurus and careful attention to stop words. Released into the ring immediately after Magadan. Ranks the main search results since September 2008.

Peculiarities

This is an innovative approach to machine learning - ranking has become clearer and more correct. The expanded dictionary of connections and attentiveness to stop words in the Nakhodka algorithm greatly influenced the search results. For example, the request “SEO optimization” was now associated with the key “SEO optimization”, and commercial sites were diluted information portals, including expanded snippets with answers appeared in the list, and Wikipedia was displayed in a special way.

results

Commercial sites have placed greater emphasis on sales queries, as competition has increased several times for informational, non-specific phrases. In turn, information platforms were able to expand their monetization using recommendation pages by participating in affiliate programs. Top information sites, promoted by commercial requests, began to sell links to order. Competition has become tougher.

Arzamas

Algorithm "Arzamas" - lexical statistics of search queries was introduced and a geographical reference of the site was created. The first version of "Arzamas" (April 2009) without geo-dependence was released immediately into the main search results, and "Arzamas 2" with a classifier for linking the site to the region was announced in August 2009.

Peculiarities

Removing the link to homonyms made life easier for the user, because now the phrase “American pie” returned only movie-themed sites, without any dessert recipes, as could have been the case before. Linking to the region made a breakthrough, shifting key phrases with the addition of the city several points down. Now the user could simply enter the word “restaurants” and see in the leaders only sites from the city of his location. If you remember, earlier you would have had to enter a more specific phrase, for example, “Restaurants in St. Petersburg,” otherwise Yandex could have returned the response “specify the request - too many options were found.” Geo-independent keywords returned only sites relevant to the request from any region, without reference.

results

Hooray! Finally, sites from small regions have stopped competing with large cities. It is now much easier to reach the TOP in your region. It was during this period of time that the “regional promotion” service was offered. The Armazas algorithm made it possible for small companies to develop faster in their area, but the catch still remained. Yandex could not determine the geolocation of all sites. And as you yourself understand, without attachment, the resources remained, to put it mildly, in one not very pleasant place. Consideration of an application for geo-dependence could last several months, and young sites without traffic and link mass (there was a restriction on TICs) generally could not submit a request to assign them a regionality. It's a double-edged sword.

Snezhinsk

The Snezhinsk algorithm strengthens geodependence and clarifies the relevance of queries to search results using Matrixnet machine learning technology. The announcement took place in November 2009, and the improved model under the name “Konakovo” went into operation in December of the same year.

Peculiarities

Search results have become more accurate to the questions entered. Geolocation binding now plays a special role - commercial sites were not associated with regions by the Snezhinsk algorithm, so they dropped out of the search results. Keywords that are not tied to a location are identified with information resources. The complex architecture for calculating relevance greatly complicated the life of optimizers, who noticed that with the slightest change in one of the indicators, the site’s position in the search results instantly changed.

results

At that time it was noted that the purchase external links on young sites the impact on the performance of new resources was too sluggish if we compare a similar purchase to a site that has been on the Internet market for a long time. New methods for determining the relevance of content to search queries removed sites whose texts were oversaturated with key phrases from the search results. A new era of quality text had begun, where there had to be a measure in everything; without it, the site could simply fall under sanctions for spam. Commercial resources began to panic, because it was almost impossible to reach the TOP using geo-independent keywords (and they were the highest-frequency ones). In this regard, an entry was published on the Yandex blog that ideally we would like to see on the first pages commercial organizations that do not write beautifully, but do their job well, but for this we will have to teach algorithms to evaluate the quality of the services offered. Since on this moment this turned out to be an impossible task; the reputation of commercial Internet resources played a key role in search results, both online and offline.

Obninsk

The “Obninsk” algorithm improves ranking and expands the geographic base of Internet sites and reduces the impact of artificial SEO links on site performance. Launched in September 2010.

Peculiarities

The popularity of purchasing link masses is falling, and the concept of a “link explosion” appears, which everyone was now afraid of. Competitors could harm each other by misleading the algorithm by purchasing a huge number of links from “bad sources” to their “colleague.” After this, the competitor dropped out of the search results and could not get there for a long time. Geo-sensitive words are more often added to different pages of commercial sites to draw the robot’s attention to working with this region.

results

Commercial sites are now more careful about their reputation, which is good news, but many still resorted to dirty methods (artificially inflating traffic and buying reviews). After the release of the Obninsk algorithm, purchasing became more popular eternal links and articles, the usual purchase of links no longer influenced the ranking as much as before, and if the source of the backlink fell under sanctions, it could lead to a chain reaction. High-quality SEO texts are a mandatory attribute of any resource. A young site with unique and properly optimized content could get to the TOP.

Krasnodar

Algorithm "Krasnodar" - implementation of the "Spectrum" technology to dilute search results, expand snippets and index social networks. The launch took place in December 2010.

Peculiarities

The “Spectrum” technology was created to classify queries into categories and was used in cases where non-specific key phrases were entered. “Krasnodar” diluted the search results, offering such a user more diverse options. For example, with the phrase “photo of Moscow” in the search, one could see not only general landscapes, but also photographs by categories such as “attractions”, “maps”, “restaurants”. The emphasis was placed on unique names of something (sites, models, products) - the specifics began to stand out. Rich snippets made it possible to immediately show users contacts and other organization data in search results.

results

The ranking of commercial sites has changed significantly; special attention is paid to details (product cards, separation of the short description from the general one). The social network on VK has begun to be indexed and the profiles of participants are now equally visible directly in the search results. Posts in forums could rank first if they had a more extensive answer to the user's question than other sites.

Reykjavik

“Reykjavik” algorithm - personalization of search results has been created and “Wizards” technologies have been added to display preliminary results of the query. Improved input hint formula. The algorithm was launched in August 2011.

Peculiarities

The motto of the personalized search result is “Every user has his own results.” The system for remembering the interests of searchers worked through cookies, so if the user’s queries were more often related, for example, to foreign resources, next time they were displayed in the leaders of search results. Hints in the search bar are updated every hour, thereby expanding the possibilities of a specific search. Competition for high-frequency queries is increasing with incredible force.

results

Reputable news sites more often reach the TOP due to their expanded semantic core (the presence of a huge number of different low-frequency key queries). The increase in the number of pages for specific search queries on information sites began to play a major role after the release of the Reykvik algorithm. Each site tried to get into the user’s bookmarks in order to become part of the personalization system; for this, methods of subscribing to the RSS feed and pop-up banner hints for bookmarking the site were used. Internet resources began to pay more attention to an individual approach, rather than putting pressure on the masses.

Kaliningrad

The “Kaliningrad” algorithm is a global personalization of search and search string, focusing on behavioral factors. The launch of Kaliningrad in December 2012 significantly increased the cost of SEO services.

Peculiarities

The interests of the user turned the entire search results upside down - site owners, who previously did not care about the comfort of the visitor’s stay on the site, began to lose traffic at lightning speed. Now Yandex divided its interests into short-term and long-term, updating its spy databases once a day. This meant that today and tomorrow, for the same request, the same user could be shown a completely different result. Interests now play a special role for a user who was previously interested in travel when typing in the phrase taxi - taxi services are shown, and for someone who constantly watches films - they will receive everything about the comedy film “Taxi” in the search results. In the search bar of every person “hungry to find information,” tips on previous interests are now displayed in the first positions.

results

Optimizers began to cover more and more ways to retain the user: usability and design improved, content was created more diverse and of higher quality. When exiting, windows like “Are you sure you want to leave the page” could pop up and the user would be stared at by the sad face of some creature. Well-thought-out page linking and an always accessible menu improved user activity indicators, which increased the sites’ positions in search results. Sites that were unclear to a wide range of Internet users were first simply demoted in positions, and then generally hung at the end of the list of proposed results.

Dublin

Dublin Algorithm - Improved personalization by identifying current goals. This modernized version of “Kaliningrad” was released to the world in May 2013.

Peculiarities

The technology includes a function for tracking the changing interests of users. That is, if there are two completely different search views over a certain period of time, the algorithm will prefer the latter and include it in the search results.

results

For websites, practically nothing has changed. The struggle continues not just for traffic, but for improving behavioral indicators. Old website layouts are starting to be abandoned because it’s easier to make a new one than to try to fix something on the old one. The supply of website template services is increasing, and competition for convenient and beautiful web resource layouts is beginning.

Islands

“Islands” algorithm - technology has been introduced to display interactive blocks in search results, allowing the user to interact with the site directly on the Yandex search page. The algorithm was launched in July 2013, with a proposal to webmasters to actively support the beta version and use templates for creating interactive “islands”. The technology is currently being tested behind closed doors.

Peculiarities

Now, when searching for information that can be found out immediately from the search, the user is offered “islands” - forms and other elements that can be worked with without visiting the site. For example, you are looking for a specific movie or restaurant. For the film in the search and to the right of it, blocks will be displayed with the cover of the film, its title, cast, showtimes in cinemas in your city and a form for purchasing tickets. The restaurant will display its photo, address, telephone numbers, and table reservation form.

results

Nothing significant changed in the ranking of sites at first. The only thing that has become noticeable is the appearance of web resources with interactive blocks in the first place and to the right of the search results. If the number of sites that took part in beta testing was significant, they could displace regular sites due to their attractiveness and catchiness for users. SEOs are thinking about improving the visibility of their content in search results by adding more photos, videos, ratings and reviews. Life is better for online stores - correctly configured product cards can be an excellent interactive “island”.

Minusinsk

The “Minusinsk” algorithm - when identifying SEO links as such, which were purchased to distort search ranking results, a filter was applied to the site, which significantly spoiled the site’s position. “Minusinsk” was announced in April 2015, and fully came into force in May of the same year. It is with this algorithm that the famous one is associated.

Peculiarities

Before the release of Minusinsk, in 2014, Yandex disabled the influence of SEO links for many commercial keys in Moscow for testing and analyzed the results. The result turned out to be predictable - purchased link mass is still used, but for the search engine it is spam. The release of “Minusinsk” marked the day when site owners had to clean up their link profiles, and use the budget spent on link promotion to improve the quality of their Internet resource.

results

“Reputable” sites that achieved TOP thanks to the bulk purchase of links flew off the first pages, and some received sanctions for violating the rules. High-quality and young sites that do not rely on backlinks suddenly found themselves in the TOP 10. “Caught in the distribution” websites that did not want to wait long created new sites, transferring content and putting a plug on the old ones, or cunningly shamanized with redirects. After about 3 months, we found a hole in the algorithm that allows us to remove this filter almost instantly.

Usability and content are beginning to be improved en masse. Links are purchased with even greater care, and control over backlinks becomes one of the functional responsibilities of the optimizer.

According to today’s data, if you purchase links ineptly, you can get a filter even for 100 links. But if the link mass is properly diluted, then you can safely buy thousands of links just like in the good old days. That is, in essence, link budgets for this very dilution, which was played by crowding and mentions, have grown significantly.

Vladivostok

The “Vladivostok” algorithm is the introduction into search technology of checking a site for full compatibility with mobile devices. The full start of the project occurred in February 2016.

Peculiarities

Yandex has taken another step towards mobile users. The Vladivostok algorithm was developed especially for them. Now for better ranking in mobile search The site must meet mobile accessibility requirements. To get ahead of your competitors in search results, an Internet resource must be displayed correctly on any web device, including tablets and smartphones. "Vladivostok" checks for the absence of java and flash plugins, adaptability of content to screen expansion (text capacity across the width of the display), ease of reading text and the ability to comfortably click on links and buttons.

results

By the time the Vladivostok algorithm was launched, only 18% of sites turned out to be mobile-friendly - the rest had to quickly get rid of the “heaviness” on the pages that was not displayed or prevented the content from being displayed correctly on smartphones and tablets. The main factor that influences a website's ranking in mobile search results is the behavior of the mobile user. At least for now. After all, there are not so many perfectly mobile-friendly sites, so free places in the search are occupied by those who are able to provide the user with the most comfortable conditions, even if not completely. Sites that are not adapted to mobile devices are not thrown out of mobile search, but are simply ranked lower than those that have achieved the best results in improving the quality of services for smart users. At the moment, the most popular type of ordering website layouts is adaptive, not mobile, as one might think. Sites that pass all the requirements of the algorithm receive the maximum number of mobile traffic in your niche.

Google: history of creation and development of algorithms

Google's algorithms and filters are still not entirely understood by Russian-speaking optimizers. For Google, it has always been important to hide details of ranking methods, explaining that “decent” sites have nothing to fear, and “dishonest” ones are better off not knowing what awaits them. Therefore, legends are still made about Google algorithms and a lot of information was obtained only after questions were asked to support when the site sagged in search results. Google had so many minor improvements that it was impossible to count, and when asked what exactly had changed, the foreign PS simply remained silent. Let's consider the main algorithms that significantly influenced the positions of sites.

Caffeine

Algorithm “Caffeine” - on the first page of the search there can be several pages of the same site by brand, and there is a preview option. The launch took place in June 2010.

Peculiarities

Highlighting company websites when searching by brand. A “magnifying glass” appears near the output line for preview. Brand keywords provide a positive growth trend in the positions of the Internet resource as a whole. The Page Rank index has been updated, while PR has increased on well-known and visited sites.

results

SEOs have begun to pay more attention to website branding, including color schemes, logos, and names. Keywords for the brand made the site pages stand out in a special way in the search, and when a visitor switched from such a phrase to the main page, his position in the search results grew (if before that the resource was not a leader). SEO optimizers began to purchase more links to increase citations. It was almost impossible for young and little-recognized brands to break into the TOP search results.

Panda

The Panda algorithm is a technology for checking a website for the quality and usefulness of content, including many SEO factors. Sites with “black hat” SEO are excluded from searches. Panda was announced in January 2012.

Peculiarities

“Panda” went out to search and cleaned it of debris. This is exactly what can be said after many websites that were not relevant to key queries disappeared from Google results. The algorithm pays attention to: keyword spam and uneven use, uniqueness of content, consistency of publications and updates, user activity and interaction with the site. Having a visitor scroll to the bottom of a page at reading speed was considered a positive factor.

results

After turning on Panda, a huge number of sites succumbed to sanctions from the search engine. Google systems and at first everyone thought that this was due to participation in link pyramids and the purchase of link masses. As a result, SEO optimizers conducted a process of testing the algorithm and analyzed the impact. The conclusion of the experiments was that Panda still checks the quality of the site for value for visitors. Internet resources stopped copy-pasting and actively began copywriting. Behavioral factors were improved by transforming the site structure into more convenient options, and linking within articles using special highlights became an important part of optimization. The popularity of SEO as a service has skyrocketed. It was noticed that sites that did not comply with the Panda rules disappeared from the search very quickly.

Page Layout (Paige Lyot)

The Page Lyot algorithm is a technology for combating search spam that calculates the ratio of useful to spam content on website pages. Launched in January 2012 and updated until 2014 inclusive.

Peculiarities

“Page Layout” was created after numerous user complaints about unscrupulous site owners whose pages had very little relevant content or the required data was difficult to access, and sometimes completely absent. The algorithm calculated the percentage of relevant content and spam on the page for an incoming request. Sanctions were imposed on sites that did not meet the requirements and the site was removed from the search. Non-compliance with the rules for posting documents also included a site header filled with advertising, when viewing the text required going to the second screen.

results

Sites that were too spammy with advertising fell from their positions, even though the content on the pages was moderately optimized for keywords. Pages that were not relevant to queries were demoted in search results. But there were not so many sites that blatantly did not follow the rules and did not worry about the comfort of visitors. After three updates to the algorithm, the approximate number of resources that fell under the filter turned out to be no more than 3%.

(Venice)

The “Venice” algorithm georeferences the site to a specific region, taking into account the presence of city names on the site pages. Launched in February 2012.

Peculiarities

“Venice” required webmasters to have an “About Us” page on their websites, indicating the location address, without paying attention to the fact that the company might not have an actual location. In context, the algorithm searched for city names in order to display a separate page for the region specified in it. The schema-creator.org markup began to be used to explain to the search robot its geographic location.

results

Sites appeared in search results for those regions that they did not mention on their pages, not taking into account geo-independent queries. Optimizers actively include geo-sensitive keywords and try to create microdata. The content on each page is personalized for each specific city or region as a whole. Localized link building began to be actively used to increase positions in the selected region.

(Penguin)

The Penguin algorithm is a smart technology for determining the weight of sites and the quality of backlinks. A system for editing inflated indicators of the authority of Internet resources. Launched into search in April 2012.

Peculiarities

“Penguin” is aimed at the war against the purchase of backlinks, an unnatural, that is, artificial, set of site authority. The algorithm forms its base of significant resources based on the quality of backlinks. The motivation for launching Penguin was the emergence of link optimizers, when any link to a web resource had equal weight and raised such a site in search results. In addition, ordinary profiles of social network users began to be ranked in search on a par with standard Internet resources, which further popularized the promotion of ordinary sites using social signals. Simultaneously with these algorithm capabilities, the system began to combat irrelevant insertions of search queries into keywords and domain names.

results

Penguin “let down” many sites in search results for the unnatural growth of backlinks and the irrelevance of content to user requests. The importance of catalogs and sites for selling links quickly decreased to a minimum, while authoritative resources (news sites, thematic and near-thematic sites) grew before our eyes. Due to the introduction of the Penguin algorithm, PR for almost all public sites was recalculated. The popularity of mass purchasing of backlinks has dropped sharply. Websites began to tailor key phrases to the content on site pages as much as possible. The “relevance mania” has begun. The installation of social buttons on pages in the form of modules was widespread due to the rapid indexing of social network accounts in search.

Pirate

The “Pirate” algorithm is a technology for responding to user complaints and identifying cases of copyright infringement. The system was launched in August 2012.

Peculiarities

“Pirate” accepted complaints from authors about violation of their copyrights by site owners. In addition to texts and pictures, sites with video content that hosted pirated footage of films from cinemas took the brunt of the attack. Descriptions and reviews of the videos were also subject to filtering - now it was not allowed to copy-paste under pain of sanctions. Due to a large number of complaints against the site for violations, such a site was thrown out of the search results.

results

Based on the results of the first month of operation of Google's Pirate, millions of video files that violated the rights of copyright holders were blocked from viewing on almost all sites, including video hosting sites and online cinemas. Websites with only pirated content were sanctioned and dropped from searches. The massive cleanup of “stolen” content is still ongoing.

HummingBird

The “Hummingbird” algorithm is the introduction of technology for understanding the user when queries do not match exact entries. The system for “identifying exact desires” was launched in September 2013.

Peculiarities

Now the user did not change the phrase in order to more specifically find the information he needed. The “Hummingbird” algorithm made it possible not to search by direct exact occurrences, but returned results from the “deciphering wishes” database. For example, a user typed the phrase “places to relax” into the search bar, and “Kolibri” ranked sites with data about sanatoriums, hotels, spa centers, swimming pools, and clubs in the search. That is, the algorithm grouped a standard database with human phrases about their description. The understanding system has changed the search results significantly.

results

With the help of the Hummingbird technology, SEO optimizers were able to expand their semantic core and get more users to the site due to morphological keys. The ranking of sites has been clarified, because now not only occurrences of direct key phrases and text-relevant queries are taken into account, but also the topical wishes of users. The concept of LSI copywriting appeared - writing text that takes into account latent semantic indexing. That is, now articles were written not only with the insertion keywords, but also including synonyms and near-thematic phrases as much as possible.

(Pigeon)

The “Dove” algorithm is a system for localizing users and linking search results to their location. The technology was launched in July 2014.

Peculiarities

The user's location now played a key role in delivering results. Organic search has become all about geolocation. Linking sites to Google maps played a special role. Now, when a user requests, the algorithm first searches for sites closest in location or targeted content, then moves away from the visitor. Organic search results have changed significantly.

results

Local sites quickly rose in search rankings and received local traffic. Internet platforms without geo-dependence fell in positions. The struggle for each city began again and the number of situations increased when identical sites with redacted content and links to different areas began to appear. Before receiving accurate information about the implementation of the “Dove” algorithm in Russian-language Internet search, many webmasters thought that they were under Penguin sanctions.

(Mobile Friendly)

The Mobile-Friendly algorithm is the implementation of technology for checking sites for adaptability to mobile devices. The system was launched in April 2015 and managed to be “called” on the Internet as: “Mobile Armageddon” (mobilegeddon), “Mobile Apocalypse” (mobilepocalyse, mobocalypse, mopocalypse).

Peculiarities

Mobile-Friendly launched a new era for mobile users, recommending that SEOs urgently provide a comfortable experience for mobile visitors on their sites. The adaptability of sites to mobile devices has become one of the most important indicators of how much site owners care about their visitors. Non-responsive web platforms had to quickly correct shortcomings: get rid of plugins that are not supported on tablets and smartphones, adjust the text size to suit the expansion of different screens, remove modules that prevent visitors with a small screen from moving around the site. Someone just created a separate one mobile version your Internet resource.

results

Resources that were prepared in advance for such a turn received special emphasis among other Internet sites in search results, and traffic from a variety of non-desktop devices to such websites increased by more than 25%. Completely non-responsive sites were demoted in mobile search. The focus on mobility played a role - the presence of heavy scripts on resources was minimized, advertising and pages naturally began to load faster, given that most users with tablets/smartphones use Mobile Internet, which is several times slower than the standard one.

Summary

That's all

Now you know how search has developed over the years, both for ordinary users and for “hit-and-miss” sites. Each of the above search algorithms is periodically updated. But this does not mean that optimizers and webmasters should be afraid of something (unless, of course, you are using black hat SEO), but it is still worth keeping an eye out so as not to unexpectedly sag in the search due to the next new filter.

Total

Introduction

Google Algorithms, the basis of the Google search engine. Created by Larry Page and Sergey Brin, Google is now capable of searching documents in two hundred languages ​​and processing data in basic formats ( Microsoft Word, PDF, Excel, etc.). In this article, we will recall the main stages in the development of Google algorithms created to rank web pages in Google search results.

Google algorithms: history of development

1998. This year the Google search engine was founded. At the same time, PR was created, the algorithm of which was based on the transfer of reference mass and had two main parameters.

  1. How more quantity links leading to a particular page, the higher the level of Page Rank and the place occupied in search results.
  2. The higher the Page Rank of the linking web pages, the greater the mass transferred by the links.

The official creator of PR is Larry Page, and the owner of the patent for this invention is Stanford University.

An interesting fact is that many people believe that Page Rank is translated as “page rank”. In fact, this phrase translates as “Page rank”; the creator of the invention gave it his name. Subsequently, many search engines adopted the idea of ​​Page Rank as a basis, developing their own analogues of the tool.

Beginning of the century

year 2000. Google Company presents a new development to the general public - algorithm called Hilltop, which allows you to most accurately calculate the PR level. This algorithm reads the geography and degree of novelty of a document. After this, Google begins to notify webmasters not to leave links on suspicious websites and “link dumps”.

year 2001. This year Google registers a patent for Hilltop algorithm. At this point, the search engine separates search results for non-commercial and commercial queries.

2003. On 11/15/13, the new Florida algorithm is launched, which removes from the search results or demotes the ranking of pages whose content is oversaturated with key phrases and words. On this day, SEO experts realized that the new search engine algorithm lowers page rankings for the following reasons:

  • not unique or low unique content;
  • high content of keywords in headings and texts;
  • unnatural links (purchased).

2005 year. For the first time, Google specialists are trying to personalize search results by setting the previous queries of a particular user as a basis.

2006. The company is launching an improved algorithm based on an algorithm called Orion- the brainchild of a student from Israel. The search engine can now find web pages matching the subject of a query that do not contain keywords.

From this point on, Google begins to “clarify” the user’s queries, offering options that are most often searched for with a given word. For example, a user types the word “circle” in the search bar. The system offers him such query options as “song circle”, “circle for bathing newborns”, etc.

2007. This year is marked by the launch of a new Austin algorithm. The new product is able to take into account the trust level of a particular web resource and lower resources with a lower trust score in search results. Thus, websites that did not have time to gain trust.

New for 2009

year 2009. Google developers are introducing a new feature into the search engine. Caffeine algorithm. Google's past algorithms no longer match the increased production capacity. Thanks to this, the search engine begins to index sites much more often.

The formation of a page with search results is accelerated many times. Caffeine did not greatly affect the formula for calculating relevance, however, the following changes became noticeable:

Continuous indexing of the entire space world wide web allowed Google search results to become much more dynamic and change throughout the day.

Second decade of our century

2011. Experts add their own “garbage” to Google’s algorithms. This launch of an algorithm called Panda– the first serious search cleaner. The new algorithm “cleanses” search results from “bad” sites:

  • satellites,
  • doorways,
  • sites whose content consists only of advertising and links,
  • sites with low uniqueness of content.

The creators of the improved algorithm, Matt Kats and Amit Singal, note that their new creation takes into account the following points:

  • Percentage indicator of the uniqueness of the content of a particular page and on the resource as a whole;
  • The level of template content, that is, the similarity of texts written for different keywords;
  • The presence of stylistic, grammatical and spelling errors;
  • Relevance of advertising presented on the resource to the topic of posted texts
  • Compliance of the content of the tags and meta tags of the web page with its content;
  • The degree of saturation of the posted texts with keywords;
  • Level of quality of outgoing and incoming links;
  • Internet user actions (duration of site visit, number of web pages viewed, number of refusals and returns to the resource).

Today we can confidently note that almost all modern search engines take these factors into account, especially behavioral ones. The more interesting the site’s content is for an Internet user, the more time he will spend on this resource. The higher the website will rank on the search results page.

Calibri

year 2013. In October, Google algorithms were replenished with the latest Hummingbird algorithm – Hummingbird. The innovation of this algorithm is that it is able to understand even the hidden meaning of queries. For example, if you enter “buy something near home” Hummingbird on meth, that means offline stores.

And from online stores, he will choose only those whose websites provide the most detailed information about delivery conditions and their advantages. In addition, the Hummingbird algorithm prefers long, detailed queries. But at the same time, for those queries for which Google cannot “fantasize”, the results have not changed.

And one more important point– non-unique, low-unique and generated content no longer works.

In conclusion, it is worth noting that Russian Google is one of the most convenient options for operating a search engine.

In Russia, Google does not use most of its “punitive” sanctions. Thanks to this, website promotion for this system in Russia is much easier than for other search engines.

Penguin and Panda, what's next?

4.10. 2013 Penguin 2.1 algorithm was released

As before, the Google search engine pays attention to suspicious sites and anchor lists. This algorithm update affected the sites that were previously sanctioned by the algorithm. Affected 1% of requests.

19.05. 2014 Panda 4.0 update was released

The most serious update of this search algorithm. Affected 7.5% of search queries.

24.08. 2014, Dove algorithm

The algorithm paid attention to geo-dependent queries. Now, when receiving a geo-dependent request, the Google search engine provides the most informative, local search results for the user.

Have you ever wondered how Google's technology works? Of course, the SERP process involves a complex algorithm based on many variables. But still, you can in simple words explain the principle of operation of the world's most popular search engine?

To understand the depths of the most complex search engine algorithm, take a look at this extremely useful infographic from quicksprout.

Google's ubiquitous search spiders crawl onto certain web pages and then follow links from those pages. This network crawling approach allows Google to index more than 100 million gigabytes of information.

In order for users to receive better search results, Google is constantly creating and improving programs. Here is a short list of the main areas in which ongoing work is being carried out:

  • spellchecking;
  • autocomplete;
  • search by synonyms;
  • general understanding of queries;
  • alive Google search;
  • search patterns.

  • How many times does a key expression appear on a page?
  • occurrence of a keyword in the Title or URL?
  • does the page contain synonyms for the keyword expression?
  • What is the Google PageRank of a page?

All this happens in ⅛ seconds.

So-called Knowledge Network(knowledge graph) is a technology that produces results based on a global database of real people, places and connections between them. For example: in addition to dry facts about the biography of Leo Tolstoy, you will receive a maximum of interesting content (text/photo/video) related to this figure.

Snippets also add convenience to search results - small pieces of information that allow you to quickly and without having to follow a link to understand whether the page matches your request.

Other Google tools to make your search easier:

No comments here. Just click on the microphone icon and tell the search engine what you want to find. But be careful - she may answer. :)

2. Images

Shows thumbnail images based on search results.

Google search allows you to ask specific questions and get quick answers.

Google also owns the second most popular search engine in the world, which you all know very well:

Conclusion

What made Google so successful? search engine? Despite the many complex processes that occur unnoticed by the user, Google is a simple and convenient search engine from a usability point of view. Whatever question you ask, it will provide relevant search results in 1/8 of a second.

We have released a new book “Content Marketing in in social networks: How to get into your subscribers’ heads and make them fall in love with your brand.”

IN last years search results have changed dramatically compared to the early and mid-2000s. It's nice to see new changes aimed at increasing user comfort. And in order to fully appreciate how much work has been done on the search engine, we decided to collect 34 key events in the history of Google development.

Distant past

March 1998. Google has a successful start with Larry Page and Sergey Brin. At the same time, the first website ranking algorithm called PageRank was released.
PageRank is a value that corresponds to the importance of a page. The calculations are quite simple: the more links lead to a page, the higher its PageRank. It is important to note that the weight transferred by a donor page directly depends on the links to the donor itself. By the way, PageRank is an algorithm that has been working for almost 20 years, albeit in a greatly modified form.

  • year 2000. Launching Google Toolbar, an application that is a toolbar for working with the search engine. It was around this time that the first conversations about SEO began.
  • December 2000 - Hilltop. This is a kind of addition to PageRank. The bottom line is that links from sites relevant to the topic have increased significantly in weight.
  • year 2001. Google has begun separating search results into commercial and non-commercial queries.
  • February 2003 - Boston. The update is named after the city in which the conference took place. The main change: monthly updates to the search database were introduced.
  • April 2003 - Cassandra. The algorithm that was the first to shake optimizers because they began to punish for use hidden texts and links. From that moment on, SEO, from Google’s point of view, began to go black.
  • May 2003 - Dominic. From that moment on, Google began to evaluate backlinks to the site differently. It was after Dominic that the Freshbot and Deepcrawler robots first appeared.
  • June 2003 - Esmeralda. Just the latest monthly SERP update. No major changes were noticed with the release of Esmeralda.
  • July 2003 - Fritz. The main change: since the release of Fritz, Google began to update daily!
  • November 2003 - Florida. One of the very important updates, since it was after Florida that a significant part of the “brutally” optimized sites were pessimistic. Florida has struck a blow against resources that are oversaturated with keywords, stolen content and purchased link profiles. Since late 2003, SEO has changed and become more demanding.
  • January 2004 - Austin. Austin Update is coming to Florida's aid. The essence of the update is to tighten the fight against spammed and invisible texts. The requirement for text relevance is increasing.
  • February 2004 - Brandy. Google began to respond to keyword synonyms. Optimized texts have become more diverse, which undoubtedly improves the search results. The attention paid to thematic content of anchor text has increased. Keyword analysis has changed.

Time for a change

  • February 2005 - Allegra. The changes brought by Allegra are a mystery of history. Proposed changes: separate punishment for purchased links. SEO continues to go black.
  • May 2005 - Bourbon. Bourbon began to impose sanctions on sites with duplicate content, lowering their rankings.
  • September 2005 -Gilligan. Gilligan did not make any major changes, at least not noticeable ones. But it needs to be mentioned for historical reference - I mentioned it.
  • October 2005 - Google maps+ local. Updates, the purpose of which is to encourage entrepreneurs to update their contact information on the maps.
  • October 2005 - Jagger. Continued the fight against link manipulation. Now Google's target was link farms, sites that exchanged backlinks out of the goodness of their hearts, and once again the algorithm ran through a dubious link profile.
  • December 2005 - Big Daddy. Google proposed using canonicalization, that is, choosing the most suitable page from potential duplicates. Use redirects 301 and 302. That’s all of the significant changes.
  • November 2006 - Supplemental Index. Expanding Google's capabilities. The search engine began to process more documents. For some reason, it was this year that serious changes were noticed in the positions of sites.
  • June 2007 - Buffy. Minor updates that are not worth your attention. And not a hint of vampires.
  • April 2008 - Dewey. It appeared like an earthquake, shook the sites’ positions and that’s it. No one understood why.
  • August 2008 - Google Suggest. Now Google has started to show drop-down search tips when entering a query into the search bar.
  • February 2009 - Vince. According to Matt Cuts, nothing much happened. But inquisitive optimizers came to the conclusion that resources major brands gained a strong advantage.
  • August 2009 - Caffeine Preview. Google kindly notified optimizers about significant changes in the search structure, due to which, from now on, documents will be indexed much faster. In the same year, the issue began to be updated in real time.
  • May 2010 - My Day. This update reduced the amount of traffic from verbose queries.
  • June 2010 - Caffeine. The previously announced Caffeine has been released. The algorithm allowed more pages to be indexed at a faster rate. Now new documents get into the index faster.

Google Updates: Today

  • February 23, 2011 - . On the Day of Defenders of the Fatherland, the Panda algorithm was released, a kind of protector of search results from garbage with an emphasis on internal factors. Panda is a large-scale, well-known and sensational algorithm that has undergone various upgrades many times. It weakened, then became stronger. The main task is to remove from the search results all “low-quality sites” such as doorways and sites created solely for advertising, while leaving the most worthy ones.
  • January 3, 2012 - Page Layout. An add-on for Panda that analyzes the ease of viewing documents. The goal is to get rid of intrusive advertising that takes up most of the user's screen.
  • February 2012 - Venice. An update that now takes into account the user's regional location. The goal is to show sites that are closest to the user.
  • April 24, 2012 - Penguin. The second sensational algorithm. The goal is the same as Panda's, but the emphasis is on external factors. I took control of everything related to the resource’s link profile. Punishes sites by either lowering the entire site in search results or deleting it altogether.
  • September 2012 -Exact Match Domain (EMD). A rare but necessary filter. The idea is to demote low-quality sites that are optimized for a single request, which is the domain. For example, the site kupi-kartoshku.ru, where each page is optimized for the request “buy potatoes,” is very likely to fall under this filter.
  • August 2013 - Hummingbird. Complex algorithm. Kolibri's task is to recognize meaning in complex multi-word queries and select the most relevant documents. Factors can include: the hidden meaning of the request, design, your location, and so on.
  • August 2014 - Pigeon. Continuing the theme of birds, Dove aims to present companies (sites) that are located closest to you in search results based on geo-based queries. For example, if you want to order Chinese noodles, Dove will find you the nearest establishment that can satisfy your need. Also in August 2014, it was announced that Google was giving a slight ranking boost to sites using a secure protocol like https.
  • April 21, 2015 - Mobile-friendly. An algorithm that gives preference to individual website pages adapted to screens mobile devices. Does not require scaling or horizontal scrolling. In a word, convenient for reading from a phone/tablet. You can check it in Google Webmaster.

To be continued…

Over the past 5 years, Google's algorithms have undergone a lot of changes. The level of difficulty in manipulating search results ranges between “very difficult” and “impossible.” Therefore, the only thing left is to be a responsible optimizer. Google has repeatedly mentioned that content remains the most important ranking factor and always will be. It turns out that the best way out of this situation is thematic traffic and promotion with very high-quality links. I'm glad that I'm optimizing at this time. I am sure that many new surprises await us in 2016.