NIX Solutions: Yandex Introduces New Neural Network Architecture for Ranking Web Pages

At YaC 2020, Yandex experts talked about the implementation of a transformer, a new neural network architecture for ranking web pages. Thanks to it, Yandex search has learned much better to assess the semantic relationship between user requests and the content of documents on the Internet. The rates became so much better that, according to SearchEngines, this is the most significant event for search over the past 10 years (since the launch of Matrixnet).

According to them, Palekh and Korolev together influenced the search less than the new model on transformers. Moreover, thousands of factors are calculated in the search, but if you turn them off and leave only the new model, then the quality of ranking will drop by only 4-5% according to the main offline metric.

The new text analysis technology is called YATI. It uses a new generation of neural networks – transformers. This is the general name for the popular neural network architecture that underlies modern approaches to text analysis. Yandex has developed its own implementation of transformers, so YATI stands for Yet Another Transformer with Improvements.

“Although the architecture of transformer neural networks has been known for a long time, and their use for NLP problems gained immense popularity after the appearance of BERT in 2018, the introduction of a transformer into a modern search engine is impossible without engineering ingenuity and a large number of original technological improvements in training and runtime. Therefore, we named our technology YATI – Yet Another Transformer (with Improvements), which, as we think, reflects its essence well. This is really “another transformer”, architecturally similar to other models, but unique, as it is able to work and be useful in search, which is the most complex Yandex service,” the company said.

NIX Solutions specifies that in Search, YATI matches the meaning of queries and web documents. It knows how to work not only with short texts, such as requests or article titles, but also with long texts. It has an “attention mechanism” that allows you to highlight the most significant fragments in the text. Finally, it pays attention to the word order and takes into account the context: how words affect each other, as in many cases the word order determines the meaning of the entire phrase (for example, when searching for tickets from one point to another).