LLC "YANDEKS" laid out in open access language model with 100 billions parameters
LLC "YANDEKS" created and published the version of the generative language YaLM model with 100 billions parameters, it is reported in the press release which has arrived in edition of N %2B 1. It is the largest Russian-speaking model and the biggest of what code and weight are publicly available (they are published on GitHub). Except Russian it also got English support. In 2017 researchers from Google Inc. presented neuronetwork architecture to Transformer which actively uses the mechanism of attention allowing algorithm to be focused on important sites of the text. This architecture led to rough development models...