English | Russian |
classify work into pre-set categories | классифицировать задания по предопределённым категориям (TechRepublic, 2018 Alex_Odeychuk) |
generative pre-trained transformer | обученный генерации текста трансформер (MichaelBurov) |
generative pre-trained transformer | генеративный предобученный трансформер (Alex_Odeychuk) |
generative pre-trained transformer | предварительно обученный генеративный преобразователь (Generative pre-trained transformers (GPT) are a family of language models generally trained on a large corpus of text data to generate human-like text. They are built using several blocks of the transformer architecture. They can be fine-tuned for various natural language processing tasks such as text generation, language translation, and text classification. The "pre-training" in its name refers to the initial training process on a large text corpus where the model learns to predict the next word in a passage, which provides a solid foundation for the model to perform well on downstream tasks with limited amounts of task-specific data.: According to the researchers, LLMs demonstrate “state-of-the-art capabilities” in translation quality assessment at the system level. However, they emphasized that only GPT 3.5 and larger models are capable of achieving state-of-the-art accuracy when compared to human judgments. Those findings provide “a first glimpse into the usefulness of pre-trained, generative large language models for quality assessment of translations,” they said. wikipedia.org Alexander Demidov) |
generative pre-trained transformer | трансформер, обученный для генерации текста (MichaelBurov) |
generative pre-training | генеративное предобучение (Alex_Odeychuk) |
pre-defined constraint | предопределённое ограничение (Alex_Odeychuk) |
pre-trained | предварительно обученный |
pre-trained | предобученный |
pre-translated data | предварительно перевёденные данные (Alex_Odeychuk) |