Skip to content Skip to sidebar Skip to footer

GopherCite: Teaching language models to support answers with verified quotes

DeepMind published a series of papers about large language models (LLMs) last year, including an analysis of Gopher, our large language model. Language modelling technology, which is also currently being developed by several other labs and companies, promises to strengthen many applications, from search engines to a new wave of chatbot-like conversational assistants and beyond.…

Read More

An empirical analysis of compute-optimal large language model training

In the last few years, a focus in language modelling has been on improving performance through increasing the number of parameters in transformer-based models. This approach has led to impressive results and state-of-the-art performance across many natural language processing tasks. We also pursued this line of research at DeepMind and recently showcased Gopher, a 280-billion…

Read More

DeepMind’s latest research at ICLR 2022

Working toward greater generalisability in artificial intelligence Today, conference season is kicking off with The Tenth International Conference on Learning Representations (ICLR 2022), running virtually from 25-29 April, 2022. Participants from around the world are gathering to share their cutting-edge work in representational learning, from advancing the state of the art in artificial intelligence to…

Read More