Google’s new trillion-parameter AI language model is almost 6 times bigger than GPT-3

A trio of researchers from the Google Brain team recently unveiled the next big thing in AI language models: a massive one trillion-parameter transformer system. The next biggest model out there, as far as we’re aware, is OpenAI’s GPT-3, which uses a measly 175 billion parameters. Background: Language models are capable of performing a variety of functions but perhaps the most popular is the generation of novel text. For example, you can go here and talk to a “philosopher AI” language model that’ll attempt to answer any question you ask it (with numerous notable exceptions). [Read next: How Netflix shapes mainstream…

This story continues at The Next Web

Or just read more coverage about: Google

Go to Source