Logo | |
Developer(s) | EleutherAI |
---|---|
Initial release | June 9, 2021 |
Type | Language model |
License | Open-source |
Website | 6b![]() |
GPT-J is an open source artificial intelligence language model developed by EleutherAI.[1] It generally follows GPT-2 architecture with the only major difference of the so-called parallel decoders: instead of placing the feed-forward multilayer perceptron after the masked multi-head attention, they are computed in parallel in order to achieve higher throughput with distributed training.[2]
GPT-J performs very similarly to similarly-sized OpenAI's GPT-3 versions on various zero-shot down-streaming tasks and can even outperform it on code generation tasks.[3] The newest version, GPT-J-6B is a language model based on a data set called The Pile.[4] The Pile is an open-source 825 gigabyte language modelling data set that is split into 22 smaller datasets.[5]
GPT-J originally does not function as a chat bot unlike ChatGPT, only as a text predictor.[6] In March 2023, Databricks released Dolly, an Apache-licensed, instruction-following model based on GPT-J with fine-tuning from the Stanford Alpaca dataset.[7]