Cerebras-GPT, a family of 7 GPT models from 111M to 13B parameters trained using the Chinchilla formula (Open Source) - PrO_RaZe Bookmarks #880
Cerebras-GPT, a family of 7 GPT models from 111M to 13B parameters trained using the Chinchilla formula (Open Source) - PrO_RaZe Bookmarks #880
Comments
Post a Comment