7B OpenLLaMA model that has been trained with 200 billion tokens on the RedPajama dataset #1054

7B OpenLLaMA model that has been trained with 200 billion tokens on the RedPajama dataset #1054

Comments

Popular posts from this blog