cron@feddit.orgtoTechnology@lemmy.world•OpenAI, Google, Anthropic admit they can’t scale up their chatbots any furtherEnglish
1·
2 months agoIt’s absurd that some of the larger LLMs now use hundreds of billions of parameters (e.g. llama3.1 with 405B).
This doesn’t really seem like a smart usage of ressources if you need several of the largest GPUs available to even run one conversation.
Toussaint (Witcher 3)