Post AT7efClWAKxiGLXTuq by Muffinlord@retro.pizza
(DIR) More posts by Muffinlord@retro.pizza
(DIR) Post #AT7YLoZRIWrrzrGC7k by arstechnica@geeknews.chat
2023-02-28T00:01:09Z
0 likes, 0 repeats
ICYMI: Meta recently announced a new AI-powered large language model that it claims can outperform OpenAI's GPT-3 model despite being "10x smaller."https://arstechnica.com/information-technology/2023/02/chatgpt-on-your-pc-meta-unveils-new-ai-model-that-can-run-on-a-single-gpu/?utm_brand=ars#botOriginal tweet : https://nitter.it/arstechnica/status/1630282066490261506
(DIR) Post #AT7YlLmcx10YFAbZxI by MatthewToad42@climatejustice.social
2023-02-28T00:05:40Z
0 likes, 0 repeats
@arstechnica DeepMind's previous attempt was 7x smaller. 10x smaller and able to run on a commodity GPU is pretty impressive. The energy used by AI matters, although this may lead to more widespread use.As regards releasing the weights / training data ... Note that detoxifying ChatGPT (so far) has required a large, fairly expensive training set mostly produced in the English speaking developing world. The initial output was horrendous. The current public access is just a way to harvest more data.
(DIR) Post #AT7efClWAKxiGLXTuq by Muffinlord@retro.pizza
2023-02-28T01:11:43Z
0 likes, 0 repeats
@arstechnica i bet this thing writes absolutely dogshit porn