Post B2ebjK7iAcEtjdlQIa by immobile@poa.st
(DIR) More posts by immobile@poa.st
(DIR) Post #B2ebjHJebl0f1oKPzs by ThatCrazyDude@noauthority.social
2026-01-25T06:57:05Z
0 likes, 0 repeats
Anyone running LLMs and image generation locally around here?Asking because I can pick up a GPU at a pretty good discount, but I'm not exactly sure if it's good for that kind of stuffThis is the card: https://www.msi.com/Graphics-Card/GeForce-RTX-5060-Ti-16G-INSPIRE-2X-OC
(DIR) Post #B2ebjIcpjvwh5aJFWy by immobile@poa.st
2026-01-25T08:22:38.048626Z
0 likes, 0 repeats
@ThatCrazyDude Two important factors:- Nvidia, checked - Amount of memory, just and just, more the better, aim for 32 GB if dreaming of image creation (said one who used 8 GB :cryblood: )
(DIR) Post #B2ebjJQok5vbabhARM by silas@noauthority.social
2026-01-25T10:02:43Z
0 likes, 0 repeats
@immobile @ThatCrazyDude unironically the best way for consumers to run ai models at home right now is getting a mac. Like a mac studio that you can configure with up to 512 gb of shared memory.You are not going to buy a gpu with 32 or 48 or 64 or even more vram. And if the model doesn't fit in vram it's going to slow. That's the main limiting factor for these models. Having the model entirely fit
(DIR) Post #B2ebjK7iAcEtjdlQIa by immobile@poa.st
2026-01-25T10:14:25.375738Z
0 likes, 0 repeats
@silas @ThatCrazyDude Unironically your suggestion was somewhat out of the budget of Pat, I guess. (Prices in €)
(DIR) Post #B2ebjKmTj2qhm4pyqG by ThatCrazyDude@noauthority.social
2026-01-25T10:51:49Z
0 likes, 0 repeats
@silas @immobile yeah, definitely. 400 euros is the most I'm willing to pay, so it will be a GPU, not some hippie ass mac. Although I might invest in a big ass ram stick and then run the whole thing from a ram drive.