Post AzJIjhOCzphAaGJ4bI by twinspin6@outerheaven.club
 (DIR) More posts by twinspin6@outerheaven.club
 (DIR) Post #AzJIgiT6Wogh3sAs2y by kaia@brotka.st
       2025-10-17T19:49:45.096597Z
       
       1 likes, 0 repeats
       
       I thought about getting a 3090 to pair with my 4090 to double the VRAM :NianShake: can I do that?
       
 (DIR) Post #AzJIjhOCzphAaGJ4bI by twinspin6@outerheaven.club
       2025-10-17T19:50:17.179481Z
       
       1 likes, 0 repeats
       
       @kaia that's a lot of power
       
 (DIR) Post #AzKpIgYPUBAJorlU92 by can@haz.pink
       2025-10-17T19:56:18Z
       
       1 likes, 0 repeats
       
       @kaia for AI? Sure. You can offload different layers to different GPUs
       
 (DIR) Post #AzKpKKMuwQZxalAkOO by vonxylofon@witter.cz
       2025-10-18T13:16:29Z
       
       0 likes, 0 repeats
       
       @kaia What for?
       
 (DIR) Post #AzKpKLZ0UxqNIXpusK by kaia@brotka.st
       2025-10-18T13:30:02.853334Z
       
       0 likes, 0 repeats
       
       @vonxylofon loading LLM > 24GB
       
 (DIR) Post #AzKpTnrMfCidfAvqqW by p@raru.re
       2025-10-17T20:03:37Z
       
       1 likes, 0 repeats
       
       depends on what you want to dothey're independent devices@kaia
       
 (DIR) Post #AzKvaCfruqdg5QzRdA by vonxylofon@witter.cz
       2025-10-18T14:40:07Z
       
       1 likes, 0 repeats
       
       @kaia That might work, IDK. Not recommended for gaming is all I know.