Post Aconr5SF18jBooB9Ky by RobbieT@noagendasocial.com
 (DIR) More posts by RobbieT@noagendasocial.com
 (DIR) Post #Acn0eItMek4SokBFQm by xeroforhire@noagendasocial.com
       2023-12-14T04:07:46Z
       
       0 likes, 0 repeats
       
       New AI is ridiculoushttps://twitter.com/BrianRoemmele/status/1734813918676066454?t=AB1gUfKd3DSPeGZvP799lw&s=19
       
 (DIR) Post #Acn0eJu6tSGHxLXMES by Iamthebammbamm@noagendasocial.com
       2023-12-14T04:10:09Z
       
       0 likes, 0 repeats
       
       @xeroforhire saying "in a year you'll have this on your computer with no internet connection" shows he knows nothing
       
 (DIR) Post #Acn0eKj9pf5wVfQ7nc by Fox@noagendasocial.com
       2023-12-14T04:50:44Z
       
       0 likes, 0 repeats
       
       @Iamthebammbamm @xeroforhire Yeah... Somehow, I seriously doubt we're all gonna have this available on a computer or phone of any kind... The absolute staggering amount of compute required to generate images is gonna require a phone to plow through a battery in no time...Not everyone even has access to AI worthy hardware and CPU inference is not yet there with the current software. Even small text generating models without any images are severely limited in capability...What a Dumbass...
       
 (DIR) Post #Acn3EICosuKV1SZLMG by preordained@noagendasocial.com
       2023-12-14T05:19:39Z
       
       0 likes, 0 repeats
       
       @Fox @Iamthebammbamm @xeroforhire Eh...the learning is what requires all the compute, the model that produces the results is just a big file. I hope you didn't think when you were talking to ChatGPT or Grok it was "computing" your results in real time.
       
 (DIR) Post #AcnhEzxAEFS4ry8aIq by Iamthebammbamm@noagendasocial.com
       2023-12-14T12:47:59Z
       
       0 likes, 0 repeats
       
       @Fox @xeroforhire CPU inference will never go anywhere until CPUs have hundreds/thousands of cores
       
 (DIR) Post #AcnwnLzcI7aMNl7n3g by Fox@noagendasocial.com
       2023-12-14T15:42:16Z
       
       1 likes, 0 repeats
       
       @Iamthebammbamm @xeroforhire Well.... That's not exactly accurate.It work for small text generation models (ie. ~7b), the catch is that it's not fast. But it DOES work. Just make sure you got 32+ GBs of RAM. How do I know? I'm doing it. But yeah, anything else would be dumb, go get a half decent GPU. 😉
       
 (DIR) Post #Aco3HIT0Z47jrKX8dM by Iamthebammbamm@noagendasocial.com
       2023-12-14T16:54:55Z
       
       0 likes, 0 repeats
       
       @Fox @xeroforhire lol, that's exactly what I mean by it doesnt work
       
 (DIR) Post #Aconr5SF18jBooB9Ky by RobbieT@noagendasocial.com
       2023-12-15T00:21:30Z
       
       0 likes, 0 repeats
       
       @Iamthebammbamm @Fox @xeroforhire do you are telling me in the future I can have AI Alex jones tell me the local weather forecast, and holiday recipes?  This is going to be awesome!
       
 (DIR) Post #Aconr6f2X2YlYnAsvQ by Iamthebammbamm@noagendasocial.com
       2023-12-15T01:03:56Z
       
       0 likes, 0 repeats
       
       @RobbieT @Fox @xeroforhire
       
 (DIR) Post #Aconr7hYfAAUmtMPUO by Fox@noagendasocial.com
       2023-12-15T01:36:48Z
       
       0 likes, 0 repeats
       
       @Iamthebammbamm @RobbieT @xeroforhire Do can and see is at the and of with for this and that while do some for and. Also yes, screaming Alex Jones yelling about the weather, then interjecting gay frogs into the news report would be so damn funny.
       
 (DIR) Post #Acop3ZMDlTqZzQsY5o by Iamthebammbamm@noagendasocial.com
       2023-12-15T01:50:16Z
       
       0 likes, 0 repeats
       
       @Fox @RobbieT @xeroforhire there is an AI that mimics different people and characters, but I wasn't too impressed with it