Post Ay2QFKh3Jbb60PpxTM by safiuddinkhan@toot.io
(DIR) More posts by safiuddinkhan@toot.io
(DIR) Post #Ay2QFF4wBt8cakxwrw by safiuddinkhan@toot.io
2025-09-09T17:18:53Z
1 likes, 0 repeats
OK, SO LLMs are basically a next word predictor at heart.. The context they remember is basically the chat history is attached to every input request.. Since they can see all the tokens at the same time so it does not matter. But this gives illusion to the user that LLMs are smart and they can remember the previous context.
(DIR) Post #Ay2QFKh3Jbb60PpxTM by safiuddinkhan@toot.io
2025-09-09T17:19:06Z
0 likes, 0 repeats
if for example i say to ChatGPT Me: "Hello How are you"ChatGPT: "I am fine"Me: "What are you up to?"This will go to ChatGPT as ['Me: "Hello How are you"','ChatGPT: "I am fine"','Me: "What are you up to?"']What LLM will do it will predict the next word at its core.
(DIR) Post #Ay2QFPhwfiHZYdIF3g by safiuddinkhan@toot.io
2025-09-09T17:19:17Z
0 likes, 0 repeats
To be more efficient sometimes the previous chat is compressed and converted to more efficient form rather than raw text but this is what basically happens. but this is basically all it does at its core.In the most simplest sense it basically creates the profile of the chat which is then attached to every chat request..It also creates your profile as well though they deny they do anything as such