Posts by glaforge@uwyn.net
 (DIR) Post #AQ5sOm8OcmhqCRPoDg by glaforge@uwyn.net
       2022-11-29T11:00:09Z
       
       0 likes, 0 repeats
       
       @bortzmeyer L'article de Tim mélange tout de même un peu tout. Il y a évidemment toujours des algorithmes, et ce à différents niveaux. Par contre l'idée (l'algorithme) d'une timeline chronologique (avec support des fils de réponse, etc) permet d'éviter 2 choses :- la chambre d'écho où certains s'enferment (on ne voit que ce qui va dans notre sens)- le défilement infini (on n'arrive plus à arrêter de défiler pour ne rien rater)
       
 (DIR) Post #AQ5wy7WDVglSTQfNlA by glaforge@uwyn.net
       2022-11-29T11:51:36Z
       
       0 likes, 0 repeats
       
       @bortzmeyer Un autre aspect aussi, c'est le choix de ne pas faire de "quote tweet" comme sur Twitter. C'est pour ajouter de la friction quand l'on veut se moquer d'un tweet en le pointant du doigt.
       
 (DIR) Post #AQUqxBThYsEzgtetCC by glaforge@uwyn.net
       2022-12-11T11:48:16Z
       
       1 likes, 2 repeats
       
       Still #caturday? Isn't it #caturdayeverday ?What would be the subtitle for this one? Hey, can I climb in the #xmas tree? 🎄
       
 (DIR) Post #ASUrcWo0n22RetQnSK by glaforge@uwyn.net
       2023-02-09T08:00:40Z
       
       0 likes, 0 repeats
       
       @simon How did you do that?
       
 (DIR) Post #ASUuXkHZYWKBs2fXqC by glaforge@uwyn.net
       2023-02-09T08:34:40Z
       
       0 likes, 0 repeats
       
       @simon neat, thank for the link and explanation. I guess the gibberish output is because it'd need an even bigger corpus of text?
       
 (DIR) Post #ASUvS5O4HA03XhZYPo by glaforge@uwyn.net
       2023-02-09T08:45:00Z
       
       0 likes, 0 repeats
       
       @simon Makes sense.NanoGPT starts from scratch or has some existing pre-trained weights to get started?
       
 (DIR) Post #ASVxmROgXykofBefAW by glaforge@uwyn.net
       2023-02-09T20:45:41Z
       
       0 likes, 0 repeats
       
       @simon Just stumbled upon this article that replicates #GPT with #numpy and which says that they loaded existing GPT-2 model weights.So perhaps with such a pre-training, the output would be better on your corpus afterwards?I'm really thinking this is interesting for example for open source projects, as they could help developers and guide them through using the project. Really promising stuff.
       
 (DIR) Post #ASVyfmGqdo7pKRNLnM by glaforge@uwyn.net
       2023-02-09T20:55:41Z
       
       0 likes, 0 repeats
       
       @simon Well, intuitively, it's like asking to create a poem without knowing the language. You need to know the language, before being able to write the poem? I'm gonna read the link, thanks.
       
 (DIR) Post #ASW3KNK2hfO7drU3Rg by glaforge@uwyn.net
       2023-02-09T20:56:40Z
       
       0 likes, 0 repeats
       
       @simon Your blog! I think I've read it already 😉
       
 (DIR) Post #ASW3KNwKPK0rYbOd7Y by glaforge@uwyn.net
       2023-02-09T20:58:06Z
       
       0 likes, 0 repeats
       
       @simon I think Supabase went with embeddings here:https://supabase.com/blog/chatgpt-supabase-docs
       
 (DIR) Post #ASoU8J2X7rqqcZvvEG by glaforge@uwyn.net
       2023-02-18T19:10:25Z
       
       0 likes, 0 repeats
       
       @simon wrong link?
       
 (DIR) Post #ATsosGPtc4y3fUhkwK by glaforge@uwyn.net
       2023-03-22T07:52:27Z
       
       0 likes, 1 repeats
       
       Did you know about #curl's --json flag? When you work with JSON data, curl can take care of the json content-type/accept headers for you:curl --json '{"msg": "hello"}' https://example.comhttps://glaforge.dev/posts/2023/03/22/curl-s-json-flag/
       
 (DIR) Post #AWU0lXJz8UFe3QzqHw by glaforge@uwyn.net
       2023-06-08T13:39:31Z
       
       0 likes, 0 repeats
       
       @bortzmeyer Je suis prêt, j'ai la liste : https://www.isouard-avocat.com/publications/adverbes-multiplicatifs
       
 (DIR) Post #AWU9NGiG8KlHBJqyo4 by glaforge@uwyn.net
       2023-06-08T15:16:13Z
       
       0 likes, 0 repeats
       
       @bortzmeyer alea jacta est... Mon latin s'arrête là, a priori. A minima, etc.