Post ASujmUZ0MfSYCoV8iG by basil@hci.social
(DIR) More posts by basil@hci.social
(DIR) Post #ASue0uatmTwrGcsbhI by simon@fedi.simonwillison.net
2023-02-21T18:31:56Z
1 likes, 1 repeats
I keep seeing people argue that prompt engineering is a bug, not a feature, and will soon be made obsolete by future AI advancesI very much disagree:https://simonwillison.net/2023/Feb/21/in-defense-of-prompt-engineering/
(DIR) Post #ASueSwwGenjMeCFDIe by kgoldsholl@social.vivaldi.net
2023-02-21T18:36:49Z
0 likes, 0 repeats
@simon they will use whatever is cheapest to implement, because the real goal is to waste so much of the customer's time that they hang up. "
(DIR) Post #ASuhWNOL3wGRQ7hSWu by pganssle@qoto.org
2023-02-21T19:10:51Z
0 likes, 0 repeats
@simon In the limit of “the LLM is as smart as the smartest human”, you would still expect to need to give it the right kind of context and information to do what you want. Communication skills are super useful on both sides of the table.Though I suppose somewhere along the way smarter LLMs will proactively recognize ambiguities and ask for clarification, which will lower the degree you which you need to be good at asking it to do something.
(DIR) Post #ASuiz3Gx2mqhNFTwXo by baclace@sigmoid.social
2023-02-21T19:27:24Z
0 likes, 0 repeats
@simon The most likely scenario is a split between developers (prompt engineering) and users (no prompt engineering).
(DIR) Post #ASujmUZ0MfSYCoV8iG by basil@hci.social
2023-02-21T19:36:24Z
0 likes, 0 repeats
@simon just to be a bit more nuanced, I think part of the “it’s a bug” argument is referring to the majority of users not needing to explicitly prompt engineer day-to-day when interacting with an LLM-based system. The deep skill and care needed to construct and parameterise a (hidden?) prompt seems likely to persist. Does that seem fair?
(DIR) Post #ASukuSsUcEbzFSyxM0 by simon@fedi.simonwillison.net
2023-02-21T19:49:17Z
0 likes, 0 repeats
@basil I always assume people are talking more about experts and developers than regular users when they use that term because of the presence of the word "engineer"
(DIR) Post #ASv7OGuBtMqRnMOWY4 by geoffreylitt@mastodon.social
2023-02-22T00:00:52Z
0 likes, 0 repeats
@simon I wonder if biological sciences provide one analogy for how the field will develop. Combination of science craft and art, unexplained nondeterministic results, weird tricks passed down by apprenticeship…
(DIR) Post #ASvA8Omn7kMZSw3YRc by b3n@g0v.social
2023-02-22T00:31:38Z
0 likes, 0 repeats
@simon That always sounded wrong to me. LLMs getting “better” can only be about nuances wrt the (construct) context the are in. Of course you can just throw a question but then, but if they are better at nuances, their power lies in reacting to nuances in the user prompts.
(DIR) Post #ASvTEqT50uywBdckC0 by MaxPower@hachyderm.io
2023-02-22T04:05:41Z
0 likes, 0 repeats
arguing prompt engineering will disappear seems like arguing that feature engineering will disappear from ML pipeline work because of neural networks or 'auto-ai'. hasn't happened yet...