With the rise of LLMs and prompt to output based Gen AIs, prompts are becoming center of attraction. Coming up with the right prompt is becoming a kind of skill on its own. There is plethora of courses being offered to teach prompt engineering.
I’ve even come across some internet sources, which claims that some colleges are going to the extent of offering an undergraduate degree in prompt engineering itself.
While there is a decent scope for the need of coming up with right prompt, I think this skill will not become an ‘engineering’ skill altogether in which one should earn a degree.
There are good reasons why I’m saying so:
- It is too generic to become a specialization.
- Prompts can be different from one LLM to another LLM.
- Sole purpose of creating LLMs is to make them as understandable as a human being or at least close enough that a normal person can’t tell the difference. Hence, as they will improve, they will be become much easier to interact and get desired output, without having the need of mastering a certain style of interaction.
I think prompt engineering will surely emerge as a skill, just like MS Excel, rather than a degree. Majority of the people will be just fine with the basics, but there will for sure be some experts who can do much more with it, just like it is in the case of MS Excel.
As a takeaway, I’d encourage you to understand how the prompts are being translated into outputs, rather than just learning to come up with new ways of how to prompt.
Once you understand, those fundamentals, you can play around with prompts as you wish and can come up with as many as you want.