Like the sister WaPo article hyping up this job title, also involving Anthropic (https://archive.is/pvwlx), this is still a super small market and only red-hot only because of Anthropic's job posting. It's like watching a penny stock have a huge price spike when there's one buyer and one seller.

In addition, I can't fathom paying $300k for skills less than a Ph.D or self-directed research. You'll be wanting to find emergent stuff like this and not just crafting prompts to get the right response: https://arxiv.org/abs/2201.11903 This is probably what prompt engineering should be about, not just crafting stuff for a particular output.

You can also get some freelance market rates here: https://promptbase.com/hire although the activity/reviews are quite low volume. Almost anyone can learn to do what these folks do, especially in stable diffusion land with all of the web-ui plugins.

"Prompt engineering" is a made up thing. There's not some magic William Gibson prompt cowboy who through some innate intuition can master prompts the way nobody else can.

It's a deterministic model. There's plenty of interesting explainability work, or however you want to frame it, to be done understanding the connection between different prompts and outputs. But it's clearly in the province of real ML researchers who have experience on the area, and will approach it scientifically. 335k for someone good at that is not unusual at all, they have lots of options.

Prompt engineering is more prompt chaining which is actively being researched so it is a thing. https://github.com/hwchase17/langchain