Apart from things like the langchain and hugging face blogs, does anyone have any good resources for the latest tools and techniques for LLM prompt engineering? I need to bone up, this week.
📝 Prompt engineering techniques with Azure OpenAI - Azure OpenAI Service
Learn about the options for how to use prompt engineering with GPT-3, GPT-35-Turbo, and GPT-4 models
Prompt Engineering, also known as In-Context Prompting, refers to methods for how to communicate with LLM to steer its behavior for desired outcomes without updating the model weights. It is an empirical science and the effect of prompt engineering methods can vary a lot among models, thus requiring heavy experimentation and heuristics. This post only focuses on prompt engineering for autoregressive language models, so nothing with Cloze tests, image generation or multimodality models.
📝 A guide to prompting AI (for what it is worth)
A little bit of magic, but mostly just practice
📝 Building LLM applications for production
[Hacker News discussion, LinkedIn discussion, Twitter thread]
📝 Re-implementing LangChain in 100 lines of code
LangChain has become a tremendously popular toolkit for building a wide range of LLM-powered applications, including chat, Q&A and document search. In this blogpost I re-implement some of the novel LangChain functionality as a learning exercise, looking at the low-level prompts it uses to create these higher level capabilities.