March 13, 2024

Prompts Fundamentals

Prompting an LLM

For software engineers looking to leverage the power of simple prompts in language models, understanding the mechanics of prompt crafting is crucial. The effectiveness of your queries hinges on the richness of the details you embed within the prompt, including the direct instructions or questions aimed at the model, alongside supplementary context, inputs, or examples. These components are instrumental in guiding the model towards generating higher quality outputs.

Let's dive into an elementary example to illustrate prompt structuring:

Prompt Example:

  • Prompt: The sky is
  • Output: blue.

Basic Structure for Prompting in LLM Interfaces

It's important to recognize that when interacting with AI chat models, such as Mistral or Llama 2, prompts can be organized into three categories: system, user, and assistant. While the system message is optional and serves to define the assistant's behavior, the user message directly queries the model. For the sake of clarity, our examples will primarily focus on the user message for querying the Mistral 7B model, unless stated otherwise. The assistant's response, or output, illustrates the model's reply. Defining an assistant message allows you to provide examples of the output you're aiming for, enhancing the model's performance.

The simple prompt example demonstrates how the model generates a contextually appropriate sequence of tokens. However, the output may not always align with your specific objectives, underscoring the importance of detailed context or instructions to guide the model towards your desired outcome. This necessity forms the basis of prompt engineering.

To refine our approach, consider the following enhanced prompt:

Improved Prompt Example:

  • Prompt: Complete the sentence: The sky is
  • Output: The sky is a vast expanse of blue, dotted with fluffy white clouds during the day and a canvas of twinkling stars at night. It is a breathtaking sight that inspires awe and wonder in people around the world.

This refined prompt directs the model to complete the sentence, yielding a more precise and contextually rich response. Such strategic prompt design, aimed at eliciting specific behaviors or answers from the model, epitomizes prompt engineering.

The above examples merely scratch the surface of what's feasible with current LLMs, which are capable of executing a wide array of complex tasks, from summarizing texts and solving mathematical problems to generating code.

Prompt Formatting

The initial example provided serves as a foundational guide to constructing prompts for interacting with language models. In the realm of software development, prompts can be categorized broadly into two types: queries and commands.

  • Query Format: How do you implement a binary search in Python?
  • Command Format: Write a function for a binary search in Python.

For tasks centered around question-answering (QA), a common format within many QA datasets, the structure is typically:

  • Q: How do you implement a binary search in Python?
  • A:

This approach is known as zero-shot prompting, where the model is directly queried for an answer without any prior examples or contextual information related to the task. Advanced language models are equipped to handle zero-shot prompts, with their effectiveness varying based on the task's complexity and the model's training specifics.

Here's a straightforward example tailored for software developers:

Prompt Example:

  • Prompt: Explain the concept of recursion in computer science.

In the context of newer models, the explicit "Q:" prefix can often be omitted, as these models are adept at recognizing a question based on the prompt's structure alone. Thus, a more streamlined version might look like this:

Simplified Prompt:

  • Explain the concept of recursion in computer science.

To enhance the effectiveness of prompts, especially in complex or nuanced tasks, the few-shot prompting technique is invaluable. This involves providing the model with a series of examples or exemplars. For software development tasks, a few-shot prompt could be structured as follows:

  • What is a class in object-oriented programming?
  • A class is a blueprint for creating objects, providing initial values for state (member variables) and implementations of behavior (member functions or methods).
  • What is inheritance in programming?
  • Inheritance is a mechanism wherein a new class is derived from an existing class.
  • What is polymorphism in computer science?
  • Polymorphism is the ability of objects of different classes to respond to the same message/function call in different ways.
  • What is encapsulation?

In QA format, this translates to:

  • Q: What is a class in object-oriented programming?
  • A: A class is a blueprint for creating objects, providing initial values for state (member variables) and implementations of behavior (member functions or methods).
  • Q: What is inheritance in programming?
  • A: Inheritance is a mechanism wherein a new class is derived from an existing class.
  • Q: What is polymorphism in computer science?
  • A: Polymorphism is the ability of objects of different classes to respond to the same message/function call in different ways.
  • Q: What is encapsulation?
  • A:

While the QA format is not obligatory, its use should be dictated by the nature of the task at hand. For instance, in a code classification task, you might structure your prompt as follows:

Prompt for Code Classification:

  • "def quicksort(arr):" // Sorting Algorithm
  • "def authenticate_user(user):" // Authentication Function
  • "class Vehicle:" // Class Definition
  • "def calculate_area(radius):" //

Expected Output:

  • Geometry Function

Few-shot prompts enable language models to engage in in-context learning, adapting to new tasks based on a limited set of examples. This guide will delve deeper into the nuances of zero-shot and few-shot prompting, particularly their applications in software development, in the sections that follow.

Get API Key