AI development with standardised prompt engineering

Pranav K
3 min readAug 28, 2024

--

In the rapidly evolving world of AI, the creation and management of prompts for large language models (LLMs) have become essential.

Microsoft’s Prompty is a tool designed to standardise and streamline this process, enhancing the reusability and portability of prompts, thereby accelerating the development cycle. Prompty’s integration with Visual Studio Code and support for orchestrators like Langchain enhances its utility, allowing developers to manage and execute prompts efficiently.

Some of the benefits includes -

  • Standardisation: Ensures consistent prompt creation and management with prompty specification format across different AI projects, fostering uniformity and collaboration.
  • Testability in Isolation: Prompts can be tested independently, allowing developers to debug and optimise them without affecting the entire system. (For eg. VSCode Prompty extension)
  • Embedded Observability: Prompty includes detailed logging features and traceability enabling developers to monitor prompt performance and easily identify issues.

An example of Prompty specification format:

#sample.prompty
---
name: SamplePrompt
description: A prompt that uses context to ground an incoming question
authors:
- Pranav
model:
api: chat
configuration:
type: openai
name: ${env:OPENAI_MODEL_NAME}
api_key: ${env:OPENAI_API_KEY}
model: ${env:OPENAI_MODEL_NAME}
parameters:
max_tokens: 3000
temperature: 0.7
response: all

sample:
firstName: John
context: >
The Alpine Explorer Tent boasts a detachable divider for privacy,
numerous mesh windows and adjustable vents for ventilation, and
a waterproof design. It even has a built-in gear loft for storing
your outdoor essentials. In short, it's a blend of privacy, comfort,
and convenience, making it your second home in the heart of nature!
question: What can you tell me about your tents?
---

system:
You are an AI assistant who helps people find information. As the assistant,
you answer questions briefly, succinctly, and in a personable manner using
markdown and even add some personal flair with appropriate emojis.

# Customer
You are helping {{firstName}} to find answers to their questions.
Use their name to address them in your responses. Your response should be like below.
Hi customer_name, thank you for your query. the answer content

# Context
Use the following context to provide a more personalized response to {{firstName}}:
{{context}}

user:
{{question}}

The usage of the prompty specification file would look like

#python
import os
from promptflow.core import OpenAIModelConfiguration
from promptflow.core import Prompty
from promptflow.tracing import start_trace

# For observability
start_trace()

configuration = OpenAIModelConfiguration(
model=os.getenv("OPENAI_MODEL_NAME"),
api_key=os.getenv("OPENAI_API_KEY")
)

# you can also override model parameters as specified in prompty file
override_model = {"configuration": configuration,
"parameters":
{"max_tokens": 512,"temperature": 0.7}}

# load prompty as a flow
flow = Prompty.load(source="sample.prompty", model=override_model)

# execute the flow as function
result = flow(question="What is the capital of France?", firstName="Pranav",
context=" France is in Europe")
print(result)

##Output
# Hi Pranav, thank you for your query. The capital of France is Paris! 🇫🇷✨

Conclusion: Prompty is a powerful tool that addresses the growing need for standardisation in AI development. By providing a unified format for prompt creation and execution, and by integrating seamlessly with existing tools, Prompty enhances both efficiency and reliability of AI projects. Currently OpenAI and AzureOpenAI models are supported, but I am hoping this sort of thing becomes a standard across other LLM providers considering the benefits above.

--

--