Skip to content

Commit 4553ce5

Browse files
authored
Merge pull request #202 from BethanyJep/main
Update docs: observability & specification
2 parents c63eb6b + 0f692ca commit 4553ce5

File tree

24 files changed

+468
-493
lines changed

24 files changed

+468
-493
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
---
2+
title: Code Guidelines
3+
authors:
4+
- bethanyjep
5+
- nitya
6+
- sethjuarez
7+
date: 2025-03-11
8+
tags:
9+
- contributing
10+
- documentation
11+
index: 2
12+
---
13+
14+
## Code contribution guidelines
15+
16+
..to be updated
17+
18+
---
19+
[Want to Contribute To the Project?](/docs/contributing/)
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
---
2+
title: Docs Guidelines
3+
authors:
4+
- bethanyjep
5+
- nitya
6+
date: 2024-06-10
7+
tags:
8+
- contribution
9+
- documentation
10+
index: 3
11+
---
12+
13+
## Documentation guidelines
14+
For contributing to the Prompty documentation:
15+
16+
1. Local Setup
17+
- Our documentation uses MDX (Markdown + JSX) and is built with a static site generator.
18+
- Install dependencies with `npm install` in the `web` directory.
19+
- Use Node.js 18+ for best compatibility.
20+
21+
2. Making Changes
22+
- Documentation files are located in the `docs` directory.
23+
- Follow the existing folder structure and naming conventions.
24+
- Use `.mdx` extension for all documentation files.
25+
26+
3. Previewing Changes
27+
- Start the local development server with `npm run dev` from the `web` directory.
28+
- View your changes at [http://localhost:3000](http://localhost:3000).
29+
- The site will hot-reload as you edit files.
30+
31+
4. Building Documentation
32+
- Test a production build with `npm run build` followed by `npm run start`.
33+
- Check for any build errors or warnings before submitting your PR.
34+
35+
---
36+
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.

web/docs/contributing/page.mdx

+12-3
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ index: 5
44
authors:
55
- bethanyjep
66
- nitya
7-
date: 2024-10-09
7+
date: 2025-03-10
88
tags:
99
- documentation
1010
- contributing
@@ -23,7 +23,16 @@ Feedback can come in several forms:
2323
- Let us know if you found errors or ambiguity in the documentation
2424
- Report bugs or inconsistent behavior seen with Prompty tools and usage
2525

26-
The easiest way to give us feedback is by [filing an issue](https://github.com/microsoft/prompty/issues/new?template=Blank+issue). **Please check previously logged issues (open and closed) to make sure the topic or bug has not already been raised.** If it does exist, weigh in on that discussion thread to add any additional context of value.
26+
The easiest way to give us feedback is by [filing an issue](https://github.com/microsoft/prompty/issues/new). **Please check previously logged issues (open and closed) to make sure the topic or bug has not already been raised.** If it does exist, weigh in on that discussion thread to add any additional context of value.
2727

2828
## Contributor guidelines
29-
The repository contains both the code and the documentation for the project. Each requires a different set of tools and processes to build and preview outcomes. We hope to document these soon - so check back for **contributor guidelines** that will cover the requirements in more detail.
29+
The repository contains both the code and the documentation for the project. Each requires a different set of tools and processes to build and preview outcomes.
30+
31+
## Pull Requests Guidelines
32+
When submitting a pull request (PR) to the Prompty repository, please use the following guidelines:
33+
34+
- Fork the Repository: Always fork the repository to your own account before making your modifications.
35+
- Separate pull requests (PR):
36+
- Submit each type of change in its own pull request. For example, bug fixes and documentation updates should be submitted in separate PRs.
37+
- Typo fixes and minor documentation updates can be combined into a single PR where appropriate.
38+
- Handle merge conflicts: If your pull request shows merge conflicts, update your local main branch to mirror the main repository before making your modifications.

web/docs/getting-started/debugging-prompty/page.mdx

+2-2
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Debugging Prompty
33
authors:
44
- bethanyjep
55
- nitya
6-
date: 2024-10-22
6+
date: 2025-03-06
77
tags:
88
- getting-started
99
- documentation
@@ -24,7 +24,7 @@ For observability in Prompty, we will use the tracer to visualize and debug the
2424
- Analyze the trace output to debug and fix the bug
2525

2626

27-
## 2. Understandfing Observability in Prompty
27+
## 2. Understanding Observability in Prompty
2828

2929
Observability refers to the ability to monitor and understand the behavior of a system based on the data it produces, such as logs, metrics, and traces.. It is important as it provides you with insights on your LLM workflows and provides a way to debug your prompt inputs and outputs.
3030

638 Bytes
Loading
42.6 KB
Loading
34.1 KB
Loading
2.46 KB
Loading
1.14 KB
Loading
34 KB
Loading
3.69 KB
Loading
44.3 KB
Loading
Loading

web/docs/guides/prompty-extension/page.mdx

+43-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,49 @@ tags:
1111
index: 3
1212
---
1313

14-
TODO: How does Prompty runtime work for no-code execution in VS Code
14+
## The Prompty VS Code Extension
15+
Run Prompty files directly in VS Code. This Visual Studio Code extension offers an intuitive prompt playground within VS Code to streamline the prompt engineering process. You can find the Prompty extension in the Visual Studio Code Marketplace.
16+
17+
Download the [VS Code extension here](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty).
18+
19+
## VSCode Extension Features
20+
### Quickly Create
21+
Quickly create a basic prompty by right-clicking in the VS Code explorer and selecting "New Prompty."
22+
23+
![Quick Create](image-2.png)
24+
25+
### Preview
26+
Preview prompty similar to markdown with dynamic template rendering while typing, allowing you to see the prompt that will be sent to the model.
27+
28+
![Preview](readme_preview.png)
29+
30+
### Define and Switch Model Configurations
31+
* Define your model configurations directly in VS Code.
32+
* Quickly switch between different model configurations.
33+
34+
![Define Configuration](image-5.png)
35+
36+
![Switch Model Configuration](switchModelConfiguration.png)
37+
* Use VS Code settings to define model configuration at:
38+
* User level for use across different prompty files.
39+
* Workspace level to share with team members via Git.
40+
41+
![ModelConfigurationSettings](modelConfigurationSettings.png)
42+
43+
* We strongly encourage using Azure Active Directory authentication for enhanced security. Leave the `api_key` empty to trigger AAD auth.
44+
* OpenAI is also supported. You can store the key in VSCode settings or use `${env:xxx}` to read the API key from an environment variable.
45+
* You can put environment variables in `.env` file, in the same folder of the prompty file, or in the workspace root folder.
46+
* Alternatively, you can also specify it in system variables, follow [OpenAI's Guide](https://help.openai.com/en/articles/5112595-best-practices-for-api-key-safety) for key safety, setting it through Control Panel/zsh/bash, and then restart VS Code to load new values.
47+
48+
### Quick Run
49+
Hit **F5** or click the **Run** button at the top. There are two output windows:
50+
* **Prompty Output** shows a concise view.
51+
52+
![Prompty Output](image-3.png)
53+
54+
* **Prompty Output (Verbose)** shows detailed requests sent and received.
55+
56+
![Prompty Output (Verbose)](image-8.png)
1557

1658
---
1759
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.
Loading
Loading
Loading
Loading

web/docs/guides/prompty-observability/page.mdx

+112-3
Original file line numberDiff line numberDiff line change
@@ -4,16 +4,125 @@ authors:
44
- bethanyjep
55
- nitya
66
- sethjuarez
7-
date: 2024-10-22
7+
date: 2025-03-06
88
tags:
99
- tutorials
1010
- runtime
1111
index: 4
1212
---
1313

14-
Get started with Observability at [debugging Prompty](/docs/getting-started/debugging-prompty), check out additional documentation at [Observability](https://github.com/microsoft/prompty/tree/main/runtime/prompty#using-tracing-in-prompty).
14+
## Using Tracing in Prompty
15+
16+
Prompty supports tracing to help you understand the execution of your prompts. This functionality is customizable and can be used to trace the execution of your prompts in a way that makes sense to you. Prompty has two default traces built in: `console_tracer` and `PromptyTracer`. The `console_tracer` writes the trace to the console, and the `PromptyTracer` writes the trace to a JSON file. You can also create your own tracer by creating your own hook.
17+
18+
```python
19+
import prompty
20+
# import invoker
21+
import prompty.azure
22+
from prompty.tracer import trace, Tracer, console_tracer, PromptyTracer
23+
24+
# add console tracer
25+
Tracer.add("console", console_tracer)
26+
27+
# add PromptyTracer
28+
json_tracer = PromptyTracer(output_dir="path/to/output")
29+
Tracer.add("console", json_tracer.tracer)
30+
31+
# execute the prompt
32+
response = prompty.execute("path/to/prompty/file")
33+
34+
print(response)
35+
```
36+
37+
You can also bring your own tracer by your own tracing hook. The `console_tracer` is the simplest example of a tracer. It writes the trace to the console.
38+
This is what it looks like:
39+
40+
```python
41+
@contextlib.contextmanager
42+
def console_tracer(name: str) -> Iterator[Callable[[str, Any], None]]:
43+
try:
44+
print(f"Starting {name}")
45+
yield lambda key, value: print(f"{key}:\n{json.dumps(value, indent=4)}")
46+
finally:
47+
print(f"Ending {name}")
48+
49+
```
50+
51+
It uses a context manager to define the start and end of the trace so you can do whatever setup and teardown you need. The `yield` statement returns a function that you can use to write the trace. The `console_tracer` writes the trace to the console using the `print` function.
52+
53+
The `PromptyTracer` is a more complex example of a tracer. This tracer manages its internal state using a full class. Here's an example of the class based approach that writes each function trace to a JSON file:
54+
55+
```python
56+
class SimplePromptyTracer:
57+
def __init__(self, output_dir: str):
58+
self.output_dir = output_dir
59+
self.tracer = self._tracer
60+
61+
@contextlib.contextmanager
62+
def tracer(self, name: str) -> Iterator[Callable[[str, Any], None]]:
63+
trace = {}
64+
try:
65+
yield lambda key, value: trace.update({key: value})
66+
finally:
67+
with open(os.path.join(self.output_dir, f"{name}.json"), "w") as f:
68+
json.dump(trace, f, indent=4)
69+
```
70+
71+
The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the paramters. There is also a `@trace` decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.
72+
73+
```python
74+
import prompty
75+
# import invoker
76+
import prompty.azure
77+
from prompty.tracer import trace, Tracer, PromptyTracer
78+
79+
json_tracer = PromptyTracer(output_dir="path/to/output")
80+
Tracer.add("PromptyTracer", json_tracer.tracer)
81+
82+
@trace
83+
def get_customer(customerId):
84+
return {"id": customerId, "firstName": "Sally", "lastName": "Davis"}
85+
86+
@trace
87+
def get_response(customerId, prompt):
88+
customer = get_customer(customerId)
89+
90+
result = prompty.execute(
91+
prompt,
92+
inputs={"question": question, "customer": customer},
93+
)
94+
return {"question": question, "answer": result}
95+
96+
```
97+
98+
In this case, whenever this code is executed, a `.tracy` file will be created in the `path/to/output` directory. This file will contain the trace of the execution of the `get_response` function, the execution of the `get_customer` function, and the prompty internals that generated the response.
99+
100+
## OpenTelemetry Tracing
101+
You can add OpenTelemetry tracing to your application using the same hook mechanism. In your application, you might create something like `trace_span` to trace the execution of your prompts:
102+
103+
```python
104+
from opentelemetry import trace as oteltrace
105+
106+
_tracer = "prompty"
107+
108+
@contextlib.contextmanager
109+
def trace_span(name: str):
110+
tracer = oteltrace.get_tracer(_tracer)
111+
with tracer.start_as_current_span(name) as span:
112+
yield lambda key, value: span.set_attribute(
113+
key, json.dumps(value).replace("\n", "")
114+
)
115+
116+
# adding this hook to the prompty runtime
117+
Tracer.add("OpenTelemetry", trace_span)
118+
119+
```
120+
121+
This will produce spans during the execution of the prompt that can be sent to an OpenTelemetry collector for further analysis.
122+
123+
124+
To get started with Observability at refer [debugging Prompty](/docs/getting-started/debugging-prompty) in the Getting Started section.
15125

16126

17-
TODO: Explain how to trace Prompty execution from input to response
18127
---
19128
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.

web/docs/guides/prompty-runtime/page.mdx

+19-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,25 @@ tags:
1111
index: 2
1212
---
1313

14-
TODO: Explain how runtimes work and how to build a custom runtime
14+
## Using this Prompty Runtime
15+
The Python runtime is a simple way to run your prompts in Python. The runtime is available as a Python package and can be installed using pip. Depending on the type of prompt you are running, you may need to install additional dependencies. The runtime is designed to be extensible and can be customized to fit your needs.
16+
17+
```bash
18+
pip install "prompty[azure]"
19+
```
20+
21+
Simple usage example:
22+
23+
```python
24+
import prompty
25+
# import invoker
26+
import prompty.azure
27+
28+
# execute the prompt
29+
response = prompty.execute("path/to/prompty/file")
30+
31+
print(response)
32+
```
1533

1634
---
1735
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.

0 commit comments

Comments
 (0)