You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: web/docs/contributing/page.mdx
+12-3
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ index: 5
4
4
authors:
5
5
- bethanyjep
6
6
- nitya
7
-
date: 2024-10-09
7
+
date: 2025-03-10
8
8
tags:
9
9
- documentation
10
10
- contributing
@@ -23,7 +23,16 @@ Feedback can come in several forms:
23
23
- Let us know if you found errors or ambiguity in the documentation
24
24
- Report bugs or inconsistent behavior seen with Prompty tools and usage
25
25
26
-
The easiest way to give us feedback is by [filing an issue](https://github.com/microsoft/prompty/issues/new?template=Blank+issue). **Please check previously logged issues (open and closed) to make sure the topic or bug has not already been raised.** If it does exist, weigh in on that discussion thread to add any additional context of value.
26
+
The easiest way to give us feedback is by [filing an issue](https://github.com/microsoft/prompty/issues/new). **Please check previously logged issues (open and closed) to make sure the topic or bug has not already been raised.** If it does exist, weigh in on that discussion thread to add any additional context of value.
27
27
28
28
## Contributor guidelines
29
-
The repository contains both the code and the documentation for the project. Each requires a different set of tools and processes to build and preview outcomes. We hope to document these soon - so check back for **contributor guidelines** that will cover the requirements in more detail.
29
+
The repository contains both the code and the documentation for the project. Each requires a different set of tools and processes to build and preview outcomes.
30
+
31
+
## Pull Requests Guidelines
32
+
When submitting a pull request (PR) to the Prompty repository, please use the following guidelines:
33
+
34
+
- Fork the Repository: Always fork the repository to your own account before making your modifications.
35
+
- Separate pull requests (PR):
36
+
- Submit each type of change in its own pull request. For example, bug fixes and documentation updates should be submitted in separate PRs.
37
+
- Typo fixes and minor documentation updates can be combined into a single PR where appropriate.
38
+
- Handle merge conflicts: If your pull request shows merge conflicts, update your local main branch to mirror the main repository before making your modifications.
Copy file name to clipboardexpand all lines: web/docs/getting-started/debugging-prompty/page.mdx
+2-2
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Debugging Prompty
3
3
authors:
4
4
- bethanyjep
5
5
- nitya
6
-
date: 2024-10-22
6
+
date: 2025-03-06
7
7
tags:
8
8
- getting-started
9
9
- documentation
@@ -24,7 +24,7 @@ For observability in Prompty, we will use the tracer to visualize and debug the
24
24
- Analyze the trace output to debug and fix the bug
25
25
26
26
27
-
## 2. Understandfing Observability in Prompty
27
+
## 2. Understanding Observability in Prompty
28
28
29
29
Observability refers to the ability to monitor and understand the behavior of a system based on the data it produces, such as logs, metrics, and traces.. It is important as it provides you with insights on your LLM workflows and provides a way to debug your prompt inputs and outputs.
Copy file name to clipboardexpand all lines: web/docs/guides/prompty-extension/page.mdx
+43-1
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,49 @@ tags:
11
11
index: 3
12
12
---
13
13
14
-
TODO: How does Prompty runtime work for no-code execution in VS Code
14
+
## The Prompty VS Code Extension
15
+
Run Prompty files directly in VS Code. This Visual Studio Code extension offers an intuitive prompt playground within VS Code to streamline the prompt engineering process. You can find the Prompty extension in the Visual Studio Code Marketplace.
16
+
17
+
Download the [VS Code extension here](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty).
18
+
19
+
## VSCode Extension Features
20
+
### Quickly Create
21
+
Quickly create a basic prompty by right-clicking in the VS Code explorer and selecting "New Prompty."
22
+
23
+

24
+
25
+
### Preview
26
+
Preview prompty similar to markdown with dynamic template rendering while typing, allowing you to see the prompt that will be sent to the model.
27
+
28
+

29
+
30
+
### Define and Switch Model Configurations
31
+
* Define your model configurations directly in VS Code.
32
+
* Quickly switch between different model configurations.
33
+
34
+

35
+
36
+

37
+
* Use VS Code settings to define model configuration at:
38
+
* User level for use across different prompty files.
39
+
* Workspace level to share with team members via Git.
* We strongly encourage using Azure Active Directory authentication for enhanced security. Leave the `api_key` empty to trigger AAD auth.
44
+
* OpenAI is also supported. You can store the key in VSCode settings or use `${env:xxx}` to read the API key from an environment variable.
45
+
* You can put environment variables in `.env` file, in the same folder of the prompty file, or in the workspace root folder.
46
+
* Alternatively, you can also specify it in system variables, follow [OpenAI's Guide](https://help.openai.com/en/articles/5112595-best-practices-for-api-key-safety) for key safety, setting it through Control Panel/zsh/bash, and then restart VS Code to load new values.
47
+
48
+
### Quick Run
49
+
Hit **F5** or click the **Run** button at the top. There are two output windows:
50
+
***Prompty Output** shows a concise view.
51
+
52
+

53
+
54
+
***Prompty Output (Verbose)** shows detailed requests sent and received.
55
+
56
+

15
57
16
58
---
17
59
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.
Copy file name to clipboardexpand all lines: web/docs/guides/prompty-observability/page.mdx
+112-3
Original file line number
Diff line number
Diff line change
@@ -4,16 +4,125 @@ authors:
4
4
- bethanyjep
5
5
- nitya
6
6
- sethjuarez
7
-
date: 2024-10-22
7
+
date: 2025-03-06
8
8
tags:
9
9
- tutorials
10
10
- runtime
11
11
index: 4
12
12
---
13
13
14
-
Get started with Observability at [debugging Prompty](/docs/getting-started/debugging-prompty), check out additional documentation at [Observability](https://github.com/microsoft/prompty/tree/main/runtime/prompty#using-tracing-in-prompty).
14
+
## Using Tracing in Prompty
15
+
16
+
Prompty supports tracing to help you understand the execution of your prompts. This functionality is customizable and can be used to trace the execution of your prompts in a way that makes sense to you. Prompty has two default traces built in: `console_tracer` and `PromptyTracer`. The `console_tracer` writes the trace to the console, and the `PromptyTracer` writes the trace to a JSON file. You can also create your own tracer by creating your own hook.
17
+
18
+
```python
19
+
import prompty
20
+
# import invoker
21
+
import prompty.azure
22
+
from prompty.tracer import trace, Tracer, console_tracer, PromptyTracer
You can also bring your own tracer by your own tracing hook. The `console_tracer` is the simplest example of a tracer. It writes the trace to the console.
It uses a context manager to define the start and end of the trace so you can do whatever setup and teardown you need. The `yield` statement returns a function that you can use to write the trace. The `console_tracer` writes the trace to the console using the `print` function.
52
+
53
+
The `PromptyTracer` is a more complex example of a tracer. This tracer manages its internal state using a full class. Here's an example of the class based approach that writes each function trace to a JSON file:
withopen(os.path.join(self.output_dir, f"{name}.json"), "w") as f:
68
+
json.dump(trace, f, indent=4)
69
+
```
70
+
71
+
The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the paramters. There is also a `@trace` decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.
72
+
73
+
```python
74
+
import prompty
75
+
# import invoker
76
+
import prompty.azure
77
+
from prompty.tracer import trace, Tracer, PromptyTracer
In this case, whenever this code is executed, a `.tracy` file will be created in the `path/to/output` directory. This file will contain the trace of the execution of the `get_response` function, the execution of the `get_customer` function, and the prompty internals that generated the response.
99
+
100
+
## OpenTelemetry Tracing
101
+
You can add OpenTelemetry tracing to your application using the same hook mechanism. In your application, you might create something like `trace_span` to trace the execution of your prompts:
102
+
103
+
```python
104
+
from opentelemetry import trace as oteltrace
105
+
106
+
_tracer ="prompty"
107
+
108
+
@contextlib.contextmanager
109
+
deftrace_span(name: str):
110
+
tracer = oteltrace.get_tracer(_tracer)
111
+
with tracer.start_as_current_span(name) as span:
112
+
yieldlambdakey, value: span.set_attribute(
113
+
key, json.dumps(value).replace("\n", "")
114
+
)
115
+
116
+
# adding this hook to the prompty runtime
117
+
Tracer.add("OpenTelemetry", trace_span)
118
+
119
+
```
120
+
121
+
This will produce spans during the execution of the prompt that can be sent to an OpenTelemetry collector for further analysis.
122
+
123
+
124
+
To get started with Observability at refer [debugging Prompty](/docs/getting-started/debugging-prompty) in the Getting Started section.
15
125
16
126
17
-
TODO: Explain how to trace Prompty execution from input to response
18
127
---
19
128
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.
Copy file name to clipboardexpand all lines: web/docs/guides/prompty-runtime/page.mdx
+19-1
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,25 @@ tags:
11
11
index: 2
12
12
---
13
13
14
-
TODO: Explain how runtimes work and how to build a custom runtime
14
+
## Using this Prompty Runtime
15
+
The Python runtime is a simple way to run your prompts in Python. The runtime is available as a Python package and can be installed using pip. Depending on the type of prompt you are running, you may need to install additional dependencies. The runtime is designed to be extensible and can be customized to fit your needs.
0 commit comments