Skip to content

OpenAI agents docs #99

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions monitoring/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,7 @@ description: "Detect hallucinations and regressions in the quality of your LLMs"
One of the key features of Traceloop is the ability to monitor the quality of your LLM outputs. It helps you to detect hallucinations and regressions in the quality of your models and prompts.

To start monitoring your LLM outputs, make sure you installed OpenLLMetry and configured it to send data to Traceloop. If you haven't done that yet, you can follow the instructions in the [Getting Started](/openllmetry/getting-started) guide.
Next, if you're not using a framework like LangChain or LlamaIndex, [make sure to annotate workflows and tasks](/openllmetry/tracing/decorators).

Next, if you're not using a [supported LLM framework](/openllmetry/tracing/supported#frameworks), [make sure to annotate workflows and tasks](/openllmetry/tracing/annotations).
You can then define any of the following [monitors](https://app.traceloop.com/monitors/prd) to track the quality of your LLM outputs.

<Frame>
Expand Down
2 changes: 1 addition & 1 deletion openllmetry/getting-started-nextjs.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ Assume you have a function that renders a prompt and calls an LLM, simply wrap i
We also have compatible Typescript decorators for class methods which are more convenient.

<Tip>
If you're using an LLM framework like Haystack, Langchain or LlamaIndex -
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) -
we'll do that for you. No need to add any annotations to your code.
</Tip>

Expand Down
2 changes: 1 addition & 1 deletion openllmetry/getting-started-python.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ Assume you have a function that renders a prompt and calls an LLM, simply add `@
</Warning>

<Tip>
If you're using an LLM framework like Haystack, Langchain or LlamaIndex -
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) -
we'll do that for you. No need to add any annotations to your code.
</Tip>

Expand Down
2 changes: 1 addition & 1 deletion openllmetry/getting-started-ts.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Assume you have a function that renders a prompt and calls an LLM, simply wrap i
We also have compatible Typescript decorators for class methods which are more convenient.

<Tip>
If you're using an LLM framework like Haystack, Langchain or LlamaIndex -
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) -
we'll do that for you. No need to add any annotations to your code.
</Tip>

Expand Down
2 changes: 1 addition & 1 deletion openllmetry/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Tracing is done in a non-intrusive way, built on top of OpenTelemetry.
You can choose to export the traces to Traceloop, or to your existing observability stack.

<Tip>
You can use OpenLLMetry whether you use a framework like LangChain, or
You can use OpenLLMetry whether you use a [supported LLM framework](/openllmetry/tracing/supported#frameworks), or
directly interact with a foundation model API.
</Tip>

Expand Down
2 changes: 1 addition & 1 deletion openllmetry/tracing/annotations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ description: "Enrich your traces by annotating chains and workflows in your app"
Traceloop SDK supports several ways to annotate workflows, tasks, agents and tools in your code to get a more complete picture of your app structure.

<Tip>
If you're using a framework like Langchain, Haystack or LlamaIndex - no need
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) - no need
to do anything! OpenLLMetry will automatically detect the framework and
annotate your traces.
</Tip>
Expand Down
1 change: 1 addition & 0 deletions openllmetry/tracing/supported.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -50,3 +50,4 @@ In the meantime, you can still use OpenLLMetry to report the [LLM and vector DB
| [Haystack by deepset](https://haystack.deepset.ai/) | ✅ | ❌ |
| [Langchain](https://www.langchain.com/) | ✅ | ✅ |
| [LlamaIndex](https://www.llamaindex.ai/) | ✅ | ✅ |
| [OpenAI Agents](https://github.com/openai/openai-agents-python) | ✅ | ❌ |