Skip to main content
The following sections help you set up and use tracing, monitoring, and observability features:
LangSmith works with many frameworks and providers. Browse available integrations to connect your stack including OpenAI, Anthropic, CrewAI, Vercel AI SDK, Pydantic AI, and more.

Set up tracing

Configure tracing with basic options, framework integrations, or advanced settings for full control.

View traces

Access and manage traces via UI or API with filtering, exporting, sharing, and comparison tools.

Monitor performance

Create dashboards and set alerts to track performance and get notified when issues arise.

Configure automations

Use rules, webhooks, and online evaluations to streamline observability workflows.

Collect feedback

Gather and manage annotations on outputs using queues and inline annotation.

Trace a RAG app

Follow a step-by-step tutorial to trace a Retrieval-Augmented Generation application from start to finish.
For terminology definitions and core concepts, refer to Observability concepts.
Use Polly, LangSmith’s AI assistant, to analyze traces and get AI-powered insights into your application’s performance.
To set up a LangSmith instance, visit the Platform setup section to choose between cloud, hybrid, or self-hosted. All options include observability, evaluation, prompt engineering, and deployment.

Connect these docs to Claude, VSCode, and more via MCP for real-time answers.