Skip to content

Commit 6551d57

Browse files
authored
Merge branch 'main' into langchain
2 parents 8581e9e + 53920e3 commit 6551d57

File tree

155 files changed

+10321
-825
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

155 files changed

+10321
-825
lines changed

.cz.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ tag_format = "v$version"
44
version_scheme = "pep440"
55
major_version_zero = true
66
update_changelog_on_bump = true
7-
version = "0.49.1"
7+
version = "0.49.7"
88
version_files = [
99
"packages/opentelemetry-instrumentation-mcp/pyproject.toml:^version",
1010
"packages/opentelemetry-instrumentation-mcp/opentelemetry/instrumentation/mcp/version.py",

.github/workflows/ci.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,7 @@ jobs:
7474
run: npm cache clean --force || true
7575
- run: npx nx affected -t install --with dev
7676
- run: npx nx affected -t lint --parallel=3
77+
- run: npx nx affected -t type-check --parallel=3
7778

7879
build-packages:
7980
name: Build Packages

CHANGELOG.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,48 @@
1+
## v0.49.7 (2025-12-08)
2+
3+
### Fix
4+
5+
- **exp**: Add a real agent example (#3507)
6+
- **evals**: Add agent evaluators to made by traceloop (#3505)
7+
- **exp**: Add made by traceloop evaluators (#3503)
8+
- **traceloop-sdk**: Fixes gRPC exporter initialisation with insecure OTLP (#3481)
9+
10+
## v0.49.6 (2025-12-01)
11+
12+
### Fix
13+
14+
- **agno**: add streaming support for Agent.run() and Agent.arun() (#3483)
15+
16+
## v0.49.5 (2025-11-27)
17+
18+
### Fix
19+
20+
- **openai**: responses instrumentation broken traces for async streaming (#3475)
21+
- **mcp**: remove faulty logic of trying to deduce HTTP errors (#3477)
22+
23+
## v0.49.4 (2025-11-27)
24+
25+
### Fix
26+
27+
- **exp**: Add run in github experiment (#3459)
28+
29+
## v0.49.3 (2025-11-26)
30+
31+
### Fix
32+
33+
- **openai**: recognize NOT_GIVEN and Omit (#3473)
34+
- **dataset**: add support for file cells in datasets with upload and external URL linking capabilities (#3462)
35+
- **openai**: report request attributes in responses API instrumentation (#3471)
36+
- **sdk**: crewai tracing provider conflict (#3470)
37+
- **sdk**: watsonx warning on initialization (#3469)
38+
- **traceloop-sdk**: add type-checking support with mypy (#3463)
39+
40+
## v0.49.2 (2025-11-25)
41+
42+
### Fix
43+
44+
- **sdk**: remove posthog (#3466)
45+
146
## v0.49.1 (2025-11-24)
247

348
### Fix

README.md

Lines changed: 20 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -117,53 +117,53 @@ See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) fo
117117

118118
OpenLLMetry can instrument everything that [OpenTelemetry already instruments](https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation) - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Chroma, Pinecone, Qdrant or Weaviate.
119119

120-
-[OpenAI / Azure OpenAI](https://openai.com/)
120+
-[Aleph Alpha](https://www.aleph-alpha.com/)
121121
-[Anthropic](https://www.anthropic.com/)
122-
-[Cohere](https://cohere.com/)
123-
-[Ollama](https://ollama.com/)
124-
-[Mistral AI](https://mistral.ai/)
125-
-[HuggingFace](https://huggingface.co/)
126122
-[Bedrock (AWS)](https://aws.amazon.com/bedrock/)
127-
-[SageMaker (AWS)](https://aws.amazon.com/sagemaker/)
128-
-[Replicate](https://replicate.com/)
129-
-[Vertex AI (GCP)](https://cloud.google.com/vertex-ai)
123+
-[Cohere](https://cohere.com/)
130124
-[Google Generative AI (Gemini)](https://ai.google/)
125+
-[Groq](https://groq.com/)
126+
-[HuggingFace](https://huggingface.co/)
131127
-[IBM Watsonx AI](https://www.ibm.com/watsonx)
128+
-[Mistral AI](https://mistral.ai/)
129+
-[Ollama](https://ollama.com/)
130+
-[OpenAI / Azure OpenAI](https://openai.com/)
131+
-[Replicate](https://replicate.com/)
132+
-[SageMaker (AWS)](https://aws.amazon.com/sagemaker/)
132133
-[Together AI](https://together.xyz/)
133-
-[Aleph Alpha](https://www.aleph-alpha.com/)
134-
-[Groq](https://groq.com/)
134+
-[Vertex AI (GCP)](https://cloud.google.com/vertex-ai)
135135
-[WRITER](https://writer.com/)
136136

137137
### Vector DBs
138138

139139
-[Chroma](https://www.trychroma.com/)
140+
-[LanceDB](https://lancedb.com/)
141+
-[Marqo](https://marqo.ai/)
142+
-[Milvus](https://milvus.io/)
140143
-[Pinecone](https://www.pinecone.io/)
141144
-[Qdrant](https://qdrant.tech/)
142145
-[Weaviate](https://weaviate.io/)
143-
-[Milvus](https://milvus.io/)
144-
-[Marqo](https://marqo.ai/)
145-
-[LanceDB](https://lancedb.com/)
146146

147147
### Frameworks
148148

149+
-[Agno](https://github.com/agno-agi/agno)
150+
-[AWS Strands](https://strandsagents.com/) (built-in OTEL support)
151+
-[CrewAI](https://docs.crewai.com/introduction)
152+
-[Haystack](https://haystack.deepset.ai/integrations/traceloop)
149153
-[LangChain](https://python.langchain.com/docs/introduction/)
154+
-[Langflow](https://docs.langflow.org/)
150155
-[LangGraph](https://langchain-ai.github.io/langgraph/concepts/why-langgraph/)
151-
-[LlamaIndex](https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#openllmetry)
152-
-[Haystack](https://haystack.deepset.ai/integrations/traceloop)
153156
-[LiteLLM](https://docs.litellm.ai/docs/observability/opentelemetry_integration)
154-
-[CrewAI](https://docs.crewai.com/introduction)
157+
-[LlamaIndex](https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#openllmetry)
155158
-[OpenAI Agents](https://openai.github.io/openai-agents-python/)
156-
-[Langflow](https://docs.langflow.org/)
157159

158160
### Protocol
159161

160162
-[MCP](https://modelcontextprotocol.io/)
161163

162164
## 🔎 Telemetry
163165

164-
The SDK provided with OpenLLMetry (not the instrumentations) contains a telemetry feature that collects **anonymous** usage information.
165-
166-
You can opt out of telemetry by setting the `TRACELOOP_TELEMETRY` environment variable to `FALSE`, or passing `telemetry_enabled=False` to the `Traceloop.init()` function.
166+
We no longer log or collect any telemetry in the SDK or in the instrumentations. Make sure to bump to v0.49.2 and above.
167167

168168
### Why we collect telemetry
169169

0 commit comments

Comments
 (0)