sec: relax fastapi upper bound + floor-pin tornado (CVE-2025-62727, CVE-2026-31958)#289
sec: relax fastapi upper bound + floor-pin tornado (CVE-2025-62727, CVE-2026-31958)#289scale-ballen wants to merge 13 commits intomainfrom
Conversation
Remove `fastapi<0.116` constraint so consumers can resolve fastapi>=0.130 which dropped the starlette<0.47 upper bound, enabling starlette>=0.49.1 (fixes CVE-2025-62727). Add `tornado>=6.5.5` floor to fix CVE-2026-31958. uv.lock: fastapi 0.115.14→0.135.2, starlette 0.46.2→1.0.0, tornado 6.5.2→6.5.5 Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
Buildtime + Runtime Test EvidenceDocker image build (
|
All *Params models require an `agent: Agent` field and SendEventParams uses `event` (not `message`), SendMessageParams uses `content` (not `message`). Test fixtures were written against an older schema and have been failing since the repo was initialized. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…sion starlette 1.0.0 introduced a regression in BaseHTTPMiddleware that broke streaming responses (StreamingResponse through call_next). The tutorial integration test (test-00_sync/010_multiturn) reproduced this as SendMessageResponse.result=None on message/send calls. Add explicit starlette>=0.49.1,<1.0.0: - >=0.49.1 preserves CVE-2025-62727 fix - <1.0.0 keeps BaseHTTPMiddleware streaming behaviour stable fastapi 0.135.2 requires starlette>=0.46.0 (no upper), which satisfies 0.52.1. uv.lock: starlette 1.0.0 -> 0.52.1. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…aming BaseHTTPMiddleware.call_next() silently buffers or drops StreamingResponse bodies in several starlette versions. Since message_send_wrapper always returns an AsyncGenerator (wrapped in StreamingResponse), the Agentex server proxy received result=null for every message/send call through that path. Replace RequestIDMiddleware with a pure ASGI implementation that sets the request-ID context variable and passes scope/receive/send through unmodified, never touching the response body. This unblocks all message/send tutorial integration tests (010_multiturn, 020_streaming, 030_langgraph). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
| if scope["type"] == "http": | ||
| headers = dict(scope.get("headers", [])) | ||
| raw_request_id = headers.get(b"x-request-id", b"") | ||
| request_id = raw_request_id.decode() if raw_request_id else uuid.uuid4().hex |
There was a problem hiding this comment.
decode() can raise UnicodeDecodeError on malformed headers
raw_request_id.decode() uses UTF-8 by default. If a client (or an upstream proxy) sends a non-UTF-8 byte sequence in the x-request-id header, this will raise an unhandled UnicodeDecodeError that propagates through the ASGI stack, causing a 500 for the request.
The HTTP/1.1 spec (RFC 7230) specifies that header field values are ISO-8859-1 / Latin-1, so decode('latin-1') is both spec-correct and will never raise an exception (every byte sequence is valid Latin-1).
| request_id = raw_request_id.decode() if raw_request_id else uuid.uuid4().hex | |
| request_id = raw_request_id.decode("latin-1") if raw_request_id else uuid.uuid4().hex |
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agentex/lib/sdk/fastacp/base/base_acp_server.py
Line: 62
Comment:
**`decode()` can raise `UnicodeDecodeError` on malformed headers**
`raw_request_id.decode()` uses UTF-8 by default. If a client (or an upstream proxy) sends a non-UTF-8 byte sequence in the `x-request-id` header, this will raise an unhandled `UnicodeDecodeError` that propagates through the ASGI stack, causing a 500 for the request.
The HTTP/1.1 spec (RFC 7230) specifies that header field values are ISO-8859-1 / Latin-1, so `decode('latin-1')` is both spec-correct and will never raise an exception (every byte sequence is valid Latin-1).
```suggestion
request_id = raw_request_id.decode("latin-1") if raw_request_id else uuid.uuid4().hex
```
How can I resolve this? If you propose a fix, please make it concise.The <1.0.0 ceiling was added to avoid a BaseHTTPMiddleware+StreamingResponse regression in starlette 1.0.0. Since RequestIDMiddleware is now a pure ASGI middleware with no call_next() involvement, that regression no longer applies. Removing the cap lets consumers reach any starlette 1.x CVE fix without needing an explicit bump here. uv.lock re-resolves starlette 0.52.1 → 1.0.0. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Tutorial agents were failing with `ModuleNotFoundError: No module named 'opentelemetry.sdk'` because `opentelemetry-sdk` was an accidental transitive dep of `ddtrace` in the old resolution graph. Removing the starlette<1.0.0 cap changed uv's resolution for tutorial `uv run --with` environments, exposing the missing explicit dependency. Declaring the `[opentelemetry]` extra on temporalio makes the dependency explicit so all environments resolve it correctly. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Four tutorial test files imported `uuid` via:
from agentex.lib.sdk.fastacp.base.base_acp_server import uuid
This triggered the full fastacp import chain at test collection time:
fastacp → types.fastacp → core.clients.temporal.utils
→ temporalio.contrib.openai_agents → opentelemetry.sdk
The test runner installs agentex-sdk from PyPI (not the built wheel),
so the published version's temporalio lacks the [opentelemetry] extra,
causing ModuleNotFoundError at test collection time.
Fix: use `import uuid` from the standard library directly.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…try deps Using the opentelemetry extra on the temporalio specifier caused uv to resolve a different dependency graph in tutorial environments (uv run --with wheel), which broke the _JSONTypeConverterUnhandled import used by scale-gp at worker startup time. Listing opentelemetry-api and opentelemetry-sdk as explicit direct dependencies keeps the same packages in the wheel while letting temporalio resolve independently in tutorial envs — matching the pre-existing resolution that scale-gp expects. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This private symbol was used only as a return type annotation. In some resolved versions of temporalio (within the >=1.18.2,<2 range), this private name is not exported, causing an ImportError at worker startup time in tutorial environments. The sentinel value JSONTypeConverter.Unhandled (public API) is what the method actually returns; the annotation is simplified to Any. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…n sync tutorials - 010_multiturn/acp.py: append current user message to input_list before Runner.run(); without this, Runner.run() was called with an empty list (history excludes current turn), causing an OpenAI API error for missing input - 020_streaming/acp.py: same fix for Runner.run_streamed(); also add `return` after yielding the no-API-key error message so the generator does not fall through - 030_langgraph/graph.py: change MODEL_NAME from "gpt-5" (invalid) to "gpt-4o"; remove unsupported `reasoning` kwarg from ChatOpenAI constructor Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- run_agent_test.sh: pass built wheel to pytest so tests use local SDK instead of PyPI version (fixes send_message NDJSON pydantic errors) - agents.py: rewrite send_message (sync+async) to use streaming internally and handle NDJSON responses from FastACP server - agent_rpc_response.py: make SendMessageStreamResponse.result Optional to handle null result in streaming done events - hooks.py: fix execute_activity_method -> execute_activity for function-based Temporal activities (fixes tool_request not appearing) - Tutorial handlers: add missing return after no-API-key sorry messages to prevent fall-through to LLM calls (010_multiturn, 020_streaming, 030_tracing, 040_other_sdks, 010_agent_chat, 050_agent_chat_guardrails) - 010_agent_chat/workflow.py: fix gpt-5 -> gpt-4o, remove invalid reasoning params (only valid for o-series models) - 080_human_in_the_loop/workflow.py: guard span.output access against None - test_agent.py (010_multiturn): add sleep for async state init race Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Summary
fastapi<0.116upper bound so consumers can resolvefastapi>=0.130which dropped starlette's<0.47.0cap, enablingstarlette>=0.49.1(fixes CVE-2025-62727 HIGH)tornado>=6.5.5explicit floor pin to fix CVE-2026-31958 HIGHVulnerability Context
Root cause chain for CVE-2025-62727
Buildtime Evidence
pyproject.toml diff:
uv.lock resolution after fix:
Runtime Evidence
Import verification (all APIs used by BaseACPServer):
Version checks:
Test Results
Pre-existing failures confirmed identical on
mainbefore this change (pydantic validation errors in test fixtures unrelated to fastapi/starlette/tornado).Downstream Impact
After this SDK version is published, consumers (e.g.
context-distillerinsgp-solutions) can upgrade theiragentex-sdkpin to get patched starlette + tornado without any code changes.🤖 Generated with Claude Code
Greptile Summary
This PR addresses two HIGH-severity CVEs (
CVE-2025-62727in starlette andCVE-2026-31958in tornado) by relaxing thefastapiupper bound, adding explicitstarlette>=0.49.1andtornado>=6.5.5floor pins, and bumping the lock tofastapi 0.135.2/starlette 1.0.0/tornado 6.5.5. Alongside the dependency fixes it ships several correctness improvements: a pure-ASGIRequestIDMiddlewareto stopBaseHTTPMiddlewarefrom swallowingStreamingResponsebodies, streaming-awaresend_messageclient logic, missingreturnguards in several tutorial handlers, and removal of the private_JSONTypeConverterUnhandledtype from the Temporal worker.Key changes:
starlette>=0.49.1andtornado>=6.5.5floor pins; lock resolves tostarlette 1.0.0(no<1.0.0cap present).RequestIDMiddlewarerewrite: Pure ASGI implementation removes theBaseHTTPMiddlewarestreaming-body-loss bug.send_messagestreaming client: Both sync and async paths now consume responses throughwith_streaming_responsecontext managers, correctly handling newline-delimited JSON from the server.returnstatements after early error responses in severalhandle_event_send/handle_message_sendhandlers;uuidis now imported directly rather than re-exported frombase_acp_server.execute_activity_method→execute_activity; private_JSONTypeConverterUnhandledunion type dropped.opentelemetry-apiandopentelemetry-sdkadded as mandatory core dependencies — this adds ~6 MB to every consumer's install footprint without an opt-in mechanism; worth a follow-up to consider whether an optional extra would be preferable.run_agent_test.sh:$pytest_cmdis invoked unquoted; safe on CI today (wheel paths have no spaces) but fragile — see inline comment.Confidence Score: 4/5
opentelemetry-api/opentelemetry-sdkadditions deserve a follow-up decision about opt-in extras for consumers who don't use OTel tracing.Important Files Changed
fastapi<0.116upper bound, pinsstarlette>=0.49.1andtornado>=6.5.5for CVE fixes; also addsopentelemetry-apiandopentelemetry-sdkas mandatory core deps (unlocked to consumers without opt-in).BaseHTTPMiddlewarewith a pure ASGIRequestIDMiddlewareto avoid StreamingResponse body buffering; clean implementation using raw ASGIscope/receive/sendtypes.send_messagenow consume responses via streaming context manager to handle SSE and plain-JSON server responses; fallback collectsparent_task_messagechunks from stream._JSONTypeConverterUnhandledtype; updatesto_typed_valuereturn hint toAny, avoiding a brittle dependency on an internal Temporal SDK symbol.workflow.execute_activity_methodcalls toworkflow.execute_activityin line with the Temporal Python SDK public API.uv run --with;$pytest_cmdis stored as a plain string and invoked unquoted, which will word-split if the wheel path contains spaces.agentobject, replacedmessagewithevent/contentstructure).SendMessageStreamResponse.resultoptional (Optional[TaskMessageUpdate] = None) to handle intermediate or empty streaming frames without validation failures.Sequence Diagram
sequenceDiagram participant Client as SDK Client<br/>(AgentsResource) participant Proxy as Agentex Proxy participant Server as BaseACPServer<br/>(FastAPI + ASGI) participant Middleware as RequestIDMiddleware<br/>(Pure ASGI) participant Handler as Agent Handler Client->>Proxy: POST /api (message/send) via streaming_response ctx mgr Proxy->>Server: forward HTTP request Server->>Middleware: ASGI scope/receive/send Middleware->>Middleware: extract x-request-id header<br/>set ctx_var_request_id Middleware->>Server: pass through (no buffering) Server->>Handler: await handler(params) alt handler returns AsyncGenerator (streaming) Handler-->>Server: AsyncGenerator of TaskMessageUpdate chunks Server->>Server: _handle_streaming_response() loop each chunk Server-->>Proxy: newline-delimited JSON-RPC frame Proxy-->>Client: stream line Client->>Client: collect parent_task_message end else handler returns plain result Handler-->>Server: list[TaskMessage] Server-->>Proxy: JSONRPCResponse (single JSON) Proxy-->>Client: single response line Client->>Client: SendMessageResponse.model_validate(chunk) → return early end Client-->>Client: return SendMessageResponse(result=task_messages)Prompt To Fix All With AI
Reviews (13): Last reviewed commit: "fix: use built SDK wheel for tutorial te..." | Re-trigger Greptile