diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs index e69de29bb..57cba171f 100644 --- a/.git-blame-ignore-revs +++ b/.git-blame-ignore-revs @@ -0,0 +1,2 @@ +# Applied 120 line-length rule to all files: https://github.com/modelcontextprotocol/python-sdk/pull/856 +543961968c0634e93d919d509cce23a1d6a56c21 diff --git a/.github/ISSUE_TEMPLATE/bug.yaml b/.github/ISSUE_TEMPLATE/bug.yaml new file mode 100644 index 000000000..e52277a2a --- /dev/null +++ b/.github/ISSUE_TEMPLATE/bug.yaml @@ -0,0 +1,55 @@ +name: šŸ› MCP Python SDK Bug +description: Report a bug or unexpected behavior in the MCP Python SDK +labels: ["need confirmation"] + +body: + - type: markdown + attributes: + value: Thank you for contributing to the MCP Python SDK! ✊ + + - type: checkboxes + id: checks + attributes: + label: Initial Checks + description: Just making sure you're using the latest version of MCP Python SDK. + options: + - label: I confirm that I'm using the latest version of MCP Python SDK + required: true + - label: I confirm that I searched for my issue in https://github.com/modelcontextprotocol/python-sdk/issues before opening this issue + required: true + + - type: textarea + id: description + attributes: + label: Description + description: | + Please explain what you're seeing and what you would expect to see. + + Please provide as much detail as possible to make understanding and solving your problem as quick as possible. šŸ™ + validations: + required: true + + - type: textarea + id: example + attributes: + label: Example Code + description: > + If applicable, please add a self-contained, + [minimal, reproducible, example](https://stackoverflow.com/help/minimal-reproducible-example) + demonstrating the bug. + + placeholder: | + from mcp.server.fastmcp import FastMCP + + ... + render: Python + + - type: textarea + id: version + attributes: + label: Python & MCP Python SDK + description: | + Which version of Python and MCP Python SDK are you using? + render: Text + validations: + required: true diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md deleted file mode 100644 index dd84ea782..000000000 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -name: Bug report -about: Create a report to help us improve -title: '' -labels: '' -assignees: '' - ---- - -**Describe the bug** -A clear and concise description of what the bug is. - -**To Reproduce** -Steps to reproduce the behavior: -1. Go to '...' -2. Click on '....' -3. Scroll down to '....' -4. See error - -**Expected behavior** -A clear and concise description of what you expected to happen. - -**Screenshots** -If applicable, add screenshots to help explain your problem. - -**Desktop (please complete the following information):** - - OS: [e.g. iOS] - - Browser [e.g. chrome, safari] - - Version [e.g. 22] - -**Smartphone (please complete the following information):** - - Device: [e.g. iPhone6] - - OS: [e.g. iOS8.1] - - Browser [e.g. stock browser, safari] - - Version [e.g. 22] - -**Additional context** -Add any other context about the problem here. diff --git a/.github/ISSUE_TEMPLATE/config.yaml b/.github/ISSUE_TEMPLATE/config.yaml new file mode 100644 index 000000000..0086358db --- /dev/null +++ b/.github/ISSUE_TEMPLATE/config.yaml @@ -0,0 +1 @@ +blank_issues_enabled: true diff --git a/.github/ISSUE_TEMPLATE/feature-request.yaml b/.github/ISSUE_TEMPLATE/feature-request.yaml new file mode 100644 index 000000000..bec9b77b1 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature-request.yaml @@ -0,0 +1,29 @@ +name: šŸš€ MCP Python SDK Feature Request +description: "Suggest a new feature for the MCP Python SDK" +labels: ["feature request"] + +body: + - type: markdown + attributes: + value: Thank you for contributing to the MCP Python SDK! ✊ + + - type: textarea + id: description + attributes: + label: Description + description: | + Please give as much detail as possible about the feature you would like to suggest. šŸ™ + + You might like to add: + * A demo of how code might look when using the feature + * Your use case(s) for the feature + * Reference to other projects that have a similar feature + validations: + required: true + + - type: textarea + id: references + attributes: + label: References + description: | + Please add any links or references that might help us understand your feature request better. šŸ“š diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md deleted file mode 100644 index bbcbbe7d6..000000000 --- a/.github/ISSUE_TEMPLATE/feature_request.md +++ /dev/null @@ -1,20 +0,0 @@ ---- -name: Feature request -about: Suggest an idea for this project -title: '' -labels: '' -assignees: '' - ---- - -**Is your feature request related to a problem? Please describe.** -A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] - -**Describe the solution you'd like** -A clear and concise description of what you want to happen. - -**Describe alternatives you've considered** -A clear and concise description of any alternative solutions or features you've considered. - -**Additional context** -Add any other context or screenshots about the feature request here. diff --git a/.github/ISSUE_TEMPLATE/question.yaml b/.github/ISSUE_TEMPLATE/question.yaml new file mode 100644 index 000000000..87a7894f1 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/question.yaml @@ -0,0 +1,33 @@ +name: ā“ MCP Python SDK Question +description: "Ask a question about the MCP Python SDK" +labels: ["question"] + +body: + - type: markdown + attributes: + value: Thank you for reaching out to the MCP Python SDK community! We're here to help! šŸ¤ + + - type: textarea + id: question + attributes: + label: Question + description: | + Please provide as much detail as possible about your question. šŸ™ + + You might like to include: + * Code snippets showing what you've tried + * Error messages you're encountering (if any) + * Expected vs actual behavior + * Your use case and what you're trying to achieve + validations: + required: true + + - type: textarea + id: context + attributes: + label: Additional Context + description: | + Please provide any additional context that might help us better understand your question, such as: + * Your MCP Python SDK version + * Your Python version + * Relevant configuration or environment details šŸ“ diff --git a/.github/workflows/check-lock.yml b/.github/workflows/check-lock.yml deleted file mode 100644 index 805b0f3cc..000000000 --- a/.github/workflows/check-lock.yml +++ /dev/null @@ -1,25 +0,0 @@ -name: Check uv.lock - -on: - pull_request: - paths: - - "pyproject.toml" - - "uv.lock" - push: - paths: - - "pyproject.toml" - - "uv.lock" - -jobs: - check-lock: - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - - - name: Install uv - run: | - curl -LsSf https://astral.sh/uv/install.sh | sh - echo "$HOME/.cargo/bin" >> $GITHUB_PATH - - - name: Check uv.lock is up to date - run: uv lock --check diff --git a/.github/workflows/publish-docs-manually.yml b/.github/workflows/publish-docs-manually.yml index e1c3954b1..5447dd391 100644 --- a/.github/workflows/publish-docs-manually.yml +++ b/.github/workflows/publish-docs-manually.yml @@ -19,6 +19,7 @@ jobs: uses: astral-sh/setup-uv@v3 with: enable-cache: true + version: 0.7.2 - run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV - uses: actions/cache@v4 diff --git a/.github/workflows/publish-pypi.yml b/.github/workflows/publish-pypi.yml index 17edd0f3c..2f03b45cd 100644 --- a/.github/workflows/publish-pypi.yml +++ b/.github/workflows/publish-pypi.yml @@ -16,6 +16,7 @@ jobs: uses: astral-sh/setup-uv@v3 with: enable-cache: true + version: 0.7.2 - name: Set up Python 3.12 run: uv python install 3.12 @@ -67,6 +68,7 @@ jobs: uses: astral-sh/setup-uv@v3 with: enable-cache: true + version: 0.7.2 - run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV - uses: actions/cache@v4 diff --git a/.github/workflows/shared.yml b/.github/workflows/shared.yml index 03c36a691..499871ca1 100644 --- a/.github/workflows/shared.yml +++ b/.github/workflows/shared.yml @@ -4,43 +4,32 @@ on: workflow_call: jobs: - format: + pre-commit: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - - name: Install uv - uses: astral-sh/setup-uv@v3 + - uses: astral-sh/setup-uv@v5 with: enable-cache: true + version: 0.7.2 - - name: Install the project - run: uv sync --frozen --all-extras --dev --python 3.12 - - - name: Run ruff format check - run: uv run --no-sync ruff check . - - typecheck: - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 + - name: Install dependencies + run: uv sync --frozen --all-extras --python 3.10 - - name: Install uv - uses: astral-sh/setup-uv@v3 + - uses: pre-commit/action@v3.0.0 with: - enable-cache: true - - - name: Install the project - run: uv sync --frozen --all-extras --dev --python 3.12 - - - name: Run pyright - run: uv run --no-sync pyright + extra_args: --all-files --verbose + env: + SKIP: no-commit-to-branch test: - runs-on: ubuntu-latest + runs-on: ${{ matrix.os }} + timeout-minutes: 10 strategy: matrix: python-version: ["3.10", "3.11", "3.12", "3.13"] + os: [ubuntu-latest, windows-latest] steps: - uses: actions/checkout@v4 @@ -49,9 +38,11 @@ jobs: uses: astral-sh/setup-uv@v3 with: enable-cache: true + version: 0.7.2 - name: Install the project - run: uv sync --frozen --all-extras --dev --python ${{ matrix.python-version }} + run: uv sync --frozen --all-extras --python ${{ matrix.python-version }} - name: Run pytest run: uv run --no-sync pytest + continue-on-error: true diff --git a/.gitignore b/.gitignore index fa269235e..e9fdca176 100644 --- a/.gitignore +++ b/.gitignore @@ -166,4 +166,5 @@ cython_debug/ # vscode .vscode/ +.windsurfrules **/CLAUDE.local.md diff --git a/CLAUDE.md b/CLAUDE.md index e95b75cd5..619f3bb44 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -19,7 +19,7 @@ This document contains critical information about working with this codebase. Fo - Line length: 88 chars maximum 3. Testing Requirements - - Framework: `uv run pytest` + - Framework: `uv run --frozen pytest` - Async testing: use anyio, not asyncio - Coverage: test edge cases and errors - New features require tests @@ -54,9 +54,9 @@ This document contains critical information about working with this codebase. Fo ## Code Formatting 1. Ruff - - Format: `uv run ruff format .` - - Check: `uv run ruff check .` - - Fix: `uv run ruff check . --fix` + - Format: `uv run --frozen ruff format .` + - Check: `uv run --frozen ruff check .` + - Fix: `uv run --frozen ruff check . --fix` - Critical issues: - Line length (88 chars) - Import sorting (I001) @@ -67,7 +67,7 @@ This document contains critical information about working with this codebase. Fo - Imports: split into multiple lines 2. Type Checking - - Tool: `uv run pyright` + - Tool: `uv run --frozen pyright` - Requirements: - Explicit None checks for Optional - Type narrowing for strings @@ -104,6 +104,10 @@ This document contains critical information about working with this codebase. Fo - Add None checks - Narrow string types - Match existing patterns + - Pytest: + - If the tests aren't finding the anyio pytest mark, try adding PYTEST_DISABLE_PLUGIN_AUTOLOAD="" + to the start of the pytest run command eg: + `PYTEST_DISABLE_PLUGIN_AUTOLOAD="" uv run --frozen pytest` 3. Best Practices - Check git status before commits diff --git a/README.md b/README.md index 0ca039ae3..8a009108b 100644 --- a/README.md +++ b/README.md @@ -30,6 +30,9 @@ - [Prompts](#prompts) - [Images](#images) - [Context](#context) + - [Completions](#completions) + - [Elicitation](#elicitation) + - [Authentication](#authentication) - [Running Your Server](#running-your-server) - [Development Mode](#development-mode) - [Claude Desktop Integration](#claude-desktop-integration) @@ -66,14 +69,14 @@ The Model Context Protocol allows applications to provide context for LLMs in a - Build MCP clients that can connect to any MCP server - Create MCP servers that expose resources, prompts and tools -- Use standard transports like stdio and SSE +- Use standard transports like stdio, SSE, and Streamable HTTP - Handle all MCP protocol messages and lifecycle events ## Installation ### Adding MCP to your python project -We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects. +We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects. If you haven't created a uv-managed project yet, create one: @@ -160,7 +163,7 @@ from dataclasses import dataclass from fake_database import Database # Replace with your actual DB type -from mcp.server.fastmcp import Context, FastMCP +from mcp.server.fastmcp import FastMCP # Create a named server mcp = FastMCP("My App") @@ -192,9 +195,10 @@ mcp = FastMCP("My App", lifespan=app_lifespan) # Access type-safe lifespan context in tools @mcp.tool() -def query_db(ctx: Context) -> str: +def query_db() -> str: """Tool that uses initialized resources""" - db = ctx.request_context.lifespan_context.db + ctx = mcp.get_context() + db = ctx.request_context.lifespan_context["db"] return db.query() ``` @@ -208,13 +212,13 @@ from mcp.server.fastmcp import FastMCP mcp = FastMCP("My App") -@mcp.resource("config://app") +@mcp.resource("config://app", title="Application Configuration") def get_config() -> str: """Static configuration data""" return "App configuration here" -@mcp.resource("users://{user_id}/profile") +@mcp.resource("users://{user_id}/profile", title="User Profile") def get_user_profile(user_id: str) -> str: """Dynamic user data""" return f"Profile data for user {user_id}" @@ -231,13 +235,13 @@ from mcp.server.fastmcp import FastMCP mcp = FastMCP("My App") -@mcp.tool() +@mcp.tool(title="BMI Calculator") def calculate_bmi(weight_kg: float, height_m: float) -> float: """Calculate BMI given weight in kg and height in meters""" return weight_kg / (height_m**2) -@mcp.tool() +@mcp.tool(title="Weather Fetcher") async def fetch_weather(city: str) -> str: """Fetch current weather for a city""" async with httpx.AsyncClient() as client: @@ -256,12 +260,12 @@ from mcp.server.fastmcp.prompts import base mcp = FastMCP("My App") -@mcp.prompt() +@mcp.prompt(title="Code Review") def review_code(code: str) -> str: return f"Please review this code:\n\n{code}" -@mcp.prompt() +@mcp.prompt(title="Debug Assistant") def debug_error(error: str) -> list[base.Message]: return [ base.UserMessage("I'm seeing this error:"), @@ -309,6 +313,153 @@ async def long_task(files: list[str], ctx: Context) -> str: return "Processing complete" ``` +### Completions + +MCP supports providing completion suggestions for prompt arguments and resource template parameters. With the context parameter, servers can provide completions based on previously resolved values: + +Client usage: +```python +from mcp.client.session import ClientSession +from mcp.types import ResourceTemplateReference + + +async def use_completion(session: ClientSession): + # Complete without context + result = await session.complete( + ref=ResourceTemplateReference( + type="ref/resource", uri="github://repos/{owner}/{repo}" + ), + argument={"name": "owner", "value": "model"}, + ) + + # Complete with context - repo suggestions based on owner + result = await session.complete( + ref=ResourceTemplateReference( + type="ref/resource", uri="github://repos/{owner}/{repo}" + ), + argument={"name": "repo", "value": "test"}, + context_arguments={"owner": "modelcontextprotocol"}, + ) +``` + +Server implementation: +```python +from mcp.server import Server +from mcp.types import ( + Completion, + CompletionArgument, + CompletionContext, + PromptReference, + ResourceTemplateReference, +) + +server = Server("example-server") + + +@server.completion() +async def handle_completion( + ref: PromptReference | ResourceTemplateReference, + argument: CompletionArgument, + context: CompletionContext | None, +) -> Completion | None: + if isinstance(ref, ResourceTemplateReference): + if ref.uri == "github://repos/{owner}/{repo}" and argument.name == "repo": + # Use context to provide owner-specific repos + if context and context.arguments: + owner = context.arguments.get("owner") + if owner == "modelcontextprotocol": + repos = ["python-sdk", "typescript-sdk", "specification"] + # Filter based on partial input + filtered = [r for r in repos if r.startswith(argument.value)] + return Completion(values=filtered) + return None +``` +### Elicitation + +Request additional information from users during tool execution: + +```python +from mcp.server.fastmcp import FastMCP, Context +from mcp.server.elicitation import ( + AcceptedElicitation, + DeclinedElicitation, + CancelledElicitation, +) +from pydantic import BaseModel, Field + +mcp = FastMCP("Booking System") + + +@mcp.tool() +async def book_table(date: str, party_size: int, ctx: Context) -> str: + """Book a table with confirmation""" + + # Schema must only contain primitive types (str, int, float, bool) + class ConfirmBooking(BaseModel): + confirm: bool = Field(description="Confirm booking?") + notes: str = Field(default="", description="Special requests") + + result = await ctx.elicit( + message=f"Confirm booking for {party_size} on {date}?", schema=ConfirmBooking + ) + + match result: + case AcceptedElicitation(data=data): + if data.confirm: + return f"Booked! Notes: {data.notes or 'None'}" + return "Booking cancelled" + case DeclinedElicitation(): + return "Booking declined" + case CancelledElicitation(): + return "Booking cancelled" +``` + +The `elicit()` method returns an `ElicitationResult` with: +- `action`: "accept", "decline", or "cancel" +- `data`: The validated response (only when accepted) +- `validation_error`: Any validation error message + +### Authentication + +Authentication can be used by servers that want to expose tools accessing protected resources. + +`mcp.server.auth` implements OAuth 2.1 resource server functionality, where MCP servers act as Resource Servers (RS) that validate tokens issued by separate Authorization Servers (AS). This follows the [MCP authorization specification](https://modelcontextprotocol.io/specification/2025-06-18/basic/authorization) and implements RFC 9728 (Protected Resource Metadata) for AS discovery. + +MCP servers can use authentication by providing an implementation of the `TokenVerifier` protocol: + +```python +from mcp import FastMCP +from mcp.server.auth.provider import TokenVerifier, TokenInfo +from mcp.server.auth.settings import AuthSettings + + +class MyTokenVerifier(TokenVerifier): + # Implement token validation logic (typically via token introspection) + async def verify_token(self, token: str) -> TokenInfo: + # Verify with your authorization server + ... + + +mcp = FastMCP( + "My App", + token_verifier=MyTokenVerifier(), + auth=AuthSettings( + issuer_url="https://auth.example.com", + resource_server_url="http://localhost:3001", + required_scopes=["mcp:read", "mcp:write"], + ), +) +``` + +For a complete example with separate Authorization Server and Resource Server implementations, see [`examples/servers/simple-auth/`](examples/servers/simple-auth/). + +**Architecture:** +- **Authorization Server (AS)**: Handles OAuth flows, user authentication, and token issuance +- **Resource Server (RS)**: Your MCP server that validates tokens and serves protected resources +- **Client**: Discovers AS through RFC 9728, obtains tokens, and uses them with the MCP server + +See [TokenVerifier](src/mcp/server/auth/provider.py) for more details on implementing token validation. + ## Running Your Server ### Development Mode @@ -360,8 +511,92 @@ python server.py mcp run server.py ``` +Note that `mcp run` or `mcp dev` only supports server using FastMCP and not the low-level server variant. + +### Streamable HTTP Transport + +> **Note**: Streamable HTTP transport is superseding SSE transport for production deployments. + +```python +from mcp.server.fastmcp import FastMCP + +# Stateful server (maintains session state) +mcp = FastMCP("StatefulServer") + +# Stateless server (no session persistence) +mcp = FastMCP("StatelessServer", stateless_http=True) + +# Stateless server (no session persistence, no sse stream with supported client) +mcp = FastMCP("StatelessServer", stateless_http=True, json_response=True) + +# Run server with streamable_http transport +mcp.run(transport="streamable-http") +``` + +You can mount multiple FastMCP servers in a FastAPI application: + +```python +# echo.py +from mcp.server.fastmcp import FastMCP + +mcp = FastMCP(name="EchoServer", stateless_http=True) + + +@mcp.tool(description="A simple echo tool") +def echo(message: str) -> str: + return f"Echo: {message}" +``` + +```python +# math.py +from mcp.server.fastmcp import FastMCP + +mcp = FastMCP(name="MathServer", stateless_http=True) + + +@mcp.tool(description="A simple add tool") +def add_two(n: int) -> int: + return n + 2 +``` + +```python +# main.py +import contextlib +from fastapi import FastAPI +from mcp.echo import echo +from mcp.math import math + + +# Create a combined lifespan to manage both session managers +@contextlib.asynccontextmanager +async def lifespan(app: FastAPI): + async with contextlib.AsyncExitStack() as stack: + await stack.enter_async_context(echo.mcp.session_manager.run()) + await stack.enter_async_context(math.mcp.session_manager.run()) + yield + + +app = FastAPI(lifespan=lifespan) +app.mount("/echo", echo.mcp.streamable_http_app()) +app.mount("/math", math.mcp.streamable_http_app()) +``` + +For low level server with Streamable HTTP implementations, see: +- Stateful server: [`examples/servers/simple-streamablehttp/`](examples/servers/simple-streamablehttp/) +- Stateless server: [`examples/servers/simple-streamablehttp-stateless/`](examples/servers/simple-streamablehttp-stateless/) + +The streamable HTTP transport supports: +- Stateful and stateless operation modes +- Resumability with event stores +- JSON or SSE response formats +- Better scalability for multi-node deployments + ### Mounting to an Existing ASGI Server +> **Note**: SSE transport is being superseded by [Streamable HTTP transport](https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http). + +By default, SSE servers are mounted at `/sse` and Streamable HTTP servers are mounted at `/mcp`. You can customize these paths using the methods described below. + You can mount the SSE server to an existing ASGI server using the `sse_app` method. This allows you to integrate the SSE server with other ASGI applications. ```python @@ -383,6 +618,43 @@ app = Starlette( app.router.routes.append(Host('mcp.acme.corp', app=mcp.sse_app())) ``` +When mounting multiple MCP servers under different paths, you can configure the mount path in several ways: + +```python +from starlette.applications import Starlette +from starlette.routing import Mount +from mcp.server.fastmcp import FastMCP + +# Create multiple MCP servers +github_mcp = FastMCP("GitHub API") +browser_mcp = FastMCP("Browser") +curl_mcp = FastMCP("Curl") +search_mcp = FastMCP("Search") + +# Method 1: Configure mount paths via settings (recommended for persistent configuration) +github_mcp.settings.mount_path = "/github" +browser_mcp.settings.mount_path = "/browser" + +# Method 2: Pass mount path directly to sse_app (preferred for ad-hoc mounting) +# This approach doesn't modify the server's settings permanently + +# Create Starlette app with multiple mounted servers +app = Starlette( + routes=[ + # Using settings-based configuration + Mount("/github", app=github_mcp.sse_app()), + Mount("/browser", app=browser_mcp.sse_app()), + # Using direct mount path parameter + Mount("/curl", app=curl_mcp.sse_app("/curl")), + Mount("/search", app=search_mcp.sse_app("/search")), + ] +) + +# Method 3: For direct execution, you can also pass the mount path to run() +if __name__ == "__main__": + search_mcp.run(transport="sse", mount_path="/search") +``` + For more information on mounting applications in Starlette, see the [Starlette documentation](https://www.starlette.io/routing/#submounting-routes). ## Examples @@ -555,9 +827,11 @@ if __name__ == "__main__": asyncio.run(run()) ``` +Caution: The `mcp run` and `mcp dev` tool doesn't support low-level server. + ### Writing MCP Clients -The SDK provides a high-level client interface for connecting to MCP servers: +The SDK provides a high-level client interface for connecting to MCP servers using various [transports](https://modelcontextprotocol.io/specification/2025-03-26/basic/transports): ```python from mcp import ClientSession, StdioServerParameters, types @@ -621,6 +895,118 @@ if __name__ == "__main__": asyncio.run(run()) ``` +Clients can also connect using [Streamable HTTP transport](https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http): + +```python +from mcp.client.streamable_http import streamablehttp_client +from mcp import ClientSession + + +async def main(): + # Connect to a streamable HTTP server + async with streamablehttp_client("example/mcp") as ( + read_stream, + write_stream, + _, + ): + # Create a session using the client streams + async with ClientSession(read_stream, write_stream) as session: + # Initialize the connection + await session.initialize() + # Call a tool + tool_result = await session.call_tool("echo", {"message": "hello"}) +``` + +### Client Display Utilities + +When building MCP clients, the SDK provides utilities to help display human-readable names for tools, resources, and prompts: + +```python +from mcp.shared.metadata_utils import get_display_name +from mcp.client.session import ClientSession + + +async def display_tools(session: ClientSession): + """Display available tools with human-readable names""" + tools_response = await session.list_tools() + + for tool in tools_response.tools: + # get_display_name() returns the title if available, otherwise the name + display_name = get_display_name(tool) + print(f"Tool: {display_name}") + if tool.description: + print(f" {tool.description}") + + +async def display_resources(session: ClientSession): + """Display available resources with human-readable names""" + resources_response = await session.list_resources() + + for resource in resources_response.resources: + display_name = get_display_name(resource) + print(f"Resource: {display_name} ({resource.uri})") +``` + +The `get_display_name()` function implements the proper precedence rules for displaying names: +- For tools: `title` > `annotations.title` > `name` +- For other objects: `title` > `name` + +This ensures your client UI shows the most user-friendly names that servers provide. + +### OAuth Authentication for Clients + +The SDK includes [authorization support](https://modelcontextprotocol.io/specification/2025-03-26/basic/authorization) for connecting to protected MCP servers: + +```python +from mcp.client.auth import OAuthClientProvider, TokenStorage +from mcp.client.session import ClientSession +from mcp.client.streamable_http import streamablehttp_client +from mcp.shared.auth import OAuthClientInformationFull, OAuthClientMetadata, OAuthToken + + +class CustomTokenStorage(TokenStorage): + """Simple in-memory token storage implementation.""" + + async def get_tokens(self) -> OAuthToken | None: + pass + + async def set_tokens(self, tokens: OAuthToken) -> None: + pass + + async def get_client_info(self) -> OAuthClientInformationFull | None: + pass + + async def set_client_info(self, client_info: OAuthClientInformationFull) -> None: + pass + + +async def main(): + # Set up OAuth authentication + oauth_auth = OAuthClientProvider( + server_url="https://api.example.com", + client_metadata=OAuthClientMetadata( + client_name="My Client", + redirect_uris=["http://localhost:3000/callback"], + grant_types=["authorization_code", "refresh_token"], + response_types=["code"], + ), + storage=CustomTokenStorage(), + redirect_handler=lambda url: print(f"Visit: {url}"), + callback_handler=lambda: ("auth_code", None), + ) + + # Use with streamable HTTP client + async with streamablehttp_client( + "https://api.example.com/mcp", auth=oauth_auth + ) as (read, write, _): + async with ClientSession(read, write) as session: + await session.initialize() + # Authenticated session ready +``` + +For a complete working example, see [`examples/clients/simple-auth-client/`](examples/clients/simple-auth-client/). + + ### MCP Primitives The MCP protocol defines three core primitives that servers can implement: diff --git a/examples/clients/simple-auth-client/README.md b/examples/clients/simple-auth-client/README.md new file mode 100644 index 000000000..cf6050b1c --- /dev/null +++ b/examples/clients/simple-auth-client/README.md @@ -0,0 +1,74 @@ +# Simple Auth Client Example + +A demonstration of how to use the MCP Python SDK with OAuth authentication over streamable HTTP or SSE transport. + +## Features + +- OAuth 2.0 authentication with PKCE +- Support for both StreamableHTTP and SSE transports +- Interactive command-line interface + +## Installation + +```bash +cd examples/clients/simple-auth-client +uv sync --reinstall +``` + +## Usage + +### 1. Start an MCP server with OAuth support + +```bash +# Example with mcp-simple-auth +cd path/to/mcp-simple-auth +uv run mcp-simple-auth --transport streamable-http --port 3001 +``` + +### 2. Run the client + +```bash +uv run mcp-simple-auth-client + +# Or with custom server URL +MCP_SERVER_PORT=3001 uv run mcp-simple-auth-client + +# Use SSE transport +MCP_TRANSPORT_TYPE=sse uv run mcp-simple-auth-client +``` + +### 3. Complete OAuth flow + +The client will open your browser for authentication. After completing OAuth, you can use commands: + +- `list` - List available tools +- `call [args]` - Call a tool with optional JSON arguments +- `quit` - Exit + +## Example + +``` +šŸ” Simple MCP Auth Client +Connecting to: http://localhost:3001 + +Please visit the following URL to authorize the application: +http://localhost:3001/authorize?response_type=code&client_id=... + +āœ… Connected to MCP server at http://localhost:3001 + +mcp> list +šŸ“‹ Available tools: +1. echo - Echo back the input text + +mcp> call echo {"text": "Hello, world!"} +šŸ”§ Tool 'echo' result: +Hello, world! + +mcp> quit +šŸ‘‹ Goodbye! +``` + +## Configuration + +- `MCP_SERVER_PORT` - Server URL (default: 8000) +- `MCP_TRANSPORT_TYPE` - Transport type: `streamable_http` (default) or `sse` diff --git a/examples/clients/simple-auth-client/mcp_simple_auth_client/__init__.py b/examples/clients/simple-auth-client/mcp_simple_auth_client/__init__.py new file mode 100644 index 000000000..06eb1f29d --- /dev/null +++ b/examples/clients/simple-auth-client/mcp_simple_auth_client/__init__.py @@ -0,0 +1 @@ +"""Simple OAuth client for MCP simple-auth server.""" diff --git a/examples/clients/simple-auth-client/mcp_simple_auth_client/main.py b/examples/clients/simple-auth-client/mcp_simple_auth_client/main.py new file mode 100644 index 000000000..6354f2026 --- /dev/null +++ b/examples/clients/simple-auth-client/mcp_simple_auth_client/main.py @@ -0,0 +1,363 @@ +#!/usr/bin/env python3 +""" +Simple MCP client example with OAuth authentication support. + +This client connects to an MCP server using streamable HTTP transport with OAuth. + +""" + +import asyncio +import os +import threading +import time +import webbrowser +from datetime import timedelta +from http.server import BaseHTTPRequestHandler, HTTPServer +from typing import Any +from urllib.parse import parse_qs, urlparse + +from mcp.client.auth import OAuthClientProvider, TokenStorage +from mcp.client.session import ClientSession +from mcp.client.sse import sse_client +from mcp.client.streamable_http import streamablehttp_client +from mcp.shared.auth import OAuthClientInformationFull, OAuthClientMetadata, OAuthToken + + +class InMemoryTokenStorage(TokenStorage): + """Simple in-memory token storage implementation.""" + + def __init__(self): + self._tokens: OAuthToken | None = None + self._client_info: OAuthClientInformationFull | None = None + + async def get_tokens(self) -> OAuthToken | None: + return self._tokens + + async def set_tokens(self, tokens: OAuthToken) -> None: + self._tokens = tokens + + async def get_client_info(self) -> OAuthClientInformationFull | None: + return self._client_info + + async def set_client_info(self, client_info: OAuthClientInformationFull) -> None: + self._client_info = client_info + + +class CallbackHandler(BaseHTTPRequestHandler): + """Simple HTTP handler to capture OAuth callback.""" + + def __init__(self, request, client_address, server, callback_data): + """Initialize with callback data storage.""" + self.callback_data = callback_data + super().__init__(request, client_address, server) + + def do_GET(self): + """Handle GET request from OAuth redirect.""" + parsed = urlparse(self.path) + query_params = parse_qs(parsed.query) + + if "code" in query_params: + self.callback_data["authorization_code"] = query_params["code"][0] + self.callback_data["state"] = query_params.get("state", [None])[0] + self.send_response(200) + self.send_header("Content-type", "text/html") + self.end_headers() + self.wfile.write(b""" + + +

Authorization Successful!

+

You can close this window and return to the terminal.

+ + + + """) + elif "error" in query_params: + self.callback_data["error"] = query_params["error"][0] + self.send_response(400) + self.send_header("Content-type", "text/html") + self.end_headers() + self.wfile.write( + f""" + + +

Authorization Failed

+

Error: {query_params['error'][0]}

+

You can close this window and return to the terminal.

+ + + """.encode() + ) + else: + self.send_response(404) + self.end_headers() + + def log_message(self, format, *args): + """Suppress default logging.""" + pass + + +class CallbackServer: + """Simple server to handle OAuth callbacks.""" + + def __init__(self, port=3000): + self.port = port + self.server = None + self.thread = None + self.callback_data = {"authorization_code": None, "state": None, "error": None} + + def _create_handler_with_data(self): + """Create a handler class with access to callback data.""" + callback_data = self.callback_data + + class DataCallbackHandler(CallbackHandler): + def __init__(self, request, client_address, server): + super().__init__(request, client_address, server, callback_data) + + return DataCallbackHandler + + def start(self): + """Start the callback server in a background thread.""" + handler_class = self._create_handler_with_data() + self.server = HTTPServer(("localhost", self.port), handler_class) + self.thread = threading.Thread(target=self.server.serve_forever, daemon=True) + self.thread.start() + print(f"šŸ–„ļø Started callback server on http://localhost:{self.port}") + + def stop(self): + """Stop the callback server.""" + if self.server: + self.server.shutdown() + self.server.server_close() + if self.thread: + self.thread.join(timeout=1) + + def wait_for_callback(self, timeout=300): + """Wait for OAuth callback with timeout.""" + start_time = time.time() + while time.time() - start_time < timeout: + if self.callback_data["authorization_code"]: + return self.callback_data["authorization_code"] + elif self.callback_data["error"]: + raise Exception(f"OAuth error: {self.callback_data['error']}") + time.sleep(0.1) + raise Exception("Timeout waiting for OAuth callback") + + def get_state(self): + """Get the received state parameter.""" + return self.callback_data["state"] + + +class SimpleAuthClient: + """Simple MCP client with auth support.""" + + def __init__(self, server_url: str, transport_type: str = "streamable_http"): + self.server_url = server_url + self.transport_type = transport_type + self.session: ClientSession | None = None + + async def connect(self): + """Connect to the MCP server.""" + print(f"šŸ”— Attempting to connect to {self.server_url}...") + + try: + callback_server = CallbackServer(port=3030) + callback_server.start() + + async def callback_handler() -> tuple[str, str | None]: + """Wait for OAuth callback and return auth code and state.""" + print("ā³ Waiting for authorization callback...") + try: + auth_code = callback_server.wait_for_callback(timeout=300) + return auth_code, callback_server.get_state() + finally: + callback_server.stop() + + client_metadata_dict = { + "client_name": "Simple Auth Client", + "redirect_uris": ["http://localhost:3030/callback"], + "grant_types": ["authorization_code", "refresh_token"], + "response_types": ["code"], + "token_endpoint_auth_method": "client_secret_post", + } + + async def _default_redirect_handler(authorization_url: str) -> None: + """Default redirect handler that opens the URL in a browser.""" + print(f"Opening browser for authorization: {authorization_url}") + webbrowser.open(authorization_url) + + # Create OAuth authentication handler using the new interface + oauth_auth = OAuthClientProvider( + server_url=self.server_url.replace("/mcp", ""), + client_metadata=OAuthClientMetadata.model_validate( + client_metadata_dict + ), + storage=InMemoryTokenStorage(), + redirect_handler=_default_redirect_handler, + callback_handler=callback_handler, + ) + + # Create transport with auth handler based on transport type + if self.transport_type == "sse": + print("šŸ“” Opening SSE transport connection with auth...") + async with sse_client( + url=self.server_url, + auth=oauth_auth, + timeout=60, + ) as (read_stream, write_stream): + await self._run_session(read_stream, write_stream, None) + else: + print("šŸ“” Opening StreamableHTTP transport connection with auth...") + async with streamablehttp_client( + url=self.server_url, + auth=oauth_auth, + timeout=timedelta(seconds=60), + ) as (read_stream, write_stream, get_session_id): + await self._run_session(read_stream, write_stream, get_session_id) + + except Exception as e: + print(f"āŒ Failed to connect: {e}") + import traceback + + traceback.print_exc() + + async def _run_session(self, read_stream, write_stream, get_session_id): + """Run the MCP session with the given streams.""" + print("šŸ¤ Initializing MCP session...") + async with ClientSession(read_stream, write_stream) as session: + self.session = session + print("⚔ Starting session initialization...") + await session.initialize() + print("✨ Session initialization complete!") + + print(f"\nāœ… Connected to MCP server at {self.server_url}") + if get_session_id: + session_id = get_session_id() + if session_id: + print(f"Session ID: {session_id}") + + # Run interactive loop + await self.interactive_loop() + + async def list_tools(self): + """List available tools from the server.""" + if not self.session: + print("āŒ Not connected to server") + return + + try: + result = await self.session.list_tools() + if hasattr(result, "tools") and result.tools: + print("\nšŸ“‹ Available tools:") + for i, tool in enumerate(result.tools, 1): + print(f"{i}. {tool.name}") + if tool.description: + print(f" Description: {tool.description}") + print() + else: + print("No tools available") + except Exception as e: + print(f"āŒ Failed to list tools: {e}") + + async def call_tool(self, tool_name: str, arguments: dict[str, Any] | None = None): + """Call a specific tool.""" + if not self.session: + print("āŒ Not connected to server") + return + + try: + result = await self.session.call_tool(tool_name, arguments or {}) + print(f"\nšŸ”§ Tool '{tool_name}' result:") + if hasattr(result, "content"): + for content in result.content: + if content.type == "text": + print(content.text) + else: + print(content) + else: + print(result) + except Exception as e: + print(f"āŒ Failed to call tool '{tool_name}': {e}") + + async def interactive_loop(self): + """Run interactive command loop.""" + print("\nšŸŽÆ Interactive MCP Client") + print("Commands:") + print(" list - List available tools") + print(" call [args] - Call a tool") + print(" quit - Exit the client") + print() + + while True: + try: + command = input("mcp> ").strip() + + if not command: + continue + + if command == "quit": + break + + elif command == "list": + await self.list_tools() + + elif command.startswith("call "): + parts = command.split(maxsplit=2) + tool_name = parts[1] if len(parts) > 1 else "" + + if not tool_name: + print("āŒ Please specify a tool name") + continue + + # Parse arguments (simple JSON-like format) + arguments = {} + if len(parts) > 2: + import json + + try: + arguments = json.loads(parts[2]) + except json.JSONDecodeError: + print("āŒ Invalid arguments format (expected JSON)") + continue + + await self.call_tool(tool_name, arguments) + + else: + print( + "āŒ Unknown command. Try 'list', 'call ', or 'quit'" + ) + + except KeyboardInterrupt: + print("\n\nšŸ‘‹ Goodbye!") + break + except EOFError: + break + + +async def main(): + """Main entry point.""" + # Default server URL - can be overridden with environment variable + # Most MCP streamable HTTP servers use /mcp as the endpoint + server_url = os.getenv("MCP_SERVER_PORT", 8000) + transport_type = os.getenv("MCP_TRANSPORT_TYPE", "streamable_http") + server_url = ( + f"http://localhost:{server_url}/mcp" + if transport_type == "streamable_http" + else f"http://localhost:{server_url}/sse" + ) + + print("šŸš€ Simple MCP Auth Client") + print(f"Connecting to: {server_url}") + print(f"Transport type: {transport_type}") + + # Start connection flow - OAuth will be handled automatically + client = SimpleAuthClient(server_url, transport_type) + await client.connect() + + +def cli(): + """CLI entry point for uv script.""" + asyncio.run(main()) + + +if __name__ == "__main__": + cli() diff --git a/examples/clients/simple-auth-client/pyproject.toml b/examples/clients/simple-auth-client/pyproject.toml new file mode 100644 index 000000000..5ae7c6b9d --- /dev/null +++ b/examples/clients/simple-auth-client/pyproject.toml @@ -0,0 +1,52 @@ +[project] +name = "mcp-simple-auth-client" +version = "0.1.0" +description = "A simple OAuth client for the MCP simple-auth server" +readme = "README.md" +requires-python = ">=3.10" +authors = [{ name = "Anthropic" }] +keywords = ["mcp", "oauth", "client", "auth"] +license = { text = "MIT" } +classifiers = [ + "Development Status :: 4 - Beta", + "Intended Audience :: Developers", + "License :: OSI Approved :: MIT License", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.10", +] +dependencies = [ + "click>=8.0.0", + "mcp>=1.0.0", +] + +[project.scripts] +mcp-simple-auth-client = "mcp_simple_auth_client.main:cli" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["mcp_simple_auth_client"] + +[tool.pyright] +include = ["mcp_simple_auth_client"] +venvPath = "." +venv = ".venv" + +[tool.ruff.lint] +select = ["E", "F", "I"] +ignore = [] + +[tool.ruff] +line-length = 88 +target-version = "py310" + +[tool.uv] +dev-dependencies = ["pyright>=1.1.379", "pytest>=8.3.3", "ruff>=0.6.9"] + +[tool.uv.sources] +mcp = { path = "../../../" } + +[[tool.uv.index]] +url = "https://pypi.org/simple" diff --git a/examples/clients/simple-auth-client/uv.lock b/examples/clients/simple-auth-client/uv.lock new file mode 100644 index 000000000..a62447fcb --- /dev/null +++ b/examples/clients/simple-auth-client/uv.lock @@ -0,0 +1,535 @@ +version = 1 +requires-python = ">=3.10" + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 }, +] + +[[package]] +name = "anyio" +version = "4.9.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916 }, +] + +[[package]] +name = "certifi" +version = "2025.4.26" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/9e/c05b3920a3b7d20d3d3310465f50348e5b3694f4f88c6daf736eef3024c4/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6", size = 160705 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4a/7e/3db2bd1b1f9e95f7cddca6d6e75e2f2bd9f51b1246e546d88addca0106bd/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3", size = 159618 }, +] + +[[package]] +name = "click" +version = "8.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/cd/0f/62ca20172d4f87d93cf89665fbaedcd560ac48b465bd1d92bfc7ea6b0a41/click-8.2.0.tar.gz", hash = "sha256:f5452aeddd9988eefa20f90f05ab66f17fce1ee2a36907fd30b05bbb5953814d", size = 235857 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a2/58/1f37bf81e3c689cc74ffa42102fa8915b59085f54a6e4a80bc6265c0f6bf/click-8.2.0-py3-none-any.whl", hash = "sha256:6b303f0b2aa85f1cb4e5303078fadcbcd4e476f114fab9b5007005711839325c", size = 102156 }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674 }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515 }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784 }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517 }, +] + +[[package]] +name = "httpx-sse" +version = "0.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 }, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050 }, +] + +[[package]] +name = "mcp" +source = { directory = "../../../" } +dependencies = [ + { name = "anyio" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "sse-starlette" }, + { name = "starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] + +[package.metadata] +requires-dist = [ + { name = "anyio", specifier = ">=4.5" }, + { name = "httpx", specifier = ">=0.27" }, + { name = "httpx-sse", specifier = ">=0.4" }, + { name = "pydantic", specifier = ">=2.7.2,<3.0.0" }, + { name = "pydantic-settings", specifier = ">=2.5.2" }, + { name = "python-dotenv", marker = "extra == 'cli'", specifier = ">=1.0.0" }, + { name = "python-multipart", specifier = ">=0.0.9" }, + { name = "rich", marker = "extra == 'rich'", specifier = ">=13.9.4" }, + { name = "sse-starlette", specifier = ">=1.6.1" }, + { name = "starlette", specifier = ">=0.27" }, + { name = "typer", marker = "extra == 'cli'", specifier = ">=0.12.4" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'", specifier = ">=0.23.1" }, + { name = "websockets", marker = "extra == 'ws'", specifier = ">=15.0.1" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pyright", specifier = ">=1.1.391" }, + { name = "pytest", specifier = ">=8.3.4" }, + { name = "pytest-examples", specifier = ">=0.0.14" }, + { name = "pytest-flakefinder", specifier = ">=1.1.0" }, + { name = "pytest-pretty", specifier = ">=1.2.0" }, + { name = "pytest-xdist", specifier = ">=3.6.1" }, + { name = "ruff", specifier = ">=0.8.5" }, + { name = "trio", specifier = ">=0.26.2" }, +] +docs = [ + { name = "mkdocs", specifier = ">=1.6.1" }, + { name = "mkdocs-glightbox", specifier = ">=0.4.0" }, + { name = "mkdocs-material", extras = ["imaging"], specifier = ">=9.5.45" }, + { name = "mkdocstrings-python", specifier = ">=1.12.2" }, +] + +[[package]] +name = "mcp-simple-auth-client" +version = "0.1.0" +source = { editable = "." } +dependencies = [ + { name = "click" }, + { name = "mcp" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pyright" }, + { name = "pytest" }, + { name = "ruff" }, +] + +[package.metadata] +requires-dist = [ + { name = "click", specifier = ">=8.0.0" }, + { name = "mcp", directory = "../../../" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pyright", specifier = ">=1.1.379" }, + { name = "pytest", specifier = ">=8.3.3" }, + { name = "ruff", specifier = ">=0.6.9" }, +] + +[[package]] +name = "nodeenv" +version = "1.9.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 }, +] + +[[package]] +name = "packaging" +version = "25.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469 }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538 }, +] + +[[package]] +name = "pydantic" +version = "2.11.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/77/ab/5250d56ad03884ab5efd07f734203943c8a8ab40d551e208af81d0257bf2/pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d", size = 786540 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e7/12/46b65f3534d099349e38ef6ec98b1a5a81f42536d17e0ba382c28c67ba67/pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb", size = 443900 }, +] + +[[package]] +name = "pydantic-core" +version = "2.33.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/92/b31726561b5dae176c2d2c2dc43a9c5bfba5d32f96f8b4c0a600dd492447/pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8", size = 2028817 }, + { url = "https://files.pythonhosted.org/packages/a3/44/3f0b95fafdaca04a483c4e685fe437c6891001bf3ce8b2fded82b9ea3aa1/pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d", size = 1861357 }, + { url = "https://files.pythonhosted.org/packages/30/97/e8f13b55766234caae05372826e8e4b3b96e7b248be3157f53237682e43c/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d", size = 1898011 }, + { url = "https://files.pythonhosted.org/packages/9b/a3/99c48cf7bafc991cc3ee66fd544c0aae8dc907b752f1dad2d79b1b5a471f/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572", size = 1982730 }, + { url = "https://files.pythonhosted.org/packages/de/8e/a5b882ec4307010a840fb8b58bd9bf65d1840c92eae7534c7441709bf54b/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02", size = 2136178 }, + { url = "https://files.pythonhosted.org/packages/e4/bb/71e35fc3ed05af6834e890edb75968e2802fe98778971ab5cba20a162315/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b", size = 2736462 }, + { url = "https://files.pythonhosted.org/packages/31/0d/c8f7593e6bc7066289bbc366f2235701dcbebcd1ff0ef8e64f6f239fb47d/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2", size = 2005652 }, + { url = "https://files.pythonhosted.org/packages/d2/7a/996d8bd75f3eda405e3dd219ff5ff0a283cd8e34add39d8ef9157e722867/pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a", size = 2113306 }, + { url = "https://files.pythonhosted.org/packages/ff/84/daf2a6fb2db40ffda6578a7e8c5a6e9c8affb251a05c233ae37098118788/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac", size = 2073720 }, + { url = "https://files.pythonhosted.org/packages/77/fb/2258da019f4825128445ae79456a5499c032b55849dbd5bed78c95ccf163/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a", size = 2244915 }, + { url = "https://files.pythonhosted.org/packages/d8/7a/925ff73756031289468326e355b6fa8316960d0d65f8b5d6b3a3e7866de7/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b", size = 2241884 }, + { url = "https://files.pythonhosted.org/packages/0b/b0/249ee6d2646f1cdadcb813805fe76265745c4010cf20a8eba7b0e639d9b2/pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22", size = 1910496 }, + { url = "https://files.pythonhosted.org/packages/66/ff/172ba8f12a42d4b552917aa65d1f2328990d3ccfc01d5b7c943ec084299f/pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640", size = 1955019 }, + { url = "https://files.pythonhosted.org/packages/3f/8d/71db63483d518cbbf290261a1fc2839d17ff89fce7089e08cad07ccfce67/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7", size = 2028584 }, + { url = "https://files.pythonhosted.org/packages/24/2f/3cfa7244ae292dd850989f328722d2aef313f74ffc471184dc509e1e4e5a/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246", size = 1855071 }, + { url = "https://files.pythonhosted.org/packages/b3/d3/4ae42d33f5e3f50dd467761304be2fa0a9417fbf09735bc2cce003480f2a/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f", size = 1897823 }, + { url = "https://files.pythonhosted.org/packages/f4/f3/aa5976e8352b7695ff808599794b1fba2a9ae2ee954a3426855935799488/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc", size = 1983792 }, + { url = "https://files.pythonhosted.org/packages/d5/7a/cda9b5a23c552037717f2b2a5257e9b2bfe45e687386df9591eff7b46d28/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de", size = 2136338 }, + { url = "https://files.pythonhosted.org/packages/2b/9f/b8f9ec8dd1417eb9da784e91e1667d58a2a4a7b7b34cf4af765ef663a7e5/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a", size = 2730998 }, + { url = "https://files.pythonhosted.org/packages/47/bc/cd720e078576bdb8255d5032c5d63ee5c0bf4b7173dd955185a1d658c456/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef", size = 2003200 }, + { url = "https://files.pythonhosted.org/packages/ca/22/3602b895ee2cd29d11a2b349372446ae9727c32e78a94b3d588a40fdf187/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e", size = 2113890 }, + { url = "https://files.pythonhosted.org/packages/ff/e6/e3c5908c03cf00d629eb38393a98fccc38ee0ce8ecce32f69fc7d7b558a7/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d", size = 2073359 }, + { url = "https://files.pythonhosted.org/packages/12/e7/6a36a07c59ebefc8777d1ffdaf5ae71b06b21952582e4b07eba88a421c79/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30", size = 2245883 }, + { url = "https://files.pythonhosted.org/packages/16/3f/59b3187aaa6cc0c1e6616e8045b284de2b6a87b027cce2ffcea073adf1d2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf", size = 2241074 }, + { url = "https://files.pythonhosted.org/packages/e0/ed/55532bb88f674d5d8f67ab121a2a13c385df382de2a1677f30ad385f7438/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51", size = 1910538 }, + { url = "https://files.pythonhosted.org/packages/fe/1b/25b7cccd4519c0b23c2dd636ad39d381abf113085ce4f7bec2b0dc755eb1/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab", size = 1952909 }, + { url = "https://files.pythonhosted.org/packages/49/a9/d809358e49126438055884c4366a1f6227f0f84f635a9014e2deb9b9de54/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65", size = 1897786 }, + { url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000 }, + { url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996 }, + { url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957 }, + { url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199 }, + { url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296 }, + { url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109 }, + { url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028 }, + { url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044 }, + { url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881 }, + { url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034 }, + { url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187 }, + { url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628 }, + { url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866 }, + { url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894 }, + { url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688 }, + { url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808 }, + { url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580 }, + { url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859 }, + { url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810 }, + { url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498 }, + { url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611 }, + { url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924 }, + { url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196 }, + { url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389 }, + { url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223 }, + { url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473 }, + { url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269 }, + { url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921 }, + { url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162 }, + { url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560 }, + { url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777 }, + { url = "https://files.pythonhosted.org/packages/30/68/373d55e58b7e83ce371691f6eaa7175e3a24b956c44628eb25d7da007917/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa", size = 2023982 }, + { url = "https://files.pythonhosted.org/packages/a4/16/145f54ac08c96a63d8ed6442f9dec17b2773d19920b627b18d4f10a061ea/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29", size = 1858412 }, + { url = "https://files.pythonhosted.org/packages/41/b1/c6dc6c3e2de4516c0bb2c46f6a373b91b5660312342a0cf5826e38ad82fa/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d", size = 1892749 }, + { url = "https://files.pythonhosted.org/packages/12/73/8cd57e20afba760b21b742106f9dbdfa6697f1570b189c7457a1af4cd8a0/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e", size = 2067527 }, + { url = "https://files.pythonhosted.org/packages/e3/d5/0bb5d988cc019b3cba4a78f2d4b3854427fc47ee8ec8e9eaabf787da239c/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c", size = 2108225 }, + { url = "https://files.pythonhosted.org/packages/f1/c5/00c02d1571913d496aabf146106ad8239dc132485ee22efe08085084ff7c/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec", size = 2069490 }, + { url = "https://files.pythonhosted.org/packages/22/a8/dccc38768274d3ed3a59b5d06f59ccb845778687652daa71df0cab4040d7/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052", size = 2237525 }, + { url = "https://files.pythonhosted.org/packages/d4/e7/4f98c0b125dda7cf7ccd14ba936218397b44f50a56dd8c16a3091df116c3/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c", size = 2238446 }, + { url = "https://files.pythonhosted.org/packages/ce/91/2ec36480fdb0b783cd9ef6795753c1dea13882f2e68e73bce76ae8c21e6a/pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808", size = 2066678 }, + { url = "https://files.pythonhosted.org/packages/7b/27/d4ae6487d73948d6f20dddcd94be4ea43e74349b56eba82e9bdee2d7494c/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8", size = 2025200 }, + { url = "https://files.pythonhosted.org/packages/f1/b8/b3cb95375f05d33801024079b9392a5ab45267a63400bf1866e7ce0f0de4/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593", size = 1859123 }, + { url = "https://files.pythonhosted.org/packages/05/bc/0d0b5adeda59a261cd30a1235a445bf55c7e46ae44aea28f7bd6ed46e091/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612", size = 1892852 }, + { url = "https://files.pythonhosted.org/packages/3e/11/d37bdebbda2e449cb3f519f6ce950927b56d62f0b84fd9cb9e372a26a3d5/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7", size = 2067484 }, + { url = "https://files.pythonhosted.org/packages/8c/55/1f95f0a05ce72ecb02a8a8a1c3be0579bbc29b1d5ab68f1378b7bebc5057/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e", size = 2108896 }, + { url = "https://files.pythonhosted.org/packages/53/89/2b2de6c81fa131f423246a9109d7b2a375e83968ad0800d6e57d0574629b/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8", size = 2069475 }, + { url = "https://files.pythonhosted.org/packages/b8/e9/1f7efbe20d0b2b10f6718944b5d8ece9152390904f29a78e68d4e7961159/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf", size = 2239013 }, + { url = "https://files.pythonhosted.org/packages/3c/b2/5309c905a93811524a49b4e031e9851a6b00ff0fb668794472ea7746b448/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb", size = 2238715 }, + { url = "https://files.pythonhosted.org/packages/32/56/8a7ca5d2cd2cda1d245d34b1c9a942920a718082ae8e54e5f3e5a58b7add/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1", size = 2066757 }, +] + +[[package]] +name = "pydantic-settings" +version = "2.9.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/67/1d/42628a2c33e93f8e9acbde0d5d735fa0850f3e6a2f8cb1eb6c40b9a732ac/pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268", size = 163234 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b6/5f/d6d641b490fd3ec2c4c13b4244d68deea3a1b970a97be64f34fb5504ff72/pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef", size = 44356 }, +] + +[[package]] +name = "pyright" +version = "1.1.400" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "nodeenv" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6c/cb/c306618a02d0ee8aed5fb8d0fe0ecfed0dbf075f71468f03a30b5f4e1fe0/pyright-1.1.400.tar.gz", hash = "sha256:b8a3ba40481aa47ba08ffb3228e821d22f7d391f83609211335858bf05686bdb", size = 3846546 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c8/a5/5d285e4932cf149c90e3c425610c5efaea005475d5f96f1bfdb452956c62/pyright-1.1.400-py3-none-any.whl", hash = "sha256:c80d04f98b5a4358ad3a35e241dbf2a408eee33a40779df365644f8054d2517e", size = 5563460 }, +] + +[[package]] +name = "pytest" +version = "8.3.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ae/3c/c9d525a414d506893f0cd8a8d0de7706446213181570cdbd766691164e40/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845", size = 1450891 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/30/3d/64ad57c803f1fa1e963a7946b6e0fea4a70df53c1a7fed304586539c2bac/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820", size = 343634 }, +] + +[[package]] +name = "python-dotenv" +version = "1.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/88/2c/7bb1416c5620485aa793f2de31d3df393d3686aa8a8506d11e10e13c5baf/python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5", size = 39920 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/18/98a99ad95133c6a6e2005fe89faedf294a748bd5dc803008059409ac9b1e/python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d", size = 20256 }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546 }, +] + +[[package]] +name = "ruff" +version = "0.11.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/4c/4a3c5a97faaae6b428b336dcca81d03ad04779f8072c267ad2bd860126bf/ruff-0.11.10.tar.gz", hash = "sha256:d522fb204b4959909ecac47da02830daec102eeb100fb50ea9554818d47a5fa6", size = 4165632 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2f/9f/596c628f8824a2ce4cd12b0f0b4c0629a62dfffc5d0f742c19a1d71be108/ruff-0.11.10-py3-none-linux_armv6l.whl", hash = "sha256:859a7bfa7bc8888abbea31ef8a2b411714e6a80f0d173c2a82f9041ed6b50f58", size = 10316243 }, + { url = "https://files.pythonhosted.org/packages/3c/38/c1e0b77ab58b426f8c332c1d1d3432d9fc9a9ea622806e208220cb133c9e/ruff-0.11.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:968220a57e09ea5e4fd48ed1c646419961a0570727c7e069842edd018ee8afed", size = 11083636 }, + { url = "https://files.pythonhosted.org/packages/23/41/b75e15961d6047d7fe1b13886e56e8413be8467a4e1be0a07f3b303cd65a/ruff-0.11.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:1067245bad978e7aa7b22f67113ecc6eb241dca0d9b696144256c3a879663bca", size = 10441624 }, + { url = "https://files.pythonhosted.org/packages/b6/2c/e396b6703f131406db1811ea3d746f29d91b41bbd43ad572fea30da1435d/ruff-0.11.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f4854fd09c7aed5b1590e996a81aeff0c9ff51378b084eb5a0b9cd9518e6cff2", size = 10624358 }, + { url = "https://files.pythonhosted.org/packages/bd/8c/ee6cca8bdaf0f9a3704796022851a33cd37d1340bceaf4f6e991eb164e2e/ruff-0.11.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b4564e9f99168c0f9195a0fd5fa5928004b33b377137f978055e40008a082c5", size = 10176850 }, + { url = "https://files.pythonhosted.org/packages/e9/ce/4e27e131a434321b3b7c66512c3ee7505b446eb1c8a80777c023f7e876e6/ruff-0.11.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b6a9cc5b62c03cc1fea0044ed8576379dbaf751d5503d718c973d5418483641", size = 11759787 }, + { url = "https://files.pythonhosted.org/packages/58/de/1e2e77fc72adc7cf5b5123fd04a59ed329651d3eab9825674a9e640b100b/ruff-0.11.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:607ecbb6f03e44c9e0a93aedacb17b4eb4f3563d00e8b474298a201622677947", size = 12430479 }, + { url = "https://files.pythonhosted.org/packages/07/ed/af0f2340f33b70d50121628ef175523cc4c37619e98d98748c85764c8d88/ruff-0.11.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7b3a522fa389402cd2137df9ddefe848f727250535c70dafa840badffb56b7a4", size = 11919760 }, + { url = "https://files.pythonhosted.org/packages/24/09/d7b3d3226d535cb89234390f418d10e00a157b6c4a06dfbe723e9322cb7d/ruff-0.11.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2f071b0deed7e9245d5820dac235cbdd4ef99d7b12ff04c330a241ad3534319f", size = 14041747 }, + { url = "https://files.pythonhosted.org/packages/62/b3/a63b4e91850e3f47f78795e6630ee9266cb6963de8f0191600289c2bb8f4/ruff-0.11.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a60e3a0a617eafba1f2e4186d827759d65348fa53708ca547e384db28406a0b", size = 11550657 }, + { url = "https://files.pythonhosted.org/packages/46/63/a4f95c241d79402ccdbdb1d823d156c89fbb36ebfc4289dce092e6c0aa8f/ruff-0.11.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:da8ec977eaa4b7bf75470fb575bea2cb41a0e07c7ea9d5a0a97d13dbca697bf2", size = 10489671 }, + { url = "https://files.pythonhosted.org/packages/6a/9b/c2238bfebf1e473495659c523d50b1685258b6345d5ab0b418ca3f010cd7/ruff-0.11.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:ddf8967e08227d1bd95cc0851ef80d2ad9c7c0c5aab1eba31db49cf0a7b99523", size = 10160135 }, + { url = "https://files.pythonhosted.org/packages/ba/ef/ba7251dd15206688dbfba7d413c0312e94df3b31b08f5d695580b755a899/ruff-0.11.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:5a94acf798a82db188f6f36575d80609072b032105d114b0f98661e1679c9125", size = 11170179 }, + { url = "https://files.pythonhosted.org/packages/73/9f/5c336717293203ba275dbfa2ea16e49b29a9fd9a0ea8b6febfc17e133577/ruff-0.11.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3afead355f1d16d95630df28d4ba17fb2cb9c8dfac8d21ced14984121f639bad", size = 11626021 }, + { url = "https://files.pythonhosted.org/packages/d9/2b/162fa86d2639076667c9aa59196c020dc6d7023ac8f342416c2f5ec4bda0/ruff-0.11.10-py3-none-win32.whl", hash = "sha256:dc061a98d32a97211af7e7f3fa1d4ca2fcf919fb96c28f39551f35fc55bdbc19", size = 10494958 }, + { url = "https://files.pythonhosted.org/packages/24/f3/66643d8f32f50a4b0d09a4832b7d919145ee2b944d43e604fbd7c144d175/ruff-0.11.10-py3-none-win_amd64.whl", hash = "sha256:5cc725fbb4d25b0f185cb42df07ab6b76c4489b4bfb740a175f3a59c70e8a224", size = 11650285 }, + { url = "https://files.pythonhosted.org/packages/95/3a/2e8704d19f376c799748ff9cb041225c1d59f3e7711bc5596c8cfdc24925/ruff-0.11.10-py3-none-win_arm64.whl", hash = "sha256:ef69637b35fb8b210743926778d0e45e1bffa850a7c61e428c6b971549b5f5d1", size = 10765278 }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 }, +] + +[[package]] +name = "sse-starlette" +version = "2.3.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "starlette" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/10/5f/28f45b1ff14bee871bacafd0a97213f7ec70e389939a80c60c0fb72a9fc9/sse_starlette-2.3.5.tar.gz", hash = "sha256:228357b6e42dcc73a427990e2b4a03c023e2495ecee82e14f07ba15077e334b2", size = 17511 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c8/48/3e49cf0f64961656402c0023edbc51844fe17afe53ab50e958a6dbbbd499/sse_starlette-2.3.5-py3-none-any.whl", hash = "sha256:251708539a335570f10eaaa21d1848a10c42ee6dc3a9cf37ef42266cdb1c52a8", size = 10233 }, +] + +[[package]] +name = "starlette" +version = "0.46.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ce/20/08dfcd9c983f6a6f4a1000d934b9e6d626cff8d2eeb77a89a68eef20a2b7/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5", size = 2580846 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8b/0c/9d30a4ebeb6db2b25a841afbb80f6ef9a854fc3b41be131d249a977b4959/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35", size = 72037 }, +] + +[[package]] +name = "tomli" +version = "2.2.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077 }, + { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429 }, + { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067 }, + { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030 }, + { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898 }, + { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894 }, + { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319 }, + { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273 }, + { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310 }, + { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309 }, + { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762 }, + { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453 }, + { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486 }, + { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349 }, + { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159 }, + { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243 }, + { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645 }, + { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584 }, + { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875 }, + { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418 }, + { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708 }, + { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582 }, + { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543 }, + { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691 }, + { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170 }, + { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530 }, + { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666 }, + { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954 }, + { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724 }, + { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383 }, + { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257 }, +] + +[[package]] +name = "typing-extensions" +version = "4.13.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f6/37/23083fcd6e35492953e8d2aaaa68b860eb422b34627b13f2ce3eb6106061/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef", size = 106967 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8b/54/b1ae86c0973cc6f0210b53d508ca3641fb6d0c56823f288d108bc7ab3cc8/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c", size = 45806 }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/82/5c/e6082df02e215b846b4b8c0b887a64d7d08ffaba30605502639d44c06b82/typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122", size = 76222 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/08/aa4fdfb71f7de5176385bd9e90852eaf6b5d622735020ad600f2bab54385/typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f", size = 14125 }, +] + +[[package]] +name = "uvicorn" +version = "0.34.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a6/ae/9bbb19b9e1c450cf9ecaef06463e40234d98d95bf572fab11b4f19ae5ded/uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328", size = 76815 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b1/4b/4cef6ce21a2aaca9d852a6e84ef4f135d99fcd74fa75105e2fc0c8308acd/uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403", size = 62483 }, +] diff --git a/examples/clients/simple-chatbot/README.MD b/examples/clients/simple-chatbot/README.MD index 683e4f3f5..22996d962 100644 --- a/examples/clients/simple-chatbot/README.MD +++ b/examples/clients/simple-chatbot/README.MD @@ -25,6 +25,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into ```plaintext LLM_API_KEY=your_api_key_here ``` + **Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters. 3. **Configure servers:** diff --git a/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py b/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py index a06e593bd..b97b85080 100644 --- a/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py +++ b/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py @@ -123,7 +123,7 @@ async def list_tools(self) -> list[Any]: for item in tools_response: if isinstance(item, tuple) and item[0] == "tools": tools.extend( - Tool(tool.name, tool.description, tool.inputSchema) + Tool(tool.name, tool.description, tool.inputSchema, tool.title) for tool in item[1] ) @@ -189,9 +189,14 @@ class Tool: """Represents a tool with its properties and formatting.""" def __init__( - self, name: str, description: str, input_schema: dict[str, Any] + self, + name: str, + description: str, + input_schema: dict[str, Any], + title: str | None = None, ) -> None: self.name: str = name + self.title: str | None = title self.description: str = description self.input_schema: dict[str, Any] = input_schema @@ -211,13 +216,20 @@ def format_for_llm(self) -> str: arg_desc += " (required)" args_desc.append(arg_desc) - return f""" -Tool: {self.name} -Description: {self.description} + # Build the formatted output with title as a separate field + output = f"Tool: {self.name}\n" + + # Add human-readable title if available + if self.title: + output += f"User-readable title: {self.title}\n" + + output += f"""Description: {self.description} Arguments: {chr(10).join(args_desc)} """ + return output + class LLMClient: """Manages communication with the LLM provider.""" @@ -245,7 +257,7 @@ def get_response(self, messages: list[dict[str, str]]) -> str: } payload = { "messages": messages, - "model": "llama-3.2-90b-vision-preview", + "model": "meta-llama/llama-4-scout-17b-16e-instruct", "temperature": 0.7, "max_tokens": 4096, "top_p": 1, @@ -284,12 +296,9 @@ def __init__(self, servers: list[Server], llm_client: LLMClient) -> None: async def cleanup_servers(self) -> None: """Clean up all servers properly.""" - cleanup_tasks = [ - asyncio.create_task(server.cleanup()) for server in self.servers - ] - if cleanup_tasks: + for server in reversed(self.servers): try: - await asyncio.gather(*cleanup_tasks, return_exceptions=True) + await server.cleanup() except Exception as e: logging.warning(f"Warning during final cleanup: {e}") @@ -323,8 +332,7 @@ async def process_llm_response(self, llm_response: str) -> str: total = result["total"] percentage = (progress / total) * 100 logging.info( - f"Progress: {progress}/{total} " - f"({percentage:.1f}%)" + f"Progress: {progress}/{total} ({percentage:.1f}%)" ) return f"Tool execution result: {result}" diff --git a/examples/fastmcp/memory.py b/examples/fastmcp/memory.py index dbc890815..0f97babf1 100644 --- a/examples/fastmcp/memory.py +++ b/examples/fastmcp/memory.py @@ -47,18 +47,14 @@ DB_DSN = "postgresql://postgres:postgres@localhost:54320/memory_db" # reset memory with rm ~/.fastmcp/{USER}/memory/* -PROFILE_DIR = ( - Path.home() / ".fastmcp" / os.environ.get("USER", "anon") / "memory" -).resolve() +PROFILE_DIR = (Path.home() / ".fastmcp" / os.environ.get("USER", "anon") / "memory").resolve() PROFILE_DIR.mkdir(parents=True, exist_ok=True) def cosine_similarity(a: list[float], b: list[float]) -> float: a_array = np.array(a, dtype=np.float64) b_array = np.array(b, dtype=np.float64) - return np.dot(a_array, b_array) / ( - np.linalg.norm(a_array) * np.linalg.norm(b_array) - ) + return np.dot(a_array, b_array) / (np.linalg.norm(a_array) * np.linalg.norm(b_array)) async def do_ai[T]( @@ -97,9 +93,7 @@ class MemoryNode(BaseModel): summary: str = "" importance: float = 1.0 access_count: int = 0 - timestamp: float = Field( - default_factory=lambda: datetime.now(timezone.utc).timestamp() - ) + timestamp: float = Field(default_factory=lambda: datetime.now(timezone.utc).timestamp()) embedding: list[float] @classmethod @@ -152,9 +146,7 @@ async def merge_with(self, other: Self, deps: Deps): self.importance += other.importance self.access_count += other.access_count self.embedding = [(a + b) / 2 for a, b in zip(self.embedding, other.embedding)] - self.summary = await do_ai( - self.content, "Summarize the following text concisely.", str, deps - ) + self.summary = await do_ai(self.content, "Summarize the following text concisely.", str, deps) await self.save(deps) # Delete the merged node from the database if other.id is not None: @@ -221,9 +213,7 @@ async def find_similar_memories(embedding: list[float], deps: Deps) -> list[Memo async def update_importance(user_embedding: list[float], deps: Deps): async with deps.pool.acquire() as conn: - rows = await conn.fetch( - "SELECT id, importance, access_count, embedding FROM memories" - ) + rows = await conn.fetch("SELECT id, importance, access_count, embedding FROM memories") for row in rows: memory_embedding = row["embedding"] similarity = cosine_similarity(user_embedding, memory_embedding) @@ -273,9 +263,7 @@ async def display_memory_tree(deps: Deps) -> str: ) result = "" for row in rows: - effective_importance = row["importance"] * ( - 1 + math.log(row["access_count"] + 1) - ) + effective_importance = row["importance"] * (1 + math.log(row["access_count"] + 1)) summary = row["summary"] or row["content"] result += f"- {summary} (Importance: {effective_importance:.2f})\n" return result @@ -283,15 +271,11 @@ async def display_memory_tree(deps: Deps) -> str: @mcp.tool() async def remember( - contents: list[str] = Field( - description="List of observations or memories to store" - ), + contents: list[str] = Field(description="List of observations or memories to store"), ): deps = Deps(openai=AsyncOpenAI(), pool=await get_db_pool()) try: - return "\n".join( - await asyncio.gather(*[add_memory(content, deps) for content in contents]) - ) + return "\n".join(await asyncio.gather(*[add_memory(content, deps) for content in contents])) finally: await deps.pool.close() @@ -305,9 +289,7 @@ async def read_profile() -> str: async def initialize_database(): - pool = await asyncpg.create_pool( - "postgresql://postgres:postgres@localhost:54320/postgres" - ) + pool = await asyncpg.create_pool("postgresql://postgres:postgres@localhost:54320/postgres") try: async with pool.acquire() as conn: await conn.execute(""" diff --git a/examples/fastmcp/text_me.py b/examples/fastmcp/text_me.py index 8053c6cc5..2434dcddd 100644 --- a/examples/fastmcp/text_me.py +++ b/examples/fastmcp/text_me.py @@ -28,15 +28,11 @@ class SurgeSettings(BaseSettings): - model_config: SettingsConfigDict = SettingsConfigDict( - env_prefix="SURGE_", env_file=".env" - ) + model_config: SettingsConfigDict = SettingsConfigDict(env_prefix="SURGE_", env_file=".env") api_key: str account_id: str - my_phone_number: Annotated[ - str, BeforeValidator(lambda v: "+" + v if not v.startswith("+") else v) - ] + my_phone_number: Annotated[str, BeforeValidator(lambda v: "+" + v if not v.startswith("+") else v)] my_first_name: str my_last_name: str diff --git a/examples/fastmcp/unicode_example.py b/examples/fastmcp/unicode_example.py index a69f586a5..94ef628bb 100644 --- a/examples/fastmcp/unicode_example.py +++ b/examples/fastmcp/unicode_example.py @@ -8,10 +8,7 @@ mcp = FastMCP() -@mcp.tool( - description="🌟 A tool that uses various Unicode characters in its description: " - "Ć” Ć© Ć­ ó Ćŗ Ʊ 漢字 šŸŽ‰" -) +@mcp.tool(description="🌟 A tool that uses various Unicode characters in its description: " "Ć” Ć© Ć­ ó Ćŗ Ʊ 漢字 šŸŽ‰") def hello_unicode(name: str = "äø–ē•Œ", greeting: str = "Ā”Hola") -> str: """ A simple tool that demonstrates Unicode handling in: diff --git a/examples/servers/simple-auth/README.md b/examples/servers/simple-auth/README.md new file mode 100644 index 000000000..2c21143c8 --- /dev/null +++ b/examples/servers/simple-auth/README.md @@ -0,0 +1,143 @@ +# MCP OAuth Authentication Demo + +This example demonstrates OAuth 2.0 authentication with the Model Context Protocol using **separate Authorization Server (AS) and Resource Server (RS)** to comply with the new RFC 9728 specification. + +--- + +## Setup Requirements + +**Create a GitHub OAuth App:** +- Go to GitHub Settings > Developer settings > OAuth Apps > New OAuth App +- **Authorization callback URL:** `http://localhost:9000/github/callback` +- Note down your **Client ID** and **Client Secret** + +**Set environment variables:** +```bash +export MCP_GITHUB_CLIENT_ID="your_client_id_here" +export MCP_GITHUB_CLIENT_SECRET="your_client_secret_here" +``` + +--- + +## Running the Servers + +### Step 1: Start Authorization Server + +```bash +# Navigate to the simple-auth directory +cd examples/servers/simple-auth + +# Start Authorization Server on port 9000 +uv run mcp-simple-auth-as --port=9000 +``` + +**What it provides:** +- OAuth 2.0 flows (registration, authorization, token exchange) +- GitHub OAuth integration for user authentication +- Token introspection endpoint for Resource Servers (`/introspect`) +- User data proxy endpoint (`/github/user`) + +--- + +### Step 2: Start Resource Server (MCP Server) + +```bash +# In another terminal, navigate to the simple-auth directory +cd examples/servers/simple-auth + +# Start Resource Server on port 8001, connected to Authorization Server +uv run mcp-simple-auth-rs --port=8001 --auth-server=http://localhost:9000 --transport=streamable-http + +# With RFC 8707 strict resource validation (recommended for production) +uv run mcp-simple-auth-rs --port=8001 --auth-server=http://localhost:9000 --transport=streamable-http --oauth-strict + +``` + + +### Step 3: Test with Client + +```bash +cd examples/clients/simple-auth-client +# Start client with streamable HTTP +MCP_SERVER_PORT=8001 MCP_TRANSPORT_TYPE=streamable_http uv run mcp-simple-auth-client +``` + + +## How It Works + +### RFC 9728 Discovery + +**Client → Resource Server:** +```bash +curl http://localhost:8001/.well-known/oauth-protected-resource +``` +```json +{ + "resource": "http://localhost:8001", + "authorization_servers": ["http://localhost:9000"] +} +``` + +**Client → Authorization Server:** +```bash +curl http://localhost:9000/.well-known/oauth-authorization-server +``` +```json +{ + "issuer": "http://localhost:9000", + "authorization_endpoint": "http://localhost:9000/authorize", + "token_endpoint": "http://localhost:9000/token" +} +``` + +## Legacy MCP Server as Authorization Server (Backwards Compatibility) + +For backwards compatibility with older MCP implementations, a legacy server is provided that acts as an Authorization Server (following the old spec where MCP servers could optionally provide OAuth): + +### Running the Legacy Server + +```bash +# Start legacy authorization server on port 8002 +uv run mcp-simple-auth-legacy --port=8002 +``` + +**Differences from the new architecture:** +- **MCP server acts as AS:** The MCP server itself provides OAuth endpoints (old spec behavior) +- **No separate RS:** The server handles both authentication and MCP tools +- **Local token validation:** Tokens are validated internally without introspection +- **No RFC 9728 support:** Does not provide `/.well-known/oauth-protected-resource` +- **Direct OAuth discovery:** OAuth metadata is at the MCP server's URL + +### Testing with Legacy Server + +```bash +# Test with client (will automatically fall back to legacy discovery) +cd examples/clients/simple-auth-client +MCP_SERVER_PORT=8002 MCP_TRANSPORT_TYPE=streamable_http uv run mcp-simple-auth-client +``` + +The client will: +1. Try RFC 9728 discovery at `/.well-known/oauth-protected-resource` (404 on legacy server) +2. Fall back to direct OAuth discovery at `/.well-known/oauth-authorization-server` +3. Complete authentication with the MCP server acting as its own AS + +This ensures existing MCP servers (which could optionally act as Authorization Servers under the old spec) continue to work while the ecosystem transitions to the new architecture where MCP servers are Resource Servers only. + +## Manual Testing + +### Test Discovery +```bash +# Test Resource Server discovery endpoint (new architecture) +curl -v http://localhost:8001/.well-known/oauth-protected-resource + +# Test Authorization Server metadata +curl -v http://localhost:9000/.well-known/oauth-authorization-server +``` + +### Test Token Introspection +```bash +# After getting a token through OAuth flow: +curl -X POST http://localhost:9000/introspect \ + -H "Content-Type: application/x-www-form-urlencoded" \ + -d "token=your_access_token" +``` diff --git a/examples/servers/simple-auth/mcp_simple_auth/__init__.py b/examples/servers/simple-auth/mcp_simple_auth/__init__.py new file mode 100644 index 000000000..3e12b3183 --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/__init__.py @@ -0,0 +1 @@ +"""Simple MCP server with GitHub OAuth authentication.""" diff --git a/examples/servers/simple-auth/mcp_simple_auth/__main__.py b/examples/servers/simple-auth/mcp_simple_auth/__main__.py new file mode 100644 index 000000000..2365ff5a1 --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/__main__.py @@ -0,0 +1,7 @@ +"""Main entry point for simple MCP server with GitHub OAuth authentication.""" + +import sys + +from mcp_simple_auth.server import main + +sys.exit(main()) # type: ignore[call-arg] diff --git a/examples/servers/simple-auth/mcp_simple_auth/auth_server.py b/examples/servers/simple-auth/mcp_simple_auth/auth_server.py new file mode 100644 index 000000000..2594f81d6 --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/auth_server.py @@ -0,0 +1,234 @@ +""" +Authorization Server for MCP Split Demo. + +This server handles OAuth flows, client registration, and token issuance. +Can be replaced with enterprise authorization servers like Auth0, Entra ID, etc. + +NOTE: this is a simplified example for demonstration purposes. +This is not a production-ready implementation. + +Usage: + python -m mcp_simple_auth.auth_server --port=9000 +""" + +import asyncio +import logging +import time + +import click +from pydantic import AnyHttpUrl, BaseModel +from starlette.applications import Starlette +from starlette.exceptions import HTTPException +from starlette.requests import Request +from starlette.responses import JSONResponse, RedirectResponse, Response +from starlette.routing import Route +from uvicorn import Config, Server + +from mcp.server.auth.routes import cors_middleware, create_auth_routes +from mcp.server.auth.settings import AuthSettings, ClientRegistrationOptions + +from .github_oauth_provider import GitHubOAuthProvider, GitHubOAuthSettings + +logger = logging.getLogger(__name__) + + +class AuthServerSettings(BaseModel): + """Settings for the Authorization Server.""" + + # Server settings + host: str = "localhost" + port: int = 9000 + server_url: AnyHttpUrl = AnyHttpUrl("http://localhost:9000") + github_callback_path: str = "http://localhost:9000/github/callback" + + +class GitHubProxyAuthProvider(GitHubOAuthProvider): + """ + Authorization Server provider that proxies GitHub OAuth. + + This provider: + 1. Issues MCP tokens after GitHub authentication + 2. Stores token state for introspection by Resource Servers + 3. Maps MCP tokens to GitHub tokens for API access + """ + + def __init__(self, github_settings: GitHubOAuthSettings, github_callback_path: str): + super().__init__(github_settings, github_callback_path) + + +def create_authorization_server(server_settings: AuthServerSettings, github_settings: GitHubOAuthSettings) -> Starlette: + """Create the Authorization Server application.""" + oauth_provider = GitHubProxyAuthProvider(github_settings, server_settings.github_callback_path) + + auth_settings = AuthSettings( + issuer_url=server_settings.server_url, + client_registration_options=ClientRegistrationOptions( + enabled=True, + valid_scopes=[github_settings.mcp_scope], + default_scopes=[github_settings.mcp_scope], + ), + required_scopes=[github_settings.mcp_scope], + resource_server_url=None, + ) + + # Create OAuth routes + routes = create_auth_routes( + provider=oauth_provider, + issuer_url=auth_settings.issuer_url, + service_documentation_url=auth_settings.service_documentation_url, + client_registration_options=auth_settings.client_registration_options, + revocation_options=auth_settings.revocation_options, + ) + + # Add GitHub callback route + async def github_callback_handler(request: Request) -> Response: + """Handle GitHub OAuth callback.""" + code = request.query_params.get("code") + state = request.query_params.get("state") + + if not code or not state: + raise HTTPException(400, "Missing code or state parameter") + + redirect_uri = await oauth_provider.handle_github_callback(code, state) + return RedirectResponse(url=redirect_uri, status_code=302) + + routes.append(Route("/github/callback", endpoint=github_callback_handler, methods=["GET"])) + + # Add token introspection endpoint (RFC 7662) for Resource Servers + async def introspect_handler(request: Request) -> Response: + """ + Token introspection endpoint for Resource Servers. + + Resource Servers call this endpoint to validate tokens without + needing direct access to token storage. + """ + form = await request.form() + token = form.get("token") + if not token or not isinstance(token, str): + return JSONResponse({"active": False}, status_code=400) + + # Look up token in provider + access_token = await oauth_provider.load_access_token(token) + if not access_token: + return JSONResponse({"active": False}) + + # Return token info for Resource Server + return JSONResponse( + { + "active": True, + "client_id": access_token.client_id, + "scope": " ".join(access_token.scopes), + "exp": access_token.expires_at, + "iat": int(time.time()), + "token_type": "Bearer", + "aud": access_token.resource, # RFC 8707 audience claim + } + ) + + routes.append( + Route( + "/introspect", + endpoint=cors_middleware(introspect_handler, ["POST", "OPTIONS"]), + methods=["POST", "OPTIONS"], + ) + ) + + # Add GitHub user info endpoint (for Resource Server to fetch user data) + async def github_user_handler(request: Request) -> Response: + """ + Proxy endpoint to get GitHub user info using stored GitHub tokens. + + Resource Servers call this with MCP tokens to get GitHub user data + without exposing GitHub tokens to clients. + """ + # Extract Bearer token + auth_header = request.headers.get("authorization", "") + if not auth_header.startswith("Bearer "): + return JSONResponse({"error": "unauthorized"}, status_code=401) + + mcp_token = auth_header[7:] + + # Get GitHub user info using the provider method + user_info = await oauth_provider.get_github_user_info(mcp_token) + return JSONResponse(user_info) + + routes.append( + Route( + "/github/user", + endpoint=cors_middleware(github_user_handler, ["GET", "OPTIONS"]), + methods=["GET", "OPTIONS"], + ) + ) + + return Starlette(routes=routes) + + +async def run_server(server_settings: AuthServerSettings, github_settings: GitHubOAuthSettings): + """Run the Authorization Server.""" + auth_server = create_authorization_server(server_settings, github_settings) + + config = Config( + auth_server, + host=server_settings.host, + port=server_settings.port, + log_level="info", + ) + server = Server(config) + + logger.info("=" * 80) + logger.info("MCP AUTHORIZATION SERVER") + logger.info("=" * 80) + logger.info(f"Server URL: {server_settings.server_url}") + logger.info("Endpoints:") + logger.info(f" - OAuth Metadata: {server_settings.server_url}/.well-known/oauth-authorization-server") + logger.info(f" - Client Registration: {server_settings.server_url}/register") + logger.info(f" - Authorization: {server_settings.server_url}/authorize") + logger.info(f" - Token Exchange: {server_settings.server_url}/token") + logger.info(f" - Token Introspection: {server_settings.server_url}/introspect") + logger.info(f" - GitHub Callback: {server_settings.server_url}/github/callback") + logger.info(f" - GitHub User Proxy: {server_settings.server_url}/github/user") + logger.info("") + logger.info("Resource Servers should use /introspect to validate tokens") + logger.info("Configure GitHub App callback URL: " + server_settings.github_callback_path) + logger.info("=" * 80) + + await server.serve() + + +@click.command() +@click.option("--port", default=9000, help="Port to listen on") +def main(port: int) -> int: + """ + Run the MCP Authorization Server. + + This server handles OAuth flows and can be used by multiple Resource Servers. + + Environment variables needed: + - MCP_GITHUB_CLIENT_ID: GitHub OAuth Client ID + - MCP_GITHUB_CLIENT_SECRET: GitHub OAuth Client Secret + """ + logging.basicConfig(level=logging.INFO) + + # Load GitHub settings from environment variables + github_settings = GitHubOAuthSettings() + + # Validate required fields + if not github_settings.github_client_id or not github_settings.github_client_secret: + raise ValueError("GitHub credentials not provided") + + # Create server settings + host = "localhost" + server_url = f"http://{host}:{port}" + server_settings = AuthServerSettings( + host=host, + port=port, + server_url=AnyHttpUrl(server_url), + github_callback_path=f"{server_url}/github/callback", + ) + + asyncio.run(run_server(server_settings, github_settings)) + return 0 + + +if __name__ == "__main__": + main() # type: ignore[call-arg] diff --git a/examples/servers/simple-auth/mcp_simple_auth/github_oauth_provider.py b/examples/servers/simple-auth/mcp_simple_auth/github_oauth_provider.py new file mode 100644 index 000000000..c64db96b7 --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/github_oauth_provider.py @@ -0,0 +1,266 @@ +""" +Shared GitHub OAuth provider for MCP servers. + +This module contains the common GitHub OAuth functionality used by both +the standalone authorization server and the legacy combined server. + +NOTE: this is a simplified example for demonstration purposes. +This is not a production-ready implementation. + +""" + +import logging +import secrets +import time +from typing import Any + +from pydantic import AnyHttpUrl +from pydantic_settings import BaseSettings, SettingsConfigDict +from starlette.exceptions import HTTPException + +from mcp.server.auth.provider import ( + AccessToken, + AuthorizationCode, + AuthorizationParams, + OAuthAuthorizationServerProvider, + RefreshToken, + construct_redirect_uri, +) +from mcp.shared._httpx_utils import create_mcp_http_client +from mcp.shared.auth import OAuthClientInformationFull, OAuthToken + +logger = logging.getLogger(__name__) + + +class GitHubOAuthSettings(BaseSettings): + """Common GitHub OAuth settings.""" + + model_config = SettingsConfigDict(env_prefix="MCP_") + + # GitHub OAuth settings - MUST be provided via environment variables + github_client_id: str | None = None + github_client_secret: str | None = None + + # GitHub OAuth URLs + github_auth_url: str = "https://github.com/login/oauth/authorize" + github_token_url: str = "https://github.com/login/oauth/access_token" + + mcp_scope: str = "user" + github_scope: str = "read:user" + + +class GitHubOAuthProvider(OAuthAuthorizationServerProvider): + """ + OAuth provider that uses GitHub as the identity provider. + + This provider handles the OAuth flow by: + 1. Redirecting users to GitHub for authentication + 2. Exchanging GitHub tokens for MCP tokens + 3. Maintaining token mappings for API access + """ + + def __init__(self, settings: GitHubOAuthSettings, github_callback_url: str): + self.settings = settings + self.github_callback_url = github_callback_url + self.clients: dict[str, OAuthClientInformationFull] = {} + self.auth_codes: dict[str, AuthorizationCode] = {} + self.tokens: dict[str, AccessToken] = {} + self.state_mapping: dict[str, dict[str, str | None]] = {} + # Maps MCP tokens to GitHub tokens + self.token_mapping: dict[str, str] = {} + + async def get_client(self, client_id: str) -> OAuthClientInformationFull | None: + """Get OAuth client information.""" + return self.clients.get(client_id) + + async def register_client(self, client_info: OAuthClientInformationFull): + """Register a new OAuth client.""" + self.clients[client_info.client_id] = client_info + + async def authorize(self, client: OAuthClientInformationFull, params: AuthorizationParams) -> str: + """Generate an authorization URL for GitHub OAuth flow.""" + state = params.state or secrets.token_hex(16) + + # Store state mapping for callback + self.state_mapping[state] = { + "redirect_uri": str(params.redirect_uri), + "code_challenge": params.code_challenge, + "redirect_uri_provided_explicitly": str(params.redirect_uri_provided_explicitly), + "client_id": client.client_id, + "resource": params.resource, # RFC 8707 + } + + # Build GitHub authorization URL + auth_url = ( + f"{self.settings.github_auth_url}" + f"?client_id={self.settings.github_client_id}" + f"&redirect_uri={self.github_callback_url}" + f"&scope={self.settings.github_scope}" + f"&state={state}" + ) + + return auth_url + + async def handle_github_callback(self, code: str, state: str) -> str: + """Handle GitHub OAuth callback and return redirect URI.""" + state_data = self.state_mapping.get(state) + if not state_data: + raise HTTPException(400, "Invalid state parameter") + + redirect_uri = state_data["redirect_uri"] + code_challenge = state_data["code_challenge"] + redirect_uri_provided_explicitly = state_data["redirect_uri_provided_explicitly"] == "True" + client_id = state_data["client_id"] + resource = state_data.get("resource") # RFC 8707 + + # These are required values from our own state mapping + assert redirect_uri is not None + assert code_challenge is not None + assert client_id is not None + + # Exchange code for token with GitHub + async with create_mcp_http_client() as client: + response = await client.post( + self.settings.github_token_url, + data={ + "client_id": self.settings.github_client_id, + "client_secret": self.settings.github_client_secret, + "code": code, + "redirect_uri": self.github_callback_url, + }, + headers={"Accept": "application/json"}, + ) + + if response.status_code != 200: + raise HTTPException(400, "Failed to exchange code for token") + + data = response.json() + + if "error" in data: + raise HTTPException(400, data.get("error_description", data["error"])) + + github_token = data["access_token"] + + # Create MCP authorization code + new_code = f"mcp_{secrets.token_hex(16)}" + auth_code = AuthorizationCode( + code=new_code, + client_id=client_id, + redirect_uri=AnyHttpUrl(redirect_uri), + redirect_uri_provided_explicitly=redirect_uri_provided_explicitly, + expires_at=time.time() + 300, + scopes=[self.settings.mcp_scope], + code_challenge=code_challenge, + resource=resource, # RFC 8707 + ) + self.auth_codes[new_code] = auth_code + + # Store GitHub token with MCP client_id + self.tokens[github_token] = AccessToken( + token=github_token, + client_id=client_id, + scopes=[self.settings.github_scope], + expires_at=None, + ) + + del self.state_mapping[state] + return construct_redirect_uri(redirect_uri, code=new_code, state=state) + + async def load_authorization_code( + self, client: OAuthClientInformationFull, authorization_code: str + ) -> AuthorizationCode | None: + """Load an authorization code.""" + return self.auth_codes.get(authorization_code) + + async def exchange_authorization_code( + self, client: OAuthClientInformationFull, authorization_code: AuthorizationCode + ) -> OAuthToken: + """Exchange authorization code for tokens.""" + if authorization_code.code not in self.auth_codes: + raise ValueError("Invalid authorization code") + + # Generate MCP access token + mcp_token = f"mcp_{secrets.token_hex(32)}" + + # Store MCP token + self.tokens[mcp_token] = AccessToken( + token=mcp_token, + client_id=client.client_id, + scopes=authorization_code.scopes, + expires_at=int(time.time()) + 3600, + resource=authorization_code.resource, # RFC 8707 + ) + + # Find GitHub token for this client + github_token = next( + ( + token + for token, data in self.tokens.items() + if (token.startswith("ghu_") or token.startswith("gho_")) and data.client_id == client.client_id + ), + None, + ) + + # Store mapping between MCP token and GitHub token + if github_token: + self.token_mapping[mcp_token] = github_token + + del self.auth_codes[authorization_code.code] + + return OAuthToken( + access_token=mcp_token, + token_type="Bearer", + expires_in=3600, + scope=" ".join(authorization_code.scopes), + ) + + async def load_access_token(self, token: str) -> AccessToken | None: + """Load and validate an access token.""" + access_token = self.tokens.get(token) + if not access_token: + return None + + # Check if expired + if access_token.expires_at and access_token.expires_at < time.time(): + del self.tokens[token] + return None + + return access_token + + async def load_refresh_token(self, client: OAuthClientInformationFull, refresh_token: str) -> RefreshToken | None: + """Load a refresh token - not supported in this example.""" + return None + + async def exchange_refresh_token( + self, + client: OAuthClientInformationFull, + refresh_token: RefreshToken, + scopes: list[str], + ) -> OAuthToken: + """Exchange refresh token - not supported in this example.""" + raise NotImplementedError("Refresh tokens not supported") + + async def revoke_token(self, token: str, token_type_hint: str | None = None) -> None: + """Revoke a token.""" + if token in self.tokens: + del self.tokens[token] + + async def get_github_user_info(self, mcp_token: str) -> dict[str, Any]: + """Get GitHub user info using MCP token.""" + github_token = self.token_mapping.get(mcp_token) + if not github_token: + raise ValueError("No GitHub token found for MCP token") + + async with create_mcp_http_client() as client: + response = await client.get( + "https://api.github.com/user", + headers={ + "Authorization": f"Bearer {github_token}", + "Accept": "application/vnd.github.v3+json", + }, + ) + + if response.status_code != 200: + raise ValueError(f"GitHub API error: {response.status_code}") + + return response.json() diff --git a/examples/servers/simple-auth/mcp_simple_auth/legacy_as_server.py b/examples/servers/simple-auth/mcp_simple_auth/legacy_as_server.py new file mode 100644 index 000000000..0725ef9ed --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/legacy_as_server.py @@ -0,0 +1,152 @@ +""" +Legacy Combined Authorization Server + Resource Server for MCP. + +This server implements the old spec where MCP servers could act as both AS and RS. +Used for backwards compatibility testing with the new split AS/RS architecture. + +NOTE: this is a simplified example for demonstration purposes. +This is not a production-ready implementation. + + +Usage: + python -m mcp_simple_auth.legacy_as_server --port=8002 +""" + +import logging +from typing import Any, Literal + +import click +from pydantic import AnyHttpUrl, BaseModel +from starlette.exceptions import HTTPException +from starlette.requests import Request +from starlette.responses import RedirectResponse, Response + +from mcp.server.auth.middleware.auth_context import get_access_token +from mcp.server.auth.settings import AuthSettings, ClientRegistrationOptions +from mcp.server.fastmcp.server import FastMCP + +from .github_oauth_provider import GitHubOAuthProvider, GitHubOAuthSettings + +logger = logging.getLogger(__name__) + + +class ServerSettings(BaseModel): + """Settings for the simple GitHub MCP server.""" + + # Server settings + host: str = "localhost" + port: int = 8000 + server_url: AnyHttpUrl = AnyHttpUrl("http://localhost:8000") + github_callback_path: str = "http://localhost:8000/github/callback" + + +class SimpleGitHubOAuthProvider(GitHubOAuthProvider): + """GitHub OAuth provider for legacy MCP server.""" + + def __init__(self, github_settings: GitHubOAuthSettings, github_callback_path: str): + super().__init__(github_settings, github_callback_path) + + +def create_simple_mcp_server(server_settings: ServerSettings, github_settings: GitHubOAuthSettings) -> FastMCP: + """Create a simple FastMCP server with GitHub OAuth.""" + oauth_provider = SimpleGitHubOAuthProvider(github_settings, server_settings.github_callback_path) + + auth_settings = AuthSettings( + issuer_url=server_settings.server_url, + client_registration_options=ClientRegistrationOptions( + enabled=True, + valid_scopes=[github_settings.mcp_scope], + default_scopes=[github_settings.mcp_scope], + ), + required_scopes=[github_settings.mcp_scope], + # No resource_server_url parameter in legacy mode + resource_server_url=None, + ) + + app = FastMCP( + name="Simple GitHub MCP Server", + instructions="A simple MCP server with GitHub OAuth authentication", + auth_server_provider=oauth_provider, + host=server_settings.host, + port=server_settings.port, + debug=True, + auth=auth_settings, + ) + + @app.custom_route("/github/callback", methods=["GET"]) + async def github_callback_handler(request: Request) -> Response: + """Handle GitHub OAuth callback.""" + code = request.query_params.get("code") + state = request.query_params.get("state") + + if not code or not state: + raise HTTPException(400, "Missing code or state parameter") + + redirect_uri = await oauth_provider.handle_github_callback(code, state) + return RedirectResponse(status_code=302, url=redirect_uri) + + def get_github_token() -> str: + """Get the GitHub token for the authenticated user.""" + access_token = get_access_token() + if not access_token: + raise ValueError("Not authenticated") + + # Get GitHub token from mapping + github_token = oauth_provider.token_mapping.get(access_token.token) + + if not github_token: + raise ValueError("No GitHub token found for user") + + return github_token + + @app.tool() + async def get_user_profile() -> dict[str, Any]: + """Get the authenticated user's GitHub profile information. + + This is the only tool in our simple example. It requires the 'user' scope. + """ + access_token = get_access_token() + if not access_token: + raise ValueError("Not authenticated") + + return await oauth_provider.get_github_user_info(access_token.token) + + return app + + +@click.command() +@click.option("--port", default=8000, help="Port to listen on") +@click.option( + "--transport", + default="streamable-http", + type=click.Choice(["sse", "streamable-http"]), + help="Transport protocol to use ('sse' or 'streamable-http')", +) +def main(port: int, transport: Literal["sse", "streamable-http"]) -> int: + """Run the simple GitHub MCP server.""" + logging.basicConfig(level=logging.INFO) + + # Load GitHub settings from environment variables + github_settings = GitHubOAuthSettings() + + # Validate required fields + if not github_settings.github_client_id or not github_settings.github_client_secret: + raise ValueError("GitHub credentials not provided") + # Create server settings + host = "localhost" + server_url = f"http://{host}:{port}" + server_settings = ServerSettings( + host=host, + port=port, + server_url=AnyHttpUrl(server_url), + github_callback_path=f"{server_url}/github/callback", + ) + + mcp_server = create_simple_mcp_server(server_settings, github_settings) + logger.info(f"Starting server with {transport} transport") + mcp_server.run(transport=transport) + return 0 + + +if __name__ == "__main__": + main() # type: ignore[call-arg] diff --git a/examples/servers/simple-auth/mcp_simple_auth/server.py b/examples/servers/simple-auth/mcp_simple_auth/server.py new file mode 100644 index 000000000..898ee7837 --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/server.py @@ -0,0 +1,226 @@ +""" +MCP Resource Server with Token Introspection. + +This server validates tokens via Authorization Server introspection and serves MCP resources. +Demonstrates RFC 9728 Protected Resource Metadata for AS/RS separation. + +Usage: + python -m mcp_simple_auth.server --port=8001 --auth-server=http://localhost:9000 +""" + +import logging +from typing import Any, Literal + +import click +import httpx +from pydantic import AnyHttpUrl +from pydantic_settings import BaseSettings, SettingsConfigDict + +from mcp.server.auth.middleware.auth_context import get_access_token +from mcp.server.auth.settings import AuthSettings +from mcp.server.fastmcp.server import FastMCP + +from .token_verifier import IntrospectionTokenVerifier + +logger = logging.getLogger(__name__) + + +class ResourceServerSettings(BaseSettings): + """Settings for the MCP Resource Server.""" + + model_config = SettingsConfigDict(env_prefix="MCP_RESOURCE_") + + # Server settings + host: str = "localhost" + port: int = 8001 + server_url: AnyHttpUrl = AnyHttpUrl("http://localhost:8001") + + # Authorization Server settings + auth_server_url: AnyHttpUrl = AnyHttpUrl("http://localhost:9000") + auth_server_introspection_endpoint: str = "http://localhost:9000/introspect" + auth_server_github_user_endpoint: str = "http://localhost:9000/github/user" + + # MCP settings + mcp_scope: str = "user" + + # RFC 8707 resource validation + oauth_strict: bool = False + + def __init__(self, **data): + """Initialize settings with values from environment variables.""" + super().__init__(**data) + + +def create_resource_server(settings: ResourceServerSettings) -> FastMCP: + """ + Create MCP Resource Server with token introspection. + + This server: + 1. Provides protected resource metadata (RFC 9728) + 2. Validates tokens via Authorization Server introspection + 3. Serves MCP tools and resources + """ + # Create token verifier for introspection with RFC 8707 resource validation + token_verifier = IntrospectionTokenVerifier( + introspection_endpoint=settings.auth_server_introspection_endpoint, + server_url=str(settings.server_url), + validate_resource=settings.oauth_strict, # Only validate when --oauth-strict is set + ) + + # Create FastMCP server as a Resource Server + app = FastMCP( + name="MCP Resource Server", + instructions="Resource Server that validates tokens via Authorization Server introspection", + host=settings.host, + port=settings.port, + debug=True, + # Auth configuration for RS mode + token_verifier=token_verifier, + auth=AuthSettings( + issuer_url=settings.auth_server_url, + required_scopes=[settings.mcp_scope], + resource_server_url=settings.server_url, + ), + ) + + async def get_github_user_data() -> dict[str, Any]: + """ + Get GitHub user data via Authorization Server proxy endpoint. + + This avoids exposing GitHub tokens to the Resource Server. + The Authorization Server handles the GitHub API call and returns the data. + """ + access_token = get_access_token() + if not access_token: + raise ValueError("Not authenticated") + + # Call Authorization Server's GitHub proxy endpoint + async with httpx.AsyncClient() as client: + response = await client.get( + settings.auth_server_github_user_endpoint, + headers={ + "Authorization": f"Bearer {access_token.token}", + }, + ) + + if response.status_code != 200: + raise ValueError(f"GitHub user data fetch failed: {response.status_code} - {response.text}") + + return response.json() + + @app.tool() + async def get_user_profile() -> dict[str, Any]: + """ + Get the authenticated user's GitHub profile information. + + This tool requires the 'user' scope and demonstrates how Resource Servers + can access user data without directly handling GitHub tokens. + """ + return await get_github_user_data() + + @app.tool() + async def get_user_info() -> dict[str, Any]: + """ + Get information about the currently authenticated user. + + Returns token and scope information from the Resource Server's perspective. + """ + access_token = get_access_token() + if not access_token: + raise ValueError("Not authenticated") + + return { + "authenticated": True, + "client_id": access_token.client_id, + "scopes": access_token.scopes, + "token_expires_at": access_token.expires_at, + "token_type": "Bearer", + "resource_server": str(settings.server_url), + "authorization_server": str(settings.auth_server_url), + } + + return app + + +@click.command() +@click.option("--port", default=8001, help="Port to listen on") +@click.option("--auth-server", default="http://localhost:9000", help="Authorization Server URL") +@click.option( + "--transport", + default="streamable-http", + type=click.Choice(["sse", "streamable-http"]), + help="Transport protocol to use ('sse' or 'streamable-http')", +) +@click.option( + "--oauth-strict", + is_flag=True, + help="Enable RFC 8707 resource validation", +) +def main(port: int, auth_server: str, transport: Literal["sse", "streamable-http"], oauth_strict: bool) -> int: + """ + Run the MCP Resource Server. + + This server: + - Provides RFC 9728 Protected Resource Metadata + - Validates tokens via Authorization Server introspection + - Serves MCP tools requiring authentication + + Must be used with a running Authorization Server. + """ + logging.basicConfig(level=logging.INFO) + + try: + # Parse auth server URL + auth_server_url = AnyHttpUrl(auth_server) + + # Create settings + host = "localhost" + server_url = f"http://{host}:{port}" + settings = ResourceServerSettings( + host=host, + port=port, + server_url=AnyHttpUrl(server_url), + auth_server_url=auth_server_url, + auth_server_introspection_endpoint=f"{auth_server}/introspect", + auth_server_github_user_endpoint=f"{auth_server}/github/user", + oauth_strict=oauth_strict, + ) + except ValueError as e: + logger.error(f"Configuration error: {e}") + logger.error("Make sure to provide a valid Authorization Server URL") + return 1 + + try: + mcp_server = create_resource_server(settings) + + logger.info("=" * 80) + logger.info("šŸ“¦ MCP RESOURCE SERVER") + logger.info("=" * 80) + logger.info(f"🌐 Server URL: {settings.server_url}") + logger.info(f"šŸ”‘ Authorization Server: {settings.auth_server_url}") + logger.info("šŸ“‹ Endpoints:") + logger.info(f" ā”Œā”€ Protected Resource Metadata: {settings.server_url}/.well-known/oauth-protected-resource") + mcp_path = "sse" if transport == "sse" else "mcp" + logger.info(f" ā”œā”€ MCP Protocol: {settings.server_url}/{mcp_path}") + logger.info(f" └─ Token Introspection: {settings.auth_server_introspection_endpoint}") + logger.info("") + logger.info("šŸ› ļø Available Tools:") + logger.info(" ā”œā”€ get_user_profile() - Get GitHub user profile") + logger.info(" └─ get_user_info() - Get authentication status") + logger.info("") + logger.info("šŸ” Tokens validated via Authorization Server introspection") + logger.info("šŸ“± Clients discover Authorization Server via Protected Resource Metadata") + logger.info("=" * 80) + + # Run the server - this should block and keep running + mcp_server.run(transport=transport) + logger.info("Server stopped") + return 0 + except Exception as e: + logger.error(f"Server error: {e}") + logger.exception("Exception details:") + return 1 + + +if __name__ == "__main__": + main() # type: ignore[call-arg] diff --git a/examples/servers/simple-auth/mcp_simple_auth/token_verifier.py b/examples/servers/simple-auth/mcp_simple_auth/token_verifier.py new file mode 100644 index 000000000..de3140238 --- /dev/null +++ b/examples/servers/simple-auth/mcp_simple_auth/token_verifier.py @@ -0,0 +1,105 @@ +"""Example token verifier implementation using OAuth 2.0 Token Introspection (RFC 7662).""" + +import logging + +from mcp.server.auth.provider import AccessToken, TokenVerifier +from mcp.shared.auth_utils import check_resource_allowed, resource_url_from_server_url + +logger = logging.getLogger(__name__) + + +class IntrospectionTokenVerifier(TokenVerifier): + """Example token verifier that uses OAuth 2.0 Token Introspection (RFC 7662). + + This is a simple example implementation for demonstration purposes. + Production implementations should consider: + - Connection pooling and reuse + - More sophisticated error handling + - Rate limiting and retry logic + - Comprehensive configuration options + """ + + def __init__( + self, + introspection_endpoint: str, + server_url: str, + validate_resource: bool = False, + ): + self.introspection_endpoint = introspection_endpoint + self.server_url = server_url + self.validate_resource = validate_resource + self.resource_url = resource_url_from_server_url(server_url) + + async def verify_token(self, token: str) -> AccessToken | None: + """Verify token via introspection endpoint.""" + import httpx + + # Validate URL to prevent SSRF attacks + if not self.introspection_endpoint.startswith(("https://", "http://localhost", "http://127.0.0.1")): + logger.warning(f"Rejecting introspection endpoint with unsafe scheme: {self.introspection_endpoint}") + return None + + # Configure secure HTTP client + timeout = httpx.Timeout(10.0, connect=5.0) + limits = httpx.Limits(max_connections=10, max_keepalive_connections=5) + + async with httpx.AsyncClient( + timeout=timeout, + limits=limits, + verify=True, # Enforce SSL verification + ) as client: + try: + response = await client.post( + self.introspection_endpoint, + data={"token": token}, + headers={"Content-Type": "application/x-www-form-urlencoded"}, + ) + + if response.status_code != 200: + logger.debug(f"Token introspection returned status {response.status_code}") + return None + + data = response.json() + if not data.get("active", False): + return None + + # RFC 8707 resource validation (only when --oauth-strict is set) + if self.validate_resource and not self._validate_resource(data): + logger.warning(f"Token resource validation failed. Expected: {self.resource_url}") + return None + + return AccessToken( + token=token, + client_id=data.get("client_id", "unknown"), + scopes=data.get("scope", "").split() if data.get("scope") else [], + expires_at=data.get("exp"), + resource=data.get("aud"), # Include resource in token + ) + except Exception as e: + logger.warning(f"Token introspection failed: {e}") + return None + + def _validate_resource(self, token_data: dict) -> bool: + """Validate token was issued for this resource server.""" + if not self.server_url or not self.resource_url: + return False # Fail if strict validation requested but URLs missing + + # Check 'aud' claim first (standard JWT audience) + aud = token_data.get("aud") + if isinstance(aud, list): + for audience in aud: + if self._is_valid_resource(audience): + return True + return False + elif aud: + return self._is_valid_resource(aud) + + # No resource binding - invalid per RFC 8707 + return False + + def _is_valid_resource(self, resource: str) -> bool: + """Check if resource matches this server using hierarchical matching.""" + if not self.resource_url: + return False + + return check_resource_allowed(requested_resource=self.resource_url, configured_resource=resource) diff --git a/examples/servers/simple-auth/pyproject.toml b/examples/servers/simple-auth/pyproject.toml new file mode 100644 index 000000000..86ca098b0 --- /dev/null +++ b/examples/servers/simple-auth/pyproject.toml @@ -0,0 +1,33 @@ +[project] +name = "mcp-simple-auth" +version = "0.1.0" +description = "A simple MCP server demonstrating OAuth authentication" +readme = "README.md" +requires-python = ">=3.10" +authors = [{ name = "Anthropic, PBC." }] +license = { text = "MIT" } +dependencies = [ + "anyio>=4.5", + "click>=8.1.0", + "httpx>=0.27", + "mcp", + "pydantic>=2.0", + "pydantic-settings>=2.5.2", + "sse-starlette>=1.6.1", + "uvicorn>=0.23.1; sys_platform != 'emscripten'", +] + +[project.scripts] +mcp-simple-auth-rs = "mcp_simple_auth.server:main" +mcp-simple-auth-as = "mcp_simple_auth.auth_server:main" +mcp-simple-auth-legacy = "mcp_simple_auth.legacy_as_server:main" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["mcp_simple_auth"] + +[tool.uv] +dev-dependencies = ["pyright>=1.1.391", "pytest>=8.3.4", "ruff>=0.8.5"] diff --git a/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py b/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py index 8b345fa2e..e7ef16530 100644 --- a/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py +++ b/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py @@ -2,4 +2,4 @@ from .server import main -sys.exit(main()) +sys.exit(main()) # type: ignore[call-arg] diff --git a/examples/servers/simple-prompt/mcp_simple_prompt/server.py b/examples/servers/simple-prompt/mcp_simple_prompt/server.py index 0552f2770..b562cc932 100644 --- a/examples/servers/simple-prompt/mcp_simple_prompt/server.py +++ b/examples/servers/simple-prompt/mcp_simple_prompt/server.py @@ -53,6 +53,7 @@ async def list_prompts() -> list[types.Prompt]: return [ types.Prompt( name="simple", + title="Simple Assistant Prompt", description="A simple prompt that can take optional context and topic " "arguments", arguments=[ @@ -90,6 +91,7 @@ async def get_prompt( if transport == "sse": from mcp.server.sse import SseServerTransport from starlette.applications import Starlette + from starlette.responses import Response from starlette.routing import Mount, Route sse = SseServerTransport("/messages/") @@ -101,6 +103,7 @@ async def handle_sse(request): await app.run( streams[0], streams[1], app.create_initialization_options() ) + return Response() starlette_app = Starlette( debug=True, @@ -112,7 +115,7 @@ async def handle_sse(request): import uvicorn - uvicorn.run(starlette_app, host="0.0.0.0", port=port) + uvicorn.run(starlette_app, host="127.0.0.1", port=port) else: from mcp.server.stdio import stdio_server diff --git a/examples/servers/simple-resource/mcp_simple_resource/__main__.py b/examples/servers/simple-resource/mcp_simple_resource/__main__.py index 8b345fa2e..e7ef16530 100644 --- a/examples/servers/simple-resource/mcp_simple_resource/__main__.py +++ b/examples/servers/simple-resource/mcp_simple_resource/__main__.py @@ -2,4 +2,4 @@ from .server import main -sys.exit(main()) +sys.exit(main()) # type: ignore[call-arg] diff --git a/examples/servers/simple-resource/mcp_simple_resource/server.py b/examples/servers/simple-resource/mcp_simple_resource/server.py index 0ec1d926a..cef29b851 100644 --- a/examples/servers/simple-resource/mcp_simple_resource/server.py +++ b/examples/servers/simple-resource/mcp_simple_resource/server.py @@ -2,12 +2,21 @@ import click import mcp.types as types from mcp.server.lowlevel import Server -from pydantic import FileUrl +from pydantic import AnyUrl, FileUrl SAMPLE_RESOURCES = { - "greeting": "Hello! This is a sample text resource.", - "help": "This server provides a few sample text resources for testing.", - "about": "This is the simple-resource MCP server implementation.", + "greeting": { + "content": "Hello! This is a sample text resource.", + "title": "Welcome Message", + }, + "help": { + "content": "This server provides a few sample text resources for testing.", + "title": "Help Documentation", + }, + "about": { + "content": "This is the simple-resource MCP server implementation.", + "title": "About This Server", + }, } @@ -28,6 +37,7 @@ async def list_resources() -> list[types.Resource]: types.Resource( uri=FileUrl(f"file:///{name}.txt"), name=name, + title=SAMPLE_RESOURCES[name]["title"], description=f"A sample text resource named {name}", mimeType="text/plain", ) @@ -35,17 +45,20 @@ async def list_resources() -> list[types.Resource]: ] @app.read_resource() - async def read_resource(uri: FileUrl) -> str | bytes: + async def read_resource(uri: AnyUrl) -> str | bytes: + if uri.path is None: + raise ValueError(f"Invalid resource path: {uri}") name = uri.path.replace(".txt", "").lstrip("/") if name not in SAMPLE_RESOURCES: raise ValueError(f"Unknown resource: {uri}") - return SAMPLE_RESOURCES[name] + return SAMPLE_RESOURCES[name]["content"] if transport == "sse": from mcp.server.sse import SseServerTransport from starlette.applications import Starlette + from starlette.responses import Response from starlette.routing import Mount, Route sse = SseServerTransport("/messages/") @@ -57,18 +70,19 @@ async def handle_sse(request): await app.run( streams[0], streams[1], app.create_initialization_options() ) + return Response() starlette_app = Starlette( debug=True, routes=[ - Route("/sse", endpoint=handle_sse), + Route("/sse", endpoint=handle_sse, methods=["GET"]), Mount("/messages/", app=sse.handle_post_message), ], ) import uvicorn - uvicorn.run(starlette_app, host="0.0.0.0", port=port) + uvicorn.run(starlette_app, host="127.0.0.1", port=port) else: from mcp.server.stdio import stdio_server diff --git a/examples/servers/simple-streamablehttp-stateless/README.md b/examples/servers/simple-streamablehttp-stateless/README.md new file mode 100644 index 000000000..2abb60614 --- /dev/null +++ b/examples/servers/simple-streamablehttp-stateless/README.md @@ -0,0 +1,41 @@ +# MCP Simple StreamableHttp Stateless Server Example + +A stateless MCP server example demonstrating the StreamableHttp transport without maintaining session state. This example is ideal for understanding how to deploy MCP servers in multi-node environments where requests can be routed to any instance. + +## Features + +- Uses the StreamableHTTP transport in stateless mode (mcp_session_id=None) +- Each request creates a new ephemeral connection +- No session state maintained between requests +- Task lifecycle scoped to individual requests +- Suitable for deployment in multi-node environments + + +## Usage + +Start the server: + +```bash +# Using default port 3000 +uv run mcp-simple-streamablehttp-stateless + +# Using custom port +uv run mcp-simple-streamablehttp-stateless --port 3000 + +# Custom logging level +uv run mcp-simple-streamablehttp-stateless --log-level DEBUG + +# Enable JSON responses instead of SSE streams +uv run mcp-simple-streamablehttp-stateless --json-response +``` + +The server exposes a tool named "start-notification-stream" that accepts three arguments: + +- `interval`: Time between notifications in seconds (e.g., 1.0) +- `count`: Number of notifications to send (e.g., 5) +- `caller`: Identifier string for the caller + + +## Client + +You can connect to this server using an HTTP client. For now, only the TypeScript SDK has streamable HTTP client examples, or you can use [Inspector](https://github.com/modelcontextprotocol/inspector) for testing. \ No newline at end of file diff --git a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__init__.py b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py new file mode 100644 index 000000000..1664737e3 --- /dev/null +++ b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py @@ -0,0 +1,7 @@ +from .server import main + +if __name__ == "__main__": + # Click will handle CLI arguments + import sys + + sys.exit(main()) # type: ignore[call-arg] diff --git a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py new file mode 100644 index 000000000..6a9ff9364 --- /dev/null +++ b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py @@ -0,0 +1,139 @@ +import contextlib +import logging +from collections.abc import AsyncIterator + +import anyio +import click +import mcp.types as types +from mcp.server.lowlevel import Server +from mcp.server.streamable_http_manager import StreamableHTTPSessionManager +from starlette.applications import Starlette +from starlette.routing import Mount +from starlette.types import Receive, Scope, Send + +logger = logging.getLogger(__name__) + + +@click.command() +@click.option("--port", default=3000, help="Port to listen on for HTTP") +@click.option( + "--log-level", + default="INFO", + help="Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)", +) +@click.option( + "--json-response", + is_flag=True, + default=False, + help="Enable JSON responses instead of SSE streams", +) +def main( + port: int, + log_level: str, + json_response: bool, +) -> int: + # Configure logging + logging.basicConfig( + level=getattr(logging, log_level.upper()), + format="%(asctime)s - %(name)s - %(levelname)s - %(message)s", + ) + + app = Server("mcp-streamable-http-stateless-demo") + + @app.call_tool() + async def call_tool(name: str, arguments: dict) -> list[types.ContentBlock]: + ctx = app.request_context + interval = arguments.get("interval", 1.0) + count = arguments.get("count", 5) + caller = arguments.get("caller", "unknown") + + # Send the specified number of notifications with the given interval + for i in range(count): + await ctx.session.send_log_message( + level="info", + data=f"Notification {i+1}/{count} from caller: {caller}", + logger="notification_stream", + related_request_id=ctx.request_id, + ) + if i < count - 1: # Don't wait after the last notification + await anyio.sleep(interval) + + return [ + types.TextContent( + type="text", + text=( + f"Sent {count} notifications with {interval}s interval" + f" for caller: {caller}" + ), + ) + ] + + @app.list_tools() + async def list_tools() -> list[types.Tool]: + return [ + types.Tool( + name="start-notification-stream", + description=( + "Sends a stream of notifications with configurable count" + " and interval" + ), + inputSchema={ + "type": "object", + "required": ["interval", "count", "caller"], + "properties": { + "interval": { + "type": "number", + "description": "Interval between notifications in seconds", + }, + "count": { + "type": "number", + "description": "Number of notifications to send", + }, + "caller": { + "type": "string", + "description": ( + "Identifier of the caller to include in notifications" + ), + }, + }, + }, + ) + ] + + # Create the session manager with true stateless mode + session_manager = StreamableHTTPSessionManager( + app=app, + event_store=None, + json_response=json_response, + stateless=True, + ) + + async def handle_streamable_http( + scope: Scope, receive: Receive, send: Send + ) -> None: + await session_manager.handle_request(scope, receive, send) + + @contextlib.asynccontextmanager + async def lifespan(app: Starlette) -> AsyncIterator[None]: + """Context manager for session manager.""" + async with session_manager.run(): + logger.info("Application started with StreamableHTTP session manager!") + try: + yield + finally: + logger.info("Application shutting down...") + + # Create an ASGI application using the transport + starlette_app = Starlette( + debug=True, + routes=[ + Mount("/mcp", app=handle_streamable_http), + ], + lifespan=lifespan, + ) + + import uvicorn + + uvicorn.run(starlette_app, host="127.0.0.1", port=port) + + return 0 diff --git a/examples/servers/simple-streamablehttp-stateless/pyproject.toml b/examples/servers/simple-streamablehttp-stateless/pyproject.toml new file mode 100644 index 000000000..d2b089451 --- /dev/null +++ b/examples/servers/simple-streamablehttp-stateless/pyproject.toml @@ -0,0 +1,36 @@ +[project] +name = "mcp-simple-streamablehttp-stateless" +version = "0.1.0" +description = "A simple MCP server exposing a StreamableHttp transport in stateless mode" +readme = "README.md" +requires-python = ">=3.10" +authors = [{ name = "Anthropic, PBC." }] +keywords = ["mcp", "llm", "automation", "web", "fetch", "http", "streamable", "stateless"] +license = { text = "MIT" } +dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp", "starlette", "uvicorn"] + +[project.scripts] +mcp-simple-streamablehttp-stateless = "mcp_simple_streamablehttp_stateless.server:main" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["mcp_simple_streamablehttp_stateless"] + +[tool.pyright] +include = ["mcp_simple_streamablehttp_stateless"] +venvPath = "." +venv = ".venv" + +[tool.ruff.lint] +select = ["E", "F", "I"] +ignore = [] + +[tool.ruff] +line-length = 88 +target-version = "py310" + +[tool.uv] +dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"] \ No newline at end of file diff --git a/examples/servers/simple-streamablehttp/README.md b/examples/servers/simple-streamablehttp/README.md new file mode 100644 index 000000000..f850b7286 --- /dev/null +++ b/examples/servers/simple-streamablehttp/README.md @@ -0,0 +1,55 @@ +# MCP Simple StreamableHttp Server Example + +A simple MCP server example demonstrating the StreamableHttp transport, which enables HTTP-based communication with MCP servers using streaming. + +## Features + +- Uses the StreamableHTTP transport for server-client communication +- Supports REST API operations (POST, GET, DELETE) for `/mcp` endpoint +- Task management with anyio task groups +- Ability to send multiple notifications over time to the client +- Proper resource cleanup and lifespan management +- Resumability support via InMemoryEventStore + +## Usage + +Start the server on the default or custom port: + +```bash + +# Using custom port +uv run mcp-simple-streamablehttp --port 3000 + +# Custom logging level +uv run mcp-simple-streamablehttp --log-level DEBUG + +# Enable JSON responses instead of SSE streams +uv run mcp-simple-streamablehttp --json-response +``` + +The server exposes a tool named "start-notification-stream" that accepts three arguments: + +- `interval`: Time between notifications in seconds (e.g., 1.0) +- `count`: Number of notifications to send (e.g., 5) +- `caller`: Identifier string for the caller + +## Resumability Support + +This server includes resumability support through the InMemoryEventStore. This enables clients to: + +- Reconnect to the server after a disconnection +- Resume event streaming from where they left off using the Last-Event-ID header + + +The server will: +- Generate unique event IDs for each SSE message +- Store events in memory for later replay +- Replay missed events when a client reconnects with a Last-Event-ID header + +Note: The InMemoryEventStore is designed for demonstration purposes only. For production use, consider implementing a persistent storage solution. + + + +## Client + +You can connect to this server using an HTTP client, for now only Typescript SDK has streamable HTTP client examples or you can use [Inspector](https://github.com/modelcontextprotocol/inspector) \ No newline at end of file diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__init__.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py new file mode 100644 index 000000000..21862e45f --- /dev/null +++ b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py @@ -0,0 +1,4 @@ +from .server import main + +if __name__ == "__main__": + main() # type: ignore[call-arg] diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py new file mode 100644 index 000000000..28c58149f --- /dev/null +++ b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py @@ -0,0 +1,105 @@ +""" +In-memory event store for demonstrating resumability functionality. + +This is a simple implementation intended for examples and testing, +not for production use where a persistent storage solution would be more appropriate. +""" + +import logging +from collections import deque +from dataclasses import dataclass +from uuid import uuid4 + +from mcp.server.streamable_http import ( + EventCallback, + EventId, + EventMessage, + EventStore, + StreamId, +) +from mcp.types import JSONRPCMessage + +logger = logging.getLogger(__name__) + + +@dataclass +class EventEntry: + """ + Represents an event entry in the event store. + """ + + event_id: EventId + stream_id: StreamId + message: JSONRPCMessage + + +class InMemoryEventStore(EventStore): + """ + Simple in-memory implementation of the EventStore interface for resumability. + This is primarily intended for examples and testing, not for production use + where a persistent storage solution would be more appropriate. + + This implementation keeps only the last N events per stream for memory efficiency. + """ + + def __init__(self, max_events_per_stream: int = 100): + """Initialize the event store. + + Args: + max_events_per_stream: Maximum number of events to keep per stream + """ + self.max_events_per_stream = max_events_per_stream + # for maintaining last N events per stream + self.streams: dict[StreamId, deque[EventEntry]] = {} + # event_id -> EventEntry for quick lookup + self.event_index: dict[EventId, EventEntry] = {} + + async def store_event( + self, stream_id: StreamId, message: JSONRPCMessage + ) -> EventId: + """Stores an event with a generated event ID.""" + event_id = str(uuid4()) + event_entry = EventEntry( + event_id=event_id, stream_id=stream_id, message=message + ) + + # Get or create deque for this stream + if stream_id not in self.streams: + self.streams[stream_id] = deque(maxlen=self.max_events_per_stream) + + # If deque is full, the oldest event will be automatically removed + # We need to remove it from the event_index as well + if len(self.streams[stream_id]) == self.max_events_per_stream: + oldest_event = self.streams[stream_id][0] + self.event_index.pop(oldest_event.event_id, None) + + # Add new event + self.streams[stream_id].append(event_entry) + self.event_index[event_id] = event_entry + + return event_id + + async def replay_events_after( + self, + last_event_id: EventId, + send_callback: EventCallback, + ) -> StreamId | None: + """Replays events that occurred after the specified event ID.""" + if last_event_id not in self.event_index: + logger.warning(f"Event ID {last_event_id} not found in store") + return None + + # Get the stream and find events after the last one + last_event = self.event_index[last_event_id] + stream_id = last_event.stream_id + stream_events = self.streams.get(last_event.stream_id, deque()) + + # Events in deque are already in chronological order + found_last = False + for event in stream_events: + if found_last: + await send_callback(EventMessage(event.message, event.event_id)) + elif event.event_id == last_event_id: + found_last = True + + return stream_id diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py new file mode 100644 index 000000000..85eb1369f --- /dev/null +++ b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py @@ -0,0 +1,167 @@ +import contextlib +import logging +from collections.abc import AsyncIterator + +import anyio +import click +import mcp.types as types +from mcp.server.lowlevel import Server +from mcp.server.streamable_http_manager import StreamableHTTPSessionManager +from pydantic import AnyUrl +from starlette.applications import Starlette +from starlette.routing import Mount +from starlette.types import Receive, Scope, Send + +from .event_store import InMemoryEventStore + +# Configure logging +logger = logging.getLogger(__name__) + + +@click.command() +@click.option("--port", default=3000, help="Port to listen on for HTTP") +@click.option( + "--log-level", + default="INFO", + help="Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)", +) +@click.option( + "--json-response", + is_flag=True, + default=False, + help="Enable JSON responses instead of SSE streams", +) +def main( + port: int, + log_level: str, + json_response: bool, +) -> int: + # Configure logging + logging.basicConfig( + level=getattr(logging, log_level.upper()), + format="%(asctime)s - %(name)s - %(levelname)s - %(message)s", + ) + + app = Server("mcp-streamable-http-demo") + + @app.call_tool() + async def call_tool(name: str, arguments: dict) -> list[types.ContentBlock]: + ctx = app.request_context + interval = arguments.get("interval", 1.0) + count = arguments.get("count", 5) + caller = arguments.get("caller", "unknown") + + # Send the specified number of notifications with the given interval + for i in range(count): + # Include more detailed message for resumability demonstration + notification_msg = ( + f"[{i+1}/{count}] Event from '{caller}' - " + f"Use Last-Event-ID to resume if disconnected" + ) + await ctx.session.send_log_message( + level="info", + data=notification_msg, + logger="notification_stream", + # Associates this notification with the original request + # Ensures notifications are sent to the correct response stream + # Without this, notifications will either go to: + # - a standalone SSE stream (if GET request is supported) + # - nowhere (if GET request isn't supported) + related_request_id=ctx.request_id, + ) + logger.debug(f"Sent notification {i+1}/{count} for caller: {caller}") + if i < count - 1: # Don't wait after the last notification + await anyio.sleep(interval) + + # This will send a resource notificaiton though standalone SSE + # established by GET request + await ctx.session.send_resource_updated(uri=AnyUrl("http:///test_resource")) + return [ + types.TextContent( + type="text", + text=( + f"Sent {count} notifications with {interval}s interval" + f" for caller: {caller}" + ), + ) + ] + + @app.list_tools() + async def list_tools() -> list[types.Tool]: + return [ + types.Tool( + name="start-notification-stream", + description=( + "Sends a stream of notifications with configurable count" + " and interval" + ), + inputSchema={ + "type": "object", + "required": ["interval", "count", "caller"], + "properties": { + "interval": { + "type": "number", + "description": "Interval between notifications in seconds", + }, + "count": { + "type": "number", + "description": "Number of notifications to send", + }, + "caller": { + "type": "string", + "description": ( + "Identifier of the caller to include in notifications" + ), + }, + }, + }, + ) + ] + + # Create event store for resumability + # The InMemoryEventStore enables resumability support for StreamableHTTP transport. + # It stores SSE events with unique IDs, allowing clients to: + # 1. Receive event IDs for each SSE message + # 2. Resume streams by sending Last-Event-ID in GET requests + # 3. Replay missed events after reconnection + # Note: This in-memory implementation is for demonstration ONLY. + # For production, use a persistent storage solution. + event_store = InMemoryEventStore() + + # Create the session manager with our app and event store + session_manager = StreamableHTTPSessionManager( + app=app, + event_store=event_store, # Enable resumability + json_response=json_response, + ) + + # ASGI handler for streamable HTTP connections + async def handle_streamable_http( + scope: Scope, receive: Receive, send: Send + ) -> None: + await session_manager.handle_request(scope, receive, send) + + @contextlib.asynccontextmanager + async def lifespan(app: Starlette) -> AsyncIterator[None]: + """Context manager for managing session manager lifecycle.""" + async with session_manager.run(): + logger.info("Application started with StreamableHTTP session manager!") + try: + yield + finally: + logger.info("Application shutting down...") + + # Create an ASGI application using the transport + starlette_app = Starlette( + debug=True, + routes=[ + Mount("/mcp", app=handle_streamable_http), + ], + lifespan=lifespan, + ) + + import uvicorn + + uvicorn.run(starlette_app, host="127.0.0.1", port=port) + + return 0 diff --git a/examples/servers/simple-streamablehttp/pyproject.toml b/examples/servers/simple-streamablehttp/pyproject.toml new file mode 100644 index 000000000..c35887d1f --- /dev/null +++ b/examples/servers/simple-streamablehttp/pyproject.toml @@ -0,0 +1,36 @@ +[project] +name = "mcp-simple-streamablehttp" +version = "0.1.0" +description = "A simple MCP server exposing a StreamableHttp transport for testing" +readme = "README.md" +requires-python = ">=3.10" +authors = [{ name = "Anthropic, PBC." }] +keywords = ["mcp", "llm", "automation", "web", "fetch", "http", "streamable"] +license = { text = "MIT" } +dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp", "starlette", "uvicorn"] + +[project.scripts] +mcp-simple-streamablehttp = "mcp_simple_streamablehttp.server:main" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["mcp_simple_streamablehttp"] + +[tool.pyright] +include = ["mcp_simple_streamablehttp"] +venvPath = "." +venv = ".venv" + +[tool.ruff.lint] +select = ["E", "F", "I"] +ignore = [] + +[tool.ruff] +line-length = 88 +target-version = "py310" + +[tool.uv] +dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"] \ No newline at end of file diff --git a/examples/servers/simple-tool/mcp_simple_tool/__main__.py b/examples/servers/simple-tool/mcp_simple_tool/__main__.py index 8b345fa2e..e7ef16530 100644 --- a/examples/servers/simple-tool/mcp_simple_tool/__main__.py +++ b/examples/servers/simple-tool/mcp_simple_tool/__main__.py @@ -2,4 +2,4 @@ from .server import main -sys.exit(main()) +sys.exit(main()) # type: ignore[call-arg] diff --git a/examples/servers/simple-tool/mcp_simple_tool/server.py b/examples/servers/simple-tool/mcp_simple_tool/server.py index 3eace52ea..bf3683c9e 100644 --- a/examples/servers/simple-tool/mcp_simple_tool/server.py +++ b/examples/servers/simple-tool/mcp_simple_tool/server.py @@ -1,17 +1,17 @@ import anyio import click -import httpx import mcp.types as types from mcp.server.lowlevel import Server +from mcp.shared._httpx_utils import create_mcp_http_client async def fetch_website( url: str, -) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]: +) -> list[types.ContentBlock]: headers = { "User-Agent": "MCP Test Server (github.com/modelcontextprotocol/python-sdk)" } - async with httpx.AsyncClient(follow_redirects=True, headers=headers) as client: + async with create_mcp_http_client(headers=headers) as client: response = await client.get(url) response.raise_for_status() return [types.TextContent(type="text", text=response.text)] @@ -29,9 +29,7 @@ def main(port: int, transport: str) -> int: app = Server("mcp-website-fetcher") @app.call_tool() - async def fetch_tool( - name: str, arguments: dict - ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]: + async def fetch_tool(name: str, arguments: dict) -> list[types.ContentBlock]: if name != "fetch": raise ValueError(f"Unknown tool: {name}") if "url" not in arguments: @@ -43,6 +41,7 @@ async def list_tools() -> list[types.Tool]: return [ types.Tool( name="fetch", + title="Website Fetcher", description="Fetches a website and returns its content", inputSchema={ "type": "object", @@ -60,6 +59,7 @@ async def list_tools() -> list[types.Tool]: if transport == "sse": from mcp.server.sse import SseServerTransport from starlette.applications import Starlette + from starlette.responses import Response from starlette.routing import Mount, Route sse = SseServerTransport("/messages/") @@ -71,18 +71,19 @@ async def handle_sse(request): await app.run( streams[0], streams[1], app.create_initialization_options() ) + return Response() starlette_app = Starlette( debug=True, routes=[ - Route("/sse", endpoint=handle_sse), + Route("/sse", endpoint=handle_sse, methods=["GET"]), Mount("/messages/", app=sse.handle_post_message), ], ) import uvicorn - uvicorn.run(starlette_app, host="0.0.0.0", port=port) + uvicorn.run(starlette_app, host="127.0.0.1", port=port) else: from mcp.server.stdio import stdio_server diff --git a/pyproject.toml b/pyproject.toml index 1aaf15593..9ad50ab58 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -27,6 +27,7 @@ dependencies = [ "httpx-sse>=0.4", "pydantic>=2.7.2,<3.0.0", "starlette>=0.27", + "python-multipart>=0.0.9", "sse-starlette>=1.6.1", "pydantic-settings>=2.5.2", "uvicorn>=0.23.1; sys_platform != 'emscripten'", @@ -43,6 +44,7 @@ mcp = "mcp.cli:app [cli]" [tool.uv] resolution = "lowest-direct" default-groups = ["dev", "docs"] +required-version = ">=0.7.2" [dependency-groups] dev = [ @@ -53,6 +55,8 @@ dev = [ "pytest-flakefinder>=1.1.0", "pytest-xdist>=3.6.1", "pytest-examples>=0.0.14", + "pytest-pretty>=1.2.0", + "inline-snapshot>=0.23.0", ] docs = [ "mkdocs>=1.6.1", @@ -61,7 +65,6 @@ docs = [ "mkdocstrings-python>=1.12.2", ] - [build-system] requires = ["hatchling", "uv-dynamic-versioning"] build-backend = "hatchling.build" @@ -83,7 +86,7 @@ Issues = "https://github.com/modelcontextprotocol/python-sdk/issues" packages = ["src/mcp"] [tool.pyright] -include = ["src/mcp", "tests"] +include = ["src/mcp", "tests", "examples/servers"] venvPath = "." venv = ".venv" strict = ["src/mcp/**/*.py"] @@ -93,7 +96,7 @@ select = ["C4", "E", "F", "I", "PERF", "UP"] ignore = ["PERF203"] [tool.ruff] -line-length = 88 +line-length = 120 target-version = "py310" [tool.ruff.lint.per-file-ignores] @@ -107,7 +110,13 @@ members = ["examples/servers/*"] mcp = { workspace = true } [tool.pytest.ini_options] +log_cli = true xfail_strict = true +addopts = """ + --color=yes + --capture=fd + --numprocesses auto +""" filterwarnings = [ "error", # This should be fixed on Uvicorn's side. diff --git a/src/mcp/__init__.py b/src/mcp/__init__.py index 0d3c372ce..e93b95c90 100644 --- a/src/mcp/__init__.py +++ b/src/mcp/__init__.py @@ -1,4 +1,5 @@ from .client.session import ClientSession +from .client.session_group import ClientSessionGroup from .client.stdio import StdioServerParameters, stdio_client from .server.session import ServerSession from .server.stdio import stdio_server @@ -63,6 +64,7 @@ "ClientRequest", "ClientResult", "ClientSession", + "ClientSessionGroup", "CreateMessageRequest", "CreateMessageResult", "ErrorData", diff --git a/src/mcp/cli/claude.py b/src/mcp/cli/claude.py index 5a0ce0ab4..e6eab2851 100644 --- a/src/mcp/cli/claude.py +++ b/src/mcp/cli/claude.py @@ -2,6 +2,7 @@ import json import os +import shutil import sys from pathlib import Path from typing import Any @@ -20,9 +21,7 @@ def get_claude_config_path() -> Path | None: elif sys.platform == "darwin": path = Path(Path.home(), "Library", "Application Support", "Claude") elif sys.platform.startswith("linux"): - path = Path( - os.environ.get("XDG_CONFIG_HOME", Path.home() / ".config"), "Claude" - ) + path = Path(os.environ.get("XDG_CONFIG_HOME", Path.home() / ".config"), "Claude") else: return None @@ -31,6 +30,17 @@ def get_claude_config_path() -> Path | None: return None +def get_uv_path() -> str: + """Get the full path to the uv executable.""" + uv_path = shutil.which("uv") + if not uv_path: + logger.error( + "uv executable not found in PATH, falling back to 'uv'. " "Please ensure uv is installed and in your PATH" + ) + return "uv" # Fall back to just "uv" if not found + return uv_path + + def update_claude_config( file_spec: str, server_name: str, @@ -54,6 +64,7 @@ def update_claude_config( Claude Desktop may not be installed or properly set up. """ config_dir = get_claude_config_path() + uv_path = get_uv_path() if not config_dir: raise RuntimeError( "Claude Desktop config directory not found. Please ensure Claude Desktop" @@ -80,10 +91,7 @@ def update_claude_config( config["mcpServers"] = {} # Always preserve existing env vars and merge with new ones - if ( - server_name in config["mcpServers"] - and "env" in config["mcpServers"][server_name] - ): + if server_name in config["mcpServers"] and "env" in config["mcpServers"][server_name]: existing_env = config["mcpServers"][server_name]["env"] if env_vars: # New vars take precedence over existing ones @@ -117,7 +125,7 @@ def update_claude_config( # Add fastmcp run command args.extend(["mcp", "run", file_spec]) - server_config: dict[str, Any] = {"command": "uv", "args": args} + server_config: dict[str, Any] = {"command": uv_path, "args": args} # Add environment variables if specified if env_vars: diff --git a/src/mcp/cli/cli.py b/src/mcp/cli/cli.py index cb0830600..69e2921f1 100644 --- a/src/mcp/cli/cli.py +++ b/src/mcp/cli/cli.py @@ -6,7 +6,10 @@ import subprocess import sys from pathlib import Path -from typing import Annotated +from typing import Annotated, Any + +from mcp.server import FastMCP +from mcp.server import Server as LowLevelServer try: import typer @@ -42,9 +45,7 @@ def _get_npx_command(): # Try both npx.cmd and npx.exe on Windows for cmd in ["npx.cmd", "npx.exe", "npx"]: try: - subprocess.run( - [cmd, "--version"], check=True, capture_output=True, shell=True - ) + subprocess.run([cmd, "--version"], check=True, capture_output=True, shell=True) return cmd except subprocess.CalledProcessError: continue @@ -55,9 +56,7 @@ def _get_npx_command(): def _parse_env_var(env_var: str) -> tuple[str, str]: """Parse environment variable string in format KEY=VALUE.""" if "=" not in env_var: - logger.error( - f"Invalid environment variable format: {env_var}. Must be KEY=VALUE" - ) + logger.error(f"Invalid environment variable format: {env_var}. Must be KEY=VALUE") sys.exit(1) key, value = env_var.split("=", 1) return key.strip(), value.strip() @@ -141,17 +140,41 @@ def _import_server(file: Path, server_object: str | None = None): module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) + def _check_server_object(server_object: Any, object_name: str): + """Helper function to check that the server object is supported + + Args: + server_object: The server object to check. + + Returns: + True if it's supported. + """ + if not isinstance(server_object, FastMCP): + logger.error(f"The server object {object_name} is of type " f"{type(server_object)} (expecting {FastMCP}).") + if isinstance(server_object, LowLevelServer): + logger.warning( + "Note that only FastMCP server is supported. Low level " "Server class is not yet supported." + ) + return False + return True + # If no object specified, try common server names if not server_object: # Look for the most common server object names for name in ["mcp", "server", "app"]: if hasattr(module, name): + if not _check_server_object(getattr(module, name), f"{file}:{name}"): + logger.error(f"Ignoring object '{file}:{name}' as it's not a valid " "server object") + continue return getattr(module, name) logger.error( f"No server object found in {file}. Please either:\n" "1. Use a standard variable name (mcp, server, or app)\n" - "2. Specify the object name with file:object syntax", + "2. Specify the object name with file:object syntax" + "3. If the server creates the FastMCP object within main() " + " or another function, refactor the FastMCP object to be a " + " global variable named mcp, server, or app.", extra={"file": str(file)}, ) sys.exit(1) @@ -179,6 +202,9 @@ def _import_server(file: Path, server_object: str | None = None): ) sys.exit(1) + if not _check_server_object(server, server_object): + sys.exit(1) + return server @@ -243,8 +269,7 @@ def dev( npx_cmd = _get_npx_command() if not npx_cmd: logger.error( - "npx not found. Please ensure Node.js and npm are properly installed " - "and added to your system PATH." + "npx not found. Please ensure Node.js and npm are properly installed " "and added to your system PATH." ) sys.exit(1) @@ -346,8 +371,7 @@ def install( typer.Option( "--name", "-n", - help="Custom name for the server (defaults to server's name attribute or" - " file name)", + help="Custom name for the server (defaults to server's name attribute or" " file name)", ), ] = None, with_editable: Annotated[ @@ -421,8 +445,7 @@ def install( name = server.name except (ImportError, ModuleNotFoundError) as e: logger.debug( - "Could not import server (likely missing dependencies), using file" - " name", + "Could not import server (likely missing dependencies), using file" " name", extra={"error": str(e)}, ) name = file.stem @@ -440,11 +463,7 @@ def install( if env_file: if dotenv: try: - env_dict |= { - k: v - for k, v in dotenv.dotenv_values(env_file).items() - if v is not None - } + env_dict |= {k: v for k, v in dotenv.dotenv_values(env_file).items() if v is not None} except Exception as e: logger.error(f"Failed to load .env file: {e}") sys.exit(1) diff --git a/src/mcp/client/__main__.py b/src/mcp/client/__main__.py index 84e15bd56..2efe05d53 100644 --- a/src/mcp/client/__main__.py +++ b/src/mcp/client/__main__.py @@ -11,8 +11,8 @@ from mcp.client.session import ClientSession from mcp.client.sse import sse_client from mcp.client.stdio import StdioServerParameters, stdio_client +from mcp.shared.message import SessionMessage from mcp.shared.session import RequestResponder -from mcp.types import JSONRPCMessage if not sys.warnoptions: import warnings @@ -24,9 +24,7 @@ async def message_handler( - message: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: if isinstance(message, Exception): logger.error("Error: %s", message) @@ -36,8 +34,8 @@ async def message_handler( async def run_session( - read_stream: MemoryObjectReceiveStream[JSONRPCMessage | Exception], - write_stream: MemoryObjectSendStream[JSONRPCMessage], + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception], + write_stream: MemoryObjectSendStream[SessionMessage], client_info: types.Implementation | None = None, ): async with ClientSession( @@ -60,9 +58,7 @@ async def main(command_or_url: str, args: list[str], env: list[tuple[str, str]]) await run_session(*streams) else: # Use stdio client for commands - server_parameters = StdioServerParameters( - command=command_or_url, args=args, env=env_dict - ) + server_parameters = StdioServerParameters(command=command_or_url, args=args, env=env_dict) async with stdio_client(server_parameters) as streams: await run_session(*streams) diff --git a/src/mcp/client/auth.py b/src/mcp/client/auth.py new file mode 100644 index 000000000..c174385ea --- /dev/null +++ b/src/mcp/client/auth.py @@ -0,0 +1,488 @@ +""" +OAuth2 Authentication implementation for HTTPX. + +Implements authorization code flow with PKCE and automatic token refresh. +""" + +import base64 +import hashlib +import logging +import secrets +import string +import time +from collections.abc import AsyncGenerator, Awaitable, Callable +from dataclasses import dataclass, field +from typing import Protocol +from urllib.parse import urlencode, urljoin, urlparse + +import anyio +import httpx +from pydantic import BaseModel, Field, ValidationError + +from mcp.client.streamable_http import MCP_PROTOCOL_VERSION +from mcp.shared.auth import ( + OAuthClientInformationFull, + OAuthClientMetadata, + OAuthMetadata, + OAuthToken, + ProtectedResourceMetadata, +) +from mcp.shared.auth_utils import check_resource_allowed, resource_url_from_server_url +from mcp.types import LATEST_PROTOCOL_VERSION + +logger = logging.getLogger(__name__) + + +class OAuthFlowError(Exception): + """Base exception for OAuth flow errors.""" + + +class OAuthTokenError(OAuthFlowError): + """Raised when token operations fail.""" + + +class OAuthRegistrationError(OAuthFlowError): + """Raised when client registration fails.""" + + +class PKCEParameters(BaseModel): + """PKCE (Proof Key for Code Exchange) parameters.""" + + code_verifier: str = Field(..., min_length=43, max_length=128) + code_challenge: str = Field(..., min_length=43, max_length=128) + + @classmethod + def generate(cls) -> "PKCEParameters": + """Generate new PKCE parameters.""" + code_verifier = "".join(secrets.choice(string.ascii_letters + string.digits + "-._~") for _ in range(128)) + digest = hashlib.sha256(code_verifier.encode()).digest() + code_challenge = base64.urlsafe_b64encode(digest).decode().rstrip("=") + return cls(code_verifier=code_verifier, code_challenge=code_challenge) + + +class TokenStorage(Protocol): + """Protocol for token storage implementations.""" + + async def get_tokens(self) -> OAuthToken | None: + """Get stored tokens.""" + ... + + async def set_tokens(self, tokens: OAuthToken) -> None: + """Store tokens.""" + ... + + async def get_client_info(self) -> OAuthClientInformationFull | None: + """Get stored client information.""" + ... + + async def set_client_info(self, client_info: OAuthClientInformationFull) -> None: + """Store client information.""" + ... + + +@dataclass +class OAuthContext: + """OAuth flow context.""" + + server_url: str + client_metadata: OAuthClientMetadata + storage: TokenStorage + redirect_handler: Callable[[str], Awaitable[None]] + callback_handler: Callable[[], Awaitable[tuple[str, str | None]]] + timeout: float = 300.0 + + # Discovered metadata + protected_resource_metadata: ProtectedResourceMetadata | None = None + oauth_metadata: OAuthMetadata | None = None + auth_server_url: str | None = None + + # Client registration + client_info: OAuthClientInformationFull | None = None + + # Token management + current_tokens: OAuthToken | None = None + token_expiry_time: float | None = None + + # State + lock: anyio.Lock = field(default_factory=anyio.Lock) + + def get_authorization_base_url(self, server_url: str) -> str: + """Extract base URL by removing path component.""" + parsed = urlparse(server_url) + return f"{parsed.scheme}://{parsed.netloc}" + + def update_token_expiry(self, token: OAuthToken) -> None: + """Update token expiry time.""" + if token.expires_in: + self.token_expiry_time = time.time() + token.expires_in + else: + self.token_expiry_time = None + + def is_token_valid(self) -> bool: + """Check if current token is valid.""" + return bool( + self.current_tokens + and self.current_tokens.access_token + and (not self.token_expiry_time or time.time() <= self.token_expiry_time) + ) + + def can_refresh_token(self) -> bool: + """Check if token can be refreshed.""" + return bool(self.current_tokens and self.current_tokens.refresh_token and self.client_info) + + def clear_tokens(self) -> None: + """Clear current tokens.""" + self.current_tokens = None + self.token_expiry_time = None + + def get_resource_url(self) -> str: + """Get resource URL for RFC 8707. + + Uses PRM resource if it's a valid parent, otherwise uses canonical server URL. + """ + resource = resource_url_from_server_url(self.server_url) + + # If PRM provides a resource that's a valid parent, use it + if self.protected_resource_metadata and self.protected_resource_metadata.resource: + prm_resource = str(self.protected_resource_metadata.resource) + if check_resource_allowed(requested_resource=resource, configured_resource=prm_resource): + resource = prm_resource + + return resource + + +class OAuthClientProvider(httpx.Auth): + """ + OAuth2 authentication for httpx. + Handles OAuth flow with automatic client registration and token storage. + """ + + requires_response_body = True + + def __init__( + self, + server_url: str, + client_metadata: OAuthClientMetadata, + storage: TokenStorage, + redirect_handler: Callable[[str], Awaitable[None]], + callback_handler: Callable[[], Awaitable[tuple[str, str | None]]], + timeout: float = 300.0, + ): + """Initialize OAuth2 authentication.""" + self.context = OAuthContext( + server_url=server_url, + client_metadata=client_metadata, + storage=storage, + redirect_handler=redirect_handler, + callback_handler=callback_handler, + timeout=timeout, + ) + self._initialized = False + + async def _discover_protected_resource(self) -> httpx.Request: + """Build discovery request for protected resource metadata.""" + auth_base_url = self.context.get_authorization_base_url(self.context.server_url) + url = urljoin(auth_base_url, "/.well-known/oauth-protected-resource") + return httpx.Request("GET", url, headers={MCP_PROTOCOL_VERSION: LATEST_PROTOCOL_VERSION}) + + async def _handle_protected_resource_response(self, response: httpx.Response) -> None: + """Handle discovery response.""" + if response.status_code == 200: + try: + content = await response.aread() + metadata = ProtectedResourceMetadata.model_validate_json(content) + self.context.protected_resource_metadata = metadata + if metadata.authorization_servers: + self.context.auth_server_url = str(metadata.authorization_servers[0]) + except ValidationError: + pass + + async def _discover_oauth_metadata(self) -> httpx.Request: + """Build OAuth metadata discovery request.""" + if self.context.auth_server_url: + base_url = self.context.get_authorization_base_url(self.context.auth_server_url) + else: + base_url = self.context.get_authorization_base_url(self.context.server_url) + + url = urljoin(base_url, "/.well-known/oauth-authorization-server") + return httpx.Request("GET", url, headers={MCP_PROTOCOL_VERSION: LATEST_PROTOCOL_VERSION}) + + async def _handle_oauth_metadata_response(self, response: httpx.Response) -> None: + """Handle OAuth metadata response.""" + if response.status_code == 200: + try: + content = await response.aread() + metadata = OAuthMetadata.model_validate_json(content) + self.context.oauth_metadata = metadata + # Apply default scope if none specified + if self.context.client_metadata.scope is None and metadata.scopes_supported is not None: + self.context.client_metadata.scope = " ".join(metadata.scopes_supported) + except ValidationError: + pass + + async def _register_client(self) -> httpx.Request | None: + """Build registration request or skip if already registered.""" + if self.context.client_info: + return None + + if self.context.oauth_metadata and self.context.oauth_metadata.registration_endpoint: + registration_url = str(self.context.oauth_metadata.registration_endpoint) + else: + auth_base_url = self.context.get_authorization_base_url(self.context.server_url) + registration_url = urljoin(auth_base_url, "/register") + + registration_data = self.context.client_metadata.model_dump(by_alias=True, mode="json", exclude_none=True) + + return httpx.Request( + "POST", registration_url, json=registration_data, headers={"Content-Type": "application/json"} + ) + + async def _handle_registration_response(self, response: httpx.Response) -> None: + """Handle registration response.""" + if response.status_code not in (200, 201): + raise OAuthRegistrationError(f"Registration failed: {response.status_code} {response.text}") + + try: + content = await response.aread() + client_info = OAuthClientInformationFull.model_validate_json(content) + self.context.client_info = client_info + await self.context.storage.set_client_info(client_info) + except ValidationError as e: + raise OAuthRegistrationError(f"Invalid registration response: {e}") + + async def _perform_authorization(self) -> tuple[str, str]: + """Perform the authorization redirect and get auth code.""" + if self.context.oauth_metadata and self.context.oauth_metadata.authorization_endpoint: + auth_endpoint = str(self.context.oauth_metadata.authorization_endpoint) + else: + auth_base_url = self.context.get_authorization_base_url(self.context.server_url) + auth_endpoint = urljoin(auth_base_url, "/authorize") + + if not self.context.client_info: + raise OAuthFlowError("No client info available for authorization") + + # Generate PKCE parameters + pkce_params = PKCEParameters.generate() + state = secrets.token_urlsafe(32) + + auth_params = { + "response_type": "code", + "client_id": self.context.client_info.client_id, + "redirect_uri": str(self.context.client_metadata.redirect_uris[0]), + "state": state, + "code_challenge": pkce_params.code_challenge, + "code_challenge_method": "S256", + "resource": self.context.get_resource_url(), # RFC 8707 + } + + if self.context.client_metadata.scope: + auth_params["scope"] = self.context.client_metadata.scope + + authorization_url = f"{auth_endpoint}?{urlencode(auth_params)}" + await self.context.redirect_handler(authorization_url) + + # Wait for callback + auth_code, returned_state = await self.context.callback_handler() + + if returned_state is None or not secrets.compare_digest(returned_state, state): + raise OAuthFlowError(f"State parameter mismatch: {returned_state} != {state}") + + if not auth_code: + raise OAuthFlowError("No authorization code received") + + # Return auth code and code verifier for token exchange + return auth_code, pkce_params.code_verifier + + async def _exchange_token(self, auth_code: str, code_verifier: str) -> httpx.Request: + """Build token exchange request.""" + if not self.context.client_info: + raise OAuthFlowError("Missing client info") + + if self.context.oauth_metadata and self.context.oauth_metadata.token_endpoint: + token_url = str(self.context.oauth_metadata.token_endpoint) + else: + auth_base_url = self.context.get_authorization_base_url(self.context.server_url) + token_url = urljoin(auth_base_url, "/token") + + token_data = { + "grant_type": "authorization_code", + "code": auth_code, + "redirect_uri": str(self.context.client_metadata.redirect_uris[0]), + "client_id": self.context.client_info.client_id, + "code_verifier": code_verifier, + "resource": self.context.get_resource_url(), # RFC 8707 + } + + if self.context.client_info.client_secret: + token_data["client_secret"] = self.context.client_info.client_secret + + return httpx.Request( + "POST", token_url, data=token_data, headers={"Content-Type": "application/x-www-form-urlencoded"} + ) + + async def _handle_token_response(self, response: httpx.Response) -> None: + """Handle token exchange response.""" + if response.status_code != 200: + raise OAuthTokenError(f"Token exchange failed: {response.status_code}") + + try: + content = await response.aread() + token_response = OAuthToken.model_validate_json(content) + + # Validate scopes + if token_response.scope and self.context.client_metadata.scope: + requested_scopes = set(self.context.client_metadata.scope.split()) + returned_scopes = set(token_response.scope.split()) + unauthorized_scopes = returned_scopes - requested_scopes + if unauthorized_scopes: + raise OAuthTokenError(f"Server granted unauthorized scopes: {unauthorized_scopes}") + + self.context.current_tokens = token_response + self.context.update_token_expiry(token_response) + await self.context.storage.set_tokens(token_response) + except ValidationError as e: + raise OAuthTokenError(f"Invalid token response: {e}") + + async def _refresh_token(self) -> httpx.Request: + """Build token refresh request.""" + if not self.context.current_tokens or not self.context.current_tokens.refresh_token: + raise OAuthTokenError("No refresh token available") + + if not self.context.client_info: + raise OAuthTokenError("No client info available") + + if self.context.oauth_metadata and self.context.oauth_metadata.token_endpoint: + token_url = str(self.context.oauth_metadata.token_endpoint) + else: + auth_base_url = self.context.get_authorization_base_url(self.context.server_url) + token_url = urljoin(auth_base_url, "/token") + + refresh_data = { + "grant_type": "refresh_token", + "refresh_token": self.context.current_tokens.refresh_token, + "client_id": self.context.client_info.client_id, + "resource": self.context.get_resource_url(), # RFC 8707 + } + + if self.context.client_info.client_secret: + refresh_data["client_secret"] = self.context.client_info.client_secret + + return httpx.Request( + "POST", token_url, data=refresh_data, headers={"Content-Type": "application/x-www-form-urlencoded"} + ) + + async def _handle_refresh_response(self, response: httpx.Response) -> bool: + """Handle token refresh response. Returns True if successful.""" + if response.status_code != 200: + logger.warning(f"Token refresh failed: {response.status_code}") + self.context.clear_tokens() + return False + + try: + content = await response.aread() + token_response = OAuthToken.model_validate_json(content) + + self.context.current_tokens = token_response + self.context.update_token_expiry(token_response) + await self.context.storage.set_tokens(token_response) + + return True + except ValidationError as e: + logger.error(f"Invalid refresh response: {e}") + self.context.clear_tokens() + return False + + async def _initialize(self) -> None: + """Load stored tokens and client info.""" + self.context.current_tokens = await self.context.storage.get_tokens() + self.context.client_info = await self.context.storage.get_client_info() + self._initialized = True + + def _add_auth_header(self, request: httpx.Request) -> None: + """Add authorization header to request if we have valid tokens.""" + if self.context.current_tokens and self.context.current_tokens.access_token: + request.headers["Authorization"] = f"Bearer {self.context.current_tokens.access_token}" + + async def async_auth_flow(self, request: httpx.Request) -> AsyncGenerator[httpx.Request, httpx.Response]: + """HTTPX auth flow integration.""" + async with self.context.lock: + if not self._initialized: + await self._initialize() + + # Perform OAuth flow if not authenticated + if not self.context.is_token_valid(): + try: + # OAuth flow must be inline due to generator constraints + # Step 1: Discover protected resource metadata (spec revision 2025-06-18) + discovery_request = await self._discover_protected_resource() + discovery_response = yield discovery_request + await self._handle_protected_resource_response(discovery_response) + + # Step 2: Discover OAuth metadata + oauth_request = await self._discover_oauth_metadata() + oauth_response = yield oauth_request + await self._handle_oauth_metadata_response(oauth_response) + + # Step 3: Register client if needed + registration_request = await self._register_client() + if registration_request: + registration_response = yield registration_request + await self._handle_registration_response(registration_response) + + # Step 4: Perform authorization + auth_code, code_verifier = await self._perform_authorization() + + # Step 5: Exchange authorization code for tokens + token_request = await self._exchange_token(auth_code, code_verifier) + token_response = yield token_request + await self._handle_token_response(token_response) + except Exception as e: + logger.error(f"OAuth flow error: {e}") + raise + + # Add authorization header and make request + self._add_auth_header(request) + response = yield request + + # Handle 401 responses + if response.status_code == 401 and self.context.can_refresh_token(): + # Try to refresh token + refresh_request = await self._refresh_token() + refresh_response = yield refresh_request + + if await self._handle_refresh_response(refresh_response): + # Retry original request with new token + self._add_auth_header(request) + yield request + else: + # Refresh failed, need full re-authentication + self._initialized = False + + # OAuth flow must be inline due to generator constraints + # Step 1: Discover protected resource metadata (spec revision 2025-06-18) + discovery_request = await self._discover_protected_resource() + discovery_response = yield discovery_request + await self._handle_protected_resource_response(discovery_response) + + # Step 2: Discover OAuth metadata + oauth_request = await self._discover_oauth_metadata() + oauth_response = yield oauth_request + await self._handle_oauth_metadata_response(oauth_response) + + # Step 3: Register client if needed + registration_request = await self._register_client() + if registration_request: + registration_response = yield registration_request + await self._handle_registration_response(registration_response) + + # Step 4: Perform authorization + auth_code, code_verifier = await self._perform_authorization() + + # Step 5: Exchange authorization code for tokens + token_request = await self._exchange_token(auth_code, code_verifier) + token_response = yield token_request + await self._handle_token_response(token_response) + + # Retry with new tokens + self._add_auth_header(request) + yield request diff --git a/src/mcp/client/session.py b/src/mcp/client/session.py index e29797d17..948817140 100644 --- a/src/mcp/client/session.py +++ b/src/mcp/client/session.py @@ -7,7 +7,8 @@ import mcp.types as types from mcp.shared.context import RequestContext -from mcp.shared.session import BaseSession, RequestResponder +from mcp.shared.message import SessionMessage +from mcp.shared.session import BaseSession, ProgressFnT, RequestResponder from mcp.shared.version import SUPPORTED_PROTOCOL_VERSIONS DEFAULT_CLIENT_INFO = types.Implementation(name="mcp", version="0.1.0") @@ -21,6 +22,14 @@ async def __call__( ) -> types.CreateMessageResult | types.ErrorData: ... +class ElicitationFnT(Protocol): + async def __call__( + self, + context: RequestContext["ClientSession", Any], + params: types.ElicitRequestParams, + ) -> types.ElicitResult | types.ErrorData: ... + + class ListRootsFnT(Protocol): async def __call__( self, context: RequestContext["ClientSession", Any] @@ -37,16 +46,12 @@ async def __call__( class MessageHandlerFnT(Protocol): async def __call__( self, - message: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: ... async def _default_message_handler( - message: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: await anyio.lowlevel.checkpoint() @@ -61,6 +66,16 @@ async def _default_sampling_callback( ) +async def _default_elicitation_callback( + context: RequestContext["ClientSession", Any], + params: types.ElicitRequestParams, +) -> types.ElicitResult | types.ErrorData: + return types.ErrorData( + code=types.INVALID_REQUEST, + message="Elicitation not supported", + ) + + async def _default_list_roots_callback( context: RequestContext["ClientSession", Any], ) -> types.ListRootsResult | types.ErrorData: @@ -76,9 +91,7 @@ async def _default_logging_callback( pass -ClientResponse: TypeAdapter[types.ClientResult | types.ErrorData] = TypeAdapter( - types.ClientResult | types.ErrorData -) +ClientResponse: TypeAdapter[types.ClientResult | types.ErrorData] = TypeAdapter(types.ClientResult | types.ErrorData) class ClientSession( @@ -92,10 +105,11 @@ class ClientSession( ): def __init__( self, - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception], - write_stream: MemoryObjectSendStream[types.JSONRPCMessage], + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception], + write_stream: MemoryObjectSendStream[SessionMessage], read_timeout_seconds: timedelta | None = None, sampling_callback: SamplingFnT | None = None, + elicitation_callback: ElicitationFnT | None = None, list_roots_callback: ListRootsFnT | None = None, logging_callback: LoggingFnT | None = None, message_handler: MessageHandlerFnT | None = None, @@ -110,17 +124,23 @@ def __init__( ) self._client_info = client_info or DEFAULT_CLIENT_INFO self._sampling_callback = sampling_callback or _default_sampling_callback + self._elicitation_callback = elicitation_callback or _default_elicitation_callback self._list_roots_callback = list_roots_callback or _default_list_roots_callback self._logging_callback = logging_callback or _default_logging_callback self._message_handler = message_handler or _default_message_handler async def initialize(self) -> types.InitializeResult: - sampling = types.SamplingCapability() - roots = types.RootsCapability( + sampling = types.SamplingCapability() if self._sampling_callback is not _default_sampling_callback else None + elicitation = ( + types.ElicitationCapability() if self._elicitation_callback is not _default_elicitation_callback else None + ) + roots = ( # TODO: Should this be based on whether we # _will_ send notifications, or only whether # they're supported? - listChanged=True, + types.RootsCapability(listChanged=True) + if self._list_roots_callback is not _default_list_roots_callback + else None ) result = await self.send_request( @@ -131,6 +151,7 @@ async def initialize(self) -> types.InitializeResult: protocolVersion=types.LATEST_PROTOCOL_VERSION, capabilities=types.ClientCapabilities( sampling=sampling, + elicitation=elicitation, experimental=None, roots=roots, ), @@ -142,15 +163,10 @@ async def initialize(self) -> types.InitializeResult: ) if result.protocolVersion not in SUPPORTED_PROTOCOL_VERSIONS: - raise RuntimeError( - "Unsupported protocol version from the server: " - f"{result.protocolVersion}" - ) + raise RuntimeError("Unsupported protocol version from the server: " f"{result.protocolVersion}") await self.send_notification( - types.ClientNotification( - types.InitializedNotification(method="notifications/initialized") - ) + types.ClientNotification(types.InitializedNotification(method="notifications/initialized")) ) return result @@ -167,7 +183,11 @@ async def send_ping(self) -> types.EmptyResult: ) async def send_progress_notification( - self, progress_token: str | int, progress: float, total: float | None = None + self, + progress_token: str | int, + progress: float, + total: float | None = None, + message: str | None = None, ) -> None: """Send a progress notification.""" await self.send_notification( @@ -178,6 +198,7 @@ async def send_progress_notification( progressToken=progress_token, progress=progress, total=total, + message=message, ), ), ) @@ -195,23 +216,25 @@ async def set_logging_level(self, level: types.LoggingLevel) -> types.EmptyResul types.EmptyResult, ) - async def list_resources(self) -> types.ListResourcesResult: + async def list_resources(self, cursor: str | None = None) -> types.ListResourcesResult: """Send a resources/list request.""" return await self.send_request( types.ClientRequest( types.ListResourcesRequest( method="resources/list", + params=types.PaginatedRequestParams(cursor=cursor) if cursor is not None else None, ) ), types.ListResourcesResult, ) - async def list_resource_templates(self) -> types.ListResourceTemplatesResult: + async def list_resource_templates(self, cursor: str | None = None) -> types.ListResourceTemplatesResult: """Send a resources/templates/list request.""" return await self.send_request( types.ClientRequest( types.ListResourceTemplatesRequest( method="resources/templates/list", + params=types.PaginatedRequestParams(cursor=cursor) if cursor is not None else None, ) ), types.ListResourceTemplatesResult, @@ -254,33 +277,42 @@ async def unsubscribe_resource(self, uri: AnyUrl) -> types.EmptyResult: ) async def call_tool( - self, name: str, arguments: dict[str, Any] | None = None + self, + name: str, + arguments: dict[str, Any] | None = None, + read_timeout_seconds: timedelta | None = None, + progress_callback: ProgressFnT | None = None, ) -> types.CallToolResult: - """Send a tools/call request.""" + """Send a tools/call request with optional progress callback support.""" + return await self.send_request( types.ClientRequest( types.CallToolRequest( method="tools/call", - params=types.CallToolRequestParams(name=name, arguments=arguments), + params=types.CallToolRequestParams( + name=name, + arguments=arguments, + ), ) ), types.CallToolResult, + request_read_timeout_seconds=read_timeout_seconds, + progress_callback=progress_callback, ) - async def list_prompts(self) -> types.ListPromptsResult: + async def list_prompts(self, cursor: str | None = None) -> types.ListPromptsResult: """Send a prompts/list request.""" return await self.send_request( types.ClientRequest( types.ListPromptsRequest( method="prompts/list", + params=types.PaginatedRequestParams(cursor=cursor) if cursor is not None else None, ) ), types.ListPromptsResult, ) - async def get_prompt( - self, name: str, arguments: dict[str, str] | None = None - ) -> types.GetPromptResult: + async def get_prompt(self, name: str, arguments: dict[str, str] | None = None) -> types.GetPromptResult: """Send a prompts/get request.""" return await self.send_request( types.ClientRequest( @@ -294,10 +326,15 @@ async def get_prompt( async def complete( self, - ref: types.ResourceReference | types.PromptReference, + ref: types.ResourceTemplateReference | types.PromptReference, argument: dict[str, str], + context_arguments: dict[str, str] | None = None, ) -> types.CompleteResult: """Send a completion/complete request.""" + context = None + if context_arguments is not None: + context = types.CompletionContext(arguments=context_arguments) + return await self.send_request( types.ClientRequest( types.CompleteRequest( @@ -305,18 +342,20 @@ async def complete( params=types.CompleteRequestParams( ref=ref, argument=types.CompletionArgument(**argument), + context=context, ), ) ), types.CompleteResult, ) - async def list_tools(self) -> types.ListToolsResult: + async def list_tools(self, cursor: str | None = None) -> types.ListToolsResult: """Send a tools/list request.""" return await self.send_request( types.ClientRequest( types.ListToolsRequest( method="tools/list", + params=types.PaginatedRequestParams(cursor=cursor) if cursor is not None else None, ) ), types.ListToolsResult, @@ -332,9 +371,7 @@ async def send_roots_list_changed(self) -> None: ) ) - async def _received_request( - self, responder: RequestResponder[types.ServerRequest, types.ClientResult] - ) -> None: + async def _received_request(self, responder: RequestResponder[types.ServerRequest, types.ClientResult]) -> None: ctx = RequestContext[ClientSession, Any]( request_id=responder.request_id, meta=responder.request_meta, @@ -349,6 +386,12 @@ async def _received_request( client_response = ClientResponse.validate_python(response) await responder.respond(client_response) + case types.ElicitRequest(params=params): + with responder: + response = await self._elicitation_callback(ctx, params) + client_response = ClientResponse.validate_python(response) + await responder.respond(client_response) + case types.ListRootsRequest(): with responder: response = await self._list_roots_callback(ctx) @@ -357,22 +400,16 @@ async def _received_request( case types.PingRequest(): with responder: - return await responder.respond( - types.ClientResult(root=types.EmptyResult()) - ) + return await responder.respond(types.ClientResult(root=types.EmptyResult())) async def _handle_incoming( self, - req: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + req: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: """Handle incoming messages by forwarding to the message handler.""" await self._message_handler(req) - async def _received_notification( - self, notification: types.ServerNotification - ) -> None: + async def _received_notification(self, notification: types.ServerNotification) -> None: """Handle notifications from the server.""" # Process specific notification types match notification.root: diff --git a/src/mcp/client/session_group.py b/src/mcp/client/session_group.py new file mode 100644 index 000000000..700b5417f --- /dev/null +++ b/src/mcp/client/session_group.py @@ -0,0 +1,366 @@ +""" +SessionGroup concurrently manages multiple MCP session connections. + +Tools, resources, and prompts are aggregated across servers. Servers may +be connected to or disconnected from at any point after initialization. + +This abstractions can handle naming collisions using a custom user-provided +hook. +""" + +import contextlib +import logging +from collections.abc import Callable +from datetime import timedelta +from types import TracebackType +from typing import Any, TypeAlias + +import anyio +from pydantic import BaseModel +from typing_extensions import Self + +import mcp +from mcp import types +from mcp.client.sse import sse_client +from mcp.client.stdio import StdioServerParameters +from mcp.client.streamable_http import streamablehttp_client +from mcp.shared.exceptions import McpError + + +class SseServerParameters(BaseModel): + """Parameters for intializing a sse_client.""" + + # The endpoint URL. + url: str + + # Optional headers to include in requests. + headers: dict[str, Any] | None = None + + # HTTP timeout for regular operations. + timeout: float = 5 + + # Timeout for SSE read operations. + sse_read_timeout: float = 60 * 5 + + +class StreamableHttpParameters(BaseModel): + """Parameters for intializing a streamablehttp_client.""" + + # The endpoint URL. + url: str + + # Optional headers to include in requests. + headers: dict[str, Any] | None = None + + # HTTP timeout for regular operations. + timeout: timedelta = timedelta(seconds=30) + + # Timeout for SSE read operations. + sse_read_timeout: timedelta = timedelta(seconds=60 * 5) + + # Close the client session when the transport closes. + terminate_on_close: bool = True + + +ServerParameters: TypeAlias = StdioServerParameters | SseServerParameters | StreamableHttpParameters + + +class ClientSessionGroup: + """Client for managing connections to multiple MCP servers. + + This class is responsible for encapsulating management of server connections. + It aggregates tools, resources, and prompts from all connected servers. + + For auxiliary handlers, such as resource subscription, this is delegated to + the client and can be accessed via the session. + + Example Usage: + name_fn = lambda name, server_info: f"{(server_info.name)}_{name}" + async with ClientSessionGroup(component_name_hook=name_fn) as group: + for server_params in server_params: + await group.connect_to_server(server_param) + ... + + """ + + class _ComponentNames(BaseModel): + """Used for reverse index to find components.""" + + prompts: set[str] = set() + resources: set[str] = set() + tools: set[str] = set() + + # Standard MCP components. + _prompts: dict[str, types.Prompt] + _resources: dict[str, types.Resource] + _tools: dict[str, types.Tool] + + # Client-server connection management. + _sessions: dict[mcp.ClientSession, _ComponentNames] + _tool_to_session: dict[str, mcp.ClientSession] + _exit_stack: contextlib.AsyncExitStack + _session_exit_stacks: dict[mcp.ClientSession, contextlib.AsyncExitStack] + + # Optional fn consuming (component_name, serverInfo) for custom names. + # This is provide a means to mitigate naming conflicts across servers. + # Example: (tool_name, serverInfo) => "{result.serverInfo.name}.{tool_name}" + _ComponentNameHook: TypeAlias = Callable[[str, types.Implementation], str] + _component_name_hook: _ComponentNameHook | None + + def __init__( + self, + exit_stack: contextlib.AsyncExitStack | None = None, + component_name_hook: _ComponentNameHook | None = None, + ) -> None: + """Initializes the MCP client.""" + + self._tools = {} + self._resources = {} + self._prompts = {} + + self._sessions = {} + self._tool_to_session = {} + if exit_stack is None: + self._exit_stack = contextlib.AsyncExitStack() + self._owns_exit_stack = True + else: + self._exit_stack = exit_stack + self._owns_exit_stack = False + self._session_exit_stacks = {} + self._component_name_hook = component_name_hook + + async def __aenter__(self) -> Self: + # Enter the exit stack only if we created it ourselves + if self._owns_exit_stack: + await self._exit_stack.__aenter__() + return self + + async def __aexit__( + self, + _exc_type: type[BaseException] | None, + _exc_val: BaseException | None, + _exc_tb: TracebackType | None, + ) -> bool | None: + """Closes session exit stacks and main exit stack upon completion.""" + + # Only close the main exit stack if we created it + if self._owns_exit_stack: + await self._exit_stack.aclose() + + # Concurrently close session stacks. + async with anyio.create_task_group() as tg: + for exit_stack in self._session_exit_stacks.values(): + tg.start_soon(exit_stack.aclose) + + @property + def sessions(self) -> list[mcp.ClientSession]: + """Returns the list of sessions being managed.""" + return list(self._sessions.keys()) + + @property + def prompts(self) -> dict[str, types.Prompt]: + """Returns the prompts as a dictionary of names to prompts.""" + return self._prompts + + @property + def resources(self) -> dict[str, types.Resource]: + """Returns the resources as a dictionary of names to resources.""" + return self._resources + + @property + def tools(self) -> dict[str, types.Tool]: + """Returns the tools as a dictionary of names to tools.""" + return self._tools + + async def call_tool(self, name: str, args: dict[str, Any]) -> types.CallToolResult: + """Executes a tool given its name and arguments.""" + session = self._tool_to_session[name] + session_tool_name = self.tools[name].name + return await session.call_tool(session_tool_name, args) + + async def disconnect_from_server(self, session: mcp.ClientSession) -> None: + """Disconnects from a single MCP server.""" + + session_known_for_components = session in self._sessions + session_known_for_stack = session in self._session_exit_stacks + + if not session_known_for_components and not session_known_for_stack: + raise McpError( + types.ErrorData( + code=types.INVALID_PARAMS, + message="Provided session is not managed or already disconnected.", + ) + ) + + if session_known_for_components: + component_names = self._sessions.pop(session) # Pop from _sessions tracking + + # Remove prompts associated with the session. + for name in component_names.prompts: + if name in self._prompts: + del self._prompts[name] + # Remove resources associated with the session. + for name in component_names.resources: + if name in self._resources: + del self._resources[name] + # Remove tools associated with the session. + for name in component_names.tools: + if name in self._tools: + del self._tools[name] + if name in self._tool_to_session: + del self._tool_to_session[name] + + # Clean up the session's resources via its dedicated exit stack + if session_known_for_stack: + session_stack_to_close = self._session_exit_stacks.pop(session) + await session_stack_to_close.aclose() + + async def connect_with_session( + self, server_info: types.Implementation, session: mcp.ClientSession + ) -> mcp.ClientSession: + """Connects to a single MCP server.""" + await self._aggregate_components(server_info, session) + return session + + async def connect_to_server( + self, + server_params: ServerParameters, + ) -> mcp.ClientSession: + """Connects to a single MCP server.""" + server_info, session = await self._establish_session(server_params) + return await self.connect_with_session(server_info, session) + + async def _establish_session( + self, server_params: ServerParameters + ) -> tuple[types.Implementation, mcp.ClientSession]: + """Establish a client session to an MCP server.""" + + session_stack = contextlib.AsyncExitStack() + try: + # Create read and write streams that facilitate io with the server. + if isinstance(server_params, StdioServerParameters): + client = mcp.stdio_client(server_params) + read, write = await session_stack.enter_async_context(client) + elif isinstance(server_params, SseServerParameters): + client = sse_client( + url=server_params.url, + headers=server_params.headers, + timeout=server_params.timeout, + sse_read_timeout=server_params.sse_read_timeout, + ) + read, write = await session_stack.enter_async_context(client) + else: + client = streamablehttp_client( + url=server_params.url, + headers=server_params.headers, + timeout=server_params.timeout, + sse_read_timeout=server_params.sse_read_timeout, + terminate_on_close=server_params.terminate_on_close, + ) + read, write, _ = await session_stack.enter_async_context(client) + + session = await session_stack.enter_async_context(mcp.ClientSession(read, write)) + result = await session.initialize() + + # Session successfully initialized. + # Store its stack and register the stack with the main group stack. + self._session_exit_stacks[session] = session_stack + # session_stack itself becomes a resource managed by the + # main _exit_stack. + await self._exit_stack.enter_async_context(session_stack) + + return result.serverInfo, session + except Exception: + # If anything during this setup fails, ensure the session-specific + # stack is closed. + await session_stack.aclose() + raise + + async def _aggregate_components(self, server_info: types.Implementation, session: mcp.ClientSession) -> None: + """Aggregates prompts, resources, and tools from a given session.""" + + # Create a reverse index so we can find all prompts, resources, and + # tools belonging to this session. Used for removing components from + # the session group via self.disconnect_from_server. + component_names = self._ComponentNames() + + # Temporary components dicts. We do not want to modify the aggregate + # lists in case of an intermediate failure. + prompts_temp: dict[str, types.Prompt] = {} + resources_temp: dict[str, types.Resource] = {} + tools_temp: dict[str, types.Tool] = {} + tool_to_session_temp: dict[str, mcp.ClientSession] = {} + + # Query the server for its prompts and aggregate to list. + try: + prompts = (await session.list_prompts()).prompts + for prompt in prompts: + name = self._component_name(prompt.name, server_info) + prompts_temp[name] = prompt + component_names.prompts.add(name) + except McpError as err: + logging.warning(f"Could not fetch prompts: {err}") + + # Query the server for its resources and aggregate to list. + try: + resources = (await session.list_resources()).resources + for resource in resources: + name = self._component_name(resource.name, server_info) + resources_temp[name] = resource + component_names.resources.add(name) + except McpError as err: + logging.warning(f"Could not fetch resources: {err}") + + # Query the server for its tools and aggregate to list. + try: + tools = (await session.list_tools()).tools + for tool in tools: + name = self._component_name(tool.name, server_info) + tools_temp[name] = tool + tool_to_session_temp[name] = session + component_names.tools.add(name) + except McpError as err: + logging.warning(f"Could not fetch tools: {err}") + + # Clean up exit stack for session if we couldn't retrieve anything + # from the server. + if not any((prompts_temp, resources_temp, tools_temp)): + del self._session_exit_stacks[session] + + # Check for duplicates. + matching_prompts = prompts_temp.keys() & self._prompts.keys() + if matching_prompts: + raise McpError( + types.ErrorData( + code=types.INVALID_PARAMS, + message=f"{matching_prompts} already exist in group prompts.", + ) + ) + matching_resources = resources_temp.keys() & self._resources.keys() + if matching_resources: + raise McpError( + types.ErrorData( + code=types.INVALID_PARAMS, + message=f"{matching_resources} already exist in group resources.", + ) + ) + matching_tools = tools_temp.keys() & self._tools.keys() + if matching_tools: + raise McpError( + types.ErrorData( + code=types.INVALID_PARAMS, + message=f"{matching_tools} already exist in group tools.", + ) + ) + + # Aggregate components. + self._sessions[session] = component_names + self._prompts.update(prompts_temp) + self._resources.update(resources_temp) + self._tools.update(tools_temp) + self._tool_to_session.update(tool_to_session_temp) + + def _component_name(self, name: str, server_info: types.Implementation) -> str: + if self._component_name_hook: + return self._component_name_hook(name, server_info) + return name diff --git a/src/mcp/client/sse.py b/src/mcp/client/sse.py index 4f6241a72..0c05c6def 100644 --- a/src/mcp/client/sse.py +++ b/src/mcp/client/sse.py @@ -10,6 +10,8 @@ from httpx_sse import aconnect_sse import mcp.types as types +from mcp.shared._httpx_utils import McpHttpClientFactory, create_mcp_http_client +from mcp.shared.message import SessionMessage logger = logging.getLogger(__name__) @@ -24,31 +26,41 @@ async def sse_client( headers: dict[str, Any] | None = None, timeout: float = 5, sse_read_timeout: float = 60 * 5, + httpx_client_factory: McpHttpClientFactory = create_mcp_http_client, + auth: httpx.Auth | None = None, ): """ Client transport for SSE. `sse_read_timeout` determines how long (in seconds) the client will wait for a new event before disconnecting. All other HTTP operations are controlled by `timeout`. + + Args: + url: The SSE endpoint URL. + headers: Optional headers to include in requests. + timeout: HTTP timeout for regular operations. + sse_read_timeout: Timeout for SSE read operations. + auth: Optional HTTPX authentication handler. """ - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception] - read_stream_writer: MemoryObjectSendStream[types.JSONRPCMessage | Exception] + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] + read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] - write_stream: MemoryObjectSendStream[types.JSONRPCMessage] - write_stream_reader: MemoryObjectReceiveStream[types.JSONRPCMessage] + write_stream: MemoryObjectSendStream[SessionMessage] + write_stream_reader: MemoryObjectReceiveStream[SessionMessage] read_stream_writer, read_stream = anyio.create_memory_object_stream(0) write_stream, write_stream_reader = anyio.create_memory_object_stream(0) async with anyio.create_task_group() as tg: try: - logger.info(f"Connecting to SSE endpoint: {remove_request_params(url)}") - async with httpx.AsyncClient(headers=headers) as client: + logger.debug(f"Connecting to SSE endpoint: {remove_request_params(url)}") + async with httpx_client_factory( + headers=headers, auth=auth, timeout=httpx.Timeout(timeout, read=sse_read_timeout) + ) as client: async with aconnect_sse( client, "GET", url, - timeout=httpx.Timeout(timeout, read=sse_read_timeout), ) as event_source: event_source.response.raise_for_status() logger.debug("SSE connection established") @@ -62,20 +74,16 @@ async def sse_reader( match sse.event: case "endpoint": endpoint_url = urljoin(url, sse.data) - logger.info( - f"Received endpoint URL: {endpoint_url}" - ) + logger.debug(f"Received endpoint URL: {endpoint_url}") url_parsed = urlparse(url) endpoint_parsed = urlparse(endpoint_url) if ( url_parsed.netloc != endpoint_parsed.netloc - or url_parsed.scheme - != endpoint_parsed.scheme + or url_parsed.scheme != endpoint_parsed.scheme ): error_msg = ( - "Endpoint origin does not match " - f"connection origin: {endpoint_url}" + "Endpoint origin does not match " f"connection origin: {endpoint_url}" ) logger.error(error_msg) raise ValueError(error_msg) @@ -87,21 +95,16 @@ async def sse_reader( message = types.JSONRPCMessage.model_validate_json( # noqa: E501 sse.data ) - logger.debug( - f"Received server message: {message}" - ) + logger.debug(f"Received server message: {message}") except Exception as exc: - logger.error( - f"Error parsing server message: {exc}" - ) + logger.error(f"Error parsing server message: {exc}") await read_stream_writer.send(exc) continue - await read_stream_writer.send(message) + session_message = SessionMessage(message) + await read_stream_writer.send(session_message) case _: - logger.warning( - f"Unknown SSE event: {sse.event}" - ) + logger.warning(f"Unknown SSE event: {sse.event}") except Exception as exc: logger.error(f"Error in sse_reader: {exc}") await read_stream_writer.send(exc) @@ -111,30 +114,25 @@ async def sse_reader( async def post_writer(endpoint_url: str): try: async with write_stream_reader: - async for message in write_stream_reader: - logger.debug(f"Sending client message: {message}") + async for session_message in write_stream_reader: + logger.debug(f"Sending client message: {session_message}") response = await client.post( endpoint_url, - json=message.model_dump( + json=session_message.message.model_dump( by_alias=True, mode="json", exclude_none=True, ), ) response.raise_for_status() - logger.debug( - "Client message sent successfully: " - f"{response.status_code}" - ) + logger.debug("Client message sent successfully: " f"{response.status_code}") except Exception as exc: logger.error(f"Error in post_writer: {exc}") finally: await write_stream.aclose() endpoint_url = await tg.start(sse_reader) - logger.info( - f"Starting post writer with endpoint URL: {endpoint_url}" - ) + logger.debug(f"Starting post writer with endpoint URL: {endpoint_url}") tg.start_soon(post_writer, endpoint_url) try: diff --git a/src/mcp/client/stdio/__init__.py b/src/mcp/client/stdio/__init__.py index 83de57a2b..a75cfd764 100644 --- a/src/mcp/client/stdio/__init__.py +++ b/src/mcp/client/stdio/__init__.py @@ -11,6 +11,7 @@ from pydantic import BaseModel, Field import mcp.types as types +from mcp.shared.message import SessionMessage from .win32 import ( create_windows_process, @@ -98,29 +99,33 @@ async def stdio_client(server: StdioServerParameters, errlog: TextIO = sys.stder Client transport for stdio: this will connect to a server by spawning a process and communicating with it over stdin/stdout. """ - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception] - read_stream_writer: MemoryObjectSendStream[types.JSONRPCMessage | Exception] + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] + read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] - write_stream: MemoryObjectSendStream[types.JSONRPCMessage] - write_stream_reader: MemoryObjectReceiveStream[types.JSONRPCMessage] + write_stream: MemoryObjectSendStream[SessionMessage] + write_stream_reader: MemoryObjectReceiveStream[SessionMessage] read_stream_writer, read_stream = anyio.create_memory_object_stream(0) write_stream, write_stream_reader = anyio.create_memory_object_stream(0) - command = _get_executable_command(server.command) - - # Open process with stderr piped for capture - process = await _create_platform_compatible_process( - command=command, - args=server.args, - env=( - {**get_default_environment(), **server.env} - if server.env is not None - else get_default_environment() - ), - errlog=errlog, - cwd=server.cwd, - ) + try: + command = _get_executable_command(server.command) + + # Open process with stderr piped for capture + process = await _create_platform_compatible_process( + command=command, + args=server.args, + env=({**get_default_environment(), **server.env} if server.env is not None else get_default_environment()), + errlog=errlog, + cwd=server.cwd, + ) + except OSError: + # Clean up streams if process creation fails + await read_stream.aclose() + await write_stream.aclose() + await read_stream_writer.aclose() + await write_stream_reader.aclose() + raise async def stdout_reader(): assert process.stdout, "Opened process is missing stdout" @@ -143,7 +148,8 @@ async def stdout_reader(): await read_stream_writer.send(exc) continue - await read_stream_writer.send(message) + session_message = SessionMessage(message) + await read_stream_writer.send(session_message) except anyio.ClosedResourceError: await anyio.lowlevel.checkpoint() @@ -152,8 +158,8 @@ async def stdin_writer(): try: async with write_stream_reader: - async for message in write_stream_reader: - json = message.model_dump_json(by_alias=True, exclude_none=True) + async for session_message in write_stream_reader: + json = session_message.message.model_dump_json(by_alias=True, exclude_none=True) await process.stdin.send( (json + "\n").encode( encoding=server.encoding, @@ -173,10 +179,18 @@ async def stdin_writer(): yield read_stream, write_stream finally: # Clean up process to prevent any dangling orphaned processes - if sys.platform == "win32": - await terminate_windows_process(process) - else: - process.terminate() + try: + if sys.platform == "win32": + await terminate_windows_process(process) + else: + process.terminate() + except ProcessLookupError: + # Process already exited, which is fine + pass + await read_stream.aclose() + await write_stream.aclose() + await read_stream_writer.aclose() + await write_stream_reader.aclose() def _get_executable_command(command: str) -> str: @@ -209,8 +223,6 @@ async def _create_platform_compatible_process( if sys.platform == "win32": process = await create_windows_process(command, args, env, errlog, cwd) else: - process = await anyio.open_process( - [command, *args], env=env, stderr=errlog, cwd=cwd - ) + process = await anyio.open_process([command, *args], env=env, stderr=errlog, cwd=cwd) return process diff --git a/src/mcp/client/stdio/win32.py b/src/mcp/client/stdio/win32.py index 825a0477d..e4f252dc9 100644 --- a/src/mcp/client/stdio/win32.py +++ b/src/mcp/client/stdio/win32.py @@ -82,9 +82,7 @@ async def create_windows_process( return process except Exception: # Don't raise, let's try to create the process without creation flags - process = await anyio.open_process( - [command, *args], env=env, stderr=errlog, cwd=cwd - ) + process = await anyio.open_process([command, *args], env=env, stderr=errlog, cwd=cwd) return process diff --git a/src/mcp/client/streamable_http.py b/src/mcp/client/streamable_http.py new file mode 100644 index 000000000..39ac34d8a --- /dev/null +++ b/src/mcp/client/streamable_http.py @@ -0,0 +1,509 @@ +""" +StreamableHTTP Client Transport Module + +This module implements the StreamableHTTP transport for MCP clients, +providing support for HTTP POST requests with optional SSE streaming responses +and session management. +""" + +import logging +from collections.abc import AsyncGenerator, Awaitable, Callable +from contextlib import asynccontextmanager +from dataclasses import dataclass +from datetime import timedelta + +import anyio +import httpx +from anyio.abc import TaskGroup +from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream +from httpx_sse import EventSource, ServerSentEvent, aconnect_sse + +from mcp.shared._httpx_utils import McpHttpClientFactory, create_mcp_http_client +from mcp.shared.message import ClientMessageMetadata, SessionMessage +from mcp.types import ( + ErrorData, + InitializeResult, + JSONRPCError, + JSONRPCMessage, + JSONRPCNotification, + JSONRPCRequest, + JSONRPCResponse, + RequestId, +) + +logger = logging.getLogger(__name__) + + +SessionMessageOrError = SessionMessage | Exception +StreamWriter = MemoryObjectSendStream[SessionMessageOrError] +StreamReader = MemoryObjectReceiveStream[SessionMessage] +GetSessionIdCallback = Callable[[], str | None] + +MCP_SESSION_ID = "mcp-session-id" +MCP_PROTOCOL_VERSION = "mcp-protocol-version" +LAST_EVENT_ID = "last-event-id" +CONTENT_TYPE = "content-type" +ACCEPT = "Accept" + + +JSON = "application/json" +SSE = "text/event-stream" + + +class StreamableHTTPError(Exception): + """Base exception for StreamableHTTP transport errors.""" + + +class ResumptionError(StreamableHTTPError): + """Raised when resumption request is invalid.""" + + +@dataclass +class RequestContext: + """Context for a request operation.""" + + client: httpx.AsyncClient + headers: dict[str, str] + session_id: str | None + session_message: SessionMessage + metadata: ClientMessageMetadata | None + read_stream_writer: StreamWriter + sse_read_timeout: float + + +class StreamableHTTPTransport: + """StreamableHTTP client transport implementation.""" + + def __init__( + self, + url: str, + headers: dict[str, str] | None = None, + timeout: float | timedelta = 30, + sse_read_timeout: float | timedelta = 60 * 5, + auth: httpx.Auth | None = None, + ) -> None: + """Initialize the StreamableHTTP transport. + + Args: + url: The endpoint URL. + headers: Optional headers to include in requests. + timeout: HTTP timeout for regular operations. + sse_read_timeout: Timeout for SSE read operations. + auth: Optional HTTPX authentication handler. + """ + self.url = url + self.headers = headers or {} + self.timeout = timeout.total_seconds() if isinstance(timeout, timedelta) else timeout + self.sse_read_timeout = ( + sse_read_timeout.total_seconds() if isinstance(sse_read_timeout, timedelta) else sse_read_timeout + ) + self.auth = auth + self.session_id = None + self.protocol_version = None + self.request_headers = { + ACCEPT: f"{JSON}, {SSE}", + CONTENT_TYPE: JSON, + **self.headers, + } + + def _prepare_request_headers(self, base_headers: dict[str, str]) -> dict[str, str]: + """Update headers with session ID and protocol version if available.""" + headers = base_headers.copy() + if self.session_id: + headers[MCP_SESSION_ID] = self.session_id + if self.protocol_version: + headers[MCP_PROTOCOL_VERSION] = self.protocol_version + return headers + + def _is_initialization_request(self, message: JSONRPCMessage) -> bool: + """Check if the message is an initialization request.""" + return isinstance(message.root, JSONRPCRequest) and message.root.method == "initialize" + + def _is_initialized_notification(self, message: JSONRPCMessage) -> bool: + """Check if the message is an initialized notification.""" + return isinstance(message.root, JSONRPCNotification) and message.root.method == "notifications/initialized" + + def _maybe_extract_session_id_from_response( + self, + response: httpx.Response, + ) -> None: + """Extract and store session ID from response headers.""" + new_session_id = response.headers.get(MCP_SESSION_ID) + if new_session_id: + self.session_id = new_session_id + logger.info(f"Received session ID: {self.session_id}") + + def _maybe_extract_protocol_version_from_message( + self, + message: JSONRPCMessage, + ) -> None: + """Extract protocol version from initialization response message.""" + if isinstance(message.root, JSONRPCResponse) and message.root.result: + try: + # Parse the result as InitializeResult for type safety + init_result = InitializeResult.model_validate(message.root.result) + self.protocol_version = str(init_result.protocolVersion) + logger.info(f"Negotiated protocol version: {self.protocol_version}") + except Exception as exc: + logger.warning(f"Failed to parse initialization response as InitializeResult: {exc}") + logger.warning(f"Raw result: {message.root.result}") + + async def _handle_sse_event( + self, + sse: ServerSentEvent, + read_stream_writer: StreamWriter, + original_request_id: RequestId | None = None, + resumption_callback: Callable[[str], Awaitable[None]] | None = None, + is_initialization: bool = False, + ) -> bool: + """Handle an SSE event, returning True if the response is complete.""" + if sse.event == "message": + try: + message = JSONRPCMessage.model_validate_json(sse.data) + logger.debug(f"SSE message: {message}") + + # Extract protocol version from initialization response + if is_initialization: + self._maybe_extract_protocol_version_from_message(message) + + # If this is a response and we have original_request_id, replace it + if original_request_id is not None and isinstance(message.root, JSONRPCResponse | JSONRPCError): + message.root.id = original_request_id + + session_message = SessionMessage(message) + await read_stream_writer.send(session_message) + + # Call resumption token callback if we have an ID + if sse.id and resumption_callback: + await resumption_callback(sse.id) + + # If this is a response or error return True indicating completion + # Otherwise, return False to continue listening + return isinstance(message.root, JSONRPCResponse | JSONRPCError) + + except Exception as exc: + logger.exception("Error parsing SSE message") + await read_stream_writer.send(exc) + return False + else: + logger.warning(f"Unknown SSE event: {sse.event}") + return False + + async def handle_get_stream( + self, + client: httpx.AsyncClient, + read_stream_writer: StreamWriter, + ) -> None: + """Handle GET stream for server-initiated messages.""" + try: + if not self.session_id: + return + + headers = self._prepare_request_headers(self.request_headers) + + async with aconnect_sse( + client, + "GET", + self.url, + headers=headers, + timeout=httpx.Timeout(self.timeout, read=self.sse_read_timeout), + ) as event_source: + event_source.response.raise_for_status() + logger.debug("GET SSE connection established") + + async for sse in event_source.aiter_sse(): + await self._handle_sse_event(sse, read_stream_writer) + + except Exception as exc: + logger.debug(f"GET stream error (non-fatal): {exc}") + + async def _handle_resumption_request(self, ctx: RequestContext) -> None: + """Handle a resumption request using GET with SSE.""" + headers = self._prepare_request_headers(ctx.headers) + if ctx.metadata and ctx.metadata.resumption_token: + headers[LAST_EVENT_ID] = ctx.metadata.resumption_token + else: + raise ResumptionError("Resumption request requires a resumption token") + + # Extract original request ID to map responses + original_request_id = None + if isinstance(ctx.session_message.message.root, JSONRPCRequest): + original_request_id = ctx.session_message.message.root.id + + async with aconnect_sse( + ctx.client, + "GET", + self.url, + headers=headers, + timeout=httpx.Timeout(self.timeout, read=self.sse_read_timeout), + ) as event_source: + event_source.response.raise_for_status() + logger.debug("Resumption GET SSE connection established") + + async for sse in event_source.aiter_sse(): + is_complete = await self._handle_sse_event( + sse, + ctx.read_stream_writer, + original_request_id, + ctx.metadata.on_resumption_token_update if ctx.metadata else None, + ) + if is_complete: + break + + async def _handle_post_request(self, ctx: RequestContext) -> None: + """Handle a POST request with response processing.""" + headers = self._prepare_request_headers(ctx.headers) + message = ctx.session_message.message + is_initialization = self._is_initialization_request(message) + + async with ctx.client.stream( + "POST", + self.url, + json=message.model_dump(by_alias=True, mode="json", exclude_none=True), + headers=headers, + ) as response: + if response.status_code == 202: + logger.debug("Received 202 Accepted") + return + + if response.status_code == 404: + if isinstance(message.root, JSONRPCRequest): + await self._send_session_terminated_error( + ctx.read_stream_writer, + message.root.id, + ) + return + + response.raise_for_status() + if is_initialization: + self._maybe_extract_session_id_from_response(response) + + content_type = response.headers.get(CONTENT_TYPE, "").lower() + + if content_type.startswith(JSON): + await self._handle_json_response(response, ctx.read_stream_writer, is_initialization) + elif content_type.startswith(SSE): + await self._handle_sse_response(response, ctx, is_initialization) + else: + await self._handle_unexpected_content_type( + content_type, + ctx.read_stream_writer, + ) + + async def _handle_json_response( + self, + response: httpx.Response, + read_stream_writer: StreamWriter, + is_initialization: bool = False, + ) -> None: + """Handle JSON response from the server.""" + try: + content = await response.aread() + message = JSONRPCMessage.model_validate_json(content) + + # Extract protocol version from initialization response + if is_initialization: + self._maybe_extract_protocol_version_from_message(message) + + session_message = SessionMessage(message) + await read_stream_writer.send(session_message) + except Exception as exc: + logger.error(f"Error parsing JSON response: {exc}") + await read_stream_writer.send(exc) + + async def _handle_sse_response( + self, + response: httpx.Response, + ctx: RequestContext, + is_initialization: bool = False, + ) -> None: + """Handle SSE response from the server.""" + try: + event_source = EventSource(response) + async for sse in event_source.aiter_sse(): + is_complete = await self._handle_sse_event( + sse, + ctx.read_stream_writer, + resumption_callback=(ctx.metadata.on_resumption_token_update if ctx.metadata else None), + is_initialization=is_initialization, + ) + # If the SSE event indicates completion, like returning respose/error + # break the loop + if is_complete: + break + except Exception as e: + logger.exception("Error reading SSE stream:") + await ctx.read_stream_writer.send(e) + + async def _handle_unexpected_content_type( + self, + content_type: str, + read_stream_writer: StreamWriter, + ) -> None: + """Handle unexpected content type in response.""" + error_msg = f"Unexpected content type: {content_type}" + logger.error(error_msg) + await read_stream_writer.send(ValueError(error_msg)) + + async def _send_session_terminated_error( + self, + read_stream_writer: StreamWriter, + request_id: RequestId, + ) -> None: + """Send a session terminated error response.""" + jsonrpc_error = JSONRPCError( + jsonrpc="2.0", + id=request_id, + error=ErrorData(code=32600, message="Session terminated"), + ) + session_message = SessionMessage(JSONRPCMessage(jsonrpc_error)) + await read_stream_writer.send(session_message) + + async def post_writer( + self, + client: httpx.AsyncClient, + write_stream_reader: StreamReader, + read_stream_writer: StreamWriter, + write_stream: MemoryObjectSendStream[SessionMessage], + start_get_stream: Callable[[], None], + tg: TaskGroup, + ) -> None: + """Handle writing requests to the server.""" + try: + async with write_stream_reader: + async for session_message in write_stream_reader: + message = session_message.message + metadata = ( + session_message.metadata + if isinstance(session_message.metadata, ClientMessageMetadata) + else None + ) + + # Check if this is a resumption request + is_resumption = bool(metadata and metadata.resumption_token) + + logger.debug(f"Sending client message: {message}") + + # Handle initialized notification + if self._is_initialized_notification(message): + start_get_stream() + + ctx = RequestContext( + client=client, + headers=self.request_headers, + session_id=self.session_id, + session_message=session_message, + metadata=metadata, + read_stream_writer=read_stream_writer, + sse_read_timeout=self.sse_read_timeout, + ) + + async def handle_request_async(): + if is_resumption: + await self._handle_resumption_request(ctx) + else: + await self._handle_post_request(ctx) + + # If this is a request, start a new task to handle it + if isinstance(message.root, JSONRPCRequest): + tg.start_soon(handle_request_async) + else: + await handle_request_async() + + except Exception as exc: + logger.error(f"Error in post_writer: {exc}") + finally: + await read_stream_writer.aclose() + await write_stream.aclose() + + async def terminate_session(self, client: httpx.AsyncClient) -> None: + """Terminate the session by sending a DELETE request.""" + if not self.session_id: + return + + try: + headers = self._prepare_request_headers(self.request_headers) + response = await client.delete(self.url, headers=headers) + + if response.status_code == 405: + logger.debug("Server does not allow session termination") + elif response.status_code not in (200, 204): + logger.warning(f"Session termination failed: {response.status_code}") + except Exception as exc: + logger.warning(f"Session termination failed: {exc}") + + def get_session_id(self) -> str | None: + """Get the current session ID.""" + return self.session_id + + +@asynccontextmanager +async def streamablehttp_client( + url: str, + headers: dict[str, str] | None = None, + timeout: float | timedelta = 30, + sse_read_timeout: float | timedelta = 60 * 5, + terminate_on_close: bool = True, + httpx_client_factory: McpHttpClientFactory = create_mcp_http_client, + auth: httpx.Auth | None = None, +) -> AsyncGenerator[ + tuple[ + MemoryObjectReceiveStream[SessionMessage | Exception], + MemoryObjectSendStream[SessionMessage], + GetSessionIdCallback, + ], + None, +]: + """ + Client transport for StreamableHTTP. + + `sse_read_timeout` determines how long (in seconds) the client will wait for a new + event before disconnecting. All other HTTP operations are controlled by `timeout`. + + Yields: + Tuple containing: + - read_stream: Stream for reading messages from the server + - write_stream: Stream for sending messages to the server + - get_session_id_callback: Function to retrieve the current session ID + """ + transport = StreamableHTTPTransport(url, headers, timeout, sse_read_timeout, auth) + + read_stream_writer, read_stream = anyio.create_memory_object_stream[SessionMessage | Exception](0) + write_stream, write_stream_reader = anyio.create_memory_object_stream[SessionMessage](0) + + async with anyio.create_task_group() as tg: + try: + logger.debug(f"Connecting to StreamableHTTP endpoint: {url}") + + async with httpx_client_factory( + headers=transport.request_headers, + timeout=httpx.Timeout(transport.timeout, read=transport.sse_read_timeout), + auth=transport.auth, + ) as client: + # Define callbacks that need access to tg + def start_get_stream() -> None: + tg.start_soon(transport.handle_get_stream, client, read_stream_writer) + + tg.start_soon( + transport.post_writer, + client, + write_stream_reader, + read_stream_writer, + write_stream, + start_get_stream, + tg, + ) + + try: + yield ( + read_stream, + write_stream, + transport.get_session_id, + ) + finally: + if transport.session_id and terminate_on_close: + await transport.terminate_session(client) + tg.cancel_scope.cancel() + finally: + await read_stream_writer.aclose() + await write_stream.aclose() diff --git a/src/mcp/client/websocket.py b/src/mcp/client/websocket.py index 2c2ed38b9..0a371610b 100644 --- a/src/mcp/client/websocket.py +++ b/src/mcp/client/websocket.py @@ -10,6 +10,7 @@ from websockets.typing import Subprotocol import mcp.types as types +from mcp.shared.message import SessionMessage logger = logging.getLogger(__name__) @@ -18,10 +19,7 @@ async def websocket_client( url: str, ) -> AsyncGenerator[ - tuple[ - MemoryObjectReceiveStream[types.JSONRPCMessage | Exception], - MemoryObjectSendStream[types.JSONRPCMessage], - ], + tuple[MemoryObjectReceiveStream[SessionMessage | Exception], MemoryObjectSendStream[SessionMessage]], None, ]: """ @@ -39,10 +37,10 @@ async def websocket_client( # Create two in-memory streams: # - One for incoming messages (read_stream, written by ws_reader) # - One for outgoing messages (write_stream, read by ws_writer) - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception] - read_stream_writer: MemoryObjectSendStream[types.JSONRPCMessage | Exception] - write_stream: MemoryObjectSendStream[types.JSONRPCMessage] - write_stream_reader: MemoryObjectReceiveStream[types.JSONRPCMessage] + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] + read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] + write_stream: MemoryObjectSendStream[SessionMessage] + write_stream_reader: MemoryObjectReceiveStream[SessionMessage] read_stream_writer, read_stream = anyio.create_memory_object_stream(0) write_stream, write_stream_reader = anyio.create_memory_object_stream(0) @@ -59,7 +57,8 @@ async def ws_reader(): async for raw_text in ws: try: message = types.JSONRPCMessage.model_validate_json(raw_text) - await read_stream_writer.send(message) + session_message = SessionMessage(message) + await read_stream_writer.send(session_message) except ValidationError as exc: # If JSON parse or model validation fails, send the exception await read_stream_writer.send(exc) @@ -70,11 +69,9 @@ async def ws_writer(): sends them to the server. """ async with write_stream_reader: - async for message in write_stream_reader: + async for session_message in write_stream_reader: # Convert to a dict, then to JSON - msg_dict = message.model_dump( - by_alias=True, mode="json", exclude_none=True - ) + msg_dict = session_message.message.model_dump(by_alias=True, mode="json", exclude_none=True) await ws.send(json.dumps(msg_dict)) async with anyio.create_task_group() as tg: diff --git a/src/mcp/server/auth/__init__.py b/src/mcp/server/auth/__init__.py new file mode 100644 index 000000000..6888ffe8d --- /dev/null +++ b/src/mcp/server/auth/__init__.py @@ -0,0 +1,3 @@ +""" +MCP OAuth server authorization components. +""" diff --git a/src/mcp/server/auth/errors.py b/src/mcp/server/auth/errors.py new file mode 100644 index 000000000..117deea83 --- /dev/null +++ b/src/mcp/server/auth/errors.py @@ -0,0 +1,5 @@ +from pydantic import ValidationError + + +def stringify_pydantic_error(validation_error: ValidationError) -> str: + return "\n".join(f"{'.'.join(str(loc) for loc in e['loc'])}: {e['msg']}" for e in validation_error.errors()) diff --git a/src/mcp/server/auth/handlers/__init__.py b/src/mcp/server/auth/handlers/__init__.py new file mode 100644 index 000000000..e99a62de1 --- /dev/null +++ b/src/mcp/server/auth/handlers/__init__.py @@ -0,0 +1,3 @@ +""" +Request handlers for MCP authorization endpoints. +""" diff --git a/src/mcp/server/auth/handlers/authorize.py b/src/mcp/server/auth/handlers/authorize.py new file mode 100644 index 000000000..3ce4c34bc --- /dev/null +++ b/src/mcp/server/auth/handlers/authorize.py @@ -0,0 +1,224 @@ +import logging +from dataclasses import dataclass +from typing import Any, Literal + +from pydantic import AnyUrl, BaseModel, Field, RootModel, ValidationError +from starlette.datastructures import FormData, QueryParams +from starlette.requests import Request +from starlette.responses import RedirectResponse, Response + +from mcp.server.auth.errors import stringify_pydantic_error +from mcp.server.auth.json_response import PydanticJSONResponse +from mcp.server.auth.provider import ( + AuthorizationErrorCode, + AuthorizationParams, + AuthorizeError, + OAuthAuthorizationServerProvider, + construct_redirect_uri, +) +from mcp.shared.auth import InvalidRedirectUriError, InvalidScopeError + +logger = logging.getLogger(__name__) + + +class AuthorizationRequest(BaseModel): + # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.1 + client_id: str = Field(..., description="The client ID") + redirect_uri: AnyUrl | None = Field(None, description="URL to redirect to after authorization") + + # see OAuthClientMetadata; we only support `code` + response_type: Literal["code"] = Field(..., description="Must be 'code' for authorization code flow") + code_challenge: str = Field(..., description="PKCE code challenge") + code_challenge_method: Literal["S256"] = Field("S256", description="PKCE code challenge method, must be S256") + state: str | None = Field(None, description="Optional state parameter") + scope: str | None = Field( + None, + description="Optional scope; if specified, should be " "a space-separated list of scope strings", + ) + resource: str | None = Field( + None, + description="RFC 8707 resource indicator - the MCP server this token will be used with", + ) + + +class AuthorizationErrorResponse(BaseModel): + error: AuthorizationErrorCode + error_description: str | None + error_uri: AnyUrl | None = None + # must be set if provided in the request + state: str | None = None + + +def best_effort_extract_string(key: str, params: None | FormData | QueryParams) -> str | None: + if params is None: + return None + value = params.get(key) + if isinstance(value, str): + return value + return None + + +class AnyUrlModel(RootModel[AnyUrl]): + root: AnyUrl + + +@dataclass +class AuthorizationHandler: + provider: OAuthAuthorizationServerProvider[Any, Any, Any] + + async def handle(self, request: Request) -> Response: + # implements authorization requests for grant_type=code; + # see https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.1 + + state = None + redirect_uri = None + client = None + params = None + + async def error_response( + error: AuthorizationErrorCode, + error_description: str | None, + attempt_load_client: bool = True, + ): + # Error responses take two different formats: + # 1. The request has a valid client ID & redirect_uri: we issue a redirect + # back to the redirect_uri with the error response fields as query + # parameters. This allows the client to be notified of the error. + # 2. Otherwise, we return an error response directly to the end user; + # we choose to do so in JSON, but this is left undefined in the + # specification. + # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.2.1 + # + # This logic is a bit awkward to handle, because the error might be thrown + # very early in request validation, before we've done the usual Pydantic + # validation, loaded the client, etc. To handle this, error_response() + # contains fallback logic which attempts to load the parameters directly + # from the request. + + nonlocal client, redirect_uri, state + if client is None and attempt_load_client: + # make last-ditch attempt to load the client + client_id = best_effort_extract_string("client_id", params) + client = client_id and await self.provider.get_client(client_id) + if redirect_uri is None and client: + # make last-ditch effort to load the redirect uri + try: + if params is not None and "redirect_uri" not in params: + raw_redirect_uri = None + else: + raw_redirect_uri = AnyUrlModel.model_validate( + best_effort_extract_string("redirect_uri", params) + ).root + redirect_uri = client.validate_redirect_uri(raw_redirect_uri) + except (ValidationError, InvalidRedirectUriError): + # if the redirect URI is invalid, ignore it & just return the + # initial error + pass + + # the error response MUST contain the state specified by the client, if any + if state is None: + # make last-ditch effort to load state + state = best_effort_extract_string("state", params) + + error_resp = AuthorizationErrorResponse( + error=error, + error_description=error_description, + state=state, + ) + + if redirect_uri and client: + return RedirectResponse( + url=construct_redirect_uri(str(redirect_uri), **error_resp.model_dump(exclude_none=True)), + status_code=302, + headers={"Cache-Control": "no-store"}, + ) + else: + return PydanticJSONResponse( + status_code=400, + content=error_resp, + headers={"Cache-Control": "no-store"}, + ) + + try: + # Parse request parameters + if request.method == "GET": + # Convert query_params to dict for pydantic validation + params = request.query_params + else: + # Parse form data for POST requests + params = await request.form() + + # Save state if it exists, even before validation + state = best_effort_extract_string("state", params) + + try: + auth_request = AuthorizationRequest.model_validate(params) + state = auth_request.state # Update with validated state + except ValidationError as validation_error: + error: AuthorizationErrorCode = "invalid_request" + for e in validation_error.errors(): + if e["loc"] == ("response_type",) and e["type"] == "literal_error": + error = "unsupported_response_type" + break + return await error_response(error, stringify_pydantic_error(validation_error)) + + # Get client information + client = await self.provider.get_client( + auth_request.client_id, + ) + if not client: + # For client_id validation errors, return direct error (no redirect) + return await error_response( + error="invalid_request", + error_description=f"Client ID '{auth_request.client_id}' not found", + attempt_load_client=False, + ) + + # Validate redirect_uri against client's registered URIs + try: + redirect_uri = client.validate_redirect_uri(auth_request.redirect_uri) + except InvalidRedirectUriError as validation_error: + # For redirect_uri validation errors, return direct error (no redirect) + return await error_response( + error="invalid_request", + error_description=validation_error.message, + ) + + # Validate scope - for scope errors, we can redirect + try: + scopes = client.validate_scope(auth_request.scope) + except InvalidScopeError as validation_error: + # For scope errors, redirect with error parameters + return await error_response( + error="invalid_scope", + error_description=validation_error.message, + ) + + # Setup authorization parameters + auth_params = AuthorizationParams( + state=state, + scopes=scopes, + code_challenge=auth_request.code_challenge, + redirect_uri=redirect_uri, + redirect_uri_provided_explicitly=auth_request.redirect_uri is not None, + resource=auth_request.resource, # RFC 8707 + ) + + try: + # Let the provider pick the next URI to redirect to + return RedirectResponse( + url=await self.provider.authorize( + client, + auth_params, + ), + status_code=302, + headers={"Cache-Control": "no-store"}, + ) + except AuthorizeError as e: + # Handle authorization errors as defined in RFC 6749 Section 4.1.2.1 + return await error_response(error=e.error, error_description=e.error_description) + + except Exception as validation_error: + # Catch-all for unexpected errors + logger.exception("Unexpected error in authorization_handler", exc_info=validation_error) + return await error_response(error="server_error", error_description="An unexpected error occurred") diff --git a/src/mcp/server/auth/handlers/metadata.py b/src/mcp/server/auth/handlers/metadata.py new file mode 100644 index 000000000..f12644215 --- /dev/null +++ b/src/mcp/server/auth/handlers/metadata.py @@ -0,0 +1,29 @@ +from dataclasses import dataclass + +from starlette.requests import Request +from starlette.responses import Response + +from mcp.server.auth.json_response import PydanticJSONResponse +from mcp.shared.auth import OAuthMetadata, ProtectedResourceMetadata + + +@dataclass +class MetadataHandler: + metadata: OAuthMetadata + + async def handle(self, request: Request) -> Response: + return PydanticJSONResponse( + content=self.metadata, + headers={"Cache-Control": "public, max-age=3600"}, # Cache for 1 hour + ) + + +@dataclass +class ProtectedResourceMetadataHandler: + metadata: ProtectedResourceMetadata + + async def handle(self, request: Request) -> Response: + return PydanticJSONResponse( + content=self.metadata, + headers={"Cache-Control": "public, max-age=3600"}, # Cache for 1 hour + ) diff --git a/src/mcp/server/auth/handlers/register.py b/src/mcp/server/auth/handlers/register.py new file mode 100644 index 000000000..61e403aca --- /dev/null +++ b/src/mcp/server/auth/handlers/register.py @@ -0,0 +1,120 @@ +import secrets +import time +from dataclasses import dataclass +from typing import Any +from uuid import uuid4 + +from pydantic import BaseModel, RootModel, ValidationError +from starlette.requests import Request +from starlette.responses import Response + +from mcp.server.auth.errors import stringify_pydantic_error +from mcp.server.auth.json_response import PydanticJSONResponse +from mcp.server.auth.provider import OAuthAuthorizationServerProvider, RegistrationError, RegistrationErrorCode +from mcp.server.auth.settings import ClientRegistrationOptions +from mcp.shared.auth import OAuthClientInformationFull, OAuthClientMetadata + + +class RegistrationRequest(RootModel[OAuthClientMetadata]): + # this wrapper is a no-op; it's just to separate out the types exposed to the + # provider from what we use in the HTTP handler + root: OAuthClientMetadata + + +class RegistrationErrorResponse(BaseModel): + error: RegistrationErrorCode + error_description: str | None + + +@dataclass +class RegistrationHandler: + provider: OAuthAuthorizationServerProvider[Any, Any, Any] + options: ClientRegistrationOptions + + async def handle(self, request: Request) -> Response: + # Implements dynamic client registration as defined in https://datatracker.ietf.org/doc/html/rfc7591#section-3.1 + try: + # Parse request body as JSON + body = await request.json() + client_metadata = OAuthClientMetadata.model_validate(body) + + # Scope validation is handled below + except ValidationError as validation_error: + return PydanticJSONResponse( + content=RegistrationErrorResponse( + error="invalid_client_metadata", + error_description=stringify_pydantic_error(validation_error), + ), + status_code=400, + ) + + client_id = str(uuid4()) + client_secret = None + if client_metadata.token_endpoint_auth_method != "none": + # cryptographically secure random 32-byte hex string + client_secret = secrets.token_hex(32) + + if client_metadata.scope is None and self.options.default_scopes is not None: + client_metadata.scope = " ".join(self.options.default_scopes) + elif client_metadata.scope is not None and self.options.valid_scopes is not None: + requested_scopes = set(client_metadata.scope.split()) + valid_scopes = set(self.options.valid_scopes) + if not requested_scopes.issubset(valid_scopes): + return PydanticJSONResponse( + content=RegistrationErrorResponse( + error="invalid_client_metadata", + error_description="Requested scopes are not valid: " + f"{', '.join(requested_scopes - valid_scopes)}", + ), + status_code=400, + ) + if set(client_metadata.grant_types) != {"authorization_code", "refresh_token"}: + return PydanticJSONResponse( + content=RegistrationErrorResponse( + error="invalid_client_metadata", + error_description="grant_types must be authorization_code " "and refresh_token", + ), + status_code=400, + ) + + client_id_issued_at = int(time.time()) + client_secret_expires_at = ( + client_id_issued_at + self.options.client_secret_expiry_seconds + if self.options.client_secret_expiry_seconds is not None + else None + ) + + client_info = OAuthClientInformationFull( + client_id=client_id, + client_id_issued_at=client_id_issued_at, + client_secret=client_secret, + client_secret_expires_at=client_secret_expires_at, + # passthrough information from the client request + redirect_uris=client_metadata.redirect_uris, + token_endpoint_auth_method=client_metadata.token_endpoint_auth_method, + grant_types=client_metadata.grant_types, + response_types=client_metadata.response_types, + client_name=client_metadata.client_name, + client_uri=client_metadata.client_uri, + logo_uri=client_metadata.logo_uri, + scope=client_metadata.scope, + contacts=client_metadata.contacts, + tos_uri=client_metadata.tos_uri, + policy_uri=client_metadata.policy_uri, + jwks_uri=client_metadata.jwks_uri, + jwks=client_metadata.jwks, + software_id=client_metadata.software_id, + software_version=client_metadata.software_version, + ) + try: + # Register client + await self.provider.register_client(client_info) + + # Return client information + return PydanticJSONResponse(content=client_info, status_code=201) + except RegistrationError as e: + # Handle registration errors as defined in RFC 7591 Section 3.2.2 + return PydanticJSONResponse( + content=RegistrationErrorResponse(error=e.error, error_description=e.error_description), + status_code=400, + ) diff --git a/src/mcp/server/auth/handlers/revoke.py b/src/mcp/server/auth/handlers/revoke.py new file mode 100644 index 000000000..478ad7a01 --- /dev/null +++ b/src/mcp/server/auth/handlers/revoke.py @@ -0,0 +1,94 @@ +from dataclasses import dataclass +from functools import partial +from typing import Any, Literal + +from pydantic import BaseModel, ValidationError +from starlette.requests import Request +from starlette.responses import Response + +from mcp.server.auth.errors import ( + stringify_pydantic_error, +) +from mcp.server.auth.json_response import PydanticJSONResponse +from mcp.server.auth.middleware.client_auth import AuthenticationError, ClientAuthenticator +from mcp.server.auth.provider import AccessToken, OAuthAuthorizationServerProvider, RefreshToken + + +class RevocationRequest(BaseModel): + """ + # See https://datatracker.ietf.org/doc/html/rfc7009#section-2.1 + """ + + token: str + token_type_hint: Literal["access_token", "refresh_token"] | None = None + client_id: str + client_secret: str | None + + +class RevocationErrorResponse(BaseModel): + error: Literal["invalid_request", "unauthorized_client"] + error_description: str | None = None + + +@dataclass +class RevocationHandler: + provider: OAuthAuthorizationServerProvider[Any, Any, Any] + client_authenticator: ClientAuthenticator + + async def handle(self, request: Request) -> Response: + """ + Handler for the OAuth 2.0 Token Revocation endpoint. + """ + try: + form_data = await request.form() + revocation_request = RevocationRequest.model_validate(dict(form_data)) + except ValidationError as e: + return PydanticJSONResponse( + status_code=400, + content=RevocationErrorResponse( + error="invalid_request", + error_description=stringify_pydantic_error(e), + ), + ) + + # Authenticate client + try: + client = await self.client_authenticator.authenticate( + revocation_request.client_id, revocation_request.client_secret + ) + except AuthenticationError as e: + return PydanticJSONResponse( + status_code=401, + content=RevocationErrorResponse( + error="unauthorized_client", + error_description=e.message, + ), + ) + + loaders = [ + self.provider.load_access_token, + partial(self.provider.load_refresh_token, client), + ] + if revocation_request.token_type_hint == "refresh_token": + loaders = reversed(loaders) + + token: None | AccessToken | RefreshToken = None + for loader in loaders: + token = await loader(revocation_request.token) + if token is not None: + break + + # if token is not found, just return HTTP 200 per the RFC + if token and token.client_id == client.client_id: + # Revoke token; provider is not meant to be able to do validation + # at this point that would result in an error + await self.provider.revoke_token(token) + + # Return successful empty response + return Response( + status_code=200, + headers={ + "Cache-Control": "no-store", + "Pragma": "no-cache", + }, + ) diff --git a/src/mcp/server/auth/handlers/token.py b/src/mcp/server/auth/handlers/token.py new file mode 100644 index 000000000..552417169 --- /dev/null +++ b/src/mcp/server/auth/handlers/token.py @@ -0,0 +1,240 @@ +import base64 +import hashlib +import time +from dataclasses import dataclass +from typing import Annotated, Any, Literal + +from pydantic import AnyHttpUrl, AnyUrl, BaseModel, Field, RootModel, ValidationError +from starlette.requests import Request + +from mcp.server.auth.errors import stringify_pydantic_error +from mcp.server.auth.json_response import PydanticJSONResponse +from mcp.server.auth.middleware.client_auth import AuthenticationError, ClientAuthenticator +from mcp.server.auth.provider import OAuthAuthorizationServerProvider, TokenError, TokenErrorCode +from mcp.shared.auth import OAuthToken + + +class AuthorizationCodeRequest(BaseModel): + # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.3 + grant_type: Literal["authorization_code"] + code: str = Field(..., description="The authorization code") + redirect_uri: AnyUrl | None = Field(None, description="Must be the same as redirect URI provided in /authorize") + client_id: str + # we use the client_secret param, per https://datatracker.ietf.org/doc/html/rfc6749#section-2.3.1 + client_secret: str | None = None + # See https://datatracker.ietf.org/doc/html/rfc7636#section-4.5 + code_verifier: str = Field(..., description="PKCE code verifier") + # RFC 8707 resource indicator + resource: str | None = Field(None, description="Resource indicator for the token") + + +class RefreshTokenRequest(BaseModel): + # See https://datatracker.ietf.org/doc/html/rfc6749#section-6 + grant_type: Literal["refresh_token"] + refresh_token: str = Field(..., description="The refresh token") + scope: str | None = Field(None, description="Optional scope parameter") + client_id: str + # we use the client_secret param, per https://datatracker.ietf.org/doc/html/rfc6749#section-2.3.1 + client_secret: str | None = None + # RFC 8707 resource indicator + resource: str | None = Field(None, description="Resource indicator for the token") + + +class TokenRequest( + RootModel[ + Annotated[ + AuthorizationCodeRequest | RefreshTokenRequest, + Field(discriminator="grant_type"), + ] + ] +): + root: Annotated[ + AuthorizationCodeRequest | RefreshTokenRequest, + Field(discriminator="grant_type"), + ] + + +class TokenErrorResponse(BaseModel): + """ + See https://datatracker.ietf.org/doc/html/rfc6749#section-5.2 + """ + + error: TokenErrorCode + error_description: str | None = None + error_uri: AnyHttpUrl | None = None + + +class TokenSuccessResponse(RootModel[OAuthToken]): + # this is just a wrapper over OAuthToken; the only reason we do this + # is to have some separation between the HTTP response type, and the + # type returned by the provider + root: OAuthToken + + +@dataclass +class TokenHandler: + provider: OAuthAuthorizationServerProvider[Any, Any, Any] + client_authenticator: ClientAuthenticator + + def response(self, obj: TokenSuccessResponse | TokenErrorResponse): + status_code = 200 + if isinstance(obj, TokenErrorResponse): + status_code = 400 + + return PydanticJSONResponse( + content=obj, + status_code=status_code, + headers={ + "Cache-Control": "no-store", + "Pragma": "no-cache", + }, + ) + + async def handle(self, request: Request): + try: + form_data = await request.form() + token_request = TokenRequest.model_validate(dict(form_data)).root + except ValidationError as validation_error: + return self.response( + TokenErrorResponse( + error="invalid_request", + error_description=stringify_pydantic_error(validation_error), + ) + ) + + try: + client_info = await self.client_authenticator.authenticate( + client_id=token_request.client_id, + client_secret=token_request.client_secret, + ) + except AuthenticationError as e: + return self.response( + TokenErrorResponse( + error="unauthorized_client", + error_description=e.message, + ) + ) + + if token_request.grant_type not in client_info.grant_types: + return self.response( + TokenErrorResponse( + error="unsupported_grant_type", + error_description=( + f"Unsupported grant type (supported grant types are " f"{client_info.grant_types})" + ), + ) + ) + + tokens: OAuthToken + + match token_request: + case AuthorizationCodeRequest(): + auth_code = await self.provider.load_authorization_code(client_info, token_request.code) + if auth_code is None or auth_code.client_id != token_request.client_id: + # if code belongs to different client, pretend it doesn't exist + return self.response( + TokenErrorResponse( + error="invalid_grant", + error_description="authorization code does not exist", + ) + ) + + # make auth codes expire after a deadline + # see https://datatracker.ietf.org/doc/html/rfc6749#section-10.5 + if auth_code.expires_at < time.time(): + return self.response( + TokenErrorResponse( + error="invalid_grant", + error_description="authorization code has expired", + ) + ) + + # verify redirect_uri doesn't change between /authorize and /tokens + # see https://datatracker.ietf.org/doc/html/rfc6749#section-10.6 + if auth_code.redirect_uri_provided_explicitly: + authorize_request_redirect_uri = auth_code.redirect_uri + else: + authorize_request_redirect_uri = None + + # Convert both sides to strings for comparison to handle AnyUrl vs string issues + token_redirect_str = str(token_request.redirect_uri) if token_request.redirect_uri is not None else None + auth_redirect_str = ( + str(authorize_request_redirect_uri) if authorize_request_redirect_uri is not None else None + ) + + if token_redirect_str != auth_redirect_str: + return self.response( + TokenErrorResponse( + error="invalid_request", + error_description=("redirect_uri did not match the one " "used when creating auth code"), + ) + ) + + # Verify PKCE code verifier + sha256 = hashlib.sha256(token_request.code_verifier.encode()).digest() + hashed_code_verifier = base64.urlsafe_b64encode(sha256).decode().rstrip("=") + + if hashed_code_verifier != auth_code.code_challenge: + # see https://datatracker.ietf.org/doc/html/rfc7636#section-4.6 + return self.response( + TokenErrorResponse( + error="invalid_grant", + error_description="incorrect code_verifier", + ) + ) + + try: + # Exchange authorization code for tokens + tokens = await self.provider.exchange_authorization_code(client_info, auth_code) + except TokenError as e: + return self.response( + TokenErrorResponse( + error=e.error, + error_description=e.error_description, + ) + ) + + case RefreshTokenRequest(): + refresh_token = await self.provider.load_refresh_token(client_info, token_request.refresh_token) + if refresh_token is None or refresh_token.client_id != token_request.client_id: + # if token belongs to different client, pretend it doesn't exist + return self.response( + TokenErrorResponse( + error="invalid_grant", + error_description="refresh token does not exist", + ) + ) + + if refresh_token.expires_at and refresh_token.expires_at < time.time(): + # if the refresh token has expired, pretend it doesn't exist + return self.response( + TokenErrorResponse( + error="invalid_grant", + error_description="refresh token has expired", + ) + ) + + # Parse scopes if provided + scopes = token_request.scope.split(" ") if token_request.scope else refresh_token.scopes + + for scope in scopes: + if scope not in refresh_token.scopes: + return self.response( + TokenErrorResponse( + error="invalid_scope", + error_description=(f"cannot request scope `{scope}` " "not provided by refresh token"), + ) + ) + + try: + # Exchange refresh token for new tokens + tokens = await self.provider.exchange_refresh_token(client_info, refresh_token, scopes) + except TokenError as e: + return self.response( + TokenErrorResponse( + error=e.error, + error_description=e.error_description, + ) + ) + + return self.response(TokenSuccessResponse(root=tokens)) diff --git a/src/mcp/server/auth/json_response.py b/src/mcp/server/auth/json_response.py new file mode 100644 index 000000000..bd95bd693 --- /dev/null +++ b/src/mcp/server/auth/json_response.py @@ -0,0 +1,10 @@ +from typing import Any + +from starlette.responses import JSONResponse + + +class PydanticJSONResponse(JSONResponse): + # use pydantic json serialization instead of the stock `json.dumps`, + # so that we can handle serializing pydantic models like AnyHttpUrl + def render(self, content: Any) -> bytes: + return content.model_dump_json(exclude_none=True).encode("utf-8") diff --git a/src/mcp/server/auth/middleware/__init__.py b/src/mcp/server/auth/middleware/__init__.py new file mode 100644 index 000000000..ba3ff63c3 --- /dev/null +++ b/src/mcp/server/auth/middleware/__init__.py @@ -0,0 +1,3 @@ +""" +Middleware for MCP authorization. +""" diff --git a/src/mcp/server/auth/middleware/auth_context.py b/src/mcp/server/auth/middleware/auth_context.py new file mode 100644 index 000000000..e2116c3bf --- /dev/null +++ b/src/mcp/server/auth/middleware/auth_context.py @@ -0,0 +1,48 @@ +import contextvars + +from starlette.types import ASGIApp, Receive, Scope, Send + +from mcp.server.auth.middleware.bearer_auth import AuthenticatedUser +from mcp.server.auth.provider import AccessToken + +# Create a contextvar to store the authenticated user +# The default is None, indicating no authenticated user is present +auth_context_var = contextvars.ContextVar[AuthenticatedUser | None]("auth_context", default=None) + + +def get_access_token() -> AccessToken | None: + """ + Get the access token from the current context. + + Returns: + The access token if an authenticated user is available, None otherwise. + """ + auth_user = auth_context_var.get() + return auth_user.access_token if auth_user else None + + +class AuthContextMiddleware: + """ + Middleware that extracts the authenticated user from the request + and sets it in a contextvar for easy access throughout the request lifecycle. + + This middleware should be added after the AuthenticationMiddleware in the + middleware stack to ensure that the user is properly authenticated before + being stored in the context. + """ + + def __init__(self, app: ASGIApp): + self.app = app + + async def __call__(self, scope: Scope, receive: Receive, send: Send): + user = scope.get("user") + if isinstance(user, AuthenticatedUser): + # Set the authenticated user in the contextvar + token = auth_context_var.set(user) + try: + await self.app(scope, receive, send) + finally: + auth_context_var.reset(token) + else: + # No authenticated user, just process the request + await self.app(scope, receive, send) diff --git a/src/mcp/server/auth/middleware/bearer_auth.py b/src/mcp/server/auth/middleware/bearer_auth.py new file mode 100644 index 000000000..6251e5ad5 --- /dev/null +++ b/src/mcp/server/auth/middleware/bearer_auth.py @@ -0,0 +1,128 @@ +import json +import time +from typing import Any + +from pydantic import AnyHttpUrl +from starlette.authentication import AuthCredentials, AuthenticationBackend, SimpleUser +from starlette.requests import HTTPConnection +from starlette.types import Receive, Scope, Send + +from mcp.server.auth.provider import AccessToken, TokenVerifier + + +class AuthenticatedUser(SimpleUser): + """User with authentication info.""" + + def __init__(self, auth_info: AccessToken): + super().__init__(auth_info.client_id) + self.access_token = auth_info + self.scopes = auth_info.scopes + + +class BearerAuthBackend(AuthenticationBackend): + """ + Authentication backend that validates Bearer tokens using a TokenVerifier. + """ + + def __init__(self, token_verifier: TokenVerifier): + self.token_verifier = token_verifier + + async def authenticate(self, conn: HTTPConnection): + auth_header = next( + (conn.headers.get(key) for key in conn.headers if key.lower() == "authorization"), + None, + ) + if not auth_header or not auth_header.lower().startswith("bearer "): + return None + + token = auth_header[7:] # Remove "Bearer " prefix + + # Validate the token with the verifier + auth_info = await self.token_verifier.verify_token(token) + + if not auth_info: + return None + + if auth_info.expires_at and auth_info.expires_at < int(time.time()): + return None + + return AuthCredentials(auth_info.scopes), AuthenticatedUser(auth_info) + + +class RequireAuthMiddleware: + """ + Middleware that requires a valid Bearer token in the Authorization header. + + This will validate the token with the auth provider and store the resulting + auth info in the request state. + """ + + def __init__( + self, + app: Any, + required_scopes: list[str], + resource_metadata_url: AnyHttpUrl | None = None, + ): + """ + Initialize the middleware. + + Args: + app: ASGI application + required_scopes: List of scopes that the token must have + resource_metadata_url: Optional protected resource metadata URL for WWW-Authenticate header + """ + self.app = app + self.required_scopes = required_scopes + self.resource_metadata_url = resource_metadata_url + + async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None: + auth_user = scope.get("user") + if not isinstance(auth_user, AuthenticatedUser): + await self._send_auth_error( + send, status_code=401, error="invalid_token", description="Authentication required" + ) + return + + auth_credentials = scope.get("auth") + + for required_scope in self.required_scopes: + # auth_credentials should always be provided; this is just paranoia + if auth_credentials is None or required_scope not in auth_credentials.scopes: + await self._send_auth_error( + send, status_code=403, error="insufficient_scope", description=f"Required scope: {required_scope}" + ) + return + + await self.app(scope, receive, send) + + async def _send_auth_error(self, send: Send, status_code: int, error: str, description: str) -> None: + """Send an authentication error response with WWW-Authenticate header.""" + # Build WWW-Authenticate header value + www_auth_parts = [f'error="{error}"', f'error_description="{description}"'] + if self.resource_metadata_url: + www_auth_parts.append(f'resource_metadata="{self.resource_metadata_url}"') + + www_authenticate = f"Bearer {', '.join(www_auth_parts)}" + + # Send response + body = {"error": error, "error_description": description} + body_bytes = json.dumps(body).encode() + + await send( + { + "type": "http.response.start", + "status": status_code, + "headers": [ + (b"content-type", b"application/json"), + (b"content-length", str(len(body_bytes)).encode()), + (b"www-authenticate", www_authenticate.encode()), + ], + } + ) + + await send( + { + "type": "http.response.body", + "body": body_bytes, + } + ) diff --git a/src/mcp/server/auth/middleware/client_auth.py b/src/mcp/server/auth/middleware/client_auth.py new file mode 100644 index 000000000..d5f473b48 --- /dev/null +++ b/src/mcp/server/auth/middleware/client_auth.py @@ -0,0 +1,51 @@ +import time +from typing import Any + +from mcp.server.auth.provider import OAuthAuthorizationServerProvider +from mcp.shared.auth import OAuthClientInformationFull + + +class AuthenticationError(Exception): + def __init__(self, message: str): + self.message = message + + +class ClientAuthenticator: + """ + ClientAuthenticator is a callable which validates requests from a client + application, used to verify /token calls. + If, during registration, the client requested to be issued a secret, the + authenticator asserts that /token calls must be authenticated with + that same token. + NOTE: clients can opt for no authentication during registration, in which case this + logic is skipped. + """ + + def __init__(self, provider: OAuthAuthorizationServerProvider[Any, Any, Any]): + """ + Initialize the dependency. + + Args: + provider: Provider to look up client information + """ + self.provider = provider + + async def authenticate(self, client_id: str, client_secret: str | None) -> OAuthClientInformationFull: + # Look up client information + client = await self.provider.get_client(client_id) + if not client: + raise AuthenticationError("Invalid client_id") + + # If client from the store expects a secret, validate that the request provides + # that secret + if client.client_secret: + if not client_secret: + raise AuthenticationError("Client secret is required") + + if client.client_secret != client_secret: + raise AuthenticationError("Invalid client_secret") + + if client.client_secret_expires_at and client.client_secret_expires_at < int(time.time()): + raise AuthenticationError("Client secret has expired") + + return client diff --git a/src/mcp/server/auth/provider.py b/src/mcp/server/auth/provider.py new file mode 100644 index 000000000..b84db89a2 --- /dev/null +++ b/src/mcp/server/auth/provider.py @@ -0,0 +1,306 @@ +from dataclasses import dataclass +from typing import Generic, Literal, Protocol, TypeVar +from urllib.parse import parse_qs, urlencode, urlparse, urlunparse + +from pydantic import AnyUrl, BaseModel + +from mcp.shared.auth import OAuthClientInformationFull, OAuthToken + + +class AuthorizationParams(BaseModel): + state: str | None + scopes: list[str] | None + code_challenge: str + redirect_uri: AnyUrl + redirect_uri_provided_explicitly: bool + resource: str | None = None # RFC 8707 resource indicator + + +class AuthorizationCode(BaseModel): + code: str + scopes: list[str] + expires_at: float + client_id: str + code_challenge: str + redirect_uri: AnyUrl + redirect_uri_provided_explicitly: bool + resource: str | None = None # RFC 8707 resource indicator + + +class RefreshToken(BaseModel): + token: str + client_id: str + scopes: list[str] + expires_at: int | None = None + + +class AccessToken(BaseModel): + token: str + client_id: str + scopes: list[str] + expires_at: int | None = None + resource: str | None = None # RFC 8707 resource indicator + + +RegistrationErrorCode = Literal[ + "invalid_redirect_uri", + "invalid_client_metadata", + "invalid_software_statement", + "unapproved_software_statement", +] + + +@dataclass(frozen=True) +class RegistrationError(Exception): + error: RegistrationErrorCode + error_description: str | None = None + + +AuthorizationErrorCode = Literal[ + "invalid_request", + "unauthorized_client", + "access_denied", + "unsupported_response_type", + "invalid_scope", + "server_error", + "temporarily_unavailable", +] + + +@dataclass(frozen=True) +class AuthorizeError(Exception): + error: AuthorizationErrorCode + error_description: str | None = None + + +TokenErrorCode = Literal[ + "invalid_request", + "invalid_client", + "invalid_grant", + "unauthorized_client", + "unsupported_grant_type", + "invalid_scope", +] + + +@dataclass(frozen=True) +class TokenError(Exception): + error: TokenErrorCode + error_description: str | None = None + + +class TokenVerifier(Protocol): + """Protocol for verifying bearer tokens.""" + + async def verify_token(self, token: str) -> AccessToken | None: + """Verify a bearer token and return access info if valid.""" + + +# NOTE: FastMCP doesn't render any of these types in the user response, so it's +# OK to add fields to subclasses which should not be exposed externally. +AuthorizationCodeT = TypeVar("AuthorizationCodeT", bound=AuthorizationCode) +RefreshTokenT = TypeVar("RefreshTokenT", bound=RefreshToken) +AccessTokenT = TypeVar("AccessTokenT", bound=AccessToken) + + +class OAuthAuthorizationServerProvider(Protocol, Generic[AuthorizationCodeT, RefreshTokenT, AccessTokenT]): + async def get_client(self, client_id: str) -> OAuthClientInformationFull | None: + """ + Retrieves client information by client ID. + + Implementors MAY raise NotImplementedError if dynamic client registration is + disabled in ClientRegistrationOptions. + + Args: + client_id: The ID of the client to retrieve. + + Returns: + The client information, or None if the client does not exist. + """ + ... + + async def register_client(self, client_info: OAuthClientInformationFull) -> None: + """ + Saves client information as part of registering it. + + Implementors MAY raise NotImplementedError if dynamic client registration is + disabled in ClientRegistrationOptions. + + Args: + client_info: The client metadata to register. + + Raises: + RegistrationError: If the client metadata is invalid. + """ + ... + + async def authorize(self, client: OAuthClientInformationFull, params: AuthorizationParams) -> str: + """ + Called as part of the /authorize endpoint, and returns a URL that the client + will be redirected to. + Many MCP implementations will redirect to a third-party provider to perform + a second OAuth exchange with that provider. In this sort of setup, the client + has an OAuth connection with the MCP server, and the MCP server has an OAuth + connection with the 3rd-party provider. At the end of this flow, the client + should be redirected to the redirect_uri from params.redirect_uri. + + +--------+ +------------+ +-------------------+ + | | | | | | + | Client | --> | MCP Server | --> | 3rd Party OAuth | + | | | | | Server | + +--------+ +------------+ +-------------------+ + | ^ | + +------------+ | | | + | | | | Redirect | + |redirect_uri|<-----+ +------------------+ + | | + +------------+ + + Implementations will need to define another handler on the MCP server return + flow to perform the second redirect, and generate and store an authorization + code as part of completing the OAuth authorization step. + + Implementations SHOULD generate an authorization code with at least 160 bits of + entropy, + and MUST generate an authorization code with at least 128 bits of entropy. + See https://datatracker.ietf.org/doc/html/rfc6749#section-10.10. + + Args: + client: The client requesting authorization. + params: The parameters of the authorization request. + + Returns: + A URL to redirect the client to for authorization. + + Raises: + AuthorizeError: If the authorization request is invalid. + """ + ... + + async def load_authorization_code( + self, client: OAuthClientInformationFull, authorization_code: str + ) -> AuthorizationCodeT | None: + """ + Loads an AuthorizationCode by its code. + + Args: + client: The client that requested the authorization code. + authorization_code: The authorization code to get the challenge for. + + Returns: + The AuthorizationCode, or None if not found + """ + ... + + async def exchange_authorization_code( + self, client: OAuthClientInformationFull, authorization_code: AuthorizationCodeT + ) -> OAuthToken: + """ + Exchanges an authorization code for an access token and refresh token. + + Args: + client: The client exchanging the authorization code. + authorization_code: The authorization code to exchange. + + Returns: + The OAuth token, containing access and refresh tokens. + + Raises: + TokenError: If the request is invalid + """ + ... + + async def load_refresh_token(self, client: OAuthClientInformationFull, refresh_token: str) -> RefreshTokenT | None: + """ + Loads a RefreshToken by its token string. + + Args: + client: The client that is requesting to load the refresh token. + refresh_token: The refresh token string to load. + + Returns: + The RefreshToken object if found, or None if not found. + """ + + ... + + async def exchange_refresh_token( + self, + client: OAuthClientInformationFull, + refresh_token: RefreshTokenT, + scopes: list[str], + ) -> OAuthToken: + """ + Exchanges a refresh token for an access token and refresh token. + + Implementations SHOULD rotate both the access token and refresh token. + + Args: + client: The client exchanging the refresh token. + refresh_token: The refresh token to exchange. + scopes: Optional scopes to request with the new access token. + + Returns: + The OAuth token, containing access and refresh tokens. + + Raises: + TokenError: If the request is invalid + """ + ... + + async def load_access_token(self, token: str) -> AccessTokenT | None: + """ + Loads an access token by its token. + + Args: + token: The access token to verify. + + Returns: + The AuthInfo, or None if the token is invalid. + """ + ... + + async def revoke_token( + self, + token: AccessTokenT | RefreshTokenT, + ) -> None: + """ + Revokes an access or refresh token. + + If the given token is invalid or already revoked, this method should do nothing. + + Implementations SHOULD revoke both the access token and its corresponding + refresh token, regardless of which of the access token or refresh token is + provided. + + Args: + token: the token to revoke + """ + ... + + +def construct_redirect_uri(redirect_uri_base: str, **params: str | None) -> str: + parsed_uri = urlparse(redirect_uri_base) + query_params = [(k, v) for k, vs in parse_qs(parsed_uri.query) for v in vs] + for k, v in params.items(): + if v is not None: + query_params.append((k, v)) + + redirect_uri = urlunparse(parsed_uri._replace(query=urlencode(query_params))) + return redirect_uri + + +class ProviderTokenVerifier(TokenVerifier): + """Token verifier that uses an OAuthAuthorizationServerProvider. + + This is provided for backwards compatibility with existing auth_server_provider + configurations. For new implementations using AS/RS separation, consider using + the TokenVerifier protocol with a dedicated implementation like IntrospectionTokenVerifier. + """ + + def __init__(self, provider: "OAuthAuthorizationServerProvider[AuthorizationCode, RefreshToken, AccessToken]"): + self.provider = provider + + async def verify_token(self, token: str) -> AccessToken | None: + """Verify token using the provider's load_access_token method.""" + return await self.provider.load_access_token(token) diff --git a/src/mcp/server/auth/routes.py b/src/mcp/server/auth/routes.py new file mode 100644 index 000000000..305440242 --- /dev/null +++ b/src/mcp/server/auth/routes.py @@ -0,0 +1,219 @@ +from collections.abc import Awaitable, Callable +from typing import Any + +from pydantic import AnyHttpUrl +from starlette.middleware.cors import CORSMiddleware +from starlette.requests import Request +from starlette.responses import Response +from starlette.routing import Route, request_response # type: ignore +from starlette.types import ASGIApp + +from mcp.server.auth.handlers.authorize import AuthorizationHandler +from mcp.server.auth.handlers.metadata import MetadataHandler +from mcp.server.auth.handlers.register import RegistrationHandler +from mcp.server.auth.handlers.revoke import RevocationHandler +from mcp.server.auth.handlers.token import TokenHandler +from mcp.server.auth.middleware.client_auth import ClientAuthenticator +from mcp.server.auth.provider import OAuthAuthorizationServerProvider +from mcp.server.auth.settings import ClientRegistrationOptions, RevocationOptions +from mcp.server.streamable_http import MCP_PROTOCOL_VERSION_HEADER +from mcp.shared.auth import OAuthMetadata + + +def validate_issuer_url(url: AnyHttpUrl): + """ + Validate that the issuer URL meets OAuth 2.0 requirements. + + Args: + url: The issuer URL to validate + + Raises: + ValueError: If the issuer URL is invalid + """ + + # RFC 8414 requires HTTPS, but we allow localhost HTTP for testing + if url.scheme != "https" and url.host != "localhost" and not url.host.startswith("127.0.0.1"): + raise ValueError("Issuer URL must be HTTPS") + + # No fragments or query parameters allowed + if url.fragment: + raise ValueError("Issuer URL must not have a fragment") + if url.query: + raise ValueError("Issuer URL must not have a query string") + + +AUTHORIZATION_PATH = "/authorize" +TOKEN_PATH = "/token" +REGISTRATION_PATH = "/register" +REVOCATION_PATH = "/revoke" + + +def cors_middleware( + handler: Callable[[Request], Response | Awaitable[Response]], + allow_methods: list[str], +) -> ASGIApp: + cors_app = CORSMiddleware( + app=request_response(handler), + allow_origins="*", + allow_methods=allow_methods, + allow_headers=[MCP_PROTOCOL_VERSION_HEADER], + ) + return cors_app + + +def create_auth_routes( + provider: OAuthAuthorizationServerProvider[Any, Any, Any], + issuer_url: AnyHttpUrl, + service_documentation_url: AnyHttpUrl | None = None, + client_registration_options: ClientRegistrationOptions | None = None, + revocation_options: RevocationOptions | None = None, +) -> list[Route]: + validate_issuer_url(issuer_url) + + client_registration_options = client_registration_options or ClientRegistrationOptions() + revocation_options = revocation_options or RevocationOptions() + metadata = build_metadata( + issuer_url, + service_documentation_url, + client_registration_options, + revocation_options, + ) + client_authenticator = ClientAuthenticator(provider) + + # Create routes + # Allow CORS requests for endpoints meant to be hit by the OAuth client + # (with the client secret). This is intended to support things like MCP Inspector, + # where the client runs in a web browser. + routes = [ + Route( + "/.well-known/oauth-authorization-server", + endpoint=cors_middleware( + MetadataHandler(metadata).handle, + ["GET", "OPTIONS"], + ), + methods=["GET", "OPTIONS"], + ), + Route( + AUTHORIZATION_PATH, + # do not allow CORS for authorization endpoint; + # clients should just redirect to this + endpoint=AuthorizationHandler(provider).handle, + methods=["GET", "POST"], + ), + Route( + TOKEN_PATH, + endpoint=cors_middleware( + TokenHandler(provider, client_authenticator).handle, + ["POST", "OPTIONS"], + ), + methods=["POST", "OPTIONS"], + ), + ] + + if client_registration_options.enabled: + registration_handler = RegistrationHandler( + provider, + options=client_registration_options, + ) + routes.append( + Route( + REGISTRATION_PATH, + endpoint=cors_middleware( + registration_handler.handle, + ["POST", "OPTIONS"], + ), + methods=["POST", "OPTIONS"], + ) + ) + + if revocation_options.enabled: + revocation_handler = RevocationHandler(provider, client_authenticator) + routes.append( + Route( + REVOCATION_PATH, + endpoint=cors_middleware( + revocation_handler.handle, + ["POST", "OPTIONS"], + ), + methods=["POST", "OPTIONS"], + ) + ) + + return routes + + +def build_metadata( + issuer_url: AnyHttpUrl, + service_documentation_url: AnyHttpUrl | None, + client_registration_options: ClientRegistrationOptions, + revocation_options: RevocationOptions, +) -> OAuthMetadata: + authorization_url = AnyHttpUrl(str(issuer_url).rstrip("/") + AUTHORIZATION_PATH) + token_url = AnyHttpUrl(str(issuer_url).rstrip("/") + TOKEN_PATH) + + # Create metadata + metadata = OAuthMetadata( + issuer=issuer_url, + authorization_endpoint=authorization_url, + token_endpoint=token_url, + scopes_supported=client_registration_options.valid_scopes, + response_types_supported=["code"], + response_modes_supported=None, + grant_types_supported=["authorization_code", "refresh_token"], + token_endpoint_auth_methods_supported=["client_secret_post"], + token_endpoint_auth_signing_alg_values_supported=None, + service_documentation=service_documentation_url, + ui_locales_supported=None, + op_policy_uri=None, + op_tos_uri=None, + introspection_endpoint=None, + code_challenge_methods_supported=["S256"], + ) + + # Add registration endpoint if supported + if client_registration_options.enabled: + metadata.registration_endpoint = AnyHttpUrl(str(issuer_url).rstrip("/") + REGISTRATION_PATH) + + # Add revocation endpoint if supported + if revocation_options.enabled: + metadata.revocation_endpoint = AnyHttpUrl(str(issuer_url).rstrip("/") + REVOCATION_PATH) + metadata.revocation_endpoint_auth_methods_supported = ["client_secret_post"] + + return metadata + + +def create_protected_resource_routes( + resource_url: AnyHttpUrl, + authorization_servers: list[AnyHttpUrl], + scopes_supported: list[str] | None = None, +) -> list[Route]: + """ + Create routes for OAuth 2.0 Protected Resource Metadata (RFC 9728). + + Args: + resource_url: The URL of this resource server + authorization_servers: List of authorization servers that can issue tokens + scopes_supported: Optional list of scopes supported by this resource + + Returns: + List of Starlette routes for protected resource metadata + """ + from mcp.server.auth.handlers.metadata import ProtectedResourceMetadataHandler + from mcp.shared.auth import ProtectedResourceMetadata + + metadata = ProtectedResourceMetadata( + resource=resource_url, + authorization_servers=authorization_servers, + scopes_supported=scopes_supported, + # bearer_methods_supported defaults to ["header"] in the model + ) + + handler = ProtectedResourceMetadataHandler(metadata) + + return [ + Route( + "/.well-known/oauth-protected-resource", + endpoint=cors_middleware(handler.handle, ["GET", "OPTIONS"]), + methods=["GET", "OPTIONS"], + ) + ] diff --git a/src/mcp/server/auth/settings.py b/src/mcp/server/auth/settings.py new file mode 100644 index 000000000..1649826db --- /dev/null +++ b/src/mcp/server/auth/settings.py @@ -0,0 +1,30 @@ +from pydantic import AnyHttpUrl, BaseModel, Field + + +class ClientRegistrationOptions(BaseModel): + enabled: bool = False + client_secret_expiry_seconds: int | None = None + valid_scopes: list[str] | None = None + default_scopes: list[str] | None = None + + +class RevocationOptions(BaseModel): + enabled: bool = False + + +class AuthSettings(BaseModel): + issuer_url: AnyHttpUrl = Field( + ..., + description="OAuth authorization server URL that issues tokens for this resource server.", + ) + service_documentation_url: AnyHttpUrl | None = None + client_registration_options: ClientRegistrationOptions | None = None + revocation_options: RevocationOptions | None = None + required_scopes: list[str] | None = None + + # Resource Server settings (when operating as RS only) + resource_server_url: AnyHttpUrl | None = Field( + ..., + description="The URL of the MCP server to be used as the resource identifier " + "and base route to look up OAuth Protected Resource Metadata.", + ) diff --git a/src/mcp/server/elicitation.py b/src/mcp/server/elicitation.py new file mode 100644 index 000000000..1e48738c8 --- /dev/null +++ b/src/mcp/server/elicitation.py @@ -0,0 +1,111 @@ +"""Elicitation utilities for MCP servers.""" + +from __future__ import annotations + +import types +from typing import Generic, Literal, TypeVar, Union, get_args, get_origin + +from pydantic import BaseModel +from pydantic.fields import FieldInfo + +from mcp.server.session import ServerSession +from mcp.types import RequestId + +ElicitSchemaModelT = TypeVar("ElicitSchemaModelT", bound=BaseModel) + + +class AcceptedElicitation(BaseModel, Generic[ElicitSchemaModelT]): + """Result when user accepts the elicitation.""" + + action: Literal["accept"] = "accept" + data: ElicitSchemaModelT + + +class DeclinedElicitation(BaseModel): + """Result when user declines the elicitation.""" + + action: Literal["decline"] = "decline" + + +class CancelledElicitation(BaseModel): + """Result when user cancels the elicitation.""" + + action: Literal["cancel"] = "cancel" + + +ElicitationResult = AcceptedElicitation[ElicitSchemaModelT] | DeclinedElicitation | CancelledElicitation + + +# Primitive types allowed in elicitation schemas +_ELICITATION_PRIMITIVE_TYPES = (str, int, float, bool) + + +def _validate_elicitation_schema(schema: type[BaseModel]) -> None: + """Validate that a Pydantic model only contains primitive field types.""" + for field_name, field_info in schema.model_fields.items(): + if not _is_primitive_field(field_info): + raise TypeError( + f"Elicitation schema field '{field_name}' must be a primitive type " + f"{_ELICITATION_PRIMITIVE_TYPES} or Optional of these types. " + f"Complex types like lists, dicts, or nested models are not allowed." + ) + + +def _is_primitive_field(field_info: FieldInfo) -> bool: + """Check if a field is a primitive type allowed in elicitation schemas.""" + annotation = field_info.annotation + + # Handle None type + if annotation is types.NoneType: + return True + + # Handle basic primitive types + if annotation in _ELICITATION_PRIMITIVE_TYPES: + return True + + # Handle Union types + origin = get_origin(annotation) + if origin is Union or origin is types.UnionType: + args = get_args(annotation) + # All args must be primitive types or None + return all(arg is types.NoneType or arg in _ELICITATION_PRIMITIVE_TYPES for arg in args) + + return False + + +async def elicit_with_validation( + session: ServerSession, + message: str, + schema: type[ElicitSchemaModelT], + related_request_id: RequestId | None = None, +) -> ElicitationResult[ElicitSchemaModelT]: + """Elicit information from the client/user with schema validation. + + This method can be used to interactively ask for additional information from the + client within a tool's execution. The client might display the message to the + user and collect a response according to the provided schema. Or in case a + client is an agent, it might decide how to handle the elicitation -- either by asking + the user or automatically generating a response. + """ + # Validate that schema only contains primitive types and fail loudly if not + _validate_elicitation_schema(schema) + + json_schema = schema.model_json_schema() + + result = await session.elicit( + message=message, + requestedSchema=json_schema, + related_request_id=related_request_id, + ) + + if result.action == "accept" and result.content: + # Validate and parse the content using the schema + validated_data = schema.model_validate(result.content) + return AcceptedElicitation(data=validated_data) + elif result.action == "decline": + return DeclinedElicitation() + elif result.action == "cancel": + return CancelledElicitation() + else: + # This should never happen, but handle it just in case + raise ValueError(f"Unexpected elicitation action: {result.action}") diff --git a/src/mcp/server/fastmcp/prompts/base.py b/src/mcp/server/fastmcp/prompts/base.py index 71c48724e..b45cfc917 100644 --- a/src/mcp/server/fastmcp/prompts/base.py +++ b/src/mcp/server/fastmcp/prompts/base.py @@ -1,25 +1,22 @@ """Base classes for FastMCP prompts.""" import inspect -import json from collections.abc import Awaitable, Callable, Sequence from typing import Any, Literal import pydantic_core from pydantic import BaseModel, Field, TypeAdapter, validate_call -from mcp.types import EmbeddedResource, ImageContent, TextContent - -CONTENT_TYPES = TextContent | ImageContent | EmbeddedResource +from mcp.types import ContentBlock, TextContent class Message(BaseModel): """Base class for all prompt messages.""" role: Literal["user", "assistant"] - content: CONTENT_TYPES + content: ContentBlock - def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any): + def __init__(self, content: str | ContentBlock, **kwargs: Any): if isinstance(content, str): content = TextContent(type="text", text=content) super().__init__(content=content, **kwargs) @@ -30,7 +27,7 @@ class UserMessage(Message): role: Literal["user", "assistant"] = "user" - def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any): + def __init__(self, content: str | ContentBlock, **kwargs: Any): super().__init__(content=content, **kwargs) @@ -39,17 +36,13 @@ class AssistantMessage(Message): role: Literal["user", "assistant"] = "assistant" - def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any): + def __init__(self, content: str | ContentBlock, **kwargs: Any): super().__init__(content=content, **kwargs) -message_validator = TypeAdapter[UserMessage | AssistantMessage]( - UserMessage | AssistantMessage -) +message_validator = TypeAdapter[UserMessage | AssistantMessage](UserMessage | AssistantMessage) -SyncPromptResult = ( - str | Message | dict[str, Any] | Sequence[str | Message | dict[str, Any]] -) +SyncPromptResult = str | Message | dict[str, Any] | Sequence[str | Message | dict[str, Any]] PromptResult = SyncPromptResult | Awaitable[SyncPromptResult] @@ -57,24 +50,17 @@ class PromptArgument(BaseModel): """An argument that can be passed to a prompt.""" name: str = Field(description="Name of the argument") - description: str | None = Field( - None, description="Description of what the argument does" - ) - required: bool = Field( - default=False, description="Whether the argument is required" - ) + description: str | None = Field(None, description="Description of what the argument does") + required: bool = Field(default=False, description="Whether the argument is required") class Prompt(BaseModel): """A prompt template that can be rendered with parameters.""" name: str = Field(description="Name of the prompt") - description: str | None = Field( - None, description="Description of what the prompt does" - ) - arguments: list[PromptArgument] | None = Field( - None, description="Arguments that can be passed to the prompt" - ) + title: str | None = Field(None, description="Human-readable title of the prompt") + description: str | None = Field(None, description="Description of what the prompt does") + arguments: list[PromptArgument] | None = Field(None, description="Arguments that can be passed to the prompt") fn: Callable[..., PromptResult | Awaitable[PromptResult]] = Field(exclude=True) @classmethod @@ -82,6 +68,7 @@ def from_function( cls, fn: Callable[..., PromptResult | Awaitable[PromptResult]], name: str | None = None, + title: str | None = None, description: str | None = None, ) -> "Prompt": """Create a Prompt from a function. @@ -118,6 +105,7 @@ def from_function( return cls( name=func_name, + title=title, description=description or fn.__doc__ or "", arguments=arguments, fn=fn, @@ -155,12 +143,10 @@ async def render(self, arguments: dict[str, Any] | None = None) -> list[Message] content = TextContent(type="text", text=msg) messages.append(UserMessage(content=content)) else: - content = json.dumps(pydantic_core.to_jsonable_python(msg)) + content = pydantic_core.to_json(msg, fallback=str, indent=2).decode() messages.append(Message(role="user", content=content)) except Exception: - raise ValueError( - f"Could not convert prompt result to message: {msg}" - ) + raise ValueError(f"Could not convert prompt result to message: {msg}") return messages except Exception as e: diff --git a/src/mcp/server/fastmcp/prompts/manager.py b/src/mcp/server/fastmcp/prompts/manager.py index 7ccbdef36..6b01d91cd 100644 --- a/src/mcp/server/fastmcp/prompts/manager.py +++ b/src/mcp/server/fastmcp/prompts/manager.py @@ -39,9 +39,7 @@ def add_prompt( self._prompts[prompt.name] = prompt return prompt - async def render_prompt( - self, name: str, arguments: dict[str, Any] | None = None - ) -> list[Message]: + async def render_prompt(self, name: str, arguments: dict[str, Any] | None = None) -> list[Message]: """Render a prompt by name with arguments.""" prompt = self.get_prompt(name) if not prompt: diff --git a/src/mcp/server/fastmcp/resources/base.py b/src/mcp/server/fastmcp/resources/base.py index b2050e7f8..f57631cc1 100644 --- a/src/mcp/server/fastmcp/resources/base.py +++ b/src/mcp/server/fastmcp/resources/base.py @@ -19,13 +19,10 @@ class Resource(BaseModel, abc.ABC): model_config = ConfigDict(validate_default=True) - uri: Annotated[AnyUrl, UrlConstraints(host_required=False)] = Field( - default=..., description="URI of the resource" - ) + uri: Annotated[AnyUrl, UrlConstraints(host_required=False)] = Field(default=..., description="URI of the resource") name: str | None = Field(description="Name of the resource", default=None) - description: str | None = Field( - description="Description of the resource", default=None - ) + title: str | None = Field(description="Human-readable title of the resource", default=None) + description: str | None = Field(description="Description of the resource", default=None) mime_type: str = Field( default="text/plain", description="MIME type of the resource content", diff --git a/src/mcp/server/fastmcp/resources/resource_manager.py b/src/mcp/server/fastmcp/resources/resource_manager.py index d27e6ac12..35e4ec04d 100644 --- a/src/mcp/server/fastmcp/resources/resource_manager.py +++ b/src/mcp/server/fastmcp/resources/resource_manager.py @@ -51,6 +51,7 @@ def add_template( fn: Callable[..., Any], uri_template: str, name: str | None = None, + title: str | None = None, description: str | None = None, mime_type: str | None = None, ) -> ResourceTemplate: @@ -59,6 +60,7 @@ def add_template( fn, uri_template=uri_template, name=name, + title=title, description=description, mime_type=mime_type, ) diff --git a/src/mcp/server/fastmcp/resources/templates.py b/src/mcp/server/fastmcp/resources/templates.py index a30b18253..b1c7b2711 100644 --- a/src/mcp/server/fastmcp/resources/templates.py +++ b/src/mcp/server/fastmcp/resources/templates.py @@ -15,18 +15,13 @@ class ResourceTemplate(BaseModel): """A template for dynamically creating resources.""" - uri_template: str = Field( - description="URI template with parameters (e.g. weather://{city}/current)" - ) + uri_template: str = Field(description="URI template with parameters (e.g. weather://{city}/current)") name: str = Field(description="Name of the resource") + title: str | None = Field(description="Human-readable title of the resource", default=None) description: str | None = Field(description="Description of what the resource does") - mime_type: str = Field( - default="text/plain", description="MIME type of the resource content" - ) + mime_type: str = Field(default="text/plain", description="MIME type of the resource content") fn: Callable[..., Any] = Field(exclude=True) - parameters: dict[str, Any] = Field( - description="JSON schema for function parameters" - ) + parameters: dict[str, Any] = Field(description="JSON schema for function parameters") @classmethod def from_function( @@ -34,6 +29,7 @@ def from_function( fn: Callable[..., Any], uri_template: str, name: str | None = None, + title: str | None = None, description: str | None = None, mime_type: str | None = None, ) -> ResourceTemplate: @@ -51,6 +47,7 @@ def from_function( return cls( uri_template=uri_template, name=func_name, + title=title, description=description or fn.__doc__ or "", mime_type=mime_type or "text/plain", fn=fn, @@ -77,6 +74,7 @@ async def create_resource(self, uri: str, params: dict[str, Any]) -> Resource: return FunctionResource( uri=uri, # type: ignore name=self.name, + title=self.title, description=self.description, mime_type=self.mime_type, fn=lambda: result, # Capture result in closure diff --git a/src/mcp/server/fastmcp/resources/types.py b/src/mcp/server/fastmcp/resources/types.py index d9fe2de6c..9c980dff1 100644 --- a/src/mcp/server/fastmcp/resources/types.py +++ b/src/mcp/server/fastmcp/resources/types.py @@ -9,9 +9,9 @@ import anyio import anyio.to_thread import httpx -import pydantic.json +import pydantic import pydantic_core -from pydantic import Field, ValidationInfo +from pydantic import AnyUrl, Field, ValidationInfo, validate_call from mcp.server.fastmcp.resources.base import Resource @@ -54,23 +54,45 @@ class FunctionResource(Resource): async def read(self) -> str | bytes: """Read the resource by calling the wrapped function.""" try: - result = ( - await self.fn() if inspect.iscoroutinefunction(self.fn) else self.fn() - ) + result = await self.fn() if inspect.iscoroutinefunction(self.fn) else self.fn() if isinstance(result, Resource): return await result.read() - if isinstance(result, bytes): + elif isinstance(result, bytes): return result - if isinstance(result, str): + elif isinstance(result, str): return result - try: - return json.dumps(pydantic_core.to_jsonable_python(result)) - except (TypeError, pydantic_core.PydanticSerializationError): - # If JSON serialization fails, try str() - return str(result) + else: + return pydantic_core.to_json(result, fallback=str, indent=2).decode() except Exception as e: raise ValueError(f"Error reading resource {self.uri}: {e}") + @classmethod + def from_function( + cls, + fn: Callable[..., Any], + uri: str, + name: str | None = None, + title: str | None = None, + description: str | None = None, + mime_type: str | None = None, + ) -> "FunctionResource": + """Create a FunctionResource from a function.""" + func_name = name or fn.__name__ + if func_name == "": + raise ValueError("You must provide a name for lambda functions") + + # ensure the arguments are properly cast + fn = validate_call(fn) + + return cls( + uri=AnyUrl(uri), + name=func_name, + title=title, + description=description or fn.__doc__ or "", + mime_type=mime_type or "text/plain", + fn=fn, + ) + class FileResource(Resource): """A resource that reads from a file. @@ -119,9 +141,7 @@ class HttpResource(Resource): """A resource that reads from an HTTP endpoint.""" url: str = Field(description="URL to fetch content from") - mime_type: str = Field( - default="application/json", description="MIME type of the resource content" - ) + mime_type: str = Field(default="application/json", description="MIME type of the resource content") async def read(self) -> str | bytes: """Read the HTTP content.""" @@ -135,15 +155,9 @@ class DirectoryResource(Resource): """A resource that lists files in a directory.""" path: Path = Field(description="Path to the directory") - recursive: bool = Field( - default=False, description="Whether to list files recursively" - ) - pattern: str | None = Field( - default=None, description="Optional glob pattern to filter files" - ) - mime_type: str = Field( - default="application/json", description="MIME type of the resource content" - ) + recursive: bool = Field(default=False, description="Whether to list files recursively") + pattern: str | None = Field(default=None, description="Optional glob pattern to filter files") + mime_type: str = Field(default="application/json", description="MIME type of the resource content") @pydantic.field_validator("path") @classmethod @@ -162,16 +176,8 @@ def list_files(self) -> list[Path]: try: if self.pattern: - return ( - list(self.path.glob(self.pattern)) - if not self.recursive - else list(self.path.rglob(self.pattern)) - ) - return ( - list(self.path.glob("*")) - if not self.recursive - else list(self.path.rglob("*")) - ) + return list(self.path.glob(self.pattern)) if not self.recursive else list(self.path.rglob(self.pattern)) + return list(self.path.glob("*")) if not self.recursive else list(self.path.rglob("*")) except Exception as e: raise ValueError(f"Error listing directory {self.path}: {e}") diff --git a/src/mcp/server/fastmcp/server.py b/src/mcp/server/fastmcp/server.py index f3bb2586a..668c6df82 100644 --- a/src/mcp/server/fastmcp/server.py +++ b/src/mcp/server/fastmcp/server.py @@ -3,9 +3,8 @@ from __future__ import annotations as _annotations import inspect -import json import re -from collections.abc import AsyncIterator, Callable, Iterable, Sequence +from collections.abc import AsyncIterator, Awaitable, Callable, Iterable, Sequence from contextlib import ( AbstractAsyncContextManager, asynccontextmanager, @@ -19,13 +18,25 @@ from pydantic.networks import AnyUrl from pydantic_settings import BaseSettings, SettingsConfigDict from starlette.applications import Starlette +from starlette.middleware import Middleware +from starlette.middleware.authentication import AuthenticationMiddleware from starlette.requests import Request +from starlette.responses import Response from starlette.routing import Mount, Route +from starlette.types import Receive, Scope, Send +from mcp.server.auth.middleware.auth_context import AuthContextMiddleware +from mcp.server.auth.middleware.bearer_auth import ( + BearerAuthBackend, + RequireAuthMiddleware, +) +from mcp.server.auth.provider import OAuthAuthorizationServerProvider, ProviderTokenVerifier, TokenVerifier +from mcp.server.auth.settings import AuthSettings +from mcp.server.elicitation import ElicitationResult, ElicitSchemaModelT, elicit_with_validation from mcp.server.fastmcp.exceptions import ResourceError from mcp.server.fastmcp.prompts import Prompt, PromptManager from mcp.server.fastmcp.resources import FunctionResource, Resource, ResourceManager -from mcp.server.fastmcp.tools import ToolManager +from mcp.server.fastmcp.tools import Tool, ToolManager from mcp.server.fastmcp.utilities.logging import configure_logging, get_logger from mcp.server.fastmcp.utilities.types import Image from mcp.server.lowlevel.helper_types import ReadResourceContents @@ -35,13 +46,16 @@ from mcp.server.session import ServerSession, ServerSessionT from mcp.server.sse import SseServerTransport from mcp.server.stdio import stdio_server -from mcp.shared.context import LifespanContextT, RequestContext +from mcp.server.streamable_http import EventStore +from mcp.server.streamable_http_manager import StreamableHTTPSessionManager +from mcp.server.transport_security import TransportSecuritySettings +from mcp.shared.context import LifespanContextT, RequestContext, RequestT from mcp.types import ( AnyFunction, - EmbeddedResource, + ContentBlock, GetPromptResult, - ImageContent, TextContent, + ToolAnnotations, ) from mcp.types import Prompt as MCPPrompt from mcp.types import PromptArgument as MCPPromptArgument @@ -62,6 +76,8 @@ class Settings(BaseSettings, Generic[LifespanResultT]): model_config = SettingsConfigDict( env_prefix="FASTMCP_", env_file=".env", + env_nested_delimiter="__", + nested_model_default_partial_update=True, extra="ignore", ) @@ -70,10 +86,16 @@ class Settings(BaseSettings, Generic[LifespanResultT]): log_level: Literal["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"] = "INFO" # HTTP settings - host: str = "0.0.0.0" + host: str = "127.0.0.1" port: int = 8000 + mount_path: str = "/" # Mount path (e.g. "/github", defaults to root path) sse_path: str = "/sse" message_path: str = "/messages/" + streamable_http_path: str = "/mcp" + + # StreamableHTTP settings + json_response: bool = False + stateless_http: bool = False # If True, uses true stateless mode (new transport per request) # resource settings warn_on_duplicate_resources: bool = True @@ -89,17 +111,22 @@ class Settings(BaseSettings, Generic[LifespanResultT]): description="List of dependencies to install in the server environment", ) - lifespan: ( - Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]] | None - ) = Field(None, description="Lifespan context manager") + lifespan: Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]] | None = Field( + None, description="Lifespan context manager" + ) + + auth: AuthSettings | None = None + + # Transport security settings (DNS rebinding protection) + transport_security: TransportSecuritySettings | None = None def lifespan_wrapper( app: FastMCP, lifespan: Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]], -) -> Callable[[MCPServer[LifespanResultT]], AbstractAsyncContextManager[object]]: +) -> Callable[[MCPServer[LifespanResultT, Request]], AbstractAsyncContextManager[object]]: @asynccontextmanager - async def wrap(s: MCPServer[LifespanResultT]) -> AsyncIterator[object]: + async def wrap(s: MCPServer[LifespanResultT, Request]) -> AsyncIterator[object]: async with lifespan(app) as context: yield context @@ -108,27 +135,46 @@ async def wrap(s: MCPServer[LifespanResultT]) -> AsyncIterator[object]: class FastMCP: def __init__( - self, name: str | None = None, instructions: str | None = None, **settings: Any + self, + name: str | None = None, + instructions: str | None = None, + auth_server_provider: OAuthAuthorizationServerProvider[Any, Any, Any] | None = None, + token_verifier: TokenVerifier | None = None, + event_store: EventStore | None = None, + *, + tools: list[Tool] | None = None, + **settings: Any, ): self.settings = Settings(**settings) self._mcp_server = MCPServer( name=name or "FastMCP", instructions=instructions, - lifespan=lifespan_wrapper(self, self.settings.lifespan) - if self.settings.lifespan - else default_lifespan, - ) - self._tool_manager = ToolManager( - warn_on_duplicate_tools=self.settings.warn_on_duplicate_tools - ) - self._resource_manager = ResourceManager( - warn_on_duplicate_resources=self.settings.warn_on_duplicate_resources - ) - self._prompt_manager = PromptManager( - warn_on_duplicate_prompts=self.settings.warn_on_duplicate_prompts + lifespan=(lifespan_wrapper(self, self.settings.lifespan) if self.settings.lifespan else default_lifespan), ) + self._tool_manager = ToolManager(tools=tools, warn_on_duplicate_tools=self.settings.warn_on_duplicate_tools) + self._resource_manager = ResourceManager(warn_on_duplicate_resources=self.settings.warn_on_duplicate_resources) + self._prompt_manager = PromptManager(warn_on_duplicate_prompts=self.settings.warn_on_duplicate_prompts) + # Validate auth configuration + if self.settings.auth is not None: + if auth_server_provider and token_verifier: + raise ValueError("Cannot specify both auth_server_provider and token_verifier") + if not auth_server_provider and not token_verifier: + raise ValueError("Must specify either auth_server_provider or token_verifier when auth is enabled") + else: + if auth_server_provider or token_verifier: + raise ValueError("Cannot specify auth_server_provider or token_verifier without auth settings") + + self._auth_server_provider = auth_server_provider + self._token_verifier = token_verifier + + # Create token verifier from provider if needed (backwards compatibility) + if auth_server_provider and not token_verifier: + self._token_verifier = ProviderTokenVerifier(auth_server_provider) + self._event_store = event_store + self._custom_starlette_routes: list[Route] = [] self.dependencies = self.settings.dependencies + self._session_manager: StreamableHTTPSessionManager | None = None # Set up MCP protocol handlers self._setup_handlers() @@ -144,20 +190,47 @@ def name(self) -> str: def instructions(self) -> str | None: return self._mcp_server.instructions - def run(self, transport: Literal["stdio", "sse"] = "stdio") -> None: + @property + def session_manager(self) -> StreamableHTTPSessionManager: + """Get the StreamableHTTP session manager. + + This is exposed to enable advanced use cases like mounting multiple + FastMCP servers in a single FastAPI application. + + Raises: + RuntimeError: If called before streamable_http_app() has been called. + """ + if self._session_manager is None: + raise RuntimeError( + "Session manager can only be accessed after" + "calling streamable_http_app()." + "The session manager is created lazily" + "to avoid unnecessary initialization." + ) + return self._session_manager + + def run( + self, + transport: Literal["stdio", "sse", "streamable-http"] = "stdio", + mount_path: str | None = None, + ) -> None: """Run the FastMCP server. Note this is a synchronous function. Args: - transport: Transport protocol to use ("stdio" or "sse") + transport: Transport protocol to use ("stdio", "sse", or "streamable-http") + mount_path: Optional mount path for SSE transport """ - TRANSPORTS = Literal["stdio", "sse"] + TRANSPORTS = Literal["stdio", "sse", "streamable-http"] if transport not in TRANSPORTS.__args__: # type: ignore raise ValueError(f"Unknown transport: {transport}") - if transport == "stdio": - anyio.run(self.run_stdio_async) - else: # transport == "sse" - anyio.run(self.run_sse_async) + match transport: + case "stdio": + anyio.run(self.run_stdio_async) + case "sse": + anyio.run(lambda: self.run_sse_async(mount_path)) + case "streamable-http": + anyio.run(self.run_streamable_http_async) def _setup_handlers(self) -> None: """Set up core MCP protocol handlers.""" @@ -175,13 +248,15 @@ async def list_tools(self) -> list[MCPTool]: return [ MCPTool( name=info.name, + title=info.title, description=info.description, inputSchema=info.parameters, + annotations=info.annotations, ) for info in tools ] - def get_context(self) -> Context[ServerSession, object]: + def get_context(self) -> Context[ServerSession, object, Request]: """ Returns a Context object. Note that the context will only be valid during a request; outside a request, most methods will error. @@ -192,9 +267,7 @@ def get_context(self) -> Context[ServerSession, object]: request_context = None return Context(request_context=request_context, fastmcp=self) - async def call_tool( - self, name: str, arguments: dict[str, Any] - ) -> Sequence[TextContent | ImageContent | EmbeddedResource]: + async def call_tool(self, name: str, arguments: dict[str, Any]) -> Sequence[ContentBlock]: """Call a tool by name with arguments.""" context = self.get_context() result = await self._tool_manager.call_tool(name, arguments, context=context) @@ -209,6 +282,7 @@ async def list_resources(self) -> list[MCPResource]: MCPResource( uri=resource.uri, name=resource.name or "", + title=resource.title, description=resource.description, mimeType=resource.mime_type, ) @@ -221,6 +295,7 @@ async def list_resource_templates(self) -> list[MCPResourceTemplate]: MCPResourceTemplate( uriTemplate=template.uri_template, name=template.name, + title=template.title, description=template.description, ) for template in templates @@ -244,7 +319,9 @@ def add_tool( self, fn: AnyFunction, name: str | None = None, + title: str | None = None, description: str | None = None, + annotations: ToolAnnotations | None = None, ) -> None: """Add a tool to the server. @@ -254,12 +331,18 @@ def add_tool( Args: fn: The function to register as a tool name: Optional name for the tool (defaults to function name) + title: Optional human-readable title for the tool description: Optional description of what the tool does + annotations: Optional ToolAnnotations providing additional tool information """ - self._tool_manager.add_tool(fn, name=name, description=description) + self._tool_manager.add_tool(fn, name=name, title=title, description=description, annotations=annotations) def tool( - self, name: str | None = None, description: str | None = None + self, + name: str | None = None, + title: str | None = None, + description: str | None = None, + annotations: ToolAnnotations | None = None, ) -> Callable[[AnyFunction], AnyFunction]: """Decorator to register a tool. @@ -269,7 +352,9 @@ def tool( Args: name: Optional name for the tool (defaults to function name) + title: Optional human-readable title for the tool description: Optional description of what the tool does + annotations: Optional ToolAnnotations providing additional tool information Example: @server.tool() @@ -289,16 +374,33 @@ async def async_tool(x: int, context: Context) -> str: # Check if user passed function directly instead of calling decorator if callable(name): raise TypeError( - "The @tool decorator was used incorrectly. " - "Did you forget to call it? Use @tool() instead of @tool" + "The @tool decorator was used incorrectly. " "Did you forget to call it? Use @tool() instead of @tool" ) def decorator(fn: AnyFunction) -> AnyFunction: - self.add_tool(fn, name=name, description=description) + self.add_tool(fn, name=name, title=title, description=description, annotations=annotations) return fn return decorator + def completion(self): + """Decorator to register a completion handler. + + The completion handler receives: + - ref: PromptReference or ResourceTemplateReference + - argument: CompletionArgument with name and partial value + - context: Optional CompletionContext with previously resolved arguments + + Example: + @mcp.completion() + async def handle_completion(ref, argument, context): + if isinstance(ref, ResourceTemplateReference): + # Return completions based on ref, argument, and context + return Completion(values=["option1", "option2"]) + return None + """ + return self._mcp_server.completion() + def add_resource(self, resource: Resource) -> None: """Add a resource to the server. @@ -312,6 +414,7 @@ def resource( uri: str, *, name: str | None = None, + title: str | None = None, description: str | None = None, mime_type: str | None = None, ) -> Callable[[AnyFunction], AnyFunction]: @@ -329,6 +432,7 @@ def resource( Args: uri: URI for the resource (e.g. "resource://my-resource" or "resource://{param}") name: Optional name for the resource + title: Optional human-readable title for the resource description: Optional description of the resource mime_type: Optional MIME type for the resource @@ -370,8 +474,7 @@ def decorator(fn: AnyFunction) -> AnyFunction: if uri_params != func_params: raise ValueError( - f"Mismatch between URI parameters {uri_params} " - f"and function parameters {func_params}" + f"Mismatch between URI parameters {uri_params} " f"and function parameters {func_params}" ) # Register as template @@ -379,17 +482,19 @@ def decorator(fn: AnyFunction) -> AnyFunction: fn=fn, uri_template=uri, name=name, + title=title, description=description, - mime_type=mime_type or "text/plain", + mime_type=mime_type, ) else: # Register as regular resource - resource = FunctionResource( - uri=AnyUrl(uri), + resource = FunctionResource.from_function( + fn=fn, + uri=uri, name=name, + title=title, description=description, - mime_type=mime_type or "text/plain", - fn=fn, + mime_type=mime_type, ) self.add_resource(resource) return fn @@ -405,12 +510,13 @@ def add_prompt(self, prompt: Prompt) -> None: self._prompt_manager.add_prompt(prompt) def prompt( - self, name: str | None = None, description: str | None = None + self, name: str | None = None, title: str | None = None, description: str | None = None ) -> Callable[[AnyFunction], AnyFunction]: """Decorator to register a prompt. Args: name: Optional name for the prompt (defaults to function name) + title: Optional human-readable title for the prompt description: Optional description of what the prompt does Example: @@ -448,12 +554,56 @@ async def analyze_file(path: str) -> list[Message]: ) def decorator(func: AnyFunction) -> AnyFunction: - prompt = Prompt.from_function(func, name=name, description=description) + prompt = Prompt.from_function(func, name=name, title=title, description=description) self.add_prompt(prompt) return func return decorator + def custom_route( + self, + path: str, + methods: list[str], + name: str | None = None, + include_in_schema: bool = True, + ): + """ + Decorator to register a custom HTTP route on the FastMCP server. + + Allows adding arbitrary HTTP endpoints outside the standard MCP protocol, + which can be useful for OAuth callbacks, health checks, or admin APIs. + The handler function must be an async function that accepts a Starlette + Request and returns a Response. + + Args: + path: URL path for the route (e.g., "/oauth/callback") + methods: List of HTTP methods to support (e.g., ["GET", "POST"]) + name: Optional name for the route (to reference this route with + Starlette's reverse URL lookup feature) + include_in_schema: Whether to include in OpenAPI schema, defaults to True + + Example: + @server.custom_route("/health", methods=["GET"]) + async def health_check(request: Request) -> Response: + return JSONResponse({"status": "ok"}) + """ + + def decorator( + func: Callable[[Request], Awaitable[Response]], + ) -> Callable[[Request], Awaitable[Response]]: + self._custom_starlette_routes.append( + Route( + path, + endpoint=func, + methods=methods, + name=name, + include_in_schema=include_in_schema, + ) + ) + return func + + return decorator + async def run_stdio_async(self) -> None: """Run the server using stdio transport.""" async with stdio_server() as (read_stream, write_stream): @@ -463,10 +613,11 @@ async def run_stdio_async(self) -> None: self._mcp_server.create_initialization_options(), ) - async def run_sse_async(self) -> None: + async def run_sse_async(self, mount_path: str | None = None) -> None: """Run the server using SSE transport.""" import uvicorn - starlette_app = self.sse_app() + + starlette_app = self.sse_app(mount_path) config = uvicorn.Config( starlette_app, @@ -477,28 +628,287 @@ async def run_sse_async(self) -> None: server = uvicorn.Server(config) await server.serve() - def sse_app(self) -> Starlette: + async def run_streamable_http_async(self) -> None: + """Run the server using StreamableHTTP transport.""" + import uvicorn + + starlette_app = self.streamable_http_app() + + config = uvicorn.Config( + starlette_app, + host=self.settings.host, + port=self.settings.port, + log_level=self.settings.log_level.lower(), + ) + server = uvicorn.Server(config) + await server.serve() + + def _normalize_path(self, mount_path: str, endpoint: str) -> str: + """ + Combine mount path and endpoint to return a normalized path. + + Args: + mount_path: The mount path (e.g. "/github" or "/") + endpoint: The endpoint path (e.g. "/messages/") + + Returns: + Normalized path (e.g. "/github/messages/") + """ + # Special case: root path + if mount_path == "/": + return endpoint + + # Remove trailing slash from mount path + if mount_path.endswith("/"): + mount_path = mount_path[:-1] + + # Ensure endpoint starts with slash + if not endpoint.startswith("/"): + endpoint = "/" + endpoint + + # Combine paths + return mount_path + endpoint + + def sse_app(self, mount_path: str | None = None) -> Starlette: """Return an instance of the SSE server app.""" - sse = SseServerTransport(self.settings.message_path) + from starlette.middleware import Middleware + from starlette.routing import Mount, Route + + # Update mount_path in settings if provided + if mount_path is not None: + self.settings.mount_path = mount_path + + # Create normalized endpoint considering the mount path + normalized_message_endpoint = self._normalize_path(self.settings.mount_path, self.settings.message_path) + + # Set up auth context and dependencies + + sse = SseServerTransport( + normalized_message_endpoint, + security_settings=self.settings.transport_security, + ) + + async def handle_sse(scope: Scope, receive: Receive, send: Send): + # Add client ID from auth context into request context if available - async def handle_sse(request: Request) -> None: async with sse.connect_sse( - request.scope, - request.receive, - request._send, # type: ignore[reportPrivateUsage] + scope, + receive, + send, ) as streams: await self._mcp_server.run( streams[0], streams[1], self._mcp_server.create_initialization_options(), ) + return Response() + + # Create routes + routes: list[Route | Mount] = [] + middleware: list[Middleware] = [] + required_scopes = [] + + # Set up auth if configured + if self.settings.auth: + required_scopes = self.settings.auth.required_scopes or [] + + # Add auth middleware if token verifier is available + if self._token_verifier: + middleware = [ + # extract auth info from request (but do not require it) + Middleware( + AuthenticationMiddleware, + backend=BearerAuthBackend(self._token_verifier), + ), + # Add the auth context middleware to store + # authenticated user in a contextvar + Middleware(AuthContextMiddleware), + ] + + # Add auth endpoints if auth server provider is configured + if self._auth_server_provider: + from mcp.server.auth.routes import create_auth_routes + + routes.extend( + create_auth_routes( + provider=self._auth_server_provider, + issuer_url=self.settings.auth.issuer_url, + service_documentation_url=self.settings.auth.service_documentation_url, + client_registration_options=self.settings.auth.client_registration_options, + revocation_options=self.settings.auth.revocation_options, + ) + ) + + # When auth is configured, require authentication + if self._token_verifier: + # Determine resource metadata URL + resource_metadata_url = None + if self.settings.auth and self.settings.auth.resource_server_url: + from pydantic import AnyHttpUrl + + resource_metadata_url = AnyHttpUrl( + str(self.settings.auth.resource_server_url).rstrip("/") + "/.well-known/oauth-protected-resource" + ) + + # Auth is enabled, wrap the endpoints with RequireAuthMiddleware + routes.append( + Route( + self.settings.sse_path, + endpoint=RequireAuthMiddleware(handle_sse, required_scopes, resource_metadata_url), + methods=["GET"], + ) + ) + routes.append( + Mount( + self.settings.message_path, + app=RequireAuthMiddleware(sse.handle_post_message, required_scopes, resource_metadata_url), + ) + ) + else: + # Auth is disabled, no need for RequireAuthMiddleware + # Since handle_sse is an ASGI app, we need to create a compatible endpoint + async def sse_endpoint(request: Request) -> Response: + # Convert the Starlette request to ASGI parameters + return await handle_sse(request.scope, request.receive, request._send) # type: ignore[reportPrivateUsage] + + routes.append( + Route( + self.settings.sse_path, + endpoint=sse_endpoint, + methods=["GET"], + ) + ) + routes.append( + Mount( + self.settings.message_path, + app=sse.handle_post_message, + ) + ) + # Add protected resource metadata endpoint if configured as RS + if self.settings.auth and self.settings.auth.resource_server_url: + from mcp.server.auth.routes import create_protected_resource_routes + + routes.extend( + create_protected_resource_routes( + resource_url=self.settings.auth.resource_server_url, + authorization_servers=[self.settings.auth.issuer_url], + scopes_supported=self.settings.auth.required_scopes, + ) + ) + + # mount these routes last, so they have the lowest route matching precedence + routes.extend(self._custom_starlette_routes) + + # Create Starlette app with routes and middleware + return Starlette(debug=self.settings.debug, routes=routes, middleware=middleware) + + def streamable_http_app(self) -> Starlette: + """Return an instance of the StreamableHTTP server app.""" + from starlette.middleware import Middleware + from starlette.routing import Mount + + # Create session manager on first call (lazy initialization) + if self._session_manager is None: + self._session_manager = StreamableHTTPSessionManager( + app=self._mcp_server, + event_store=self._event_store, + json_response=self.settings.json_response, + stateless=self.settings.stateless_http, # Use the stateless setting + security_settings=self.settings.transport_security, + ) + + # Create the ASGI handler + async def handle_streamable_http(scope: Scope, receive: Receive, send: Send) -> None: + await self.session_manager.handle_request(scope, receive, send) + + # Create routes + routes: list[Route | Mount] = [] + middleware: list[Middleware] = [] + required_scopes = [] + + # Set up auth if configured + if self.settings.auth: + required_scopes = self.settings.auth.required_scopes or [] + + # Add auth middleware if token verifier is available + if self._token_verifier: + middleware = [ + Middleware( + AuthenticationMiddleware, + backend=BearerAuthBackend(self._token_verifier), + ), + Middleware(AuthContextMiddleware), + ] + + # Add auth endpoints if auth server provider is configured + if self._auth_server_provider: + from mcp.server.auth.routes import create_auth_routes + + routes.extend( + create_auth_routes( + provider=self._auth_server_provider, + issuer_url=self.settings.auth.issuer_url, + service_documentation_url=self.settings.auth.service_documentation_url, + client_registration_options=self.settings.auth.client_registration_options, + revocation_options=self.settings.auth.revocation_options, + ) + ) + + # Set up routes with or without auth + if self._token_verifier: + # Determine resource metadata URL + resource_metadata_url = None + if self.settings.auth and self.settings.auth.resource_server_url: + from pydantic import AnyHttpUrl + + resource_metadata_url = AnyHttpUrl( + str(self.settings.auth.resource_server_url).rstrip("/") + "/.well-known/oauth-protected-resource" + ) + + routes.append( + Mount( + self.settings.streamable_http_path, + app=RequireAuthMiddleware(handle_streamable_http, required_scopes, resource_metadata_url), + ) + ) + else: + # Auth is disabled, no wrapper needed + routes.append( + Mount( + self.settings.streamable_http_path, + app=handle_streamable_http, + ) + ) + + # Add protected resource metadata endpoint if configured as RS + if self.settings.auth and self.settings.auth.resource_server_url: + from mcp.server.auth.handlers.metadata import ProtectedResourceMetadataHandler + from mcp.server.auth.routes import cors_middleware + from mcp.shared.auth import ProtectedResourceMetadata + + protected_resource_metadata = ProtectedResourceMetadata( + resource=self.settings.auth.resource_server_url, + authorization_servers=[self.settings.auth.issuer_url], + scopes_supported=self.settings.auth.required_scopes, + ) + routes.append( + Route( + "/.well-known/oauth-protected-resource", + endpoint=cors_middleware( + ProtectedResourceMetadataHandler(protected_resource_metadata).handle, + ["GET", "OPTIONS"], + ), + methods=["GET", "OPTIONS"], + ) + ) + + routes.extend(self._custom_starlette_routes) return Starlette( debug=self.settings.debug, - routes=[ - Route(self.settings.sse_path, endpoint=handle_sse), - Mount(self.settings.message_path, app=sse.handle_post_message), - ], + routes=routes, + middleware=middleware, + lifespan=lambda app: self.session_manager.run(), ) async def list_prompts(self) -> list[MCPPrompt]: @@ -507,6 +917,7 @@ async def list_prompts(self) -> list[MCPPrompt]: return [ MCPPrompt( name=prompt.name, + title=prompt.title, description=prompt.description, arguments=[ MCPPromptArgument( @@ -520,9 +931,7 @@ async def list_prompts(self) -> list[MCPPrompt]: for prompt in prompts ] - async def get_prompt( - self, name: str, arguments: dict[str, Any] | None = None - ) -> GetPromptResult: + async def get_prompt(self, name: str, arguments: dict[str, Any] | None = None) -> GetPromptResult: """Get a prompt by name with arguments.""" try: messages = await self._prompt_manager.render_prompt(name, arguments) @@ -535,12 +944,12 @@ async def get_prompt( def _convert_to_content( result: Any, -) -> Sequence[TextContent | ImageContent | EmbeddedResource]: +) -> Sequence[ContentBlock]: """Convert a result to a sequence of content objects.""" if result is None: return [] - if isinstance(result, TextContent | ImageContent | EmbeddedResource): + if isinstance(result, ContentBlock): return [result] if isinstance(result, Image): @@ -550,15 +959,12 @@ def _convert_to_content( return list(chain.from_iterable(_convert_to_content(item) for item in result)) # type: ignore[reportUnknownVariableType] if not isinstance(result, str): - try: - result = json.dumps(pydantic_core.to_jsonable_python(result)) - except Exception: - result = str(result) + result = pydantic_core.to_json(result, fallback=str, indent=2).decode() return [TextContent(type="text", text=result)] -class Context(BaseModel, Generic[ServerSessionT, LifespanContextT]): +class Context(BaseModel, Generic[ServerSessionT, LifespanContextT, RequestT]): """Context object providing access to MCP capabilities. This provides a cleaner interface to MCP's RequestContext functionality. @@ -592,13 +998,13 @@ def my_tool(x: int, ctx: Context) -> str: The context is optional - tools that don't need it can omit the parameter. """ - _request_context: RequestContext[ServerSessionT, LifespanContextT] | None + _request_context: RequestContext[ServerSessionT, LifespanContextT, RequestT] | None _fastmcp: FastMCP | None def __init__( self, *, - request_context: RequestContext[ServerSessionT, LifespanContextT] | None = None, + request_context: (RequestContext[ServerSessionT, LifespanContextT, RequestT] | None) = None, fastmcp: FastMCP | None = None, **kwargs: Any, ): @@ -614,33 +1020,32 @@ def fastmcp(self) -> FastMCP: return self._fastmcp @property - def request_context(self) -> RequestContext[ServerSessionT, LifespanContextT]: + def request_context( + self, + ) -> RequestContext[ServerSessionT, LifespanContextT, RequestT]: """Access to the underlying request context.""" if self._request_context is None: raise ValueError("Context is not available outside of a request") return self._request_context - async def report_progress( - self, progress: float, total: float | None = None - ) -> None: + async def report_progress(self, progress: float, total: float | None = None, message: str | None = None) -> None: """Report progress for the current operation. Args: progress: Current progress value e.g. 24 total: Optional total value e.g. 100 + message: Optional message e.g. Starting render... """ - - progress_token = ( - self.request_context.meta.progressToken - if self.request_context.meta - else None - ) + progress_token = self.request_context.meta.progressToken if self.request_context.meta else None if progress_token is None: return await self.request_context.session.send_progress_notification( - progress_token=progress_token, progress=progress, total=total + progress_token=progress_token, + progress=progress, + total=total, + message=message, ) async def read_resource(self, uri: str | AnyUrl) -> Iterable[ReadResourceContents]: @@ -652,11 +1057,40 @@ async def read_resource(self, uri: str | AnyUrl) -> Iterable[ReadResourceContent Returns: The resource content as either text or bytes """ - assert ( - self._fastmcp is not None - ), "Context is not available outside of a request" + assert self._fastmcp is not None, "Context is not available outside of a request" return await self._fastmcp.read_resource(uri) + async def elicit( + self, + message: str, + schema: type[ElicitSchemaModelT], + ) -> ElicitationResult[ElicitSchemaModelT]: + """Elicit information from the client/user. + + This method can be used to interactively ask for additional information from the + client within a tool's execution. The client might display the message to the + user and collect a response according to the provided schema. Or in case a + client is an agent, it might decide how to handle the elicitation -- either by asking + the user or automatically generating a response. + + Args: + schema: A Pydantic model class defining the expected response structure, according to the specification, + only primive types are allowed. + message: Optional message to present to the user. If not provided, will use + a default message based on the schema + + Returns: + An ElicitationResult containing the action taken and the data if accepted + + Note: + Check the result.action to determine if the user accepted, declined, or cancelled. + The result.data will only be populated if action is "accept" and validation succeeded. + """ + + return await elicit_with_validation( + session=self.request_context.session, message=message, schema=schema, related_request_id=self.request_id + ) + async def log( self, level: Literal["debug", "info", "warning", "error"], @@ -673,17 +1107,16 @@ async def log( **extra: Additional structured data to include """ await self.request_context.session.send_log_message( - level=level, data=message, logger=logger_name + level=level, + data=message, + logger=logger_name, + related_request_id=self.request_id, ) @property def client_id(self) -> str | None: """Get the client ID if available.""" - return ( - getattr(self.request_context.meta, "client_id", None) - if self.request_context.meta - else None - ) + return getattr(self.request_context.meta, "client_id", None) if self.request_context.meta else None @property def request_id(self) -> str: diff --git a/src/mcp/server/fastmcp/tools/base.py b/src/mcp/server/fastmcp/tools/base.py index 92a216f56..2f7c48e8b 100644 --- a/src/mcp/server/fastmcp/tools/base.py +++ b/src/mcp/server/fastmcp/tools/base.py @@ -1,5 +1,6 @@ from __future__ import annotations as _annotations +import functools import inspect from collections.abc import Callable from typing import TYPE_CHECKING, Any, get_origin @@ -8,11 +9,12 @@ from mcp.server.fastmcp.exceptions import ToolError from mcp.server.fastmcp.utilities.func_metadata import FuncMetadata, func_metadata +from mcp.types import ToolAnnotations if TYPE_CHECKING: from mcp.server.fastmcp.server import Context from mcp.server.session import ServerSessionT - from mcp.shared.context import LifespanContextT + from mcp.shared.context import LifespanContextT, RequestT class Tool(BaseModel): @@ -20,27 +22,28 @@ class Tool(BaseModel): fn: Callable[..., Any] = Field(exclude=True) name: str = Field(description="Name of the tool") + title: str | None = Field(None, description="Human-readable title of the tool") description: str = Field(description="Description of what the tool does") parameters: dict[str, Any] = Field(description="JSON schema for tool parameters") fn_metadata: FuncMetadata = Field( - description="Metadata about the function including a pydantic model for tool" - " arguments" + description="Metadata about the function including a pydantic model for tool" " arguments" ) is_async: bool = Field(description="Whether the tool is async") - context_kwarg: str | None = Field( - None, description="Name of the kwarg that should receive context" - ) + context_kwarg: str | None = Field(None, description="Name of the kwarg that should receive context") + annotations: ToolAnnotations | None = Field(None, description="Optional annotations for the tool") @classmethod def from_function( cls, fn: Callable[..., Any], name: str | None = None, + title: str | None = None, description: str | None = None, context_kwarg: str | None = None, + annotations: ToolAnnotations | None = None, ) -> Tool: """Create a Tool from a function.""" - from mcp.server.fastmcp import Context + from mcp.server.fastmcp.server import Context func_name = name or fn.__name__ @@ -48,7 +51,7 @@ def from_function( raise ValueError("You must provide a name for lambda functions") func_doc = description or fn.__doc__ or "" - is_async = inspect.iscoroutinefunction(fn) + is_async = _is_async_callable(fn) if context_kwarg is None: sig = inspect.signature(fn) @@ -68,17 +71,19 @@ def from_function( return cls( fn=fn, name=func_name, + title=title, description=func_doc, parameters=parameters, fn_metadata=func_arg_metadata, is_async=is_async, context_kwarg=context_kwarg, + annotations=annotations, ) async def run( self, arguments: dict[str, Any], - context: Context[ServerSessionT, LifespanContextT] | None = None, + context: Context[ServerSessionT, LifespanContextT, RequestT] | None = None, ) -> Any: """Run the tool with arguments.""" try: @@ -86,9 +91,16 @@ async def run( self.fn, self.is_async, arguments, - {self.context_kwarg: context} - if self.context_kwarg is not None - else None, + {self.context_kwarg: context} if self.context_kwarg is not None else None, ) except Exception as e: raise ToolError(f"Error executing tool {self.name}: {e}") from e + + +def _is_async_callable(obj: Any) -> bool: + while isinstance(obj, functools.partial): + obj = obj.func + + return inspect.iscoroutinefunction(obj) or ( + callable(obj) and inspect.iscoroutinefunction(getattr(obj, "__call__", None)) + ) diff --git a/src/mcp/server/fastmcp/tools/tool_manager.py b/src/mcp/server/fastmcp/tools/tool_manager.py index 4d6ac268f..b9ca1655d 100644 --- a/src/mcp/server/fastmcp/tools/tool_manager.py +++ b/src/mcp/server/fastmcp/tools/tool_manager.py @@ -6,7 +6,8 @@ from mcp.server.fastmcp.exceptions import ToolError from mcp.server.fastmcp.tools.base import Tool from mcp.server.fastmcp.utilities.logging import get_logger -from mcp.shared.context import LifespanContextT +from mcp.shared.context import LifespanContextT, RequestT +from mcp.types import ToolAnnotations if TYPE_CHECKING: from mcp.server.fastmcp.server import Context @@ -18,8 +19,19 @@ class ToolManager: """Manages FastMCP tools.""" - def __init__(self, warn_on_duplicate_tools: bool = True): + def __init__( + self, + warn_on_duplicate_tools: bool = True, + *, + tools: list[Tool] | None = None, + ): self._tools: dict[str, Tool] = {} + if tools is not None: + for tool in tools: + if warn_on_duplicate_tools and tool.name in self._tools: + logger.warning(f"Tool already exists: {tool.name}") + self._tools[tool.name] = tool + self.warn_on_duplicate_tools = warn_on_duplicate_tools def get_tool(self, name: str) -> Tool | None: @@ -34,10 +46,12 @@ def add_tool( self, fn: Callable[..., Any], name: str | None = None, + title: str | None = None, description: str | None = None, + annotations: ToolAnnotations | None = None, ) -> Tool: """Add a tool to the server.""" - tool = Tool.from_function(fn, name=name, description=description) + tool = Tool.from_function(fn, name=name, title=title, description=description, annotations=annotations) existing = self._tools.get(tool.name) if existing: if self.warn_on_duplicate_tools: @@ -50,7 +64,7 @@ async def call_tool( self, name: str, arguments: dict[str, Any], - context: Context[ServerSessionT, LifespanContextT] | None = None, + context: Context[ServerSessionT, LifespanContextT, RequestT] | None = None, ) -> Any: """Call a tool by name with arguments.""" tool = self.get_tool(name) diff --git a/src/mcp/server/fastmcp/utilities/func_metadata.py b/src/mcp/server/fastmcp/utilities/func_metadata.py index 374391325..9f8d9177a 100644 --- a/src/mcp/server/fastmcp/utilities/func_metadata.py +++ b/src/mcp/server/fastmcp/utilities/func_metadata.py @@ -102,9 +102,7 @@ def pre_parse_json(self, data: dict[str, Any]) -> dict[str, Any]: ) -def func_metadata( - func: Callable[..., Any], skip_names: Sequence[str] = () -) -> FuncMetadata: +def func_metadata(func: Callable[..., Any], skip_names: Sequence[str] = ()) -> FuncMetadata: """Given a function, return metadata including a pydantic model representing its signature. @@ -131,9 +129,7 @@ def func_metadata( globalns = getattr(func, "__globals__", {}) for param in params.values(): if param.name.startswith("_"): - raise InvalidSignature( - f"Parameter {param.name} of {func.__name__} cannot start with '_'" - ) + raise InvalidSignature(f"Parameter {param.name} of {func.__name__} cannot start with '_'") if param.name in skip_names: continue annotation = param.annotation @@ -142,11 +138,7 @@ def func_metadata( if annotation is None: annotation = Annotated[ None, - Field( - default=param.default - if param.default is not inspect.Parameter.empty - else PydanticUndefined - ), + Field(default=param.default if param.default is not inspect.Parameter.empty else PydanticUndefined), ] # Untyped field @@ -160,9 +152,7 @@ def func_metadata( field_info = FieldInfo.from_annotated_attribute( _get_typed_annotation(annotation, globalns), - param.default - if param.default is not inspect.Parameter.empty - else PydanticUndefined, + param.default if param.default is not inspect.Parameter.empty else PydanticUndefined, ) dynamic_pydantic_model_params[param.name] = (field_info.annotation, field_info) continue @@ -177,9 +167,7 @@ def func_metadata( def _get_typed_annotation(annotation: Any, globalns: dict[str, Any]) -> Any: - def try_eval_type( - value: Any, globalns: dict[str, Any], localns: dict[str, Any] - ) -> tuple[Any, bool]: + def try_eval_type(value: Any, globalns: dict[str, Any], localns: dict[str, Any]) -> tuple[Any, bool]: try: return eval_type_backport(value, globalns, localns), True except NameError: diff --git a/src/mcp/server/lowlevel/server.py b/src/mcp/server/lowlevel/server.py index dbaff3051..0a8ab7f97 100644 --- a/src/mcp/server/lowlevel/server.py +++ b/src/mcp/server/lowlevel/server.py @@ -37,7 +37,8 @@ async def handle_list_resource_templates() -> list[types.ResourceTemplate]: 3. Define notification handlers if needed: @server.progress_notification() async def handle_progress( - progress_token: str | int, progress: float, total: float | None + progress_token: str | int, progress: float, total: float | None, + message: str | None ) -> None: # Implementation @@ -71,11 +72,12 @@ async def main(): import warnings from collections.abc import AsyncIterator, Awaitable, Callable, Iterable from contextlib import AbstractAsyncContextManager, AsyncExitStack, asynccontextmanager -from typing import Any, Generic, TypeVar +from typing import Any, Generic import anyio from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream from pydantic import AnyUrl +from typing_extensions import TypeVar import mcp.types as types from mcp.server.lowlevel.helper_types import ReadResourceContents @@ -84,16 +86,16 @@ async def main(): from mcp.server.stdio import stdio_server as stdio_server from mcp.shared.context import RequestContext from mcp.shared.exceptions import McpError +from mcp.shared.message import ServerMessageMetadata, SessionMessage from mcp.shared.session import RequestResponder logger = logging.getLogger(__name__) LifespanResultT = TypeVar("LifespanResultT") +RequestT = TypeVar("RequestT", default=Any) # This will be properly typed in each Server instance's context -request_ctx: contextvars.ContextVar[RequestContext[ServerSession, Any]] = ( - contextvars.ContextVar("request_ctx") -) +request_ctx: contextvars.ContextVar[RequestContext[ServerSession, Any, Any]] = contextvars.ContextVar("request_ctx") class NotificationOptions: @@ -109,7 +111,7 @@ def __init__( @asynccontextmanager -async def lifespan(server: Server[LifespanResultT]) -> AsyncIterator[object]: +async def lifespan(server: Server[LifespanResultT, RequestT]) -> AsyncIterator[object]: """Default lifespan context manager that does nothing. Args: @@ -121,28 +123,27 @@ async def lifespan(server: Server[LifespanResultT]) -> AsyncIterator[object]: yield {} -class Server(Generic[LifespanResultT]): +class Server(Generic[LifespanResultT, RequestT]): def __init__( self, name: str, version: str | None = None, instructions: str | None = None, lifespan: Callable[ - [Server[LifespanResultT]], AbstractAsyncContextManager[LifespanResultT] + [Server[LifespanResultT, RequestT]], + AbstractAsyncContextManager[LifespanResultT], ] = lifespan, ): self.name = name self.version = version self.instructions = instructions self.lifespan = lifespan - self.request_handlers: dict[ - type, Callable[..., Awaitable[types.ServerResult]] - ] = { + self.request_handlers: dict[type, Callable[..., Awaitable[types.ServerResult]]] = { types.PingRequest: _ping_handler, } self.notification_handlers: dict[type, Callable[..., Awaitable[None]]] = {} self.notification_options = NotificationOptions() - logger.debug(f"Initializing server '{name}'") + logger.debug("Initializing server %r", name) def create_initialization_options( self, @@ -184,9 +185,7 @@ def get_capabilities( # Set prompt capabilities if handler exists if types.ListPromptsRequest in self.request_handlers: - prompts_capability = types.PromptsCapability( - listChanged=notification_options.prompts_changed - ) + prompts_capability = types.PromptsCapability(listChanged=notification_options.prompts_changed) # Set resource capabilities if handler exists if types.ListResourcesRequest in self.request_handlers: @@ -196,9 +195,7 @@ def get_capabilities( # Set tool capabilities if handler exists if types.ListToolsRequest in self.request_handlers: - tools_capability = types.ToolsCapability( - listChanged=notification_options.tools_changed - ) + tools_capability = types.ToolsCapability(listChanged=notification_options.tools_changed) # Set logging capabilities if handler exists if types.SetLevelRequest in self.request_handlers: @@ -213,7 +210,9 @@ def get_capabilities( ) @property - def request_context(self) -> RequestContext[ServerSession, LifespanResultT]: + def request_context( + self, + ) -> RequestContext[ServerSession, LifespanResultT, RequestT]: """If called outside of a request context, this will raise a LookupError.""" return request_ctx.get() @@ -232,9 +231,7 @@ async def handler(_: Any): def get_prompt(self): def decorator( - func: Callable[ - [str, dict[str, str] | None], Awaitable[types.GetPromptResult] - ], + func: Callable[[str, dict[str, str] | None], Awaitable[types.GetPromptResult]], ): logger.debug("Registering handler for GetPromptRequest") @@ -253,9 +250,7 @@ def decorator(func: Callable[[], Awaitable[list[types.Resource]]]): async def handler(_: Any): resources = await func() - return types.ServerResult( - types.ListResourcesResult(resources=resources) - ) + return types.ServerResult(types.ListResourcesResult(resources=resources)) self.request_handlers[types.ListResourcesRequest] = handler return func @@ -268,9 +263,7 @@ def decorator(func: Callable[[], Awaitable[list[types.ResourceTemplate]]]): async def handler(_: Any): templates = await func() - return types.ServerResult( - types.ListResourceTemplatesResult(resourceTemplates=templates) - ) + return types.ServerResult(types.ListResourceTemplatesResult(resourceTemplates=templates)) self.request_handlers[types.ListResourceTemplatesRequest] = handler return func @@ -279,9 +272,7 @@ async def handler(_: Any): def read_resource(self): def decorator( - func: Callable[ - [AnyUrl], Awaitable[str | bytes | Iterable[ReadResourceContents]] - ], + func: Callable[[AnyUrl], Awaitable[str | bytes | Iterable[ReadResourceContents]]], ): logger.debug("Registering handler for ReadResourceRequest") @@ -316,8 +307,7 @@ def create_content(data: str | bytes, mime_type: str | None): content = create_content(data, None) case Iterable() as contents: contents_list = [ - create_content(content_item.content, content_item.mime_type) - for content_item in contents + create_content(content_item.content, content_item.mime_type) for content_item in contents ] return types.ServerResult( types.ReadResourceResult( @@ -325,9 +315,7 @@ def create_content(data: str | bytes, mime_type: str | None): ) ) case _: - raise ValueError( - f"Unexpected return type from read_resource: {type(result)}" - ) + raise ValueError(f"Unexpected return type from read_resource: {type(result)}") return types.ServerResult( types.ReadResourceResult( @@ -396,11 +384,7 @@ def call_tool(self): def decorator( func: Callable[ ..., - Awaitable[ - Iterable[ - types.TextContent | types.ImageContent | types.EmbeddedResource - ] - ], + Awaitable[Iterable[types.ContentBlock]], ], ): logger.debug("Registering handler for CallToolRequest") @@ -408,9 +392,7 @@ def decorator( async def handler(req: types.CallToolRequest): try: results = await func(req.params.name, (req.params.arguments or {})) - return types.ServerResult( - types.CallToolResult(content=list(results), isError=False) - ) + return types.ServerResult(types.CallToolResult(content=list(results), isError=False)) except Exception as e: return types.ServerResult( types.CallToolResult( @@ -426,13 +408,16 @@ async def handler(req: types.CallToolRequest): def progress_notification(self): def decorator( - func: Callable[[str | int, float, float | None], Awaitable[None]], + func: Callable[[str | int, float, float | None, str | None], Awaitable[None]], ): logger.debug("Registering handler for ProgressNotification") async def handler(req: types.ProgressNotification): await func( - req.params.progressToken, req.params.progress, req.params.total + req.params.progressToken, + req.params.progress, + req.params.total, + req.params.message, ) self.notification_handlers[types.ProgressNotification] = handler @@ -446,8 +431,9 @@ def completion(self): def decorator( func: Callable[ [ - types.PromptReference | types.ResourceReference, + types.PromptReference | types.ResourceTemplateReference, types.CompletionArgument, + types.CompletionContext | None, ], Awaitable[types.Completion | None], ], @@ -455,7 +441,7 @@ def decorator( logger.debug("Registering handler for CompleteRequest") async def handler(req: types.CompleteRequest): - completion = await func(req.params.ref, req.params.argument) + completion = await func(req.params.ref, req.params.argument, req.params.context) return types.ServerResult( types.CompleteResult( completion=completion @@ -471,24 +457,34 @@ async def handler(req: types.CompleteRequest): async def run( self, - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception], - write_stream: MemoryObjectSendStream[types.JSONRPCMessage], + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception], + write_stream: MemoryObjectSendStream[SessionMessage], initialization_options: InitializationOptions, # When False, exceptions are returned as messages to the client. # When True, exceptions are raised, which will cause the server to shut down # but also make tracing exceptions much easier during testing and when using # in-process servers. raise_exceptions: bool = False, + # When True, the server is stateless and + # clients can perform initialization with any node. The client must still follow + # the initialization lifecycle, but can do so with any available node + # rather than requiring initialization for each connection. + stateless: bool = False, ): async with AsyncExitStack() as stack: lifespan_context = await stack.enter_async_context(self.lifespan(self)) session = await stack.enter_async_context( - ServerSession(read_stream, write_stream, initialization_options) + ServerSession( + read_stream, + write_stream, + initialization_options, + stateless=stateless, + ) ) async with anyio.create_task_group() as tg: async for message in session.incoming_messages: - logger.debug(f"Received message: {message}") + logger.debug("Received message: %s", message) tg.start_soon( self._handle_message, @@ -500,9 +496,7 @@ async def run( async def _handle_message( self, - message: RequestResponder[types.ClientRequest, types.ServerResult] - | types.ClientNotification - | Exception, + message: RequestResponder[types.ClientRequest, types.ServerResult] | types.ClientNotification | Exception, session: ServerSession, lifespan_context: LifespanResultT, raise_exceptions: bool = False, @@ -510,18 +504,14 @@ async def _handle_message( with warnings.catch_warnings(record=True) as w: # TODO(Marcelo): We should be checking if message is Exception here. match message: # type: ignore[reportMatchNotExhaustive] - case ( - RequestResponder(request=types.ClientRequest(root=req)) as responder - ): + case RequestResponder(request=types.ClientRequest(root=req)) as responder: with responder: - await self._handle_request( - message, req, session, lifespan_context, raise_exceptions - ) + await self._handle_request(message, req, session, lifespan_context, raise_exceptions) case types.ClientNotification(root=notify): await self._handle_notification(notify) for warning in w: - logger.info(f"Warning: {warning.category.__name__}: {warning.message}") + logger.info("Warning: %s: %s", warning.category.__name__, warning.message) async def _handle_request( self, @@ -531,13 +521,17 @@ async def _handle_request( lifespan_context: LifespanResultT, raise_exceptions: bool, ): - logger.info(f"Processing request of type {type(req).__name__}") - if type(req) in self.request_handlers: - handler = self.request_handlers[type(req)] - logger.debug(f"Dispatching request of type {type(req).__name__}") + logger.info("Processing request of type %s", type(req).__name__) + if handler := self.request_handlers.get(type(req)): # type: ignore + logger.debug("Dispatching request of type %s", type(req).__name__) token = None try: + # Extract request context from message metadata + request_data = None + if message.message_metadata is not None and isinstance(message.message_metadata, ServerMessageMetadata): + request_data = message.message_metadata.request_context + # Set our global state that can be retrieved via # app.get_request_context() token = request_ctx.set( @@ -546,6 +540,7 @@ async def _handle_request( message.request_meta, session, lifespan_context, + request=request_data, ) ) response = await handler(req) @@ -572,18 +567,13 @@ async def _handle_request( logger.debug("Response sent") async def _handle_notification(self, notify: Any): - if type(notify) in self.notification_handlers: - assert type(notify) in self.notification_handlers - - handler = self.notification_handlers[type(notify)] - logger.debug( - f"Dispatching notification of type " f"{type(notify).__name__}" - ) + if handler := self.notification_handlers.get(type(notify)): # type: ignore + logger.debug("Dispatching notification of type %s", type(notify).__name__) try: await handler(notify) - except Exception as err: - logger.error(f"Uncaught exception in notification handler: " f"{err}") + except Exception: + logger.exception("Uncaught exception in notification handler") async def _ping_handler(request: types.PingRequest) -> types.ServerResult: diff --git a/src/mcp/server/session.py b/src/mcp/server/session.py index 568ecd4b9..5c696b136 100644 --- a/src/mcp/server/session.py +++ b/src/mcp/server/session.py @@ -47,10 +47,12 @@ async def handle_list_prompts(ctx: RequestContext) -> list[types.Prompt]: import mcp.types as types from mcp.server.models import InitializationOptions +from mcp.shared.message import ServerMessageMetadata, SessionMessage from mcp.shared.session import ( BaseSession, RequestResponder, ) +from mcp.shared.version import SUPPORTED_PROTOCOL_VERSIONS class InitializationState(Enum): @@ -62,9 +64,7 @@ class InitializationState(Enum): ServerSessionT = TypeVar("ServerSessionT", bound="ServerSession") ServerRequestResponder = ( - RequestResponder[types.ClientRequest, types.ServerResult] - | types.ClientNotification - | Exception + RequestResponder[types.ClientRequest, types.ServerResult] | types.ClientNotification | Exception ) @@ -82,24 +82,21 @@ class ServerSession( def __init__( self, - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception], - write_stream: MemoryObjectSendStream[types.JSONRPCMessage], + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception], + write_stream: MemoryObjectSendStream[SessionMessage], init_options: InitializationOptions, + stateless: bool = False, ) -> None: - super().__init__( - read_stream, write_stream, types.ClientRequest, types.ClientNotification + super().__init__(read_stream, write_stream, types.ClientRequest, types.ClientNotification) + self._initialization_state = ( + InitializationState.Initialized if stateless else InitializationState.NotInitialized ) - self._initialization_state = InitializationState.NotInitialized + self._init_options = init_options - self._incoming_message_stream_writer, self._incoming_message_stream_reader = ( - anyio.create_memory_object_stream[ServerRequestResponder](0) - ) - self._exit_stack.push_async_callback( - lambda: self._incoming_message_stream_reader.aclose() - ) - self._exit_stack.push_async_callback( - lambda: self._incoming_message_stream_writer.aclose() - ) + self._incoming_message_stream_writer, self._incoming_message_stream_reader = anyio.create_memory_object_stream[ + ServerRequestResponder + ](0) + self._exit_stack.push_async_callback(lambda: self._incoming_message_stream_reader.aclose()) @property def client_params(self) -> types.InitializeRequestParams | None: @@ -124,31 +121,37 @@ def check_client_capability(self, capability: types.ClientCapabilities) -> bool: if client_caps.sampling is None: return False + if capability.elicitation is not None: + if client_caps.elicitation is None: + return False + if capability.experimental is not None: if client_caps.experimental is None: return False # Check each experimental capability for exp_key, exp_value in capability.experimental.items(): - if ( - exp_key not in client_caps.experimental - or client_caps.experimental[exp_key] != exp_value - ): + if exp_key not in client_caps.experimental or client_caps.experimental[exp_key] != exp_value: return False return True - async def _received_request( - self, responder: RequestResponder[types.ClientRequest, types.ServerResult] - ): + async def _receive_loop(self) -> None: + async with self._incoming_message_stream_writer: + await super()._receive_loop() + + async def _received_request(self, responder: RequestResponder[types.ClientRequest, types.ServerResult]): match responder.request.root: case types.InitializeRequest(params=params): + requested_version = params.protocolVersion self._initialization_state = InitializationState.Initializing self._client_params = params with responder: await responder.respond( types.ServerResult( types.InitializeResult( - protocolVersion=types.LATEST_PROTOCOL_VERSION, + protocolVersion=requested_version + if requested_version in SUPPORTED_PROTOCOL_VERSIONS + else types.LATEST_PROTOCOL_VERSION, capabilities=self._init_options.capabilities, serverInfo=types.Implementation( name=self._init_options.server_name, @@ -160,13 +163,9 @@ async def _received_request( ) case _: if self._initialization_state != InitializationState.Initialized: - raise RuntimeError( - "Received request before initialization was complete" - ) + raise RuntimeError("Received request before initialization was complete") - async def _received_notification( - self, notification: types.ClientNotification - ) -> None: + async def _received_notification(self, notification: types.ClientNotification) -> None: # Need this to avoid ASYNC910 await anyio.lowlevel.checkpoint() match notification.root: @@ -174,12 +173,14 @@ async def _received_notification( self._initialization_state = InitializationState.Initialized case _: if self._initialization_state != InitializationState.Initialized: - raise RuntimeError( - "Received notification before initialization was complete" - ) + raise RuntimeError("Received notification before initialization was complete") async def send_log_message( - self, level: types.LoggingLevel, data: Any, logger: str | None = None + self, + level: types.LoggingLevel, + data: Any, + logger: str | None = None, + related_request_id: types.RequestId | None = None, ) -> None: """Send a log message notification.""" await self.send_notification( @@ -192,7 +193,8 @@ async def send_log_message( logger=logger, ), ) - ) + ), + related_request_id, ) async def send_resource_updated(self, uri: AnyUrl) -> None: @@ -217,10 +219,11 @@ async def create_message( stop_sequences: list[str] | None = None, metadata: dict[str, Any] | None = None, model_preferences: types.ModelPreferences | None = None, + related_request_id: types.RequestId | None = None, ) -> types.CreateMessageResult: """Send a sampling/create_message request.""" return await self.send_request( - types.ServerRequest( + request=types.ServerRequest( types.CreateMessageRequest( method="sampling/createMessage", params=types.CreateMessageRequestParams( @@ -235,7 +238,10 @@ async def create_message( ), ) ), - types.CreateMessageResult, + result_type=types.CreateMessageResult, + metadata=ServerMessageMetadata( + related_request_id=related_request_id, + ), ) async def list_roots(self) -> types.ListRootsResult: @@ -249,6 +255,35 @@ async def list_roots(self) -> types.ListRootsResult: types.ListRootsResult, ) + async def elicit( + self, + message: str, + requestedSchema: types.ElicitRequestedSchema, + related_request_id: types.RequestId | None = None, + ) -> types.ElicitResult: + """Send an elicitation/create request. + + Args: + message: The message to present to the user + requestedSchema: Schema defining the expected response structure + + Returns: + The client's response + """ + return await self.send_request( + types.ServerRequest( + types.ElicitRequest( + method="elicitation/create", + params=types.ElicitRequestParams( + message=message, + requestedSchema=requestedSchema, + ), + ) + ), + types.ElicitResult, + metadata=ServerMessageMetadata(related_request_id=related_request_id), + ) + async def send_ping(self) -> types.EmptyResult: """Send a ping request.""" return await self.send_request( @@ -261,7 +296,12 @@ async def send_ping(self) -> types.EmptyResult: ) async def send_progress_notification( - self, progress_token: str | int, progress: float, total: float | None = None + self, + progress_token: str | int, + progress: float, + total: float | None = None, + message: str | None = None, + related_request_id: str | None = None, ) -> None: """Send a progress notification.""" await self.send_notification( @@ -272,9 +312,11 @@ async def send_progress_notification( progressToken=progress_token, progress=progress, total=total, + message=message, ), ) - ) + ), + related_request_id, ) async def send_resource_list_changed(self) -> None: diff --git a/src/mcp/server/sse.py b/src/mcp/server/sse.py index d051c25bf..41145e49f 100644 --- a/src/mcp/server/sse.py +++ b/src/mcp/server/sse.py @@ -10,7 +10,7 @@ # Create Starlette routes for SSE and message handling routes = [ - Route("/sse", endpoint=handle_sse), + Route("/sse", endpoint=handle_sse, methods=["GET"]), Mount("/messages/", app=sse.handle_post_message), ] @@ -22,12 +22,18 @@ async def handle_sse(request): await app.run( streams[0], streams[1], app.create_initialization_options() ) + # Return empty response to avoid NoneType error + return Response() # Create and run Starlette app starlette_app = Starlette(routes=routes) - uvicorn.run(starlette_app, host="0.0.0.0", port=port) + uvicorn.run(starlette_app, host="127.0.0.1", port=port) ``` +Note: The handle_sse function must return a Response to avoid a "TypeError: 'NoneType' +object is not callable" error when client disconnects. The example above returns +an empty Response() after the SSE connection ends to fix this. + See SseServerTransport class documentation for more details. """ @@ -46,6 +52,11 @@ async def handle_sse(request): from starlette.types import Receive, Scope, Send import mcp.types as types +from mcp.server.transport_security import ( + TransportSecurityMiddleware, + TransportSecuritySettings, +) +from mcp.shared.message import ServerMessageMetadata, SessionMessage logger = logging.getLogger(__name__) @@ -63,19 +74,23 @@ class SseServerTransport: """ _endpoint: str - _read_stream_writers: dict[ - UUID, MemoryObjectSendStream[types.JSONRPCMessage | Exception] - ] + _read_stream_writers: dict[UUID, MemoryObjectSendStream[SessionMessage | Exception]] + _security: TransportSecurityMiddleware - def __init__(self, endpoint: str) -> None: + def __init__(self, endpoint: str, security_settings: TransportSecuritySettings | None = None) -> None: """ Creates a new SSE server transport, which will direct the client to POST messages to the relative or absolute URL given. + + Args: + endpoint: The relative or absolute URL for POST messages. + security_settings: Optional security settings for DNS rebinding protection. """ super().__init__() self._endpoint = endpoint self._read_stream_writers = {} + self._security = TransportSecurityMiddleware(security_settings) logger.debug(f"SseServerTransport initialized with endpoint: {endpoint}") @asynccontextmanager @@ -84,58 +99,89 @@ async def connect_sse(self, scope: Scope, receive: Receive, send: Send): logger.error("connect_sse received non-HTTP request") raise ValueError("connect_sse can only handle HTTP requests") + # Validate request headers for DNS rebinding protection + request = Request(scope, receive) + error_response = await self._security.validate_request(request, is_post=False) + if error_response: + await error_response(scope, receive, send) + raise ValueError("Request validation failed") + logger.debug("Setting up SSE connection") - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception] - read_stream_writer: MemoryObjectSendStream[types.JSONRPCMessage | Exception] + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] + read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] - write_stream: MemoryObjectSendStream[types.JSONRPCMessage] - write_stream_reader: MemoryObjectReceiveStream[types.JSONRPCMessage] + write_stream: MemoryObjectSendStream[SessionMessage] + write_stream_reader: MemoryObjectReceiveStream[SessionMessage] read_stream_writer, read_stream = anyio.create_memory_object_stream(0) write_stream, write_stream_reader = anyio.create_memory_object_stream(0) session_id = uuid4() - session_uri = f"{quote(self._endpoint)}?session_id={session_id.hex}" self._read_stream_writers[session_id] = read_stream_writer logger.debug(f"Created new session with ID: {session_id}") - sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[ - dict[str, Any] - ](0) + # Determine the full path for the message endpoint to be sent to the client. + # scope['root_path'] is the prefix where the current Starlette app + # instance is mounted. + # e.g., "" if top-level, or "/api_prefix" if mounted under "/api_prefix". + root_path = scope.get("root_path", "") + + # self._endpoint is the path *within* this app, e.g., "/messages". + # Concatenating them gives the full absolute path from the server root. + # e.g., "" + "/messages" -> "/messages" + # e.g., "/api_prefix" + "/messages" -> "/api_prefix/messages" + full_message_path_for_client = root_path.rstrip("/") + self._endpoint + + # This is the URI (path + query) the client will use to POST messages. + client_post_uri_data = f"{quote(full_message_path_for_client)}?session_id={session_id.hex}" + + sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[dict[str, Any]](0) async def sse_writer(): logger.debug("Starting SSE writer") async with sse_stream_writer, write_stream_reader: - await sse_stream_writer.send({"event": "endpoint", "data": session_uri}) - logger.debug(f"Sent endpoint event: {session_uri}") + await sse_stream_writer.send({"event": "endpoint", "data": client_post_uri_data}) + logger.debug(f"Sent endpoint event: {client_post_uri_data}") - async for message in write_stream_reader: - logger.debug(f"Sending message via SSE: {message}") + async for session_message in write_stream_reader: + logger.debug(f"Sending message via SSE: {session_message}") await sse_stream_writer.send( { "event": "message", - "data": message.model_dump_json( - by_alias=True, exclude_none=True - ), + "data": session_message.message.model_dump_json(by_alias=True, exclude_none=True), } ) async with anyio.create_task_group() as tg: - response = EventSourceResponse( - content=sse_stream_reader, data_sender_callable=sse_writer - ) + + async def response_wrapper(scope: Scope, receive: Receive, send: Send): + """ + The EventSourceResponse returning signals a client close / disconnect. + In this case we close our side of the streams to signal the client that + the connection has been closed. + """ + await EventSourceResponse(content=sse_stream_reader, data_sender_callable=sse_writer)( + scope, receive, send + ) + await read_stream_writer.aclose() + await write_stream_reader.aclose() + logging.debug(f"Client session disconnected {session_id}") + logger.debug("Starting SSE response task") - tg.start_soon(response, scope, receive, send) + tg.start_soon(response_wrapper, scope, receive, send) logger.debug("Yielding read and write streams") yield (read_stream, write_stream) - async def handle_post_message( - self, scope: Scope, receive: Receive, send: Send - ) -> None: + async def handle_post_message(self, scope: Scope, receive: Receive, send: Send) -> None: logger.debug("Handling POST message") request = Request(scope, receive) + # Validate request headers for DNS rebinding protection + error_response = await self._security.validate_request(request, is_post=True) + if error_response: + return await error_response(scope, receive, send) + session_id_param = request.query_params.get("session_id") if session_id_param is None: logger.warning("Received request without session_id") @@ -169,7 +215,10 @@ async def handle_post_message( await writer.send(err) return - logger.debug(f"Sending message to writer: {message}") + # Pass the ASGI scope for framework-agnostic access to request data + metadata = ServerMessageMetadata(request_context=request) + session_message = SessionMessage(message, metadata=metadata) + logger.debug(f"Sending session message to writer: {session_message}") response = Response("Accepted", status_code=202) await response(scope, receive, send) - await writer.send(message) + await writer.send(session_message) diff --git a/src/mcp/server/stdio.py b/src/mcp/server/stdio.py index 0e0e49129..d1618a371 100644 --- a/src/mcp/server/stdio.py +++ b/src/mcp/server/stdio.py @@ -27,6 +27,7 @@ async def run_server(): from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream import mcp.types as types +from mcp.shared.message import SessionMessage @asynccontextmanager @@ -47,11 +48,11 @@ async def stdio_server( if not stdout: stdout = anyio.wrap_file(TextIOWrapper(sys.stdout.buffer, encoding="utf-8")) - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception] - read_stream_writer: MemoryObjectSendStream[types.JSONRPCMessage | Exception] + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] + read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] - write_stream: MemoryObjectSendStream[types.JSONRPCMessage] - write_stream_reader: MemoryObjectReceiveStream[types.JSONRPCMessage] + write_stream: MemoryObjectSendStream[SessionMessage] + write_stream_reader: MemoryObjectReceiveStream[SessionMessage] read_stream_writer, read_stream = anyio.create_memory_object_stream(0) write_stream, write_stream_reader = anyio.create_memory_object_stream(0) @@ -66,15 +67,16 @@ async def stdin_reader(): await read_stream_writer.send(exc) continue - await read_stream_writer.send(message) + session_message = SessionMessage(message) + await read_stream_writer.send(session_message) except anyio.ClosedResourceError: await anyio.lowlevel.checkpoint() async def stdout_writer(): try: async with write_stream_reader: - async for message in write_stream_reader: - json = message.model_dump_json(by_alias=True, exclude_none=True) + async for session_message in write_stream_reader: + json = session_message.message.model_dump_json(by_alias=True, exclude_none=True) await stdout.write(json + "\n") await stdout.flush() except anyio.ClosedResourceError: diff --git a/src/mcp/server/streamable_http.py b/src/mcp/server/streamable_http.py new file mode 100644 index 000000000..d46549929 --- /dev/null +++ b/src/mcp/server/streamable_http.py @@ -0,0 +1,911 @@ +""" +StreamableHTTP Server Transport Module + +This module implements an HTTP transport layer with Streamable HTTP. + +The transport handles bidirectional communication using HTTP requests and +responses, with streaming support for long-running operations. +""" + +import json +import logging +import re +from abc import ABC, abstractmethod +from collections.abc import AsyncGenerator, Awaitable, Callable +from contextlib import asynccontextmanager +from dataclasses import dataclass +from http import HTTPStatus + +import anyio +from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream +from pydantic import ValidationError +from sse_starlette import EventSourceResponse +from starlette.requests import Request +from starlette.responses import Response +from starlette.types import Receive, Scope, Send + +from mcp.server.transport_security import ( + TransportSecurityMiddleware, + TransportSecuritySettings, +) +from mcp.shared.message import ServerMessageMetadata, SessionMessage +from mcp.shared.version import SUPPORTED_PROTOCOL_VERSIONS +from mcp.types import ( + DEFAULT_NEGOTIATED_VERSION, + INTERNAL_ERROR, + INVALID_PARAMS, + INVALID_REQUEST, + PARSE_ERROR, + ErrorData, + JSONRPCError, + JSONRPCMessage, + JSONRPCRequest, + JSONRPCResponse, + RequestId, +) + +logger = logging.getLogger(__name__) + +# Maximum size for incoming messages +MAXIMUM_MESSAGE_SIZE = 4 * 1024 * 1024 # 4MB + +# Header names +MCP_SESSION_ID_HEADER = "mcp-session-id" +MCP_PROTOCOL_VERSION_HEADER = "mcp-protocol-version" +LAST_EVENT_ID_HEADER = "last-event-id" + +# Content types +CONTENT_TYPE_JSON = "application/json" +CONTENT_TYPE_SSE = "text/event-stream" + +# Special key for the standalone GET stream +GET_STREAM_KEY = "_GET_stream" + +# Session ID validation pattern (visible ASCII characters ranging from 0x21 to 0x7E) +# Pattern ensures entire string contains only valid characters by using ^ and $ anchors +SESSION_ID_PATTERN = re.compile(r"^[\x21-\x7E]+$") + +# Type aliases +StreamId = str +EventId = str + + +@dataclass +class EventMessage: + """ + A JSONRPCMessage with an optional event ID for stream resumability. + """ + + message: JSONRPCMessage + event_id: str | None = None + + +EventCallback = Callable[[EventMessage], Awaitable[None]] + + +class EventStore(ABC): + """ + Interface for resumability support via event storage. + """ + + @abstractmethod + async def store_event(self, stream_id: StreamId, message: JSONRPCMessage) -> EventId: + """ + Stores an event for later retrieval. + + Args: + stream_id: ID of the stream the event belongs to + message: The JSON-RPC message to store + + Returns: + The generated event ID for the stored event + """ + pass + + @abstractmethod + async def replay_events_after( + self, + last_event_id: EventId, + send_callback: EventCallback, + ) -> StreamId | None: + """ + Replays events that occurred after the specified event ID. + + Args: + last_event_id: The ID of the last event the client received + send_callback: A callback function to send events to the client + + Returns: + The stream ID of the replayed events + """ + pass + + +class StreamableHTTPServerTransport: + """ + HTTP server transport with event streaming support for MCP. + + Handles JSON-RPC messages in HTTP POST requests with SSE streaming. + Supports optional JSON responses and session management. + """ + + # Server notification streams for POST requests as well as standalone SSE stream + _read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] | None = None + _read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] | None = None + _write_stream: MemoryObjectSendStream[SessionMessage] | None = None + _write_stream_reader: MemoryObjectReceiveStream[SessionMessage] | None = None + _security: TransportSecurityMiddleware + + def __init__( + self, + mcp_session_id: str | None, + is_json_response_enabled: bool = False, + event_store: EventStore | None = None, + security_settings: TransportSecuritySettings | None = None, + ) -> None: + """ + Initialize a new StreamableHTTP server transport. + + Args: + mcp_session_id: Optional session identifier for this connection. + Must contain only visible ASCII characters (0x21-0x7E). + is_json_response_enabled: If True, return JSON responses for requests + instead of SSE streams. Default is False. + event_store: Event store for resumability support. If provided, + resumability will be enabled, allowing clients to + reconnect and resume messages. + security_settings: Optional security settings for DNS rebinding protection. + + Raises: + ValueError: If the session ID contains invalid characters. + """ + if mcp_session_id is not None and not SESSION_ID_PATTERN.fullmatch(mcp_session_id): + raise ValueError("Session ID must only contain visible ASCII characters (0x21-0x7E)") + + self.mcp_session_id = mcp_session_id + self.is_json_response_enabled = is_json_response_enabled + self._event_store = event_store + self._security = TransportSecurityMiddleware(security_settings) + self._request_streams: dict[ + RequestId, + tuple[ + MemoryObjectSendStream[EventMessage], + MemoryObjectReceiveStream[EventMessage], + ], + ] = {} + self._terminated = False + + def _create_error_response( + self, + error_message: str, + status_code: HTTPStatus, + error_code: int = INVALID_REQUEST, + headers: dict[str, str] | None = None, + ) -> Response: + """Create an error response with a simple string message.""" + response_headers = {"Content-Type": CONTENT_TYPE_JSON} + if headers: + response_headers.update(headers) + + if self.mcp_session_id: + response_headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id + + # Return a properly formatted JSON error response + error_response = JSONRPCError( + jsonrpc="2.0", + id="server-error", # We don't have a request ID for general errors + error=ErrorData( + code=error_code, + message=error_message, + ), + ) + + return Response( + error_response.model_dump_json(by_alias=True, exclude_none=True), + status_code=status_code, + headers=response_headers, + ) + + def _create_json_response( + self, + response_message: JSONRPCMessage | None, + status_code: HTTPStatus = HTTPStatus.OK, + headers: dict[str, str] | None = None, + ) -> Response: + """Create a JSON response from a JSONRPCMessage""" + response_headers = {"Content-Type": CONTENT_TYPE_JSON} + if headers: + response_headers.update(headers) + + if self.mcp_session_id: + response_headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id + + return Response( + response_message.model_dump_json(by_alias=True, exclude_none=True) if response_message else None, + status_code=status_code, + headers=response_headers, + ) + + def _get_session_id(self, request: Request) -> str | None: + """Extract the session ID from request headers.""" + return request.headers.get(MCP_SESSION_ID_HEADER) + + def _create_event_data(self, event_message: EventMessage) -> dict[str, str]: + """Create event data dictionary from an EventMessage.""" + event_data = { + "event": "message", + "data": event_message.message.model_dump_json(by_alias=True, exclude_none=True), + } + + # If an event ID was provided, include it + if event_message.event_id: + event_data["id"] = event_message.event_id + + return event_data + + async def _clean_up_memory_streams(self, request_id: RequestId) -> None: + """Clean up memory streams for a given request ID.""" + if request_id in self._request_streams: + try: + # Close the request stream + await self._request_streams[request_id][0].aclose() + await self._request_streams[request_id][1].aclose() + except Exception as e: + logger.debug(f"Error closing memory streams: {e}") + finally: + # Remove the request stream from the mapping + self._request_streams.pop(request_id, None) + + async def handle_request(self, scope: Scope, receive: Receive, send: Send) -> None: + """Application entry point that handles all HTTP requests""" + request = Request(scope, receive) + + # Validate request headers for DNS rebinding protection + is_post = request.method == "POST" + error_response = await self._security.validate_request(request, is_post=is_post) + if error_response: + await error_response(scope, receive, send) + return + + if self._terminated: + # If the session has been terminated, return 404 Not Found + response = self._create_error_response( + "Not Found: Session has been terminated", + HTTPStatus.NOT_FOUND, + ) + await response(scope, receive, send) + return + + if request.method == "POST": + await self._handle_post_request(scope, request, receive, send) + elif request.method == "GET": + await self._handle_get_request(request, send) + elif request.method == "DELETE": + await self._handle_delete_request(request, send) + else: + await self._handle_unsupported_request(request, send) + + def _check_accept_headers(self, request: Request) -> tuple[bool, bool]: + """Check if the request accepts the required media types.""" + accept_header = request.headers.get("accept", "") + accept_types = [media_type.strip() for media_type in accept_header.split(",")] + + has_json = any(media_type.startswith(CONTENT_TYPE_JSON) for media_type in accept_types) + has_sse = any(media_type.startswith(CONTENT_TYPE_SSE) for media_type in accept_types) + + return has_json, has_sse + + def _check_content_type(self, request: Request) -> bool: + """Check if the request has the correct Content-Type.""" + content_type = request.headers.get("content-type", "") + content_type_parts = [part.strip() for part in content_type.split(";")[0].split(",")] + + return any(part == CONTENT_TYPE_JSON for part in content_type_parts) + + async def _handle_post_request(self, scope: Scope, request: Request, receive: Receive, send: Send) -> None: + """Handle POST requests containing JSON-RPC messages.""" + writer = self._read_stream_writer + if writer is None: + raise ValueError("No read stream writer available. Ensure connect() is called first.") + try: + # Check Accept headers + has_json, has_sse = self._check_accept_headers(request) + if not (has_json and has_sse): + response = self._create_error_response( + ("Not Acceptable: Client must accept both application/json and text/event-stream"), + HTTPStatus.NOT_ACCEPTABLE, + ) + await response(scope, receive, send) + return + + # Validate Content-Type + if not self._check_content_type(request): + response = self._create_error_response( + "Unsupported Media Type: Content-Type must be application/json", + HTTPStatus.UNSUPPORTED_MEDIA_TYPE, + ) + await response(scope, receive, send) + return + + # Parse the body - only read it once + body = await request.body() + if len(body) > MAXIMUM_MESSAGE_SIZE: + response = self._create_error_response( + "Payload Too Large: Message exceeds maximum size", + HTTPStatus.REQUEST_ENTITY_TOO_LARGE, + ) + await response(scope, receive, send) + return + + try: + raw_message = json.loads(body) + except json.JSONDecodeError as e: + response = self._create_error_response(f"Parse error: {str(e)}", HTTPStatus.BAD_REQUEST, PARSE_ERROR) + await response(scope, receive, send) + return + + try: + message = JSONRPCMessage.model_validate(raw_message) + except ValidationError as e: + response = self._create_error_response( + f"Validation error: {str(e)}", + HTTPStatus.BAD_REQUEST, + INVALID_PARAMS, + ) + await response(scope, receive, send) + return + + # Check if this is an initialization request + is_initialization_request = isinstance(message.root, JSONRPCRequest) and message.root.method == "initialize" + + if is_initialization_request: + # Check if the server already has an established session + if self.mcp_session_id: + # Check if request has a session ID + request_session_id = self._get_session_id(request) + + # If request has a session ID but doesn't match, return 404 + if request_session_id and request_session_id != self.mcp_session_id: + response = self._create_error_response( + "Not Found: Invalid or expired session ID", + HTTPStatus.NOT_FOUND, + ) + await response(scope, receive, send) + return + elif not await self._validate_request_headers(request, send): + return + + # For notifications and responses only, return 202 Accepted + if not isinstance(message.root, JSONRPCRequest): + # Create response object and send it + response = self._create_json_response( + None, + HTTPStatus.ACCEPTED, + ) + await response(scope, receive, send) + + # Process the message after sending the response + metadata = ServerMessageMetadata(request_context=request) + session_message = SessionMessage(message, metadata=metadata) + await writer.send(session_message) + + return + + # Extract the request ID outside the try block for proper scope + request_id = str(message.root.id) + # Register this stream for the request ID + self._request_streams[request_id] = anyio.create_memory_object_stream[EventMessage](0) + request_stream_reader = self._request_streams[request_id][1] + + if self.is_json_response_enabled: + # Process the message + metadata = ServerMessageMetadata(request_context=request) + session_message = SessionMessage(message, metadata=metadata) + await writer.send(session_message) + try: + # Process messages from the request-specific stream + # We need to collect all messages until we get a response + response_message = None + + # Use similar approach to SSE writer for consistency + async for event_message in request_stream_reader: + # If it's a response, this is what we're waiting for + if isinstance(event_message.message.root, JSONRPCResponse | JSONRPCError): + response_message = event_message.message + break + # For notifications and request, keep waiting + else: + logger.debug(f"received: {event_message.message.root.method}") + + # At this point we should have a response + if response_message: + # Create JSON response + response = self._create_json_response(response_message) + await response(scope, receive, send) + else: + # This shouldn't happen in normal operation + logger.error("No response message received before stream closed") + response = self._create_error_response( + "Error processing request: No response received", + HTTPStatus.INTERNAL_SERVER_ERROR, + ) + await response(scope, receive, send) + except Exception as e: + logger.exception(f"Error processing JSON response: {e}") + response = self._create_error_response( + f"Error processing request: {str(e)}", + HTTPStatus.INTERNAL_SERVER_ERROR, + INTERNAL_ERROR, + ) + await response(scope, receive, send) + finally: + await self._clean_up_memory_streams(request_id) + else: + # Create SSE stream + sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[dict[str, str]](0) + + async def sse_writer(): + # Get the request ID from the incoming request message + try: + async with sse_stream_writer, request_stream_reader: + # Process messages from the request-specific stream + async for event_message in request_stream_reader: + # Build the event data + event_data = self._create_event_data(event_message) + await sse_stream_writer.send(event_data) + + # If response, remove from pending streams and close + if isinstance( + event_message.message.root, + JSONRPCResponse | JSONRPCError, + ): + break + except Exception as e: + logger.exception(f"Error in SSE writer: {e}") + finally: + logger.debug("Closing SSE writer") + await self._clean_up_memory_streams(request_id) + + # Create and start EventSourceResponse + # SSE stream mode (original behavior) + # Set up headers + headers = { + "Cache-Control": "no-cache, no-transform", + "Connection": "keep-alive", + "Content-Type": CONTENT_TYPE_SSE, + **({MCP_SESSION_ID_HEADER: self.mcp_session_id} if self.mcp_session_id else {}), + } + response = EventSourceResponse( + content=sse_stream_reader, + data_sender_callable=sse_writer, + headers=headers, + ) + + # Start the SSE response (this will send headers immediately) + try: + # First send the response to establish the SSE connection + async with anyio.create_task_group() as tg: + tg.start_soon(response, scope, receive, send) + # Then send the message to be processed by the server + metadata = ServerMessageMetadata(request_context=request) + session_message = SessionMessage(message, metadata=metadata) + await writer.send(session_message) + except Exception: + logger.exception("SSE response error") + await sse_stream_writer.aclose() + await sse_stream_reader.aclose() + await self._clean_up_memory_streams(request_id) + + except Exception as err: + logger.exception("Error handling POST request") + response = self._create_error_response( + f"Error handling POST request: {err}", + HTTPStatus.INTERNAL_SERVER_ERROR, + INTERNAL_ERROR, + ) + await response(scope, receive, send) + if writer: + await writer.send(Exception(err)) + return + + async def _handle_get_request(self, request: Request, send: Send) -> None: + """ + Handle GET request to establish SSE. + + This allows the server to communicate to the client without the client + first sending data via HTTP POST. The server can send JSON-RPC requests + and notifications on this stream. + """ + writer = self._read_stream_writer + if writer is None: + raise ValueError("No read stream writer available. Ensure connect() is called first.") + + # Validate Accept header - must include text/event-stream + _, has_sse = self._check_accept_headers(request) + + if not has_sse: + response = self._create_error_response( + "Not Acceptable: Client must accept text/event-stream", + HTTPStatus.NOT_ACCEPTABLE, + ) + await response(request.scope, request.receive, send) + return + + if not await self._validate_request_headers(request, send): + return + + # Handle resumability: check for Last-Event-ID header + if last_event_id := request.headers.get(LAST_EVENT_ID_HEADER): + await self._replay_events(last_event_id, request, send) + return + + headers = { + "Cache-Control": "no-cache, no-transform", + "Connection": "keep-alive", + "Content-Type": CONTENT_TYPE_SSE, + } + + if self.mcp_session_id: + headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id + + # Check if we already have an active GET stream + if GET_STREAM_KEY in self._request_streams: + response = self._create_error_response( + "Conflict: Only one SSE stream is allowed per session", + HTTPStatus.CONFLICT, + ) + await response(request.scope, request.receive, send) + return + + # Create SSE stream + sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[dict[str, str]](0) + + async def standalone_sse_writer(): + try: + # Create a standalone message stream for server-initiated messages + + self._request_streams[GET_STREAM_KEY] = anyio.create_memory_object_stream[EventMessage](0) + standalone_stream_reader = self._request_streams[GET_STREAM_KEY][1] + + async with sse_stream_writer, standalone_stream_reader: + # Process messages from the standalone stream + async for event_message in standalone_stream_reader: + # For the standalone stream, we handle: + # - JSONRPCNotification (server sends notifications to client) + # - JSONRPCRequest (server sends requests to client) + # We should NOT receive JSONRPCResponse + + # Send the message via SSE + event_data = self._create_event_data(event_message) + await sse_stream_writer.send(event_data) + except Exception as e: + logger.exception(f"Error in standalone SSE writer: {e}") + finally: + logger.debug("Closing standalone SSE writer") + await self._clean_up_memory_streams(GET_STREAM_KEY) + + # Create and start EventSourceResponse + response = EventSourceResponse( + content=sse_stream_reader, + data_sender_callable=standalone_sse_writer, + headers=headers, + ) + + try: + # This will send headers immediately and establish the SSE connection + await response(request.scope, request.receive, send) + except Exception as e: + logger.exception(f"Error in standalone SSE response: {e}") + await sse_stream_writer.aclose() + await sse_stream_reader.aclose() + await self._clean_up_memory_streams(GET_STREAM_KEY) + + async def _handle_delete_request(self, request: Request, send: Send) -> None: + """Handle DELETE requests for explicit session termination.""" + # Validate session ID + if not self.mcp_session_id: + # If no session ID set, return Method Not Allowed + response = self._create_error_response( + "Method Not Allowed: Session termination not supported", + HTTPStatus.METHOD_NOT_ALLOWED, + ) + await response(request.scope, request.receive, send) + return + + if not await self._validate_request_headers(request, send): + return + + await self._terminate_session() + + response = self._create_json_response( + None, + HTTPStatus.OK, + ) + await response(request.scope, request.receive, send) + + async def _terminate_session(self) -> None: + """Terminate the current session, closing all streams. + + Once terminated, all requests with this session ID will receive 404 Not Found. + """ + + self._terminated = True + logger.info(f"Terminating session: {self.mcp_session_id}") + + # We need a copy of the keys to avoid modification during iteration + request_stream_keys = list(self._request_streams.keys()) + + # Close all request streams asynchronously + for key in request_stream_keys: + try: + await self._clean_up_memory_streams(key) + except Exception as e: + logger.debug(f"Error closing stream {key} during termination: {e}") + + # Clear the request streams dictionary immediately + self._request_streams.clear() + try: + if self._read_stream_writer is not None: + await self._read_stream_writer.aclose() + if self._read_stream is not None: + await self._read_stream.aclose() + if self._write_stream_reader is not None: + await self._write_stream_reader.aclose() + if self._write_stream is not None: + await self._write_stream.aclose() + except Exception as e: + logger.debug(f"Error closing streams: {e}") + + async def _handle_unsupported_request(self, request: Request, send: Send) -> None: + """Handle unsupported HTTP methods.""" + headers = { + "Content-Type": CONTENT_TYPE_JSON, + "Allow": "GET, POST, DELETE", + } + if self.mcp_session_id: + headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id + + response = self._create_error_response( + "Method Not Allowed", + HTTPStatus.METHOD_NOT_ALLOWED, + headers=headers, + ) + await response(request.scope, request.receive, send) + + async def _validate_request_headers(self, request: Request, send: Send) -> bool: + if not await self._validate_session(request, send): + return False + if not await self._validate_protocol_version(request, send): + return False + return True + + async def _validate_session(self, request: Request, send: Send) -> bool: + """Validate the session ID in the request.""" + if not self.mcp_session_id: + # If we're not using session IDs, return True + return True + + # Get the session ID from the request headers + request_session_id = self._get_session_id(request) + + # If no session ID provided but required, return error + if not request_session_id: + response = self._create_error_response( + "Bad Request: Missing session ID", + HTTPStatus.BAD_REQUEST, + ) + await response(request.scope, request.receive, send) + return False + + # If session ID doesn't match, return error + if request_session_id != self.mcp_session_id: + response = self._create_error_response( + "Not Found: Invalid or expired session ID", + HTTPStatus.NOT_FOUND, + ) + await response(request.scope, request.receive, send) + return False + + return True + + async def _validate_protocol_version(self, request: Request, send: Send) -> bool: + """Validate the protocol version header in the request.""" + # Get the protocol version from the request headers + protocol_version = request.headers.get(MCP_PROTOCOL_VERSION_HEADER) + + # If no protocol version provided, assume default version + if protocol_version is None: + protocol_version = DEFAULT_NEGOTIATED_VERSION + + # Check if the protocol version is supported + if protocol_version not in SUPPORTED_PROTOCOL_VERSIONS: + supported_versions = ", ".join(SUPPORTED_PROTOCOL_VERSIONS) + response = self._create_error_response( + f"Bad Request: Unsupported protocol version: {protocol_version}. " + + f"Supported versions: {supported_versions}", + HTTPStatus.BAD_REQUEST, + ) + await response(request.scope, request.receive, send) + return False + + return True + + async def _replay_events(self, last_event_id: str, request: Request, send: Send) -> None: + """ + Replays events that would have been sent after the specified event ID. + Only used when resumability is enabled. + """ + event_store = self._event_store + if not event_store: + return + + try: + headers = { + "Cache-Control": "no-cache, no-transform", + "Connection": "keep-alive", + "Content-Type": CONTENT_TYPE_SSE, + } + + if self.mcp_session_id: + headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id + + # Create SSE stream for replay + sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[dict[str, str]](0) + + async def replay_sender(): + try: + async with sse_stream_writer: + # Define an async callback for sending events + async def send_event(event_message: EventMessage) -> None: + event_data = self._create_event_data(event_message) + await sse_stream_writer.send(event_data) + + # Replay past events and get the stream ID + stream_id = await event_store.replay_events_after(last_event_id, send_event) + + # If stream ID not in mapping, create it + if stream_id and stream_id not in self._request_streams: + self._request_streams[stream_id] = anyio.create_memory_object_stream[EventMessage](0) + msg_reader = self._request_streams[stream_id][1] + + # Forward messages to SSE + async with msg_reader: + async for event_message in msg_reader: + event_data = self._create_event_data(event_message) + + await sse_stream_writer.send(event_data) + except Exception as e: + logger.exception(f"Error in replay sender: {e}") + + # Create and start EventSourceResponse + response = EventSourceResponse( + content=sse_stream_reader, + data_sender_callable=replay_sender, + headers=headers, + ) + + try: + await response(request.scope, request.receive, send) + except Exception as e: + logger.exception(f"Error in replay response: {e}") + finally: + await sse_stream_writer.aclose() + await sse_stream_reader.aclose() + + except Exception as e: + logger.exception(f"Error replaying events: {e}") + response = self._create_error_response( + f"Error replaying events: {str(e)}", + HTTPStatus.INTERNAL_SERVER_ERROR, + INTERNAL_ERROR, + ) + await response(request.scope, request.receive, send) + + @asynccontextmanager + async def connect( + self, + ) -> AsyncGenerator[ + tuple[ + MemoryObjectReceiveStream[SessionMessage | Exception], + MemoryObjectSendStream[SessionMessage], + ], + None, + ]: + """Context manager that provides read and write streams for a connection. + + Yields: + Tuple of (read_stream, write_stream) for bidirectional communication + """ + + # Create the memory streams for this connection + + read_stream_writer, read_stream = anyio.create_memory_object_stream[SessionMessage | Exception](0) + write_stream, write_stream_reader = anyio.create_memory_object_stream[SessionMessage](0) + + # Store the streams + self._read_stream_writer = read_stream_writer + self._read_stream = read_stream + self._write_stream_reader = write_stream_reader + self._write_stream = write_stream + + # Start a task group for message routing + async with anyio.create_task_group() as tg: + # Create a message router that distributes messages to request streams + async def message_router(): + try: + async for session_message in write_stream_reader: + # Determine which request stream(s) should receive this message + message = session_message.message + target_request_id = None + # Check if this is a response + if isinstance(message.root, JSONRPCResponse | JSONRPCError): + response_id = str(message.root.id) + # If this response is for an existing request stream, + # send it there + if response_id in self._request_streams: + target_request_id = response_id + + else: + # Extract related_request_id from meta if it exists + if ( + session_message.metadata is not None + and isinstance( + session_message.metadata, + ServerMessageMetadata, + ) + and session_message.metadata.related_request_id is not None + ): + target_request_id = str(session_message.metadata.related_request_id) + + request_stream_id = target_request_id if target_request_id is not None else GET_STREAM_KEY + + # Store the event if we have an event store, + # regardless of whether a client is connected + # messages will be replayed on the re-connect + event_id = None + if self._event_store: + event_id = await self._event_store.store_event(request_stream_id, message) + logger.debug(f"Stored {event_id} from {request_stream_id}") + + if request_stream_id in self._request_streams: + try: + # Send both the message and the event ID + await self._request_streams[request_stream_id][0].send(EventMessage(message, event_id)) + except ( + anyio.BrokenResourceError, + anyio.ClosedResourceError, + ): + # Stream might be closed, remove from registry + self._request_streams.pop(request_stream_id, None) + else: + logging.debug( + f"""Request stream {request_stream_id} not found + for message. Still processing message as the client + might reconnect and replay.""" + ) + except Exception as e: + logger.exception(f"Error in message router: {e}") + + # Start the message router + tg.start_soon(message_router) + + try: + # Yield the streams for the caller to use + yield read_stream, write_stream + finally: + for stream_id in list(self._request_streams.keys()): + try: + await self._clean_up_memory_streams(stream_id) + except Exception as e: + logger.debug(f"Error closing request stream: {e}") + pass + self._request_streams.clear() + + # Clean up the read and write streams + try: + await read_stream_writer.aclose() + await read_stream.aclose() + await write_stream_reader.aclose() + await write_stream.aclose() + except Exception as e: + logger.debug(f"Error closing streams: {e}") diff --git a/src/mcp/server/streamable_http_manager.py b/src/mcp/server/streamable_http_manager.py new file mode 100644 index 000000000..41b807388 --- /dev/null +++ b/src/mcp/server/streamable_http_manager.py @@ -0,0 +1,256 @@ +"""StreamableHTTP Session Manager for MCP servers.""" + +from __future__ import annotations + +import contextlib +import logging +import threading +from collections.abc import AsyncIterator +from http import HTTPStatus +from typing import Any +from uuid import uuid4 + +import anyio +from anyio.abc import TaskStatus +from starlette.requests import Request +from starlette.responses import Response +from starlette.types import Receive, Scope, Send + +from mcp.server.lowlevel.server import Server as MCPServer +from mcp.server.streamable_http import ( + MCP_SESSION_ID_HEADER, + EventStore, + StreamableHTTPServerTransport, +) +from mcp.server.transport_security import TransportSecuritySettings + +logger = logging.getLogger(__name__) + + +class StreamableHTTPSessionManager: + """ + Manages StreamableHTTP sessions with optional resumability via event store. + + This class abstracts away the complexity of session management, event storage, + and request handling for StreamableHTTP transports. It handles: + + 1. Session tracking for clients + 2. Resumability via an optional event store + 3. Connection management and lifecycle + 4. Request handling and transport setup + + Important: Only one StreamableHTTPSessionManager instance should be created + per application. The instance cannot be reused after its run() context has + completed. If you need to restart the manager, create a new instance. + + Args: + app: The MCP server instance + event_store: Optional event store for resumability support. + If provided, enables resumable connections where clients + can reconnect and receive missed events. + If None, sessions are still tracked but not resumable. + json_response: Whether to use JSON responses instead of SSE streams + stateless: If True, creates a completely fresh transport for each request + with no session tracking or state persistence between requests. + + """ + + def __init__( + self, + app: MCPServer[Any, Any], + event_store: EventStore | None = None, + json_response: bool = False, + stateless: bool = False, + security_settings: TransportSecuritySettings | None = None, + ): + self.app = app + self.event_store = event_store + self.json_response = json_response + self.stateless = stateless + self.security_settings = security_settings + + # Session tracking (only used if not stateless) + self._session_creation_lock = anyio.Lock() + self._server_instances: dict[str, StreamableHTTPServerTransport] = {} + + # The task group will be set during lifespan + self._task_group = None + # Thread-safe tracking of run() calls + self._run_lock = threading.Lock() + self._has_started = False + + @contextlib.asynccontextmanager + async def run(self) -> AsyncIterator[None]: + """ + Run the session manager with proper lifecycle management. + + This creates and manages the task group for all session operations. + + Important: This method can only be called once per instance. The same + StreamableHTTPSessionManager instance cannot be reused after this + context manager exits. Create a new instance if you need to restart. + + Use this in the lifespan context manager of your Starlette app: + + @contextlib.asynccontextmanager + async def lifespan(app: Starlette) -> AsyncIterator[None]: + async with session_manager.run(): + yield + """ + # Thread-safe check to ensure run() is only called once + with self._run_lock: + if self._has_started: + raise RuntimeError( + "StreamableHTTPSessionManager .run() can only be called " + "once per instance. Create a new instance if you need to run again." + ) + self._has_started = True + + async with anyio.create_task_group() as tg: + # Store the task group for later use + self._task_group = tg + logger.info("StreamableHTTP session manager started") + try: + yield # Let the application run + finally: + logger.info("StreamableHTTP session manager shutting down") + # Cancel task group to stop all spawned tasks + tg.cancel_scope.cancel() + self._task_group = None + # Clear any remaining server instances + self._server_instances.clear() + + async def handle_request( + self, + scope: Scope, + receive: Receive, + send: Send, + ) -> None: + """ + Process ASGI request with proper session handling and transport setup. + + Dispatches to the appropriate handler based on stateless mode. + + Args: + scope: ASGI scope + receive: ASGI receive function + send: ASGI send function + """ + if self._task_group is None: + raise RuntimeError("Task group is not initialized. Make sure to use run().") + + # Dispatch to the appropriate handler + if self.stateless: + await self._handle_stateless_request(scope, receive, send) + else: + await self._handle_stateful_request(scope, receive, send) + + async def _handle_stateless_request( + self, + scope: Scope, + receive: Receive, + send: Send, + ) -> None: + """ + Process request in stateless mode - creating a new transport for each request. + + Args: + scope: ASGI scope + receive: ASGI receive function + send: ASGI send function + """ + logger.debug("Stateless mode: Creating new transport for this request") + # No session ID needed in stateless mode + http_transport = StreamableHTTPServerTransport( + mcp_session_id=None, # No session tracking in stateless mode + is_json_response_enabled=self.json_response, + event_store=None, # No event store in stateless mode + security_settings=self.security_settings, + ) + + # Start server in a new task + async def run_stateless_server(*, task_status: TaskStatus[None] = anyio.TASK_STATUS_IGNORED): + async with http_transport.connect() as streams: + read_stream, write_stream = streams + task_status.started() + await self.app.run( + read_stream, + write_stream, + self.app.create_initialization_options(), + stateless=True, + ) + + # Assert task group is not None for type checking + assert self._task_group is not None + # Start the server task + await self._task_group.start(run_stateless_server) + + # Handle the HTTP request and return the response + await http_transport.handle_request(scope, receive, send) + + async def _handle_stateful_request( + self, + scope: Scope, + receive: Receive, + send: Send, + ) -> None: + """ + Process request in stateful mode - maintaining session state between requests. + + Args: + scope: ASGI scope + receive: ASGI receive function + send: ASGI send function + """ + request = Request(scope, receive) + request_mcp_session_id = request.headers.get(MCP_SESSION_ID_HEADER) + + # Existing session case + if request_mcp_session_id is not None and request_mcp_session_id in self._server_instances: + transport = self._server_instances[request_mcp_session_id] + logger.debug("Session already exists, handling request directly") + await transport.handle_request(scope, receive, send) + return + + if request_mcp_session_id is None: + # New session case + logger.debug("Creating new transport") + async with self._session_creation_lock: + new_session_id = uuid4().hex + http_transport = StreamableHTTPServerTransport( + mcp_session_id=new_session_id, + is_json_response_enabled=self.json_response, + event_store=self.event_store, # May be None (no resumability) + security_settings=self.security_settings, + ) + + assert http_transport.mcp_session_id is not None + self._server_instances[http_transport.mcp_session_id] = http_transport + logger.info(f"Created new transport with session ID: {new_session_id}") + + # Define the server runner + async def run_server(*, task_status: TaskStatus[None] = anyio.TASK_STATUS_IGNORED) -> None: + async with http_transport.connect() as streams: + read_stream, write_stream = streams + task_status.started() + await self.app.run( + read_stream, + write_stream, + self.app.create_initialization_options(), + stateless=False, # Stateful mode + ) + + # Assert task group is not None for type checking + assert self._task_group is not None + # Start the server task + await self._task_group.start(run_server) + + # Handle the HTTP request and return the response + await http_transport.handle_request(scope, receive, send) + else: + # Invalid session ID + response = Response( + "Bad Request: No valid session ID provided", + status_code=HTTPStatus.BAD_REQUEST, + ) + await response(scope, receive, send) diff --git a/src/mcp/server/streaming_asgi_transport.py b/src/mcp/server/streaming_asgi_transport.py new file mode 100644 index 000000000..a74751312 --- /dev/null +++ b/src/mcp/server/streaming_asgi_transport.py @@ -0,0 +1,203 @@ +""" +A modified version of httpx.ASGITransport that supports streaming responses. + +This transport runs the ASGI app as a separate anyio task, allowing it to +handle streaming responses like SSE where the app doesn't terminate until +the connection is closed. + +This is only intended for writing tests for the SSE transport. +""" + +import typing +from typing import Any, cast + +import anyio +import anyio.abc +import anyio.streams.memory +from httpx._models import Request, Response +from httpx._transports.base import AsyncBaseTransport +from httpx._types import AsyncByteStream +from starlette.types import ASGIApp, Receive, Scope, Send + + +class StreamingASGITransport(AsyncBaseTransport): + """ + A custom AsyncTransport that handles sending requests directly to an ASGI app + and supports streaming responses like SSE. + + Unlike the standard ASGITransport, this transport runs the ASGI app in a + separate anyio task, allowing it to handle responses from apps that don't + terminate immediately (like SSE endpoints). + + Arguments: + + * `app` - The ASGI application. + * `raise_app_exceptions` - Boolean indicating if exceptions in the application + should be raised. Default to `True`. Can be set to `False` for use cases + such as testing the content of a client 500 response. + * `root_path` - The root path on which the ASGI application should be mounted. + * `client` - A two-tuple indicating the client IP and port of incoming requests. + * `response_timeout` - Timeout in seconds to wait for the initial response. + Default is 10 seconds. + + TODO: https://github.com/encode/httpx/pull/3059 is adding something similar to + upstream httpx. When that merges, we should delete this & switch back to the + upstream implementation. + """ + + def __init__( + self, + app: ASGIApp, + task_group: anyio.abc.TaskGroup, + raise_app_exceptions: bool = True, + root_path: str = "", + client: tuple[str, int] = ("127.0.0.1", 123), + ) -> None: + self.app = app + self.raise_app_exceptions = raise_app_exceptions + self.root_path = root_path + self.client = client + self.task_group = task_group + + async def handle_async_request( + self, + request: Request, + ) -> Response: + assert isinstance(request.stream, AsyncByteStream) + + # ASGI scope. + scope = { + "type": "http", + "asgi": {"version": "3.0"}, + "http_version": "1.1", + "method": request.method, + "headers": [(k.lower(), v) for (k, v) in request.headers.raw], + "scheme": request.url.scheme, + "path": request.url.path, + "raw_path": request.url.raw_path.split(b"?")[0], + "query_string": request.url.query, + "server": (request.url.host, request.url.port), + "client": self.client, + "root_path": self.root_path, + } + + # Request body + request_body_chunks = request.stream.__aiter__() + request_complete = False + + # Response state + status_code = 499 + response_headers = None + response_started = False + response_complete = anyio.Event() + initial_response_ready = anyio.Event() + + # Synchronization for streaming response + asgi_send_channel, asgi_receive_channel = anyio.create_memory_object_stream[dict[str, Any]](100) + content_send_channel, content_receive_channel = anyio.create_memory_object_stream[bytes](100) + + # ASGI callables. + async def receive() -> dict[str, Any]: + nonlocal request_complete + + if request_complete: + await response_complete.wait() + return {"type": "http.disconnect"} + + try: + body = await request_body_chunks.__anext__() + except StopAsyncIteration: + request_complete = True + return {"type": "http.request", "body": b"", "more_body": False} + return {"type": "http.request", "body": body, "more_body": True} + + async def send(message: dict[str, Any]) -> None: + nonlocal status_code, response_headers, response_started + + await asgi_send_channel.send(message) + + # Start the ASGI application in a separate task + async def run_app() -> None: + try: + # Cast the receive and send functions to the ASGI types + await self.app(cast(Scope, scope), cast(Receive, receive), cast(Send, send)) + except Exception: + if self.raise_app_exceptions: + raise + + if not response_started: + await asgi_send_channel.send({"type": "http.response.start", "status": 500, "headers": []}) + + await asgi_send_channel.send({"type": "http.response.body", "body": b"", "more_body": False}) + finally: + await asgi_send_channel.aclose() + + # Process messages from the ASGI app + async def process_messages() -> None: + nonlocal status_code, response_headers, response_started + + try: + async with asgi_receive_channel: + async for message in asgi_receive_channel: + if message["type"] == "http.response.start": + assert not response_started + status_code = message["status"] + response_headers = message.get("headers", []) + response_started = True + + # As soon as we have headers, we can return a response + initial_response_ready.set() + + elif message["type"] == "http.response.body": + body = message.get("body", b"") + more_body = message.get("more_body", False) + + if body and request.method != "HEAD": + await content_send_channel.send(body) + + if not more_body: + response_complete.set() + await content_send_channel.aclose() + break + finally: + # Ensure events are set even if there's an error + initial_response_ready.set() + response_complete.set() + await content_send_channel.aclose() + + # Create tasks for running the app and processing messages + self.task_group.start_soon(run_app) + self.task_group.start_soon(process_messages) + + # Wait for the initial response or timeout + await initial_response_ready.wait() + + # Create a streaming response + return Response( + status_code, + headers=response_headers, + stream=StreamingASGIResponseStream(content_receive_channel), + ) + + +class StreamingASGIResponseStream(AsyncByteStream): + """ + A modified ASGIResponseStream that supports streaming responses. + + This class extends the standard ASGIResponseStream to handle cases where + the response body continues to be generated after the initial response + is returned. + """ + + def __init__( + self, + receive_channel: anyio.streams.memory.MemoryObjectReceiveStream[bytes], + ) -> None: + self.receive_channel = receive_channel + + async def __aiter__(self) -> typing.AsyncIterator[bytes]: + try: + async for chunk in self.receive_channel: + yield chunk + finally: + await self.receive_channel.aclose() diff --git a/src/mcp/server/transport_security.py b/src/mcp/server/transport_security.py new file mode 100644 index 000000000..3a884ee2b --- /dev/null +++ b/src/mcp/server/transport_security.py @@ -0,0 +1,127 @@ +"""DNS rebinding protection for MCP server transports.""" + +import logging + +from pydantic import BaseModel, Field +from starlette.requests import Request +from starlette.responses import Response + +logger = logging.getLogger(__name__) + + +class TransportSecuritySettings(BaseModel): + """Settings for MCP transport security features. + + These settings help protect against DNS rebinding attacks by validating + incoming request headers. + """ + + enable_dns_rebinding_protection: bool = Field( + default=True, + description="Enable DNS rebinding protection (recommended for production)", + ) + + allowed_hosts: list[str] = Field( + default=[], + description="List of allowed Host header values. Only applies when " + + "enable_dns_rebinding_protection is True.", + ) + + allowed_origins: list[str] = Field( + default=[], + description="List of allowed Origin header values. Only applies when " + + "enable_dns_rebinding_protection is True.", + ) + + +class TransportSecurityMiddleware: + """Middleware to enforce DNS rebinding protection for MCP transport endpoints.""" + + def __init__(self, settings: TransportSecuritySettings | None = None): + # If not specified, disable DNS rebinding protection by default + # for backwards compatibility + self.settings = settings or TransportSecuritySettings(enable_dns_rebinding_protection=False) + + def _validate_host(self, host: str | None) -> bool: + """Validate the Host header against allowed values.""" + if not host: + logger.warning("Missing Host header in request") + return False + + # Check exact match first + if host in self.settings.allowed_hosts: + return True + + # Check wildcard port patterns + for allowed in self.settings.allowed_hosts: + if allowed.endswith(":*"): + # Extract base host from pattern + base_host = allowed[:-2] + # Check if the actual host starts with base host and has a port + if host.startswith(base_host + ":"): + return True + + logger.warning(f"Invalid Host header: {host}") + return False + + def _validate_origin(self, origin: str | None) -> bool: + """Validate the Origin header against allowed values.""" + # Origin can be absent for same-origin requests + if not origin: + return True + + # Check exact match first + if origin in self.settings.allowed_origins: + return True + + # Check wildcard port patterns + for allowed in self.settings.allowed_origins: + if allowed.endswith(":*"): + # Extract base origin from pattern + base_origin = allowed[:-2] + # Check if the actual origin starts with base origin and has a port + if origin.startswith(base_origin + ":"): + return True + + logger.warning(f"Invalid Origin header: {origin}") + return False + + def _validate_content_type(self, content_type: str | None) -> bool: + """Validate the Content-Type header for POST requests.""" + if not content_type: + logger.warning("Missing Content-Type header in POST request") + return False + + # Content-Type must start with application/json + if not content_type.lower().startswith("application/json"): + logger.warning(f"Invalid Content-Type header: {content_type}") + return False + + return True + + async def validate_request(self, request: Request, is_post: bool = False) -> Response | None: + """Validate request headers for DNS rebinding protection. + + Returns None if validation passes, or an error Response if validation fails. + """ + # Always validate Content-Type for POST requests + if is_post: + content_type = request.headers.get("content-type") + if not self._validate_content_type(content_type): + return Response("Invalid Content-Type header", status_code=400) + + # Skip remaining validation if DNS rebinding protection is disabled + if not self.settings.enable_dns_rebinding_protection: + return None + + # Validate Host header + host = request.headers.get("host") + if not self._validate_host(host): + return Response("Invalid Host header", status_code=421) + + # Validate Origin header + origin = request.headers.get("origin") + if not self._validate_origin(origin): + return Response("Invalid Origin header", status_code=400) + + return None diff --git a/src/mcp/server/websocket.py b/src/mcp/server/websocket.py index aee855cf1..7c0d8789c 100644 --- a/src/mcp/server/websocket.py +++ b/src/mcp/server/websocket.py @@ -8,6 +8,7 @@ from starlette.websockets import WebSocket import mcp.types as types +from mcp.shared.message import SessionMessage logger = logging.getLogger(__name__) @@ -22,11 +23,11 @@ async def websocket_server(scope: Scope, receive: Receive, send: Send): websocket = WebSocket(scope, receive, send) await websocket.accept(subprotocol="mcp") - read_stream: MemoryObjectReceiveStream[types.JSONRPCMessage | Exception] - read_stream_writer: MemoryObjectSendStream[types.JSONRPCMessage | Exception] + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] + read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] - write_stream: MemoryObjectSendStream[types.JSONRPCMessage] - write_stream_reader: MemoryObjectReceiveStream[types.JSONRPCMessage] + write_stream: MemoryObjectSendStream[SessionMessage] + write_stream_reader: MemoryObjectReceiveStream[SessionMessage] read_stream_writer, read_stream = anyio.create_memory_object_stream(0) write_stream, write_stream_reader = anyio.create_memory_object_stream(0) @@ -41,15 +42,16 @@ async def ws_reader(): await read_stream_writer.send(exc) continue - await read_stream_writer.send(client_message) + session_message = SessionMessage(client_message) + await read_stream_writer.send(session_message) except anyio.ClosedResourceError: await websocket.close() async def ws_writer(): try: async with write_stream_reader: - async for message in write_stream_reader: - obj = message.model_dump_json(by_alias=True, exclude_none=True) + async for session_message in write_stream_reader: + obj = session_message.message.model_dump_json(by_alias=True, exclude_none=True) await websocket.send_text(obj) except anyio.ClosedResourceError: await websocket.close() diff --git a/src/mcp/shared/_httpx_utils.py b/src/mcp/shared/_httpx_utils.py new file mode 100644 index 000000000..e0611ce73 --- /dev/null +++ b/src/mcp/shared/_httpx_utils.py @@ -0,0 +1,83 @@ +"""Utilities for creating standardized httpx AsyncClient instances.""" + +from typing import Any, Protocol + +import httpx + +__all__ = ["create_mcp_http_client"] + + +class McpHttpClientFactory(Protocol): + def __call__( + self, + headers: dict[str, str] | None = None, + timeout: httpx.Timeout | None = None, + auth: httpx.Auth | None = None, + ) -> httpx.AsyncClient: ... + + +def create_mcp_http_client( + headers: dict[str, str] | None = None, + timeout: httpx.Timeout | None = None, + auth: httpx.Auth | None = None, +) -> httpx.AsyncClient: + """Create a standardized httpx AsyncClient with MCP defaults. + + This function provides common defaults used throughout the MCP codebase: + - follow_redirects=True (always enabled) + - Default timeout of 30 seconds if not specified + + Args: + headers: Optional headers to include with all requests. + timeout: Request timeout as httpx.Timeout object. + Defaults to 30 seconds if not specified. + auth: Optional authentication handler. + + Returns: + Configured httpx.AsyncClient instance with MCP defaults. + + Note: + The returned AsyncClient must be used as a context manager to ensure + proper cleanup of connections. + + Examples: + # Basic usage with MCP defaults + async with create_mcp_http_client() as client: + response = await client.get("https://api.example.com") + + # With custom headers + headers = {"Authorization": "Bearer token"} + async with create_mcp_http_client(headers) as client: + response = await client.get("/endpoint") + + # With both custom headers and timeout + timeout = httpx.Timeout(60.0, read=300.0) + async with create_mcp_http_client(headers, timeout) as client: + response = await client.get("/long-request") + + # With authentication + from httpx import BasicAuth + auth = BasicAuth(username="user", password="pass") + async with create_mcp_http_client(headers, timeout, auth) as client: + response = await client.get("/protected-endpoint") + """ + # Set MCP defaults + kwargs: dict[str, Any] = { + "follow_redirects": True, + } + + # Handle timeout + if timeout is None: + kwargs["timeout"] = httpx.Timeout(30.0) + else: + kwargs["timeout"] = timeout + + # Handle headers + if headers is not None: + kwargs["headers"] = headers + + # Handle authentication + if auth is not None: + kwargs["auth"] = auth + + return httpx.AsyncClient(**kwargs) diff --git a/src/mcp/shared/auth.py b/src/mcp/shared/auth.py new file mode 100644 index 000000000..1f2d1659a --- /dev/null +++ b/src/mcp/shared/auth.py @@ -0,0 +1,144 @@ +from typing import Any, Literal + +from pydantic import AnyHttpUrl, AnyUrl, BaseModel, Field, field_validator + + +class OAuthToken(BaseModel): + """ + See https://datatracker.ietf.org/doc/html/rfc6749#section-5.1 + """ + + access_token: str + token_type: Literal["Bearer"] = "Bearer" + expires_in: int | None = None + scope: str | None = None + refresh_token: str | None = None + + @field_validator("token_type", mode="before") + @classmethod + def normalize_token_type(cls, v: str | None) -> str | None: + if isinstance(v, str): + # Bearer is title-cased in the spec, so we normalize it + # https://datatracker.ietf.org/doc/html/rfc6750#section-4 + return v.title() + return v + + +class InvalidScopeError(Exception): + def __init__(self, message: str): + self.message = message + + +class InvalidRedirectUriError(Exception): + def __init__(self, message: str): + self.message = message + + +class OAuthClientMetadata(BaseModel): + """ + RFC 7591 OAuth 2.0 Dynamic Client Registration metadata. + See https://datatracker.ietf.org/doc/html/rfc7591#section-2 + for the full specification. + """ + + redirect_uris: list[AnyUrl] = Field(..., min_length=1) + # token_endpoint_auth_method: this implementation only supports none & + # client_secret_post; + # ie: we do not support client_secret_basic + token_endpoint_auth_method: Literal["none", "client_secret_post"] = "client_secret_post" + # grant_types: this implementation only supports authorization_code & refresh_token + grant_types: list[Literal["authorization_code", "refresh_token"]] = [ + "authorization_code", + "refresh_token", + ] + # this implementation only supports code; ie: it does not support implicit grants + response_types: list[Literal["code"]] = ["code"] + scope: str | None = None + + # these fields are currently unused, but we support & store them for potential + # future use + client_name: str | None = None + client_uri: AnyHttpUrl | None = None + logo_uri: AnyHttpUrl | None = None + contacts: list[str] | None = None + tos_uri: AnyHttpUrl | None = None + policy_uri: AnyHttpUrl | None = None + jwks_uri: AnyHttpUrl | None = None + jwks: Any | None = None + software_id: str | None = None + software_version: str | None = None + + def validate_scope(self, requested_scope: str | None) -> list[str] | None: + if requested_scope is None: + return None + requested_scopes = requested_scope.split(" ") + allowed_scopes = [] if self.scope is None else self.scope.split(" ") + for scope in requested_scopes: + if scope not in allowed_scopes: + raise InvalidScopeError(f"Client was not registered with scope {scope}") + return requested_scopes + + def validate_redirect_uri(self, redirect_uri: AnyUrl | None) -> AnyUrl: + if redirect_uri is not None: + # Validate redirect_uri against client's registered redirect URIs + if redirect_uri not in self.redirect_uris: + raise InvalidRedirectUriError(f"Redirect URI '{redirect_uri}' not registered for client") + return redirect_uri + elif len(self.redirect_uris) == 1: + return self.redirect_uris[0] + else: + raise InvalidRedirectUriError("redirect_uri must be specified when client " "has multiple registered URIs") + + +class OAuthClientInformationFull(OAuthClientMetadata): + """ + RFC 7591 OAuth 2.0 Dynamic Client Registration full response + (client information plus metadata). + """ + + client_id: str + client_secret: str | None = None + client_id_issued_at: int | None = None + client_secret_expires_at: int | None = None + + +class OAuthMetadata(BaseModel): + """ + RFC 8414 OAuth 2.0 Authorization Server Metadata. + See https://datatracker.ietf.org/doc/html/rfc8414#section-2 + """ + + issuer: AnyHttpUrl + authorization_endpoint: AnyHttpUrl + token_endpoint: AnyHttpUrl + registration_endpoint: AnyHttpUrl | None = None + scopes_supported: list[str] | None = None + response_types_supported: list[str] = ["code"] + response_modes_supported: list[Literal["query", "fragment"]] | None = None + grant_types_supported: list[str] | None = None + token_endpoint_auth_methods_supported: list[str] | None = None + token_endpoint_auth_signing_alg_values_supported: None = None + service_documentation: AnyHttpUrl | None = None + ui_locales_supported: list[str] | None = None + op_policy_uri: AnyHttpUrl | None = None + op_tos_uri: AnyHttpUrl | None = None + revocation_endpoint: AnyHttpUrl | None = None + revocation_endpoint_auth_methods_supported: list[str] | None = None + revocation_endpoint_auth_signing_alg_values_supported: None = None + introspection_endpoint: AnyHttpUrl | None = None + introspection_endpoint_auth_methods_supported: list[str] | None = None + introspection_endpoint_auth_signing_alg_values_supported: None = None + code_challenge_methods_supported: list[str] | None = None + + +class ProtectedResourceMetadata(BaseModel): + """ + RFC 9728 OAuth 2.0 Protected Resource Metadata. + See https://datatracker.ietf.org/doc/html/rfc9728#section-2 + """ + + resource: AnyHttpUrl + authorization_servers: list[AnyHttpUrl] = Field(..., min_length=1) + scopes_supported: list[str] | None = None + bearer_methods_supported: list[str] | None = Field(default=["header"]) # MCP only supports header method + resource_documentation: AnyHttpUrl | None = None diff --git a/src/mcp/shared/auth_utils.py b/src/mcp/shared/auth_utils.py new file mode 100644 index 000000000..6d6300c9c --- /dev/null +++ b/src/mcp/shared/auth_utils.py @@ -0,0 +1,69 @@ +"""Utilities for OAuth 2.0 Resource Indicators (RFC 8707).""" + +from urllib.parse import urlparse, urlsplit, urlunsplit + +from pydantic import AnyUrl, HttpUrl + + +def resource_url_from_server_url(url: str | HttpUrl | AnyUrl) -> str: + """Convert server URL to canonical resource URL per RFC 8707. + + RFC 8707 section 2 states that resource URIs "MUST NOT include a fragment component". + Returns absolute URI with lowercase scheme/host for canonical form. + + Args: + url: Server URL to convert + + Returns: + Canonical resource URL string + """ + # Convert to string if needed + url_str = str(url) + + # Parse the URL and remove fragment, create canonical form + parsed = urlsplit(url_str) + canonical = urlunsplit(parsed._replace(scheme=parsed.scheme.lower(), netloc=parsed.netloc.lower(), fragment="")) + + return canonical + + +def check_resource_allowed(requested_resource: str, configured_resource: str) -> bool: + """Check if a requested resource URL matches a configured resource URL. + + A requested resource matches if it has the same scheme, domain, port, + and its path starts with the configured resource's path. This allows + hierarchical matching where a token for a parent resource can be used + for child resources. + + Args: + requested_resource: The resource URL being requested + configured_resource: The resource URL that has been configured + + Returns: + True if the requested resource matches the configured resource + """ + # Parse both URLs + requested = urlparse(requested_resource) + configured = urlparse(configured_resource) + + # Compare scheme, host, and port (origin) + if requested.scheme.lower() != configured.scheme.lower() or requested.netloc.lower() != configured.netloc.lower(): + return False + + # Handle cases like requested=/foo and configured=/foo/ + requested_path = requested.path + configured_path = configured.path + + # If requested path is shorter, it cannot be a child + if len(requested_path) < len(configured_path): + return False + + # Check if the requested path starts with the configured path + # Ensure both paths end with / for proper comparison + # This ensures that paths like "/api123" don't incorrectly match "/api" + if not requested_path.endswith("/"): + requested_path += "/" + if not configured_path.endswith("/"): + configured_path += "/" + + return requested_path.startswith(configured_path) diff --git a/src/mcp/shared/context.py b/src/mcp/shared/context.py index ae85d3a19..f3006e7d5 100644 --- a/src/mcp/shared/context.py +++ b/src/mcp/shared/context.py @@ -8,11 +8,13 @@ SessionT = TypeVar("SessionT", bound=BaseSession[Any, Any, Any, Any, Any]) LifespanContextT = TypeVar("LifespanContextT") +RequestT = TypeVar("RequestT", default=Any) @dataclass -class RequestContext(Generic[SessionT, LifespanContextT]): +class RequestContext(Generic[SessionT, LifespanContextT, RequestT]): request_id: RequestId meta: RequestParams.Meta | None session: SessionT lifespan_context: LifespanContextT + request: RequestT | None = None diff --git a/src/mcp/shared/memory.py b/src/mcp/shared/memory.py index abf87a3aa..c94e5e6ac 100644 --- a/src/mcp/shared/memory.py +++ b/src/mcp/shared/memory.py @@ -13,24 +13,20 @@ import mcp.types as types from mcp.client.session import ( ClientSession, + ElicitationFnT, ListRootsFnT, LoggingFnT, MessageHandlerFnT, SamplingFnT, ) from mcp.server import Server -from mcp.types import JSONRPCMessage +from mcp.shared.message import SessionMessage -MessageStream = tuple[ - MemoryObjectReceiveStream[JSONRPCMessage | Exception], - MemoryObjectSendStream[JSONRPCMessage], -] +MessageStream = tuple[MemoryObjectReceiveStream[SessionMessage | Exception], MemoryObjectSendStream[SessionMessage]] @asynccontextmanager -async def create_client_server_memory_streams() -> ( - AsyncGenerator[tuple[MessageStream, MessageStream], None] -): +async def create_client_server_memory_streams() -> AsyncGenerator[tuple[MessageStream, MessageStream], None]: """ Creates a pair of bidirectional memory streams for client-server communication. @@ -39,12 +35,8 @@ async def create_client_server_memory_streams() -> ( (read_stream, write_stream) """ # Create streams for both directions - server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[ - JSONRPCMessage | Exception - ](1) - client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[ - JSONRPCMessage | Exception - ](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage | Exception](1) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage | Exception](1) client_streams = (server_to_client_receive, client_to_server_send) server_streams = (client_to_server_receive, server_to_client_send) @@ -68,6 +60,7 @@ async def create_connected_server_and_client_session( message_handler: MessageHandlerFnT | None = None, client_info: types.Implementation | None = None, raise_exceptions: bool = False, + elicitation_callback: ElicitationFnT | None = None, ) -> AsyncGenerator[ClientSession, None]: """Creates a ClientSession that is connected to a running MCP server.""" async with create_client_server_memory_streams() as ( @@ -98,6 +91,7 @@ async def create_connected_server_and_client_session( logging_callback=logging_callback, message_handler=message_handler, client_info=client_info, + elicitation_callback=elicitation_callback, ) as client_session: await client_session.initialize() yield client_session diff --git a/src/mcp/shared/message.py b/src/mcp/shared/message.py new file mode 100644 index 000000000..4b6df23eb --- /dev/null +++ b/src/mcp/shared/message.py @@ -0,0 +1,43 @@ +""" +Message wrapper with metadata support. + +This module defines a wrapper type that combines JSONRPCMessage with metadata +to support transport-specific features like resumability. +""" + +from collections.abc import Awaitable, Callable +from dataclasses import dataclass + +from mcp.types import JSONRPCMessage, RequestId + +ResumptionToken = str + +ResumptionTokenUpdateCallback = Callable[[ResumptionToken], Awaitable[None]] + + +@dataclass +class ClientMessageMetadata: + """Metadata specific to client messages.""" + + resumption_token: ResumptionToken | None = None + on_resumption_token_update: Callable[[ResumptionToken], Awaitable[None]] | None = None + + +@dataclass +class ServerMessageMetadata: + """Metadata specific to server messages.""" + + related_request_id: RequestId | None = None + # Request-specific context (e.g., headers, auth info) + request_context: object | None = None + + +MessageMetadata = ClientMessageMetadata | ServerMessageMetadata | None + + +@dataclass +class SessionMessage: + """A message with specific metadata for transport-specific features.""" + + message: JSONRPCMessage + metadata: MessageMetadata = None diff --git a/src/mcp/shared/metadata_utils.py b/src/mcp/shared/metadata_utils.py new file mode 100644 index 000000000..e3f49daf4 --- /dev/null +++ b/src/mcp/shared/metadata_utils.py @@ -0,0 +1,45 @@ +"""Utility functions for working with metadata in MCP types. + +These utilities are primarily intended for client-side usage to properly display +human-readable names in user interfaces in a spec compliant way. +""" + +from mcp.types import Implementation, Prompt, Resource, ResourceTemplate, Tool + + +def get_display_name(obj: Tool | Resource | Prompt | ResourceTemplate | Implementation) -> str: + """ + Get the display name for an MCP object with proper precedence. + + This is a client-side utility function designed to help MCP clients display + human-readable names in their user interfaces. When servers provide a 'title' + field, it should be preferred over the programmatic 'name' field for display. + + For tools: title > annotations.title > name + For other objects: title > name + + Example: + # In a client displaying available tools + tools = await session.list_tools() + for tool in tools.tools: + display_name = get_display_name(tool) + print(f"Available tool: {display_name}") + + Args: + obj: An MCP object with name and optional title fields + + Returns: + The display name to use for UI presentation + """ + if isinstance(obj, Tool): + # Tools have special precedence: title > annotations.title > name + if hasattr(obj, "title") and obj.title is not None: + return obj.title + if obj.annotations and hasattr(obj.annotations, "title") and obj.annotations.title is not None: + return obj.annotations.title + return obj.name + else: + # All other objects: title > name + if hasattr(obj, "title") and obj.title is not None: + return obj.title + return obj.name diff --git a/src/mcp/shared/progress.py b/src/mcp/shared/progress.py index 52e0017d0..1ad81a779 100644 --- a/src/mcp/shared/progress.py +++ b/src/mcp/shared/progress.py @@ -23,55 +23,29 @@ class Progress(BaseModel): @dataclass -class ProgressContext( - Generic[ - SendRequestT, - SendNotificationT, - SendResultT, - ReceiveRequestT, - ReceiveNotificationT, - ] -): - session: BaseSession[ - SendRequestT, - SendNotificationT, - SendResultT, - ReceiveRequestT, - ReceiveNotificationT, - ] +class ProgressContext(Generic[SendRequestT, SendNotificationT, SendResultT, ReceiveRequestT, ReceiveNotificationT]): + session: BaseSession[SendRequestT, SendNotificationT, SendResultT, ReceiveRequestT, ReceiveNotificationT] progress_token: ProgressToken total: float | None current: float = field(default=0.0, init=False) - async def progress(self, amount: float) -> None: + async def progress(self, amount: float, message: str | None = None) -> None: self.current += amount await self.session.send_progress_notification( - self.progress_token, self.current, total=self.total + self.progress_token, self.current, total=self.total, message=message ) @contextmanager def progress( ctx: RequestContext[ - BaseSession[ - SendRequestT, - SendNotificationT, - SendResultT, - ReceiveRequestT, - ReceiveNotificationT, - ], + BaseSession[SendRequestT, SendNotificationT, SendResultT, ReceiveRequestT, ReceiveNotificationT], LifespanContextT, ], total: float | None = None, ) -> Generator[ - ProgressContext[ - SendRequestT, - SendNotificationT, - SendResultT, - ReceiveRequestT, - ReceiveNotificationT, - ], + ProgressContext[SendRequestT, SendNotificationT, SendResultT, ReceiveRequestT, ReceiveNotificationT], None, ]: if ctx.meta is None or ctx.meta.progressToken is None: diff --git a/src/mcp/shared/session.py b/src/mcp/shared/session.py index 05fd3ce37..6536272d9 100644 --- a/src/mcp/shared/session.py +++ b/src/mcp/shared/session.py @@ -3,17 +3,19 @@ from contextlib import AsyncExitStack from datetime import timedelta from types import TracebackType -from typing import Any, Generic, TypeVar +from typing import Any, Generic, Protocol, TypeVar import anyio -import anyio.lowlevel import httpx from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream from pydantic import BaseModel from typing_extensions import Self from mcp.shared.exceptions import McpError +from mcp.shared.message import MessageMetadata, ServerMessageMetadata, SessionMessage from mcp.types import ( + CONNECTION_CLOSED, + INVALID_PARAMS, CancelledNotification, ClientNotification, ClientRequest, @@ -24,6 +26,7 @@ JSONRPCNotification, JSONRPCRequest, JSONRPCResponse, + ProgressNotification, RequestParams, ServerNotification, ServerRequest, @@ -35,13 +38,17 @@ SendNotificationT = TypeVar("SendNotificationT", ClientNotification, ServerNotification) ReceiveRequestT = TypeVar("ReceiveRequestT", ClientRequest, ServerRequest) ReceiveResultT = TypeVar("ReceiveResultT", bound=BaseModel) -ReceiveNotificationT = TypeVar( - "ReceiveNotificationT", ClientNotification, ServerNotification -) +ReceiveNotificationT = TypeVar("ReceiveNotificationT", ClientNotification, ServerNotification) RequestId = str | int +class ProgressFnT(Protocol): + """Protocol for progress notification callbacks.""" + + async def __call__(self, progress: float, total: float | None, message: str | None) -> None: ... + + class RequestResponder(Generic[ReceiveRequestT, SendResultT]): """Handles responding to MCP requests and manages request lifecycle. @@ -71,10 +78,12 @@ def __init__( ReceiveNotificationT ]""", on_complete: Callable[["RequestResponder[ReceiveRequestT, SendResultT]"], Any], + message_metadata: MessageMetadata = None, ) -> None: self.request_id = request_id self.request_meta = request_meta self.request = request + self.message_metadata = message_metadata self._session = session self._completed = False self._cancel_scope = anyio.CancelScope() @@ -164,16 +173,15 @@ class BaseSession( messages when entered. """ - _response_streams: dict[ - RequestId, MemoryObjectSendStream[JSONRPCResponse | JSONRPCError] - ] + _response_streams: dict[RequestId, MemoryObjectSendStream[JSONRPCResponse | JSONRPCError]] _request_id: int _in_flight: dict[RequestId, RequestResponder[ReceiveRequestT, SendResultT]] + _progress_callbacks: dict[RequestId, ProgressFnT] def __init__( self, - read_stream: MemoryObjectReceiveStream[JSONRPCMessage | Exception], - write_stream: MemoryObjectSendStream[JSONRPCMessage], + read_stream: MemoryObjectReceiveStream[SessionMessage | Exception], + write_stream: MemoryObjectSendStream[SessionMessage], receive_request_type: type[ReceiveRequestT], receive_notification_type: type[ReceiveNotificationT], # If none, reading will never time out @@ -185,9 +193,9 @@ def __init__( self._request_id = 0 self._receive_request_type = receive_request_type self._receive_notification_type = receive_notification_type - self._read_timeout_seconds = read_timeout_seconds + self._session_read_timeout_seconds = read_timeout_seconds self._in_flight = {} - + self._progress_callbacks = {} self._exit_stack = AsyncExitStack() async def __aenter__(self) -> Self: @@ -213,155 +221,221 @@ async def send_request( self, request: SendRequestT, result_type: type[ReceiveResultT], + request_read_timeout_seconds: timedelta | None = None, + metadata: MessageMetadata = None, + progress_callback: ProgressFnT | None = None, ) -> ReceiveResultT: """ Sends a request and wait for a response. Raises an McpError if the - response contains an error. + response contains an error. If a request read timeout is provided, it + will take precedence over the session read timeout. Do not use this method to emit notifications! Use send_notification() instead. """ - request_id = self._request_id self._request_id = request_id + 1 - response_stream, response_stream_reader = anyio.create_memory_object_stream[ - JSONRPCResponse | JSONRPCError - ](1) + response_stream, response_stream_reader = anyio.create_memory_object_stream[JSONRPCResponse | JSONRPCError](1) self._response_streams[request_id] = response_stream - self._exit_stack.push_async_callback(lambda: response_stream.aclose()) - self._exit_stack.push_async_callback(lambda: response_stream_reader.aclose()) - - jsonrpc_request = JSONRPCRequest( - jsonrpc="2.0", - id=request_id, - **request.model_dump(by_alias=True, mode="json", exclude_none=True), - ) - - # TODO: Support progress callbacks - - await self._write_stream.send(JSONRPCMessage(jsonrpc_request)) + # Set up progress token if progress callback is provided + request_data = request.model_dump(by_alias=True, mode="json", exclude_none=True) + if progress_callback is not None: + # Use request_id as progress token + if "params" not in request_data: + request_data["params"] = {} + if "_meta" not in request_data["params"]: + request_data["params"]["_meta"] = {} + request_data["params"]["_meta"]["progressToken"] = request_id + # Store the callback for this request + self._progress_callbacks[request_id] = progress_callback try: - with anyio.fail_after( - None - if self._read_timeout_seconds is None - else self._read_timeout_seconds.total_seconds() - ): - response_or_error = await response_stream_reader.receive() - except TimeoutError: - raise McpError( - ErrorData( - code=httpx.codes.REQUEST_TIMEOUT, - message=( - f"Timed out while waiting for response to " - f"{request.__class__.__name__}. Waited " - f"{self._read_timeout_seconds} seconds." - ), - ) + jsonrpc_request = JSONRPCRequest( + jsonrpc="2.0", + id=request_id, + **request_data, ) - if isinstance(response_or_error, JSONRPCError): - raise McpError(response_or_error.error) - else: - return result_type.model_validate(response_or_error.result) + await self._write_stream.send(SessionMessage(message=JSONRPCMessage(jsonrpc_request), metadata=metadata)) + + # request read timeout takes precedence over session read timeout + timeout = None + if request_read_timeout_seconds is not None: + timeout = request_read_timeout_seconds.total_seconds() + elif self._session_read_timeout_seconds is not None: + timeout = self._session_read_timeout_seconds.total_seconds() + + try: + with anyio.fail_after(timeout): + response_or_error = await response_stream_reader.receive() + except TimeoutError: + raise McpError( + ErrorData( + code=httpx.codes.REQUEST_TIMEOUT, + message=( + f"Timed out while waiting for response to " + f"{request.__class__.__name__}. Waited " + f"{timeout} seconds." + ), + ) + ) + + if isinstance(response_or_error, JSONRPCError): + raise McpError(response_or_error.error) + else: + return result_type.model_validate(response_or_error.result) + + finally: + self._response_streams.pop(request_id, None) + self._progress_callbacks.pop(request_id, None) + await response_stream.aclose() + await response_stream_reader.aclose() - async def send_notification(self, notification: SendNotificationT) -> None: + async def send_notification( + self, + notification: SendNotificationT, + related_request_id: RequestId | None = None, + ) -> None: """ Emits a notification, which is a one-way message that does not expect a response. """ + # Some transport implementations may need to set the related_request_id + # to attribute to the notifications to the request that triggered them. jsonrpc_notification = JSONRPCNotification( jsonrpc="2.0", **notification.model_dump(by_alias=True, mode="json", exclude_none=True), ) + session_message = SessionMessage( + message=JSONRPCMessage(jsonrpc_notification), + metadata=ServerMessageMetadata(related_request_id=related_request_id) if related_request_id else None, + ) + await self._write_stream.send(session_message) - await self._write_stream.send(JSONRPCMessage(jsonrpc_notification)) - - async def _send_response( - self, request_id: RequestId, response: SendResultT | ErrorData - ) -> None: + async def _send_response(self, request_id: RequestId, response: SendResultT | ErrorData) -> None: if isinstance(response, ErrorData): jsonrpc_error = JSONRPCError(jsonrpc="2.0", id=request_id, error=response) - await self._write_stream.send(JSONRPCMessage(jsonrpc_error)) + session_message = SessionMessage(message=JSONRPCMessage(jsonrpc_error)) + await self._write_stream.send(session_message) else: jsonrpc_response = JSONRPCResponse( jsonrpc="2.0", id=request_id, - result=response.model_dump( - by_alias=True, mode="json", exclude_none=True - ), + result=response.model_dump(by_alias=True, mode="json", exclude_none=True), ) - await self._write_stream.send(JSONRPCMessage(jsonrpc_response)) + session_message = SessionMessage(message=JSONRPCMessage(jsonrpc_response)) + await self._write_stream.send(session_message) async def _receive_loop(self) -> None: async with ( self._read_stream, self._write_stream, ): - async for message in self._read_stream: - if isinstance(message, Exception): - await self._handle_incoming(message) - elif isinstance(message.root, JSONRPCRequest): - validated_request = self._receive_request_type.model_validate( - message.root.model_dump( - by_alias=True, mode="json", exclude_none=True - ) - ) - - responder = RequestResponder( - request_id=message.root.id, - request_meta=validated_request.root.params.meta - if validated_request.root.params - else None, - request=validated_request, - session=self, - on_complete=lambda r: self._in_flight.pop(r.request_id, None), - ) - - self._in_flight[responder.request_id] = responder - await self._received_request(responder) - - if not responder._completed: # type: ignore[reportPrivateUsage] - await self._handle_incoming(responder) + try: + async for message in self._read_stream: + if isinstance(message, Exception): + await self._handle_incoming(message) + elif isinstance(message.message.root, JSONRPCRequest): + try: + validated_request = self._receive_request_type.model_validate( + message.message.root.model_dump(by_alias=True, mode="json", exclude_none=True) + ) + responder = RequestResponder( + request_id=message.message.root.id, + request_meta=validated_request.root.params.meta + if validated_request.root.params + else None, + request=validated_request, + session=self, + on_complete=lambda r: self._in_flight.pop(r.request_id, None), + message_metadata=message.metadata, + ) + self._in_flight[responder.request_id] = responder + await self._received_request(responder) + + if not responder._completed: # type: ignore[reportPrivateUsage] + await self._handle_incoming(responder) + except Exception as e: + # For request validation errors, send a proper JSON-RPC error + # response instead of crashing the server + logging.warning(f"Failed to validate request: {e}") + logging.debug(f"Message that failed validation: {message.message.root}") + error_response = JSONRPCError( + jsonrpc="2.0", + id=message.message.root.id, + error=ErrorData( + code=INVALID_PARAMS, + message="Invalid request parameters", + data="", + ), + ) + session_message = SessionMessage(message=JSONRPCMessage(error_response)) + await self._write_stream.send(session_message) - elif isinstance(message.root, JSONRPCNotification): - try: - notification = self._receive_notification_type.model_validate( - message.root.model_dump( - by_alias=True, mode="json", exclude_none=True + elif isinstance(message.message.root, JSONRPCNotification): + try: + notification = self._receive_notification_type.model_validate( + message.message.root.model_dump(by_alias=True, mode="json", exclude_none=True) ) - ) - # Handle cancellation notifications - if isinstance(notification.root, CancelledNotification): - cancelled_id = notification.root.params.requestId - if cancelled_id in self._in_flight: - await self._in_flight[cancelled_id].cancel() + # Handle cancellation notifications + if isinstance(notification.root, CancelledNotification): + cancelled_id = notification.root.params.requestId + if cancelled_id in self._in_flight: + await self._in_flight[cancelled_id].cancel() + else: + # Handle progress notifications callback + if isinstance(notification.root, ProgressNotification): + progress_token = notification.root.params.progressToken + # If there is a progress callback for this token, + # call it with the progress information + if progress_token in self._progress_callbacks: + callback = self._progress_callbacks[progress_token] + await callback( + notification.root.params.progress, + notification.root.params.total, + notification.root.params.message, + ) + await self._received_notification(notification) + await self._handle_incoming(notification) + except Exception as e: + # For other validation errors, log and continue + logging.warning( + f"Failed to validate notification: {e}. " f"Message was: {message.message.root}" + ) + else: # Response or error + stream = self._response_streams.pop(message.message.root.id, None) + if stream: + await stream.send(message.message.root) else: - await self._received_notification(notification) - await self._handle_incoming(notification) - except Exception as e: - # For other validation errors, log and continue - logging.warning( - f"Failed to validate notification: {e}. " - f"Message was: {message.root}" - ) - else: # Response or error - stream = self._response_streams.pop(message.root.id, None) - if stream: - await stream.send(message.root) - else: - await self._handle_incoming( - RuntimeError( - "Received response with an unknown " - f"request ID: {message}" + await self._handle_incoming( + RuntimeError("Received response with an unknown " f"request ID: {message}") ) - ) - async def _received_request( - self, responder: RequestResponder[ReceiveRequestT, SendResultT] - ) -> None: + except anyio.ClosedResourceError: + # This is expected when the client disconnects abruptly. + # Without this handler, the exception would propagate up and + # crash the server's task group. + logging.debug("Read stream closed by client") + except Exception as e: + # Other exceptions are not expected and should be logged. We purposefully + # catch all exceptions here to avoid crashing the server. + logging.exception(f"Unhandled exception in receive loop: {e}") + finally: + # after the read stream is closed, we need to send errors + # to any pending requests + for id, stream in self._response_streams.items(): + error = ErrorData(code=CONNECTION_CLOSED, message="Connection closed") + try: + await stream.send(JSONRPCError(jsonrpc="2.0", id=id, error=error)) + await stream.aclose() + except Exception: + # Stream might already be closed + pass + self._response_streams.clear() + + async def _received_request(self, responder: RequestResponder[ReceiveRequestT, SendResultT]) -> None: """ Can be overridden by subclasses to handle a request without needing to listen on the message stream. @@ -377,7 +451,11 @@ async def _received_notification(self, notification: ReceiveNotificationT) -> No """ async def send_progress_notification( - self, progress_token: str | int, progress: float, total: float | None = None + self, + progress_token: str | int, + progress: float, + total: float | None = None, + message: str | None = None, ) -> None: """ Sends a progress notification for a request that is currently being @@ -386,9 +464,7 @@ async def send_progress_notification( async def _handle_incoming( self, - req: RequestResponder[ReceiveRequestT, SendResultT] - | ReceiveNotificationT - | Exception, + req: RequestResponder[ReceiveRequestT, SendResultT] | ReceiveNotificationT | Exception, ) -> None: """A generic handler for incoming messages. Overwritten by subclasses.""" pass diff --git a/src/mcp/shared/version.py b/src/mcp/shared/version.py index 8fd13b992..d00077705 100644 --- a/src/mcp/shared/version.py +++ b/src/mcp/shared/version.py @@ -1,3 +1,3 @@ from mcp.types import LATEST_PROTOCOL_VERSION -SUPPORTED_PROTOCOL_VERSIONS: tuple[int, str] = (1, LATEST_PROTOCOL_VERSION) +SUPPORTED_PROTOCOL_VERSIONS: list[str] = ["2024-11-05", LATEST_PROTOCOL_VERSION] diff --git a/src/mcp/types.py b/src/mcp/types.py index bd71d51f0..d5663dad6 100644 --- a/src/mcp/types.py +++ b/src/mcp/types.py @@ -1,15 +1,9 @@ from collections.abc import Callable -from typing import ( - Annotated, - Any, - Generic, - Literal, - TypeAlias, - TypeVar, -) +from typing import Annotated, Any, Generic, Literal, TypeAlias, TypeVar from pydantic import BaseModel, ConfigDict, Field, FileUrl, RootModel from pydantic.networks import AnyUrl, UrlConstraints +from typing_extensions import deprecated """ Model Context Protocol bindings for Python @@ -29,12 +23,20 @@ not separate types in the schema. """ -LATEST_PROTOCOL_VERSION = "2024-11-05" +LATEST_PROTOCOL_VERSION = "2025-03-26" + +""" +The default negotiated version of the Model Context Protocol when no version is specified. +We need this to satisfy the MCP specification, which requires the server to assume a +specific version if none is provided by the client. See section "Protocol Version Header" at +https://modelcontextprotocol.io/specification +""" +DEFAULT_NEGOTIATED_VERSION = "2025-03-26" ProgressToken = str | int Cursor = str Role = Literal["user", "assistant"] -RequestId = str | int +RequestId = Annotated[int | str, Field(union_mode="left_to_right")] AnyFunction: TypeAlias = Callable[..., Any] @@ -53,21 +55,27 @@ class Meta(BaseModel): meta: Meta | None = Field(alias="_meta", default=None) +class PaginatedRequestParams(RequestParams): + cursor: Cursor | None = None + """ + An opaque token representing the current pagination position. + If provided, the server should return results starting after this cursor. + """ + + class NotificationParams(BaseModel): class Meta(BaseModel): model_config = ConfigDict(extra="allow") meta: Meta | None = Field(alias="_meta", default=None) """ - This parameter name is reserved by MCP to allow clients and servers to attach - additional metadata to their notifications. + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. """ RequestParamsT = TypeVar("RequestParamsT", bound=RequestParams | dict[str, Any] | None) -NotificationParamsT = TypeVar( - "NotificationParamsT", bound=NotificationParams | dict[str, Any] | None -) +NotificationParamsT = TypeVar("NotificationParamsT", bound=NotificationParams | dict[str, Any] | None) MethodT = TypeVar("MethodT", bound=str) @@ -79,12 +87,11 @@ class Request(BaseModel, Generic[RequestParamsT, MethodT]): model_config = ConfigDict(extra="allow") -class PaginatedRequest(Request[RequestParamsT, MethodT]): - cursor: Cursor | None = None - """ - An opaque token representing the current pagination position. - If provided, the server should return results starting after this cursor. - """ +class PaginatedRequest(Request[PaginatedRequestParams | None, MethodT], Generic[MethodT]): + """Base class for paginated requests, + matching the schema's PaginatedRequest interface.""" + + params: PaginatedRequestParams | None = None class Notification(BaseModel, Generic[NotificationParamsT, MethodT]): @@ -98,13 +105,12 @@ class Notification(BaseModel, Generic[NotificationParamsT, MethodT]): class Result(BaseModel): """Base class for JSON-RPC results.""" - model_config = ConfigDict(extra="allow") - meta: dict[str, Any] | None = Field(alias="_meta", default=None) """ - This result property is reserved by the protocol to allow clients and servers to - attach additional metadata to their responses. + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. """ + model_config = ConfigDict(extra="allow") class PaginatedResult(Result): @@ -140,6 +146,10 @@ class JSONRPCResponse(BaseModel): model_config = ConfigDict(extra="allow") +# SDK error codes +CONNECTION_CLOSED = -32000 +# REQUEST_TIMEOUT = -32001 # the typescript sdk uses this + # Standard JSON-RPC error codes PARSE_ERROR = -32700 INVALID_REQUEST = -32600 @@ -178,9 +188,7 @@ class JSONRPCError(BaseModel): model_config = ConfigDict(extra="allow") -class JSONRPCMessage( - RootModel[JSONRPCRequest | JSONRPCNotification | JSONRPCResponse | JSONRPCError] -): +class JSONRPCMessage(RootModel[JSONRPCRequest | JSONRPCNotification | JSONRPCResponse | JSONRPCError]): pass @@ -188,10 +196,26 @@ class EmptyResult(Result): """A response that indicates success but carries no data.""" -class Implementation(BaseModel): - """Describes the name and version of an MCP implementation.""" +class BaseMetadata(BaseModel): + """Base class for entities with name and optional title fields.""" name: str + """The programmatic name of the entity.""" + + title: str | None = None + """ + Intended for UI and end-user contexts — optimized to be human-readable and easily understood, + even by those unfamiliar with domain-specific terminology. + + If not provided, the name should be used for display (except for Tool, + where `annotations.title` should be given precedence over using `name`, + if present). + """ + + +class Implementation(BaseMetadata): + """Describes the name and version of an MCP implementation.""" + version: str model_config = ConfigDict(extra="allow") @@ -205,7 +229,13 @@ class RootsCapability(BaseModel): class SamplingCapability(BaseModel): - """Capability for logging operations.""" + """Capability for sampling operations.""" + + model_config = ConfigDict(extra="allow") + + +class ElicitationCapability(BaseModel): + """Capability for elicitation operations.""" model_config = ConfigDict(extra="allow") @@ -217,6 +247,8 @@ class ClientCapabilities(BaseModel): """Experimental, non-standard capabilities that the client supports.""" sampling: SamplingCapability | None = None """Present if the client supports sampling from an LLM.""" + elicitation: ElicitationCapability | None = None + """Present if the client supports elicitation from the user.""" roots: RootsCapability | None = None """Present if the client supports listing roots.""" model_config = ConfigDict(extra="allow") @@ -301,9 +333,7 @@ class InitializeResult(Result): """Instructions describing how to use the server and its features.""" -class InitializedNotification( - Notification[NotificationParams | None, Literal["notifications/initialized"]] -): +class InitializedNotification(Notification[NotificationParams | None, Literal["notifications/initialized"]]): """ This notification is sent from the client to the server after initialization has finished. @@ -338,12 +368,15 @@ class ProgressNotificationParams(NotificationParams): """ total: float | None = None """Total number of items to process (or total progress required), if known.""" + message: str | None = None + """ + Message related to progress. This should provide relevant human readable + progress information. + """ model_config = ConfigDict(extra="allow") -class ProgressNotification( - Notification[ProgressNotificationParams, Literal["notifications/progress"]] -): +class ProgressNotification(Notification[ProgressNotificationParams, Literal["notifications/progress"]]): """ An out-of-band notification used to inform the receiver of a progress update for a long-running request. @@ -353,13 +386,10 @@ class ProgressNotification( params: ProgressNotificationParams -class ListResourcesRequest( - PaginatedRequest[RequestParams | None, Literal["resources/list"]] -): +class ListResourcesRequest(PaginatedRequest[Literal["resources/list"]]): """Sent from the client to request a list of resources the server has.""" method: Literal["resources/list"] - params: RequestParams | None = None class Annotations(BaseModel): @@ -368,13 +398,11 @@ class Annotations(BaseModel): model_config = ConfigDict(extra="allow") -class Resource(BaseModel): +class Resource(BaseMetadata): """A known resource that the server is capable of reading.""" uri: Annotated[AnyUrl, UrlConstraints(host_required=False)] """The URI of this resource.""" - name: str - """A human-readable name for this resource.""" description: str | None = None """A description of what this resource represents.""" mimeType: str | None = None @@ -387,10 +415,15 @@ class Resource(BaseModel): This can be used by Hosts to display file sizes and estimate context window usage. """ annotations: Annotations | None = None + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") -class ResourceTemplate(BaseModel): +class ResourceTemplate(BaseMetadata): """A template description for resources available on the server.""" uriTemplate: str @@ -398,8 +431,6 @@ class ResourceTemplate(BaseModel): A URI template (according to RFC 6570) that can be used to construct resource URIs. """ - name: str - """A human-readable name for the type of resource this template refers to.""" description: str | None = None """A human-readable description of what this template is for.""" mimeType: str | None = None @@ -408,6 +439,11 @@ class ResourceTemplate(BaseModel): included if all resources matching this template have the same type. """ annotations: Annotations | None = None + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -417,13 +453,10 @@ class ListResourcesResult(PaginatedResult): resources: list[Resource] -class ListResourceTemplatesRequest( - PaginatedRequest[RequestParams | None, Literal["resources/templates/list"]] -): +class ListResourceTemplatesRequest(PaginatedRequest[Literal["resources/templates/list"]]): """Sent from the client to request a list of resource templates the server has.""" method: Literal["resources/templates/list"] - params: RequestParams | None = None class ListResourceTemplatesResult(PaginatedResult): @@ -443,9 +476,7 @@ class ReadResourceRequestParams(RequestParams): model_config = ConfigDict(extra="allow") -class ReadResourceRequest( - Request[ReadResourceRequestParams, Literal["resources/read"]] -): +class ReadResourceRequest(Request[ReadResourceRequestParams, Literal["resources/read"]]): """Sent from the client to the server, to read a specific resource URI.""" method: Literal["resources/read"] @@ -459,6 +490,11 @@ class ResourceContents(BaseModel): """The URI of this resource.""" mimeType: str | None = None """The MIME type of this resource, if known.""" + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -486,9 +522,7 @@ class ReadResourceResult(Result): class ResourceListChangedNotification( - Notification[ - NotificationParams | None, Literal["notifications/resources/list_changed"] - ] + Notification[NotificationParams | None, Literal["notifications/resources/list_changed"]] ): """ An optional notification from the server to the client, informing it that the list @@ -528,9 +562,7 @@ class UnsubscribeRequestParams(RequestParams): model_config = ConfigDict(extra="allow") -class UnsubscribeRequest( - Request[UnsubscribeRequestParams, Literal["resources/unsubscribe"]] -): +class UnsubscribeRequest(Request[UnsubscribeRequestParams, Literal["resources/unsubscribe"]]): """ Sent from the client to request cancellation of resources/updated notifications from the server. @@ -552,9 +584,7 @@ class ResourceUpdatedNotificationParams(NotificationParams): class ResourceUpdatedNotification( - Notification[ - ResourceUpdatedNotificationParams, Literal["notifications/resources/updated"] - ] + Notification[ResourceUpdatedNotificationParams, Literal["notifications/resources/updated"]] ): """ A notification from the server to the client, informing it that a resource has @@ -565,13 +595,10 @@ class ResourceUpdatedNotification( params: ResourceUpdatedNotificationParams -class ListPromptsRequest( - PaginatedRequest[RequestParams | None, Literal["prompts/list"]] -): +class ListPromptsRequest(PaginatedRequest[Literal["prompts/list"]]): """Sent from the client to request a list of prompts and prompt templates.""" method: Literal["prompts/list"] - params: RequestParams | None = None class PromptArgument(BaseModel): @@ -586,15 +613,18 @@ class PromptArgument(BaseModel): model_config = ConfigDict(extra="allow") -class Prompt(BaseModel): +class Prompt(BaseMetadata): """A prompt or prompt template that the server offers.""" - name: str - """The name of the prompt or prompt template.""" description: str | None = None """An optional description of what this prompt provides.""" arguments: list[PromptArgument] | None = None """A list of arguments to use for templating the prompt.""" + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -628,6 +658,11 @@ class TextContent(BaseModel): text: str """The text content of the message.""" annotations: Annotations | None = None + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -643,6 +678,31 @@ class ImageContent(BaseModel): image types. """ annotations: Annotations | None = None + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ + model_config = ConfigDict(extra="allow") + + +class AudioContent(BaseModel): + """Audio content for a message.""" + + type: Literal["audio"] + data: str + """The base64-encoded audio data.""" + mimeType: str + """ + The MIME type of the audio. Different providers may support different + audio types. + """ + annotations: Annotations | None = None + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -650,7 +710,7 @@ class SamplingMessage(BaseModel): """Describes a message issued to or received from an LLM API.""" role: Role - content: TextContent | ImageContent + content: TextContent | ImageContent | AudioContent model_config = ConfigDict(extra="allow") @@ -665,14 +725,36 @@ class EmbeddedResource(BaseModel): type: Literal["resource"] resource: TextResourceContents | BlobResourceContents annotations: Annotations | None = None + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") +class ResourceLink(Resource): + """ + A resource that the server is capable of reading, included in a prompt or tool call result. + + Note: resource links returned by tools are not guaranteed to appear in the results of `resources/list` requests. + """ + + type: Literal["resource_link"] + + +ContentBlock = TextContent | ImageContent | AudioContent | ResourceLink | EmbeddedResource +"""A content block that can be used in prompts and tool results.""" + +Content: TypeAlias = ContentBlock +# """DEPRECATED: Content is deprecated, you should use ContentBlock directly.""" + + class PromptMessage(BaseModel): """Describes a message returned as part of a prompt.""" role: Role - content: TextContent | ImageContent | EmbeddedResource + content: ContentBlock model_config = ConfigDict(extra="allow") @@ -685,9 +767,7 @@ class GetPromptResult(Result): class PromptListChangedNotification( - Notification[ - NotificationParams | None, Literal["notifications/prompts/list_changed"] - ] + Notification[NotificationParams | None, Literal["notifications/prompts/list_changed"]] ): """ An optional notification from the server to the client, informing it that the list @@ -698,22 +778,74 @@ class PromptListChangedNotification( params: NotificationParams | None = None -class ListToolsRequest(PaginatedRequest[RequestParams | None, Literal["tools/list"]]): +class ListToolsRequest(PaginatedRequest[Literal["tools/list"]]): """Sent from the client to request a list of tools the server has.""" method: Literal["tools/list"] - params: RequestParams | None = None -class Tool(BaseModel): +class ToolAnnotations(BaseModel): + """ + Additional properties describing a Tool to clients. + + NOTE: all properties in ToolAnnotations are **hints**. + They are not guaranteed to provide a faithful description of + tool behavior (including descriptive properties like `title`). + + Clients should never make tool use decisions based on ToolAnnotations + received from untrusted servers. + """ + + title: str | None = None + """A human-readable title for the tool.""" + + readOnlyHint: bool | None = None + """ + If true, the tool does not modify its environment. + Default: false + """ + + destructiveHint: bool | None = None + """ + If true, the tool may perform destructive updates to its environment. + If false, the tool performs only additive updates. + (This property is meaningful only when `readOnlyHint == false`) + Default: true + """ + + idempotentHint: bool | None = None + """ + If true, calling the tool repeatedly with the same arguments + will have no additional effect on the its environment. + (This property is meaningful only when `readOnlyHint == false`) + Default: false + """ + + openWorldHint: bool | None = None + """ + If true, this tool may interact with an "open world" of external + entities. If false, the tool's domain of interaction is closed. + For example, the world of a web search tool is open, whereas that + of a memory tool is not. + Default: true + """ + model_config = ConfigDict(extra="allow") + + +class Tool(BaseMetadata): """Definition for a tool the client can call.""" - name: str - """The name of the tool.""" description: str | None = None """A human-readable description of the tool.""" inputSchema: dict[str, Any] """A JSON Schema object defining the expected parameters for the tool.""" + annotations: ToolAnnotations | None = None + """Optional additional tool information.""" + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -741,13 +873,11 @@ class CallToolRequest(Request[CallToolRequestParams, Literal["tools/call"]]): class CallToolResult(Result): """The server's response to a tool call.""" - content: list[TextContent | ImageContent | EmbeddedResource] + content: list[ContentBlock] isError: bool = False -class ToolListChangedNotification( - Notification[NotificationParams | None, Literal["notifications/tools/list_changed"]] -): +class ToolListChangedNotification(Notification[NotificationParams | None, Literal["notifications/tools/list_changed"]]): """ An optional notification from the server to the client, informing it that the list of tools it offers has changed. @@ -757,9 +887,7 @@ class ToolListChangedNotification( params: NotificationParams | None = None -LoggingLevel = Literal[ - "debug", "info", "notice", "warning", "error", "critical", "alert", "emergency" -] +LoggingLevel = Literal["debug", "info", "notice", "warning", "error", "critical", "alert", "emergency"] class SetLevelRequestParams(RequestParams): @@ -792,9 +920,7 @@ class LoggingMessageNotificationParams(NotificationParams): model_config = ConfigDict(extra="allow") -class LoggingMessageNotification( - Notification[LoggingMessageNotificationParams, Literal["notifications/message"]] -): +class LoggingMessageNotification(Notification[LoggingMessageNotificationParams, Literal["notifications/message"]]): """Notification of a log message passed from server to client.""" method: Literal["notifications/message"] @@ -889,9 +1015,7 @@ class CreateMessageRequestParams(RequestParams): model_config = ConfigDict(extra="allow") -class CreateMessageRequest( - Request[CreateMessageRequestParams, Literal["sampling/createMessage"]] -): +class CreateMessageRequest(Request[CreateMessageRequestParams, Literal["sampling/createMessage"]]): """A request from the server to sample an LLM via the client.""" method: Literal["sampling/createMessage"] @@ -905,14 +1029,14 @@ class CreateMessageResult(Result): """The client's response to a sampling/create_message request from the server.""" role: Role - content: TextContent | ImageContent + content: TextContent | ImageContent | AudioContent model: str """The name of the model that generated the message.""" stopReason: StopReason | None = None """The reason why sampling stopped, if known.""" -class ResourceReference(BaseModel): +class ResourceTemplateReference(BaseModel): """A reference to a resource or resource template definition.""" type: Literal["ref/resource"] @@ -921,6 +1045,11 @@ class ResourceReference(BaseModel): model_config = ConfigDict(extra="allow") +@deprecated("`ResourceReference` is deprecated, you should use `ResourceTemplateReference`.") +class ResourceReference(ResourceTemplateReference): + pass + + class PromptReference(BaseModel): """Identifies a prompt.""" @@ -940,11 +1069,21 @@ class CompletionArgument(BaseModel): model_config = ConfigDict(extra="allow") +class CompletionContext(BaseModel): + """Additional, optional context for completions.""" + + arguments: dict[str, str] | None = None + """Previously-resolved variables in a URI template or prompt.""" + model_config = ConfigDict(extra="allow") + + class CompleteRequestParams(RequestParams): """Parameters for completion requests.""" - ref: ResourceReference | PromptReference + ref: ResourceTemplateReference | PromptReference argument: CompletionArgument + context: CompletionContext | None = None + """Additional, optional context for completions""" model_config = ConfigDict(extra="allow") @@ -1009,6 +1148,11 @@ class Root(BaseModel): identifier for the root, which may be useful for display purposes or for referencing the root in other parts of the application. """ + meta: dict[str, Any] | None = Field(alias="_meta", default=None) + """ + See [MCP specification](https://github.com/modelcontextprotocol/modelcontextprotocol/blob/47339c03c143bb4ec01a26e721a1b8fe66634ebe/docs/specification/draft/basic/index.mdx#general-fields) + for notes on _meta usage. + """ model_config = ConfigDict(extra="allow") @@ -1048,9 +1192,7 @@ class CancelledNotificationParams(NotificationParams): model_config = ConfigDict(extra="allow") -class CancelledNotification( - Notification[CancelledNotificationParams, Literal["notifications/cancelled"]] -): +class CancelledNotification(Notification[CancelledNotificationParams, Literal["notifications/cancelled"]]): """ This notification can be sent by either side to indicate that it is canceling a previously-issued request. @@ -1081,21 +1223,54 @@ class ClientRequest( class ClientNotification( - RootModel[ - CancelledNotification - | ProgressNotification - | InitializedNotification - | RootsListChangedNotification - ] + RootModel[CancelledNotification | ProgressNotification | InitializedNotification | RootsListChangedNotification] ): pass -class ClientResult(RootModel[EmptyResult | CreateMessageResult | ListRootsResult]): +# Type for elicitation schema - a JSON Schema dict +ElicitRequestedSchema: TypeAlias = dict[str, Any] +"""Schema for elicitation requests.""" + + +class ElicitRequestParams(RequestParams): + """Parameters for elicitation requests.""" + + message: str + requestedSchema: ElicitRequestedSchema + model_config = ConfigDict(extra="allow") + + +class ElicitRequest(Request[ElicitRequestParams, Literal["elicitation/create"]]): + """A request from the server to elicit information from the client.""" + + method: Literal["elicitation/create"] + params: ElicitRequestParams + + +class ElicitResult(Result): + """The client's response to an elicitation request.""" + + action: Literal["accept", "decline", "cancel"] + """ + The user action in response to the elicitation. + - "accept": User submitted the form/confirmed the action + - "decline": User explicitly declined the action + - "cancel": User dismissed without making an explicit choice + """ + + content: dict[str, str | int | float | bool | None] | None = None + """ + The submitted form data, only present when action is "accept". + Contains values matching the requested schema. + """ + + +class ClientResult(RootModel[EmptyResult | CreateMessageResult | ListRootsResult | ElicitResult]): pass -class ServerRequest(RootModel[PingRequest | CreateMessageRequest | ListRootsRequest]): +class ServerRequest(RootModel[PingRequest | CreateMessageRequest | ListRootsRequest | ElicitRequest]): pass diff --git a/tests/client/conftest.py b/tests/client/conftest.py new file mode 100644 index 000000000..0c8283903 --- /dev/null +++ b/tests/client/conftest.py @@ -0,0 +1,137 @@ +from contextlib import asynccontextmanager +from unittest.mock import patch + +import pytest + +import mcp.shared.memory +from mcp.shared.message import SessionMessage +from mcp.types import ( + JSONRPCNotification, + JSONRPCRequest, +) + + +class SpyMemoryObjectSendStream: + def __init__(self, original_stream): + self.original_stream = original_stream + self.sent_messages: list[SessionMessage] = [] + + async def send(self, message): + self.sent_messages.append(message) + await self.original_stream.send(message) + + async def aclose(self): + await self.original_stream.aclose() + + async def __aenter__(self): + return self + + async def __aexit__(self, *args): + await self.aclose() + + +class StreamSpyCollection: + def __init__( + self, + client_spy: SpyMemoryObjectSendStream, + server_spy: SpyMemoryObjectSendStream, + ): + self.client = client_spy + self.server = server_spy + + def clear(self) -> None: + """Clear all captured messages.""" + self.client.sent_messages.clear() + self.server.sent_messages.clear() + + def get_client_requests(self, method: str | None = None) -> list[JSONRPCRequest]: + """Get client-sent requests, optionally filtered by method.""" + return [ + req.message.root + for req in self.client.sent_messages + if isinstance(req.message.root, JSONRPCRequest) and (method is None or req.message.root.method == method) + ] + + def get_server_requests(self, method: str | None = None) -> list[JSONRPCRequest]: + """Get server-sent requests, optionally filtered by method.""" + return [ + req.message.root + for req in self.server.sent_messages + if isinstance(req.message.root, JSONRPCRequest) and (method is None or req.message.root.method == method) + ] + + def get_client_notifications(self, method: str | None = None) -> list[JSONRPCNotification]: + """Get client-sent notifications, optionally filtered by method.""" + return [ + notif.message.root + for notif in self.client.sent_messages + if isinstance(notif.message.root, JSONRPCNotification) + and (method is None or notif.message.root.method == method) + ] + + def get_server_notifications(self, method: str | None = None) -> list[JSONRPCNotification]: + """Get server-sent notifications, optionally filtered by method.""" + return [ + notif.message.root + for notif in self.server.sent_messages + if isinstance(notif.message.root, JSONRPCNotification) + and (method is None or notif.message.root.method == method) + ] + + +@pytest.fixture +def stream_spy(): + """Fixture that provides spies for both client and server write streams. + + Example usage: + async def test_something(stream_spy): + # ... set up server and client ... + + spies = stream_spy() + + # Run some operation that sends messages + await client.some_operation() + + # Check the messages + requests = spies.get_client_requests(method="some/method") + assert len(requests) == 1 + + # Clear for the next operation + spies.clear() + """ + client_spy = None + server_spy = None + + # Store references to our spy objects + def capture_spies(c_spy, s_spy): + nonlocal client_spy, server_spy + client_spy = c_spy + server_spy = s_spy + + # Create patched version of stream creation + original_create_streams = mcp.shared.memory.create_client_server_memory_streams + + @asynccontextmanager + async def patched_create_streams(): + async with original_create_streams() as (client_streams, server_streams): + client_read, client_write = client_streams + server_read, server_write = server_streams + + # Create spy wrappers + spy_client_write = SpyMemoryObjectSendStream(client_write) + spy_server_write = SpyMemoryObjectSendStream(server_write) + + # Capture references for the test to use + capture_spies(spy_client_write, spy_server_write) + + yield (client_read, spy_client_write), (server_read, spy_server_write) + + # Apply the patch for the duration of the test + with patch("mcp.shared.memory.create_client_server_memory_streams", patched_create_streams): + # Return a collection with helper methods + def get_spy_collection() -> StreamSpyCollection: + assert client_spy is not None, "client_spy was not initialized" + assert server_spy is not None, "server_spy was not initialized" + return StreamSpyCollection(client_spy, server_spy) + + yield get_spy_collection diff --git a/tests/client/test_auth.py b/tests/client/test_auth.py new file mode 100644 index 000000000..8dee687a9 --- /dev/null +++ b/tests/client/test_auth.py @@ -0,0 +1,315 @@ +""" +Tests for refactored OAuth client authentication implementation. +""" + +import time + +import httpx +import pytest +from pydantic import AnyHttpUrl, AnyUrl + +from mcp.client.auth import OAuthClientProvider, PKCEParameters +from mcp.shared.auth import ( + OAuthClientInformationFull, + OAuthClientMetadata, + OAuthToken, +) + + +class MockTokenStorage: + """Mock token storage for testing.""" + + def __init__(self): + self._tokens: OAuthToken | None = None + self._client_info: OAuthClientInformationFull | None = None + + async def get_tokens(self) -> OAuthToken | None: + return self._tokens + + async def set_tokens(self, tokens: OAuthToken) -> None: + self._tokens = tokens + + async def get_client_info(self) -> OAuthClientInformationFull | None: + return self._client_info + + async def set_client_info(self, client_info: OAuthClientInformationFull) -> None: + self._client_info = client_info + + +@pytest.fixture +def mock_storage(): + return MockTokenStorage() + + +@pytest.fixture +def client_metadata(): + return OAuthClientMetadata( + client_name="Test Client", + client_uri=AnyHttpUrl("https://example.com"), + redirect_uris=[AnyUrl("http://localhost:3030/callback")], + scope="read write", + ) + + +@pytest.fixture +def valid_tokens(): + return OAuthToken( + access_token="test_access_token", + token_type="Bearer", + expires_in=3600, + refresh_token="test_refresh_token", + scope="read write", + ) + + +@pytest.fixture +def oauth_provider(client_metadata, mock_storage): + async def redirect_handler(url: str) -> None: + """Mock redirect handler.""" + pass + + async def callback_handler() -> tuple[str, str | None]: + """Mock callback handler.""" + return "test_auth_code", "test_state" + + return OAuthClientProvider( + server_url="https://api.example.com/v1/mcp", + client_metadata=client_metadata, + storage=mock_storage, + redirect_handler=redirect_handler, + callback_handler=callback_handler, + ) + + +class TestPKCEParameters: + """Test PKCE parameter generation.""" + + def test_pkce_generation(self): + """Test PKCE parameter generation creates valid values.""" + pkce = PKCEParameters.generate() + + # Verify lengths + assert len(pkce.code_verifier) == 128 + assert 43 <= len(pkce.code_challenge) <= 128 + + # Verify characters used in verifier + allowed_chars = set("ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-._~") + assert all(c in allowed_chars for c in pkce.code_verifier) + + # Verify base64url encoding in challenge (no padding) + assert "=" not in pkce.code_challenge + + def test_pkce_uniqueness(self): + """Test PKCE generates unique values each time.""" + pkce1 = PKCEParameters.generate() + pkce2 = PKCEParameters.generate() + + assert pkce1.code_verifier != pkce2.code_verifier + assert pkce1.code_challenge != pkce2.code_challenge + + +class TestOAuthContext: + """Test OAuth context functionality.""" + + @pytest.mark.anyio + async def test_oauth_provider_initialization(self, oauth_provider, client_metadata, mock_storage): + """Test OAuthClientProvider basic setup.""" + assert oauth_provider.context.server_url == "https://api.example.com/v1/mcp" + assert oauth_provider.context.client_metadata == client_metadata + assert oauth_provider.context.storage == mock_storage + assert oauth_provider.context.timeout == 300.0 + assert oauth_provider.context is not None + + def test_context_url_parsing(self, oauth_provider): + """Test get_authorization_base_url() extracts base URLs correctly.""" + context = oauth_provider.context + + # Test with path + assert context.get_authorization_base_url("https://api.example.com/v1/mcp") == "https://api.example.com" + + # Test with no path + assert context.get_authorization_base_url("https://api.example.com") == "https://api.example.com" + + # Test with port + assert ( + context.get_authorization_base_url("https://api.example.com:8080/path/to/mcp") + == "https://api.example.com:8080" + ) + + # Test with query params + assert ( + context.get_authorization_base_url("https://api.example.com/path?param=value") == "https://api.example.com" + ) + + @pytest.mark.anyio + async def test_token_validity_checking(self, oauth_provider, mock_storage, valid_tokens): + """Test is_token_valid() and can_refresh_token() logic.""" + context = oauth_provider.context + + # No tokens - should be invalid + assert not context.is_token_valid() + assert not context.can_refresh_token() + + # Set valid tokens and client info + context.current_tokens = valid_tokens + context.token_expiry_time = time.time() + 1800 # 30 minutes from now + context.client_info = OAuthClientInformationFull( + client_id="test_client_id", + client_secret="test_client_secret", + redirect_uris=[AnyUrl("http://localhost:3030/callback")], + ) + + # Should be valid + assert context.is_token_valid() + assert context.can_refresh_token() # Has refresh token and client info + + # Expire the token + context.token_expiry_time = time.time() - 100 # Expired 100 seconds ago + assert not context.is_token_valid() + assert context.can_refresh_token() # Can still refresh + + # Remove refresh token + context.current_tokens.refresh_token = None + assert not context.can_refresh_token() + + # Remove client info + context.current_tokens.refresh_token = "test_refresh_token" + context.client_info = None + assert not context.can_refresh_token() + + def test_clear_tokens(self, oauth_provider, valid_tokens): + """Test clear_tokens() removes token data.""" + context = oauth_provider.context + context.current_tokens = valid_tokens + context.token_expiry_time = time.time() + 1800 + + # Clear tokens + context.clear_tokens() + + # Verify cleared + assert context.current_tokens is None + assert context.token_expiry_time is None + + +class TestOAuthFlow: + """Test OAuth flow methods.""" + + @pytest.mark.anyio + async def test_discover_protected_resource_request(self, oauth_provider): + """Test protected resource discovery request building.""" + request = await oauth_provider._discover_protected_resource() + + assert request.method == "GET" + assert str(request.url) == "https://api.example.com/.well-known/oauth-protected-resource" + assert "mcp-protocol-version" in request.headers + + @pytest.mark.anyio + async def test_discover_oauth_metadata_request(self, oauth_provider): + """Test OAuth metadata discovery request building.""" + request = await oauth_provider._discover_oauth_metadata() + + assert request.method == "GET" + assert str(request.url) == "https://api.example.com/.well-known/oauth-authorization-server" + assert "mcp-protocol-version" in request.headers + + @pytest.mark.anyio + async def test_register_client_request(self, oauth_provider): + """Test client registration request building.""" + request = await oauth_provider._register_client() + + assert request is not None + assert request.method == "POST" + assert str(request.url) == "https://api.example.com/register" + assert request.headers["Content-Type"] == "application/json" + + @pytest.mark.anyio + async def test_register_client_skip_if_registered(self, oauth_provider, mock_storage): + """Test client registration is skipped if already registered.""" + # Set existing client info + client_info = OAuthClientInformationFull( + client_id="existing_client", + redirect_uris=[AnyUrl("http://localhost:3030/callback")], + ) + oauth_provider.context.client_info = client_info + + # Should return None (skip registration) + request = await oauth_provider._register_client() + assert request is None + + @pytest.mark.anyio + async def test_token_exchange_request(self, oauth_provider): + """Test token exchange request building.""" + # Set up required context + oauth_provider.context.client_info = OAuthClientInformationFull( + client_id="test_client", + client_secret="test_secret", + redirect_uris=[AnyUrl("http://localhost:3030/callback")], + ) + + request = await oauth_provider._exchange_token("test_auth_code", "test_verifier") + + assert request.method == "POST" + assert str(request.url) == "https://api.example.com/token" + assert request.headers["Content-Type"] == "application/x-www-form-urlencoded" + + # Check form data + content = request.content.decode() + assert "grant_type=authorization_code" in content + assert "code=test_auth_code" in content + assert "code_verifier=test_verifier" in content + assert "client_id=test_client" in content + assert "client_secret=test_secret" in content + + @pytest.mark.anyio + async def test_refresh_token_request(self, oauth_provider, valid_tokens): + """Test refresh token request building.""" + # Set up required context + oauth_provider.context.current_tokens = valid_tokens + oauth_provider.context.client_info = OAuthClientInformationFull( + client_id="test_client", + client_secret="test_secret", + redirect_uris=[AnyUrl("http://localhost:3030/callback")], + ) + + request = await oauth_provider._refresh_token() + + assert request.method == "POST" + assert str(request.url) == "https://api.example.com/token" + assert request.headers["Content-Type"] == "application/x-www-form-urlencoded" + + # Check form data + content = request.content.decode() + assert "grant_type=refresh_token" in content + assert "refresh_token=test_refresh_token" in content + assert "client_id=test_client" in content + assert "client_secret=test_secret" in content + + +class TestAuthFlow: + """Test the auth flow in httpx.""" + + @pytest.mark.anyio + async def test_auth_flow_with_valid_tokens(self, oauth_provider, mock_storage, valid_tokens): + """Test auth flow when tokens are already valid.""" + # Pre-store valid tokens + await mock_storage.set_tokens(valid_tokens) + oauth_provider.context.current_tokens = valid_tokens + oauth_provider.context.token_expiry_time = time.time() + 1800 + oauth_provider._initialized = True + + # Create a test request + test_request = httpx.Request("GET", "https://api.example.com/test") + + # Mock the auth flow + auth_flow = oauth_provider.async_auth_flow(test_request) + + # Should get the request with auth header added + request = await auth_flow.__anext__() + assert request.headers["Authorization"] == "Bearer test_access_token" + + # Send a successful response + response = httpx.Response(200) + try: + await auth_flow.asend(response) + except StopAsyncIteration: + pass # Expected diff --git a/tests/client/test_config.py b/tests/client/test_config.py index 97030e069..f144dcffb 100644 --- a/tests/client/test_config.py +++ b/tests/client/test_config.py @@ -44,7 +44,32 @@ def test_command_execution(mock_config_path: Path): test_args = [command] + args + ["--help"] - result = subprocess.run(test_args, capture_output=True, text=True, timeout=5) + result = subprocess.run(test_args, capture_output=True, text=True, timeout=5, check=False) assert result.returncode == 0 assert "usage" in result.stdout.lower() + + +def test_absolute_uv_path(mock_config_path: Path): + """Test that the absolute path to uv is used when available.""" + # Mock the shutil.which function to return a fake path + mock_uv_path = "/usr/local/bin/uv" + + with patch("mcp.cli.claude.get_uv_path", return_value=mock_uv_path): + # Setup + server_name = "test_server" + file_spec = "test_server.py:app" + + # Update config + success = update_claude_config(file_spec=file_spec, server_name=server_name) + assert success + + # Read the generated config + config_file = mock_config_path / "claude_desktop_config.json" + config = json.loads(config_file.read_text()) + + # Verify the command is the absolute path + server_config = config["mcpServers"][server_name] + command = server_config["command"] + + assert command == mock_uv_path diff --git a/tests/client/test_list_methods_cursor.py b/tests/client/test_list_methods_cursor.py new file mode 100644 index 000000000..f7b031737 --- /dev/null +++ b/tests/client/test_list_methods_cursor.py @@ -0,0 +1,213 @@ +import pytest + +from mcp.server.fastmcp import FastMCP +from mcp.shared.memory import ( + create_connected_server_and_client_session as create_session, +) + +# Mark the whole module for async tests +pytestmark = pytest.mark.anyio + + +async def test_list_tools_cursor_parameter(stream_spy): + """Test that the cursor parameter is accepted for list_tools + and that it is correctly passed to the server. + + See: https://modelcontextprotocol.io/specification/2025-03-26/server/utilities/pagination#request-format + """ + server = FastMCP("test") + + # Create a couple of test tools + @server.tool(name="test_tool_1") + async def test_tool_1() -> str: + """First test tool""" + return "Result 1" + + @server.tool(name="test_tool_2") + async def test_tool_2() -> str: + """Second test tool""" + return "Result 2" + + async with create_session(server._mcp_server) as client_session: + spies = stream_spy() + + # Test without cursor parameter (omitted) + _ = await client_session.list_tools() + list_tools_requests = spies.get_client_requests(method="tools/list") + assert len(list_tools_requests) == 1 + assert list_tools_requests[0].params is None + + spies.clear() + + # Test with cursor=None + _ = await client_session.list_tools(cursor=None) + list_tools_requests = spies.get_client_requests(method="tools/list") + assert len(list_tools_requests) == 1 + assert list_tools_requests[0].params is None + + spies.clear() + + # Test with cursor as string + _ = await client_session.list_tools(cursor="some_cursor_value") + list_tools_requests = spies.get_client_requests(method="tools/list") + assert len(list_tools_requests) == 1 + assert list_tools_requests[0].params is not None + assert list_tools_requests[0].params["cursor"] == "some_cursor_value" + + spies.clear() + + # Test with empty string cursor + _ = await client_session.list_tools(cursor="") + list_tools_requests = spies.get_client_requests(method="tools/list") + assert len(list_tools_requests) == 1 + assert list_tools_requests[0].params is not None + assert list_tools_requests[0].params["cursor"] == "" + + +async def test_list_resources_cursor_parameter(stream_spy): + """Test that the cursor parameter is accepted for list_resources + and that it is correctly passed to the server. + + See: https://modelcontextprotocol.io/specification/2025-03-26/server/utilities/pagination#request-format + """ + server = FastMCP("test") + + # Create a test resource + @server.resource("resource://test/data") + async def test_resource() -> str: + """Test resource""" + return "Test data" + + async with create_session(server._mcp_server) as client_session: + spies = stream_spy() + + # Test without cursor parameter (omitted) + _ = await client_session.list_resources() + list_resources_requests = spies.get_client_requests(method="resources/list") + assert len(list_resources_requests) == 1 + assert list_resources_requests[0].params is None + + spies.clear() + + # Test with cursor=None + _ = await client_session.list_resources(cursor=None) + list_resources_requests = spies.get_client_requests(method="resources/list") + assert len(list_resources_requests) == 1 + assert list_resources_requests[0].params is None + + spies.clear() + + # Test with cursor as string + _ = await client_session.list_resources(cursor="some_cursor") + list_resources_requests = spies.get_client_requests(method="resources/list") + assert len(list_resources_requests) == 1 + assert list_resources_requests[0].params is not None + assert list_resources_requests[0].params["cursor"] == "some_cursor" + + spies.clear() + + # Test with empty string cursor + _ = await client_session.list_resources(cursor="") + list_resources_requests = spies.get_client_requests(method="resources/list") + assert len(list_resources_requests) == 1 + assert list_resources_requests[0].params is not None + assert list_resources_requests[0].params["cursor"] == "" + + +async def test_list_prompts_cursor_parameter(stream_spy): + """Test that the cursor parameter is accepted for list_prompts + and that it is correctly passed to the server. + See: https://modelcontextprotocol.io/specification/2025-03-26/server/utilities/pagination#request-format + """ + server = FastMCP("test") + + # Create a test prompt + @server.prompt() + async def test_prompt(name: str) -> str: + """Test prompt""" + return f"Hello, {name}!" + + async with create_session(server._mcp_server) as client_session: + spies = stream_spy() + + # Test without cursor parameter (omitted) + _ = await client_session.list_prompts() + list_prompts_requests = spies.get_client_requests(method="prompts/list") + assert len(list_prompts_requests) == 1 + assert list_prompts_requests[0].params is None + + spies.clear() + + # Test with cursor=None + _ = await client_session.list_prompts(cursor=None) + list_prompts_requests = spies.get_client_requests(method="prompts/list") + assert len(list_prompts_requests) == 1 + assert list_prompts_requests[0].params is None + + spies.clear() + + # Test with cursor as string + _ = await client_session.list_prompts(cursor="some_cursor") + list_prompts_requests = spies.get_client_requests(method="prompts/list") + assert len(list_prompts_requests) == 1 + assert list_prompts_requests[0].params is not None + assert list_prompts_requests[0].params["cursor"] == "some_cursor" + + spies.clear() + + # Test with empty string cursor + _ = await client_session.list_prompts(cursor="") + list_prompts_requests = spies.get_client_requests(method="prompts/list") + assert len(list_prompts_requests) == 1 + assert list_prompts_requests[0].params is not None + assert list_prompts_requests[0].params["cursor"] == "" + + +async def test_list_resource_templates_cursor_parameter(stream_spy): + """Test that the cursor parameter is accepted for list_resource_templates + and that it is correctly passed to the server. + + See: https://modelcontextprotocol.io/specification/2025-03-26/server/utilities/pagination#request-format + """ + server = FastMCP("test") + + # Create a test resource template + @server.resource("resource://test/{name}") + async def test_template(name: str) -> str: + """Test resource template""" + return f"Data for {name}" + + async with create_session(server._mcp_server) as client_session: + spies = stream_spy() + + # Test without cursor parameter (omitted) + _ = await client_session.list_resource_templates() + list_templates_requests = spies.get_client_requests(method="resources/templates/list") + assert len(list_templates_requests) == 1 + assert list_templates_requests[0].params is None + + spies.clear() + + # Test with cursor=None + _ = await client_session.list_resource_templates(cursor=None) + list_templates_requests = spies.get_client_requests(method="resources/templates/list") + assert len(list_templates_requests) == 1 + assert list_templates_requests[0].params is None + + spies.clear() + + # Test with cursor as string + _ = await client_session.list_resource_templates(cursor="some_cursor") + list_templates_requests = spies.get_client_requests(method="resources/templates/list") + assert len(list_templates_requests) == 1 + assert list_templates_requests[0].params is not None + assert list_templates_requests[0].params["cursor"] == "some_cursor" + + spies.clear() + + # Test with empty string cursor + _ = await client_session.list_resource_templates(cursor="") + list_templates_requests = spies.get_client_requests(method="resources/templates/list") + assert len(list_templates_requests) == 1 + assert list_templates_requests[0].params is not None + assert list_templates_requests[0].params["cursor"] == "" diff --git a/tests/client/test_list_roots_callback.py b/tests/client/test_list_roots_callback.py index f5b598218..f65490421 100644 --- a/tests/client/test_list_roots_callback.py +++ b/tests/client/test_list_roots_callback.py @@ -41,13 +41,9 @@ async def test_list_roots(context: Context, message: str): # type: ignore[repor return True # Test with list_roots callback - async with create_session( - server._mcp_server, list_roots_callback=list_roots_callback - ) as client_session: + async with create_session(server._mcp_server, list_roots_callback=list_roots_callback) as client_session: # Make a request to trigger sampling callback - result = await client_session.call_tool( - "test_list_roots", {"message": "test message"} - ) + result = await client_session.call_tool("test_list_roots", {"message": "test message"}) assert result.isError is False assert isinstance(result.content[0], TextContent) assert result.content[0].text == "true" @@ -55,12 +51,7 @@ async def test_list_roots(context: Context, message: str): # type: ignore[repor # Test without list_roots callback async with create_session(server._mcp_server) as client_session: # Make a request to trigger sampling callback - result = await client_session.call_tool( - "test_list_roots", {"message": "test message"} - ) + result = await client_session.call_tool("test_list_roots", {"message": "test message"}) assert result.isError is True assert isinstance(result.content[0], TextContent) - assert ( - result.content[0].text - == "Error executing tool test_list_roots: List roots not supported" - ) + assert result.content[0].text == "Error executing tool test_list_roots: List roots not supported" diff --git a/tests/client/test_logging_callback.py b/tests/client/test_logging_callback.py index 797f817e1..f298ee287 100644 --- a/tests/client/test_logging_callback.py +++ b/tests/client/test_logging_callback.py @@ -49,9 +49,7 @@ async def test_tool_with_log( # Create a message handler to catch exceptions async def message_handler( - message: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: if isinstance(message, Exception): raise message @@ -78,6 +76,8 @@ async def message_handler( ) assert log_result.isError is False assert len(logging_collector.log_messages) == 1 - assert logging_collector.log_messages[0] == LoggingMessageNotificationParams( - level="info", logger="test_logger", data="Test log message" - ) + # Create meta object with related_request_id added dynamically + log = logging_collector.log_messages[0] + assert log.level == "info" + assert log.logger == "test_logger" + assert log.data == "Test log message" diff --git a/tests/client/test_resource_cleanup.py b/tests/client/test_resource_cleanup.py new file mode 100644 index 000000000..990b3a89a --- /dev/null +++ b/tests/client/test_resource_cleanup.py @@ -0,0 +1,68 @@ +from unittest.mock import patch + +import anyio +import pytest + +from mcp.shared.session import BaseSession +from mcp.types import ( + ClientRequest, + EmptyResult, + PingRequest, +) + + +@pytest.mark.anyio +async def test_send_request_stream_cleanup(): + """ + Test that send_request properly cleans up streams when an exception occurs. + + This test mocks out most of the session functionality to focus on stream cleanup. + """ + + # Create a mock session with the minimal required functionality + class TestSession(BaseSession): + async def _send_response(self, request_id, response): + pass + + # Create streams + write_stream_send, write_stream_receive = anyio.create_memory_object_stream(1) + read_stream_send, read_stream_receive = anyio.create_memory_object_stream(1) + + # Create the session + session = TestSession( + read_stream_receive, + write_stream_send, + object, # Request type doesn't matter for this test + object, # Notification type doesn't matter for this test + ) + + # Create a test request + request = ClientRequest( + PingRequest( + method="ping", + ) + ) + + # Patch the _write_stream.send method to raise an exception + async def mock_send(*args, **kwargs): + raise RuntimeError("Simulated network error") + + # Record the response streams before the test + initial_stream_count = len(session._response_streams) + + # Run the test with the patched method + with patch.object(session._write_stream, "send", mock_send): + with pytest.raises(RuntimeError): + await session.send_request(request, EmptyResult) + + # Verify that no response streams were leaked + assert len(session._response_streams) == initial_stream_count, ( + f"Expected {initial_stream_count} response streams after request, " + f"but found {len(session._response_streams)}" + ) + + # Clean up + await write_stream_send.aclose() + await write_stream_receive.aclose() + await read_stream_send.aclose() + await read_stream_receive.aclose() diff --git a/tests/client/test_sampling_callback.py b/tests/client/test_sampling_callback.py index ba586d4a8..a3f6affda 100644 --- a/tests/client/test_sampling_callback.py +++ b/tests/client/test_sampling_callback.py @@ -21,9 +21,7 @@ async def test_sampling_callback(): callback_return = CreateMessageResult( role="assistant", - content=TextContent( - type="text", text="This is a response from the sampling callback" - ), + content=TextContent(type="text", text="This is a response from the sampling callback"), model="test-model", stopReason="endTurn", ) @@ -37,24 +35,16 @@ async def sampling_callback( @server.tool("test_sampling") async def test_sampling_tool(message: str): value = await server.get_context().session.create_message( - messages=[ - SamplingMessage( - role="user", content=TextContent(type="text", text=message) - ) - ], + messages=[SamplingMessage(role="user", content=TextContent(type="text", text=message))], max_tokens=100, ) assert value == callback_return return True # Test with sampling callback - async with create_session( - server._mcp_server, sampling_callback=sampling_callback - ) as client_session: + async with create_session(server._mcp_server, sampling_callback=sampling_callback) as client_session: # Make a request to trigger sampling callback - result = await client_session.call_tool( - "test_sampling", {"message": "Test message for sampling"} - ) + result = await client_session.call_tool("test_sampling", {"message": "Test message for sampling"}) assert result.isError is False assert isinstance(result.content[0], TextContent) assert result.content[0].text == "true" @@ -62,12 +52,7 @@ async def test_sampling_tool(message: str): # Test without sampling callback async with create_session(server._mcp_server) as client_session: # Make a request to trigger sampling callback - result = await client_session.call_tool( - "test_sampling", {"message": "Test message for sampling"} - ) + result = await client_session.call_tool("test_sampling", {"message": "Test message for sampling"}) assert result.isError is True assert isinstance(result.content[0], TextContent) - assert ( - result.content[0].text - == "Error executing tool test_sampling: Sampling not supported" - ) + assert result.content[0].text == "Error executing tool test_sampling: Sampling not supported" diff --git a/tests/client/test_session.py b/tests/client/test_session.py index 543ebb2f0..327d1a9e4 100644 --- a/tests/client/test_session.py +++ b/tests/client/test_session.py @@ -1,9 +1,14 @@ +from typing import Any + import anyio import pytest import mcp.types as types from mcp.client.session import DEFAULT_CLIENT_INFO, ClientSession +from mcp.shared.context import RequestContext +from mcp.shared.message import SessionMessage from mcp.shared.session import RequestResponder +from mcp.shared.version import SUPPORTED_PROTOCOL_VERSIONS from mcp.types import ( LATEST_PROTOCOL_VERSION, ClientNotification, @@ -23,19 +28,16 @@ @pytest.mark.anyio async def test_client_session_initialize(): - client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) - server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) initialized_notification = None async def mock_server(): nonlocal initialized_notification - jsonrpc_request = await client_to_server_receive.receive() + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message assert isinstance(jsonrpc_request.root, JSONRPCRequest) request = ClientRequest.model_validate( jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) @@ -59,29 +61,26 @@ async def mock_server(): async with server_to_client_send: await server_to_client_send.send( - JSONRPCMessage( - JSONRPCResponse( - jsonrpc="2.0", - id=jsonrpc_request.root.id, - result=result.model_dump( - by_alias=True, mode="json", exclude_none=True - ), + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) ) ) ) - jsonrpc_notification = await client_to_server_receive.receive() + session_notification = await client_to_server_receive.receive() + jsonrpc_notification = session_notification.message assert isinstance(jsonrpc_notification.root, JSONRPCNotification) initialized_notification = ClientNotification.model_validate( - jsonrpc_notification.model_dump( - by_alias=True, mode="json", exclude_none=True - ) + jsonrpc_notification.model_dump(by_alias=True, mode="json", exclude_none=True) ) # Create a message handler to catch exceptions async def message_handler( - message: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: if isinstance(message, Exception): raise message @@ -115,12 +114,8 @@ async def message_handler( @pytest.mark.anyio async def test_client_session_custom_client_info(): - client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) - server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) custom_client_info = Implementation(name="test-client", version="1.2.3") received_client_info = None @@ -128,7 +123,8 @@ async def test_client_session_custom_client_info(): async def mock_server(): nonlocal received_client_info - jsonrpc_request = await client_to_server_receive.receive() + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message assert isinstance(jsonrpc_request.root, JSONRPCRequest) request = ClientRequest.model_validate( jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) @@ -146,13 +142,13 @@ async def mock_server(): async with server_to_client_send: await server_to_client_send.send( - JSONRPCMessage( - JSONRPCResponse( - jsonrpc="2.0", - id=jsonrpc_request.root.id, - result=result.model_dump( - by_alias=True, mode="json", exclude_none=True - ), + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) ) ) ) @@ -180,19 +176,16 @@ async def mock_server(): @pytest.mark.anyio async def test_client_session_default_client_info(): - client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) - server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) received_client_info = None async def mock_server(): nonlocal received_client_info - jsonrpc_request = await client_to_server_receive.receive() + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message assert isinstance(jsonrpc_request.root, JSONRPCRequest) request = ClientRequest.model_validate( jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) @@ -210,13 +203,13 @@ async def mock_server(): async with server_to_client_send: await server_to_client_send.send( - JSONRPCMessage( - JSONRPCResponse( - jsonrpc="2.0", - id=jsonrpc_request.root.id, - result=result.model_dump( - by_alias=True, mode="json", exclude_none=True - ), + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) ) ) ) @@ -239,3 +232,266 @@ async def mock_server(): # Assert that the default client info was sent assert received_client_info == DEFAULT_CLIENT_INFO + + +@pytest.mark.anyio +async def test_client_session_version_negotiation_success(): + """Test successful version negotiation with supported version""" + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) + + async def mock_server(): + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message + assert isinstance(jsonrpc_request.root, JSONRPCRequest) + request = ClientRequest.model_validate( + jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) + ) + assert isinstance(request.root, InitializeRequest) + + # Verify client sent the latest protocol version + assert request.root.params.protocolVersion == LATEST_PROTOCOL_VERSION + + # Server responds with a supported older version + result = ServerResult( + InitializeResult( + protocolVersion="2024-11-05", + capabilities=ServerCapabilities(), + serverInfo=Implementation(name="mock-server", version="0.1.0"), + ) + ) + + async with server_to_client_send: + await server_to_client_send.send( + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) + ) + ) + ) + # Receive initialized notification + await client_to_server_receive.receive() + + async with ( + ClientSession( + server_to_client_receive, + client_to_server_send, + ) as session, + anyio.create_task_group() as tg, + client_to_server_send, + client_to_server_receive, + server_to_client_send, + server_to_client_receive, + ): + tg.start_soon(mock_server) + result = await session.initialize() + + # Assert the result with negotiated version + assert isinstance(result, InitializeResult) + assert result.protocolVersion == "2024-11-05" + assert result.protocolVersion in SUPPORTED_PROTOCOL_VERSIONS + + +@pytest.mark.anyio +async def test_client_session_version_negotiation_failure(): + """Test version negotiation failure with unsupported version""" + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) + + async def mock_server(): + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message + assert isinstance(jsonrpc_request.root, JSONRPCRequest) + request = ClientRequest.model_validate( + jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) + ) + assert isinstance(request.root, InitializeRequest) + + # Server responds with an unsupported version + result = ServerResult( + InitializeResult( + protocolVersion="2020-01-01", # Unsupported old version + capabilities=ServerCapabilities(), + serverInfo=Implementation(name="mock-server", version="0.1.0"), + ) + ) + + async with server_to_client_send: + await server_to_client_send.send( + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) + ) + ) + ) + + async with ( + ClientSession( + server_to_client_receive, + client_to_server_send, + ) as session, + anyio.create_task_group() as tg, + client_to_server_send, + client_to_server_receive, + server_to_client_send, + server_to_client_receive, + ): + tg.start_soon(mock_server) + + # Should raise RuntimeError for unsupported version + with pytest.raises(RuntimeError, match="Unsupported protocol version"): + await session.initialize() + + +@pytest.mark.anyio +async def test_client_capabilities_default(): + """Test that client capabilities are properly set with default callbacks""" + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) + + received_capabilities = None + + async def mock_server(): + nonlocal received_capabilities + + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message + assert isinstance(jsonrpc_request.root, JSONRPCRequest) + request = ClientRequest.model_validate( + jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) + ) + assert isinstance(request.root, InitializeRequest) + received_capabilities = request.root.params.capabilities + + result = ServerResult( + InitializeResult( + protocolVersion=LATEST_PROTOCOL_VERSION, + capabilities=ServerCapabilities(), + serverInfo=Implementation(name="mock-server", version="0.1.0"), + ) + ) + + async with server_to_client_send: + await server_to_client_send.send( + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) + ) + ) + ) + # Receive initialized notification + await client_to_server_receive.receive() + + async with ( + ClientSession( + server_to_client_receive, + client_to_server_send, + ) as session, + anyio.create_task_group() as tg, + client_to_server_send, + client_to_server_receive, + server_to_client_send, + server_to_client_receive, + ): + tg.start_soon(mock_server) + await session.initialize() + + # Assert that capabilities are properly set with defaults + assert received_capabilities is not None + assert received_capabilities.sampling is None # No custom sampling callback + assert received_capabilities.roots is None # No custom list_roots callback + + +@pytest.mark.anyio +async def test_client_capabilities_with_custom_callbacks(): + """Test that client capabilities are properly set with custom callbacks""" + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) + + received_capabilities = None + + async def custom_sampling_callback( + context: RequestContext["ClientSession", Any], + params: types.CreateMessageRequestParams, + ) -> types.CreateMessageResult | types.ErrorData: + return types.CreateMessageResult( + role="assistant", + content=types.TextContent(type="text", text="test"), + model="test-model", + ) + + async def custom_list_roots_callback( + context: RequestContext["ClientSession", Any], + ) -> types.ListRootsResult | types.ErrorData: + return types.ListRootsResult(roots=[]) + + async def mock_server(): + nonlocal received_capabilities + + session_message = await client_to_server_receive.receive() + jsonrpc_request = session_message.message + assert isinstance(jsonrpc_request.root, JSONRPCRequest) + request = ClientRequest.model_validate( + jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True) + ) + assert isinstance(request.root, InitializeRequest) + received_capabilities = request.root.params.capabilities + + result = ServerResult( + InitializeResult( + protocolVersion=LATEST_PROTOCOL_VERSION, + capabilities=ServerCapabilities(), + serverInfo=Implementation(name="mock-server", version="0.1.0"), + ) + ) + + async with server_to_client_send: + await server_to_client_send.send( + SessionMessage( + JSONRPCMessage( + JSONRPCResponse( + jsonrpc="2.0", + id=jsonrpc_request.root.id, + result=result.model_dump(by_alias=True, mode="json", exclude_none=True), + ) + ) + ) + ) + # Receive initialized notification + await client_to_server_receive.receive() + + async with ( + ClientSession( + server_to_client_receive, + client_to_server_send, + sampling_callback=custom_sampling_callback, + list_roots_callback=custom_list_roots_callback, + ) as session, + anyio.create_task_group() as tg, + client_to_server_send, + client_to_server_receive, + server_to_client_send, + server_to_client_receive, + ): + tg.start_soon(mock_server) + await session.initialize() + + # Assert that capabilities are properly set with custom callbacks + assert received_capabilities is not None + assert received_capabilities.sampling is not None # Custom sampling callback provided + assert isinstance(received_capabilities.sampling, types.SamplingCapability) + assert received_capabilities.roots is not None # Custom list_roots callback provided + assert isinstance(received_capabilities.roots, types.RootsCapability) + assert received_capabilities.roots.listChanged is True # Should be True for custom callback diff --git a/tests/client/test_session_group.py b/tests/client/test_session_group.py new file mode 100644 index 000000000..16a887e00 --- /dev/null +++ b/tests/client/test_session_group.py @@ -0,0 +1,367 @@ +import contextlib +from unittest import mock + +import pytest + +import mcp +from mcp import types +from mcp.client.session_group import ( + ClientSessionGroup, + SseServerParameters, + StreamableHttpParameters, +) +from mcp.client.stdio import StdioServerParameters +from mcp.shared.exceptions import McpError + + +@pytest.fixture +def mock_exit_stack(): + """Fixture for a mocked AsyncExitStack.""" + # Use unittest.mock.Mock directly if needed, or just a plain object + # if only attribute access/existence is needed. + # For AsyncExitStack, Mock or MagicMock is usually fine. + return mock.MagicMock(spec=contextlib.AsyncExitStack) + + +@pytest.mark.anyio +class TestClientSessionGroup: + def test_init(self): + mcp_session_group = ClientSessionGroup() + assert not mcp_session_group._tools + assert not mcp_session_group._resources + assert not mcp_session_group._prompts + assert not mcp_session_group._tool_to_session + + def test_component_properties(self): + # --- Mock Dependencies --- + mock_prompt = mock.Mock() + mock_resource = mock.Mock() + mock_tool = mock.Mock() + + # --- Prepare Session Group --- + mcp_session_group = ClientSessionGroup() + mcp_session_group._prompts = {"my_prompt": mock_prompt} + mcp_session_group._resources = {"my_resource": mock_resource} + mcp_session_group._tools = {"my_tool": mock_tool} + + # --- Assertions --- + assert mcp_session_group.prompts == {"my_prompt": mock_prompt} + assert mcp_session_group.resources == {"my_resource": mock_resource} + assert mcp_session_group.tools == {"my_tool": mock_tool} + + async def test_call_tool(self): + # --- Mock Dependencies --- + mock_session = mock.AsyncMock() + + # --- Prepare Session Group --- + def hook(name, server_info): + return f"{(server_info.name)}-{name}" + + mcp_session_group = ClientSessionGroup(component_name_hook=hook) + mcp_session_group._tools = {"server1-my_tool": types.Tool(name="my_tool", inputSchema={})} + mcp_session_group._tool_to_session = {"server1-my_tool": mock_session} + text_content = types.TextContent(type="text", text="OK") + mock_session.call_tool.return_value = types.CallToolResult(content=[text_content]) + + # --- Test Execution --- + result = await mcp_session_group.call_tool( + name="server1-my_tool", + args={ + "name": "value1", + "args": {}, + }, + ) + + # --- Assertions --- + assert result.content == [text_content] + mock_session.call_tool.assert_called_once_with( + "my_tool", + {"name": "value1", "args": {}}, + ) + + async def test_connect_to_server(self, mock_exit_stack): + """Test connecting to a server and aggregating components.""" + # --- Mock Dependencies --- + mock_server_info = mock.Mock(spec=types.Implementation) + mock_server_info.name = "TestServer1" + mock_session = mock.AsyncMock(spec=mcp.ClientSession) + mock_tool1 = mock.Mock(spec=types.Tool) + mock_tool1.name = "tool_a" + mock_resource1 = mock.Mock(spec=types.Resource) + mock_resource1.name = "resource_b" + mock_prompt1 = mock.Mock(spec=types.Prompt) + mock_prompt1.name = "prompt_c" + mock_session.list_tools.return_value = mock.AsyncMock(tools=[mock_tool1]) + mock_session.list_resources.return_value = mock.AsyncMock(resources=[mock_resource1]) + mock_session.list_prompts.return_value = mock.AsyncMock(prompts=[mock_prompt1]) + + # --- Test Execution --- + group = ClientSessionGroup(exit_stack=mock_exit_stack) + with mock.patch.object(group, "_establish_session", return_value=(mock_server_info, mock_session)): + await group.connect_to_server(StdioServerParameters(command="test")) + + # --- Assertions --- + assert mock_session in group._sessions + assert len(group.tools) == 1 + assert "tool_a" in group.tools + assert group.tools["tool_a"] == mock_tool1 + assert group._tool_to_session["tool_a"] == mock_session + assert len(group.resources) == 1 + assert "resource_b" in group.resources + assert group.resources["resource_b"] == mock_resource1 + assert len(group.prompts) == 1 + assert "prompt_c" in group.prompts + assert group.prompts["prompt_c"] == mock_prompt1 + mock_session.list_tools.assert_awaited_once() + mock_session.list_resources.assert_awaited_once() + mock_session.list_prompts.assert_awaited_once() + + async def test_connect_to_server_with_name_hook(self, mock_exit_stack): + """Test connecting with a component name hook.""" + # --- Mock Dependencies --- + mock_server_info = mock.Mock(spec=types.Implementation) + mock_server_info.name = "HookServer" + mock_session = mock.AsyncMock(spec=mcp.ClientSession) + mock_tool = mock.Mock(spec=types.Tool) + mock_tool.name = "base_tool" + mock_session.list_tools.return_value = mock.AsyncMock(tools=[mock_tool]) + mock_session.list_resources.return_value = mock.AsyncMock(resources=[]) + mock_session.list_prompts.return_value = mock.AsyncMock(prompts=[]) + + # --- Test Setup --- + def name_hook(name: str, server_info: types.Implementation) -> str: + return f"{server_info.name}.{name}" + + # --- Test Execution --- + group = ClientSessionGroup(exit_stack=mock_exit_stack, component_name_hook=name_hook) + with mock.patch.object(group, "_establish_session", return_value=(mock_server_info, mock_session)): + await group.connect_to_server(StdioServerParameters(command="test")) + + # --- Assertions --- + assert mock_session in group._sessions + assert len(group.tools) == 1 + expected_tool_name = "HookServer.base_tool" + assert expected_tool_name in group.tools + assert group.tools[expected_tool_name] == mock_tool + assert group._tool_to_session[expected_tool_name] == mock_session + + async def test_disconnect_from_server(self): # No mock arguments needed + """Test disconnecting from a server.""" + # --- Test Setup --- + group = ClientSessionGroup() + server_name = "ServerToDisconnect" + + # Manually populate state using standard mocks + mock_session1 = mock.MagicMock(spec=mcp.ClientSession) + mock_session2 = mock.MagicMock(spec=mcp.ClientSession) + mock_tool1 = mock.Mock(spec=types.Tool) + mock_tool1.name = "tool1" + mock_resource1 = mock.Mock(spec=types.Resource) + mock_resource1.name = "res1" + mock_prompt1 = mock.Mock(spec=types.Prompt) + mock_prompt1.name = "prm1" + mock_tool2 = mock.Mock(spec=types.Tool) + mock_tool2.name = "tool2" + mock_component_named_like_server = mock.Mock() + mock_session = mock.Mock(spec=mcp.ClientSession) + + group._tools = { + "tool1": mock_tool1, + "tool2": mock_tool2, + server_name: mock_component_named_like_server, + } + group._tool_to_session = { + "tool1": mock_session1, + "tool2": mock_session2, + server_name: mock_session1, + } + group._resources = { + "res1": mock_resource1, + server_name: mock_component_named_like_server, + } + group._prompts = { + "prm1": mock_prompt1, + server_name: mock_component_named_like_server, + } + group._sessions = { + mock_session: ClientSessionGroup._ComponentNames( + prompts=set({"prm1"}), + resources=set({"res1"}), + tools=set({"tool1", "tool2"}), + ) + } + + # --- Assertions --- + assert mock_session in group._sessions + assert "tool1" in group._tools + assert "tool2" in group._tools + assert "res1" in group._resources + assert "prm1" in group._prompts + + # --- Test Execution --- + await group.disconnect_from_server(mock_session) + + # --- Assertions --- + assert mock_session not in group._sessions + assert "tool1" not in group._tools + assert "tool2" not in group._tools + assert "res1" not in group._resources + assert "prm1" not in group._prompts + + async def test_connect_to_server_duplicate_tool_raises_error(self, mock_exit_stack): + """Test McpError raised when connecting a server with a dup name.""" + # --- Setup Pre-existing State --- + group = ClientSessionGroup(exit_stack=mock_exit_stack) + existing_tool_name = "shared_tool" + # Manually add a tool to simulate a previous connection + group._tools[existing_tool_name] = mock.Mock(spec=types.Tool) + group._tools[existing_tool_name].name = existing_tool_name + # Need a dummy session associated with the existing tool + mock_session = mock.MagicMock(spec=mcp.ClientSession) + group._tool_to_session[existing_tool_name] = mock_session + group._session_exit_stacks[mock_session] = mock.Mock(spec=contextlib.AsyncExitStack) + + # --- Mock New Connection Attempt --- + mock_server_info_new = mock.Mock(spec=types.Implementation) + mock_server_info_new.name = "ServerWithDuplicate" + mock_session_new = mock.AsyncMock(spec=mcp.ClientSession) + + # Configure the new session to return a tool with the *same name* + duplicate_tool = mock.Mock(spec=types.Tool) + duplicate_tool.name = existing_tool_name + mock_session_new.list_tools.return_value = mock.AsyncMock(tools=[duplicate_tool]) + # Keep other lists empty for simplicity + mock_session_new.list_resources.return_value = mock.AsyncMock(resources=[]) + mock_session_new.list_prompts.return_value = mock.AsyncMock(prompts=[]) + + # --- Test Execution and Assertion --- + with pytest.raises(McpError) as excinfo: + with mock.patch.object( + group, + "_establish_session", + return_value=(mock_server_info_new, mock_session_new), + ): + await group.connect_to_server(StdioServerParameters(command="test")) + + # Assert details about the raised error + assert excinfo.value.error.code == types.INVALID_PARAMS + assert existing_tool_name in excinfo.value.error.message + assert "already exist " in excinfo.value.error.message + + # Verify the duplicate tool was *not* added again (state should be unchanged) + assert len(group._tools) == 1 # Should still only have the original + assert group._tools[existing_tool_name] is not duplicate_tool # Ensure it's the original mock + + # No patching needed here + async def test_disconnect_non_existent_server(self): + """Test disconnecting a server that isn't connected.""" + session = mock.Mock(spec=mcp.ClientSession) + group = ClientSessionGroup() + with pytest.raises(McpError): + await group.disconnect_from_server(session) + + @pytest.mark.parametrize( + "server_params_instance, client_type_name, patch_target_for_client_func", + [ + ( + StdioServerParameters(command="test_stdio_cmd"), + "stdio", + "mcp.client.session_group.mcp.stdio_client", + ), + ( + SseServerParameters(url="http://test.com/sse", timeout=10), + "sse", + "mcp.client.session_group.sse_client", + ), # url, headers, timeout, sse_read_timeout + ( + StreamableHttpParameters(url="http://test.com/stream", terminate_on_close=False), + "streamablehttp", + "mcp.client.session_group.streamablehttp_client", + ), # url, headers, timeout, sse_read_timeout, terminate_on_close + ], + ) + async def test_establish_session_parameterized( + self, + server_params_instance, + client_type_name, # Just for clarity or conditional logic if needed + patch_target_for_client_func, + ): + with mock.patch("mcp.client.session_group.mcp.ClientSession") as mock_ClientSession_class: + with mock.patch(patch_target_for_client_func) as mock_specific_client_func: + mock_client_cm_instance = mock.AsyncMock(name=f"{client_type_name}ClientCM") + mock_read_stream = mock.AsyncMock(name=f"{client_type_name}Read") + mock_write_stream = mock.AsyncMock(name=f"{client_type_name}Write") + + # streamablehttp_client's __aenter__ returns three values + if client_type_name == "streamablehttp": + mock_extra_stream_val = mock.AsyncMock(name="StreamableExtra") + mock_client_cm_instance.__aenter__.return_value = ( + mock_read_stream, + mock_write_stream, + mock_extra_stream_val, + ) + else: + mock_client_cm_instance.__aenter__.return_value = ( + mock_read_stream, + mock_write_stream, + ) + + mock_client_cm_instance.__aexit__ = mock.AsyncMock(return_value=None) + mock_specific_client_func.return_value = mock_client_cm_instance + + # --- Mock mcp.ClientSession (class) --- + # mock_ClientSession_class is already provided by the outer patch + mock_raw_session_cm = mock.AsyncMock(name="RawSessionCM") + mock_ClientSession_class.return_value = mock_raw_session_cm + + mock_entered_session = mock.AsyncMock(name="EnteredSessionInstance") + mock_raw_session_cm.__aenter__.return_value = mock_entered_session + mock_raw_session_cm.__aexit__ = mock.AsyncMock(return_value=None) + + # Mock session.initialize() + mock_initialize_result = mock.AsyncMock(name="InitializeResult") + mock_initialize_result.serverInfo = types.Implementation(name="foo", version="1") + mock_entered_session.initialize.return_value = mock_initialize_result + + # --- Test Execution --- + group = ClientSessionGroup() + returned_server_info = None + returned_session = None + + async with contextlib.AsyncExitStack() as stack: + group._exit_stack = stack + ( + returned_server_info, + returned_session, + ) = await group._establish_session(server_params_instance) + + # --- Assertions --- + # 1. Assert the correct specific client function was called + if client_type_name == "stdio": + mock_specific_client_func.assert_called_once_with(server_params_instance) + elif client_type_name == "sse": + mock_specific_client_func.assert_called_once_with( + url=server_params_instance.url, + headers=server_params_instance.headers, + timeout=server_params_instance.timeout, + sse_read_timeout=server_params_instance.sse_read_timeout, + ) + elif client_type_name == "streamablehttp": + mock_specific_client_func.assert_called_once_with( + url=server_params_instance.url, + headers=server_params_instance.headers, + timeout=server_params_instance.timeout, + sse_read_timeout=server_params_instance.sse_read_timeout, + terminate_on_close=server_params_instance.terminate_on_close, + ) + + mock_client_cm_instance.__aenter__.assert_awaited_once() + + # 2. Assert ClientSession was called correctly + mock_ClientSession_class.assert_called_once_with(mock_read_stream, mock_write_stream) + mock_raw_session_cm.__aenter__.assert_awaited_once() + mock_entered_session.initialize.assert_awaited_once() + + # 3. Assert returned values + assert returned_server_info is mock_initialize_result.serverInfo + assert returned_session is mock_entered_session diff --git a/tests/client/test_stdio.py b/tests/client/test_stdio.py index 95747ffd1..c66a16ab9 100644 --- a/tests/client/test_stdio.py +++ b/tests/client/test_stdio.py @@ -2,10 +2,24 @@ import pytest -from mcp.client.stdio import StdioServerParameters, stdio_client -from mcp.types import JSONRPCMessage, JSONRPCRequest, JSONRPCResponse +from mcp.client.session import ClientSession +from mcp.client.stdio import ( + StdioServerParameters, + stdio_client, +) +from mcp.shared.exceptions import McpError +from mcp.shared.message import SessionMessage +from mcp.types import CONNECTION_CLOSED, JSONRPCMessage, JSONRPCRequest, JSONRPCResponse tee: str = shutil.which("tee") # type: ignore +python: str = shutil.which("python") # type: ignore + + +@pytest.mark.anyio +@pytest.mark.skipif(tee is None, reason="could not find tee command") +async def test_stdio_context_manager_exiting(): + async with stdio_client(StdioServerParameters(command=tee)) as (_, _): + pass @pytest.mark.anyio @@ -22,7 +36,8 @@ async def test_stdio_client(): async with write_stream: for message in messages: - await write_stream.send(message) + session_message = SessionMessage(message) + await write_stream.send(session_message) read_messages = [] async with read_stream: @@ -30,14 +45,48 @@ async def test_stdio_client(): if isinstance(message, Exception): raise message - read_messages.append(message) + read_messages.append(message.message) if len(read_messages) == 2: break assert len(read_messages) == 2 - assert read_messages[0] == JSONRPCMessage( - root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping") - ) - assert read_messages[1] == JSONRPCMessage( - root=JSONRPCResponse(jsonrpc="2.0", id=2, result={}) - ) + assert read_messages[0] == JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")) + assert read_messages[1] == JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})) + + +@pytest.mark.anyio +async def test_stdio_client_bad_path(): + """Check that the connection doesn't hang if process errors.""" + server_params = StdioServerParameters(command="python", args=["-c", "non-existent-file.py"]) + async with stdio_client(server_params) as (read_stream, write_stream): + async with ClientSession(read_stream, write_stream) as session: + # The session should raise an error when the connection closes + with pytest.raises(McpError) as exc_info: + await session.initialize() + + # Check that we got a connection closed error + assert exc_info.value.error.code == CONNECTION_CLOSED + assert "Connection closed" in exc_info.value.error.message + + +@pytest.mark.anyio +async def test_stdio_client_nonexistent_command(): + """Test that stdio_client raises an error for non-existent commands.""" + # Create a server with a non-existent command + server_params = StdioServerParameters( + command="/path/to/nonexistent/command", + args=["--help"], + ) + + # Should raise an error when trying to start the process + with pytest.raises(Exception) as exc_info: + async with stdio_client(server_params) as (_, _): + pass + + # The error should indicate the command was not found + error_message = str(exc_info.value) + assert ( + "nonexistent" in error_message + or "not found" in error_message.lower() + or "cannot find the file" in error_message.lower() # Windows error message + ) diff --git a/tests/issues/test_100_tool_listing.py b/tests/issues/test_100_tool_listing.py index 2bc386c96..6dccec84d 100644 --- a/tests/issues/test_100_tool_listing.py +++ b/tests/issues/test_100_tool_listing.py @@ -17,9 +17,7 @@ def dummy_tool_func(): f"""Tool number {i}""" return i - globals()[f"dummy_tool_{i}"] = ( - dummy_tool_func # Keep reference to avoid garbage collection - ) + globals()[f"dummy_tool_{i}"] = dummy_tool_func # Keep reference to avoid garbage collection # Get all tools tools = await mcp.list_tools() @@ -30,6 +28,4 @@ def dummy_tool_func(): # Verify each tool is unique and has the correct name tool_names = [tool.name for tool in tools] expected_names = [f"tool_{i}" for i in range(num_tools)] - assert sorted(tool_names) == sorted( - expected_names - ), "Tool names don't match expected names" + assert sorted(tool_names) == sorted(expected_names), "Tool names don't match expected names" diff --git a/tests/issues/test_129_resource_templates.py b/tests/issues/test_129_resource_templates.py index e6eff3d46..4bedb15d5 100644 --- a/tests/issues/test_129_resource_templates.py +++ b/tests/issues/test_129_resource_templates.py @@ -24,9 +24,7 @@ def get_user_profile(user_id: str) -> str: # Note: list_resource_templates() returns a decorator that wraps the handler # The handler returns a ServerResult with a ListResourceTemplatesResult inside result = await mcp._mcp_server.request_handlers[types.ListResourceTemplatesRequest]( - types.ListResourceTemplatesRequest( - method="resources/templates/list", params=None, cursor=None - ) + types.ListResourceTemplatesRequest(method="resources/templates/list", params=None) ) assert isinstance(result.root, types.ListResourceTemplatesResult) templates = result.root.resourceTemplates diff --git a/tests/issues/test_141_resource_templates.py b/tests/issues/test_141_resource_templates.py index 3c17cd559..3145f65e8 100644 --- a/tests/issues/test_141_resource_templates.py +++ b/tests/issues/test_141_resource_templates.py @@ -61,9 +61,7 @@ def get_user_profile_missing(user_id: str) -> str: await mcp.read_resource("resource://users/123/posts") # Missing post_id with pytest.raises(ValueError, match="Unknown resource"): - await mcp.read_resource( - "resource://users/123/posts/456/extra" - ) # Extra path component + await mcp.read_resource("resource://users/123/posts/456/extra") # Extra path component @pytest.mark.anyio @@ -110,11 +108,7 @@ def get_user_profile(user_id: str) -> str: # Verify invalid resource URIs raise appropriate errors with pytest.raises(Exception): # Specific exception type may vary - await session.read_resource( - AnyUrl("resource://users/123/posts") - ) # Missing post_id + await session.read_resource(AnyUrl("resource://users/123/posts")) # Missing post_id with pytest.raises(Exception): # Specific exception type may vary - await session.read_resource( - AnyUrl("resource://users/123/invalid") - ) # Invalid template + await session.read_resource(AnyUrl("resource://users/123/invalid")) # Invalid template diff --git a/tests/issues/test_152_resource_mime_type.py b/tests/issues/test_152_resource_mime_type.py index 1143195e5..a99e5a5c7 100644 --- a/tests/issues/test_152_resource_mime_type.py +++ b/tests/issues/test_152_resource_mime_type.py @@ -45,31 +45,19 @@ def get_image_as_bytes() -> bytes: bytes_resource = mapping["test://image_bytes"] # Verify mime types - assert ( - string_resource.mimeType == "image/png" - ), "String resource mime type not respected" - assert ( - bytes_resource.mimeType == "image/png" - ), "Bytes resource mime type not respected" + assert string_resource.mimeType == "image/png", "String resource mime type not respected" + assert bytes_resource.mimeType == "image/png", "Bytes resource mime type not respected" # Also verify the content can be read correctly string_result = await client.read_resource(AnyUrl("test://image")) assert len(string_result.contents) == 1 - assert ( - getattr(string_result.contents[0], "text") == base64_string - ), "Base64 string mismatch" - assert ( - string_result.contents[0].mimeType == "image/png" - ), "String content mime type not preserved" + assert getattr(string_result.contents[0], "text") == base64_string, "Base64 string mismatch" + assert string_result.contents[0].mimeType == "image/png", "String content mime type not preserved" bytes_result = await client.read_resource(AnyUrl("test://image_bytes")) assert len(bytes_result.contents) == 1 - assert ( - base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes - ), "Bytes mismatch" - assert ( - bytes_result.contents[0].mimeType == "image/png" - ), "Bytes content mime type not preserved" + assert base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes, "Bytes mismatch" + assert bytes_result.contents[0].mimeType == "image/png", "Bytes content mime type not preserved" async def test_lowlevel_resource_mime_type(): @@ -82,9 +70,7 @@ async def test_lowlevel_resource_mime_type(): # Create test resources with specific mime types test_resources = [ - types.Resource( - uri=AnyUrl("test://image"), name="test image", mimeType="image/png" - ), + types.Resource(uri=AnyUrl("test://image"), name="test image", mimeType="image/png"), types.Resource( uri=AnyUrl("test://image_bytes"), name="test image bytes", @@ -101,9 +87,7 @@ async def handle_read_resource(uri: AnyUrl): if str(uri) == "test://image": return [ReadResourceContents(content=base64_string, mime_type="image/png")] elif str(uri) == "test://image_bytes": - return [ - ReadResourceContents(content=bytes(image_bytes), mime_type="image/png") - ] + return [ReadResourceContents(content=bytes(image_bytes), mime_type="image/png")] raise Exception(f"Resource not found: {uri}") # Test that resources are listed with correct mime type @@ -119,28 +103,16 @@ async def handle_read_resource(uri: AnyUrl): bytes_resource = mapping["test://image_bytes"] # Verify mime types - assert ( - string_resource.mimeType == "image/png" - ), "String resource mime type not respected" - assert ( - bytes_resource.mimeType == "image/png" - ), "Bytes resource mime type not respected" + assert string_resource.mimeType == "image/png", "String resource mime type not respected" + assert bytes_resource.mimeType == "image/png", "Bytes resource mime type not respected" # Also verify the content can be read correctly string_result = await client.read_resource(AnyUrl("test://image")) assert len(string_result.contents) == 1 - assert ( - getattr(string_result.contents[0], "text") == base64_string - ), "Base64 string mismatch" - assert ( - string_result.contents[0].mimeType == "image/png" - ), "String content mime type not preserved" + assert getattr(string_result.contents[0], "text") == base64_string, "Base64 string mismatch" + assert string_result.contents[0].mimeType == "image/png", "String content mime type not preserved" bytes_result = await client.read_resource(AnyUrl("test://image_bytes")) assert len(bytes_result.contents) == 1 - assert ( - base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes - ), "Bytes mismatch" - assert ( - bytes_result.contents[0].mimeType == "image/png" - ), "Bytes content mime type not preserved" + assert base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes, "Bytes mismatch" + assert bytes_result.contents[0].mimeType == "image/png", "Bytes content mime type not preserved" diff --git a/tests/issues/test_176_progress_token.py b/tests/issues/test_176_progress_token.py index 7f9131a1e..eb5f19d64 100644 --- a/tests/issues/test_176_progress_token.py +++ b/tests/issues/test_176_progress_token.py @@ -35,15 +35,7 @@ async def test_progress_token_zero_first_call(): await ctx.report_progress(10, 10) # Complete # Verify progress notifications - assert ( - mock_session.send_progress_notification.call_count == 3 - ), "All progress notifications should be sent" - mock_session.send_progress_notification.assert_any_call( - progress_token=0, progress=0.0, total=10.0 - ) - mock_session.send_progress_notification.assert_any_call( - progress_token=0, progress=5.0, total=10.0 - ) - mock_session.send_progress_notification.assert_any_call( - progress_token=0, progress=10.0, total=10.0 - ) + assert mock_session.send_progress_notification.call_count == 3, "All progress notifications should be sent" + mock_session.send_progress_notification.assert_any_call(progress_token=0, progress=0.0, total=10.0, message=None) + mock_session.send_progress_notification.assert_any_call(progress_token=0, progress=5.0, total=10.0, message=None) + mock_session.send_progress_notification.assert_any_call(progress_token=0, progress=10.0, total=10.0, message=None) diff --git a/tests/issues/test_188_concurrency.py b/tests/issues/test_188_concurrency.py index 2aa6c49cb..9ccffefa9 100644 --- a/tests/issues/test_188_concurrency.py +++ b/tests/issues/test_188_concurrency.py @@ -35,7 +35,7 @@ async def slow_resource(): end_time = anyio.current_time() duration = end_time - start_time - assert duration < 3 * _sleep_time_seconds + assert duration < 10 * _sleep_time_seconds print(duration) diff --git a/tests/issues/test_192_request_id.py b/tests/issues/test_192_request_id.py index 00e187895..3c63f00b7 100644 --- a/tests/issues/test_192_request_id.py +++ b/tests/issues/test_192_request_id.py @@ -3,6 +3,7 @@ from mcp.server.lowlevel import NotificationOptions, Server from mcp.server.models import InitializationOptions +from mcp.shared.message import SessionMessage from mcp.types import ( LATEST_PROTOCOL_VERSION, ClientCapabilities, @@ -64,8 +65,8 @@ async def run_server(): jsonrpc="2.0", ) - await client_writer.send(JSONRPCMessage(root=init_req)) - await server_reader.receive() # Get init response but don't need to check it + await client_writer.send(SessionMessage(JSONRPCMessage(root=init_req))) + response = await server_reader.receive() # Get init response but don't need to check it # Send initialized notification initialized_notification = JSONRPCNotification( @@ -73,22 +74,18 @@ async def run_server(): params=NotificationParams().model_dump(by_alias=True, exclude_none=True), jsonrpc="2.0", ) - await client_writer.send(JSONRPCMessage(root=initialized_notification)) + await client_writer.send(SessionMessage(JSONRPCMessage(root=initialized_notification))) # Send ping request with custom ID - ping_request = JSONRPCRequest( - id=custom_request_id, method="ping", params={}, jsonrpc="2.0" - ) + ping_request = JSONRPCRequest(id=custom_request_id, method="ping", params={}, jsonrpc="2.0") - await client_writer.send(JSONRPCMessage(root=ping_request)) + await client_writer.send(SessionMessage(JSONRPCMessage(root=ping_request))) # Read response response = await server_reader.receive() # Verify response ID matches request ID - assert ( - response.root.id == custom_request_id - ), "Response ID should match request ID" + assert response.message.root.id == custom_request_id, "Response ID should match request ID" # Cancel server task tg.cancel_scope.cancel() diff --git a/tests/issues/test_342_base64_encoding.py b/tests/issues/test_342_base64_encoding.py index cff8ec543..6a6e410c7 100644 --- a/tests/issues/test_342_base64_encoding.py +++ b/tests/issues/test_342_base64_encoding.py @@ -47,11 +47,7 @@ async def test_server_base64_encoding_issue(): # Register a resource handler that returns our test data @server.read_resource() async def read_resource(uri: AnyUrl) -> list[ReadResourceContents]: - return [ - ReadResourceContents( - content=binary_data, mime_type="application/octet-stream" - ) - ] + return [ReadResourceContents(content=binary_data, mime_type="application/octet-stream")] # Get the handler directly from the server handler = server.request_handlers[ReadResourceRequest] diff --git a/tests/issues/test_88_random_error.py b/tests/issues/test_88_random_error.py index 88e41d66d..7ba970f0b 100644 --- a/tests/issues/test_88_random_error.py +++ b/tests/issues/test_88_random_error.py @@ -11,11 +11,7 @@ from mcp.client.session import ClientSession from mcp.server.lowlevel import Server from mcp.shared.exceptions import McpError -from mcp.types import ( - EmbeddedResource, - ImageContent, - TextContent, -) +from mcp.types import ContentBlock, TextContent @pytest.mark.anyio @@ -35,9 +31,7 @@ async def test_notification_validation_error(tmp_path: Path): slow_request_complete = anyio.Event() @server.call_tool() - async def slow_tool( - name: str, arg - ) -> Sequence[TextContent | ImageContent | EmbeddedResource]: + async def slow_tool(name: str, arg) -> Sequence[ContentBlock]: nonlocal request_count request_count += 1 @@ -74,9 +68,7 @@ async def client(read_stream, write_stream, scope): # - Long enough for fast operations (>10ms) # - Short enough for slow operations (<200ms) # - Not too short to avoid flakiness - async with ClientSession( - read_stream, write_stream, read_timeout_seconds=timedelta(milliseconds=50) - ) as session: + async with ClientSession(read_stream, write_stream, read_timeout_seconds=timedelta(milliseconds=50)) as session: await session.initialize() # First call should work (fast operation) diff --git a/tests/issues/test_malformed_input.py b/tests/issues/test_malformed_input.py new file mode 100644 index 000000000..97edb651e --- /dev/null +++ b/tests/issues/test_malformed_input.py @@ -0,0 +1,160 @@ +# Claude Debug +"""Test for HackerOne vulnerability report #3156202 - malformed input DOS.""" + +import anyio +import pytest + +from mcp.server.models import InitializationOptions +from mcp.server.session import ServerSession +from mcp.shared.message import SessionMessage +from mcp.types import ( + INVALID_PARAMS, + JSONRPCError, + JSONRPCMessage, + JSONRPCRequest, + ServerCapabilities, +) + + +@pytest.mark.anyio +async def test_malformed_initialize_request_does_not_crash_server(): + """ + Test that malformed initialize requests return proper error responses + instead of crashing the server (HackerOne #3156202). + """ + # Create in-memory streams for testing + read_send_stream, read_receive_stream = anyio.create_memory_object_stream[SessionMessage | Exception](10) + write_send_stream, write_receive_stream = anyio.create_memory_object_stream[SessionMessage](10) + + try: + # Create a malformed initialize request (missing required params field) + malformed_request = JSONRPCRequest( + jsonrpc="2.0", + id="f20fe86132ed4cd197f89a7134de5685", + method="initialize", + # params=None # Missing required params field + ) + + # Wrap in session message + request_message = SessionMessage(message=JSONRPCMessage(malformed_request)) + + # Start a server session + async with ServerSession( + read_stream=read_receive_stream, + write_stream=write_send_stream, + init_options=InitializationOptions( + server_name="test_server", + server_version="1.0.0", + capabilities=ServerCapabilities(), + ), + ): + # Send the malformed request + await read_send_stream.send(request_message) + + # Give the session time to process the request + await anyio.sleep(0.1) + + # Check that we received an error response instead of a crash + try: + response_message = write_receive_stream.receive_nowait() + response = response_message.message.root + + # Verify it's a proper JSON-RPC error response + assert isinstance(response, JSONRPCError) + assert response.jsonrpc == "2.0" + assert response.id == "f20fe86132ed4cd197f89a7134de5685" + assert response.error.code == INVALID_PARAMS + assert "Invalid request parameters" in response.error.message + + # Verify the session is still alive and can handle more requests + # Send another malformed request to confirm server stability + another_malformed_request = JSONRPCRequest( + jsonrpc="2.0", + id="test_id_2", + method="tools/call", + # params=None # Missing required params + ) + another_request_message = SessionMessage(message=JSONRPCMessage(another_malformed_request)) + + await read_send_stream.send(another_request_message) + await anyio.sleep(0.1) + + # Should get another error response, not a crash + second_response_message = write_receive_stream.receive_nowait() + second_response = second_response_message.message.root + + assert isinstance(second_response, JSONRPCError) + assert second_response.id == "test_id_2" + assert second_response.error.code == INVALID_PARAMS + + except anyio.WouldBlock: + pytest.fail("No response received - server likely crashed") + finally: + # Close all streams to ensure proper cleanup + await read_send_stream.aclose() + await write_send_stream.aclose() + await read_receive_stream.aclose() + await write_receive_stream.aclose() + + +@pytest.mark.anyio +async def test_multiple_concurrent_malformed_requests(): + """ + Test that multiple concurrent malformed requests don't crash the server. + """ + # Create in-memory streams for testing + read_send_stream, read_receive_stream = anyio.create_memory_object_stream[SessionMessage | Exception](100) + write_send_stream, write_receive_stream = anyio.create_memory_object_stream[SessionMessage](100) + + try: + # Start a server session + async with ServerSession( + read_stream=read_receive_stream, + write_stream=write_send_stream, + init_options=InitializationOptions( + server_name="test_server", + server_version="1.0.0", + capabilities=ServerCapabilities(), + ), + ): + # Send multiple malformed requests concurrently + malformed_requests = [] + for i in range(10): + malformed_request = JSONRPCRequest( + jsonrpc="2.0", + id=f"malformed_{i}", + method="initialize", + # params=None # Missing required params + ) + request_message = SessionMessage(message=JSONRPCMessage(malformed_request)) + malformed_requests.append(request_message) + + # Send all requests + for request in malformed_requests: + await read_send_stream.send(request) + + # Give time to process + await anyio.sleep(0.2) + + # Verify we get error responses for all requests + error_responses = [] + try: + while True: + response_message = write_receive_stream.receive_nowait() + error_responses.append(response_message.message.root) + except anyio.WouldBlock: + pass # No more messages + + # Should have received 10 error responses + assert len(error_responses) == 10 + + for i, response in enumerate(error_responses): + assert isinstance(response, JSONRPCError) + assert response.id == f"malformed_{i}" + assert response.error.code == INVALID_PARAMS + finally: + # Close all streams to ensure proper cleanup + await read_send_stream.aclose() + await write_send_stream.aclose() + await read_receive_stream.aclose() + await write_receive_stream.aclose() diff --git a/tests/server/auth/middleware/test_auth_context.py b/tests/server/auth/middleware/test_auth_context.py new file mode 100644 index 000000000..916640714 --- /dev/null +++ b/tests/server/auth/middleware/test_auth_context.py @@ -0,0 +1,122 @@ +""" +Tests for the AuthContext middleware components. +""" + +import time + +import pytest +from starlette.types import Message, Receive, Scope, Send + +from mcp.server.auth.middleware.auth_context import ( + AuthContextMiddleware, + auth_context_var, + get_access_token, +) +from mcp.server.auth.middleware.bearer_auth import AuthenticatedUser +from mcp.server.auth.provider import AccessToken + + +class MockApp: + """Mock ASGI app for testing.""" + + def __init__(self): + self.called = False + self.scope: Scope | None = None + self.receive: Receive | None = None + self.send: Send | None = None + self.access_token_during_call: AccessToken | None = None + + async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None: + self.called = True + self.scope = scope + self.receive = receive + self.send = send + # Check the context during the call + self.access_token_during_call = get_access_token() + + +@pytest.fixture +def valid_access_token() -> AccessToken: + """Create a valid access token.""" + return AccessToken( + token="valid_token", + client_id="test_client", + scopes=["read", "write"], + expires_at=int(time.time()) + 3600, # 1 hour from now + ) + + +@pytest.mark.anyio +class TestAuthContextMiddleware: + """Tests for the AuthContextMiddleware class.""" + + async def test_with_authenticated_user(self, valid_access_token: AccessToken): + """Test middleware with an authenticated user in scope.""" + app = MockApp() + middleware = AuthContextMiddleware(app) + + # Create an authenticated user + user = AuthenticatedUser(valid_access_token) + + scope: Scope = {"type": "http", "user": user} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + async def send(message: Message) -> None: + pass + + # Verify context is empty before middleware + assert auth_context_var.get() is None + assert get_access_token() is None + + # Run the middleware + await middleware(scope, receive, send) + + # Verify the app was called + assert app.called + assert app.scope == scope + assert app.receive == receive + assert app.send == send + + # Verify the access token was available during the call + assert app.access_token_during_call == valid_access_token + + # Verify context is reset after middleware + assert auth_context_var.get() is None + assert get_access_token() is None + + async def test_with_no_user(self): + """Test middleware with no user in scope.""" + app = MockApp() + middleware = AuthContextMiddleware(app) + + scope: Scope = {"type": "http"} # No user + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + async def send(message: Message) -> None: + pass + + # Verify context is empty before middleware + assert auth_context_var.get() is None + assert get_access_token() is None + + # Run the middleware + await middleware(scope, receive, send) + + # Verify the app was called + assert app.called + assert app.scope == scope + assert app.receive == receive + assert app.send == send + + # Verify the access token was not available during the call + assert app.access_token_during_call is None + + # Verify context is still empty after middleware + assert auth_context_var.get() is None + assert get_access_token() is None diff --git a/tests/server/auth/middleware/test_bearer_auth.py b/tests/server/auth/middleware/test_bearer_auth.py new file mode 100644 index 000000000..5bb0f969e --- /dev/null +++ b/tests/server/auth/middleware/test_bearer_auth.py @@ -0,0 +1,458 @@ +""" +Tests for the BearerAuth middleware components. +""" + +import time +from typing import Any, cast + +import pytest +from starlette.authentication import AuthCredentials +from starlette.datastructures import Headers +from starlette.requests import Request +from starlette.types import Message, Receive, Scope, Send + +from mcp.server.auth.middleware.bearer_auth import ( + AuthenticatedUser, + BearerAuthBackend, + RequireAuthMiddleware, +) +from mcp.server.auth.provider import ( + AccessToken, + OAuthAuthorizationServerProvider, + ProviderTokenVerifier, +) + + +class MockOAuthProvider: + """Mock OAuth provider for testing. + + This is a simplified version that only implements the methods needed for testing + the BearerAuthMiddleware components. + """ + + def __init__(self): + self.tokens = {} # token -> AccessToken + + def add_token(self, token: str, access_token: AccessToken) -> None: + """Add a token to the provider.""" + self.tokens[token] = access_token + + async def load_access_token(self, token: str) -> AccessToken | None: + """Load an access token.""" + return self.tokens.get(token) + + +def add_token_to_provider( + provider: OAuthAuthorizationServerProvider[Any, Any, Any], + token: str, + access_token: AccessToken, +) -> None: + """Helper function to add a token to a provider. + + This is used to work around type checking issues with our mock provider. + """ + # We know this is actually a MockOAuthProvider + mock_provider = cast(MockOAuthProvider, provider) + mock_provider.add_token(token, access_token) + + +class MockApp: + """Mock ASGI app for testing.""" + + def __init__(self): + self.called = False + self.scope: Scope | None = None + self.receive: Receive | None = None + self.send: Send | None = None + + async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None: + self.called = True + self.scope = scope + self.receive = receive + self.send = send + + +@pytest.fixture +def mock_oauth_provider() -> OAuthAuthorizationServerProvider[Any, Any, Any]: + """Create a mock OAuth provider.""" + # Use type casting to satisfy the type checker + return cast(OAuthAuthorizationServerProvider[Any, Any, Any], MockOAuthProvider()) + + +@pytest.fixture +def valid_access_token() -> AccessToken: + """Create a valid access token.""" + return AccessToken( + token="valid_token", + client_id="test_client", + scopes=["read", "write"], + expires_at=int(time.time()) + 3600, # 1 hour from now + ) + + +@pytest.fixture +def expired_access_token() -> AccessToken: + """Create an expired access token.""" + return AccessToken( + token="expired_token", + client_id="test_client", + scopes=["read"], + expires_at=int(time.time()) - 3600, # 1 hour ago + ) + + +@pytest.fixture +def no_expiry_access_token() -> AccessToken: + """Create an access token with no expiry.""" + return AccessToken( + token="no_expiry_token", + client_id="test_client", + scopes=["read", "write"], + expires_at=None, + ) + + +@pytest.mark.anyio +class TestBearerAuthBackend: + """Tests for the BearerAuthBackend class.""" + + async def test_no_auth_header(self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]): + """Test authentication with no Authorization header.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + request = Request({"type": "http", "headers": []}) + result = await backend.authenticate(request) + assert result is None + + async def test_non_bearer_auth_header(self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]): + """Test authentication with non-Bearer Authorization header.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + request = Request( + { + "type": "http", + "headers": [(b"authorization", b"Basic dXNlcjpwYXNz")], + } + ) + result = await backend.authenticate(request) + assert result is None + + async def test_invalid_token(self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]): + """Test authentication with invalid token.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + request = Request( + { + "type": "http", + "headers": [(b"authorization", b"Bearer invalid_token")], + } + ) + result = await backend.authenticate(request) + assert result is None + + async def test_expired_token( + self, + mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any], + expired_access_token: AccessToken, + ): + """Test authentication with expired token.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + add_token_to_provider(mock_oauth_provider, "expired_token", expired_access_token) + request = Request( + { + "type": "http", + "headers": [(b"authorization", b"Bearer expired_token")], + } + ) + result = await backend.authenticate(request) + assert result is None + + async def test_valid_token( + self, + mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any], + valid_access_token: AccessToken, + ): + """Test authentication with valid token.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + add_token_to_provider(mock_oauth_provider, "valid_token", valid_access_token) + request = Request( + { + "type": "http", + "headers": [(b"authorization", b"Bearer valid_token")], + } + ) + result = await backend.authenticate(request) + assert result is not None + credentials, user = result + assert isinstance(credentials, AuthCredentials) + assert isinstance(user, AuthenticatedUser) + assert credentials.scopes == ["read", "write"] + assert user.display_name == "test_client" + assert user.access_token == valid_access_token + assert user.scopes == ["read", "write"] + + async def test_token_without_expiry( + self, + mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any], + no_expiry_access_token: AccessToken, + ): + """Test authentication with token that has no expiry.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + add_token_to_provider(mock_oauth_provider, "no_expiry_token", no_expiry_access_token) + request = Request( + { + "type": "http", + "headers": [(b"authorization", b"Bearer no_expiry_token")], + } + ) + result = await backend.authenticate(request) + assert result is not None + credentials, user = result + assert isinstance(credentials, AuthCredentials) + assert isinstance(user, AuthenticatedUser) + assert credentials.scopes == ["read", "write"] + assert user.display_name == "test_client" + assert user.access_token == no_expiry_access_token + assert user.scopes == ["read", "write"] + + async def test_lowercase_bearer_prefix( + self, + mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any], + valid_access_token: AccessToken, + ): + """Test with lowercase 'bearer' prefix in Authorization header""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + add_token_to_provider(mock_oauth_provider, "valid_token", valid_access_token) + headers = Headers({"Authorization": "bearer valid_token"}) + scope = {"type": "http", "headers": headers.raw} + request = Request(scope) + result = await backend.authenticate(request) + assert result is not None + credentials, user = result + assert isinstance(credentials, AuthCredentials) + assert isinstance(user, AuthenticatedUser) + assert credentials.scopes == ["read", "write"] + assert user.display_name == "test_client" + assert user.access_token == valid_access_token + + async def test_mixed_case_bearer_prefix( + self, + mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any], + valid_access_token: AccessToken, + ): + """Test with mixed 'BeArEr' prefix in Authorization header""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + add_token_to_provider(mock_oauth_provider, "valid_token", valid_access_token) + headers = Headers({"authorization": "BeArEr valid_token"}) + scope = {"type": "http", "headers": headers.raw} + request = Request(scope) + result = await backend.authenticate(request) + assert result is not None + credentials, user = result + assert isinstance(credentials, AuthCredentials) + assert isinstance(user, AuthenticatedUser) + assert credentials.scopes == ["read", "write"] + assert user.display_name == "test_client" + assert user.access_token == valid_access_token + + async def test_mixed_case_authorization_header( + self, + mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any], + valid_access_token: AccessToken, + ): + """Test authentication with mixed 'Authorization' header.""" + backend = BearerAuthBackend(token_verifier=ProviderTokenVerifier(mock_oauth_provider)) + add_token_to_provider(mock_oauth_provider, "valid_token", valid_access_token) + headers = Headers({"AuThOrIzAtIoN": "BeArEr valid_token"}) + scope = {"type": "http", "headers": headers.raw} + request = Request(scope) + result = await backend.authenticate(request) + assert result is not None + credentials, user = result + assert isinstance(credentials, AuthCredentials) + assert isinstance(user, AuthenticatedUser) + assert credentials.scopes == ["read", "write"] + assert user.display_name == "test_client" + assert user.access_token == valid_access_token + + +@pytest.mark.anyio +class TestRequireAuthMiddleware: + """Tests for the RequireAuthMiddleware class.""" + + async def test_no_user(self): + """Test middleware with no user in scope.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=["read"]) + scope: Scope = {"type": "http"} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + sent_messages = [] + + async def send(message: Message) -> None: + sent_messages.append(message) + + await middleware(scope, receive, send) + + # Check that a 401 response was sent + assert len(sent_messages) == 2 + assert sent_messages[0]["type"] == "http.response.start" + assert sent_messages[0]["status"] == 401 + assert any(h[0] == b"www-authenticate" for h in sent_messages[0]["headers"]) + assert not app.called + + async def test_non_authenticated_user(self): + """Test middleware with non-authenticated user in scope.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=["read"]) + scope: Scope = {"type": "http", "user": object()} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + sent_messages = [] + + async def send(message: Message) -> None: + sent_messages.append(message) + + await middleware(scope, receive, send) + + # Check that a 401 response was sent + assert len(sent_messages) == 2 + assert sent_messages[0]["type"] == "http.response.start" + assert sent_messages[0]["status"] == 401 + assert any(h[0] == b"www-authenticate" for h in sent_messages[0]["headers"]) + assert not app.called + + async def test_missing_required_scope(self, valid_access_token: AccessToken): + """Test middleware with user missing required scope.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=["admin"]) + + # Create a user with read/write scopes but not admin + user = AuthenticatedUser(valid_access_token) + auth = AuthCredentials(["read", "write"]) + + scope: Scope = {"type": "http", "user": user, "auth": auth} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + sent_messages = [] + + async def send(message: Message) -> None: + sent_messages.append(message) + + await middleware(scope, receive, send) + + # Check that a 403 response was sent + assert len(sent_messages) == 2 + assert sent_messages[0]["type"] == "http.response.start" + assert sent_messages[0]["status"] == 403 + assert any(h[0] == b"www-authenticate" for h in sent_messages[0]["headers"]) + assert not app.called + + async def test_no_auth_credentials(self, valid_access_token: AccessToken): + """Test middleware with no auth credentials in scope.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=["read"]) + + # Create a user with read/write scopes + user = AuthenticatedUser(valid_access_token) + + scope: Scope = {"type": "http", "user": user} # No auth credentials + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + sent_messages = [] + + async def send(message: Message) -> None: + sent_messages.append(message) + + await middleware(scope, receive, send) + + # Check that a 403 response was sent + assert len(sent_messages) == 2 + assert sent_messages[0]["type"] == "http.response.start" + assert sent_messages[0]["status"] == 403 + assert any(h[0] == b"www-authenticate" for h in sent_messages[0]["headers"]) + assert not app.called + + async def test_has_required_scopes(self, valid_access_token: AccessToken): + """Test middleware with user having all required scopes.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=["read"]) + + # Create a user with read/write scopes + user = AuthenticatedUser(valid_access_token) + auth = AuthCredentials(["read", "write"]) + + scope: Scope = {"type": "http", "user": user, "auth": auth} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + async def send(message: Message) -> None: + pass + + await middleware(scope, receive, send) + + assert app.called + assert app.scope == scope + assert app.receive == receive + assert app.send == send + + async def test_multiple_required_scopes(self, valid_access_token: AccessToken): + """Test middleware with multiple required scopes.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=["read", "write"]) + + # Create a user with read/write scopes + user = AuthenticatedUser(valid_access_token) + auth = AuthCredentials(["read", "write"]) + + scope: Scope = {"type": "http", "user": user, "auth": auth} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + async def send(message: Message) -> None: + pass + + await middleware(scope, receive, send) + + assert app.called + assert app.scope == scope + assert app.receive == receive + assert app.send == send + + async def test_no_required_scopes(self, valid_access_token: AccessToken): + """Test middleware with no required scopes.""" + app = MockApp() + middleware = RequireAuthMiddleware(app, required_scopes=[]) + + # Create a user with read/write scopes + user = AuthenticatedUser(valid_access_token) + auth = AuthCredentials(["read", "write"]) + + scope: Scope = {"type": "http", "user": user, "auth": auth} + + # Create dummy async functions for receive and send + async def receive() -> Message: + return {"type": "http.request"} + + async def send(message: Message) -> None: + pass + + await middleware(scope, receive, send) + + assert app.called + assert app.scope == scope + assert app.receive == receive + assert app.send == send diff --git a/tests/server/auth/test_error_handling.py b/tests/server/auth/test_error_handling.py new file mode 100644 index 000000000..7846c8adb --- /dev/null +++ b/tests/server/auth/test_error_handling.py @@ -0,0 +1,286 @@ +""" +Tests for OAuth error handling in the auth handlers. +""" + +import unittest.mock +from urllib.parse import parse_qs, urlparse + +import httpx +import pytest +from httpx import ASGITransport +from pydantic import AnyHttpUrl +from starlette.applications import Starlette + +from mcp.server.auth.provider import ( + AuthorizeError, + RegistrationError, + TokenError, +) +from mcp.server.auth.routes import create_auth_routes +from tests.server.fastmcp.auth.test_auth_integration import ( + MockOAuthProvider, +) + + +@pytest.fixture +def oauth_provider(): + """Return a MockOAuthProvider instance that can be configured to raise errors.""" + return MockOAuthProvider() + + +@pytest.fixture +def app(oauth_provider): + from mcp.server.auth.settings import ClientRegistrationOptions, RevocationOptions + + # Enable client registration + client_registration_options = ClientRegistrationOptions(enabled=True) + revocation_options = RevocationOptions(enabled=True) + + # Create auth routes + auth_routes = create_auth_routes( + oauth_provider, + issuer_url=AnyHttpUrl("http://localhost"), + client_registration_options=client_registration_options, + revocation_options=revocation_options, + ) + + # Create Starlette app with routes directly + return Starlette(routes=auth_routes) + + +@pytest.fixture +def client(app): + transport = ASGITransport(app=app) + # Use base_url without a path since routes are directly on the app + return httpx.AsyncClient(transport=transport, base_url="http://localhost") + + +@pytest.fixture +def pkce_challenge(): + """Create a PKCE challenge with code_verifier and code_challenge.""" + import base64 + import hashlib + import secrets + + # Generate a code verifier + code_verifier = secrets.token_urlsafe(64)[:128] + + # Create code challenge using S256 method + code_verifier_bytes = code_verifier.encode("ascii") + sha256 = hashlib.sha256(code_verifier_bytes).digest() + code_challenge = base64.urlsafe_b64encode(sha256).decode().rstrip("=") + + return {"code_verifier": code_verifier, "code_challenge": code_challenge} + + +@pytest.fixture +async def registered_client(client): + """Create and register a test client.""" + # Default client metadata + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "token_endpoint_auth_method": "client_secret_post", + "grant_types": ["authorization_code", "refresh_token"], + "response_types": ["code"], + "client_name": "Test Client", + } + + response = await client.post("/register", json=client_metadata) + assert response.status_code == 201, f"Failed to register client: {response.content}" + + client_info = response.json() + return client_info + + +class TestRegistrationErrorHandling: + @pytest.mark.anyio + async def test_registration_error_handling(self, client, oauth_provider): + # Mock the register_client method to raise a registration error + with unittest.mock.patch.object( + oauth_provider, + "register_client", + side_effect=RegistrationError( + error="invalid_redirect_uri", + error_description="The redirect URI is invalid", + ), + ): + # Prepare a client registration request + client_data = { + "redirect_uris": ["https://client.example.com/callback"], + "token_endpoint_auth_method": "client_secret_post", + "grant_types": ["authorization_code", "refresh_token"], + "response_types": ["code"], + "client_name": "Test Client", + } + + # Send the registration request + response = await client.post( + "/register", + json=client_data, + ) + + # Verify the response + assert response.status_code == 400, response.content + data = response.json() + assert data["error"] == "invalid_redirect_uri" + assert data["error_description"] == "The redirect URI is invalid" + + +class TestAuthorizeErrorHandling: + @pytest.mark.anyio + async def test_authorize_error_handling(self, client, oauth_provider, registered_client, pkce_challenge): + # Mock the authorize method to raise an authorize error + with unittest.mock.patch.object( + oauth_provider, + "authorize", + side_effect=AuthorizeError(error="access_denied", error_description="The user denied the request"), + ): + # Register the client + client_id = registered_client["client_id"] + redirect_uri = registered_client["redirect_uris"][0] + + # Prepare an authorization request + params = { + "client_id": client_id, + "redirect_uri": redirect_uri, + "response_type": "code", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + } + + # Send the authorization request + response = await client.get("/authorize", params=params) + + # Verify the response is a redirect with error parameters + assert response.status_code == 302 + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert query_params["error"][0] == "access_denied" + assert "error_description" in query_params + assert query_params["state"][0] == "test_state" + + +class TestTokenErrorHandling: + @pytest.mark.anyio + async def test_token_error_handling_auth_code(self, client, oauth_provider, registered_client, pkce_challenge): + # Register the client and get an auth code + client_id = registered_client["client_id"] + client_secret = registered_client["client_secret"] + redirect_uri = registered_client["redirect_uris"][0] + + # First get an authorization code + auth_response = await client.get( + "/authorize", + params={ + "client_id": client_id, + "redirect_uri": redirect_uri, + "response_type": "code", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + + redirect_url = auth_response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + code = query_params["code"][0] + + # Mock the exchange_authorization_code method to raise a token error + with unittest.mock.patch.object( + oauth_provider, + "exchange_authorization_code", + side_effect=TokenError( + error="invalid_grant", + error_description="The authorization code is invalid", + ), + ): + # Try to exchange the code for tokens + token_response = await client.post( + "/token", + data={ + "grant_type": "authorization_code", + "code": code, + "redirect_uri": redirect_uri, + "client_id": client_id, + "client_secret": client_secret, + "code_verifier": pkce_challenge["code_verifier"], + }, + ) + + # Verify the response + assert token_response.status_code == 400 + data = token_response.json() + assert data["error"] == "invalid_grant" + assert data["error_description"] == "The authorization code is invalid" + + @pytest.mark.anyio + async def test_token_error_handling_refresh_token(self, client, oauth_provider, registered_client, pkce_challenge): + # Register the client and get tokens + client_id = registered_client["client_id"] + client_secret = registered_client["client_secret"] + redirect_uri = registered_client["redirect_uris"][0] + + # First get an authorization code + auth_response = await client.get( + "/authorize", + params={ + "client_id": client_id, + "redirect_uri": redirect_uri, + "response_type": "code", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + assert auth_response.status_code == 302, auth_response.content + + redirect_url = auth_response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + code = query_params["code"][0] + + # Exchange the code for tokens + token_response = await client.post( + "/token", + data={ + "grant_type": "authorization_code", + "code": code, + "redirect_uri": redirect_uri, + "client_id": client_id, + "client_secret": client_secret, + "code_verifier": pkce_challenge["code_verifier"], + }, + ) + + tokens = token_response.json() + refresh_token = tokens["refresh_token"] + + # Mock the exchange_refresh_token method to raise a token error + with unittest.mock.patch.object( + oauth_provider, + "exchange_refresh_token", + side_effect=TokenError( + error="invalid_scope", + error_description="The requested scope is invalid", + ), + ): + # Try to use the refresh token + refresh_response = await client.post( + "/token", + data={ + "grant_type": "refresh_token", + "refresh_token": refresh_token, + "client_id": client_id, + "client_secret": client_secret, + }, + ) + + # Verify the response + assert refresh_response.status_code == 400 + data = refresh_response.json() + assert data["error"] == "invalid_scope" + assert data["error_description"] == "The requested scope is invalid" diff --git a/tests/server/fastmcp/auth/__init__.py b/tests/server/fastmcp/auth/__init__.py new file mode 100644 index 000000000..64d318ec4 --- /dev/null +++ b/tests/server/fastmcp/auth/__init__.py @@ -0,0 +1,3 @@ +""" +Tests for the MCP server auth components. +""" diff --git a/tests/server/fastmcp/auth/test_auth_integration.py b/tests/server/fastmcp/auth/test_auth_integration.py new file mode 100644 index 000000000..5db5d58c2 --- /dev/null +++ b/tests/server/fastmcp/auth/test_auth_integration.py @@ -0,0 +1,1203 @@ +""" +Integration tests for MCP authorization components. +""" + +import base64 +import hashlib +import secrets +import time +import unittest.mock +from urllib.parse import parse_qs, urlparse + +import httpx +import pytest +from pydantic import AnyHttpUrl +from starlette.applications import Starlette + +from mcp.server.auth.provider import ( + AccessToken, + AuthorizationCode, + AuthorizationParams, + OAuthAuthorizationServerProvider, + RefreshToken, + construct_redirect_uri, +) +from mcp.server.auth.routes import ( + ClientRegistrationOptions, + RevocationOptions, + create_auth_routes, +) +from mcp.shared.auth import ( + OAuthClientInformationFull, + OAuthToken, +) + + +# Mock OAuth provider for testing +class MockOAuthProvider(OAuthAuthorizationServerProvider): + def __init__(self): + self.clients = {} + self.auth_codes = {} # code -> {client_id, code_challenge, redirect_uri} + self.tokens = {} # token -> {client_id, scopes, expires_at} + self.refresh_tokens = {} # refresh_token -> access_token + + async def get_client(self, client_id: str) -> OAuthClientInformationFull | None: + return self.clients.get(client_id) + + async def register_client(self, client_info: OAuthClientInformationFull): + self.clients[client_info.client_id] = client_info + + async def authorize(self, client: OAuthClientInformationFull, params: AuthorizationParams) -> str: + # toy authorize implementation which just immediately generates an authorization + # code and completes the redirect + code = AuthorizationCode( + code=f"code_{int(time.time())}", + client_id=client.client_id, + code_challenge=params.code_challenge, + redirect_uri=params.redirect_uri, + redirect_uri_provided_explicitly=params.redirect_uri_provided_explicitly, + expires_at=time.time() + 300, + scopes=params.scopes or ["read", "write"], + ) + self.auth_codes[code.code] = code + + return construct_redirect_uri(str(params.redirect_uri), code=code.code, state=params.state) + + async def load_authorization_code( + self, client: OAuthClientInformationFull, authorization_code: str + ) -> AuthorizationCode | None: + return self.auth_codes.get(authorization_code) + + async def exchange_authorization_code( + self, client: OAuthClientInformationFull, authorization_code: AuthorizationCode + ) -> OAuthToken: + assert authorization_code.code in self.auth_codes + + # Generate an access token and refresh token + access_token = f"access_{secrets.token_hex(32)}" + refresh_token = f"refresh_{secrets.token_hex(32)}" + + # Store the tokens + self.tokens[access_token] = AccessToken( + token=access_token, + client_id=client.client_id, + scopes=authorization_code.scopes, + expires_at=int(time.time()) + 3600, + ) + + self.refresh_tokens[refresh_token] = access_token + + # Remove the used code + del self.auth_codes[authorization_code.code] + + return OAuthToken( + access_token=access_token, + token_type="Bearer", + expires_in=3600, + scope="read write", + refresh_token=refresh_token, + ) + + async def load_refresh_token(self, client: OAuthClientInformationFull, refresh_token: str) -> RefreshToken | None: + old_access_token = self.refresh_tokens.get(refresh_token) + if old_access_token is None: + return None + token_info = self.tokens.get(old_access_token) + if token_info is None: + return None + + # Create a RefreshToken object that matches what is expected in later code + refresh_obj = RefreshToken( + token=refresh_token, + client_id=token_info.client_id, + scopes=token_info.scopes, + expires_at=token_info.expires_at, + ) + + return refresh_obj + + async def exchange_refresh_token( + self, + client: OAuthClientInformationFull, + refresh_token: RefreshToken, + scopes: list[str], + ) -> OAuthToken: + # Check if refresh token exists + assert refresh_token.token in self.refresh_tokens + + old_access_token = self.refresh_tokens[refresh_token.token] + + # Check if the access token exists + assert old_access_token in self.tokens + + # Check if the token was issued to this client + token_info = self.tokens[old_access_token] + assert token_info.client_id == client.client_id + + # Generate a new access token and refresh token + new_access_token = f"access_{secrets.token_hex(32)}" + new_refresh_token = f"refresh_{secrets.token_hex(32)}" + + # Store the new tokens + self.tokens[new_access_token] = AccessToken( + token=new_access_token, + client_id=client.client_id, + scopes=scopes or token_info.scopes, + expires_at=int(time.time()) + 3600, + ) + + self.refresh_tokens[new_refresh_token] = new_access_token + + # Remove the old tokens + del self.refresh_tokens[refresh_token.token] + del self.tokens[old_access_token] + + return OAuthToken( + access_token=new_access_token, + token_type="Bearer", + expires_in=3600, + scope=" ".join(scopes) if scopes else " ".join(token_info.scopes), + refresh_token=new_refresh_token, + ) + + async def load_access_token(self, token: str) -> AccessToken | None: + token_info = self.tokens.get(token) + + # Check if token is expired + # if token_info.expires_at < int(time.time()): + # raise InvalidTokenError("Access token has expired") + + return token_info and AccessToken( + token=token, + client_id=token_info.client_id, + scopes=token_info.scopes, + expires_at=token_info.expires_at, + ) + + async def revoke_token(self, token: AccessToken | RefreshToken) -> None: + match token: + case RefreshToken(): + # Remove the refresh token + del self.refresh_tokens[token.token] + + case AccessToken(): + # Remove the access token + del self.tokens[token.token] + + # Also remove any refresh tokens that point to this access token + for refresh_token, access_token in list(self.refresh_tokens.items()): + if access_token == token.token: + del self.refresh_tokens[refresh_token] + + +@pytest.fixture +def mock_oauth_provider(): + return MockOAuthProvider() + + +@pytest.fixture +def auth_app(mock_oauth_provider): + # Create auth router + auth_routes = create_auth_routes( + mock_oauth_provider, + AnyHttpUrl("https://auth.example.com"), + AnyHttpUrl("https://docs.example.com"), + client_registration_options=ClientRegistrationOptions( + enabled=True, + valid_scopes=["read", "write", "profile"], + default_scopes=["read", "write"], + ), + revocation_options=RevocationOptions(enabled=True), + ) + + # Create Starlette app + app = Starlette(routes=auth_routes) + + return app + + +@pytest.fixture +async def test_client(auth_app): + async with httpx.AsyncClient(transport=httpx.ASGITransport(app=auth_app), base_url="https://mcptest.com") as client: + yield client + + +@pytest.fixture +async def registered_client(test_client: httpx.AsyncClient, request): + """Create and register a test client. + + Parameters can be customized via indirect parameterization: + @pytest.mark.parametrize("registered_client", + [{"grant_types": ["authorization_code"]}], + indirect=True) + """ + # Default client metadata + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + "grant_types": ["authorization_code", "refresh_token"], + } + + # Override with any parameters from the test + if hasattr(request, "param") and request.param: + client_metadata.update(request.param) + + response = await test_client.post("/register", json=client_metadata) + assert response.status_code == 201, f"Failed to register client: {response.content}" + + client_info = response.json() + return client_info + + +@pytest.fixture +def pkce_challenge(): + """Create a PKCE challenge with code_verifier and code_challenge.""" + code_verifier = "some_random_verifier_string" + code_challenge = base64.urlsafe_b64encode(hashlib.sha256(code_verifier.encode()).digest()).decode().rstrip("=") + + return {"code_verifier": code_verifier, "code_challenge": code_challenge} + + +@pytest.fixture +async def auth_code(test_client, registered_client, pkce_challenge, request): + """Get an authorization code. + + Parameters can be customized via indirect parameterization: + @pytest.mark.parametrize("auth_code", + [{"redirect_uri": "https://client.example.com/other-callback"}], + indirect=True) + """ + # Default authorize params + auth_params = { + "response_type": "code", + "client_id": registered_client["client_id"], + "redirect_uri": "https://client.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + } + + # Override with any parameters from the test + if hasattr(request, "param") and request.param: + auth_params.update(request.param) + + response = await test_client.get("/authorize", params=auth_params) + assert response.status_code == 302, f"Failed to get auth code: {response.content}" + + # Extract the authorization code + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "code" in query_params, f"No code in response: {query_params}" + auth_code = query_params["code"][0] + + return { + "code": auth_code, + "redirect_uri": auth_params["redirect_uri"], + "state": query_params.get("state", [None])[0], + } + + +@pytest.fixture +async def tokens(test_client, registered_client, auth_code, pkce_challenge, request): + """Exchange authorization code for tokens. + + Parameters can be customized via indirect parameterization: + @pytest.mark.parametrize("tokens", + [{"code_verifier": "wrong_verifier"}], + indirect=True) + """ + # Default token request params + token_params = { + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": auth_code["code"], + "code_verifier": pkce_challenge["code_verifier"], + "redirect_uri": auth_code["redirect_uri"], + } + + # Override with any parameters from the test + if hasattr(request, "param") and request.param: + token_params.update(request.param) + + response = await test_client.post("/token", data=token_params) + + # Don't assert success here since some tests will intentionally cause errors + return { + "response": response, + "params": token_params, + } + + +class TestAuthEndpoints: + @pytest.mark.anyio + async def test_metadata_endpoint(self, test_client: httpx.AsyncClient): + """Test the OAuth 2.0 metadata endpoint.""" + print("Sending request to metadata endpoint") + response = await test_client.get("/.well-known/oauth-authorization-server") + print(f"Got response: {response.status_code}") + if response.status_code != 200: + print(f"Response content: {response.content}") + assert response.status_code == 200 + + metadata = response.json() + assert metadata["issuer"] == "https://auth.example.com/" + assert metadata["authorization_endpoint"] == "https://auth.example.com/authorize" + assert metadata["token_endpoint"] == "https://auth.example.com/token" + assert metadata["registration_endpoint"] == "https://auth.example.com/register" + assert metadata["revocation_endpoint"] == "https://auth.example.com/revoke" + assert metadata["response_types_supported"] == ["code"] + assert metadata["code_challenge_methods_supported"] == ["S256"] + assert metadata["token_endpoint_auth_methods_supported"] == ["client_secret_post"] + assert metadata["grant_types_supported"] == [ + "authorization_code", + "refresh_token", + ] + assert metadata["service_documentation"] == "https://docs.example.com/" + + @pytest.mark.anyio + async def test_token_validation_error(self, test_client: httpx.AsyncClient): + """Test token endpoint error - validation error.""" + # Missing required fields + response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + # Missing code, code_verifier, client_id, etc. + }, + ) + error_response = response.json() + assert error_response["error"] == "invalid_request" + assert "error_description" in error_response # Contains validation error messages + + @pytest.mark.anyio + async def test_token_invalid_auth_code(self, test_client, registered_client, pkce_challenge): + """Test token endpoint error - authorization code does not exist.""" + # Try to use a non-existent authorization code + response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": "non_existent_auth_code", + "code_verifier": pkce_challenge["code_verifier"], + "redirect_uri": "https://client.example.com/callback", + }, + ) + print(f"Status code: {response.status_code}") + print(f"Response body: {response.content}") + print(f"Response JSON: {response.json()}") + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_grant" + assert "authorization code does not exist" in error_response["error_description"] + + @pytest.mark.anyio + async def test_token_expired_auth_code( + self, + test_client, + registered_client, + auth_code, + pkce_challenge, + mock_oauth_provider, + ): + """Test token endpoint error - authorization code has expired.""" + # Get the current time for our time mocking + current_time = time.time() + + # Find the auth code object + code_value = auth_code["code"] + found_code = None + for code_obj in mock_oauth_provider.auth_codes.values(): + if code_obj.code == code_value: + found_code = code_obj + break + + assert found_code is not None + + # Authorization codes are typically short-lived (5 minutes = 300 seconds) + # So we'll mock time to be 10 minutes (600 seconds) in the future + with unittest.mock.patch("time.time", return_value=current_time + 600): + # Try to use the expired authorization code + response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": code_value, + "code_verifier": pkce_challenge["code_verifier"], + "redirect_uri": auth_code["redirect_uri"], + }, + ) + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_grant" + assert "authorization code has expired" in error_response["error_description"] + + @pytest.mark.anyio + @pytest.mark.parametrize( + "registered_client", + [ + { + "redirect_uris": [ + "https://client.example.com/callback", + "https://client.example.com/other-callback", + ] + } + ], + indirect=True, + ) + async def test_token_redirect_uri_mismatch(self, test_client, registered_client, auth_code, pkce_challenge): + """Test token endpoint error - redirect URI mismatch.""" + # Try to use the code with a different redirect URI + response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": auth_code["code"], + "code_verifier": pkce_challenge["code_verifier"], + # Different from the one used in /authorize + "redirect_uri": "https://client.example.com/other-callback", + }, + ) + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_request" + assert "redirect_uri did not match" in error_response["error_description"] + + @pytest.mark.anyio + async def test_token_code_verifier_mismatch(self, test_client, registered_client, auth_code): + """Test token endpoint error - PKCE code verifier mismatch.""" + # Try to use the code with an incorrect code verifier + response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": auth_code["code"], + # Different from the one used to create challenge + "code_verifier": "incorrect_code_verifier", + "redirect_uri": auth_code["redirect_uri"], + }, + ) + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_grant" + assert "incorrect code_verifier" in error_response["error_description"] + + @pytest.mark.anyio + async def test_token_invalid_refresh_token(self, test_client, registered_client): + """Test token endpoint error - refresh token does not exist.""" + # Try to use a non-existent refresh token + response = await test_client.post( + "/token", + data={ + "grant_type": "refresh_token", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "refresh_token": "non_existent_refresh_token", + }, + ) + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_grant" + assert "refresh token does not exist" in error_response["error_description"] + + @pytest.mark.anyio + async def test_token_expired_refresh_token( + self, + test_client, + registered_client, + auth_code, + pkce_challenge, + mock_oauth_provider, + ): + """Test token endpoint error - refresh token has expired.""" + # Step 1: First, let's create a token and refresh token at the current time + current_time = time.time() + + # Exchange authorization code for tokens normally + token_response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": auth_code["code"], + "code_verifier": pkce_challenge["code_verifier"], + "redirect_uri": auth_code["redirect_uri"], + }, + ) + assert token_response.status_code == 200 + tokens = token_response.json() + refresh_token = tokens["refresh_token"] + + # Step 2: Time travel forward 4 hours (tokens expire in 1 hour by default) + # Mock the time.time() function to return a value 4 hours in the future + with unittest.mock.patch("time.time", return_value=current_time + 14400): # 4 hours = 14400 seconds + # Try to use the refresh token which should now be considered expired + response = await test_client.post( + "/token", + data={ + "grant_type": "refresh_token", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "refresh_token": refresh_token, + }, + ) + + # In the "future", the token should be considered expired + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_grant" + assert "refresh token has expired" in error_response["error_description"] + + @pytest.mark.anyio + async def test_token_invalid_scope(self, test_client, registered_client, auth_code, pkce_challenge): + """Test token endpoint error - invalid scope in refresh token request.""" + # Exchange authorization code for tokens + token_response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "code": auth_code["code"], + "code_verifier": pkce_challenge["code_verifier"], + "redirect_uri": auth_code["redirect_uri"], + }, + ) + assert token_response.status_code == 200 + + tokens = token_response.json() + refresh_token = tokens["refresh_token"] + + # Try to use refresh token with an invalid scope + response = await test_client.post( + "/token", + data={ + "grant_type": "refresh_token", + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "refresh_token": refresh_token, + "scope": "read write invalid_scope", # Adding an invalid scope + }, + ) + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_scope" + assert "cannot request scope" in error_response["error_description"] + + @pytest.mark.anyio + async def test_client_registration(self, test_client: httpx.AsyncClient, mock_oauth_provider: MockOAuthProvider): + """Test client registration.""" + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + "client_uri": "https://client.example.com", + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 201, response.content + + client_info = response.json() + assert "client_id" in client_info + assert "client_secret" in client_info + assert client_info["client_name"] == "Test Client" + assert client_info["redirect_uris"] == ["https://client.example.com/callback"] + + # Verify that the client was registered + # assert await mock_oauth_provider.clients_store.get_client( + # client_info["client_id"] + # ) is not None + + @pytest.mark.anyio + async def test_client_registration_missing_required_fields(self, test_client: httpx.AsyncClient): + """Test client registration with missing required fields.""" + # Missing redirect_uris which is a required field + client_metadata = { + "client_name": "Test Client", + "client_uri": "https://client.example.com", + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 400 + error_data = response.json() + assert "error" in error_data + assert error_data["error"] == "invalid_client_metadata" + assert error_data["error_description"] == "redirect_uris: Field required" + + @pytest.mark.anyio + async def test_client_registration_invalid_uri(self, test_client: httpx.AsyncClient): + """Test client registration with invalid URIs.""" + # Invalid redirect_uri format + client_metadata = { + "redirect_uris": ["not-a-valid-uri"], + "client_name": "Test Client", + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 400 + error_data = response.json() + assert "error" in error_data + assert error_data["error"] == "invalid_client_metadata" + assert error_data["error_description"] == ( + "redirect_uris.0: Input should be a valid URL, " "relative URL without a base" + ) + + @pytest.mark.anyio + async def test_client_registration_empty_redirect_uris(self, test_client: httpx.AsyncClient): + """Test client registration with empty redirect_uris array.""" + client_metadata = { + "redirect_uris": [], # Empty array + "client_name": "Test Client", + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 400 + error_data = response.json() + assert "error" in error_data + assert error_data["error"] == "invalid_client_metadata" + assert ( + error_data["error_description"] == "redirect_uris: List should have at least 1 item after validation, not 0" + ) + + @pytest.mark.anyio + async def test_authorize_form_post( + self, + test_client: httpx.AsyncClient, + mock_oauth_provider: MockOAuthProvider, + pkce_challenge, + ): + """Test the authorization endpoint using POST with form-encoded data.""" + # Register a client + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + "grant_types": ["authorization_code", "refresh_token"], + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 201 + client_info = response.json() + + # Use POST with form-encoded data for authorization + response = await test_client.post( + "/authorize", + data={ + "response_type": "code", + "client_id": client_info["client_id"], + "redirect_uri": "https://client.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_form_state", + }, + ) + assert response.status_code == 302 + + # Extract the authorization code from the redirect URL + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "code" in query_params + assert query_params["state"][0] == "test_form_state" + + @pytest.mark.anyio + async def test_authorization_get( + self, + test_client: httpx.AsyncClient, + mock_oauth_provider: MockOAuthProvider, + pkce_challenge, + ): + """Test the full authorization flow.""" + # 1. Register a client + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + "grant_types": ["authorization_code", "refresh_token"], + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 201 + client_info = response.json() + + # 2. Request authorization using GET with query params + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": client_info["client_id"], + "redirect_uri": "https://client.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + assert response.status_code == 302 + + # 3. Extract the authorization code from the redirect URL + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "code" in query_params + assert query_params["state"][0] == "test_state" + auth_code = query_params["code"][0] + + # 4. Exchange the authorization code for tokens + response = await test_client.post( + "/token", + data={ + "grant_type": "authorization_code", + "client_id": client_info["client_id"], + "client_secret": client_info["client_secret"], + "code": auth_code, + "code_verifier": pkce_challenge["code_verifier"], + "redirect_uri": "https://client.example.com/callback", + }, + ) + assert response.status_code == 200 + + token_response = response.json() + assert "access_token" in token_response + assert "token_type" in token_response + assert "refresh_token" in token_response + assert "expires_in" in token_response + assert token_response["token_type"] == "Bearer" + + # 5. Verify the access token + access_token = token_response["access_token"] + refresh_token = token_response["refresh_token"] + + # Create a test client with the token + auth_info = await mock_oauth_provider.load_access_token(access_token) + assert auth_info + assert auth_info.client_id == client_info["client_id"] + assert "read" in auth_info.scopes + assert "write" in auth_info.scopes + + # 6. Refresh the token + response = await test_client.post( + "/token", + data={ + "grant_type": "refresh_token", + "client_id": client_info["client_id"], + "client_secret": client_info["client_secret"], + "refresh_token": refresh_token, + "redirect_uri": "https://client.example.com/callback", + }, + ) + assert response.status_code == 200 + + new_token_response = response.json() + assert "access_token" in new_token_response + assert "refresh_token" in new_token_response + assert new_token_response["access_token"] != access_token + assert new_token_response["refresh_token"] != refresh_token + + # 7. Revoke the token + response = await test_client.post( + "/revoke", + data={ + "client_id": client_info["client_id"], + "client_secret": client_info["client_secret"], + "token": new_token_response["access_token"], + }, + ) + assert response.status_code == 200 + + # Verify that the token was revoked + assert await mock_oauth_provider.load_access_token(new_token_response["access_token"]) is None + + @pytest.mark.anyio + async def test_revoke_invalid_token(self, test_client, registered_client): + """Test revoking an invalid token.""" + response = await test_client.post( + "/revoke", + data={ + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "token": "invalid_token", + }, + ) + # per RFC, this should return 200 even if the token is invalid + assert response.status_code == 200 + + @pytest.mark.anyio + async def test_revoke_with_malformed_token(self, test_client, registered_client): + response = await test_client.post( + "/revoke", + data={ + "client_id": registered_client["client_id"], + "client_secret": registered_client["client_secret"], + "token": 123, + "token_type_hint": "asdf", + }, + ) + assert response.status_code == 400 + error_response = response.json() + assert error_response["error"] == "invalid_request" + assert "token_type_hint" in error_response["error_description"] + + @pytest.mark.anyio + async def test_client_registration_disallowed_scopes(self, test_client: httpx.AsyncClient): + """Test client registration with scopes that are not allowed.""" + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + "scope": "read write profile admin", # 'admin' is not in valid_scopes + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 400 + error_data = response.json() + assert "error" in error_data + assert error_data["error"] == "invalid_client_metadata" + assert "scope" in error_data["error_description"] + assert "admin" in error_data["error_description"] + + @pytest.mark.anyio + async def test_client_registration_default_scopes( + self, test_client: httpx.AsyncClient, mock_oauth_provider: MockOAuthProvider + ): + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + # No scope specified + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 201 + client_info = response.json() + + # Verify client was registered successfully + assert client_info["scope"] == "read write" + + # Retrieve the client from the store to verify default scopes + registered_client = await mock_oauth_provider.get_client(client_info["client_id"]) + assert registered_client is not None + + # Check that default scopes were applied + assert registered_client.scope == "read write" + + @pytest.mark.anyio + async def test_client_registration_invalid_grant_type(self, test_client: httpx.AsyncClient): + client_metadata = { + "redirect_uris": ["https://client.example.com/callback"], + "client_name": "Test Client", + "grant_types": ["authorization_code"], + } + + response = await test_client.post( + "/register", + json=client_metadata, + ) + assert response.status_code == 400 + error_data = response.json() + assert "error" in error_data + assert error_data["error"] == "invalid_client_metadata" + assert error_data["error_description"] == "grant_types must be authorization_code and refresh_token" + + +class TestAuthorizeEndpointErrors: + """Test error handling in the OAuth authorization endpoint.""" + + @pytest.mark.anyio + async def test_authorize_missing_client_id(self, test_client: httpx.AsyncClient, pkce_challenge): + """Test authorization endpoint with missing client_id. + + According to the OAuth2.0 spec, if client_id is missing, the server should + inform the resource owner and NOT redirect. + """ + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + # Missing client_id + "redirect_uri": "https://client.example.com/callback", + "state": "test_state", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + }, + ) + + # Should NOT redirect, should show an error page + assert response.status_code == 400 + # The response should include an error message about missing client_id + assert "client_id" in response.text.lower() + + @pytest.mark.anyio + async def test_authorize_invalid_client_id(self, test_client: httpx.AsyncClient, pkce_challenge): + """Test authorization endpoint with invalid client_id. + + According to the OAuth2.0 spec, if client_id is invalid, the server should + inform the resource owner and NOT redirect. + """ + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": "invalid_client_id_that_does_not_exist", + "redirect_uri": "https://client.example.com/callback", + "state": "test_state", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + }, + ) + + # Should NOT redirect, should show an error page + assert response.status_code == 400 + # The response should include an error message about invalid client_id + assert "client" in response.text.lower() + + @pytest.mark.anyio + async def test_authorize_missing_redirect_uri( + self, test_client: httpx.AsyncClient, registered_client, pkce_challenge + ): + """Test authorization endpoint with missing redirect_uri. + + If client has only one registered redirect_uri, it can be omitted. + """ + + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": registered_client["client_id"], + # Missing redirect_uri + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + + # Should redirect to the registered redirect_uri + assert response.status_code == 302, response.content + redirect_url = response.headers["location"] + assert redirect_url.startswith("https://client.example.com/callback") + + @pytest.mark.anyio + async def test_authorize_invalid_redirect_uri( + self, test_client: httpx.AsyncClient, registered_client, pkce_challenge + ): + """Test authorization endpoint with invalid redirect_uri. + + According to the OAuth2.0 spec, if redirect_uri is invalid or doesn't match, + the server should inform the resource owner and NOT redirect. + """ + + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": registered_client["client_id"], + # Non-matching URI + "redirect_uri": "https://attacker.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + + # Should NOT redirect, should show an error page + assert response.status_code == 400, response.content + # The response should include an error message about redirect_uri mismatch + assert "redirect" in response.text.lower() + + @pytest.mark.anyio + @pytest.mark.parametrize( + "registered_client", + [ + { + "redirect_uris": [ + "https://client.example.com/callback", + "https://client.example.com/other-callback", + ] + } + ], + indirect=True, + ) + async def test_authorize_missing_redirect_uri_multiple_registered( + self, test_client: httpx.AsyncClient, registered_client, pkce_challenge + ): + """Test endpoint with missing redirect_uri with multiple registered URIs. + + If client has multiple registered redirect_uris, redirect_uri must be provided. + """ + + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": registered_client["client_id"], + # Missing redirect_uri + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + + # Should NOT redirect, should return a 400 error + assert response.status_code == 400 + # The response should include an error message about missing redirect_uri + assert "redirect_uri" in response.text.lower() + + @pytest.mark.anyio + async def test_authorize_unsupported_response_type( + self, test_client: httpx.AsyncClient, registered_client, pkce_challenge + ): + """Test authorization endpoint with unsupported response_type. + + According to the OAuth2.0 spec, for other errors like unsupported_response_type, + the server should redirect with error parameters. + """ + + response = await test_client.get( + "/authorize", + params={ + "response_type": "token", # Unsupported (we only support "code") + "client_id": registered_client["client_id"], + "redirect_uri": "https://client.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + + # Should redirect with error parameters + assert response.status_code == 302 + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "error" in query_params + assert query_params["error"][0] == "unsupported_response_type" + # State should be preserved + assert "state" in query_params + assert query_params["state"][0] == "test_state" + + @pytest.mark.anyio + async def test_authorize_missing_response_type( + self, test_client: httpx.AsyncClient, registered_client, pkce_challenge + ): + """Test authorization endpoint with missing response_type. + + Missing required parameter should result in invalid_request error. + """ + + response = await test_client.get( + "/authorize", + params={ + # Missing response_type + "client_id": registered_client["client_id"], + "redirect_uri": "https://client.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "state": "test_state", + }, + ) + + # Should redirect with error parameters + assert response.status_code == 302 + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "error" in query_params + assert query_params["error"][0] == "invalid_request" + # State should be preserved + assert "state" in query_params + assert query_params["state"][0] == "test_state" + + @pytest.mark.anyio + async def test_authorize_missing_pkce_challenge(self, test_client: httpx.AsyncClient, registered_client): + """Test authorization endpoint with missing PKCE code_challenge. + + Missing PKCE parameters should result in invalid_request error. + """ + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": registered_client["client_id"], + # Missing code_challenge + "state": "test_state", + # using default URL + }, + ) + + # Should redirect with error parameters + assert response.status_code == 302 + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "error" in query_params + assert query_params["error"][0] == "invalid_request" + # State should be preserved + assert "state" in query_params + assert query_params["state"][0] == "test_state" + + @pytest.mark.anyio + async def test_authorize_invalid_scope(self, test_client: httpx.AsyncClient, registered_client, pkce_challenge): + """Test authorization endpoint with invalid scope. + + Invalid scope should redirect with invalid_scope error. + """ + + response = await test_client.get( + "/authorize", + params={ + "response_type": "code", + "client_id": registered_client["client_id"], + "redirect_uri": "https://client.example.com/callback", + "code_challenge": pkce_challenge["code_challenge"], + "code_challenge_method": "S256", + "scope": "invalid_scope_that_does_not_exist", + "state": "test_state", + }, + ) + + # Should redirect with error parameters + assert response.status_code == 302 + redirect_url = response.headers["location"] + parsed_url = urlparse(redirect_url) + query_params = parse_qs(parsed_url.query) + + assert "error" in query_params + assert query_params["error"][0] == "invalid_scope" + # State should be preserved + assert "state" in query_params + assert query_params["state"][0] == "test_state" diff --git a/tests/server/fastmcp/prompts/test_base.py b/tests/server/fastmcp/prompts/test_base.py index c4af044a6..5b7b50e63 100644 --- a/tests/server/fastmcp/prompts/test_base.py +++ b/tests/server/fastmcp/prompts/test_base.py @@ -18,9 +18,7 @@ def fn() -> str: return "Hello, world!" prompt = Prompt.from_function(fn) - assert await prompt.render() == [ - UserMessage(content=TextContent(type="text", text="Hello, world!")) - ] + assert await prompt.render() == [UserMessage(content=TextContent(type="text", text="Hello, world!"))] @pytest.mark.anyio async def test_async_fn(self): @@ -28,9 +26,7 @@ async def fn() -> str: return "Hello, world!" prompt = Prompt.from_function(fn) - assert await prompt.render() == [ - UserMessage(content=TextContent(type="text", text="Hello, world!")) - ] + assert await prompt.render() == [UserMessage(content=TextContent(type="text", text="Hello, world!"))] @pytest.mark.anyio async def test_fn_with_args(self): @@ -39,11 +35,7 @@ async def fn(name: str, age: int = 30) -> str: prompt = Prompt.from_function(fn) assert await prompt.render(arguments={"name": "World"}) == [ - UserMessage( - content=TextContent( - type="text", text="Hello, World! You're 30 years old." - ) - ) + UserMessage(content=TextContent(type="text", text="Hello, World! You're 30 years old.")) ] @pytest.mark.anyio @@ -61,21 +53,15 @@ async def fn() -> UserMessage: return UserMessage(content="Hello, world!") prompt = Prompt.from_function(fn) - assert await prompt.render() == [ - UserMessage(content=TextContent(type="text", text="Hello, world!")) - ] + assert await prompt.render() == [UserMessage(content=TextContent(type="text", text="Hello, world!"))] @pytest.mark.anyio async def test_fn_returns_assistant_message(self): async def fn() -> AssistantMessage: - return AssistantMessage( - content=TextContent(type="text", text="Hello, world!") - ) + return AssistantMessage(content=TextContent(type="text", text="Hello, world!")) prompt = Prompt.from_function(fn) - assert await prompt.render() == [ - AssistantMessage(content=TextContent(type="text", text="Hello, world!")) - ] + assert await prompt.render() == [AssistantMessage(content=TextContent(type="text", text="Hello, world!"))] @pytest.mark.anyio async def test_fn_returns_multiple_messages(self): @@ -156,9 +142,7 @@ async def fn() -> list[Message]: prompt = Prompt.from_function(fn) assert await prompt.render() == [ - UserMessage( - content=TextContent(type="text", text="Please analyze this file:") - ), + UserMessage(content=TextContent(type="text", text="Please analyze this file:")), UserMessage( content=EmbeddedResource( type="resource", @@ -169,9 +153,7 @@ async def fn() -> list[Message]: ), ) ), - AssistantMessage( - content=TextContent(type="text", text="I'll help analyze that file.") - ), + AssistantMessage(content=TextContent(type="text", text="I'll help analyze that file.")), ] @pytest.mark.anyio diff --git a/tests/server/fastmcp/prompts/test_manager.py b/tests/server/fastmcp/prompts/test_manager.py index c64a4a564..82b234638 100644 --- a/tests/server/fastmcp/prompts/test_manager.py +++ b/tests/server/fastmcp/prompts/test_manager.py @@ -72,9 +72,7 @@ def fn() -> str: prompt = Prompt.from_function(fn) manager.add_prompt(prompt) messages = await manager.render_prompt("fn") - assert messages == [ - UserMessage(content=TextContent(type="text", text="Hello, world!")) - ] + assert messages == [UserMessage(content=TextContent(type="text", text="Hello, world!"))] @pytest.mark.anyio async def test_render_prompt_with_args(self): @@ -87,9 +85,7 @@ def fn(name: str) -> str: prompt = Prompt.from_function(fn) manager.add_prompt(prompt) messages = await manager.render_prompt("fn", arguments={"name": "World"}) - assert messages == [ - UserMessage(content=TextContent(type="text", text="Hello, World!")) - ] + assert messages == [UserMessage(content=TextContent(type="text", text="Hello, World!"))] @pytest.mark.anyio async def test_render_unknown_prompt(self): diff --git a/tests/server/fastmcp/resources/test_file_resources.py b/tests/server/fastmcp/resources/test_file_resources.py index 36cbca32c..ec3c85d8d 100644 --- a/tests/server/fastmcp/resources/test_file_resources.py +++ b/tests/server/fastmcp/resources/test_file_resources.py @@ -100,9 +100,7 @@ async def test_missing_file_error(self, temp_file: Path): with pytest.raises(ValueError, match="Error reading file"): await resource.read() - @pytest.mark.skipif( - os.name == "nt", reason="File permissions behave differently on Windows" - ) + @pytest.mark.skipif(os.name == "nt", reason="File permissions behave differently on Windows") @pytest.mark.anyio async def test_permission_error(self, temp_file: Path): """Test reading a file without permissions.""" diff --git a/tests/server/fastmcp/resources/test_function_resources.py b/tests/server/fastmcp/resources/test_function_resources.py index 5bfc72bf6..f59436ae3 100644 --- a/tests/server/fastmcp/resources/test_function_resources.py +++ b/tests/server/fastmcp/resources/test_function_resources.py @@ -100,7 +100,7 @@ class MyModel(BaseModel): fn=lambda: MyModel(name="test"), ) content = await resource.read() - assert content == '{"name": "test"}' + assert content == '{\n "name": "test"\n}' @pytest.mark.anyio async def test_custom_type_conversion(self): @@ -136,3 +136,22 @@ async def get_data() -> str: content = await resource.read() assert content == "Hello, world!" assert resource.mime_type == "text/plain" + + @pytest.mark.anyio + async def test_from_function(self): + """Test creating a FunctionResource from a function.""" + + async def get_data() -> str: + """get_data returns a string""" + return "Hello, world!" + + resource = FunctionResource.from_function( + fn=get_data, + uri="function://test", + name="test", + ) + + assert resource.description == "get_data returns a string" + assert resource.mime_type == "text/plain" + assert resource.name == "test" + assert resource.uri == AnyUrl("function://test") diff --git a/tests/server/fastmcp/resources/test_resource_template.py b/tests/server/fastmcp/resources/test_resource_template.py index 09bc600d0..f47244361 100644 --- a/tests/server/fastmcp/resources/test_resource_template.py +++ b/tests/server/fastmcp/resources/test_resource_template.py @@ -185,4 +185,4 @@ def get_data(value: str) -> CustomData: assert isinstance(resource, FunctionResource) content = await resource.read() - assert content == "hello" + assert content == '"hello"' diff --git a/tests/server/fastmcp/servers/test_file_server.py b/tests/server/fastmcp/servers/test_file_server.py index c1f51cabe..b40778ea8 100644 --- a/tests/server/fastmcp/servers/test_file_server.py +++ b/tests/server/fastmcp/servers/test_file_server.py @@ -114,17 +114,13 @@ async def test_read_resource_file(mcp: FastMCP): @pytest.mark.anyio async def test_delete_file(mcp: FastMCP, test_dir: Path): - await mcp.call_tool( - "delete_file", arguments={"path": str(test_dir / "example.py")} - ) + await mcp.call_tool("delete_file", arguments={"path": str(test_dir / "example.py")}) assert not (test_dir / "example.py").exists() @pytest.mark.anyio async def test_delete_file_and_check_resources(mcp: FastMCP, test_dir: Path): - await mcp.call_tool( - "delete_file", arguments={"path": str(test_dir / "example.py")} - ) + await mcp.call_tool("delete_file", arguments={"path": str(test_dir / "example.py")}) res_iter = await mcp.read_resource("file://test_dir/example.py") res_list = list(res_iter) assert len(res_list) == 1 diff --git a/tests/server/fastmcp/test_elicitation.py b/tests/server/fastmcp/test_elicitation.py new file mode 100644 index 000000000..20937d91d --- /dev/null +++ b/tests/server/fastmcp/test_elicitation.py @@ -0,0 +1,210 @@ +""" +Test the elicitation feature using stdio transport. +""" + +import pytest +from pydantic import BaseModel, Field + +from mcp.server.fastmcp import Context, FastMCP +from mcp.shared.memory import create_connected_server_and_client_session +from mcp.types import ElicitResult, TextContent + + +# Shared schema for basic tests +class AnswerSchema(BaseModel): + answer: str = Field(description="The user's answer to the question") + + +def create_ask_user_tool(mcp: FastMCP): + """Create a standard ask_user tool that handles all elicitation responses.""" + + @mcp.tool(description="A tool that uses elicitation") + async def ask_user(prompt: str, ctx: Context) -> str: + result = await ctx.elicit( + message=f"Tool wants to ask: {prompt}", + schema=AnswerSchema, + ) + + if result.action == "accept" and result.data: + return f"User answered: {result.data.answer}" + elif result.action == "decline": + return "User declined to answer" + else: + return "User cancelled" + + return ask_user + + +async def call_tool_and_assert( + mcp: FastMCP, + elicitation_callback, + tool_name: str, + args: dict, + expected_text: str | None = None, + text_contains: list[str] | None = None, +): + """Helper to create session, call tool, and assert result.""" + async with create_connected_server_and_client_session( + mcp._mcp_server, elicitation_callback=elicitation_callback + ) as client_session: + await client_session.initialize() + + result = await client_session.call_tool(tool_name, args) + assert len(result.content) == 1 + assert isinstance(result.content[0], TextContent) + + if expected_text is not None: + assert result.content[0].text == expected_text + elif text_contains is not None: + for substring in text_contains: + assert substring in result.content[0].text + + return result + + +@pytest.mark.anyio +async def test_stdio_elicitation(): + """Test the elicitation feature using stdio transport.""" + mcp = FastMCP(name="StdioElicitationServer") + create_ask_user_tool(mcp) + + # Create a custom handler for elicitation requests + async def elicitation_callback(context, params): + if params.message == "Tool wants to ask: What is your name?": + return ElicitResult(action="accept", content={"answer": "Test User"}) + else: + raise ValueError(f"Unexpected elicitation message: {params.message}") + + await call_tool_and_assert( + mcp, elicitation_callback, "ask_user", {"prompt": "What is your name?"}, "User answered: Test User" + ) + + +@pytest.mark.anyio +async def test_stdio_elicitation_decline(): + """Test elicitation with user declining.""" + mcp = FastMCP(name="StdioElicitationDeclineServer") + create_ask_user_tool(mcp) + + async def elicitation_callback(context, params): + return ElicitResult(action="decline") + + await call_tool_and_assert( + mcp, elicitation_callback, "ask_user", {"prompt": "What is your name?"}, "User declined to answer" + ) + + +@pytest.mark.anyio +async def test_elicitation_schema_validation(): + """Test that elicitation schemas must only contain primitive types.""" + mcp = FastMCP(name="ValidationTestServer") + + def create_validation_tool(name: str, schema_class: type[BaseModel]): + @mcp.tool(name=name, description=f"Tool testing {name}") + async def tool(ctx: Context) -> str: + try: + await ctx.elicit(message="This should fail validation", schema=schema_class) + return "Should not reach here" + except TypeError as e: + return f"Validation failed as expected: {str(e)}" + + return tool + + # Test cases for invalid schemas + class InvalidListSchema(BaseModel): + names: list[str] = Field(description="List of names") + + class NestedModel(BaseModel): + value: str + + class InvalidNestedSchema(BaseModel): + nested: NestedModel = Field(description="Nested model") + + create_validation_tool("invalid_list", InvalidListSchema) + create_validation_tool("nested_model", InvalidNestedSchema) + + # Dummy callback (won't be called due to validation failure) + async def elicitation_callback(context, params): + return ElicitResult(action="accept", content={}) + + async with create_connected_server_and_client_session( + mcp._mcp_server, elicitation_callback=elicitation_callback + ) as client_session: + await client_session.initialize() + + # Test both invalid schemas + for tool_name, field_name in [("invalid_list", "names"), ("nested_model", "nested")]: + result = await client_session.call_tool(tool_name, {}) + assert len(result.content) == 1 + assert isinstance(result.content[0], TextContent) + assert "Validation failed as expected" in result.content[0].text + assert field_name in result.content[0].text + + +@pytest.mark.anyio +async def test_elicitation_with_optional_fields(): + """Test that Optional fields work correctly in elicitation schemas.""" + mcp = FastMCP(name="OptionalFieldServer") + + class OptionalSchema(BaseModel): + required_name: str = Field(description="Your name (required)") + optional_age: int | None = Field(default=None, description="Your age (optional)") + optional_email: str | None = Field(default=None, description="Your email (optional)") + subscribe: bool | None = Field(default=False, description="Subscribe to newsletter?") + + @mcp.tool(description="Tool with optional fields") + async def optional_tool(ctx: Context) -> str: + result = await ctx.elicit(message="Please provide your information", schema=OptionalSchema) + + if result.action == "accept" and result.data: + info = [f"Name: {result.data.required_name}"] + if result.data.optional_age is not None: + info.append(f"Age: {result.data.optional_age}") + if result.data.optional_email is not None: + info.append(f"Email: {result.data.optional_email}") + info.append(f"Subscribe: {result.data.subscribe}") + return ", ".join(info) + else: + return f"User {result.action}" + + # Test cases with different field combinations + test_cases = [ + ( + # All fields provided + {"required_name": "John Doe", "optional_age": 30, "optional_email": "john@example.com", "subscribe": True}, + "Name: John Doe, Age: 30, Email: john@example.com, Subscribe: True", + ), + ( + # Only required fields + {"required_name": "Jane Smith"}, + "Name: Jane Smith, Subscribe: False", + ), + ] + + for content, expected in test_cases: + + async def callback(context, params): + return ElicitResult(action="accept", content=content) + + await call_tool_and_assert(mcp, callback, "optional_tool", {}, expected) + + # Test invalid optional field + class InvalidOptionalSchema(BaseModel): + name: str = Field(description="Name") + optional_list: list[str] | None = Field(default=None, description="Invalid optional list") + + @mcp.tool(description="Tool with invalid optional field") + async def invalid_optional_tool(ctx: Context) -> str: + try: + await ctx.elicit(message="This should fail", schema=InvalidOptionalSchema) + return "Should not reach here" + except TypeError as e: + return f"Validation failed: {str(e)}" + + await call_tool_and_assert( + mcp, + lambda c, p: ElicitResult(action="accept", content={}), + "invalid_optional_tool", + {}, + text_contains=["Validation failed:", "optional_list"], + ) diff --git a/tests/server/fastmcp/test_func_metadata.py b/tests/server/fastmcp/test_func_metadata.py index b1828ffe9..b13685e88 100644 --- a/tests/server/fastmcp/test_func_metadata.py +++ b/tests/server/fastmcp/test_func_metadata.py @@ -28,9 +28,7 @@ def complex_arguments_fn( # list[str] | str is an interesting case because if it comes in as JSON like # "[\"a\", \"b\"]" then it will be naively parsed as a string. list_str_or_str: list[str] | str, - an_int_annotated_with_field: Annotated[ - int, Field(description="An int with a field") - ], + an_int_annotated_with_field: Annotated[int, Field(description="An int with a field")], an_int_annotated_with_field_and_others: Annotated[ int, str, # Should be ignored, really @@ -42,9 +40,7 @@ def complex_arguments_fn( "123", 456, ], - field_with_default_via_field_annotation_before_nondefault_arg: Annotated[ - int, Field(1) - ], + field_with_default_via_field_annotation_before_nondefault_arg: Annotated[int, Field(1)], unannotated, my_model_a: SomeInputModelA, my_model_a_forward_ref: "SomeInputModelA", @@ -179,9 +175,7 @@ def func_with_str_types(str_or_list: str | list[str]): def test_skip_names(): """Test that skipped parameters are not included in the model""" - def func_with_many_params( - keep_this: int, skip_this: str, also_keep: float, also_skip: bool - ): + def func_with_many_params(keep_this: int, skip_this: str, also_keep: float, also_skip: bool): return keep_this, skip_this, also_keep, also_skip # Skip some parameters diff --git a/tests/server/fastmcp/test_integration.py b/tests/server/fastmcp/test_integration.py new file mode 100644 index 000000000..526201f9a --- /dev/null +++ b/tests/server/fastmcp/test_integration.py @@ -0,0 +1,1171 @@ +""" +Integration tests for FastMCP server functionality. + +These tests validate the proper functioning of FastMCP in various configurations, +including with and without authentication. +""" + +import json +import multiprocessing +import socket +import time +from collections.abc import Generator +from typing import Any + +import pytest +import uvicorn +from pydantic import AnyUrl, BaseModel, Field +from starlette.applications import Starlette +from starlette.requests import Request + +from mcp.client.session import ClientSession +from mcp.client.sse import sse_client +from mcp.client.streamable_http import streamablehttp_client +from mcp.server.fastmcp import Context, FastMCP +from mcp.server.fastmcp.resources import FunctionResource +from mcp.server.transport_security import TransportSecuritySettings +from mcp.shared.context import RequestContext +from mcp.types import ( + Completion, + CompletionArgument, + CompletionContext, + CreateMessageRequestParams, + CreateMessageResult, + ElicitResult, + GetPromptResult, + InitializeResult, + LoggingMessageNotification, + ProgressNotification, + PromptReference, + ReadResourceResult, + ResourceLink, + ResourceListChangedNotification, + ResourceTemplateReference, + SamplingMessage, + ServerNotification, + TextContent, + TextResourceContents, + ToolListChangedNotification, +) + + +@pytest.fixture +def server_port() -> int: + """Get a free port for testing.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def server_url(server_port: int) -> str: + """Get the server URL for testing.""" + return f"http://127.0.0.1:{server_port}" + + +@pytest.fixture +def http_server_port() -> int: + """Get a free port for testing the StreamableHTTP server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def http_server_url(http_server_port: int) -> str: + """Get the StreamableHTTP server URL for testing.""" + return f"http://127.0.0.1:{http_server_port}" + + +@pytest.fixture +def stateless_http_server_port() -> int: + """Get a free port for testing the stateless StreamableHTTP server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def stateless_http_server_url(stateless_http_server_port: int) -> str: + """Get the stateless StreamableHTTP server URL for testing.""" + return f"http://127.0.0.1:{stateless_http_server_port}" + + +# Create a function to make the FastMCP server app +def make_fastmcp_app(): + """Create a FastMCP server without auth settings.""" + transport_security = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + mcp = FastMCP(name="NoAuthServer", transport_security=transport_security) + + # Add a simple tool + @mcp.tool(description="A simple echo tool") + def echo(message: str) -> str: + return f"Echo: {message}" + + # Add a tool that uses elicitation + @mcp.tool(description="A tool that uses elicitation") + async def ask_user(prompt: str, ctx: Context) -> str: + class AnswerSchema(BaseModel): + answer: str = Field(description="The user's answer to the question") + + result = await ctx.elicit(message=f"Tool wants to ask: {prompt}", schema=AnswerSchema) + + if result.action == "accept" and result.data: + return f"User answered: {result.data.answer}" + else: + # Handle cancellation or decline + return f"User cancelled or declined: {result.action}" + + # Create the SSE app + app = mcp.sse_app() + + return mcp, app + + +def make_everything_fastmcp() -> FastMCP: + """Create a FastMCP server with all features enabled for testing.""" + transport_security = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + mcp = FastMCP(name="EverythingServer", transport_security=transport_security) + + # Tool with context for logging and progress + @mcp.tool(description="A tool that demonstrates logging and progress", title="Progress Tool") + async def tool_with_progress(message: str, ctx: Context, steps: int = 3) -> str: + await ctx.info(f"Starting processing of '{message}' with {steps} steps") + + # Send progress notifications + for i in range(steps): + progress_value = (i + 1) / steps + await ctx.report_progress( + progress=progress_value, + total=1.0, + message=f"Processing step {i + 1} of {steps}", + ) + await ctx.debug(f"Completed step {i + 1}") + + return f"Processed '{message}' in {steps} steps" + + # Simple tool for basic functionality + @mcp.tool(description="A simple echo tool", title="Echo Tool") + def echo(message: str) -> str: + return f"Echo: {message}" + + # Tool that returns ResourceLinks + @mcp.tool(description="Lists files and returns resource links", title="List Files Tool") + def list_files() -> list[ResourceLink]: + """Returns a list of resource links for files matching the pattern.""" + + # Mock some file resources for testing + file_resources = [ + { + "type": "resource_link", + "uri": "file:///project/README.md", + "name": "README.md", + "mimeType": "text/markdown", + } + ] + + result: list[ResourceLink] = [ResourceLink.model_validate(file_json) for file_json in file_resources] + + return result + + # Tool with sampling capability + @mcp.tool(description="A tool that uses sampling to generate content", title="Sampling Tool") + async def sampling_tool(prompt: str, ctx: Context) -> str: + await ctx.info(f"Requesting sampling for prompt: {prompt}") + + # Request sampling from the client + result = await ctx.session.create_message( + messages=[SamplingMessage(role="user", content=TextContent(type="text", text=prompt))], + max_tokens=100, + temperature=0.7, + ) + + await ctx.info(f"Received sampling result from model: {result.model}") + # Handle different content types + if result.content.type == "text": + return f"Sampling result: {result.content.text[:100]}..." + else: + return f"Sampling result: {str(result.content)[:100]}..." + + # Tool that sends notifications and logging + @mcp.tool(description="A tool that demonstrates notifications and logging", title="Notification Tool") + async def notification_tool(message: str, ctx: Context) -> str: + # Send different log levels + await ctx.debug("Debug: Starting notification tool") + await ctx.info(f"Info: Processing message '{message}'") + await ctx.warning("Warning: This is a test warning") + + # Send resource change notifications + await ctx.session.send_resource_list_changed() + await ctx.session.send_tool_list_changed() + + await ctx.info("Completed notification tool successfully") + return f"Sent notifications and logs for: {message}" + + # Resource - static + def get_static_info() -> str: + return "This is static resource content" + + static_resource = FunctionResource( + uri=AnyUrl("resource://static/info"), + name="Static Info", + title="Static Information", + description="Static information resource", + fn=get_static_info, + ) + mcp.add_resource(static_resource) + + # Resource - dynamic function + @mcp.resource("resource://dynamic/{category}", title="Dynamic Resource") + def dynamic_resource(category: str) -> str: + return f"Dynamic resource content for category: {category}" + + # Resource template + @mcp.resource("resource://template/{id}/data", title="Template Resource") + def template_resource(id: str) -> str: + return f"Template resource data for ID: {id}" + + # Prompt - simple + @mcp.prompt(description="A simple prompt", title="Simple Prompt") + def simple_prompt(topic: str) -> str: + return f"Tell me about {topic}" + + # Prompt - complex with multiple messages + @mcp.prompt(description="Complex prompt with context", title="Complex Prompt") + def complex_prompt(user_query: str, context: str = "general") -> str: + # For simplicity, return a single string that incorporates the context + # Since FastMCP doesn't support system messages in the same way + return f"Context: {context}. Query: {user_query}" + + # Resource template with completion support + @mcp.resource("github://repos/{owner}/{repo}", title="GitHub Repository") + def github_repo_resource(owner: str, repo: str) -> str: + return f"Repository: {owner}/{repo}" + + # Add completion handler for the server + @mcp.completion() + async def handle_completion( + ref: PromptReference | ResourceTemplateReference, + argument: CompletionArgument, + context: CompletionContext | None, + ) -> Completion | None: + # Handle GitHub repository completion + if isinstance(ref, ResourceTemplateReference): + if ref.uri == "github://repos/{owner}/{repo}" and argument.name == "repo": + if context and context.arguments and context.arguments.get("owner") == "modelcontextprotocol": + # Return repos for modelcontextprotocol org + return Completion(values=["python-sdk", "typescript-sdk", "specification"], total=3, hasMore=False) + elif context and context.arguments and context.arguments.get("owner") == "test-org": + # Return repos for test-org + return Completion(values=["test-repo1", "test-repo2"], total=2, hasMore=False) + + # Handle prompt completions + if isinstance(ref, PromptReference): + if ref.name == "complex_prompt" and argument.name == "context": + # Complete context values + contexts = ["general", "technical", "business", "academic"] + return Completion( + values=[c for c in contexts if c.startswith(argument.value)], total=None, hasMore=False + ) + + # Default: no completion available + return Completion(values=[], total=0, hasMore=False) + + # Tool that echoes request headers from context + @mcp.tool(description="Echo request headers from context", title="Echo Headers") + def echo_headers(ctx: Context[Any, Any, Request]) -> str: + """Returns the request headers as JSON.""" + headers_info = {} + if ctx.request_context.request: + # Now the type system knows request is a Starlette Request object + headers_info = dict(ctx.request_context.request.headers) + return json.dumps(headers_info) + + # Tool that returns full request context + @mcp.tool(description="Echo request context with custom data", title="Echo Context") + def echo_context(custom_request_id: str, ctx: Context[Any, Any, Request]) -> str: + """Returns request context including headers and custom data.""" + context_data = { + "custom_request_id": custom_request_id, + "headers": {}, + "method": None, + "path": None, + } + if ctx.request_context.request: + request = ctx.request_context.request + context_data["headers"] = dict(request.headers) + context_data["method"] = request.method + context_data["path"] = request.url.path + return json.dumps(context_data) + + # Restaurant booking tool with elicitation + @mcp.tool(description="Book a table at a restaurant with elicitation", title="Restaurant Booking") + async def book_restaurant( + date: str, + time: str, + party_size: int, + ctx: Context, + ) -> str: + """Book a table - uses elicitation if requested date is unavailable.""" + + class AlternativeDateSchema(BaseModel): + checkAlternative: bool = Field(description="Would you like to try another date?") + alternativeDate: str = Field( + default="2024-12-26", + description="What date would you prefer? (YYYY-MM-DD)", + ) + + # For testing: assume dates starting with "2024-12-25" are unavailable + if date.startswith("2024-12-25"): + # Use elicitation to ask about alternatives + result = await ctx.elicit( + message=( + f"No tables available for {party_size} people on {date} " + f"at {time}. Would you like to check another date?" + ), + schema=AlternativeDateSchema, + ) + + if result.action == "accept" and result.data: + if result.data.checkAlternative: + alt_date = result.data.alternativeDate + return f"āœ… Booked table for {party_size} on {alt_date} at {time}" + else: + return "āŒ No booking made" + elif result.action in ("decline", "cancel"): + return "āŒ Booking cancelled" + else: + # Handle case where action is "accept" but data is None + return "āŒ No booking data received" + else: + # Available - book directly + return f"āœ… Booked table for {party_size} on {date} at {time}" + + return mcp + + +def make_everything_fastmcp_app(): + """Create a comprehensive FastMCP server with SSE transport.""" + mcp = make_everything_fastmcp() + # Create the SSE app + app = mcp.sse_app() + return mcp, app + + +def make_fastmcp_streamable_http_app(): + """Create a FastMCP server with StreamableHTTP transport.""" + transport_security = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + mcp = FastMCP(name="NoAuthServer", transport_security=transport_security) + + # Add a simple tool + @mcp.tool(description="A simple echo tool") + def echo(message: str) -> str: + return f"Echo: {message}" + + # Create the StreamableHTTP app + app: Starlette = mcp.streamable_http_app() + + return mcp, app + + +def make_everything_fastmcp_streamable_http_app(): + """Create a comprehensive FastMCP server with StreamableHTTP transport.""" + # Create a new instance with different name for HTTP transport + mcp = make_everything_fastmcp() + # We can't change the name after creation, so we'll use the same name + # Create the StreamableHTTP app + app: Starlette = mcp.streamable_http_app() + return mcp, app + + +def make_fastmcp_stateless_http_app(): + """Create a FastMCP server with stateless StreamableHTTP transport.""" + transport_security = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + mcp = FastMCP(name="StatelessServer", stateless_http=True, transport_security=transport_security) + + # Add a simple tool + @mcp.tool(description="A simple echo tool") + def echo(message: str) -> str: + return f"Echo: {message}" + + # Create the StreamableHTTP app + app: Starlette = mcp.streamable_http_app() + + return mcp, app + + +def run_server(server_port: int) -> None: + """Run the server.""" + _, app = make_fastmcp_app() + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"Starting server on port {server_port}") + server.run() + + +def run_everything_legacy_sse_http_server(server_port: int) -> None: + """Run the comprehensive server with all features.""" + _, app = make_everything_fastmcp_app() + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"Starting comprehensive server on port {server_port}") + server.run() + + +def run_streamable_http_server(server_port: int) -> None: + """Run the StreamableHTTP server.""" + _, app = make_fastmcp_streamable_http_app() + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"Starting StreamableHTTP server on port {server_port}") + server.run() + + +def run_everything_server(server_port: int) -> None: + """Run the comprehensive StreamableHTTP server with all features.""" + _, app = make_everything_fastmcp_streamable_http_app() + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"Starting comprehensive StreamableHTTP server on port {server_port}") + server.run() + + +def run_stateless_http_server(server_port: int) -> None: + """Run the stateless StreamableHTTP server.""" + _, app = make_fastmcp_stateless_http_app() + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"Starting stateless StreamableHTTP server on port {server_port}") + server.run() + + +@pytest.fixture() +def server(server_port: int) -> Generator[None, None, None]: + """Start the server in a separate process and clean up after the test.""" + proc = multiprocessing.Process(target=run_server, args=(server_port,), daemon=True) + print("Starting server process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("Waiting for server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Server failed to start after {max_attempts} attempts") + + yield + + print("Killing server") + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("Server process failed to terminate") + + +@pytest.fixture() +def streamable_http_server(http_server_port: int) -> Generator[None, None, None]: + """Start the StreamableHTTP server in a separate process.""" + proc = multiprocessing.Process(target=run_streamable_http_server, args=(http_server_port,), daemon=True) + print("Starting StreamableHTTP server process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("Waiting for StreamableHTTP server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", http_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"StreamableHTTP server failed to start after {max_attempts} attempts") + + yield + + print("Killing StreamableHTTP server") + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("StreamableHTTP server process failed to terminate") + + +@pytest.fixture() +def stateless_http_server( + stateless_http_server_port: int, +) -> Generator[None, None, None]: + """Start the stateless StreamableHTTP server in a separate process.""" + proc = multiprocessing.Process( + target=run_stateless_http_server, + args=(stateless_http_server_port,), + daemon=True, + ) + print("Starting stateless StreamableHTTP server process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("Waiting for stateless StreamableHTTP server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", stateless_http_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Stateless server failed to start after {max_attempts} attempts") + + yield + + print("Killing stateless StreamableHTTP server") + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("Stateless StreamableHTTP server process failed to terminate") + + +@pytest.mark.anyio +async def test_fastmcp_without_auth(server: None, server_url: str) -> None: + """Test that FastMCP works when auth settings are not provided.""" + # Connect to the server + async with sse_client(server_url + "/sse") as streams: + async with ClientSession(*streams) as session: + # Test initialization + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == "NoAuthServer" + + # Test that we can call tools without authentication + tool_result = await session.call_tool("echo", {"message": "hello"}) + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + assert tool_result.content[0].text == "Echo: hello" + + +@pytest.mark.anyio +async def test_fastmcp_streamable_http(streamable_http_server: None, http_server_url: str) -> None: + """Test that FastMCP works with StreamableHTTP transport.""" + # Connect to the server using StreamableHTTP + async with streamablehttp_client(http_server_url + "/mcp") as ( + read_stream, + write_stream, + _, + ): + # Create a session using the client streams + async with ClientSession(read_stream, write_stream) as session: + # Test initialization + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == "NoAuthServer" + + # Test that we can call tools without authentication + tool_result = await session.call_tool("echo", {"message": "hello"}) + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + assert tool_result.content[0].text == "Echo: hello" + + +@pytest.mark.anyio +async def test_fastmcp_stateless_streamable_http(stateless_http_server: None, stateless_http_server_url: str) -> None: + """Test that FastMCP works with stateless StreamableHTTP transport.""" + # Connect to the server using StreamableHTTP + async with streamablehttp_client(stateless_http_server_url + "/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == "StatelessServer" + tool_result = await session.call_tool("echo", {"message": "hello"}) + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + assert tool_result.content[0].text == "Echo: hello" + + for i in range(3): + tool_result = await session.call_tool("echo", {"message": f"test_{i}"}) + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + assert tool_result.content[0].text == f"Echo: test_{i}" + + +@pytest.fixture +def everything_server_port() -> int: + """Get a free port for testing the comprehensive server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def everything_server_url(everything_server_port: int) -> str: + """Get the comprehensive server URL for testing.""" + return f"http://127.0.0.1:{everything_server_port}" + + +@pytest.fixture +def everything_http_server_port() -> int: + """Get a free port for testing the comprehensive StreamableHTTP server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def everything_http_server_url(everything_http_server_port: int) -> str: + """Get the comprehensive StreamableHTTP server URL for testing.""" + return f"http://127.0.0.1:{everything_http_server_port}" + + +@pytest.fixture() +def everything_server(everything_server_port: int) -> Generator[None, None, None]: + """Start the comprehensive server in a separate process and clean up after.""" + proc = multiprocessing.Process( + target=run_everything_legacy_sse_http_server, + args=(everything_server_port,), + daemon=True, + ) + print("Starting comprehensive server process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("Waiting for comprehensive server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", everything_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Comprehensive server failed to start after {max_attempts} attempts") + + yield + + print("Killing comprehensive server") + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("Comprehensive server process failed to terminate") + + +@pytest.fixture() +def everything_streamable_http_server( + everything_http_server_port: int, +) -> Generator[None, None, None]: + """Start the comprehensive StreamableHTTP server in a separate process.""" + proc = multiprocessing.Process( + target=run_everything_server, + args=(everything_http_server_port,), + daemon=True, + ) + print("Starting comprehensive StreamableHTTP server process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("Waiting for comprehensive StreamableHTTP server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", everything_http_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Comprehensive StreamableHTTP server failed to start after " f"{max_attempts} attempts") + + yield + + print("Killing comprehensive StreamableHTTP server") + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("Comprehensive StreamableHTTP server process failed to terminate") + + +class NotificationCollector: + def __init__(self): + self.progress_notifications: list = [] + self.log_messages: list = [] + self.resource_notifications: list = [] + self.tool_notifications: list = [] + + async def handle_progress(self, params) -> None: + self.progress_notifications.append(params) + + async def handle_log(self, params) -> None: + self.log_messages.append(params) + + async def handle_resource_list_changed(self, params) -> None: + self.resource_notifications.append(params) + + async def handle_tool_list_changed(self, params) -> None: + self.tool_notifications.append(params) + + async def handle_generic_notification(self, message) -> None: + # Check if this is a ServerNotification + if isinstance(message, ServerNotification): + # Check the specific notification type + if isinstance(message.root, ProgressNotification): + await self.handle_progress(message.root.params) + elif isinstance(message.root, LoggingMessageNotification): + await self.handle_log(message.root.params) + elif isinstance(message.root, ResourceListChangedNotification): + await self.handle_resource_list_changed(message.root.params) + elif isinstance(message.root, ToolListChangedNotification): + await self.handle_tool_list_changed(message.root.params) + + +async def create_test_elicitation_callback(context, params): + """Shared elicitation callback for tests. + + Handles elicitation requests for restaurant booking tests. + """ + # For restaurant booking test + if "No tables available" in params.message: + return ElicitResult( + action="accept", + content={"checkAlternative": True, "alternativeDate": "2024-12-26"}, + ) + else: + # Default response + return ElicitResult(action="decline") + + +async def call_all_mcp_features(session: ClientSession, collector: NotificationCollector) -> None: + """ + Test all MCP features using the provided session. + + Args: + session: The MCP client session to test with + collector: Notification collector for capturing server notifications + """ + # Test initialization + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == "EverythingServer" + + # Check server features are reported + assert result.capabilities.prompts is not None + assert result.capabilities.resources is not None + assert result.capabilities.tools is not None + # Note: logging capability may be None if no tools use context logging + + # Test tools + # 1. Simple echo tool + tool_result = await session.call_tool("echo", {"message": "hello"}) + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + assert tool_result.content[0].text == "Echo: hello" + + # 2. Test tool that returns ResourceLinks + list_files_result = await session.call_tool("list_files") + assert len(list_files_result.content) == 1 + + # Rest should be ResourceLinks + content = list_files_result.content[0] + assert isinstance(content, ResourceLink) + assert str(content.uri).startswith("file:///") + assert content.name is not None + assert content.mimeType is not None + + # Test progress callback functionality + progress_updates = [] + + async def progress_callback(progress: float, total: float | None, message: str | None) -> None: + """Collect progress updates for testing (async version).""" + progress_updates.append((progress, total, message)) + print(f"Progress: {progress}/{total} - {message}") + + test_message = "test" + steps = 3 + params = { + "message": test_message, + "steps": steps, + } + tool_result = await session.call_tool( + "tool_with_progress", + params, + progress_callback=progress_callback, + ) + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + assert f"Processed '{test_message}' in {steps} steps" in tool_result.content[0].text + + # Verify progress callback was called + assert len(progress_updates) == steps + for i, (progress, total, message) in enumerate(progress_updates): + expected_progress = (i + 1) / steps + assert abs(progress - expected_progress) < 0.01 + assert total == 1.0 + assert message is not None + assert f"step {i + 1} of {steps}" in message + + # Verify we received log messages from the tool + # Note: Progress notifications require special handling in the MCP client + # that's not implemented by default, so we focus on testing logging + assert len(collector.log_messages) > 0 + + # 3. Test sampling tool + prompt = "What is the meaning of life?" + sampling_result = await session.call_tool("sampling_tool", {"prompt": prompt}) + assert len(sampling_result.content) == 1 + assert isinstance(sampling_result.content[0], TextContent) + assert "Sampling result:" in sampling_result.content[0].text + assert "This is a simulated LLM response" in sampling_result.content[0].text + + # Verify we received log messages from the sampling tool + assert len(collector.log_messages) > 0 + assert any("Requesting sampling for prompt" in msg.data for msg in collector.log_messages) + assert any("Received sampling result from model" in msg.data for msg in collector.log_messages) + + # 4. Test notification tool + notification_message = "test_notifications" + notification_result = await session.call_tool("notification_tool", {"message": notification_message}) + assert len(notification_result.content) == 1 + assert isinstance(notification_result.content[0], TextContent) + assert "Sent notifications and logs" in notification_result.content[0].text + + # Verify we received various notification types + assert len(collector.log_messages) > 3 # Should have logs from both tools + assert len(collector.resource_notifications) > 0 + assert len(collector.tool_notifications) > 0 + + # Check that we got different log levels + log_levels = [msg.level for msg in collector.log_messages] + assert "debug" in log_levels + assert "info" in log_levels + assert "warning" in log_levels + + # 5. Test elicitation tool + # Test restaurant booking with unavailable date (triggers elicitation) + booking_result = await session.call_tool( + "book_restaurant", + { + "date": "2024-12-25", # Unavailable date to trigger elicitation + "time": "19:00", + "party_size": 4, + }, + ) + assert len(booking_result.content) == 1 + assert isinstance(booking_result.content[0], TextContent) + # Should have booked the alternative date from elicitation callback + assert "āœ… Booked table for 4 on 2024-12-26" in booking_result.content[0].text + + # Test resources + # 1. Static resource + resources = await session.list_resources() + # Try using string comparison since AnyUrl might not match directly + static_resource = next( + (r for r in resources.resources if str(r.uri) == "resource://static/info"), + None, + ) + assert static_resource is not None + assert static_resource.name == "Static Info" + + static_content = await session.read_resource(AnyUrl("resource://static/info")) + assert isinstance(static_content, ReadResourceResult) + assert len(static_content.contents) == 1 + assert isinstance(static_content.contents[0], TextResourceContents) + assert static_content.contents[0].text == "This is static resource content" + + # 2. Dynamic resource + resource_category = "test" + dynamic_content = await session.read_resource(AnyUrl(f"resource://dynamic/{resource_category}")) + assert isinstance(dynamic_content, ReadResourceResult) + assert len(dynamic_content.contents) == 1 + assert isinstance(dynamic_content.contents[0], TextResourceContents) + assert f"Dynamic resource content for category: {resource_category}" in dynamic_content.contents[0].text + + # 3. Template resource + resource_id = "456" + template_content = await session.read_resource(AnyUrl(f"resource://template/{resource_id}/data")) + assert isinstance(template_content, ReadResourceResult) + assert len(template_content.contents) == 1 + assert isinstance(template_content.contents[0], TextResourceContents) + assert f"Template resource data for ID: {resource_id}" in template_content.contents[0].text + + # Test prompts + # 1. Simple prompt + prompts = await session.list_prompts() + simple_prompt = next((p for p in prompts.prompts if p.name == "simple_prompt"), None) + assert simple_prompt is not None + + prompt_topic = "AI" + prompt_result = await session.get_prompt("simple_prompt", {"topic": prompt_topic}) + assert isinstance(prompt_result, GetPromptResult) + assert len(prompt_result.messages) >= 1 + # The actual message structure depends on the prompt implementation + + # 2. Complex prompt + complex_prompt = next((p for p in prompts.prompts if p.name == "complex_prompt"), None) + assert complex_prompt is not None + + query = "What is AI?" + context = "technical" + complex_result = await session.get_prompt("complex_prompt", {"user_query": query, "context": context}) + assert isinstance(complex_result, GetPromptResult) + assert len(complex_result.messages) >= 1 + + # Test request context propagation (only works when headers are available) + + headers_result = await session.call_tool("echo_headers", {}) + assert len(headers_result.content) == 1 + assert isinstance(headers_result.content[0], TextContent) + + # If we got headers, verify they exist + headers_data = json.loads(headers_result.content[0].text) + # The headers depend on the transport and test setup + print(f"Received headers: {headers_data}") + + # Test 6: Call tool that returns full context + context_result = await session.call_tool("echo_context", {"custom_request_id": "test-123"}) + assert len(context_result.content) == 1 + assert isinstance(context_result.content[0], TextContent) + + context_data = json.loads(context_result.content[0].text) + assert context_data["custom_request_id"] == "test-123" + # The method should be POST for most transports + if context_data["method"]: + assert context_data["method"] == "POST" + + # Test completion functionality + # 1. Test resource template completion with context + repo_result = await session.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="github://repos/{owner}/{repo}"), + argument={"name": "repo", "value": ""}, + context_arguments={"owner": "modelcontextprotocol"}, + ) + assert repo_result.completion.values == ["python-sdk", "typescript-sdk", "specification"] + assert repo_result.completion.total == 3 + assert repo_result.completion.hasMore is False + + # 2. Test with different context + repo_result2 = await session.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="github://repos/{owner}/{repo}"), + argument={"name": "repo", "value": ""}, + context_arguments={"owner": "test-org"}, + ) + assert repo_result2.completion.values == ["test-repo1", "test-repo2"] + assert repo_result2.completion.total == 2 + + # 3. Test prompt argument completion + context_result = await session.complete( + ref=PromptReference(type="ref/prompt", name="complex_prompt"), + argument={"name": "context", "value": "tech"}, + ) + assert "technical" in context_result.completion.values + + # 4. Test completion without context (should return empty) + no_context_result = await session.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="github://repos/{owner}/{repo}"), + argument={"name": "repo", "value": "test"}, + ) + assert no_context_result.completion.values == [] + assert no_context_result.completion.total == 0 + + +async def sampling_callback( + context: RequestContext[ClientSession, None], + params: CreateMessageRequestParams, +) -> CreateMessageResult: + # Simulate LLM response based on the input + if params.messages and isinstance(params.messages[0].content, TextContent): + input_text = params.messages[0].content.text + else: + input_text = "No input" + response_text = f"This is a simulated LLM response to: {input_text}" + + model_name = "test-llm-model" + return CreateMessageResult( + role="assistant", + content=TextContent(type="text", text=response_text), + model=model_name, + stopReason="endTurn", + ) + + +@pytest.mark.anyio +async def test_fastmcp_all_features_sse(everything_server: None, everything_server_url: str) -> None: + """Test all MCP features work correctly with SSE transport.""" + + # Create notification collector + collector = NotificationCollector() + + # Connect to the server with callbacks + async with sse_client(everything_server_url + "/sse") as streams: + # Set up message handler to capture notifications + async def message_handler(message): + print(f"Received message: {message}") + await collector.handle_generic_notification(message) + if isinstance(message, Exception): + raise message + + async with ClientSession( + *streams, + sampling_callback=sampling_callback, + elicitation_callback=create_test_elicitation_callback, + message_handler=message_handler, + ) as session: + # Run the common test suite + await call_all_mcp_features(session, collector) + + +@pytest.mark.anyio +async def test_fastmcp_all_features_streamable_http( + everything_streamable_http_server: None, everything_http_server_url: str +) -> None: + """Test all MCP features work correctly with StreamableHTTP transport.""" + + # Create notification collector + collector = NotificationCollector() + + # Connect to the server using StreamableHTTP + async with streamablehttp_client(everything_http_server_url + "/mcp") as ( + read_stream, + write_stream, + _, + ): + # Set up message handler to capture notifications + async def message_handler(message): + print(f"Received message: {message}") + await collector.handle_generic_notification(message) + if isinstance(message, Exception): + raise message + + async with ClientSession( + read_stream, + write_stream, + sampling_callback=sampling_callback, + elicitation_callback=create_test_elicitation_callback, + message_handler=message_handler, + ) as session: + # Run the common test suite with HTTP-specific test suffix + await call_all_mcp_features(session, collector) + + +@pytest.mark.anyio +async def test_elicitation_feature(server: None, server_url: str) -> None: + """Test the elicitation feature.""" + + # Create a custom handler for elicitation requests + async def elicitation_callback(context, params): + # Verify the elicitation parameters + if params.message == "Tool wants to ask: What is your name?": + return ElicitResult(content={"answer": "Test User"}, action="accept") + else: + raise ValueError("Unexpected elicitation message") + + # Connect to the server with our custom elicitation handler + async with sse_client(server_url + "/sse") as streams: + async with ClientSession(*streams, elicitation_callback=elicitation_callback) as session: + # First initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == "NoAuthServer" + + # Call the tool that uses elicitation + tool_result = await session.call_tool("ask_user", {"prompt": "What is your name?"}) + # Verify the result + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + # # The test should only succeed with the successful elicitation response + assert tool_result.content[0].text == "User answered: Test User" + + +@pytest.mark.anyio +async def test_title_precedence(everything_server: None, everything_server_url: str) -> None: + """Test that titles are properly returned for tools, resources, and prompts.""" + from mcp.shared.metadata_utils import get_display_name + + async with sse_client(everything_server_url + "/sse") as streams: + async with ClientSession(*streams) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + + # Test tools have titles + tools_result = await session.list_tools() + assert tools_result.tools + + # Check specific tools have titles + tool_names_to_titles = { + "tool_with_progress": "Progress Tool", + "echo": "Echo Tool", + "sampling_tool": "Sampling Tool", + "notification_tool": "Notification Tool", + "echo_headers": "Echo Headers", + "echo_context": "Echo Context", + "book_restaurant": "Restaurant Booking", + } + + for tool in tools_result.tools: + if tool.name in tool_names_to_titles: + assert tool.title == tool_names_to_titles[tool.name] + # Test get_display_name utility + assert get_display_name(tool) == tool_names_to_titles[tool.name] + + # Test resources have titles + resources_result = await session.list_resources() + assert resources_result.resources + + # Check specific resources have titles + static_resource = next((r for r in resources_result.resources if r.name == "Static Info"), None) + assert static_resource is not None + assert static_resource.title == "Static Information" + assert get_display_name(static_resource) == "Static Information" + + # Test resource templates have titles + resource_templates = await session.list_resource_templates() + assert resource_templates.resourceTemplates + + # Check specific resource templates have titles + template_uris_to_titles = { + "resource://dynamic/{category}": "Dynamic Resource", + "resource://template/{id}/data": "Template Resource", + "github://repos/{owner}/{repo}": "GitHub Repository", + } + + for template in resource_templates.resourceTemplates: + if template.uriTemplate in template_uris_to_titles: + assert template.title == template_uris_to_titles[template.uriTemplate] + assert get_display_name(template) == template_uris_to_titles[template.uriTemplate] + + # Test prompts have titles + prompts_result = await session.list_prompts() + assert prompts_result.prompts + + # Check specific prompts have titles + prompt_names_to_titles = { + "simple_prompt": "Simple Prompt", + "complex_prompt": "Complex Prompt", + } + + for prompt in prompts_result.prompts: + if prompt.name in prompt_names_to_titles: + assert prompt.title == prompt_names_to_titles[prompt.name] + assert get_display_name(prompt) == prompt_names_to_titles[prompt.name] diff --git a/tests/server/fastmcp/test_server.py b/tests/server/fastmcp/test_server.py index e76e59c52..8719b78d5 100644 --- a/tests/server/fastmcp/test_server.py +++ b/tests/server/fastmcp/test_server.py @@ -1,12 +1,14 @@ import base64 from pathlib import Path from typing import TYPE_CHECKING +from unittest.mock import patch import pytest from pydantic import AnyUrl +from starlette.routing import Mount, Route from mcp.server.fastmcp import Context, FastMCP -from mcp.server.fastmcp.prompts.base import EmbeddedResource, Message, UserMessage +from mcp.server.fastmcp.prompts.base import Message, UserMessage from mcp.server.fastmcp.resources import FileResource, FunctionResource from mcp.server.fastmcp.utilities.types import Image from mcp.shared.exceptions import McpError @@ -14,7 +16,10 @@ create_connected_server_and_client_session as client_session, ) from mcp.types import ( + AudioContent, BlobResourceContents, + ContentBlock, + EmbeddedResource, ImageContent, TextContent, TextResourceContents, @@ -31,16 +36,93 @@ async def test_create_server(self): assert mcp.name == "FastMCP" assert mcp.instructions == "Server instructions" + @pytest.mark.anyio + async def test_normalize_path(self): + """Test path normalization for mount paths.""" + mcp = FastMCP() + + # Test root path + assert mcp._normalize_path("/", "/messages/") == "/messages/" + + # Test path with trailing slash + assert mcp._normalize_path("/github/", "/messages/") == "/github/messages/" + + # Test path without trailing slash + assert mcp._normalize_path("/github", "/messages/") == "/github/messages/" + + # Test endpoint without leading slash + assert mcp._normalize_path("/github", "messages/") == "/github/messages/" + + # Test both with trailing/leading slashes + assert mcp._normalize_path("/api/", "/v1/") == "/api/v1/" + + @pytest.mark.anyio + async def test_sse_app_with_mount_path(self): + """Test SSE app creation with different mount paths.""" + # Test with default mount path + mcp = FastMCP() + with patch.object(mcp, "_normalize_path", return_value="/messages/") as mock_normalize: + mcp.sse_app() + # Verify _normalize_path was called with correct args + mock_normalize.assert_called_once_with("/", "/messages/") + + # Test with custom mount path in settings + mcp = FastMCP() + mcp.settings.mount_path = "/custom" + with patch.object(mcp, "_normalize_path", return_value="/custom/messages/") as mock_normalize: + mcp.sse_app() + # Verify _normalize_path was called with correct args + mock_normalize.assert_called_once_with("/custom", "/messages/") + + # Test with mount_path parameter + mcp = FastMCP() + with patch.object(mcp, "_normalize_path", return_value="/param/messages/") as mock_normalize: + mcp.sse_app(mount_path="/param") + # Verify _normalize_path was called with correct args + mock_normalize.assert_called_once_with("/param", "/messages/") + + @pytest.mark.anyio + async def test_starlette_routes_with_mount_path(self): + """Test that Starlette routes are correctly configured with mount path.""" + # Test with mount path in settings + mcp = FastMCP() + mcp.settings.mount_path = "/api" + app = mcp.sse_app() + + # Find routes by type + sse_routes = [r for r in app.routes if isinstance(r, Route)] + mount_routes = [r for r in app.routes if isinstance(r, Mount)] + + # Verify routes exist + assert len(sse_routes) == 1, "Should have one SSE route" + assert len(mount_routes) == 1, "Should have one mount route" + + # Verify path values + assert sse_routes[0].path == "/sse", "SSE route path should be /sse" + assert mount_routes[0].path == "/messages", "Mount route path should be /messages" + + # Test with mount path as parameter + mcp = FastMCP() + app = mcp.sse_app(mount_path="/param") + + # Find routes by type + sse_routes = [r for r in app.routes if isinstance(r, Route)] + mount_routes = [r for r in app.routes if isinstance(r, Mount)] + + # Verify routes exist + assert len(sse_routes) == 1, "Should have one SSE route" + assert len(mount_routes) == 1, "Should have one mount route" + + # Verify path values + assert sse_routes[0].path == "/sse", "SSE route path should be /sse" + assert mount_routes[0].path == "/messages", "Mount route path should be /messages" + @pytest.mark.anyio async def test_non_ascii_description(self): """Test that FastMCP handles non-ASCII characters in descriptions correctly""" mcp = FastMCP() - @mcp.tool( - description=( - "🌟 This tool uses emojis and UTF-8 characters: Ć” Ć© Ć­ ó Ćŗ Ʊ 漢字 šŸŽ‰" - ) - ) + @mcp.tool(description=("🌟 This tool uses emojis and UTF-8 characters: Ć” Ć© Ć­ ó Ćŗ Ʊ 漢字 šŸŽ‰")) def hello_world(name: str = "äø–ē•Œ") -> str: return f"Ā”Hola, {name}! šŸ‘‹" @@ -93,9 +175,7 @@ def get_data(x: str) -> str: async def test_add_resource_decorator_incorrect_usage(self): mcp = FastMCP() - with pytest.raises( - TypeError, match="The @resource decorator was used incorrectly" - ): + with pytest.raises(TypeError, match="The @resource decorator was used incorrectly"): @mcp.resource # Missing parentheses #type: ignore def get_data(x: str) -> str: @@ -114,10 +194,11 @@ def image_tool_fn(path: str) -> Image: return Image(path) -def mixed_content_tool_fn() -> list[TextContent | ImageContent]: +def mixed_content_tool_fn() -> list[ContentBlock]: return [ TextContent(type="text", text="Hello"), ImageContent(type="image", data="abc", mimeType="image/png"), + AudioContent(type="audio", data="def", mimeType="audio/wav"), ] @@ -219,14 +300,16 @@ async def test_tool_mixed_content(self): mcp.add_tool(mixed_content_tool_fn) async with client_session(mcp._mcp_server) as client: result = await client.call_tool("mixed_content_tool_fn", {}) - assert len(result.content) == 2 - content1 = result.content[0] - content2 = result.content[1] + assert len(result.content) == 3 + content1, content2, content3 = result.content assert isinstance(content1, TextContent) assert content1.text == "Hello" assert isinstance(content2, ImageContent) assert content2.mimeType == "image/png" assert content2.data == "abc" + assert isinstance(content3, AudioContent) + assert content3.mimeType == "audio/wav" + assert content3.data == "def" @pytest.mark.anyio async def test_tool_mixed_list_with_image(self, tmp_path: Path): @@ -276,9 +359,7 @@ async def test_text_resource(self): def get_text(): return "Hello, world!" - resource = FunctionResource( - uri=AnyUrl("resource://test"), name="test", fn=get_text - ) + resource = FunctionResource(uri=AnyUrl("resource://test"), name="test", fn=get_text) mcp.add_resource(resource) async with client_session(mcp._mcp_server) as client: @@ -314,9 +395,7 @@ async def test_file_resource_text(self, tmp_path: Path): text_file = tmp_path / "test.txt" text_file.write_text("Hello from file!") - resource = FileResource( - uri=AnyUrl("file://test.txt"), name="test.txt", path=text_file - ) + resource = FileResource(uri=AnyUrl("file://test.txt"), name="test.txt", path=text_file) mcp.add_resource(resource) async with client_session(mcp._mcp_server) as client: @@ -343,10 +422,25 @@ async def test_file_resource_binary(self, tmp_path: Path): async with client_session(mcp._mcp_server) as client: result = await client.read_resource(AnyUrl("file://test.bin")) assert isinstance(result.contents[0], BlobResourceContents) - assert ( - result.contents[0].blob - == base64.b64encode(b"Binary file data").decode() - ) + assert result.contents[0].blob == base64.b64encode(b"Binary file data").decode() + + @pytest.mark.anyio + async def test_function_resource(self): + mcp = FastMCP() + + @mcp.resource("function://test", name="test_get_data") + def get_data() -> str: + """get_data returns a string""" + return "Hello, world!" + + async with client_session(mcp._mcp_server) as client: + resources = await client.list_resources() + assert len(resources.resources) == 1 + resource = resources.resources[0] + assert resource.description == "get_data returns a string" + assert resource.uri == AnyUrl("function://test") + assert resource.name == "test_get_data" + assert resource.mimeType == "text/plain" class TestServerResourceTemplates: @@ -417,9 +511,7 @@ def get_data(org: str, repo: str) -> str: return f"Data for {org}/{repo}" async with client_session(mcp._mcp_server) as client: - result = await client.read_resource( - AnyUrl("resource://cursor/fastmcp/data") - ) + result = await client.read_resource(AnyUrl("resource://cursor/fastmcp/data")) assert isinstance(result.contents[0], TextResourceContents) assert result.contents[0].text == "Data for cursor/fastmcp" @@ -518,8 +610,6 @@ async def async_tool(x: int, ctx: Context) -> str: @pytest.mark.anyio async def test_context_logging(self): - from unittest.mock import patch - import mcp.server.session """Test that context logging methods work.""" @@ -544,14 +634,28 @@ async def logging_tool(msg: str, ctx: Context) -> str: assert mock_log.call_count == 4 mock_log.assert_any_call( - level="debug", data="Debug message", logger=None + level="debug", + data="Debug message", + logger=None, + related_request_id="1", + ) + mock_log.assert_any_call( + level="info", + data="Info message", + logger=None, + related_request_id="1", ) - mock_log.assert_any_call(level="info", data="Info message", logger=None) mock_log.assert_any_call( - level="warning", data="Warning message", logger=None + level="warning", + data="Warning message", + logger=None, + related_request_id="1", ) mock_log.assert_any_call( - level="error", data="Error message", logger=None + level="error", + data="Error message", + logger=None, + related_request_id="1", ) @pytest.mark.anyio diff --git a/tests/server/fastmcp/test_title.py b/tests/server/fastmcp/test_title.py new file mode 100644 index 000000000..a94f6671d --- /dev/null +++ b/tests/server/fastmcp/test_title.py @@ -0,0 +1,215 @@ +"""Integration tests for title field functionality.""" + +import pytest +from pydantic import AnyUrl + +from mcp.server.fastmcp import FastMCP +from mcp.server.fastmcp.resources import FunctionResource +from mcp.shared.memory import create_connected_server_and_client_session +from mcp.shared.metadata_utils import get_display_name +from mcp.types import Prompt, Resource, ResourceTemplate, Tool, ToolAnnotations + + +@pytest.mark.anyio +async def test_tool_title_precedence(): + """Test that tool title precedence works correctly: title > annotations.title > name.""" + # Create server with various tool configurations + mcp = FastMCP(name="TitleTestServer") + + # Tool with only name + @mcp.tool(description="Basic tool") + def basic_tool(message: str) -> str: + return message + + # Tool with title + @mcp.tool(description="Tool with title", title="User-Friendly Tool") + def tool_with_title(message: str) -> str: + return message + + # Tool with annotations.title (when title is not supported on decorator) + # We'll need to add this manually after registration + @mcp.tool(description="Tool with annotations") + def tool_with_annotations(message: str) -> str: + return message + + # Tool with both title and annotations.title + @mcp.tool(description="Tool with both", title="Primary Title") + def tool_with_both(message: str) -> str: + return message + + # Start server and connect client + async with create_connected_server_and_client_session(mcp._mcp_server) as client: + await client.initialize() + + # List tools + tools_result = await client.list_tools() + tools = {tool.name: tool for tool in tools_result.tools} + + # Verify basic tool uses name + assert "basic_tool" in tools + basic = tools["basic_tool"] + # Since we haven't implemented get_display_name yet, we'll check the raw fields + assert basic.title is None + assert basic.name == "basic_tool" + + # Verify tool with title + assert "tool_with_title" in tools + titled = tools["tool_with_title"] + assert titled.title == "User-Friendly Tool" + + # For now, we'll skip the annotations.title test as it requires modifying + # the tool after registration, which we'll implement later + + # Verify tool with both uses title over annotations.title + assert "tool_with_both" in tools + both = tools["tool_with_both"] + assert both.title == "Primary Title" + + +@pytest.mark.anyio +async def test_prompt_title(): + """Test that prompt titles work correctly.""" + mcp = FastMCP(name="PromptTitleServer") + + # Prompt with only name + @mcp.prompt(description="Basic prompt") + def basic_prompt(topic: str) -> str: + return f"Tell me about {topic}" + + # Prompt with title + @mcp.prompt(description="Titled prompt", title="Ask About Topic") + def titled_prompt(topic: str) -> str: + return f"Tell me about {topic}" + + # Start server and connect client + async with create_connected_server_and_client_session(mcp._mcp_server) as client: + await client.initialize() + + # List prompts + prompts_result = await client.list_prompts() + prompts = {prompt.name: prompt for prompt in prompts_result.prompts} + + # Verify basic prompt uses name + assert "basic_prompt" in prompts + basic = prompts["basic_prompt"] + assert basic.title is None + assert basic.name == "basic_prompt" + + # Verify prompt with title + assert "titled_prompt" in prompts + titled = prompts["titled_prompt"] + assert titled.title == "Ask About Topic" + + +@pytest.mark.anyio +async def test_resource_title(): + """Test that resource titles work correctly.""" + mcp = FastMCP(name="ResourceTitleServer") + + # Static resource without title + def get_basic_data() -> str: + return "Basic data" + + basic_resource = FunctionResource( + uri=AnyUrl("resource://basic"), + name="basic_resource", + description="Basic resource", + fn=get_basic_data, + ) + mcp.add_resource(basic_resource) + + # Static resource with title + def get_titled_data() -> str: + return "Titled data" + + titled_resource = FunctionResource( + uri=AnyUrl("resource://titled"), + name="titled_resource", + title="User-Friendly Resource", + description="Resource with title", + fn=get_titled_data, + ) + mcp.add_resource(titled_resource) + + # Dynamic resource without title + @mcp.resource("resource://dynamic/{id}") + def dynamic_resource(id: str) -> str: + return f"Data for {id}" + + # Dynamic resource with title (when supported) + @mcp.resource("resource://titled-dynamic/{id}", title="Dynamic Data") + def titled_dynamic_resource(id: str) -> str: + return f"Data for {id}" + + # Start server and connect client + async with create_connected_server_and_client_session(mcp._mcp_server) as client: + await client.initialize() + + # List resources + resources_result = await client.list_resources() + resources = {str(res.uri): res for res in resources_result.resources} + + # Verify basic resource uses name + assert "resource://basic" in resources + basic = resources["resource://basic"] + assert basic.title is None + assert basic.name == "basic_resource" + + # Verify resource with title + assert "resource://titled" in resources + titled = resources["resource://titled"] + assert titled.title == "User-Friendly Resource" + + # List resource templates + templates_result = await client.list_resource_templates() + templates = {tpl.uriTemplate: tpl for tpl in templates_result.resourceTemplates} + + # Verify dynamic resource template + assert "resource://dynamic/{id}" in templates + dynamic = templates["resource://dynamic/{id}"] + assert dynamic.title is None + assert dynamic.name == "dynamic_resource" + + # Verify titled dynamic resource template (when supported) + if "resource://titled-dynamic/{id}" in templates: + titled_dynamic = templates["resource://titled-dynamic/{id}"] + assert titled_dynamic.title == "Dynamic Data" + + +@pytest.mark.anyio +async def test_get_display_name_utility(): + """Test the get_display_name utility function.""" + + # Test tool precedence: title > annotations.title > name + tool_name_only = Tool(name="test_tool", inputSchema={}) + assert get_display_name(tool_name_only) == "test_tool" + + tool_with_title = Tool(name="test_tool", title="Test Tool", inputSchema={}) + assert get_display_name(tool_with_title) == "Test Tool" + + tool_with_annotations = Tool(name="test_tool", inputSchema={}, annotations=ToolAnnotations(title="Annotated Tool")) + assert get_display_name(tool_with_annotations) == "Annotated Tool" + + tool_with_both = Tool( + name="test_tool", title="Primary Title", inputSchema={}, annotations=ToolAnnotations(title="Secondary Title") + ) + assert get_display_name(tool_with_both) == "Primary Title" + + # Test other types: title > name + resource = Resource(uri=AnyUrl("file://test"), name="test_res") + assert get_display_name(resource) == "test_res" + + resource_with_title = Resource(uri=AnyUrl("file://test"), name="test_res", title="Test Resource") + assert get_display_name(resource_with_title) == "Test Resource" + + prompt = Prompt(name="test_prompt") + assert get_display_name(prompt) == "test_prompt" + + prompt_with_title = Prompt(name="test_prompt", title="Test Prompt") + assert get_display_name(prompt_with_title) == "Test Prompt" + + template = ResourceTemplate(uriTemplate="file://{id}", name="test_template") + assert get_display_name(template) == "test_template" + + template_with_title = ResourceTemplate(uriTemplate="file://{id}", name="test_template", title="Test Template") + assert get_display_name(template_with_title) == "Test Template" diff --git a/tests/server/fastmcp/test_tool_manager.py b/tests/server/fastmcp/test_tool_manager.py index 8f52e3d85..206df42d7 100644 --- a/tests/server/fastmcp/test_tool_manager.py +++ b/tests/server/fastmcp/test_tool_manager.py @@ -6,9 +6,11 @@ from mcp.server.fastmcp import Context, FastMCP from mcp.server.fastmcp.exceptions import ToolError -from mcp.server.fastmcp.tools import ToolManager +from mcp.server.fastmcp.tools import Tool, ToolManager +from mcp.server.fastmcp.utilities.func_metadata import ArgModelBase, FuncMetadata from mcp.server.session import ServerSessionT -from mcp.shared.context import LifespanContextT +from mcp.shared.context import LifespanContextT, RequestT +from mcp.types import ToolAnnotations class TestAddTools: @@ -30,6 +32,36 @@ def add(a: int, b: int) -> int: assert tool.parameters["properties"]["a"]["type"] == "integer" assert tool.parameters["properties"]["b"]["type"] == "integer" + def test_init_with_tools(self, caplog): + def add(a: int, b: int) -> int: + return a + b + + class AddArguments(ArgModelBase): + a: int + b: int + + fn_metadata = FuncMetadata(arg_model=AddArguments) + + original_tool = Tool( + name="add", + title="Add Tool", + description="Add two numbers.", + fn=add, + fn_metadata=fn_metadata, + is_async=False, + parameters=AddArguments.model_json_schema(), + context_kwarg=None, + annotations=None, + ) + manager = ToolManager(tools=[original_tool]) + saved_tool = manager.get_tool("add") + assert saved_tool == original_tool + + # warn on duplicate tools + with caplog.at_level(logging.WARNING): + manager = ToolManager(True, tools=[original_tool, original_tool]) + assert "Tool already exists: add" in caplog.text + @pytest.mark.anyio async def test_async_function(self): """Test registering and running an async function.""" @@ -71,6 +103,39 @@ def create_user(user: UserInput, flag: bool) -> dict: assert "age" in tool.parameters["$defs"]["UserInput"]["properties"] assert "flag" in tool.parameters["properties"] + def test_add_callable_object(self): + """Test registering a callable object.""" + + class MyTool: + def __init__(self): + self.__name__ = "MyTool" + + def __call__(self, x: int) -> int: + return x * 2 + + manager = ToolManager() + tool = manager.add_tool(MyTool()) + assert tool.name == "MyTool" + assert tool.is_async is False + assert tool.parameters["properties"]["x"]["type"] == "integer" + + @pytest.mark.anyio + async def test_add_async_callable_object(self): + """Test registering an async callable object.""" + + class MyAsyncTool: + def __init__(self): + self.__name__ = "MyAsyncTool" + + async def __call__(self, x: int) -> int: + return x * 2 + + manager = ToolManager() + tool = manager.add_tool(MyAsyncTool()) + assert tool.name == "MyAsyncTool" + assert tool.is_async is True + assert tool.parameters["properties"]["x"]["type"] == "integer" + def test_add_invalid_tool(self): manager = ToolManager() with pytest.raises(AttributeError): @@ -83,9 +148,7 @@ def test_add_lambda(self): def test_add_lambda_with_no_name(self): manager = ToolManager() - with pytest.raises( - ValueError, match="You must provide a name for lambda functions" - ): + with pytest.raises(ValueError, match="You must provide a name for lambda functions"): manager.add_tool(lambda x: x) def test_warn_on_duplicate_tools(self, caplog): @@ -137,6 +200,34 @@ async def double(n: int) -> int: result = await manager.call_tool("double", {"n": 5}) assert result == 10 + @pytest.mark.anyio + async def test_call_object_tool(self): + class MyTool: + def __init__(self): + self.__name__ = "MyTool" + + def __call__(self, x: int) -> int: + return x * 2 + + manager = ToolManager() + tool = manager.add_tool(MyTool()) + result = await tool.run({"x": 5}) + assert result == 10 + + @pytest.mark.anyio + async def test_call_async_object_tool(self): + class MyAsyncTool: + def __init__(self): + self.__name__ = "MyAsyncTool" + + async def __call__(self, x: int) -> int: + return x * 2 + + manager = ToolManager() + tool = manager.add_tool(MyAsyncTool()) + result = await tool.run({"x": 5}) + assert result == 10 + @pytest.mark.anyio async def test_call_tool_with_default_args(self): def add(a: int, b: int = 1) -> int: @@ -254,9 +345,7 @@ def tool_without_context(x: int) -> str: tool = manager.add_tool(tool_without_context) assert tool.context_kwarg is None - def tool_with_parametrized_context( - x: int, ctx: Context[ServerSessionT, LifespanContextT] - ) -> str: + def tool_with_parametrized_context(x: int, ctx: Context[ServerSessionT, LifespanContextT, RequestT]) -> str: return str(x) tool = manager.add_tool(tool_with_parametrized_context) @@ -321,3 +410,43 @@ def tool_with_context(x: int, ctx: Context) -> str: ctx = mcp.get_context() with pytest.raises(ToolError, match="Error executing tool tool_with_context"): await manager.call_tool("tool_with_context", {"x": 42}, context=ctx) + + +class TestToolAnnotations: + def test_tool_annotations(self): + """Test that tool annotations are correctly added to tools.""" + + def read_data(path: str) -> str: + """Read data from a file.""" + return f"Data from {path}" + + annotations = ToolAnnotations( + title="File Reader", + readOnlyHint=True, + openWorldHint=False, + ) + + manager = ToolManager() + tool = manager.add_tool(read_data, annotations=annotations) + + assert tool.annotations is not None + assert tool.annotations.title == "File Reader" + assert tool.annotations.readOnlyHint is True + assert tool.annotations.openWorldHint is False + + @pytest.mark.anyio + async def test_tool_annotations_in_fastmcp(self): + """Test that tool annotations are included in MCPTool conversion.""" + + app = FastMCP() + + @app.tool(annotations=ToolAnnotations(title="Echo Tool", readOnlyHint=True)) + def echo(message: str) -> str: + """Echo a message back.""" + return message + + tools = await app.list_tools() + assert len(tools) == 1 + assert tools[0].annotations is not None + assert tools[0].annotations.title == "Echo Tool" + assert tools[0].annotations.readOnlyHint is True diff --git a/tests/server/test_completion_with_context.py b/tests/server/test_completion_with_context.py new file mode 100644 index 000000000..f0d154587 --- /dev/null +++ b/tests/server/test_completion_with_context.py @@ -0,0 +1,180 @@ +""" +Tests for completion handler with context functionality. +""" + +import pytest + +from mcp.server.lowlevel import Server +from mcp.shared.memory import create_connected_server_and_client_session +from mcp.types import ( + Completion, + CompletionArgument, + CompletionContext, + PromptReference, + ResourceTemplateReference, +) + + +@pytest.mark.anyio +async def test_completion_handler_receives_context(): + """Test that the completion handler receives context correctly.""" + server = Server("test-server") + + # Track what the handler receives + received_args = {} + + @server.completion() + async def handle_completion( + ref: PromptReference | ResourceTemplateReference, + argument: CompletionArgument, + context: CompletionContext | None, + ) -> Completion | None: + received_args["ref"] = ref + received_args["argument"] = argument + received_args["context"] = context + + # Return test completion + return Completion(values=["test-completion"], total=1, hasMore=False) + + async with create_connected_server_and_client_session(server) as client: + # Test with context + result = await client.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="test://resource/{param}"), + argument={"name": "param", "value": "test"}, + context_arguments={"previous": "value"}, + ) + + # Verify handler received the context + assert received_args["context"] is not None + assert received_args["context"].arguments == {"previous": "value"} + assert result.completion.values == ["test-completion"] + + +@pytest.mark.anyio +async def test_completion_backward_compatibility(): + """Test that completion works without context (backward compatibility).""" + server = Server("test-server") + + context_was_none = False + + @server.completion() + async def handle_completion( + ref: PromptReference | ResourceTemplateReference, + argument: CompletionArgument, + context: CompletionContext | None, + ) -> Completion | None: + nonlocal context_was_none + context_was_none = context is None + + return Completion(values=["no-context-completion"], total=1, hasMore=False) + + async with create_connected_server_and_client_session(server) as client: + # Test without context + result = await client.complete( + ref=PromptReference(type="ref/prompt", name="test-prompt"), argument={"name": "arg", "value": "val"} + ) + + # Verify context was None + assert context_was_none + assert result.completion.values == ["no-context-completion"] + + +@pytest.mark.anyio +async def test_dependent_completion_scenario(): + """Test a real-world scenario with dependent completions.""" + server = Server("test-server") + + @server.completion() + async def handle_completion( + ref: PromptReference | ResourceTemplateReference, + argument: CompletionArgument, + context: CompletionContext | None, + ) -> Completion | None: + # Simulate database/table completion scenario + if isinstance(ref, ResourceTemplateReference): + if ref.uri == "db://{database}/{table}": + if argument.name == "database": + # Complete database names + return Completion(values=["users_db", "products_db", "analytics_db"], total=3, hasMore=False) + elif argument.name == "table": + # Complete table names based on selected database + if context and context.arguments: + db = context.arguments.get("database") + if db == "users_db": + return Completion(values=["users", "sessions", "permissions"], total=3, hasMore=False) + elif db == "products_db": + return Completion(values=["products", "categories", "inventory"], total=3, hasMore=False) + + return Completion(values=[], total=0, hasMore=False) + + async with create_connected_server_and_client_session(server) as client: + # First, complete database + db_result = await client.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="db://{database}/{table}"), + argument={"name": "database", "value": ""}, + ) + assert "users_db" in db_result.completion.values + assert "products_db" in db_result.completion.values + + # Then complete table with database context + table_result = await client.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="db://{database}/{table}"), + argument={"name": "table", "value": ""}, + context_arguments={"database": "users_db"}, + ) + assert table_result.completion.values == ["users", "sessions", "permissions"] + + # Different database gives different tables + table_result2 = await client.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="db://{database}/{table}"), + argument={"name": "table", "value": ""}, + context_arguments={"database": "products_db"}, + ) + assert table_result2.completion.values == ["products", "categories", "inventory"] + + +@pytest.mark.anyio +async def test_completion_error_on_missing_context(): + """Test that server can raise error when required context is missing.""" + server = Server("test-server") + + @server.completion() + async def handle_completion( + ref: PromptReference | ResourceTemplateReference, + argument: CompletionArgument, + context: CompletionContext | None, + ) -> Completion | None: + if isinstance(ref, ResourceTemplateReference): + if ref.uri == "db://{database}/{table}": + if argument.name == "table": + # Check if database context is provided + if not context or not context.arguments or "database" not in context.arguments: + # Raise an error instead of returning error as completion + raise ValueError("Please select a database first to see available tables") + # Normal completion if context is provided + db = context.arguments.get("database") + if db == "test_db": + return Completion(values=["users", "orders", "products"], total=3, hasMore=False) + + return Completion(values=[], total=0, hasMore=False) + + async with create_connected_server_and_client_session(server) as client: + # Try to complete table without database context - should raise error + with pytest.raises(Exception) as exc_info: + await client.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="db://{database}/{table}"), + argument={"name": "table", "value": ""}, + ) + + # Verify error message + assert "Please select a database first" in str(exc_info.value) + + # Now complete with proper context - should work normally + result_with_context = await client.complete( + ref=ResourceTemplateReference(type="ref/resource", uri="db://{database}/{table}"), + argument={"name": "table", "value": ""}, + context_arguments={"database": "test_db"}, + ) + + # Should get normal completions + assert result_with_context.completion.values == ["users", "orders", "products"] diff --git a/tests/server/test_lifespan.py b/tests/server/test_lifespan.py index 309a44b87..a3ff59bc1 100644 --- a/tests/server/test_lifespan.py +++ b/tests/server/test_lifespan.py @@ -10,6 +10,7 @@ from mcp.server.fastmcp import Context, FastMCP from mcp.server.lowlevel.server import NotificationOptions, Server from mcp.server.models import InitializationOptions +from mcp.shared.message import SessionMessage from mcp.types import ( ClientCapabilities, Implementation, @@ -82,41 +83,49 @@ async def run_server(): clientInfo=Implementation(name="test-client", version="0.1.0"), ) await send_stream1.send( - JSONRPCMessage( - root=JSONRPCRequest( - jsonrpc="2.0", - id=1, - method="initialize", - params=TypeAdapter(InitializeRequestParams).dump_python(params), + SessionMessage( + JSONRPCMessage( + root=JSONRPCRequest( + jsonrpc="2.0", + id=1, + method="initialize", + params=TypeAdapter(InitializeRequestParams).dump_python(params), + ) ) ) ) response = await receive_stream2.receive() + response = response.message # Send initialized notification await send_stream1.send( - JSONRPCMessage( - root=JSONRPCNotification( - jsonrpc="2.0", - method="notifications/initialized", + SessionMessage( + JSONRPCMessage( + root=JSONRPCNotification( + jsonrpc="2.0", + method="notifications/initialized", + ) ) ) ) # Call the tool to verify lifespan context await send_stream1.send( - JSONRPCMessage( - root=JSONRPCRequest( - jsonrpc="2.0", - id=2, - method="tools/call", - params={"name": "check_lifespan", "arguments": {}}, + SessionMessage( + JSONRPCMessage( + root=JSONRPCRequest( + jsonrpc="2.0", + id=2, + method="tools/call", + params={"name": "check_lifespan", "arguments": {}}, + ) ) ) ) # Get response and verify response = await receive_stream2.receive() + response = response.message assert response.root.result["content"][0]["text"] == "true" # Cancel server task @@ -178,41 +187,49 @@ async def run_server(): clientInfo=Implementation(name="test-client", version="0.1.0"), ) await send_stream1.send( - JSONRPCMessage( - root=JSONRPCRequest( - jsonrpc="2.0", - id=1, - method="initialize", - params=TypeAdapter(InitializeRequestParams).dump_python(params), + SessionMessage( + JSONRPCMessage( + root=JSONRPCRequest( + jsonrpc="2.0", + id=1, + method="initialize", + params=TypeAdapter(InitializeRequestParams).dump_python(params), + ) ) ) ) response = await receive_stream2.receive() + response = response.message # Send initialized notification await send_stream1.send( - JSONRPCMessage( - root=JSONRPCNotification( - jsonrpc="2.0", - method="notifications/initialized", + SessionMessage( + JSONRPCMessage( + root=JSONRPCNotification( + jsonrpc="2.0", + method="notifications/initialized", + ) ) ) ) # Call the tool to verify lifespan context await send_stream1.send( - JSONRPCMessage( - root=JSONRPCRequest( - jsonrpc="2.0", - id=2, - method="tools/call", - params={"name": "check_lifespan", "arguments": {}}, + SessionMessage( + JSONRPCMessage( + root=JSONRPCRequest( + jsonrpc="2.0", + id=2, + method="tools/call", + params={"name": "check_lifespan", "arguments": {}}, + ) ) ) ) # Get response and verify response = await receive_stream2.receive() + response = response.message assert response.root.result["content"][0]["text"] == "true" # Cancel server task diff --git a/tests/server/test_lowlevel_tool_annotations.py b/tests/server/test_lowlevel_tool_annotations.py new file mode 100644 index 000000000..2eb3b7ddb --- /dev/null +++ b/tests/server/test_lowlevel_tool_annotations.py @@ -0,0 +1,99 @@ +"""Tests for tool annotations in low-level server.""" + +import anyio +import pytest + +from mcp.client.session import ClientSession +from mcp.server import Server +from mcp.server.lowlevel import NotificationOptions +from mcp.server.models import InitializationOptions +from mcp.server.session import ServerSession +from mcp.shared.message import SessionMessage +from mcp.shared.session import RequestResponder +from mcp.types import ClientResult, ServerNotification, ServerRequest, Tool, ToolAnnotations + + +@pytest.mark.anyio +async def test_lowlevel_server_tool_annotations(): + """Test that tool annotations work in low-level server.""" + server = Server("test") + + # Create a tool with annotations + @server.list_tools() + async def list_tools(): + return [ + Tool( + name="echo", + description="Echo a message back", + inputSchema={ + "type": "object", + "properties": { + "message": {"type": "string"}, + }, + "required": ["message"], + }, + annotations=ToolAnnotations( + title="Echo Tool", + readOnlyHint=True, + ), + ) + ] + + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](10) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](10) + + # Message handler for client + async def message_handler( + message: RequestResponder[ServerRequest, ClientResult] | ServerNotification | Exception, + ) -> None: + if isinstance(message, Exception): + raise message + + # Server task + async def run_server(): + async with ServerSession( + client_to_server_receive, + server_to_client_send, + InitializationOptions( + server_name="test-server", + server_version="1.0.0", + capabilities=server.get_capabilities( + notification_options=NotificationOptions(), + experimental_capabilities={}, + ), + ), + ) as server_session: + async with anyio.create_task_group() as tg: + + async def handle_messages(): + async for message in server_session.incoming_messages: + await server._handle_message(message, server_session, {}, False) + + tg.start_soon(handle_messages) + await anyio.sleep_forever() + + # Run the test + async with anyio.create_task_group() as tg: + tg.start_soon(run_server) + + async with ClientSession( + server_to_client_receive, + client_to_server_send, + message_handler=message_handler, + ) as client_session: + # Initialize the session + await client_session.initialize() + + # List tools + tools_result = await client_session.list_tools() + + # Cancel the server task + tg.cancel_scope.cancel() + + # Verify results + assert tools_result is not None + assert len(tools_result.tools) == 1 + assert tools_result.tools[0].name == "echo" + assert tools_result.tools[0].annotations is not None + assert tools_result.tools[0].annotations.title == "Echo Tool" + assert tools_result.tools[0].annotations.readOnlyHint is True diff --git a/tests/server/test_read_resource.py b/tests/server/test_read_resource.py index 469eef857..91f6ef8c8 100644 --- a/tests/server/test_read_resource.py +++ b/tests/server/test_read_resource.py @@ -56,11 +56,7 @@ async def test_read_resource_binary(temp_file: Path): @server.read_resource() async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]: - return [ - ReadResourceContents( - content=b"Hello World", mime_type="application/octet-stream" - ) - ] + return [ReadResourceContents(content=b"Hello World", mime_type="application/octet-stream")] # Get the handler directly from the server handler = server.request_handlers[types.ReadResourceRequest] diff --git a/tests/server/test_session.py b/tests/server/test_session.py index 561a94b64..69321f87c 100644 --- a/tests/server/test_session.py +++ b/tests/server/test_session.py @@ -7,11 +7,11 @@ from mcp.server.lowlevel import NotificationOptions from mcp.server.models import InitializationOptions from mcp.server.session import ServerSession +from mcp.shared.message import SessionMessage from mcp.shared.session import RequestResponder from mcp.types import ( ClientNotification, InitializedNotification, - JSONRPCMessage, PromptsCapability, ResourcesCapability, ServerCapabilities, @@ -20,18 +20,12 @@ @pytest.mark.anyio async def test_server_session_initialize(): - server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) - client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[ - JSONRPCMessage - ](1) + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](1) # Create a message handler to catch exceptions async def message_handler( - message: RequestResponder[types.ServerRequest, types.ClientResult] - | types.ServerNotification - | Exception, + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, ) -> None: if isinstance(message, Exception): raise message @@ -54,9 +48,7 @@ async def run_server(): if isinstance(message, Exception): raise message - if isinstance(message, ClientNotification) and isinstance( - message.root, InitializedNotification - ): + if isinstance(message, ClientNotification) and isinstance(message.root, InitializedNotification): received_initialized = True return @@ -106,3 +98,89 @@ async def list_resources(): caps = server.get_capabilities(notification_options, experimental_capabilities) assert caps.prompts == PromptsCapability(listChanged=False) assert caps.resources == ResourcesCapability(subscribe=False, listChanged=False) + + +@pytest.mark.anyio +async def test_server_session_initialize_with_older_protocol_version(): + """Test that server accepts and responds with older protocol (2024-11-05).""" + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](1) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage | Exception](1) + + received_initialized = False + received_protocol_version = None + + async def run_server(): + nonlocal received_initialized + + async with ServerSession( + client_to_server_receive, + server_to_client_send, + InitializationOptions( + server_name="mcp", + server_version="0.1.0", + capabilities=ServerCapabilities(), + ), + ) as server_session: + async for message in server_session.incoming_messages: + if isinstance(message, Exception): + raise message + + if isinstance(message, types.ClientNotification) and isinstance(message.root, InitializedNotification): + received_initialized = True + return + + async def mock_client(): + nonlocal received_protocol_version + + # Send initialization request with older protocol version (2024-11-05) + await client_to_server_send.send( + SessionMessage( + types.JSONRPCMessage( + types.JSONRPCRequest( + jsonrpc="2.0", + id=1, + method="initialize", + params=types.InitializeRequestParams( + protocolVersion="2024-11-05", + capabilities=types.ClientCapabilities(), + clientInfo=types.Implementation(name="test-client", version="1.0.0"), + ).model_dump(by_alias=True, mode="json", exclude_none=True), + ) + ) + ) + ) + + # Wait for the initialize response + init_response_message = await server_to_client_receive.receive() + assert isinstance(init_response_message.message.root, types.JSONRPCResponse) + result_data = init_response_message.message.root.result + init_result = types.InitializeResult.model_validate(result_data) + + # Check that the server responded with the requested protocol version + received_protocol_version = init_result.protocolVersion + assert received_protocol_version == "2024-11-05" + + # Send initialized notification + await client_to_server_send.send( + SessionMessage( + types.JSONRPCMessage( + types.JSONRPCNotification( + jsonrpc="2.0", + method="notifications/initialized", + ) + ) + ) + ) + + async with ( + client_to_server_send, + client_to_server_receive, + server_to_client_send, + server_to_client_receive, + anyio.create_task_group() as tg, + ): + tg.start_soon(run_server) + tg.start_soon(mock_client) + + assert received_initialized + assert received_protocol_version == "2024-11-05" diff --git a/tests/server/test_sse_security.py b/tests/server/test_sse_security.py new file mode 100644 index 000000000..43af35061 --- /dev/null +++ b/tests/server/test_sse_security.py @@ -0,0 +1,293 @@ +"""Tests for SSE server DNS rebinding protection.""" + +import logging +import multiprocessing +import socket +import time + +import httpx +import pytest +import uvicorn +from starlette.applications import Starlette +from starlette.requests import Request +from starlette.responses import Response +from starlette.routing import Mount, Route + +from mcp.server import Server +from mcp.server.sse import SseServerTransport +from mcp.server.transport_security import TransportSecuritySettings +from mcp.types import Tool + +logger = logging.getLogger(__name__) +SERVER_NAME = "test_sse_security_server" + + +@pytest.fixture +def server_port() -> int: + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def server_url(server_port: int) -> str: + return f"http://127.0.0.1:{server_port}" + + +class SecurityTestServer(Server): + def __init__(self): + super().__init__(SERVER_NAME) + + async def on_list_tools(self) -> list[Tool]: + return [] + + +def run_server_with_settings(port: int, security_settings: TransportSecuritySettings | None = None): + """Run the SSE server with specified security settings.""" + app = SecurityTestServer() + sse_transport = SseServerTransport("/messages/", security_settings) + + async def handle_sse(request: Request): + try: + async with sse_transport.connect_sse(request.scope, request.receive, request._send) as streams: + if streams: + await app.run(streams[0], streams[1], app.create_initialization_options()) + except ValueError as e: + # Validation error was already handled inside connect_sse + logger.debug(f"SSE connection failed validation: {e}") + return Response() + + routes = [ + Route("/sse", endpoint=handle_sse), + Mount("/messages/", app=sse_transport.handle_post_message), + ] + + starlette_app = Starlette(routes=routes) + uvicorn.run(starlette_app, host="127.0.0.1", port=port, log_level="error") + + +def start_server_process(port: int, security_settings: TransportSecuritySettings | None = None): + """Start server in a separate process.""" + process = multiprocessing.Process(target=run_server_with_settings, args=(port, security_settings)) + process.start() + # Give server time to start + time.sleep(1) + return process + + +@pytest.mark.anyio +async def test_sse_security_default_settings(server_port: int): + """Test SSE with default security settings (protection disabled).""" + process = start_server_process(server_port) + + try: + headers = {"Host": "evil.com", "Origin": "http://evil.com"} + + async with httpx.AsyncClient(timeout=5.0) as client: + async with client.stream("GET", f"http://127.0.0.1:{server_port}/sse", headers=headers) as response: + assert response.status_code == 200 + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_invalid_host_header(server_port: int): + """Test SSE with invalid Host header.""" + # Enable security by providing settings with an empty allowed_hosts list + security_settings = TransportSecuritySettings(enable_dns_rebinding_protection=True, allowed_hosts=["example.com"]) + process = start_server_process(server_port, security_settings) + + try: + # Test with invalid host header + headers = {"Host": "evil.com"} + + async with httpx.AsyncClient() as client: + response = await client.get(f"http://127.0.0.1:{server_port}/sse", headers=headers) + assert response.status_code == 421 + assert response.text == "Invalid Host header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_invalid_origin_header(server_port: int): + """Test SSE with invalid Origin header.""" + # Configure security to allow the host but restrict origins + security_settings = TransportSecuritySettings( + enable_dns_rebinding_protection=True, allowed_hosts=["127.0.0.1:*"], allowed_origins=["http://localhost:*"] + ) + process = start_server_process(server_port, security_settings) + + try: + # Test with invalid origin header + headers = {"Origin": "http://evil.com"} + + async with httpx.AsyncClient() as client: + response = await client.get(f"http://127.0.0.1:{server_port}/sse", headers=headers) + assert response.status_code == 400 + assert response.text == "Invalid Origin header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_post_invalid_content_type(server_port: int): + """Test POST endpoint with invalid Content-Type header.""" + # Configure security to allow the host + security_settings = TransportSecuritySettings( + enable_dns_rebinding_protection=True, allowed_hosts=["127.0.0.1:*"], allowed_origins=["http://127.0.0.1:*"] + ) + process = start_server_process(server_port, security_settings) + + try: + async with httpx.AsyncClient(timeout=5.0) as client: + # Test POST with invalid content type + fake_session_id = "12345678123456781234567812345678" + response = await client.post( + f"http://127.0.0.1:{server_port}/messages/?session_id={fake_session_id}", + headers={"Content-Type": "text/plain"}, + content="test", + ) + assert response.status_code == 400 + assert response.text == "Invalid Content-Type header" + + # Test POST with missing content type + response = await client.post( + f"http://127.0.0.1:{server_port}/messages/?session_id={fake_session_id}", content="test" + ) + assert response.status_code == 400 + assert response.text == "Invalid Content-Type header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_disabled(server_port: int): + """Test SSE with security disabled.""" + settings = TransportSecuritySettings(enable_dns_rebinding_protection=False) + process = start_server_process(server_port, settings) + + try: + # Test with invalid host header - should still work + headers = {"Host": "evil.com"} + + async with httpx.AsyncClient(timeout=5.0) as client: + # For SSE endpoints, we need to use stream to avoid timeout + async with client.stream("GET", f"http://127.0.0.1:{server_port}/sse", headers=headers) as response: + # Should connect successfully even with invalid host + assert response.status_code == 200 + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_custom_allowed_hosts(server_port: int): + """Test SSE with custom allowed hosts.""" + settings = TransportSecuritySettings( + enable_dns_rebinding_protection=True, + allowed_hosts=["localhost", "127.0.0.1", "custom.host"], + allowed_origins=["http://localhost", "http://127.0.0.1", "http://custom.host"], + ) + process = start_server_process(server_port, settings) + + try: + # Test with custom allowed host + headers = {"Host": "custom.host"} + + async with httpx.AsyncClient(timeout=5.0) as client: + # For SSE endpoints, we need to use stream to avoid timeout + async with client.stream("GET", f"http://127.0.0.1:{server_port}/sse", headers=headers) as response: + # Should connect successfully with custom host + assert response.status_code == 200 + + # Test with non-allowed host + headers = {"Host": "evil.com"} + + async with httpx.AsyncClient() as client: + response = await client.get(f"http://127.0.0.1:{server_port}/sse", headers=headers) + assert response.status_code == 421 + assert response.text == "Invalid Host header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_wildcard_ports(server_port: int): + """Test SSE with wildcard port patterns.""" + settings = TransportSecuritySettings( + enable_dns_rebinding_protection=True, + allowed_hosts=["localhost:*", "127.0.0.1:*"], + allowed_origins=["http://localhost:*", "http://127.0.0.1:*"], + ) + process = start_server_process(server_port, settings) + + try: + # Test with various port numbers + for test_port in [8080, 3000, 9999]: + headers = {"Host": f"localhost:{test_port}"} + + async with httpx.AsyncClient(timeout=5.0) as client: + # For SSE endpoints, we need to use stream to avoid timeout + async with client.stream("GET", f"http://127.0.0.1:{server_port}/sse", headers=headers) as response: + # Should connect successfully with any port + assert response.status_code == 200 + + headers = {"Origin": f"http://localhost:{test_port}"} + + async with httpx.AsyncClient(timeout=5.0) as client: + # For SSE endpoints, we need to use stream to avoid timeout + async with client.stream("GET", f"http://127.0.0.1:{server_port}/sse", headers=headers) as response: + # Should connect successfully with any port + assert response.status_code == 200 + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_sse_security_post_valid_content_type(server_port: int): + """Test POST endpoint with valid Content-Type headers.""" + # Configure security to allow the host + security_settings = TransportSecuritySettings( + enable_dns_rebinding_protection=True, allowed_hosts=["127.0.0.1:*"], allowed_origins=["http://127.0.0.1:*"] + ) + process = start_server_process(server_port, security_settings) + + try: + async with httpx.AsyncClient() as client: + # Test with various valid content types + valid_content_types = [ + "application/json", + "application/json; charset=utf-8", + "application/json;charset=utf-8", + "APPLICATION/JSON", # Case insensitive + ] + + for content_type in valid_content_types: + # Use a valid UUID format (even though session won't exist) + fake_session_id = "12345678123456781234567812345678" + response = await client.post( + f"http://127.0.0.1:{server_port}/messages/?session_id={fake_session_id}", + headers={"Content-Type": content_type}, + json={"test": "data"}, + ) + # Will get 404 because session doesn't exist, but that's OK + # We're testing that it passes the content-type check + assert response.status_code == 404 + assert response.text == "Could not find session" + + finally: + process.terminate() + process.join() diff --git a/tests/server/test_stdio.py b/tests/server/test_stdio.py index 85c5bf219..2d1850b73 100644 --- a/tests/server/test_stdio.py +++ b/tests/server/test_stdio.py @@ -4,6 +4,7 @@ import pytest from mcp.server.stdio import stdio_server +from mcp.shared.message import SessionMessage from mcp.types import JSONRPCMessage, JSONRPCRequest, JSONRPCResponse @@ -21,26 +22,23 @@ async def test_stdio_server(): stdin.write(message.model_dump_json(by_alias=True, exclude_none=True) + "\n") stdin.seek(0) - async with stdio_server( - stdin=anyio.AsyncFile(stdin), stdout=anyio.AsyncFile(stdout) - ) as (read_stream, write_stream): + async with stdio_server(stdin=anyio.AsyncFile(stdin), stdout=anyio.AsyncFile(stdout)) as ( + read_stream, + write_stream, + ): received_messages = [] async with read_stream: async for message in read_stream: if isinstance(message, Exception): raise message - received_messages.append(message) + received_messages.append(message.message) if len(received_messages) == 2: break # Verify received messages assert len(received_messages) == 2 - assert received_messages[0] == JSONRPCMessage( - root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping") - ) - assert received_messages[1] == JSONRPCMessage( - root=JSONRPCResponse(jsonrpc="2.0", id=2, result={}) - ) + assert received_messages[0] == JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")) + assert received_messages[1] == JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})) # Test sending responses from the server responses = [ @@ -50,19 +48,14 @@ async def test_stdio_server(): async with write_stream: for response in responses: - await write_stream.send(response) + session_message = SessionMessage(response) + await write_stream.send(session_message) stdout.seek(0) output_lines = stdout.readlines() assert len(output_lines) == 2 - received_responses = [ - JSONRPCMessage.model_validate_json(line.strip()) for line in output_lines - ] + received_responses = [JSONRPCMessage.model_validate_json(line.strip()) for line in output_lines] assert len(received_responses) == 2 - assert received_responses[0] == JSONRPCMessage( - root=JSONRPCRequest(jsonrpc="2.0", id=3, method="ping") - ) - assert received_responses[1] == JSONRPCMessage( - root=JSONRPCResponse(jsonrpc="2.0", id=4, result={}) - ) + assert received_responses[0] == JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=3, method="ping")) + assert received_responses[1] == JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=4, result={})) diff --git a/tests/server/test_streamable_http_manager.py b/tests/server/test_streamable_http_manager.py new file mode 100644 index 000000000..65828b63b --- /dev/null +++ b/tests/server/test_streamable_http_manager.py @@ -0,0 +1,73 @@ +"""Tests for StreamableHTTPSessionManager.""" + +import anyio +import pytest + +from mcp.server.lowlevel import Server +from mcp.server.streamable_http_manager import StreamableHTTPSessionManager + + +@pytest.mark.anyio +async def test_run_can_only_be_called_once(): + """Test that run() can only be called once per instance.""" + app = Server("test-server") + manager = StreamableHTTPSessionManager(app=app) + + # First call should succeed + async with manager.run(): + pass + + # Second call should raise RuntimeError + with pytest.raises(RuntimeError) as excinfo: + async with manager.run(): + pass + + assert "StreamableHTTPSessionManager .run() can only be called once per instance" in str(excinfo.value) + + +@pytest.mark.anyio +async def test_run_prevents_concurrent_calls(): + """Test that concurrent calls to run() are prevented.""" + app = Server("test-server") + manager = StreamableHTTPSessionManager(app=app) + + errors = [] + + async def try_run(): + try: + async with manager.run(): + # Simulate some work + await anyio.sleep(0.1) + except RuntimeError as e: + errors.append(e) + + # Try to run concurrently + async with anyio.create_task_group() as tg: + tg.start_soon(try_run) + tg.start_soon(try_run) + + # One should succeed, one should fail + assert len(errors) == 1 + assert "StreamableHTTPSessionManager .run() can only be called once per instance" in str(errors[0]) + + +@pytest.mark.anyio +async def test_handle_request_without_run_raises_error(): + """Test that handle_request raises error if run() hasn't been called.""" + app = Server("test-server") + manager = StreamableHTTPSessionManager(app=app) + + # Mock ASGI parameters + scope = {"type": "http", "method": "POST", "path": "/test"} + + async def receive(): + return {"type": "http.request", "body": b""} + + async def send(message): + pass + + # Should raise error because run() hasn't been called + with pytest.raises(RuntimeError) as excinfo: + await manager.handle_request(scope, receive, send) + + assert "Task group is not initialized. Make sure to use run()." in str(excinfo.value) diff --git a/tests/server/test_streamable_http_security.py b/tests/server/test_streamable_http_security.py new file mode 100644 index 000000000..eed791924 --- /dev/null +++ b/tests/server/test_streamable_http_security.py @@ -0,0 +1,293 @@ +"""Tests for StreamableHTTP server DNS rebinding protection.""" + +import logging +import multiprocessing +import socket +import time +from collections.abc import AsyncGenerator +from contextlib import asynccontextmanager + +import httpx +import pytest +import uvicorn +from starlette.applications import Starlette +from starlette.routing import Mount +from starlette.types import Receive, Scope, Send + +from mcp.server import Server +from mcp.server.streamable_http_manager import StreamableHTTPSessionManager +from mcp.server.transport_security import TransportSecuritySettings +from mcp.types import Tool + +logger = logging.getLogger(__name__) +SERVER_NAME = "test_streamable_http_security_server" + + +@pytest.fixture +def server_port() -> int: + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def server_url(server_port: int) -> str: + return f"http://127.0.0.1:{server_port}" + + +class SecurityTestServer(Server): + def __init__(self): + super().__init__(SERVER_NAME) + + async def on_list_tools(self) -> list[Tool]: + return [] + + +def run_server_with_settings(port: int, security_settings: TransportSecuritySettings | None = None): + """Run the StreamableHTTP server with specified security settings.""" + app = SecurityTestServer() + + # Create session manager with security settings + session_manager = StreamableHTTPSessionManager( + app=app, + json_response=False, + stateless=False, + security_settings=security_settings, + ) + + # Create the ASGI handler + async def handle_streamable_http(scope: Scope, receive: Receive, send: Send) -> None: + await session_manager.handle_request(scope, receive, send) + + # Create Starlette app with lifespan + @asynccontextmanager + async def lifespan(app: Starlette) -> AsyncGenerator[None, None]: + async with session_manager.run(): + yield + + routes = [ + Mount("/", app=handle_streamable_http), + ] + + starlette_app = Starlette(routes=routes, lifespan=lifespan) + uvicorn.run(starlette_app, host="127.0.0.1", port=port, log_level="error") + + +def start_server_process(port: int, security_settings: TransportSecuritySettings | None = None): + """Start server in a separate process.""" + process = multiprocessing.Process(target=run_server_with_settings, args=(port, security_settings)) + process.start() + # Give server time to start + time.sleep(1) + return process + + +@pytest.mark.anyio +async def test_streamable_http_security_default_settings(server_port: int): + """Test StreamableHTTP with default security settings (protection enabled).""" + process = start_server_process(server_port) + + try: + # Test with valid localhost headers + async with httpx.AsyncClient(timeout=5.0) as client: + # POST request to initialize session + response = await client.post( + f"http://127.0.0.1:{server_port}/", + json={"jsonrpc": "2.0", "method": "initialize", "id": 1, "params": {}}, + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + ) + assert response.status_code == 200 + assert "mcp-session-id" in response.headers + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_streamable_http_security_invalid_host_header(server_port: int): + """Test StreamableHTTP with invalid Host header.""" + security_settings = TransportSecuritySettings(enable_dns_rebinding_protection=True) + process = start_server_process(server_port, security_settings) + + try: + # Test with invalid host header + headers = { + "Host": "evil.com", + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + } + + async with httpx.AsyncClient(timeout=5.0) as client: + response = await client.post( + f"http://127.0.0.1:{server_port}/", + json={"jsonrpc": "2.0", "method": "initialize", "id": 1, "params": {}}, + headers=headers, + ) + assert response.status_code == 421 + assert response.text == "Invalid Host header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_streamable_http_security_invalid_origin_header(server_port: int): + """Test StreamableHTTP with invalid Origin header.""" + security_settings = TransportSecuritySettings(enable_dns_rebinding_protection=True, allowed_hosts=["127.0.0.1:*"]) + process = start_server_process(server_port, security_settings) + + try: + # Test with invalid origin header + headers = { + "Origin": "http://evil.com", + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + } + + async with httpx.AsyncClient(timeout=5.0) as client: + response = await client.post( + f"http://127.0.0.1:{server_port}/", + json={"jsonrpc": "2.0", "method": "initialize", "id": 1, "params": {}}, + headers=headers, + ) + assert response.status_code == 400 + assert response.text == "Invalid Origin header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_streamable_http_security_invalid_content_type(server_port: int): + """Test StreamableHTTP POST with invalid Content-Type header.""" + process = start_server_process(server_port) + + try: + async with httpx.AsyncClient(timeout=5.0) as client: + # Test POST with invalid content type + response = await client.post( + f"http://127.0.0.1:{server_port}/", + headers={ + "Content-Type": "text/plain", + "Accept": "application/json, text/event-stream", + }, + content="test", + ) + assert response.status_code == 400 + assert response.text == "Invalid Content-Type header" + + # Test POST with missing content type + response = await client.post( + f"http://127.0.0.1:{server_port}/", + headers={"Accept": "application/json, text/event-stream"}, + content="test", + ) + assert response.status_code == 400 + assert response.text == "Invalid Content-Type header" + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_streamable_http_security_disabled(server_port: int): + """Test StreamableHTTP with security disabled.""" + settings = TransportSecuritySettings(enable_dns_rebinding_protection=False) + process = start_server_process(server_port, settings) + + try: + # Test with invalid host header - should still work + headers = { + "Host": "evil.com", + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + } + + async with httpx.AsyncClient(timeout=5.0) as client: + response = await client.post( + f"http://127.0.0.1:{server_port}/", + json={"jsonrpc": "2.0", "method": "initialize", "id": 1, "params": {}}, + headers=headers, + ) + # Should connect successfully even with invalid host + assert response.status_code == 200 + + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_streamable_http_security_custom_allowed_hosts(server_port: int): + """Test StreamableHTTP with custom allowed hosts.""" + settings = TransportSecuritySettings( + enable_dns_rebinding_protection=True, + allowed_hosts=["localhost", "127.0.0.1", "custom.host"], + allowed_origins=["http://localhost", "http://127.0.0.1", "http://custom.host"], + ) + process = start_server_process(server_port, settings) + + try: + # Test with custom allowed host + headers = { + "Host": "custom.host", + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + } + + async with httpx.AsyncClient(timeout=5.0) as client: + response = await client.post( + f"http://127.0.0.1:{server_port}/", + json={"jsonrpc": "2.0", "method": "initialize", "id": 1, "params": {}}, + headers=headers, + ) + # Should connect successfully with custom host + assert response.status_code == 200 + finally: + process.terminate() + process.join() + + +@pytest.mark.anyio +async def test_streamable_http_security_get_request(server_port: int): + """Test StreamableHTTP GET request with security.""" + security_settings = TransportSecuritySettings(enable_dns_rebinding_protection=True, allowed_hosts=["127.0.0.1"]) + process = start_server_process(server_port, security_settings) + + try: + # Test GET request with invalid host header + headers = { + "Host": "evil.com", + "Accept": "text/event-stream", + } + + async with httpx.AsyncClient(timeout=5.0) as client: + response = await client.get(f"http://127.0.0.1:{server_port}/", headers=headers) + assert response.status_code == 421 + assert response.text == "Invalid Host header" + + # Test GET request with valid host header + headers = { + "Host": "127.0.0.1", + "Accept": "text/event-stream", + } + + async with httpx.AsyncClient(timeout=5.0) as client: + # GET requests need a session ID in StreamableHTTP + # So it will fail with "Missing session ID" not security error + response = await client.get(f"http://127.0.0.1:{server_port}/", headers=headers) + # This should pass security but fail on session validation + assert response.status_code == 400 + body = response.json() + assert "Missing session ID" in body["error"]["message"] + + finally: + process.terminate() + process.join() diff --git a/tests/shared/test_auth_utils.py b/tests/shared/test_auth_utils.py new file mode 100644 index 000000000..5b12dc677 --- /dev/null +++ b/tests/shared/test_auth_utils.py @@ -0,0 +1,112 @@ +"""Tests for OAuth 2.0 Resource Indicators utilities.""" + +from mcp.shared.auth_utils import check_resource_allowed, resource_url_from_server_url + + +class TestResourceUrlFromServerUrl: + """Tests for resource_url_from_server_url function.""" + + def test_removes_fragment(self): + """Fragment should be removed per RFC 8707.""" + assert resource_url_from_server_url("https://example.com/path#fragment") == "https://example.com/path" + assert resource_url_from_server_url("https://example.com/#fragment") == "https://example.com/" + + def test_preserves_path(self): + """Path should be preserved.""" + assert ( + resource_url_from_server_url("https://example.com/path/to/resource") + == "https://example.com/path/to/resource" + ) + assert resource_url_from_server_url("https://example.com/") == "https://example.com/" + assert resource_url_from_server_url("https://example.com") == "https://example.com" + + def test_preserves_query(self): + """Query parameters should be preserved.""" + assert resource_url_from_server_url("https://example.com/path?foo=bar") == "https://example.com/path?foo=bar" + assert resource_url_from_server_url("https://example.com/?key=value") == "https://example.com/?key=value" + + def test_preserves_port(self): + """Non-default ports should be preserved.""" + assert resource_url_from_server_url("https://example.com:8443/path") == "https://example.com:8443/path" + assert resource_url_from_server_url("http://example.com:8080/") == "http://example.com:8080/" + + def test_lowercase_scheme_and_host(self): + """Scheme and host should be lowercase for canonical form.""" + assert resource_url_from_server_url("HTTPS://EXAMPLE.COM/path") == "https://example.com/path" + assert resource_url_from_server_url("Http://Example.Com:8080/") == "http://example.com:8080/" + + def test_handles_pydantic_urls(self): + """Should handle Pydantic URL types.""" + from pydantic import HttpUrl + + url = HttpUrl("https://example.com/path") + assert resource_url_from_server_url(url) == "https://example.com/path" + + +class TestCheckResourceAllowed: + """Tests for check_resource_allowed function.""" + + def test_identical_urls(self): + """Identical URLs should match.""" + assert check_resource_allowed("https://example.com/path", "https://example.com/path") is True + assert check_resource_allowed("https://example.com/", "https://example.com/") is True + assert check_resource_allowed("https://example.com", "https://example.com") is True + + def test_different_schemes(self): + """Different schemes should not match.""" + assert check_resource_allowed("https://example.com/path", "http://example.com/path") is False + assert check_resource_allowed("http://example.com/", "https://example.com/") is False + + def test_different_domains(self): + """Different domains should not match.""" + assert check_resource_allowed("https://example.com/path", "https://example.org/path") is False + assert check_resource_allowed("https://sub.example.com/", "https://example.com/") is False + + def test_different_ports(self): + """Different ports should not match.""" + assert check_resource_allowed("https://example.com:8443/path", "https://example.com/path") is False + assert check_resource_allowed("https://example.com:8080/", "https://example.com:8443/") is False + + def test_hierarchical_matching(self): + """Child paths should match parent paths.""" + # Parent resource allows child resources + assert check_resource_allowed("https://example.com/api/v1/users", "https://example.com/api") is True + assert check_resource_allowed("https://example.com/api/v1", "https://example.com/api") is True + assert check_resource_allowed("https://example.com/mcp/server", "https://example.com/mcp") is True + + # Exact match + assert check_resource_allowed("https://example.com/api", "https://example.com/api") is True + + # Parent cannot use child's token + assert check_resource_allowed("https://example.com/api", "https://example.com/api/v1") is False + assert check_resource_allowed("https://example.com/", "https://example.com/api") is False + + def test_path_boundary_matching(self): + """Path matching should respect boundaries.""" + # Should not match partial path segments + assert check_resource_allowed("https://example.com/apiextra", "https://example.com/api") is False + assert check_resource_allowed("https://example.com/api123", "https://example.com/api") is False + + # Should match with trailing slash + assert check_resource_allowed("https://example.com/api/", "https://example.com/api") is True + assert check_resource_allowed("https://example.com/api/v1", "https://example.com/api/") is True + + def test_trailing_slash_handling(self): + """Trailing slashes should be handled correctly.""" + # With and without trailing slashes + assert check_resource_allowed("https://example.com/api/", "https://example.com/api") is True + assert check_resource_allowed("https://example.com/api", "https://example.com/api/") is False + assert check_resource_allowed("https://example.com/api/v1", "https://example.com/api") is True + assert check_resource_allowed("https://example.com/api/v1", "https://example.com/api/") is True + + def test_case_insensitive_origin(self): + """Origin comparison should be case-insensitive.""" + assert check_resource_allowed("https://EXAMPLE.COM/path", "https://example.com/path") is True + assert check_resource_allowed("HTTPS://example.com/path", "https://example.com/path") is True + assert check_resource_allowed("https://Example.Com:8080/api", "https://example.com:8080/api") is True + + def test_empty_paths(self): + """Empty paths should be handled correctly.""" + assert check_resource_allowed("https://example.com", "https://example.com") is True + assert check_resource_allowed("https://example.com/", "https://example.com") is True + assert check_resource_allowed("https://example.com/api", "https://example.com") is True diff --git a/tests/shared/test_httpx_utils.py b/tests/shared/test_httpx_utils.py new file mode 100644 index 000000000..dcc6fd003 --- /dev/null +++ b/tests/shared/test_httpx_utils.py @@ -0,0 +1,24 @@ +"""Tests for httpx utility functions.""" + +import httpx + +from mcp.shared._httpx_utils import create_mcp_http_client + + +def test_default_settings(): + """Test that default settings are applied correctly.""" + client = create_mcp_http_client() + + assert client.follow_redirects is True + assert client.timeout.connect == 30.0 + + +def test_custom_parameters(): + """Test custom headers and timeout are set correctly.""" + headers = {"Authorization": "Bearer token"} + timeout = httpx.Timeout(60.0) + + client = create_mcp_http_client(headers, timeout) + + assert client.headers["Authorization"] == "Bearer token" + assert client.timeout.connect == 60.0 diff --git a/tests/shared/test_progress_notifications.py b/tests/shared/test_progress_notifications.py new file mode 100644 index 000000000..08bcb2662 --- /dev/null +++ b/tests/shared/test_progress_notifications.py @@ -0,0 +1,335 @@ +from typing import Any, cast + +import anyio +import pytest + +import mcp.types as types +from mcp.client.session import ClientSession +from mcp.server import Server +from mcp.server.lowlevel import NotificationOptions +from mcp.server.models import InitializationOptions +from mcp.server.session import ServerSession +from mcp.shared.context import RequestContext +from mcp.shared.progress import progress +from mcp.shared.session import ( + BaseSession, + RequestResponder, + SessionMessage, +) + + +@pytest.mark.anyio +async def test_bidirectional_progress_notifications(): + """Test that both client and server can send progress notifications.""" + # Create memory streams for client/server + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](5) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](5) + + # Run a server session so we can send progress updates in tool + async def run_server(): + # Create a server session + async with ServerSession( + client_to_server_receive, + server_to_client_send, + InitializationOptions( + server_name="ProgressTestServer", + server_version="0.1.0", + capabilities=server.get_capabilities(NotificationOptions(), {}), + ), + ) as server_session: + global serv_sesh + + serv_sesh = server_session + async for message in server_session.incoming_messages: + try: + await server._handle_message(message, server_session, ()) + except Exception as e: + raise e + + # Track progress updates + server_progress_updates = [] + client_progress_updates = [] + + # Progress tokens + server_progress_token = "server_token_123" + client_progress_token = "client_token_456" + + # Create a server with progress capability + server = Server(name="ProgressTestServer") + + # Register progress handler + @server.progress_notification() + async def handle_progress( + progress_token: str | int, + progress: float, + total: float | None, + message: str | None, + ): + server_progress_updates.append( + { + "token": progress_token, + "progress": progress, + "total": total, + "message": message, + } + ) + + # Register list tool handler + @server.list_tools() + async def handle_list_tools() -> list[types.Tool]: + return [ + types.Tool( + name="test_tool", + description="A tool that sends progress notifications list: + # Make sure we received a progress token + if name == "test_tool": + if arguments and "_meta" in arguments: + progressToken = arguments["_meta"]["progressToken"] + + if not progressToken: + raise ValueError("Empty progress token received") + + if progressToken != client_progress_token: + raise ValueError("Server sending back incorrect progressToken") + + # Send progress notifications + await serv_sesh.send_progress_notification( + progress_token=progressToken, + progress=0.25, + total=1.0, + message="Server progress 25%", + ) + + await serv_sesh.send_progress_notification( + progress_token=progressToken, + progress=0.5, + total=1.0, + message="Server progress 50%", + ) + + await serv_sesh.send_progress_notification( + progress_token=progressToken, + progress=1.0, + total=1.0, + message="Server progress 100%", + ) + + else: + raise ValueError("Progress token not sent.") + + return ["Tool executed successfully"] + + raise ValueError(f"Unknown tool: {name}") + + # Client message handler to store progress notifications + async def handle_client_message( + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, + ) -> None: + if isinstance(message, Exception): + raise message + + if isinstance(message, types.ServerNotification): + if isinstance(message.root, types.ProgressNotification): + params = message.root.params + client_progress_updates.append( + { + "token": params.progressToken, + "progress": params.progress, + "total": params.total, + "message": params.message, + } + ) + + # Test using client + async with ( + ClientSession( + server_to_client_receive, + client_to_server_send, + message_handler=handle_client_message, + ) as client_session, + anyio.create_task_group() as tg, + ): + # Start the server in a background task + tg.start_soon(run_server) + + # Initialize the client connection + await client_session.initialize() + + # Call list_tools with progress token + await client_session.list_tools() + + # Call test_tool with progress token + await client_session.call_tool("test_tool", {"_meta": {"progressToken": client_progress_token}}) + + # Send progress notifications from client to server + await client_session.send_progress_notification( + progress_token=server_progress_token, + progress=0.33, + total=1.0, + message="Client progress 33%", + ) + + await client_session.send_progress_notification( + progress_token=server_progress_token, + progress=0.66, + total=1.0, + message="Client progress 66%", + ) + + await client_session.send_progress_notification( + progress_token=server_progress_token, + progress=1.0, + total=1.0, + message="Client progress 100%", + ) + + # Wait and exit + await anyio.sleep(0.5) + tg.cancel_scope.cancel() + + # Verify client received progress updates from server + assert len(client_progress_updates) == 3 + assert client_progress_updates[0]["token"] == client_progress_token + assert client_progress_updates[0]["progress"] == 0.25 + assert client_progress_updates[0]["message"] == "Server progress 25%" + assert client_progress_updates[2]["progress"] == 1.0 + + # Verify server received progress updates from client + assert len(server_progress_updates) == 3 + assert server_progress_updates[0]["token"] == server_progress_token + assert server_progress_updates[0]["progress"] == 0.33 + assert server_progress_updates[0]["message"] == "Client progress 33%" + assert server_progress_updates[2]["progress"] == 1.0 + + +@pytest.mark.anyio +async def test_progress_context_manager(): + """Test client using progress context manager for sending progress notifications.""" + # Create memory streams for client/server + server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[SessionMessage](5) + client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[SessionMessage](5) + + # Track progress updates + server_progress_updates = [] + + server = Server(name="ProgressContextTestServer") + + # Register progress handler + @server.progress_notification() + async def handle_progress( + progress_token: str | int, + progress: float, + total: float | None, + message: str | None, + ): + server_progress_updates.append( + { + "token": progress_token, + "progress": progress, + "total": total, + "message": message, + } + ) + + # Run server session to receive progress updates + async def run_server(): + # Create a server session + async with ServerSession( + client_to_server_receive, + server_to_client_send, + InitializationOptions( + server_name="ProgressContextTestServer", + server_version="0.1.0", + capabilities=server.get_capabilities(NotificationOptions(), {}), + ), + ) as server_session: + async for message in server_session.incoming_messages: + try: + await server._handle_message(message, server_session, ()) + except Exception as e: + raise e + + # Client message handler + async def handle_client_message( + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, + ) -> None: + if isinstance(message, Exception): + raise message + + # run client session + async with ( + ClientSession( + server_to_client_receive, + client_to_server_send, + message_handler=handle_client_message, + ) as client_session, + anyio.create_task_group() as tg, + ): + tg.start_soon(run_server) + + await client_session.initialize() + + progress_token = "client_token_456" + + # Create request context + meta = types.RequestParams.Meta(progressToken=progress_token) + request_context = RequestContext( + request_id="test-request", + session=client_session, + meta=meta, + lifespan_context=None, + ) + + # cast for type checker + typed_context = cast( + RequestContext[ + BaseSession[Any, Any, Any, Any, Any], + Any, + ], + request_context, + ) + + # Utilize progress context manager + with progress(typed_context, total=100) as p: + await p.progress(10, message="Loading configuration...") + await p.progress(30, message="Connecting to database...") + await p.progress(40, message="Fetching data...") + await p.progress(20, message="Processing results...") + + # Wait for all messages to be processed + await anyio.sleep(0.5) + tg.cancel_scope.cancel() + + # Verify progress updates were received by server + assert len(server_progress_updates) == 4 + + # first update + assert server_progress_updates[0]["token"] == progress_token + assert server_progress_updates[0]["progress"] == 10 + assert server_progress_updates[0]["total"] == 100 + assert server_progress_updates[0]["message"] == "Loading configuration..." + + # second update + assert server_progress_updates[1]["token"] == progress_token + assert server_progress_updates[1]["progress"] == 40 + assert server_progress_updates[1]["total"] == 100 + assert server_progress_updates[1]["message"] == "Connecting to database..." + + # third update + assert server_progress_updates[2]["token"] == progress_token + assert server_progress_updates[2]["progress"] == 80 + assert server_progress_updates[2]["total"] == 100 + assert server_progress_updates[2]["message"] == "Fetching data..." + + # final update + assert server_progress_updates[3]["token"] == progress_token + assert server_progress_updates[3]["progress"] == 100 + assert server_progress_updates[3]["total"] == 100 + assert server_progress_updates[3]["message"] == "Processing results..." diff --git a/tests/shared/test_session.py b/tests/shared/test_session.py index 59cb30c86..864e0d1b4 100644 --- a/tests/shared/test_session.py +++ b/tests/shared/test_session.py @@ -7,7 +7,10 @@ from mcp.client.session import ClientSession from mcp.server.lowlevel.server import Server from mcp.shared.exceptions import McpError -from mcp.shared.memory import create_connected_server_and_client_session +from mcp.shared.memory import ( + create_client_server_memory_streams, + create_connected_server_and_client_session, +) from mcp.types import ( CancelledNotification, CancelledNotificationParams, @@ -87,9 +90,7 @@ async def make_request(client_session): ClientRequest( types.CallToolRequest( method="tools/call", - params=types.CallToolRequestParams( - name="slow_tool", arguments={} - ), + params=types.CallToolRequestParams(name="slow_tool", arguments={}), ) ), types.CallToolResult, @@ -100,9 +101,7 @@ async def make_request(client_session): assert "Request cancelled" in str(e) ev_cancelled.set() - async with create_connected_server_and_client_session( - make_server() - ) as client_session: + async with create_connected_server_and_client_session(make_server()) as client_session: async with anyio.create_task_group() as tg: tg.start_soon(make_request, client_session) @@ -124,3 +123,57 @@ async def make_request(client_session): # Give cancellation time to process with anyio.fail_after(1): await ev_cancelled.wait() + + +@pytest.mark.anyio +async def test_connection_closed(): + """ + Test that pending requests are cancelled when the connection is closed remotely. + """ + + ev_closed = anyio.Event() + ev_response = anyio.Event() + + async with create_client_server_memory_streams() as ( + client_streams, + server_streams, + ): + client_read, client_write = client_streams + server_read, server_write = server_streams + + async def make_request(client_session): + """Send a request in a separate task""" + nonlocal ev_response + try: + # any request will do + await client_session.initialize() + pytest.fail("Request should have errored") + except McpError as e: + # Expected - request errored + assert "Connection closed" in str(e) + ev_response.set() + + async def mock_server(): + """Wait for a request, then close the connection""" + nonlocal ev_closed + # Wait for a request + await server_read.receive() + # Close the connection, as if the server exited + server_write.close() + server_read.close() + ev_closed.set() + + async with ( + anyio.create_task_group() as tg, + ClientSession( + read_stream=client_read, + write_stream=client_write, + ) as client_session, + ): + tg.start_soon(make_request, client_session) + tg.start_soon(mock_server) + + with anyio.fail_after(1): + await ev_closed.wait() + with anyio.fail_after(1): + await ev_response.wait() diff --git a/tests/shared/test_sse.py b/tests/shared/test_sse.py index f5158c3c3..8e1912e9b 100644 --- a/tests/shared/test_sse.py +++ b/tests/shared/test_sse.py @@ -1,3 +1,4 @@ +import json import multiprocessing import socket import time @@ -7,15 +8,19 @@ import httpx import pytest import uvicorn +from inline_snapshot import snapshot from pydantic import AnyUrl from starlette.applications import Starlette from starlette.requests import Request +from starlette.responses import Response from starlette.routing import Mount, Route +import mcp.types as types from mcp.client.session import ClientSession from mcp.client.sse import sse_client from mcp.server import Server from mcp.server.sse import SseServerTransport +from mcp.server.transport_security import TransportSecuritySettings from mcp.shared.exceptions import McpError from mcp.types import ( EmptyResult, @@ -56,11 +61,7 @@ async def handle_read_resource(uri: AnyUrl) -> str | bytes: await anyio.sleep(2.0) return f"Slow response from {uri.host}" - raise McpError( - error=ErrorData( - code=404, message="OOPS! no resource with that URI was found" - ) - ) + raise McpError(error=ErrorData(code=404, message="OOPS! no resource with that URI was found")) @self.list_tools() async def handle_list_tools() -> list[Tool]: @@ -80,16 +81,17 @@ async def handle_call_tool(name: str, args: dict) -> list[TextContent]: # Test fixtures def make_server_app() -> Starlette: """Create test Starlette app with SSE transport""" - sse = SseServerTransport("/messages/") + # Configure security with allowed hosts/origins for testing + security_settings = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + sse = SseServerTransport("/messages/", security_settings=security_settings) server = ServerTest() - async def handle_sse(request: Request) -> None: - async with sse.connect_sse( - request.scope, request.receive, request._send - ) as streams: - await server.run( - streams[0], streams[1], server.create_initialization_options() - ) + async def handle_sse(request: Request) -> Response: + async with sse.connect_sse(request.scope, request.receive, request._send) as streams: + await server.run(streams[0], streams[1], server.create_initialization_options()) + return Response() app = Starlette( routes=[ @@ -103,11 +105,7 @@ async def handle_sse(request: Request) -> None: def run_server(server_port: int) -> None: app = make_server_app() - server = uvicorn.Server( - config=uvicorn.Config( - app=app, host="127.0.0.1", port=server_port, log_level="error" - ) - ) + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) print(f"starting server on {server_port}") server.run() @@ -119,9 +117,7 @@ def run_server(server_port: int) -> None: @pytest.fixture() def server(server_port: int) -> Generator[None, None, None]: - proc = multiprocessing.Process( - target=run_server, kwargs={"server_port": server_port}, daemon=True - ) + proc = multiprocessing.Process(target=run_server, kwargs={"server_port": server_port}, daemon=True) print("starting process") proc.start() @@ -166,10 +162,7 @@ async def test_raw_sse_connection(http_client: httpx.AsyncClient) -> None: async def connection_test() -> None: async with http_client.stream("GET", "/sse") as response: assert response.status_code == 200 - assert ( - response.headers["content-type"] - == "text/event-stream; charset=utf-8" - ) + assert response.headers["content-type"] == "text/event-stream; charset=utf-8" line_number = 0 async for line in response.aiter_lines(): @@ -201,9 +194,7 @@ async def test_sse_client_basic_connection(server: None, server_url: str) -> Non @pytest.fixture -async def initialized_sse_client_session( - server, server_url: str -) -> AsyncGenerator[ClientSession, None]: +async def initialized_sse_client_session(server, server_url: str) -> AsyncGenerator[ClientSession, None]: async with sse_client(server_url + "/sse", sse_read_timeout=0.5) as streams: async with ClientSession(*streams) as session: await session.initialize() @@ -231,9 +222,7 @@ async def test_sse_client_exception_handling( @pytest.mark.anyio -@pytest.mark.skip( - "this test highlights a possible bug in SSE read timeout exception handling" -) +@pytest.mark.skip("this test highlights a possible bug in SSE read timeout exception handling") async def test_sse_client_timeout( initialized_sse_client_session: ClientSession, ) -> None: @@ -250,3 +239,237 @@ async def test_sse_client_timeout( return pytest.fail("the client should have timed out and returned an error already") + + +def run_mounted_server(server_port: int) -> None: + app = make_server_app() + main_app = Starlette(routes=[Mount("/mounted_app", app=app)]) + server = uvicorn.Server(config=uvicorn.Config(app=main_app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"starting server on {server_port}") + server.run() + + # Give server time to start + while not server.started: + print("waiting for server to start") + time.sleep(0.5) + + +@pytest.fixture() +def mounted_server(server_port: int) -> Generator[None, None, None]: + proc = multiprocessing.Process(target=run_mounted_server, kwargs={"server_port": server_port}, daemon=True) + print("starting process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("waiting for server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Server failed to start after {max_attempts} attempts") + + yield + + print("killing server") + # Signal the server to stop + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("server process failed to terminate") + + +@pytest.mark.anyio +async def test_sse_client_basic_connection_mounted_app(mounted_server: None, server_url: str) -> None: + async with sse_client(server_url + "/mounted_app/sse") as streams: + async with ClientSession(*streams) as session: + # Test initialization + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == SERVER_NAME + + # Test ping + ping_result = await session.send_ping() + assert isinstance(ping_result, EmptyResult) + + +# Test server with request context that returns headers in the response +class RequestContextServer(Server[object, Request]): + def __init__(self): + super().__init__("request_context_server") + + @self.call_tool() + async def handle_call_tool(name: str, args: dict) -> list[TextContent]: + headers_info = {} + context = self.request_context + if context.request: + headers_info = dict(context.request.headers) + + if name == "echo_headers": + return [TextContent(type="text", text=json.dumps(headers_info))] + elif name == "echo_context": + context_data = { + "request_id": args.get("request_id"), + "headers": headers_info, + } + return [TextContent(type="text", text=json.dumps(context_data))] + + return [TextContent(type="text", text=f"Called {name}")] + + @self.list_tools() + async def handle_list_tools() -> list[Tool]: + return [ + Tool( + name="echo_headers", + description="Echoes request headers", + inputSchema={"type": "object", "properties": {}}, + ), + Tool( + name="echo_context", + description="Echoes request context", + inputSchema={ + "type": "object", + "properties": {"request_id": {"type": "string"}}, + "required": ["request_id"], + }, + ), + ] + + +def run_context_server(server_port: int) -> None: + """Run a server that captures request context""" + # Configure security with allowed hosts/origins for testing + security_settings = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + sse = SseServerTransport("/messages/", security_settings=security_settings) + context_server = RequestContextServer() + + async def handle_sse(request: Request) -> Response: + async with sse.connect_sse(request.scope, request.receive, request._send) as streams: + await context_server.run(streams[0], streams[1], context_server.create_initialization_options()) + return Response() + + app = Starlette( + routes=[ + Route("/sse", endpoint=handle_sse), + Mount("/messages/", app=sse.handle_post_message), + ] + ) + + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) + print(f"starting context server on {server_port}") + server.run() + + +@pytest.fixture() +def context_server(server_port: int) -> Generator[None, None, None]: + """Fixture that provides a server with request context capture""" + proc = multiprocessing.Process(target=run_context_server, kwargs={"server_port": server_port}, daemon=True) + print("starting context server process") + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + print("waiting for context server to start") + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Context server failed to start after {max_attempts} attempts") + + yield + + print("killing context server") + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("context server process failed to terminate") + + +@pytest.mark.anyio +async def test_request_context_propagation(context_server: None, server_url: str) -> None: + """Test that request context is properly propagated through SSE transport.""" + # Test with custom headers + custom_headers = { + "Authorization": "Bearer test-token", + "X-Custom-Header": "test-value", + "X-Trace-Id": "trace-123", + } + + async with sse_client(server_url + "/sse", headers=custom_headers) as ( + read_stream, + write_stream, + ): + async with ClientSession(read_stream, write_stream) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + + # Call the tool that echoes headers back + tool_result = await session.call_tool("echo_headers", {}) + + # Parse the JSON response + + assert len(tool_result.content) == 1 + headers_data = json.loads(tool_result.content[0].text if tool_result.content[0].type == "text" else "{}") + + # Verify headers were propagated + assert headers_data.get("authorization") == "Bearer test-token" + assert headers_data.get("x-custom-header") == "test-value" + assert headers_data.get("x-trace-id") == "trace-123" + + +@pytest.mark.anyio +async def test_request_context_isolation(context_server: None, server_url: str) -> None: + """Test that request contexts are isolated between different SSE clients.""" + contexts = [] + + # Create multiple clients with different headers + for i in range(3): + headers = {"X-Request-Id": f"request-{i}", "X-Custom-Value": f"value-{i}"} + + async with sse_client(server_url + "/sse", headers=headers) as ( + read_stream, + write_stream, + ): + async with ClientSession(read_stream, write_stream) as session: + await session.initialize() + + # Call the tool that echoes context + tool_result = await session.call_tool("echo_context", {"request_id": f"request-{i}"}) + + assert len(tool_result.content) == 1 + context_data = json.loads( + tool_result.content[0].text if tool_result.content[0].type == "text" else "{}" + ) + contexts.append(context_data) + + # Verify each request had its own context + assert len(contexts) == 3 + for i, ctx in enumerate(contexts): + assert ctx["request_id"] == f"request-{i}" + assert ctx["headers"].get("x-request-id") == f"request-{i}" + assert ctx["headers"].get("x-custom-value") == f"value-{i}" + + +def test_sse_message_id_coercion(): + """Test that string message IDs that look like integers are parsed as integers. + + See for more details. + """ + json_message = '{"jsonrpc": "2.0", "id": "123", "method": "ping", "params": null}' + msg = types.JSONRPCMessage.model_validate_json(json_message) + assert msg == snapshot(types.JSONRPCMessage(root=types.JSONRPCRequest(method="ping", jsonrpc="2.0", id=123))) diff --git a/tests/shared/test_streamable_http.py b/tests/shared/test_streamable_http.py new file mode 100644 index 000000000..1ffcc13b0 --- /dev/null +++ b/tests/shared/test_streamable_http.py @@ -0,0 +1,1566 @@ +""" +Tests for the StreamableHTTP server and client transport. + +Contains tests for both server and client sides of the StreamableHTTP transport. +""" + +import json +import multiprocessing +import socket +import time +from collections.abc import Generator +from typing import Any + +import anyio +import httpx +import pytest +import requests +import uvicorn +from pydantic import AnyUrl +from starlette.applications import Starlette +from starlette.requests import Request +from starlette.routing import Mount + +import mcp.types as types +from mcp.client.session import ClientSession +from mcp.client.streamable_http import streamablehttp_client +from mcp.server import Server +from mcp.server.streamable_http import ( + MCP_PROTOCOL_VERSION_HEADER, + MCP_SESSION_ID_HEADER, + SESSION_ID_PATTERN, + EventCallback, + EventId, + EventMessage, + EventStore, + StreamableHTTPServerTransport, + StreamId, +) +from mcp.server.streamable_http_manager import StreamableHTTPSessionManager +from mcp.server.transport_security import TransportSecuritySettings +from mcp.shared.context import RequestContext +from mcp.shared.exceptions import McpError +from mcp.shared.message import ( + ClientMessageMetadata, +) +from mcp.shared.session import RequestResponder +from mcp.types import ( + InitializeResult, + TextContent, + TextResourceContents, + Tool, +) + +# Test constants +SERVER_NAME = "test_streamable_http_server" +TEST_SESSION_ID = "test-session-id-12345" +INIT_REQUEST = { + "jsonrpc": "2.0", + "method": "initialize", + "params": { + "clientInfo": {"name": "test-client", "version": "1.0"}, + "protocolVersion": "2025-03-26", + "capabilities": {}, + }, + "id": "init-1", +} + + +# Helper functions +def extract_protocol_version_from_sse(response: requests.Response) -> str: + """Extract the negotiated protocol version from an SSE initialization response.""" + assert response.headers.get("Content-Type") == "text/event-stream" + for line in response.text.splitlines(): + if line.startswith("data: "): + init_data = json.loads(line[6:]) + return init_data["result"]["protocolVersion"] + raise ValueError("Could not extract protocol version from SSE response") + + +# Simple in-memory event store for testing +class SimpleEventStore(EventStore): + """Simple in-memory event store for testing.""" + + def __init__(self): + self._events: list[tuple[StreamId, EventId, types.JSONRPCMessage]] = [] + self._event_id_counter = 0 + + async def store_event(self, stream_id: StreamId, message: types.JSONRPCMessage) -> EventId: + """Store an event and return its ID.""" + self._event_id_counter += 1 + event_id = str(self._event_id_counter) + self._events.append((stream_id, event_id, message)) + return event_id + + async def replay_events_after( + self, + last_event_id: EventId, + send_callback: EventCallback, + ) -> StreamId | None: + """Replay events after the specified ID.""" + # Find the index of the last event ID + start_index = None + for i, (_, event_id, _) in enumerate(self._events): + if event_id == last_event_id: + start_index = i + 1 + break + + if start_index is None: + # If event ID not found, start from beginning + start_index = 0 + + stream_id = None + # Replay events + for _, event_id, message in self._events[start_index:]: + await send_callback(EventMessage(message, event_id)) + # Capture the stream ID from the first replayed event + if stream_id is None and len(self._events) > start_index: + stream_id = self._events[start_index][0] + + return stream_id + + +# Test server implementation that follows MCP protocol +class ServerTest(Server): + def __init__(self): + super().__init__(SERVER_NAME) + + @self.read_resource() + async def handle_read_resource(uri: AnyUrl) -> str | bytes: + if uri.scheme == "foobar": + return f"Read {uri.host}" + elif uri.scheme == "slow": + # Simulate a slow resource + await anyio.sleep(2.0) + return f"Slow response from {uri.host}" + + raise ValueError(f"Unknown resource: {uri}") + + @self.list_tools() + async def handle_list_tools() -> list[Tool]: + return [ + Tool( + name="test_tool", + description="A test tool", + inputSchema={"type": "object", "properties": {}}, + ), + Tool( + name="test_tool_with_standalone_notification", + description="A test tool that sends a notification", + inputSchema={"type": "object", "properties": {}}, + ), + Tool( + name="long_running_with_checkpoints", + description="A long-running tool that sends periodic notifications", + inputSchema={"type": "object", "properties": {}}, + ), + Tool( + name="test_sampling_tool", + description="A tool that triggers server-side sampling", + inputSchema={"type": "object", "properties": {}}, + ), + ] + + @self.call_tool() + async def handle_call_tool(name: str, args: dict) -> list[TextContent]: + ctx = self.request_context + + # When the tool is called, send a notification to test GET stream + if name == "test_tool_with_standalone_notification": + await ctx.session.send_resource_updated(uri=AnyUrl("http://test_resource")) + return [TextContent(type="text", text=f"Called {name}")] + + elif name == "long_running_with_checkpoints": + # Send notifications that are part of the response stream + # This simulates a long-running tool that sends logs + + await ctx.session.send_log_message( + level="info", + data="Tool started", + logger="tool", + related_request_id=ctx.request_id, # need for stream association + ) + + await anyio.sleep(0.1) + + await ctx.session.send_log_message( + level="info", + data="Tool is almost done", + logger="tool", + related_request_id=ctx.request_id, + ) + + return [TextContent(type="text", text="Completed!")] + + elif name == "test_sampling_tool": + # Test sampling by requesting the client to sample a message + sampling_result = await ctx.session.create_message( + messages=[ + types.SamplingMessage( + role="user", + content=types.TextContent(type="text", text="Server needs client sampling"), + ) + ], + max_tokens=100, + related_request_id=ctx.request_id, + ) + + # Return the sampling result in the tool response + response = sampling_result.content.text if sampling_result.content.type == "text" else None + return [ + TextContent( + type="text", + text=f"Response from sampling: {response}", + ) + ] + + return [TextContent(type="text", text=f"Called {name}")] + + +def create_app(is_json_response_enabled=False, event_store: EventStore | None = None) -> Starlette: + """Create a Starlette application for testing using the session manager. + + Args: + is_json_response_enabled: If True, use JSON responses instead of SSE streams. + event_store: Optional event store for testing resumability. + """ + # Create server instance + server = ServerTest() + + # Create the session manager + security_settings = TransportSecuritySettings( + allowed_hosts=["127.0.0.1:*", "localhost:*"], allowed_origins=["http://127.0.0.1:*", "http://localhost:*"] + ) + session_manager = StreamableHTTPSessionManager( + app=server, + event_store=event_store, + json_response=is_json_response_enabled, + security_settings=security_settings, + ) + + # Create an ASGI application that uses the session manager + app = Starlette( + debug=True, + routes=[ + Mount("/mcp", app=session_manager.handle_request), + ], + lifespan=lambda app: session_manager.run(), + ) + + return app + + +def run_server(port: int, is_json_response_enabled=False, event_store: EventStore | None = None) -> None: + """Run the test server. + + Args: + port: Port to listen on. + is_json_response_enabled: If True, use JSON responses instead of SSE streams. + event_store: Optional event store for testing resumability. + """ + + app = create_app(is_json_response_enabled, event_store) + # Configure server + config = uvicorn.Config( + app=app, + host="127.0.0.1", + port=port, + log_level="info", + limit_concurrency=10, + timeout_keep_alive=5, + access_log=False, + ) + + # Start the server + server = uvicorn.Server(config=config) + + # This is important to catch exceptions and prevent test hangs + try: + server.run() + except Exception: + import traceback + + traceback.print_exc() + + +# Test fixtures - using same approach as SSE tests +@pytest.fixture +def basic_server_port() -> int: + """Find an available port for the basic server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def json_server_port() -> int: + """Find an available port for the JSON response server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def basic_server(basic_server_port: int) -> Generator[None, None, None]: + """Start a basic server.""" + proc = multiprocessing.Process(target=run_server, kwargs={"port": basic_server_port}, daemon=True) + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", basic_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Server failed to start after {max_attempts} attempts") + + yield + + # Clean up + proc.kill() + proc.join(timeout=2) + + +@pytest.fixture +def event_store() -> SimpleEventStore: + """Create a test event store.""" + return SimpleEventStore() + + +@pytest.fixture +def event_server_port() -> int: + """Find an available port for the event store server.""" + with socket.socket() as s: + s.bind(("127.0.0.1", 0)) + return s.getsockname()[1] + + +@pytest.fixture +def event_server( + event_server_port: int, event_store: SimpleEventStore +) -> Generator[tuple[SimpleEventStore, str], None, None]: + """Start a server with event store enabled.""" + proc = multiprocessing.Process( + target=run_server, + kwargs={"port": event_server_port, "event_store": event_store}, + daemon=True, + ) + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", event_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Server failed to start after {max_attempts} attempts") + + yield event_store, f"http://127.0.0.1:{event_server_port}" + + # Clean up + proc.kill() + proc.join(timeout=2) + + +@pytest.fixture +def json_response_server(json_server_port: int) -> Generator[None, None, None]: + """Start a server with JSON response enabled.""" + proc = multiprocessing.Process( + target=run_server, + kwargs={"port": json_server_port, "is_json_response_enabled": True}, + daemon=True, + ) + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", json_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Server failed to start after {max_attempts} attempts") + + yield + + # Clean up + proc.kill() + proc.join(timeout=2) + + +@pytest.fixture +def basic_server_url(basic_server_port: int) -> str: + """Get the URL for the basic test server.""" + return f"http://127.0.0.1:{basic_server_port}" + + +@pytest.fixture +def json_server_url(json_server_port: int) -> str: + """Get the URL for the JSON response test server.""" + return f"http://127.0.0.1:{json_server_port}" + + +# Basic request validation tests +def test_accept_header_validation(basic_server, basic_server_url): + """Test that Accept header is properly validated.""" + # Test without Accept header + response = requests.post( + f"{basic_server_url}/mcp", + headers={"Content-Type": "application/json"}, + json={"jsonrpc": "2.0", "method": "initialize", "id": 1}, + ) + assert response.status_code == 406 + assert "Not Acceptable" in response.text + + +def test_content_type_validation(basic_server, basic_server_url): + """Test that Content-Type header is properly validated.""" + # Test with incorrect Content-Type + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "text/plain", + }, + data="This is not JSON", + ) + + assert response.status_code == 400 + assert "Invalid Content-Type" in response.text + + +def test_json_validation(basic_server, basic_server_url): + """Test that JSON content is properly validated.""" + # Test with invalid JSON + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + data="this is not valid json", + ) + assert response.status_code == 400 + assert "Parse error" in response.text + + +def test_json_parsing(basic_server, basic_server_url): + """Test that JSON content is properly parse.""" + # Test with valid JSON but invalid JSON-RPC + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json={"foo": "bar"}, + ) + assert response.status_code == 400 + assert "Validation error" in response.text + + +def test_method_not_allowed(basic_server, basic_server_url): + """Test that unsupported HTTP methods are rejected.""" + # Test with unsupported method (PUT) + response = requests.put( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json={"jsonrpc": "2.0", "method": "initialize", "id": 1}, + ) + assert response.status_code == 405 + assert "Method Not Allowed" in response.text + + +def test_session_validation(basic_server, basic_server_url): + """Test session ID validation.""" + # session_id not used directly in this test + + # Test without session ID + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json={"jsonrpc": "2.0", "method": "list_tools", "id": 1}, + ) + assert response.status_code == 400 + assert "Missing session ID" in response.text + + +def test_session_id_pattern(): + """Test that SESSION_ID_PATTERN correctly validates session IDs.""" + # Valid session IDs (visible ASCII characters from 0x21 to 0x7E) + valid_session_ids = [ + "test-session-id", + "1234567890", + "session!@#$%^&*()_+-=[]{}|;:,.<>?/", + "~`", + ] + + for session_id in valid_session_ids: + assert SESSION_ID_PATTERN.match(session_id) is not None + # Ensure fullmatch matches too (whole string) + assert SESSION_ID_PATTERN.fullmatch(session_id) is not None + + # Invalid session IDs + invalid_session_ids = [ + "", # Empty string + " test", # Space (0x20) + "test\t", # Tab + "test\n", # Newline + "test\r", # Carriage return + "test" + chr(0x7F), # DEL character + "test" + chr(0x80), # Extended ASCII + "test" + chr(0x00), # Null character + "test" + chr(0x20), # Space (0x20) + ] + + for session_id in invalid_session_ids: + # For invalid IDs, either match will fail or fullmatch will fail + if SESSION_ID_PATTERN.match(session_id) is not None: + # If match succeeds, fullmatch should fail (partial match case) + assert SESSION_ID_PATTERN.fullmatch(session_id) is None + + +def test_streamable_http_transport_init_validation(): + """Test that StreamableHTTPServerTransport validates session ID on init.""" + # Valid session ID should initialize without errors + valid_transport = StreamableHTTPServerTransport(mcp_session_id="valid-id") + assert valid_transport.mcp_session_id == "valid-id" + + # None should be accepted + none_transport = StreamableHTTPServerTransport(mcp_session_id=None) + assert none_transport.mcp_session_id is None + + # Invalid session ID should raise ValueError + with pytest.raises(ValueError) as excinfo: + StreamableHTTPServerTransport(mcp_session_id="invalid id with space") + assert "Session ID must only contain visible ASCII characters" in str(excinfo.value) + + # Test with control characters + with pytest.raises(ValueError): + StreamableHTTPServerTransport(mcp_session_id="test\nid") + + with pytest.raises(ValueError): + StreamableHTTPServerTransport(mcp_session_id="test\n") + + +def test_session_termination(basic_server, basic_server_url): + """Test session termination via DELETE and subsequent request handling.""" + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert response.status_code == 200 + + # Extract negotiated protocol version from SSE response + negotiated_version = extract_protocol_version_from_sse(response) + + # Now terminate the session + session_id = response.headers.get(MCP_SESSION_ID_HEADER) + response = requests.delete( + f"{basic_server_url}/mcp", + headers={ + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + ) + assert response.status_code == 200 + + # Try to use the terminated session + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + MCP_SESSION_ID_HEADER: session_id, + }, + json={"jsonrpc": "2.0", "method": "ping", "id": 2}, + ) + assert response.status_code == 404 + assert "Session has been terminated" in response.text + + +def test_response(basic_server, basic_server_url): + """Test response handling for a valid request.""" + mcp_url = f"{basic_server_url}/mcp" + response = requests.post( + mcp_url, + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert response.status_code == 200 + + # Extract negotiated protocol version from SSE response + negotiated_version = extract_protocol_version_from_sse(response) + + # Now get the session ID + session_id = response.headers.get(MCP_SESSION_ID_HEADER) + + # Try to use the session with proper headers + tools_response = requests.post( + mcp_url, + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + MCP_SESSION_ID_HEADER: session_id, # Use the session ID we got earlier + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + json={"jsonrpc": "2.0", "method": "tools/list", "id": "tools-1"}, + stream=True, + ) + assert tools_response.status_code == 200 + assert tools_response.headers.get("Content-Type") == "text/event-stream" + + +def test_json_response(json_response_server, json_server_url): + """Test response handling when is_json_response_enabled is True.""" + mcp_url = f"{json_server_url}/mcp" + response = requests.post( + mcp_url, + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert response.status_code == 200 + assert response.headers.get("Content-Type") == "application/json" + + +def test_get_sse_stream(basic_server, basic_server_url): + """Test establishing an SSE stream via GET request.""" + # First, we need to initialize a session + mcp_url = f"{basic_server_url}/mcp" + init_response = requests.post( + mcp_url, + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert init_response.status_code == 200 + + # Get the session ID + session_id = init_response.headers.get(MCP_SESSION_ID_HEADER) + assert session_id is not None + + # Extract negotiated protocol version from SSE response + init_data = None + assert init_response.headers.get("Content-Type") == "text/event-stream" + for line in init_response.text.splitlines(): + if line.startswith("data: "): + init_data = json.loads(line[6:]) + break + assert init_data is not None + negotiated_version = init_data["result"]["protocolVersion"] + + # Now attempt to establish an SSE stream via GET + get_response = requests.get( + mcp_url, + headers={ + "Accept": "text/event-stream", + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + stream=True, + ) + + # Verify we got a successful response with the right content type + assert get_response.status_code == 200 + assert get_response.headers.get("Content-Type") == "text/event-stream" + + # Test that a second GET request gets rejected (only one stream allowed) + second_get = requests.get( + mcp_url, + headers={ + "Accept": "text/event-stream", + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + stream=True, + ) + + # Should get CONFLICT (409) since there's already a stream + # Note: This might fail if the first stream fully closed before this runs, + # but generally it should work in the test environment where it runs quickly + assert second_get.status_code == 409 + + +def test_get_validation(basic_server, basic_server_url): + """Test validation for GET requests.""" + # First, we need to initialize a session + mcp_url = f"{basic_server_url}/mcp" + init_response = requests.post( + mcp_url, + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert init_response.status_code == 200 + + # Get the session ID + session_id = init_response.headers.get(MCP_SESSION_ID_HEADER) + assert session_id is not None + + # Extract negotiated protocol version from SSE response + init_data = None + assert init_response.headers.get("Content-Type") == "text/event-stream" + for line in init_response.text.splitlines(): + if line.startswith("data: "): + init_data = json.loads(line[6:]) + break + assert init_data is not None + negotiated_version = init_data["result"]["protocolVersion"] + + # Test without Accept header + response = requests.get( + mcp_url, + headers={ + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + stream=True, + ) + assert response.status_code == 406 + assert "Not Acceptable" in response.text + + # Test with wrong Accept header + response = requests.get( + mcp_url, + headers={ + "Accept": "application/json", + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + ) + assert response.status_code == 406 + assert "Not Acceptable" in response.text + + +# Client-specific fixtures +@pytest.fixture +async def http_client(basic_server, basic_server_url): + """Create test client matching the SSE test pattern.""" + async with httpx.AsyncClient(base_url=basic_server_url) as client: + yield client + + +@pytest.fixture +async def initialized_client_session(basic_server, basic_server_url): + """Create initialized StreamableHTTP client session.""" + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession( + read_stream, + write_stream, + ) as session: + await session.initialize() + yield session + + +@pytest.mark.anyio +async def test_streamablehttp_client_basic_connection(basic_server, basic_server_url): + """Test basic client connection with initialization.""" + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession( + read_stream, + write_stream, + ) as session: + # Test initialization + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == SERVER_NAME + + +@pytest.mark.anyio +async def test_streamablehttp_client_resource_read(initialized_client_session): + """Test client resource read functionality.""" + response = await initialized_client_session.read_resource(uri=AnyUrl("foobar://test-resource")) + assert len(response.contents) == 1 + assert response.contents[0].uri == AnyUrl("foobar://test-resource") + assert response.contents[0].text == "Read test-resource" + + +@pytest.mark.anyio +async def test_streamablehttp_client_tool_invocation(initialized_client_session): + """Test client tool invocation.""" + # First list tools + tools = await initialized_client_session.list_tools() + assert len(tools.tools) == 4 + assert tools.tools[0].name == "test_tool" + + # Call the tool + result = await initialized_client_session.call_tool("test_tool", {}) + assert len(result.content) == 1 + assert result.content[0].type == "text" + assert result.content[0].text == "Called test_tool" + + +@pytest.mark.anyio +async def test_streamablehttp_client_error_handling(initialized_client_session): + """Test error handling in client.""" + with pytest.raises(McpError) as exc_info: + await initialized_client_session.read_resource(uri=AnyUrl("unknown://test-error")) + assert exc_info.value.error.code == 0 + assert "Unknown resource: unknown://test-error" in exc_info.value.error.message + + +@pytest.mark.anyio +async def test_streamablehttp_client_session_persistence(basic_server, basic_server_url): + """Test that session ID persists across requests.""" + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession( + read_stream, + write_stream, + ) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + + # Make multiple requests to verify session persistence + tools = await session.list_tools() + assert len(tools.tools) == 4 + + # Read a resource + resource = await session.read_resource(uri=AnyUrl("foobar://test-persist")) + assert isinstance(resource.contents[0], TextResourceContents) is True + content = resource.contents[0] + assert isinstance(content, TextResourceContents) + assert content.text == "Read test-persist" + + +@pytest.mark.anyio +async def test_streamablehttp_client_json_response(json_response_server, json_server_url): + """Test client with JSON response mode.""" + async with streamablehttp_client(f"{json_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession( + read_stream, + write_stream, + ) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == SERVER_NAME + + # Check tool listing + tools = await session.list_tools() + assert len(tools.tools) == 4 + + # Call a tool and verify JSON response handling + result = await session.call_tool("test_tool", {}) + assert len(result.content) == 1 + assert result.content[0].type == "text" + assert result.content[0].text == "Called test_tool" + + +@pytest.mark.anyio +async def test_streamablehttp_client_get_stream(basic_server, basic_server_url): + """Test GET stream functionality for server-initiated messages.""" + import mcp.types as types + from mcp.shared.session import RequestResponder + + notifications_received = [] + + # Define message handler to capture notifications + async def message_handler( + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, + ) -> None: + if isinstance(message, types.ServerNotification): + notifications_received.append(message) + + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream, message_handler=message_handler) as session: + # Initialize the session - this triggers the GET stream setup + result = await session.initialize() + assert isinstance(result, InitializeResult) + + # Call the special tool that sends a notification + await session.call_tool("test_tool_with_standalone_notification", {}) + + # Verify we received the notification + assert len(notifications_received) > 0 + + # Verify the notification is a ResourceUpdatedNotification + resource_update_found = False + for notif in notifications_received: + if isinstance(notif.root, types.ResourceUpdatedNotification): + assert str(notif.root.params.uri) == "http://test_resource/" + resource_update_found = True + + assert resource_update_found, "ResourceUpdatedNotification not received via GET stream" + + +@pytest.mark.anyio +async def test_streamablehttp_client_session_termination(basic_server, basic_server_url): + """Test client session termination functionality.""" + + captured_session_id = None + + # Create the streamablehttp_client with a custom httpx client to capture headers + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + get_session_id, + ): + async with ClientSession(read_stream, write_stream) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + captured_session_id = get_session_id() + assert captured_session_id is not None + + # Make a request to confirm session is working + tools = await session.list_tools() + assert len(tools.tools) == 4 + + headers = {} + if captured_session_id: + headers[MCP_SESSION_ID_HEADER] = captured_session_id + + async with streamablehttp_client(f"{basic_server_url}/mcp", headers=headers) as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + # Attempt to make a request after termination + with pytest.raises( + McpError, + match="Session terminated", + ): + await session.list_tools() + + +@pytest.mark.anyio +async def test_streamablehttp_client_session_termination_204(basic_server, basic_server_url, monkeypatch): + """Test client session termination functionality with a 204 response. + + This test patches the httpx client to return a 204 response for DELETEs. + """ + + # Save the original delete method to restore later + original_delete = httpx.AsyncClient.delete + + # Mock the client's delete method to return a 204 + async def mock_delete(self, *args, **kwargs): + # Call the original method to get the real response + response = await original_delete(self, *args, **kwargs) + + # Create a new response with 204 status code but same headers + mocked_response = httpx.Response( + 204, + headers=response.headers, + content=response.content, + request=response.request, + ) + return mocked_response + + # Apply the patch to the httpx client + monkeypatch.setattr(httpx.AsyncClient, "delete", mock_delete) + + captured_session_id = None + + # Create the streamablehttp_client with a custom httpx client to capture headers + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + get_session_id, + ): + async with ClientSession(read_stream, write_stream) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + captured_session_id = get_session_id() + assert captured_session_id is not None + + # Make a request to confirm session is working + tools = await session.list_tools() + assert len(tools.tools) == 4 + + headers = {} + if captured_session_id: + headers[MCP_SESSION_ID_HEADER] = captured_session_id + + async with streamablehttp_client(f"{basic_server_url}/mcp", headers=headers) as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + # Attempt to make a request after termination + with pytest.raises( + McpError, + match="Session terminated", + ): + await session.list_tools() + + +@pytest.mark.anyio +async def test_streamablehttp_client_resumption(event_server): + """Test client session to resume a long running tool.""" + _, server_url = event_server + + # Variables to track the state + captured_session_id = None + captured_resumption_token = None + captured_notifications = [] + tool_started = False + captured_protocol_version = None + + async def message_handler( + message: RequestResponder[types.ServerRequest, types.ClientResult] | types.ServerNotification | Exception, + ) -> None: + if isinstance(message, types.ServerNotification): + captured_notifications.append(message) + # Look for our special notification that indicates the tool is running + if isinstance(message.root, types.LoggingMessageNotification): + if message.root.params.data == "Tool started": + nonlocal tool_started + tool_started = True + + async def on_resumption_token_update(token: str) -> None: + nonlocal captured_resumption_token + captured_resumption_token = token + + # First, start the client session and begin the long-running tool + async with streamablehttp_client(f"{server_url}/mcp", terminate_on_close=False) as ( + read_stream, + write_stream, + get_session_id, + ): + async with ClientSession(read_stream, write_stream, message_handler=message_handler) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + captured_session_id = get_session_id() + assert captured_session_id is not None + # Capture the negotiated protocol version + captured_protocol_version = result.protocolVersion + + # Start a long-running tool in a task + async with anyio.create_task_group() as tg: + + async def run_tool(): + metadata = ClientMessageMetadata( + on_resumption_token_update=on_resumption_token_update, + ) + await session.send_request( + types.ClientRequest( + types.CallToolRequest( + method="tools/call", + params=types.CallToolRequestParams(name="long_running_with_checkpoints", arguments={}), + ) + ), + types.CallToolResult, + metadata=metadata, + ) + + tg.start_soon(run_tool) + + # Wait for the tool to start and at least one notification + # and then kill the task group + while not tool_started or not captured_resumption_token: + await anyio.sleep(0.1) + tg.cancel_scope.cancel() + + # Store pre notifications and clear the captured notifications + # for the post-resumption check + captured_notifications_pre = captured_notifications.copy() + captured_notifications = [] + + # Now resume the session with the same mcp-session-id and protocol version + headers = {} + if captured_session_id: + headers[MCP_SESSION_ID_HEADER] = captured_session_id + if captured_protocol_version: + headers[MCP_PROTOCOL_VERSION_HEADER] = captured_protocol_version + + async with streamablehttp_client(f"{server_url}/mcp", headers=headers) as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream, message_handler=message_handler) as session: + # Don't initialize - just use the existing session + + # Resume the tool with the resumption token + assert captured_resumption_token is not None + + metadata = ClientMessageMetadata( + resumption_token=captured_resumption_token, + ) + result = await session.send_request( + types.ClientRequest( + types.CallToolRequest( + method="tools/call", + params=types.CallToolRequestParams(name="long_running_with_checkpoints", arguments={}), + ) + ), + types.CallToolResult, + metadata=metadata, + ) + + # We should get a complete result + assert len(result.content) == 1 + assert result.content[0].type == "text" + assert "Completed" in result.content[0].text + + # We should have received the remaining notifications + assert len(captured_notifications) > 0 + + # Should not have the first notification + # Check that "Tool started" notification isn't repeated when resuming + assert not any( + isinstance(n.root, types.LoggingMessageNotification) and n.root.params.data == "Tool started" + for n in captured_notifications + ) + # there is no intersection between pre and post notifications + assert not any(n in captured_notifications_pre for n in captured_notifications) + + +@pytest.mark.anyio +async def test_streamablehttp_server_sampling(basic_server, basic_server_url): + """Test server-initiated sampling request through streamable HTTP transport.""" + print("Testing server sampling...") + # Variable to track if sampling callback was invoked + sampling_callback_invoked = False + captured_message_params = None + + # Define sampling callback that returns a mock response + async def sampling_callback( + context: RequestContext[ClientSession, Any], + params: types.CreateMessageRequestParams, + ) -> types.CreateMessageResult: + nonlocal sampling_callback_invoked, captured_message_params + sampling_callback_invoked = True + captured_message_params = params + message_received = params.messages[0].content.text if params.messages[0].content.type == "text" else None + + return types.CreateMessageResult( + role="assistant", + content=types.TextContent( + type="text", + text=f"Received message from server: {message_received}", + ), + model="test-model", + stopReason="endTurn", + ) + + # Create client with sampling callback + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession( + read_stream, + write_stream, + sampling_callback=sampling_callback, + ) as session: + # Initialize the session + result = await session.initialize() + assert isinstance(result, InitializeResult) + + # Call the tool that triggers server-side sampling + tool_result = await session.call_tool("test_sampling_tool", {}) + + # Verify the tool result contains the expected content + assert len(tool_result.content) == 1 + assert tool_result.content[0].type == "text" + assert "Response from sampling: Received message from server" in tool_result.content[0].text + + # Verify sampling callback was invoked + assert sampling_callback_invoked + assert captured_message_params is not None + assert len(captured_message_params.messages) == 1 + assert captured_message_params.messages[0].content.text == "Server needs client sampling" + + +# Context-aware server implementation for testing request context propagation +class ContextAwareServerTest(Server): + def __init__(self): + super().__init__("ContextAwareServer") + + @self.list_tools() + async def handle_list_tools() -> list[Tool]: + return [ + Tool( + name="echo_headers", + description="Echo request headers from context", + inputSchema={"type": "object", "properties": {}}, + ), + Tool( + name="echo_context", + description="Echo request context with custom data", + inputSchema={ + "type": "object", + "properties": { + "request_id": {"type": "string"}, + }, + "required": ["request_id"], + }, + ), + ] + + @self.call_tool() + async def handle_call_tool(name: str, args: dict) -> list[TextContent]: + ctx = self.request_context + + if name == "echo_headers": + # Access the request object from context + headers_info = {} + if ctx.request and isinstance(ctx.request, Request): + headers_info = dict(ctx.request.headers) + return [ + TextContent( + type="text", + text=json.dumps(headers_info), + ) + ] + + elif name == "echo_context": + # Return full context information + context_data = { + "request_id": args.get("request_id"), + "headers": {}, + "method": None, + "path": None, + } + if ctx.request and isinstance(ctx.request, Request): + request = ctx.request + context_data["headers"] = dict(request.headers) + context_data["method"] = request.method + context_data["path"] = request.url.path + return [ + TextContent( + type="text", + text=json.dumps(context_data), + ) + ] + + return [TextContent(type="text", text=f"Unknown tool: {name}")] + + +# Server runner for context-aware testing +def run_context_aware_server(port: int): + """Run the context-aware test server.""" + server = ContextAwareServerTest() + + session_manager = StreamableHTTPSessionManager( + app=server, + event_store=None, + json_response=False, + ) + + app = Starlette( + debug=True, + routes=[ + Mount("/mcp", app=session_manager.handle_request), + ], + lifespan=lambda app: session_manager.run(), + ) + + server_instance = uvicorn.Server( + config=uvicorn.Config( + app=app, + host="127.0.0.1", + port=port, + log_level="error", + ) + ) + server_instance.run() + + +@pytest.fixture +def context_aware_server(basic_server_port: int) -> Generator[None, None, None]: + """Start the context-aware server in a separate process.""" + proc = multiprocessing.Process(target=run_context_aware_server, args=(basic_server_port,), daemon=True) + proc.start() + + # Wait for server to be running + max_attempts = 20 + attempt = 0 + while attempt < max_attempts: + try: + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: + s.connect(("127.0.0.1", basic_server_port)) + break + except ConnectionRefusedError: + time.sleep(0.1) + attempt += 1 + else: + raise RuntimeError(f"Context-aware server failed to start after {max_attempts} attempts") + + yield + + proc.kill() + proc.join(timeout=2) + if proc.is_alive(): + print("Context-aware server process failed to terminate") + + +@pytest.mark.anyio +async def test_streamablehttp_request_context_propagation(context_aware_server: None, basic_server_url: str) -> None: + """Test that request context is properly propagated through StreamableHTTP.""" + custom_headers = { + "Authorization": "Bearer test-token", + "X-Custom-Header": "test-value", + "X-Trace-Id": "trace-123", + } + + async with streamablehttp_client(f"{basic_server_url}/mcp", headers=custom_headers) as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + result = await session.initialize() + assert isinstance(result, InitializeResult) + assert result.serverInfo.name == "ContextAwareServer" + + # Call the tool that echoes headers back + tool_result = await session.call_tool("echo_headers", {}) + + # Parse the JSON response + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + headers_data = json.loads(tool_result.content[0].text) + + # Verify headers were propagated + assert headers_data.get("authorization") == "Bearer test-token" + assert headers_data.get("x-custom-header") == "test-value" + assert headers_data.get("x-trace-id") == "trace-123" + + +@pytest.mark.anyio +async def test_streamablehttp_request_context_isolation(context_aware_server: None, basic_server_url: str) -> None: + """Test that request contexts are isolated between StreamableHTTP clients.""" + contexts = [] + + # Create multiple clients with different headers + for i in range(3): + headers = { + "X-Request-Id": f"request-{i}", + "X-Custom-Value": f"value-{i}", + "Authorization": f"Bearer token-{i}", + } + + async with streamablehttp_client(f"{basic_server_url}/mcp", headers=headers) as (read_stream, write_stream, _): + async with ClientSession(read_stream, write_stream) as session: + await session.initialize() + + # Call the tool that echoes context + tool_result = await session.call_tool("echo_context", {"request_id": f"request-{i}"}) + + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + context_data = json.loads(tool_result.content[0].text) + contexts.append(context_data) + + # Verify each request had its own context + assert len(contexts) == 3 + for i, ctx in enumerate(contexts): + assert ctx["request_id"] == f"request-{i}" + assert ctx["headers"].get("x-request-id") == f"request-{i}" + assert ctx["headers"].get("x-custom-value") == f"value-{i}" + assert ctx["headers"].get("authorization") == f"Bearer token-{i}" + + +@pytest.mark.anyio +async def test_client_includes_protocol_version_header_after_init(context_aware_server, basic_server_url): + """Test that client includes mcp-protocol-version header after initialization.""" + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + # Initialize and get the negotiated version + init_result = await session.initialize() + negotiated_version = init_result.protocolVersion + + # Call a tool that echoes headers to verify the header is present + tool_result = await session.call_tool("echo_headers", {}) + + assert len(tool_result.content) == 1 + assert isinstance(tool_result.content[0], TextContent) + headers_data = json.loads(tool_result.content[0].text) + + # Verify protocol version header is present + assert "mcp-protocol-version" in headers_data + assert headers_data[MCP_PROTOCOL_VERSION_HEADER] == negotiated_version + + +def test_server_validates_protocol_version_header(basic_server, basic_server_url): + """Test that server returns 400 Bad Request version if header unsupported or invalid.""" + # First initialize a session to get a valid session ID + init_response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert init_response.status_code == 200 + session_id = init_response.headers.get(MCP_SESSION_ID_HEADER) + + # Test request with invalid protocol version (should fail) + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: "invalid-version", + }, + json={"jsonrpc": "2.0", "method": "tools/list", "id": "test-2"}, + ) + assert response.status_code == 400 + assert MCP_PROTOCOL_VERSION_HEADER in response.text or "protocol version" in response.text.lower() + + # Test request with unsupported protocol version (should fail) + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: "1999-01-01", # Very old unsupported version + }, + json={"jsonrpc": "2.0", "method": "tools/list", "id": "test-3"}, + ) + assert response.status_code == 400 + assert MCP_PROTOCOL_VERSION_HEADER in response.text or "protocol version" in response.text.lower() + + # Test request with valid protocol version (should succeed) + negotiated_version = extract_protocol_version_from_sse(init_response) + + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + MCP_SESSION_ID_HEADER: session_id, + MCP_PROTOCOL_VERSION_HEADER: negotiated_version, + }, + json={"jsonrpc": "2.0", "method": "tools/list", "id": "test-4"}, + ) + assert response.status_code == 200 + + +def test_server_backwards_compatibility_no_protocol_version(basic_server, basic_server_url): + """Test server accepts requests without protocol version header.""" + # First initialize a session to get a valid session ID + init_response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + }, + json=INIT_REQUEST, + ) + assert init_response.status_code == 200 + session_id = init_response.headers.get(MCP_SESSION_ID_HEADER) + + # Test request without mcp-protocol-version header (backwards compatibility) + response = requests.post( + f"{basic_server_url}/mcp", + headers={ + "Accept": "application/json, text/event-stream", + "Content-Type": "application/json", + MCP_SESSION_ID_HEADER: session_id, + }, + json={"jsonrpc": "2.0", "method": "tools/list", "id": "test-backwards-compat"}, + stream=True, + ) + assert response.status_code == 200 # Should succeed for backwards compatibility + assert response.headers.get("Content-Type") == "text/event-stream" + + +@pytest.mark.anyio +async def test_client_crash_handled(basic_server, basic_server_url): + """Test that cases where the client crashes are handled gracefully.""" + + # Simulate bad client that crashes after init + async def bad_client(): + """Client that triggers ClosedResourceError""" + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + await session.initialize() + raise Exception("client crash") + + # Run bad client a few times to trigger the crash + for _ in range(3): + try: + await bad_client() + except Exception: + pass + await anyio.sleep(0.1) + + # Try a good client, it should still be able to connect and list tools + async with streamablehttp_client(f"{basic_server_url}/mcp") as ( + read_stream, + write_stream, + _, + ): + async with ClientSession(read_stream, write_stream) as session: + result = await session.initialize() + assert isinstance(result, InitializeResult) + tools = await session.list_tools() + assert tools.tools diff --git a/tests/shared/test_ws.py b/tests/shared/test_ws.py index 1381c8153..5081f1d53 100644 --- a/tests/shared/test_ws.py +++ b/tests/shared/test_ws.py @@ -54,11 +54,7 @@ async def handle_read_resource(uri: AnyUrl) -> str | bytes: await anyio.sleep(2.0) return f"Slow response from {uri.host}" - raise McpError( - error=ErrorData( - code=404, message="OOPS! no resource with that URI was found" - ) - ) + raise McpError(error=ErrorData(code=404, message="OOPS! no resource with that URI was found")) @self.list_tools() async def handle_list_tools() -> list[Tool]: @@ -81,12 +77,8 @@ def make_server_app() -> Starlette: server = ServerTest() async def handle_ws(websocket): - async with websocket_server( - websocket.scope, websocket.receive, websocket.send - ) as streams: - await server.run( - streams[0], streams[1], server.create_initialization_options() - ) + async with websocket_server(websocket.scope, websocket.receive, websocket.send) as streams: + await server.run(streams[0], streams[1], server.create_initialization_options()) app = Starlette( routes=[ @@ -99,11 +91,7 @@ async def handle_ws(websocket): def run_server(server_port: int) -> None: app = make_server_app() - server = uvicorn.Server( - config=uvicorn.Config( - app=app, host="127.0.0.1", port=server_port, log_level="error" - ) - ) + server = uvicorn.Server(config=uvicorn.Config(app=app, host="127.0.0.1", port=server_port, log_level="error")) print(f"starting server on {server_port}") server.run() @@ -115,9 +103,7 @@ def run_server(server_port: int) -> None: @pytest.fixture() def server(server_port: int) -> Generator[None, None, None]: - proc = multiprocessing.Process( - target=run_server, kwargs={"server_port": server_port}, daemon=True - ) + proc = multiprocessing.Process(target=run_server, kwargs={"server_port": server_port}, daemon=True) print("starting process") proc.start() @@ -147,9 +133,7 @@ def server(server_port: int) -> Generator[None, None, None]: @pytest.fixture() -async def initialized_ws_client_session( - server, server_url: str -) -> AsyncGenerator[ClientSession, None]: +async def initialized_ws_client_session(server, server_url: str) -> AsyncGenerator[ClientSession, None]: """Create and initialize a WebSocket client session""" async with websocket_client(server_url + "/ws") as streams: async with ClientSession(*streams) as session: @@ -186,9 +170,7 @@ async def test_ws_client_happy_request_and_response( initialized_ws_client_session: ClientSession, ) -> None: """Test a successful request and response via WebSocket""" - result = await initialized_ws_client_session.read_resource( - AnyUrl("foobar://example") - ) + result = await initialized_ws_client_session.read_resource(AnyUrl("foobar://example")) assert isinstance(result, ReadResourceResult) assert isinstance(result.contents, list) assert len(result.contents) > 0 @@ -218,9 +200,7 @@ async def test_ws_client_timeout( # Now test that we can still use the session after a timeout with anyio.fail_after(5): # Longer timeout to allow completion - result = await initialized_ws_client_session.read_resource( - AnyUrl("foobar://example") - ) + result = await initialized_ws_client_session.read_resource(AnyUrl("foobar://example")) assert isinstance(result, ReadResourceResult) assert isinstance(result.contents, list) assert len(result.contents) > 0 diff --git a/tests/test_examples.py b/tests/test_examples.py index c5e8ec9d7..230e7d394 100644 --- a/tests/test_examples.py +++ b/tests/test_examples.py @@ -1,5 +1,7 @@ """Tests for example servers""" +import sys + import pytest from pytest_examples import CodeExample, EvalExample, find_examples @@ -29,9 +31,7 @@ async def test_complex_inputs(): async with client_session(mcp._mcp_server) as client: tank = {"shrimp": [{"name": "bob"}, {"name": "alice"}]} - result = await client.call_tool( - "name_shrimp", {"tank": tank, "extra_names": ["charlie"]} - ) + result = await client.call_tool("name_shrimp", {"tank": tank, "extra_names": ["charlie"]}) assert len(result.content) == 3 assert isinstance(result.content[0], TextContent) assert isinstance(result.content[1], TextContent) @@ -69,17 +69,22 @@ async def test_desktop(monkeypatch): content = result.contents[0] assert isinstance(content, TextResourceContents) assert isinstance(content.text, str) - assert "/fake/path/file1.txt" in content.text - assert "/fake/path/file2.txt" in content.text + if sys.platform == "win32": + file_1 = "/fake/path/file1.txt".replace("/", "\\\\") # might be a bug + file_2 = "/fake/path/file2.txt".replace("/", "\\\\") # might be a bug + assert file_1 in content.text + assert file_2 in content.text + # might be a bug, but the test is passing + else: + assert "/fake/path/file1.txt" in content.text + assert "/fake/path/file2.txt" in content.text @pytest.mark.parametrize("example", find_examples("README.md"), ids=str) def test_docs_examples(example: CodeExample, eval_example: EvalExample): ruff_ignore: list[str] = ["F841", "I001"] - eval_example.set_config( - ruff_ignore=ruff_ignore, target_version="py310", line_length=88 - ) + eval_example.set_config(ruff_ignore=ruff_ignore, target_version="py310", line_length=88) if eval_example.update_examples: # pragma: no cover eval_example.format(example) diff --git a/uv.lock b/uv.lock index 78f46f47c..180d5a9c1 100644 --- a/uv.lock +++ b/uv.lock @@ -1,5 +1,5 @@ version = 1 -revision = 1 +revision = 2 requires-python = ">=3.10" [options] @@ -8,8 +8,11 @@ resolution-mode = "lowest-direct" [manifest] members = [ "mcp", + "mcp-simple-auth", "mcp-simple-prompt", "mcp-simple-resource", + "mcp-simple-streamablehttp", + "mcp-simple-streamablehttp-stateless", "mcp-simple-tool", ] @@ -17,9 +20,9 @@ members = [ name = "annotated-types" version = "0.7.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 } +sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 }, + { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, ] [[package]] @@ -32,27 +35,39 @@ dependencies = [ { name = "sniffio" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/a0/44/66874c5256e9fbc30103b31927fd9341c8da6ccafd4721b2b3e81e6ef176/anyio-4.5.0.tar.gz", hash = "sha256:c5a275fe5ca0afd788001f58fca1e69e29ce706d746e317d660e21f70c530ef9", size = 169376 } +sdist = { url = "https://files.pythonhosted.org/packages/a0/44/66874c5256e9fbc30103b31927fd9341c8da6ccafd4721b2b3e81e6ef176/anyio-4.5.0.tar.gz", hash = "sha256:c5a275fe5ca0afd788001f58fca1e69e29ce706d746e317d660e21f70c530ef9", size = 169376, upload-time = "2024-09-19T09:28:45.477Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/3b/68/f9e9bf6324c46e6b8396610aef90ad423ec3e18c9079547ceafea3dce0ec/anyio-4.5.0-py3-none-any.whl", hash = "sha256:fdeb095b7cc5a5563175eedd926ec4ae55413bb4be5770c424af0ba46ccb4a78", size = 89250 }, + { url = "https://files.pythonhosted.org/packages/3b/68/f9e9bf6324c46e6b8396610aef90ad423ec3e18c9079547ceafea3dce0ec/anyio-4.5.0-py3-none-any.whl", hash = "sha256:fdeb095b7cc5a5563175eedd926ec4ae55413bb4be5770c424af0ba46ccb4a78", size = 89250, upload-time = "2024-09-19T09:28:42.699Z" }, +] + +[[package]] +name = "asttokens" +version = "2.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/45/1d/f03bcb60c4a3212e15f99a56085d93093a497718adf828d050b9d675da81/asttokens-2.4.1.tar.gz", hash = "sha256:b03869718ba9a6eb027e134bfdf69f38a236d681c83c160d510768af11254ba0", size = 62284, upload-time = "2023-10-26T10:03:05.06Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/86/4736ac618d82a20d87d2f92ae19441ebc7ac9e7a581d7e58bbe79233b24a/asttokens-2.4.1-py2.py3-none-any.whl", hash = "sha256:051ed49c3dcae8913ea7cd08e46a606dba30b79993209636c4875bc1d637bc24", size = 27764, upload-time = "2023-10-26T10:03:01.789Z" }, ] [[package]] name = "attrs" version = "24.3.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/48/c8/6260f8ccc11f0917360fc0da435c5c9c7504e3db174d5a12a1494887b045/attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff", size = 805984 } +sdist = { url = "https://files.pythonhosted.org/packages/48/c8/6260f8ccc11f0917360fc0da435c5c9c7504e3db174d5a12a1494887b045/attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff", size = 805984, upload-time = "2024-12-16T06:59:29.899Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/89/aa/ab0f7891a01eeb2d2e338ae8fecbe57fcebea1a24dbb64d45801bfab481d/attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308", size = 63397 }, + { url = "https://files.pythonhosted.org/packages/89/aa/ab0f7891a01eeb2d2e338ae8fecbe57fcebea1a24dbb64d45801bfab481d/attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308", size = 63397, upload-time = "2024-12-16T06:59:26.977Z" }, ] [[package]] name = "babel" version = "2.17.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/7d/6b/d52e42361e1aa00709585ecc30b3f9684b3ab62530771402248b1b1d6240/babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d", size = 9951852 } +sdist = { url = "https://files.pythonhosted.org/packages/7d/6b/d52e42361e1aa00709585ecc30b3f9684b3ab62530771402248b1b1d6240/babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d", size = 9951852, upload-time = "2025-02-01T15:17:41.026Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537 }, + { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537, upload-time = "2025-02-01T15:17:37.39Z" }, ] [[package]] @@ -68,25 +83,25 @@ dependencies = [ { name = "tomli", marker = "python_full_version < '3.11'" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449 } +sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449, upload-time = "2025-01-29T04:15:40.373Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/4d/3b/4ba3f93ac8d90410423fdd31d7541ada9bcee1df32fb90d26de41ed40e1d/black-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:759e7ec1e050a15f89b770cefbf91ebee8917aac5c20483bc2d80a6c3a04df32", size = 1629419 }, - { url = "https://files.pythonhosted.org/packages/b4/02/0bde0485146a8a5e694daed47561785e8b77a0466ccc1f3e485d5ef2925e/black-25.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e519ecf93120f34243e6b0054db49c00a35f84f195d5bce7e9f5cfc578fc2da", size = 1461080 }, - { url = "https://files.pythonhosted.org/packages/52/0e/abdf75183c830eaca7589144ff96d49bce73d7ec6ad12ef62185cc0f79a2/black-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:055e59b198df7ac0b7efca5ad7ff2516bca343276c466be72eb04a3bcc1f82d7", size = 1766886 }, - { url = "https://files.pythonhosted.org/packages/dc/a6/97d8bb65b1d8a41f8a6736222ba0a334db7b7b77b8023ab4568288f23973/black-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:db8ea9917d6f8fc62abd90d944920d95e73c83a5ee3383493e35d271aca872e9", size = 1419404 }, - { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372 }, - { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865 }, - { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699 }, - { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028 }, - { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988 }, - { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985 }, - { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816 }, - { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860 }, - { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673 }, - { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190 }, - { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926 }, - { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613 }, - { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646 }, + { url = "https://files.pythonhosted.org/packages/4d/3b/4ba3f93ac8d90410423fdd31d7541ada9bcee1df32fb90d26de41ed40e1d/black-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:759e7ec1e050a15f89b770cefbf91ebee8917aac5c20483bc2d80a6c3a04df32", size = 1629419, upload-time = "2025-01-29T05:37:06.642Z" }, + { url = "https://files.pythonhosted.org/packages/b4/02/0bde0485146a8a5e694daed47561785e8b77a0466ccc1f3e485d5ef2925e/black-25.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e519ecf93120f34243e6b0054db49c00a35f84f195d5bce7e9f5cfc578fc2da", size = 1461080, upload-time = "2025-01-29T05:37:09.321Z" }, + { url = "https://files.pythonhosted.org/packages/52/0e/abdf75183c830eaca7589144ff96d49bce73d7ec6ad12ef62185cc0f79a2/black-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:055e59b198df7ac0b7efca5ad7ff2516bca343276c466be72eb04a3bcc1f82d7", size = 1766886, upload-time = "2025-01-29T04:18:24.432Z" }, + { url = "https://files.pythonhosted.org/packages/dc/a6/97d8bb65b1d8a41f8a6736222ba0a334db7b7b77b8023ab4568288f23973/black-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:db8ea9917d6f8fc62abd90d944920d95e73c83a5ee3383493e35d271aca872e9", size = 1419404, upload-time = "2025-01-29T04:19:04.296Z" }, + { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372, upload-time = "2025-01-29T05:37:11.71Z" }, + { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865, upload-time = "2025-01-29T05:37:14.309Z" }, + { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699, upload-time = "2025-01-29T04:18:17.688Z" }, + { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028, upload-time = "2025-01-29T04:18:51.711Z" }, + { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988, upload-time = "2025-01-29T05:37:16.707Z" }, + { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985, upload-time = "2025-01-29T05:37:18.273Z" }, + { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816, upload-time = "2025-01-29T04:18:33.823Z" }, + { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860, upload-time = "2025-01-29T04:19:12.944Z" }, + { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673, upload-time = "2025-01-29T05:37:20.574Z" }, + { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190, upload-time = "2025-01-29T05:37:22.106Z" }, + { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926, upload-time = "2025-01-29T04:18:58.564Z" }, + { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613, upload-time = "2025-01-29T04:19:27.63Z" }, + { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646, upload-time = "2025-01-29T04:15:38.082Z" }, ] [[package]] @@ -96,9 +111,9 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cffi" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/70/c5/1a4dc131459e68a173cbdab5fad6b524f53f9c1ef7861b7698e998b837cc/cairocffi-1.7.1.tar.gz", hash = "sha256:2e48ee864884ec4a3a34bfa8c9ab9999f688286eb714a15a43ec9d068c36557b", size = 88096 } +sdist = { url = "https://files.pythonhosted.org/packages/70/c5/1a4dc131459e68a173cbdab5fad6b524f53f9c1ef7861b7698e998b837cc/cairocffi-1.7.1.tar.gz", hash = "sha256:2e48ee864884ec4a3a34bfa8c9ab9999f688286eb714a15a43ec9d068c36557b", size = 88096, upload-time = "2024-06-18T10:56:06.741Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/93/d8/ba13451aa6b745c49536e87b6bf8f629b950e84bd0e8308f7dc6883b67e2/cairocffi-1.7.1-py3-none-any.whl", hash = "sha256:9803a0e11f6c962f3b0ae2ec8ba6ae45e957a146a004697a1ac1bbf16b073b3f", size = 75611 }, + { url = "https://files.pythonhosted.org/packages/93/d8/ba13451aa6b745c49536e87b6bf8f629b950e84bd0e8308f7dc6883b67e2/cairocffi-1.7.1-py3-none-any.whl", hash = "sha256:9803a0e11f6c962f3b0ae2ec8ba6ae45e957a146a004697a1ac1bbf16b073b3f", size = 75611, upload-time = "2024-06-18T10:55:59.489Z" }, ] [[package]] @@ -112,18 +127,18 @@ dependencies = [ { name = "pillow" }, { name = "tinycss2" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d5/e6/ec5900b724e3c44af7f6f51f719919137284e5da4aabe96508baec8a1b40/CairoSVG-2.7.1.tar.gz", hash = "sha256:432531d72347291b9a9ebfb6777026b607563fd8719c46ee742db0aef7271ba0", size = 8399085 } +sdist = { url = "https://files.pythonhosted.org/packages/d5/e6/ec5900b724e3c44af7f6f51f719919137284e5da4aabe96508baec8a1b40/CairoSVG-2.7.1.tar.gz", hash = "sha256:432531d72347291b9a9ebfb6777026b607563fd8719c46ee742db0aef7271ba0", size = 8399085, upload-time = "2023-08-05T09:08:05.75Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/01/a5/1866b42151f50453f1a0d28fc4c39f5be5f412a2e914f33449c42daafdf1/CairoSVG-2.7.1-py3-none-any.whl", hash = "sha256:8a5222d4e6c3f86f1f7046b63246877a63b49923a1cd202184c3a634ef546b3b", size = 43235 }, + { url = "https://files.pythonhosted.org/packages/01/a5/1866b42151f50453f1a0d28fc4c39f5be5f412a2e914f33449c42daafdf1/CairoSVG-2.7.1-py3-none-any.whl", hash = "sha256:8a5222d4e6c3f86f1f7046b63246877a63b49923a1cd202184c3a634ef546b3b", size = 43235, upload-time = "2023-08-05T09:08:01.583Z" }, ] [[package]] name = "certifi" version = "2024.12.14" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/0f/bd/1d41ee578ce09523c81a15426705dd20969f5abf006d1afe8aeff0dd776a/certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db", size = 166010 } +sdist = { url = "https://files.pythonhosted.org/packages/0f/bd/1d41ee578ce09523c81a15426705dd20969f5abf006d1afe8aeff0dd776a/certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db", size = 166010, upload-time = "2024-12-14T13:52:38.02Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/a5/32/8f6669fc4798494966bf446c8c4a162e0b5d893dff088afddf76414f70e1/certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56", size = 164927 }, + { url = "https://files.pythonhosted.org/packages/a5/32/8f6669fc4798494966bf446c8c4a162e0b5d893dff088afddf76414f70e1/certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56", size = 164927, upload-time = "2024-12-14T13:52:36.114Z" }, ] [[package]] @@ -133,115 +148,115 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pycparser" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191 }, - { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592 }, - { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024 }, - { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188 }, - { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571 }, - { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687 }, - { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211 }, - { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325 }, - { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784 }, - { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564 }, - { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804 }, - { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299 }, - { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264 }, - { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651 }, - { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259 }, - { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200 }, - { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235 }, - { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721 }, - { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242 }, - { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999 }, - { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242 }, - { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604 }, - { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 }, - { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 }, - { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178 }, - { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840 }, - { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803 }, - { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850 }, - { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729 }, - { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256 }, - { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424 }, - { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568 }, - { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736 }, - { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 }, - { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 }, - { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989 }, - { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802 }, - { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792 }, - { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893 }, - { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810 }, - { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200 }, - { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447 }, - { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358 }, - { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 }, - { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 }, - { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 }, +sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621, upload-time = "2024-09-04T20:45:21.852Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191, upload-time = "2024-09-04T20:43:30.027Z" }, + { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592, upload-time = "2024-09-04T20:43:32.108Z" }, + { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024, upload-time = "2024-09-04T20:43:34.186Z" }, + { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188, upload-time = "2024-09-04T20:43:36.286Z" }, + { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571, upload-time = "2024-09-04T20:43:38.586Z" }, + { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687, upload-time = "2024-09-04T20:43:40.084Z" }, + { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211, upload-time = "2024-09-04T20:43:41.526Z" }, + { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325, upload-time = "2024-09-04T20:43:43.117Z" }, + { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784, upload-time = "2024-09-04T20:43:45.256Z" }, + { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564, upload-time = "2024-09-04T20:43:46.779Z" }, + { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804, upload-time = "2024-09-04T20:43:48.186Z" }, + { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299, upload-time = "2024-09-04T20:43:49.812Z" }, + { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264, upload-time = "2024-09-04T20:43:51.124Z" }, + { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651, upload-time = "2024-09-04T20:43:52.872Z" }, + { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259, upload-time = "2024-09-04T20:43:56.123Z" }, + { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200, upload-time = "2024-09-04T20:43:57.891Z" }, + { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235, upload-time = "2024-09-04T20:44:00.18Z" }, + { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721, upload-time = "2024-09-04T20:44:01.585Z" }, + { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242, upload-time = "2024-09-04T20:44:03.467Z" }, + { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999, upload-time = "2024-09-04T20:44:05.023Z" }, + { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242, upload-time = "2024-09-04T20:44:06.444Z" }, + { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604, upload-time = "2024-09-04T20:44:08.206Z" }, + { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727, upload-time = "2024-09-04T20:44:09.481Z" }, + { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400, upload-time = "2024-09-04T20:44:10.873Z" }, + { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178, upload-time = "2024-09-04T20:44:12.232Z" }, + { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840, upload-time = "2024-09-04T20:44:13.739Z" }, + { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803, upload-time = "2024-09-04T20:44:15.231Z" }, + { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850, upload-time = "2024-09-04T20:44:17.188Z" }, + { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729, upload-time = "2024-09-04T20:44:18.688Z" }, + { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256, upload-time = "2024-09-04T20:44:20.248Z" }, + { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424, upload-time = "2024-09-04T20:44:21.673Z" }, + { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568, upload-time = "2024-09-04T20:44:23.245Z" }, + { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736, upload-time = "2024-09-04T20:44:24.757Z" }, + { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448, upload-time = "2024-09-04T20:44:26.208Z" }, + { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976, upload-time = "2024-09-04T20:44:27.578Z" }, + { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989, upload-time = "2024-09-04T20:44:28.956Z" }, + { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802, upload-time = "2024-09-04T20:44:30.289Z" }, + { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792, upload-time = "2024-09-04T20:44:32.01Z" }, + { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893, upload-time = "2024-09-04T20:44:33.606Z" }, + { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810, upload-time = "2024-09-04T20:44:35.191Z" }, + { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200, upload-time = "2024-09-04T20:44:36.743Z" }, + { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447, upload-time = "2024-09-04T20:44:38.492Z" }, + { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358, upload-time = "2024-09-04T20:44:40.046Z" }, + { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469, upload-time = "2024-09-04T20:44:41.616Z" }, + { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475, upload-time = "2024-09-04T20:44:43.733Z" }, + { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009, upload-time = "2024-09-04T20:44:45.309Z" }, ] [[package]] name = "charset-normalizer" version = "3.4.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0d/58/5580c1716040bc89206c77d8f74418caf82ce519aae06450393ca73475d1/charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de", size = 198013 }, - { url = "https://files.pythonhosted.org/packages/d0/11/00341177ae71c6f5159a08168bcb98c6e6d196d372c94511f9f6c9afe0c6/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176", size = 141285 }, - { url = "https://files.pythonhosted.org/packages/01/09/11d684ea5819e5a8f5100fb0b38cf8d02b514746607934134d31233e02c8/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037", size = 151449 }, - { url = "https://files.pythonhosted.org/packages/08/06/9f5a12939db324d905dc1f70591ae7d7898d030d7662f0d426e2286f68c9/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f", size = 143892 }, - { url = "https://files.pythonhosted.org/packages/93/62/5e89cdfe04584cb7f4d36003ffa2936681b03ecc0754f8e969c2becb7e24/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a", size = 146123 }, - { url = "https://files.pythonhosted.org/packages/a9/ac/ab729a15c516da2ab70a05f8722ecfccc3f04ed7a18e45c75bbbaa347d61/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a", size = 147943 }, - { url = "https://files.pythonhosted.org/packages/03/d2/3f392f23f042615689456e9a274640c1d2e5dd1d52de36ab8f7955f8f050/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247", size = 142063 }, - { url = "https://files.pythonhosted.org/packages/f2/e3/e20aae5e1039a2cd9b08d9205f52142329f887f8cf70da3650326670bddf/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408", size = 150578 }, - { url = "https://files.pythonhosted.org/packages/8d/af/779ad72a4da0aed925e1139d458adc486e61076d7ecdcc09e610ea8678db/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb", size = 153629 }, - { url = "https://files.pythonhosted.org/packages/c2/b6/7aa450b278e7aa92cf7732140bfd8be21f5f29d5bf334ae987c945276639/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d", size = 150778 }, - { url = "https://files.pythonhosted.org/packages/39/f4/d9f4f712d0951dcbfd42920d3db81b00dd23b6ab520419626f4023334056/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807", size = 146453 }, - { url = "https://files.pythonhosted.org/packages/49/2b/999d0314e4ee0cff3cb83e6bc9aeddd397eeed693edb4facb901eb8fbb69/charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f", size = 95479 }, - { url = "https://files.pythonhosted.org/packages/2d/ce/3cbed41cff67e455a386fb5e5dd8906cdda2ed92fbc6297921f2e4419309/charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f", size = 102790 }, - { url = "https://files.pythonhosted.org/packages/72/80/41ef5d5a7935d2d3a773e3eaebf0a9350542f2cab4eac59a7a4741fbbbbe/charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125", size = 194995 }, - { url = "https://files.pythonhosted.org/packages/7a/28/0b9fefa7b8b080ec492110af6d88aa3dea91c464b17d53474b6e9ba5d2c5/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1", size = 139471 }, - { url = "https://files.pythonhosted.org/packages/71/64/d24ab1a997efb06402e3fc07317e94da358e2585165930d9d59ad45fcae2/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3", size = 149831 }, - { url = "https://files.pythonhosted.org/packages/37/ed/be39e5258e198655240db5e19e0b11379163ad7070962d6b0c87ed2c4d39/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd", size = 142335 }, - { url = "https://files.pythonhosted.org/packages/88/83/489e9504711fa05d8dde1574996408026bdbdbd938f23be67deebb5eca92/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00", size = 143862 }, - { url = "https://files.pythonhosted.org/packages/c6/c7/32da20821cf387b759ad24627a9aca289d2822de929b8a41b6241767b461/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12", size = 145673 }, - { url = "https://files.pythonhosted.org/packages/68/85/f4288e96039abdd5aeb5c546fa20a37b50da71b5cf01e75e87f16cd43304/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77", size = 140211 }, - { url = "https://files.pythonhosted.org/packages/28/a3/a42e70d03cbdabc18997baf4f0227c73591a08041c149e710045c281f97b/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146", size = 148039 }, - { url = "https://files.pythonhosted.org/packages/85/e4/65699e8ab3014ecbe6f5c71d1a55d810fb716bbfd74f6283d5c2aa87febf/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd", size = 151939 }, - { url = "https://files.pythonhosted.org/packages/b1/82/8e9fe624cc5374193de6860aba3ea8070f584c8565ee77c168ec13274bd2/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6", size = 149075 }, - { url = "https://files.pythonhosted.org/packages/3d/7b/82865ba54c765560c8433f65e8acb9217cb839a9e32b42af4aa8e945870f/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8", size = 144340 }, - { url = "https://files.pythonhosted.org/packages/b5/b6/9674a4b7d4d99a0d2df9b215da766ee682718f88055751e1e5e753c82db0/charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b", size = 95205 }, - { url = "https://files.pythonhosted.org/packages/1e/ab/45b180e175de4402dcf7547e4fb617283bae54ce35c27930a6f35b6bef15/charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76", size = 102441 }, - { url = "https://files.pythonhosted.org/packages/0a/9a/dd1e1cdceb841925b7798369a09279bd1cf183cef0f9ddf15a3a6502ee45/charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545", size = 196105 }, - { url = "https://files.pythonhosted.org/packages/d3/8c/90bfabf8c4809ecb648f39794cf2a84ff2e7d2a6cf159fe68d9a26160467/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7", size = 140404 }, - { url = "https://files.pythonhosted.org/packages/ad/8f/e410d57c721945ea3b4f1a04b74f70ce8fa800d393d72899f0a40526401f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757", size = 150423 }, - { url = "https://files.pythonhosted.org/packages/f0/b8/e6825e25deb691ff98cf5c9072ee0605dc2acfca98af70c2d1b1bc75190d/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa", size = 143184 }, - { url = "https://files.pythonhosted.org/packages/3e/a2/513f6cbe752421f16d969e32f3583762bfd583848b763913ddab8d9bfd4f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d", size = 145268 }, - { url = "https://files.pythonhosted.org/packages/74/94/8a5277664f27c3c438546f3eb53b33f5b19568eb7424736bdc440a88a31f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616", size = 147601 }, - { url = "https://files.pythonhosted.org/packages/7c/5f/6d352c51ee763623a98e31194823518e09bfa48be2a7e8383cf691bbb3d0/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b", size = 141098 }, - { url = "https://files.pythonhosted.org/packages/78/d4/f5704cb629ba5ab16d1d3d741396aec6dc3ca2b67757c45b0599bb010478/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d", size = 149520 }, - { url = "https://files.pythonhosted.org/packages/c5/96/64120b1d02b81785f222b976c0fb79a35875457fa9bb40827678e54d1bc8/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a", size = 152852 }, - { url = "https://files.pythonhosted.org/packages/84/c9/98e3732278a99f47d487fd3468bc60b882920cef29d1fa6ca460a1fdf4e6/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9", size = 150488 }, - { url = "https://files.pythonhosted.org/packages/13/0e/9c8d4cb99c98c1007cc11eda969ebfe837bbbd0acdb4736d228ccaabcd22/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1", size = 146192 }, - { url = "https://files.pythonhosted.org/packages/b2/21/2b6b5b860781a0b49427309cb8670785aa543fb2178de875b87b9cc97746/charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35", size = 95550 }, - { url = "https://files.pythonhosted.org/packages/21/5b/1b390b03b1d16c7e382b561c5329f83cc06623916aab983e8ab9239c7d5c/charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f", size = 102785 }, - { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698 }, - { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162 }, - { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263 }, - { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966 }, - { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992 }, - { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162 }, - { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972 }, - { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095 }, - { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668 }, - { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073 }, - { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732 }, - { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391 }, - { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702 }, - { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767 }, +sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188, upload-time = "2024-12-24T18:12:35.43Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0d/58/5580c1716040bc89206c77d8f74418caf82ce519aae06450393ca73475d1/charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de", size = 198013, upload-time = "2024-12-24T18:09:43.671Z" }, + { url = "https://files.pythonhosted.org/packages/d0/11/00341177ae71c6f5159a08168bcb98c6e6d196d372c94511f9f6c9afe0c6/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176", size = 141285, upload-time = "2024-12-24T18:09:48.113Z" }, + { url = "https://files.pythonhosted.org/packages/01/09/11d684ea5819e5a8f5100fb0b38cf8d02b514746607934134d31233e02c8/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037", size = 151449, upload-time = "2024-12-24T18:09:50.845Z" }, + { url = "https://files.pythonhosted.org/packages/08/06/9f5a12939db324d905dc1f70591ae7d7898d030d7662f0d426e2286f68c9/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f", size = 143892, upload-time = "2024-12-24T18:09:52.078Z" }, + { url = "https://files.pythonhosted.org/packages/93/62/5e89cdfe04584cb7f4d36003ffa2936681b03ecc0754f8e969c2becb7e24/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a", size = 146123, upload-time = "2024-12-24T18:09:54.575Z" }, + { url = "https://files.pythonhosted.org/packages/a9/ac/ab729a15c516da2ab70a05f8722ecfccc3f04ed7a18e45c75bbbaa347d61/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a", size = 147943, upload-time = "2024-12-24T18:09:57.324Z" }, + { url = "https://files.pythonhosted.org/packages/03/d2/3f392f23f042615689456e9a274640c1d2e5dd1d52de36ab8f7955f8f050/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247", size = 142063, upload-time = "2024-12-24T18:09:59.794Z" }, + { url = "https://files.pythonhosted.org/packages/f2/e3/e20aae5e1039a2cd9b08d9205f52142329f887f8cf70da3650326670bddf/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408", size = 150578, upload-time = "2024-12-24T18:10:02.357Z" }, + { url = "https://files.pythonhosted.org/packages/8d/af/779ad72a4da0aed925e1139d458adc486e61076d7ecdcc09e610ea8678db/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb", size = 153629, upload-time = "2024-12-24T18:10:03.678Z" }, + { url = "https://files.pythonhosted.org/packages/c2/b6/7aa450b278e7aa92cf7732140bfd8be21f5f29d5bf334ae987c945276639/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d", size = 150778, upload-time = "2024-12-24T18:10:06.197Z" }, + { url = "https://files.pythonhosted.org/packages/39/f4/d9f4f712d0951dcbfd42920d3db81b00dd23b6ab520419626f4023334056/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807", size = 146453, upload-time = "2024-12-24T18:10:08.848Z" }, + { url = "https://files.pythonhosted.org/packages/49/2b/999d0314e4ee0cff3cb83e6bc9aeddd397eeed693edb4facb901eb8fbb69/charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f", size = 95479, upload-time = "2024-12-24T18:10:10.044Z" }, + { url = "https://files.pythonhosted.org/packages/2d/ce/3cbed41cff67e455a386fb5e5dd8906cdda2ed92fbc6297921f2e4419309/charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f", size = 102790, upload-time = "2024-12-24T18:10:11.323Z" }, + { url = "https://files.pythonhosted.org/packages/72/80/41ef5d5a7935d2d3a773e3eaebf0a9350542f2cab4eac59a7a4741fbbbbe/charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125", size = 194995, upload-time = "2024-12-24T18:10:12.838Z" }, + { url = "https://files.pythonhosted.org/packages/7a/28/0b9fefa7b8b080ec492110af6d88aa3dea91c464b17d53474b6e9ba5d2c5/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1", size = 139471, upload-time = "2024-12-24T18:10:14.101Z" }, + { url = "https://files.pythonhosted.org/packages/71/64/d24ab1a997efb06402e3fc07317e94da358e2585165930d9d59ad45fcae2/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3", size = 149831, upload-time = "2024-12-24T18:10:15.512Z" }, + { url = "https://files.pythonhosted.org/packages/37/ed/be39e5258e198655240db5e19e0b11379163ad7070962d6b0c87ed2c4d39/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd", size = 142335, upload-time = "2024-12-24T18:10:18.369Z" }, + { url = "https://files.pythonhosted.org/packages/88/83/489e9504711fa05d8dde1574996408026bdbdbd938f23be67deebb5eca92/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00", size = 143862, upload-time = "2024-12-24T18:10:19.743Z" }, + { url = "https://files.pythonhosted.org/packages/c6/c7/32da20821cf387b759ad24627a9aca289d2822de929b8a41b6241767b461/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12", size = 145673, upload-time = "2024-12-24T18:10:21.139Z" }, + { url = "https://files.pythonhosted.org/packages/68/85/f4288e96039abdd5aeb5c546fa20a37b50da71b5cf01e75e87f16cd43304/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77", size = 140211, upload-time = "2024-12-24T18:10:22.382Z" }, + { url = "https://files.pythonhosted.org/packages/28/a3/a42e70d03cbdabc18997baf4f0227c73591a08041c149e710045c281f97b/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146", size = 148039, upload-time = "2024-12-24T18:10:24.802Z" }, + { url = "https://files.pythonhosted.org/packages/85/e4/65699e8ab3014ecbe6f5c71d1a55d810fb716bbfd74f6283d5c2aa87febf/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd", size = 151939, upload-time = "2024-12-24T18:10:26.124Z" }, + { url = "https://files.pythonhosted.org/packages/b1/82/8e9fe624cc5374193de6860aba3ea8070f584c8565ee77c168ec13274bd2/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6", size = 149075, upload-time = "2024-12-24T18:10:30.027Z" }, + { url = "https://files.pythonhosted.org/packages/3d/7b/82865ba54c765560c8433f65e8acb9217cb839a9e32b42af4aa8e945870f/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8", size = 144340, upload-time = "2024-12-24T18:10:32.679Z" }, + { url = "https://files.pythonhosted.org/packages/b5/b6/9674a4b7d4d99a0d2df9b215da766ee682718f88055751e1e5e753c82db0/charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b", size = 95205, upload-time = "2024-12-24T18:10:34.724Z" }, + { url = "https://files.pythonhosted.org/packages/1e/ab/45b180e175de4402dcf7547e4fb617283bae54ce35c27930a6f35b6bef15/charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76", size = 102441, upload-time = "2024-12-24T18:10:37.574Z" }, + { url = "https://files.pythonhosted.org/packages/0a/9a/dd1e1cdceb841925b7798369a09279bd1cf183cef0f9ddf15a3a6502ee45/charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545", size = 196105, upload-time = "2024-12-24T18:10:38.83Z" }, + { url = "https://files.pythonhosted.org/packages/d3/8c/90bfabf8c4809ecb648f39794cf2a84ff2e7d2a6cf159fe68d9a26160467/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7", size = 140404, upload-time = "2024-12-24T18:10:44.272Z" }, + { url = "https://files.pythonhosted.org/packages/ad/8f/e410d57c721945ea3b4f1a04b74f70ce8fa800d393d72899f0a40526401f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757", size = 150423, upload-time = "2024-12-24T18:10:45.492Z" }, + { url = "https://files.pythonhosted.org/packages/f0/b8/e6825e25deb691ff98cf5c9072ee0605dc2acfca98af70c2d1b1bc75190d/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa", size = 143184, upload-time = "2024-12-24T18:10:47.898Z" }, + { url = "https://files.pythonhosted.org/packages/3e/a2/513f6cbe752421f16d969e32f3583762bfd583848b763913ddab8d9bfd4f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d", size = 145268, upload-time = "2024-12-24T18:10:50.589Z" }, + { url = "https://files.pythonhosted.org/packages/74/94/8a5277664f27c3c438546f3eb53b33f5b19568eb7424736bdc440a88a31f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616", size = 147601, upload-time = "2024-12-24T18:10:52.541Z" }, + { url = "https://files.pythonhosted.org/packages/7c/5f/6d352c51ee763623a98e31194823518e09bfa48be2a7e8383cf691bbb3d0/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b", size = 141098, upload-time = "2024-12-24T18:10:53.789Z" }, + { url = "https://files.pythonhosted.org/packages/78/d4/f5704cb629ba5ab16d1d3d741396aec6dc3ca2b67757c45b0599bb010478/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d", size = 149520, upload-time = "2024-12-24T18:10:55.048Z" }, + { url = "https://files.pythonhosted.org/packages/c5/96/64120b1d02b81785f222b976c0fb79a35875457fa9bb40827678e54d1bc8/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a", size = 152852, upload-time = "2024-12-24T18:10:57.647Z" }, + { url = "https://files.pythonhosted.org/packages/84/c9/98e3732278a99f47d487fd3468bc60b882920cef29d1fa6ca460a1fdf4e6/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9", size = 150488, upload-time = "2024-12-24T18:10:59.43Z" }, + { url = "https://files.pythonhosted.org/packages/13/0e/9c8d4cb99c98c1007cc11eda969ebfe837bbbd0acdb4736d228ccaabcd22/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1", size = 146192, upload-time = "2024-12-24T18:11:00.676Z" }, + { url = "https://files.pythonhosted.org/packages/b2/21/2b6b5b860781a0b49427309cb8670785aa543fb2178de875b87b9cc97746/charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35", size = 95550, upload-time = "2024-12-24T18:11:01.952Z" }, + { url = "https://files.pythonhosted.org/packages/21/5b/1b390b03b1d16c7e382b561c5329f83cc06623916aab983e8ab9239c7d5c/charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f", size = 102785, upload-time = "2024-12-24T18:11:03.142Z" }, + { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698, upload-time = "2024-12-24T18:11:05.834Z" }, + { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162, upload-time = "2024-12-24T18:11:07.064Z" }, + { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263, upload-time = "2024-12-24T18:11:08.374Z" }, + { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966, upload-time = "2024-12-24T18:11:09.831Z" }, + { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992, upload-time = "2024-12-24T18:11:12.03Z" }, + { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162, upload-time = "2024-12-24T18:11:13.372Z" }, + { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972, upload-time = "2024-12-24T18:11:14.628Z" }, + { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095, upload-time = "2024-12-24T18:11:17.672Z" }, + { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668, upload-time = "2024-12-24T18:11:18.989Z" }, + { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073, upload-time = "2024-12-24T18:11:21.507Z" }, + { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732, upload-time = "2024-12-24T18:11:22.774Z" }, + { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391, upload-time = "2024-12-24T18:11:24.139Z" }, + { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702, upload-time = "2024-12-24T18:11:26.535Z" }, + { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767, upload-time = "2024-12-24T18:12:32.852Z" }, ] [[package]] @@ -251,18 +266,18 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "colorama", marker = "sys_platform == 'win32'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/45/2b/7ebad1e59a99207d417c0784f7fb67893465eef84b5b47c788324f1b4095/click-8.1.0.tar.gz", hash = "sha256:977c213473c7665d3aa092b41ff12063227751c41d7b17165013e10069cc5cd2", size = 329986 } +sdist = { url = "https://files.pythonhosted.org/packages/45/2b/7ebad1e59a99207d417c0784f7fb67893465eef84b5b47c788324f1b4095/click-8.1.0.tar.gz", hash = "sha256:977c213473c7665d3aa092b41ff12063227751c41d7b17165013e10069cc5cd2", size = 329986, upload-time = "2022-03-28T17:37:50.944Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/86/3e/3a523bdd24510288b1b850428e01172116a29268378b1da9a8d0b894a115/click-8.1.0-py3-none-any.whl", hash = "sha256:19a4baa64da924c5e0cd889aba8e947f280309f1a2ce0947a3e3a7bcb7cc72d6", size = 96400 }, + { url = "https://files.pythonhosted.org/packages/86/3e/3a523bdd24510288b1b850428e01172116a29268378b1da9a8d0b894a115/click-8.1.0-py3-none-any.whl", hash = "sha256:19a4baa64da924c5e0cd889aba8e947f280309f1a2ce0947a3e3a7bcb7cc72d6", size = 96400, upload-time = "2022-03-28T17:37:48.879Z" }, ] [[package]] name = "colorama" version = "0.4.6" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, ] [[package]] @@ -273,36 +288,45 @@ dependencies = [ { name = "tinycss2" }, { name = "webencodings" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/9f/86/fd7f58fc498b3166f3a7e8e0cddb6e620fe1da35b02248b1bd59e95dbaaa/cssselect2-0.8.0.tar.gz", hash = "sha256:7674ffb954a3b46162392aee2a3a0aedb2e14ecf99fcc28644900f4e6e3e9d3a", size = 35716 } +sdist = { url = "https://files.pythonhosted.org/packages/9f/86/fd7f58fc498b3166f3a7e8e0cddb6e620fe1da35b02248b1bd59e95dbaaa/cssselect2-0.8.0.tar.gz", hash = "sha256:7674ffb954a3b46162392aee2a3a0aedb2e14ecf99fcc28644900f4e6e3e9d3a", size = 35716, upload-time = "2025-03-05T14:46:07.988Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/0f/e7/aa315e6a749d9b96c2504a1ba0ba031ba2d0517e972ce22682e3fccecb09/cssselect2-0.8.0-py3-none-any.whl", hash = "sha256:46fc70ebc41ced7a32cd42d58b1884d72ade23d21e5a4eaaf022401c13f0e76e", size = 15454 }, + { url = "https://files.pythonhosted.org/packages/0f/e7/aa315e6a749d9b96c2504a1ba0ba031ba2d0517e972ce22682e3fccecb09/cssselect2-0.8.0-py3-none-any.whl", hash = "sha256:46fc70ebc41ced7a32cd42d58b1884d72ade23d21e5a4eaaf022401c13f0e76e", size = 15454, upload-time = "2025-03-05T14:46:06.463Z" }, ] [[package]] name = "defusedxml" version = "0.7.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/0f/d5/c66da9b79e5bdb124974bfe172b4daf3c984ebd9c2a06e2b8a4dc7331c72/defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69", size = 75520 } +sdist = { url = "https://files.pythonhosted.org/packages/0f/d5/c66da9b79e5bdb124974bfe172b4daf3c984ebd9c2a06e2b8a4dc7331c72/defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69", size = 75520, upload-time = "2021-03-08T10:59:26.269Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61", size = 25604 }, + { url = "https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61", size = 25604, upload-time = "2021-03-08T10:59:24.45Z" }, ] [[package]] name = "exceptiongroup" version = "1.2.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 } +sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883, upload-time = "2024-07-12T22:26:00.161Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 }, + { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453, upload-time = "2024-07-12T22:25:58.476Z" }, ] [[package]] name = "execnet" version = "2.1.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/bb/ff/b4c0dc78fbe20c3e59c0c7334de0c27eb4001a2b2017999af398bf730817/execnet-2.1.1.tar.gz", hash = "sha256:5189b52c6121c24feae288166ab41b32549c7e2348652736540b9e6e7d4e72e3", size = 166524 } +sdist = { url = "https://files.pythonhosted.org/packages/bb/ff/b4c0dc78fbe20c3e59c0c7334de0c27eb4001a2b2017999af398bf730817/execnet-2.1.1.tar.gz", hash = "sha256:5189b52c6121c24feae288166ab41b32549c7e2348652736540b9e6e7d4e72e3", size = 166524, upload-time = "2024-04-08T09:04:19.245Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/09/2aea36ff60d16dd8879bdb2f5b3ee0ba8d08cbbdcdfe870e695ce3784385/execnet-2.1.1-py3-none-any.whl", hash = "sha256:26dee51f1b80cebd6d0ca8e74dd8745419761d3bef34163928cbebbdc4749fdc", size = 40612, upload-time = "2024-04-08T09:04:17.414Z" }, +] + +[[package]] +name = "executing" +version = "2.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/91/50/a9d80c47ff289c611ff12e63f7c5d13942c65d68125160cefd768c73e6e4/executing-2.2.0.tar.gz", hash = "sha256:5d108c028108fe2551d1a7b2e8b713341e2cb4fc0aa7dcf966fa4327a5226755", size = 978693, upload-time = "2025-01-22T15:41:29.403Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/43/09/2aea36ff60d16dd8879bdb2f5b3ee0ba8d08cbbdcdfe870e695ce3784385/execnet-2.1.1-py3-none-any.whl", hash = "sha256:26dee51f1b80cebd6d0ca8e74dd8745419761d3bef34163928cbebbdc4749fdc", size = 40612 }, + { url = "https://files.pythonhosted.org/packages/7b/8f/c4d9bafc34ad7ad5d8dc16dd1347ee0e507a52c3adb6bfa8887e1c6a26ba/executing-2.2.0-py2.py3-none-any.whl", hash = "sha256:11387150cad388d62750327a53d3339fad4888b39a6fe233c3afbb54ecffd3aa", size = 26702, upload-time = "2025-01-22T15:41:25.929Z" }, ] [[package]] @@ -312,9 +336,9 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "python-dateutil" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d9/29/d40217cbe2f6b1359e00c6c307bb3fc876ba74068cbab3dde77f03ca0dc4/ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343", size = 10943 } +sdist = { url = "https://files.pythonhosted.org/packages/d9/29/d40217cbe2f6b1359e00c6c307bb3fc876ba74068cbab3dde77f03ca0dc4/ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343", size = 10943, upload-time = "2022-05-02T15:47:16.11Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f7/ec/67fbef5d497f86283db54c22eec6f6140243aae73265799baaaa19cd17fb/ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619", size = 11034 }, + { url = "https://files.pythonhosted.org/packages/f7/ec/67fbef5d497f86283db54c22eec6f6140243aae73265799baaaa19cd17fb/ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619", size = 11034, upload-time = "2022-05-02T15:47:14.552Z" }, ] [[package]] @@ -324,18 +348,18 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "colorama" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/2f/f2/b00eb72b853ecb5bf31dd47857cdf6767e380ca24ec2910d43b3fa7cc500/griffe-1.6.2.tar.gz", hash = "sha256:3a46fa7bd83280909b63c12b9a975732a927dd97809efe5b7972290b606c5d91", size = 392836 } +sdist = { url = "https://files.pythonhosted.org/packages/2f/f2/b00eb72b853ecb5bf31dd47857cdf6767e380ca24ec2910d43b3fa7cc500/griffe-1.6.2.tar.gz", hash = "sha256:3a46fa7bd83280909b63c12b9a975732a927dd97809efe5b7972290b606c5d91", size = 392836, upload-time = "2025-03-20T12:40:25.569Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/4e/bc/bd8b7de5e748e078b6be648e76b47189a9182b1ac1eb7791ff7969f39f27/griffe-1.6.2-py3-none-any.whl", hash = "sha256:6399f7e663150e4278a312a8e8a14d2f3d7bd86e2ef2f8056a1058e38579c2ee", size = 128638 }, + { url = "https://files.pythonhosted.org/packages/4e/bc/bd8b7de5e748e078b6be648e76b47189a9182b1ac1eb7791ff7969f39f27/griffe-1.6.2-py3-none-any.whl", hash = "sha256:6399f7e663150e4278a312a8e8a14d2f3d7bd86e2ef2f8056a1058e38579c2ee", size = 128638, upload-time = "2025-03-20T12:40:23.548Z" }, ] [[package]] name = "h11" version = "0.14.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 } +sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418, upload-time = "2022-09-25T15:40:01.519Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 }, + { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259, upload-time = "2022-09-25T15:39:59.68Z" }, ] [[package]] @@ -346,9 +370,9 @@ dependencies = [ { name = "certifi" }, { name = "h11" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 } +sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196, upload-time = "2024-11-15T12:30:47.531Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 }, + { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551, upload-time = "2024-11-15T12:30:45.782Z" }, ] [[package]] @@ -362,36 +386,52 @@ dependencies = [ { name = "idna" }, { name = "sniffio" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz", hash = "sha256:a0cb88a46f32dc874e04ee956e4c2764aba2aa228f650b06788ba6bda2962ab5", size = 126413 } +sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz", hash = "sha256:a0cb88a46f32dc874e04ee956e4c2764aba2aa228f650b06788ba6bda2962ab5", size = 126413, upload-time = "2024-02-21T13:07:52.434Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/41/7b/ddacf6dcebb42466abd03f368782142baa82e08fc0c1f8eaa05b4bae87d5/httpx-0.27.0-py3-none-any.whl", hash = "sha256:71d5465162c13681bff01ad59b2cc68dd838ea1f10e51574bac27103f00c91a5", size = 75590 }, + { url = "https://files.pythonhosted.org/packages/41/7b/ddacf6dcebb42466abd03f368782142baa82e08fc0c1f8eaa05b4bae87d5/httpx-0.27.0-py3-none-any.whl", hash = "sha256:71d5465162c13681bff01ad59b2cc68dd838ea1f10e51574bac27103f00c91a5", size = 75590, upload-time = "2024-02-21T13:07:50.455Z" }, ] [[package]] name = "httpx-sse" version = "0.4.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 } +sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624, upload-time = "2023-12-22T08:01:21.083Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 }, + { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819, upload-time = "2023-12-22T08:01:19.89Z" }, ] [[package]] name = "idna" version = "3.10" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 } +sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 }, + { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" }, ] [[package]] name = "iniconfig" version = "2.0.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 } +sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646, upload-time = "2023-01-07T11:08:11.254Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892, upload-time = "2023-01-07T11:08:09.864Z" }, +] + +[[package]] +name = "inline-snapshot" +version = "0.23.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "asttokens" }, + { name = "executing" }, + { name = "pytest" }, + { name = "rich" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8d/a3/9e78da370d20b896861cd5db3fa6fde89348b700ec144d8c1457c18ea113/inline_snapshot-0.23.0.tar.gz", hash = "sha256:872d027b1eae4e3e3b4028e0d46128bafbf62889a2424a2667dbe4b69cb1ffdf", size = 259375, upload-time = "2025-04-25T18:14:36.48Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 }, + { url = "https://files.pythonhosted.org/packages/71/ab/4ad6bb9808f242e659ca8437ee475efaa201f18ff20a0dd5553280c85ae5/inline_snapshot-0.23.0-py3-none-any.whl", hash = "sha256:b1a5feab675aee8d03a51f1b6291f412100ce750d846c2d58eab16c90ee2c4dd", size = 50119, upload-time = "2025-04-25T18:14:34.46Z" }, ] [[package]] @@ -401,18 +441,18 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markupsafe" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115 } +sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899 }, + { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, ] [[package]] name = "markdown" version = "3.7" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/54/28/3af612670f82f4c056911fbbbb42760255801b3068c48de792d354ff4472/markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2", size = 357086 } +sdist = { url = "https://files.pythonhosted.org/packages/54/28/3af612670f82f4c056911fbbbb42760255801b3068c48de792d354ff4472/markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2", size = 357086, upload-time = "2024-08-16T15:55:17.812Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/3f/08/83871f3c50fc983b88547c196d11cf8c3340e37c32d2e9d6152abe2c61f7/Markdown-3.7-py3-none-any.whl", hash = "sha256:7eb6df5690b81a1d7942992c97fad2938e956e79df20cbc6186e9c3a77b1c803", size = 106349 }, + { url = "https://files.pythonhosted.org/packages/3f/08/83871f3c50fc983b88547c196d11cf8c3340e37c32d2e9d6152abe2c61f7/Markdown-3.7-py3-none-any.whl", hash = "sha256:7eb6df5690b81a1d7942992c97fad2938e956e79df20cbc6186e9c3a77b1c803", size = 106349, upload-time = "2024-08-16T15:55:16.176Z" }, ] [[package]] @@ -422,67 +462,67 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "mdurl" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 } +sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596, upload-time = "2023-06-03T06:41:14.443Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 }, + { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528, upload-time = "2023-06-03T06:41:11.019Z" }, ] [[package]] name = "markupsafe" version = "3.0.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357 }, - { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393 }, - { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732 }, - { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866 }, - { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964 }, - { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977 }, - { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366 }, - { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091 }, - { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065 }, - { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514 }, - { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353 }, - { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392 }, - { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984 }, - { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120 }, - { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032 }, - { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057 }, - { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359 }, - { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306 }, - { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094 }, - { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521 }, - { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274 }, - { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348 }, - { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149 }, - { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118 }, - { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993 }, - { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178 }, - { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319 }, - { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352 }, - { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097 }, - { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601 }, - { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274 }, - { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352 }, - { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122 }, - { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085 }, - { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978 }, - { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208 }, - { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357 }, - { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344 }, - { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101 }, - { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603 }, - { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510 }, - { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486 }, - { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480 }, - { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914 }, - { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796 }, - { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473 }, - { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114 }, - { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098 }, - { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208 }, - { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739 }, +sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537, upload-time = "2024-10-18T15:21:54.129Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357, upload-time = "2024-10-18T15:20:51.44Z" }, + { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393, upload-time = "2024-10-18T15:20:52.426Z" }, + { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732, upload-time = "2024-10-18T15:20:53.578Z" }, + { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866, upload-time = "2024-10-18T15:20:55.06Z" }, + { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964, upload-time = "2024-10-18T15:20:55.906Z" }, + { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977, upload-time = "2024-10-18T15:20:57.189Z" }, + { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366, upload-time = "2024-10-18T15:20:58.235Z" }, + { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091, upload-time = "2024-10-18T15:20:59.235Z" }, + { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065, upload-time = "2024-10-18T15:21:00.307Z" }, + { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514, upload-time = "2024-10-18T15:21:01.122Z" }, + { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353, upload-time = "2024-10-18T15:21:02.187Z" }, + { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392, upload-time = "2024-10-18T15:21:02.941Z" }, + { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984, upload-time = "2024-10-18T15:21:03.953Z" }, + { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120, upload-time = "2024-10-18T15:21:06.495Z" }, + { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032, upload-time = "2024-10-18T15:21:07.295Z" }, + { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057, upload-time = "2024-10-18T15:21:08.073Z" }, + { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359, upload-time = "2024-10-18T15:21:09.318Z" }, + { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306, upload-time = "2024-10-18T15:21:10.185Z" }, + { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094, upload-time = "2024-10-18T15:21:11.005Z" }, + { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521, upload-time = "2024-10-18T15:21:12.911Z" }, + { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274, upload-time = "2024-10-18T15:21:13.777Z" }, + { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348, upload-time = "2024-10-18T15:21:14.822Z" }, + { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149, upload-time = "2024-10-18T15:21:15.642Z" }, + { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118, upload-time = "2024-10-18T15:21:17.133Z" }, + { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993, upload-time = "2024-10-18T15:21:18.064Z" }, + { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178, upload-time = "2024-10-18T15:21:18.859Z" }, + { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319, upload-time = "2024-10-18T15:21:19.671Z" }, + { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352, upload-time = "2024-10-18T15:21:20.971Z" }, + { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097, upload-time = "2024-10-18T15:21:22.646Z" }, + { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601, upload-time = "2024-10-18T15:21:23.499Z" }, + { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274, upload-time = "2024-10-18T15:21:24.577Z" }, + { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352, upload-time = "2024-10-18T15:21:25.382Z" }, + { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122, upload-time = "2024-10-18T15:21:26.199Z" }, + { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085, upload-time = "2024-10-18T15:21:27.029Z" }, + { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978, upload-time = "2024-10-18T15:21:27.846Z" }, + { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208, upload-time = "2024-10-18T15:21:28.744Z" }, + { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357, upload-time = "2024-10-18T15:21:29.545Z" }, + { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344, upload-time = "2024-10-18T15:21:30.366Z" }, + { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101, upload-time = "2024-10-18T15:21:31.207Z" }, + { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603, upload-time = "2024-10-18T15:21:32.032Z" }, + { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510, upload-time = "2024-10-18T15:21:33.625Z" }, + { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486, upload-time = "2024-10-18T15:21:34.611Z" }, + { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480, upload-time = "2024-10-18T15:21:35.398Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914, upload-time = "2024-10-18T15:21:36.231Z" }, + { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796, upload-time = "2024-10-18T15:21:37.073Z" }, + { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473, upload-time = "2024-10-18T15:21:37.932Z" }, + { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114, upload-time = "2024-10-18T15:21:39.799Z" }, + { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098, upload-time = "2024-10-18T15:21:40.813Z" }, + { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208, upload-time = "2024-10-18T15:21:41.814Z" }, + { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739, upload-time = "2024-10-18T15:21:42.784Z" }, ] [[package]] @@ -494,6 +534,7 @@ dependencies = [ { name = "httpx-sse" }, { name = "pydantic" }, { name = "pydantic-settings" }, + { name = "python-multipart" }, { name = "sse-starlette" }, { name = "starlette" }, { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, @@ -513,10 +554,12 @@ ws = [ [package.dev-dependencies] dev = [ + { name = "inline-snapshot" }, { name = "pyright" }, { name = "pytest" }, { name = "pytest-examples" }, { name = "pytest-flakefinder" }, + { name = "pytest-pretty" }, { name = "pytest-xdist" }, { name = "ruff" }, { name = "trio" }, @@ -536,6 +579,7 @@ requires-dist = [ { name = "pydantic", specifier = ">=2.7.2,<3.0.0" }, { name = "pydantic-settings", specifier = ">=2.5.2" }, { name = "python-dotenv", marker = "extra == 'cli'", specifier = ">=1.0.0" }, + { name = "python-multipart", specifier = ">=0.0.9" }, { name = "rich", marker = "extra == 'rich'", specifier = ">=13.9.4" }, { name = "sse-starlette", specifier = ">=1.6.1" }, { name = "starlette", specifier = ">=0.27" }, @@ -547,10 +591,12 @@ provides-extras = ["cli", "rich", "ws"] [package.metadata.requires-dev] dev = [ + { name = "inline-snapshot", specifier = ">=0.23.0" }, { name = "pyright", specifier = ">=1.1.391" }, { name = "pytest", specifier = ">=8.3.4" }, { name = "pytest-examples", specifier = ">=0.0.14" }, { name = "pytest-flakefinder", specifier = ">=1.1.0" }, + { name = "pytest-pretty", specifier = ">=1.2.0" }, { name = "pytest-xdist", specifier = ">=3.6.1" }, { name = "ruff", specifier = ">=0.8.5" }, { name = "trio", specifier = ">=0.26.2" }, @@ -562,6 +608,47 @@ docs = [ { name = "mkdocstrings-python", specifier = ">=1.12.2" }, ] +[[package]] +name = "mcp-simple-auth" +version = "0.1.0" +source = { editable = "examples/servers/simple-auth" } +dependencies = [ + { name = "anyio" }, + { name = "click" }, + { name = "httpx" }, + { name = "mcp" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "sse-starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pyright" }, + { name = "pytest" }, + { name = "ruff" }, +] + +[package.metadata] +requires-dist = [ + { name = "anyio", specifier = ">=4.5" }, + { name = "click", specifier = ">=8.1.0" }, + { name = "httpx", specifier = ">=0.27" }, + { name = "mcp", editable = "." }, + { name = "pydantic", specifier = ">=2.0" }, + { name = "pydantic-settings", specifier = ">=2.5.2" }, + { name = "sse-starlette", specifier = ">=1.6.1" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'", specifier = ">=0.23.1" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pyright", specifier = ">=1.1.391" }, + { name = "pytest", specifier = ">=8.3.4" }, + { name = "ruff", specifier = ">=0.8.5" }, +] + [[package]] name = "mcp-simple-prompt" version = "0.1.0" @@ -628,6 +715,80 @@ dev = [ { name = "ruff", specifier = ">=0.6.9" }, ] +[[package]] +name = "mcp-simple-streamablehttp" +version = "0.1.0" +source = { editable = "examples/servers/simple-streamablehttp" } +dependencies = [ + { name = "anyio" }, + { name = "click" }, + { name = "httpx" }, + { name = "mcp" }, + { name = "starlette" }, + { name = "uvicorn" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pyright" }, + { name = "pytest" }, + { name = "ruff" }, +] + +[package.metadata] +requires-dist = [ + { name = "anyio", specifier = ">=4.5" }, + { name = "click", specifier = ">=8.1.0" }, + { name = "httpx", specifier = ">=0.27" }, + { name = "mcp", editable = "." }, + { name = "starlette" }, + { name = "uvicorn" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pyright", specifier = ">=1.1.378" }, + { name = "pytest", specifier = ">=8.3.3" }, + { name = "ruff", specifier = ">=0.6.9" }, +] + +[[package]] +name = "mcp-simple-streamablehttp-stateless" +version = "0.1.0" +source = { editable = "examples/servers/simple-streamablehttp-stateless" } +dependencies = [ + { name = "anyio" }, + { name = "click" }, + { name = "httpx" }, + { name = "mcp" }, + { name = "starlette" }, + { name = "uvicorn" }, +] + +[package.dev-dependencies] +dev = [ + { name = "pyright" }, + { name = "pytest" }, + { name = "ruff" }, +] + +[package.metadata] +requires-dist = [ + { name = "anyio", specifier = ">=4.5" }, + { name = "click", specifier = ">=8.1.0" }, + { name = "httpx", specifier = ">=0.27" }, + { name = "mcp", editable = "." }, + { name = "starlette" }, + { name = "uvicorn" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "pyright", specifier = ">=1.1.378" }, + { name = "pytest", specifier = ">=8.3.3" }, + { name = "ruff", specifier = ">=0.6.9" }, +] + [[package]] name = "mcp-simple-tool" version = "0.1.0" @@ -665,18 +826,18 @@ dev = [ name = "mdurl" version = "0.1.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 } +sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 }, + { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, ] [[package]] name = "mergedeep" version = "1.3.4" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/3a/41/580bb4006e3ed0361b8151a01d324fb03f420815446c7def45d02f74c270/mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8", size = 4661 } +sdist = { url = "https://files.pythonhosted.org/packages/3a/41/580bb4006e3ed0361b8151a01d324fb03f420815446c7def45d02f74c270/mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8", size = 4661, upload-time = "2021-02-05T18:55:30.623Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/2c/19/04f9b178c2d8a15b076c8b5140708fa6ffc5601fb6f1e975537072df5b2a/mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307", size = 6354 }, + { url = "https://files.pythonhosted.org/packages/2c/19/04f9b178c2d8a15b076c8b5140708fa6ffc5601fb6f1e975537072df5b2a/mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307", size = 6354, upload-time = "2021-02-05T18:55:29.583Z" }, ] [[package]] @@ -698,9 +859,9 @@ dependencies = [ { name = "pyyaml-env-tag" }, { name = "watchdog" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/bc/c6/bbd4f061bd16b378247f12953ffcb04786a618ce5e904b8c5a01a0309061/mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2", size = 3889159 } +sdist = { url = "https://files.pythonhosted.org/packages/bc/c6/bbd4f061bd16b378247f12953ffcb04786a618ce5e904b8c5a01a0309061/mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2", size = 3889159, upload-time = "2024-08-30T12:24:06.899Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/22/5b/dbc6a8cddc9cfa9c4971d59fb12bb8d42e161b7e7f8cc89e49137c5b279c/mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e", size = 3864451 }, + { url = "https://files.pythonhosted.org/packages/22/5b/dbc6a8cddc9cfa9c4971d59fb12bb8d42e161b7e7f8cc89e49137c5b279c/mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e", size = 3864451, upload-time = "2024-08-30T12:24:05.054Z" }, ] [[package]] @@ -712,9 +873,9 @@ dependencies = [ { name = "markupsafe" }, { name = "mkdocs" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/c2/44/140469d87379c02f1e1870315f3143718036a983dd0416650827b8883192/mkdocs_autorefs-1.4.1.tar.gz", hash = "sha256:4b5b6235a4becb2b10425c2fa191737e415b37aa3418919db33e5d774c9db079", size = 4131355 } +sdist = { url = "https://files.pythonhosted.org/packages/c2/44/140469d87379c02f1e1870315f3143718036a983dd0416650827b8883192/mkdocs_autorefs-1.4.1.tar.gz", hash = "sha256:4b5b6235a4becb2b10425c2fa191737e415b37aa3418919db33e5d774c9db079", size = 4131355, upload-time = "2025-03-08T13:35:21.232Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f8/29/1125f7b11db63e8e32bcfa0752a4eea30abff3ebd0796f808e14571ddaa2/mkdocs_autorefs-1.4.1-py3-none-any.whl", hash = "sha256:9793c5ac06a6ebbe52ec0f8439256e66187badf4b5334b5fde0b128ec134df4f", size = 5782047 }, + { url = "https://files.pythonhosted.org/packages/f8/29/1125f7b11db63e8e32bcfa0752a4eea30abff3ebd0796f808e14571ddaa2/mkdocs_autorefs-1.4.1-py3-none-any.whl", hash = "sha256:9793c5ac06a6ebbe52ec0f8439256e66187badf4b5334b5fde0b128ec134df4f", size = 5782047, upload-time = "2025-03-08T13:35:18.889Z" }, ] [[package]] @@ -726,18 +887,18 @@ dependencies = [ { name = "platformdirs" }, { name = "pyyaml" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/98/f5/ed29cd50067784976f25ed0ed6fcd3c2ce9eb90650aa3b2796ddf7b6870b/mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c", size = 10239 } +sdist = { url = "https://files.pythonhosted.org/packages/98/f5/ed29cd50067784976f25ed0ed6fcd3c2ce9eb90650aa3b2796ddf7b6870b/mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c", size = 10239, upload-time = "2023-11-20T17:51:09.981Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/9f/d4/029f984e8d3f3b6b726bd33cafc473b75e9e44c0f7e80a5b29abc466bdea/mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134", size = 9521 }, + { url = "https://files.pythonhosted.org/packages/9f/d4/029f984e8d3f3b6b726bd33cafc473b75e9e44c0f7e80a5b29abc466bdea/mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134", size = 9521, upload-time = "2023-11-20T17:51:08.587Z" }, ] [[package]] name = "mkdocs-glightbox" version = "0.4.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/86/5a/0bc456397ba0acc684b5b1daa4ca232ed717938fd37198251d8bcc4053bf/mkdocs-glightbox-0.4.0.tar.gz", hash = "sha256:392b34207bf95991071a16d5f8916d1d2f2cd5d5bb59ae2997485ccd778c70d9", size = 32010 } +sdist = { url = "https://files.pythonhosted.org/packages/86/5a/0bc456397ba0acc684b5b1daa4ca232ed717938fd37198251d8bcc4053bf/mkdocs-glightbox-0.4.0.tar.gz", hash = "sha256:392b34207bf95991071a16d5f8916d1d2f2cd5d5bb59ae2997485ccd778c70d9", size = 32010, upload-time = "2024-05-06T14:31:43.063Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/c1/72/b0c2128bb569c732c11ae8e49a777089e77d83c05946062caa19b841e6fb/mkdocs_glightbox-0.4.0-py3-none-any.whl", hash = "sha256:e0107beee75d3eb7380ac06ea2d6eac94c999eaa49f8c3cbab0e7be2ac006ccf", size = 31154 }, + { url = "https://files.pythonhosted.org/packages/c1/72/b0c2128bb569c732c11ae8e49a777089e77d83c05946062caa19b841e6fb/mkdocs_glightbox-0.4.0-py3-none-any.whl", hash = "sha256:e0107beee75d3eb7380ac06ea2d6eac94c999eaa49f8c3cbab0e7be2ac006ccf", size = 31154, upload-time = "2024-05-06T14:31:41.011Z" }, ] [[package]] @@ -757,9 +918,9 @@ dependencies = [ { name = "regex" }, { name = "requests" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/02/02/38f1f76252462b8e9652eb3778905206c1f3b9b4c25bf60aafc029675a2b/mkdocs_material-9.5.45.tar.gz", hash = "sha256:286489cf0beca4a129d91d59d6417419c63bceed1ce5cd0ec1fc7e1ebffb8189", size = 3906694 } +sdist = { url = "https://files.pythonhosted.org/packages/02/02/38f1f76252462b8e9652eb3778905206c1f3b9b4c25bf60aafc029675a2b/mkdocs_material-9.5.45.tar.gz", hash = "sha256:286489cf0beca4a129d91d59d6417419c63bceed1ce5cd0ec1fc7e1ebffb8189", size = 3906694, upload-time = "2024-11-20T08:17:02.051Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5c/43/f5f866cd840e14f82068831e53446ea1f66a128cd38a229c5b9c9243ed9e/mkdocs_material-9.5.45-py3-none-any.whl", hash = "sha256:a9be237cfd0be14be75f40f1726d83aa3a81ce44808dc3594d47a7a592f44547", size = 8615700 }, + { url = "https://files.pythonhosted.org/packages/5c/43/f5f866cd840e14f82068831e53446ea1f66a128cd38a229c5b9c9243ed9e/mkdocs_material-9.5.45-py3-none-any.whl", hash = "sha256:a9be237cfd0be14be75f40f1726d83aa3a81ce44808dc3594d47a7a592f44547", size = 8615700, upload-time = "2024-11-20T08:16:58.906Z" }, ] [package.optional-dependencies] @@ -772,9 +933,9 @@ imaging = [ name = "mkdocs-material-extensions" version = "1.3.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/79/9b/9b4c96d6593b2a541e1cb8b34899a6d021d208bb357042823d4d2cabdbe7/mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443", size = 11847 } +sdist = { url = "https://files.pythonhosted.org/packages/79/9b/9b4c96d6593b2a541e1cb8b34899a6d021d208bb357042823d4d2cabdbe7/mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443", size = 11847, upload-time = "2023-11-22T19:09:45.208Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5b/54/662a4743aa81d9582ee9339d4ffa3c8fd40a4965e033d77b9da9774d3960/mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31", size = 8728 }, + { url = "https://files.pythonhosted.org/packages/5b/54/662a4743aa81d9582ee9339d4ffa3c8fd40a4965e033d77b9da9774d3960/mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31", size = 8728, upload-time = "2023-11-22T19:09:43.465Z" }, ] [[package]] @@ -789,9 +950,9 @@ dependencies = [ { name = "mkdocs-autorefs" }, { name = "pymdown-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/8e/4d/a9484dc5d926295bdf308f1f6c4f07fcc99735b970591edc414d401fcc91/mkdocstrings-0.29.0.tar.gz", hash = "sha256:3657be1384543ce0ee82112c3e521bbf48e41303aa0c229b9ffcccba057d922e", size = 1212185 } +sdist = { url = "https://files.pythonhosted.org/packages/8e/4d/a9484dc5d926295bdf308f1f6c4f07fcc99735b970591edc414d401fcc91/mkdocstrings-0.29.0.tar.gz", hash = "sha256:3657be1384543ce0ee82112c3e521bbf48e41303aa0c229b9ffcccba057d922e", size = 1212185, upload-time = "2025-03-10T13:10:11.445Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/15/47/eb876dfd84e48f31ff60897d161b309cf6a04ca270155b0662aae562b3fb/mkdocstrings-0.29.0-py3-none-any.whl", hash = "sha256:8ea98358d2006f60befa940fdebbbc88a26b37ecbcded10be726ba359284f73d", size = 1630824 }, + { url = "https://files.pythonhosted.org/packages/15/47/eb876dfd84e48f31ff60897d161b309cf6a04ca270155b0662aae562b3fb/mkdocstrings-0.29.0-py3-none-any.whl", hash = "sha256:8ea98358d2006f60befa940fdebbbc88a26b37ecbcded10be726ba359284f73d", size = 1630824, upload-time = "2025-03-10T13:10:09.712Z" }, ] [[package]] @@ -803,27 +964,27 @@ dependencies = [ { name = "mkdocs-autorefs" }, { name = "mkdocstrings" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/23/ec/cb6debe2db77f1ef42b25b21d93b5021474de3037cd82385e586aee72545/mkdocstrings_python-1.12.2.tar.gz", hash = "sha256:7a1760941c0b52a2cd87b960a9e21112ffe52e7df9d0b9583d04d47ed2e186f3", size = 168207 } +sdist = { url = "https://files.pythonhosted.org/packages/23/ec/cb6debe2db77f1ef42b25b21d93b5021474de3037cd82385e586aee72545/mkdocstrings_python-1.12.2.tar.gz", hash = "sha256:7a1760941c0b52a2cd87b960a9e21112ffe52e7df9d0b9583d04d47ed2e186f3", size = 168207, upload-time = "2024-10-19T17:54:41.672Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5b/c1/ac524e1026d9580cbc654b5d19f5843c8b364a66d30f956372cd09fd2f92/mkdocstrings_python-1.12.2-py3-none-any.whl", hash = "sha256:7f7d40d6db3cb1f5d19dbcd80e3efe4d0ba32b073272c0c0de9de2e604eda62a", size = 111759 }, + { url = "https://files.pythonhosted.org/packages/5b/c1/ac524e1026d9580cbc654b5d19f5843c8b364a66d30f956372cd09fd2f92/mkdocstrings_python-1.12.2-py3-none-any.whl", hash = "sha256:7f7d40d6db3cb1f5d19dbcd80e3efe4d0ba32b073272c0c0de9de2e604eda62a", size = 111759, upload-time = "2024-10-19T17:54:39.338Z" }, ] [[package]] name = "mypy-extensions" version = "1.0.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 } +sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433, upload-time = "2023-02-04T12:11:27.157Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 }, + { url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695, upload-time = "2023-02-04T12:11:25.002Z" }, ] [[package]] name = "nodeenv" version = "1.9.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 } +sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 }, + { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" }, ] [[package]] @@ -833,122 +994,122 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060 } +sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060, upload-time = "2023-10-26T04:26:04.361Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692 }, + { url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692, upload-time = "2023-10-26T04:26:02.532Z" }, ] [[package]] name = "packaging" version = "24.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 } +sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950, upload-time = "2024-11-08T09:47:47.202Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 }, + { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451, upload-time = "2024-11-08T09:47:44.722Z" }, ] [[package]] name = "paginate" version = "0.5.7" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ec/46/68dde5b6bc00c1296ec6466ab27dddede6aec9af1b99090e1107091b3b84/paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945", size = 19252 } +sdist = { url = "https://files.pythonhosted.org/packages/ec/46/68dde5b6bc00c1296ec6466ab27dddede6aec9af1b99090e1107091b3b84/paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945", size = 19252, upload-time = "2024-08-25T14:17:24.139Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/90/96/04b8e52da071d28f5e21a805b19cb9390aa17a47462ac87f5e2696b9566d/paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591", size = 13746 }, + { url = "https://files.pythonhosted.org/packages/90/96/04b8e52da071d28f5e21a805b19cb9390aa17a47462ac87f5e2696b9566d/paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591", size = 13746, upload-time = "2024-08-25T14:17:22.55Z" }, ] [[package]] name = "pathspec" version = "0.12.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043 } +sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191 }, + { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" }, ] [[package]] name = "pillow" version = "10.4.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/cd/74/ad3d526f3bf7b6d3f408b73fde271ec69dfac8b81341a318ce825f2b3812/pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", size = 46555059 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0e/69/a31cccd538ca0b5272be2a38347f8839b97a14be104ea08b0db92f749c74/pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", size = 3509271 }, - { url = "https://files.pythonhosted.org/packages/9a/9e/4143b907be8ea0bce215f2ae4f7480027473f8b61fcedfda9d851082a5d2/pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", size = 3375658 }, - { url = "https://files.pythonhosted.org/packages/8a/25/1fc45761955f9359b1169aa75e241551e74ac01a09f487adaaf4c3472d11/pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", size = 4332075 }, - { url = "https://files.pythonhosted.org/packages/5e/dd/425b95d0151e1d6c951f45051112394f130df3da67363b6bc75dc4c27aba/pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", size = 4444808 }, - { url = "https://files.pythonhosted.org/packages/b1/84/9a15cc5726cbbfe7f9f90bfb11f5d028586595907cd093815ca6644932e3/pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", size = 4356290 }, - { url = "https://files.pythonhosted.org/packages/b5/5b/6651c288b08df3b8c1e2f8c1152201e0b25d240e22ddade0f1e242fc9fa0/pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", size = 4525163 }, - { url = "https://files.pythonhosted.org/packages/07/8b/34854bf11a83c248505c8cb0fcf8d3d0b459a2246c8809b967963b6b12ae/pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", size = 4463100 }, - { url = "https://files.pythonhosted.org/packages/78/63/0632aee4e82476d9cbe5200c0cdf9ba41ee04ed77887432845264d81116d/pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", size = 4592880 }, - { url = "https://files.pythonhosted.org/packages/df/56/b8663d7520671b4398b9d97e1ed9f583d4afcbefbda3c6188325e8c297bd/pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", size = 2235218 }, - { url = "https://files.pythonhosted.org/packages/f4/72/0203e94a91ddb4a9d5238434ae6c1ca10e610e8487036132ea9bf806ca2a/pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", size = 2554487 }, - { url = "https://files.pythonhosted.org/packages/bd/52/7e7e93d7a6e4290543f17dc6f7d3af4bd0b3dd9926e2e8a35ac2282bc5f4/pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1", size = 2243219 }, - { url = "https://files.pythonhosted.org/packages/a7/62/c9449f9c3043c37f73e7487ec4ef0c03eb9c9afc91a92b977a67b3c0bbc5/pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", size = 3509265 }, - { url = "https://files.pythonhosted.org/packages/f4/5f/491dafc7bbf5a3cc1845dc0430872e8096eb9e2b6f8161509d124594ec2d/pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", size = 3375655 }, - { url = "https://files.pythonhosted.org/packages/73/d5/c4011a76f4207a3c151134cd22a1415741e42fa5ddecec7c0182887deb3d/pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", size = 4340304 }, - { url = "https://files.pythonhosted.org/packages/ac/10/c67e20445a707f7a610699bba4fe050583b688d8cd2d202572b257f46600/pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", size = 4452804 }, - { url = "https://files.pythonhosted.org/packages/a9/83/6523837906d1da2b269dee787e31df3b0acb12e3d08f024965a3e7f64665/pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", size = 4365126 }, - { url = "https://files.pythonhosted.org/packages/ba/e5/8c68ff608a4203085158cff5cc2a3c534ec384536d9438c405ed6370d080/pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", size = 4533541 }, - { url = "https://files.pythonhosted.org/packages/f4/7c/01b8dbdca5bc6785573f4cee96e2358b0918b7b2c7b60d8b6f3abf87a070/pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", size = 4471616 }, - { url = "https://files.pythonhosted.org/packages/c8/57/2899b82394a35a0fbfd352e290945440e3b3785655a03365c0ca8279f351/pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", size = 4600802 }, - { url = "https://files.pythonhosted.org/packages/4d/d7/a44f193d4c26e58ee5d2d9db3d4854b2cfb5b5e08d360a5e03fe987c0086/pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", size = 2235213 }, - { url = "https://files.pythonhosted.org/packages/c1/d0/5866318eec2b801cdb8c82abf190c8343d8a1cd8bf5a0c17444a6f268291/pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", size = 2554498 }, - { url = "https://files.pythonhosted.org/packages/d4/c8/310ac16ac2b97e902d9eb438688de0d961660a87703ad1561fd3dfbd2aa0/pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", size = 2243219 }, - { url = "https://files.pythonhosted.org/packages/05/cb/0353013dc30c02a8be34eb91d25e4e4cf594b59e5a55ea1128fde1e5f8ea/pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", size = 3509350 }, - { url = "https://files.pythonhosted.org/packages/e7/cf/5c558a0f247e0bf9cec92bff9b46ae6474dd736f6d906315e60e4075f737/pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", size = 3374980 }, - { url = "https://files.pythonhosted.org/packages/84/48/6e394b86369a4eb68b8a1382c78dc092245af517385c086c5094e3b34428/pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", size = 4343799 }, - { url = "https://files.pythonhosted.org/packages/3b/f3/a8c6c11fa84b59b9df0cd5694492da8c039a24cd159f0f6918690105c3be/pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", size = 4459973 }, - { url = "https://files.pythonhosted.org/packages/7d/1b/c14b4197b80150fb64453585247e6fb2e1d93761fa0fa9cf63b102fde822/pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", size = 4370054 }, - { url = "https://files.pythonhosted.org/packages/55/77/40daddf677897a923d5d33329acd52a2144d54a9644f2a5422c028c6bf2d/pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", size = 4539484 }, - { url = "https://files.pythonhosted.org/packages/40/54/90de3e4256b1207300fb2b1d7168dd912a2fb4b2401e439ba23c2b2cabde/pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", size = 4477375 }, - { url = "https://files.pythonhosted.org/packages/13/24/1bfba52f44193860918ff7c93d03d95e3f8748ca1de3ceaf11157a14cf16/pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", size = 4608773 }, - { url = "https://files.pythonhosted.org/packages/55/04/5e6de6e6120451ec0c24516c41dbaf80cce1b6451f96561235ef2429da2e/pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", size = 2235690 }, - { url = "https://files.pythonhosted.org/packages/74/0a/d4ce3c44bca8635bd29a2eab5aa181b654a734a29b263ca8efe013beea98/pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", size = 2554951 }, - { url = "https://files.pythonhosted.org/packages/b5/ca/184349ee40f2e92439be9b3502ae6cfc43ac4b50bc4fc6b3de7957563894/pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", size = 2243427 }, - { url = "https://files.pythonhosted.org/packages/c3/00/706cebe7c2c12a6318aabe5d354836f54adff7156fd9e1bd6c89f4ba0e98/pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", size = 3525685 }, - { url = "https://files.pythonhosted.org/packages/cf/76/f658cbfa49405e5ecbfb9ba42d07074ad9792031267e782d409fd8fe7c69/pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", size = 3374883 }, - { url = "https://files.pythonhosted.org/packages/46/2b/99c28c4379a85e65378211971c0b430d9c7234b1ec4d59b2668f6299e011/pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", size = 4339837 }, - { url = "https://files.pythonhosted.org/packages/f1/74/b1ec314f624c0c43711fdf0d8076f82d9d802afd58f1d62c2a86878e8615/pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", size = 4455562 }, - { url = "https://files.pythonhosted.org/packages/4a/2a/4b04157cb7b9c74372fa867096a1607e6fedad93a44deeff553ccd307868/pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", size = 4366761 }, - { url = "https://files.pythonhosted.org/packages/ac/7b/8f1d815c1a6a268fe90481232c98dd0e5fa8c75e341a75f060037bd5ceae/pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", size = 4536767 }, - { url = "https://files.pythonhosted.org/packages/e5/77/05fa64d1f45d12c22c314e7b97398ffb28ef2813a485465017b7978b3ce7/pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", size = 4477989 }, - { url = "https://files.pythonhosted.org/packages/12/63/b0397cfc2caae05c3fb2f4ed1b4fc4fc878f0243510a7a6034ca59726494/pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", size = 4610255 }, - { url = "https://files.pythonhosted.org/packages/7b/f9/cfaa5082ca9bc4a6de66ffe1c12c2d90bf09c309a5f52b27759a596900e7/pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", size = 2235603 }, - { url = "https://files.pythonhosted.org/packages/01/6a/30ff0eef6e0c0e71e55ded56a38d4859bf9d3634a94a88743897b5f96936/pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", size = 2554972 }, - { url = "https://files.pythonhosted.org/packages/48/2c/2e0a52890f269435eee38b21c8218e102c621fe8d8df8b9dd06fabf879ba/pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", size = 2243375 }, - { url = "https://files.pythonhosted.org/packages/38/30/095d4f55f3a053392f75e2eae45eba3228452783bab3d9a920b951ac495c/pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", size = 3493889 }, - { url = "https://files.pythonhosted.org/packages/f3/e8/4ff79788803a5fcd5dc35efdc9386af153569853767bff74540725b45863/pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", size = 3346160 }, - { url = "https://files.pythonhosted.org/packages/d7/ac/4184edd511b14f760c73f5bb8a5d6fd85c591c8aff7c2229677a355c4179/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", size = 3435020 }, - { url = "https://files.pythonhosted.org/packages/da/21/1749cd09160149c0a246a81d646e05f35041619ce76f6493d6a96e8d1103/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", size = 3490539 }, - { url = "https://files.pythonhosted.org/packages/b6/f5/f71fe1888b96083b3f6dfa0709101f61fc9e972c0c8d04e9d93ccef2a045/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", size = 3476125 }, - { url = "https://files.pythonhosted.org/packages/96/b9/c0362c54290a31866c3526848583a2f45a535aa9d725fd31e25d318c805f/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", size = 3579373 }, - { url = "https://files.pythonhosted.org/packages/52/3b/ce7a01026a7cf46e5452afa86f97a5e88ca97f562cafa76570178ab56d8d/pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", size = 2554661 }, +sdist = { url = "https://files.pythonhosted.org/packages/cd/74/ad3d526f3bf7b6d3f408b73fde271ec69dfac8b81341a318ce825f2b3812/pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", size = 46555059, upload-time = "2024-07-01T09:48:43.583Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0e/69/a31cccd538ca0b5272be2a38347f8839b97a14be104ea08b0db92f749c74/pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", size = 3509271, upload-time = "2024-07-01T09:45:22.07Z" }, + { url = "https://files.pythonhosted.org/packages/9a/9e/4143b907be8ea0bce215f2ae4f7480027473f8b61fcedfda9d851082a5d2/pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", size = 3375658, upload-time = "2024-07-01T09:45:25.292Z" }, + { url = "https://files.pythonhosted.org/packages/8a/25/1fc45761955f9359b1169aa75e241551e74ac01a09f487adaaf4c3472d11/pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", size = 4332075, upload-time = "2024-07-01T09:45:27.94Z" }, + { url = "https://files.pythonhosted.org/packages/5e/dd/425b95d0151e1d6c951f45051112394f130df3da67363b6bc75dc4c27aba/pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", size = 4444808, upload-time = "2024-07-01T09:45:30.305Z" }, + { url = "https://files.pythonhosted.org/packages/b1/84/9a15cc5726cbbfe7f9f90bfb11f5d028586595907cd093815ca6644932e3/pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", size = 4356290, upload-time = "2024-07-01T09:45:32.868Z" }, + { url = "https://files.pythonhosted.org/packages/b5/5b/6651c288b08df3b8c1e2f8c1152201e0b25d240e22ddade0f1e242fc9fa0/pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", size = 4525163, upload-time = "2024-07-01T09:45:35.279Z" }, + { url = "https://files.pythonhosted.org/packages/07/8b/34854bf11a83c248505c8cb0fcf8d3d0b459a2246c8809b967963b6b12ae/pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", size = 4463100, upload-time = "2024-07-01T09:45:37.74Z" }, + { url = "https://files.pythonhosted.org/packages/78/63/0632aee4e82476d9cbe5200c0cdf9ba41ee04ed77887432845264d81116d/pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", size = 4592880, upload-time = "2024-07-01T09:45:39.89Z" }, + { url = "https://files.pythonhosted.org/packages/df/56/b8663d7520671b4398b9d97e1ed9f583d4afcbefbda3c6188325e8c297bd/pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", size = 2235218, upload-time = "2024-07-01T09:45:42.771Z" }, + { url = "https://files.pythonhosted.org/packages/f4/72/0203e94a91ddb4a9d5238434ae6c1ca10e610e8487036132ea9bf806ca2a/pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", size = 2554487, upload-time = "2024-07-01T09:45:45.176Z" }, + { url = "https://files.pythonhosted.org/packages/bd/52/7e7e93d7a6e4290543f17dc6f7d3af4bd0b3dd9926e2e8a35ac2282bc5f4/pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1", size = 2243219, upload-time = "2024-07-01T09:45:47.274Z" }, + { url = "https://files.pythonhosted.org/packages/a7/62/c9449f9c3043c37f73e7487ec4ef0c03eb9c9afc91a92b977a67b3c0bbc5/pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", size = 3509265, upload-time = "2024-07-01T09:45:49.812Z" }, + { url = "https://files.pythonhosted.org/packages/f4/5f/491dafc7bbf5a3cc1845dc0430872e8096eb9e2b6f8161509d124594ec2d/pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", size = 3375655, upload-time = "2024-07-01T09:45:52.462Z" }, + { url = "https://files.pythonhosted.org/packages/73/d5/c4011a76f4207a3c151134cd22a1415741e42fa5ddecec7c0182887deb3d/pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", size = 4340304, upload-time = "2024-07-01T09:45:55.006Z" }, + { url = "https://files.pythonhosted.org/packages/ac/10/c67e20445a707f7a610699bba4fe050583b688d8cd2d202572b257f46600/pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", size = 4452804, upload-time = "2024-07-01T09:45:58.437Z" }, + { url = "https://files.pythonhosted.org/packages/a9/83/6523837906d1da2b269dee787e31df3b0acb12e3d08f024965a3e7f64665/pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", size = 4365126, upload-time = "2024-07-01T09:46:00.713Z" }, + { url = "https://files.pythonhosted.org/packages/ba/e5/8c68ff608a4203085158cff5cc2a3c534ec384536d9438c405ed6370d080/pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", size = 4533541, upload-time = "2024-07-01T09:46:03.235Z" }, + { url = "https://files.pythonhosted.org/packages/f4/7c/01b8dbdca5bc6785573f4cee96e2358b0918b7b2c7b60d8b6f3abf87a070/pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", size = 4471616, upload-time = "2024-07-01T09:46:05.356Z" }, + { url = "https://files.pythonhosted.org/packages/c8/57/2899b82394a35a0fbfd352e290945440e3b3785655a03365c0ca8279f351/pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", size = 4600802, upload-time = "2024-07-01T09:46:08.145Z" }, + { url = "https://files.pythonhosted.org/packages/4d/d7/a44f193d4c26e58ee5d2d9db3d4854b2cfb5b5e08d360a5e03fe987c0086/pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", size = 2235213, upload-time = "2024-07-01T09:46:10.211Z" }, + { url = "https://files.pythonhosted.org/packages/c1/d0/5866318eec2b801cdb8c82abf190c8343d8a1cd8bf5a0c17444a6f268291/pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", size = 2554498, upload-time = "2024-07-01T09:46:12.685Z" }, + { url = "https://files.pythonhosted.org/packages/d4/c8/310ac16ac2b97e902d9eb438688de0d961660a87703ad1561fd3dfbd2aa0/pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", size = 2243219, upload-time = "2024-07-01T09:46:14.83Z" }, + { url = "https://files.pythonhosted.org/packages/05/cb/0353013dc30c02a8be34eb91d25e4e4cf594b59e5a55ea1128fde1e5f8ea/pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", size = 3509350, upload-time = "2024-07-01T09:46:17.177Z" }, + { url = "https://files.pythonhosted.org/packages/e7/cf/5c558a0f247e0bf9cec92bff9b46ae6474dd736f6d906315e60e4075f737/pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", size = 3374980, upload-time = "2024-07-01T09:46:19.169Z" }, + { url = "https://files.pythonhosted.org/packages/84/48/6e394b86369a4eb68b8a1382c78dc092245af517385c086c5094e3b34428/pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", size = 4343799, upload-time = "2024-07-01T09:46:21.883Z" }, + { url = "https://files.pythonhosted.org/packages/3b/f3/a8c6c11fa84b59b9df0cd5694492da8c039a24cd159f0f6918690105c3be/pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", size = 4459973, upload-time = "2024-07-01T09:46:24.321Z" }, + { url = "https://files.pythonhosted.org/packages/7d/1b/c14b4197b80150fb64453585247e6fb2e1d93761fa0fa9cf63b102fde822/pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", size = 4370054, upload-time = "2024-07-01T09:46:26.825Z" }, + { url = "https://files.pythonhosted.org/packages/55/77/40daddf677897a923d5d33329acd52a2144d54a9644f2a5422c028c6bf2d/pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", size = 4539484, upload-time = "2024-07-01T09:46:29.355Z" }, + { url = "https://files.pythonhosted.org/packages/40/54/90de3e4256b1207300fb2b1d7168dd912a2fb4b2401e439ba23c2b2cabde/pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", size = 4477375, upload-time = "2024-07-01T09:46:31.756Z" }, + { url = "https://files.pythonhosted.org/packages/13/24/1bfba52f44193860918ff7c93d03d95e3f8748ca1de3ceaf11157a14cf16/pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", size = 4608773, upload-time = "2024-07-01T09:46:33.73Z" }, + { url = "https://files.pythonhosted.org/packages/55/04/5e6de6e6120451ec0c24516c41dbaf80cce1b6451f96561235ef2429da2e/pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", size = 2235690, upload-time = "2024-07-01T09:46:36.587Z" }, + { url = "https://files.pythonhosted.org/packages/74/0a/d4ce3c44bca8635bd29a2eab5aa181b654a734a29b263ca8efe013beea98/pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", size = 2554951, upload-time = "2024-07-01T09:46:38.777Z" }, + { url = "https://files.pythonhosted.org/packages/b5/ca/184349ee40f2e92439be9b3502ae6cfc43ac4b50bc4fc6b3de7957563894/pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", size = 2243427, upload-time = "2024-07-01T09:46:43.15Z" }, + { url = "https://files.pythonhosted.org/packages/c3/00/706cebe7c2c12a6318aabe5d354836f54adff7156fd9e1bd6c89f4ba0e98/pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", size = 3525685, upload-time = "2024-07-01T09:46:45.194Z" }, + { url = "https://files.pythonhosted.org/packages/cf/76/f658cbfa49405e5ecbfb9ba42d07074ad9792031267e782d409fd8fe7c69/pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", size = 3374883, upload-time = "2024-07-01T09:46:47.331Z" }, + { url = "https://files.pythonhosted.org/packages/46/2b/99c28c4379a85e65378211971c0b430d9c7234b1ec4d59b2668f6299e011/pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", size = 4339837, upload-time = "2024-07-01T09:46:49.647Z" }, + { url = "https://files.pythonhosted.org/packages/f1/74/b1ec314f624c0c43711fdf0d8076f82d9d802afd58f1d62c2a86878e8615/pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", size = 4455562, upload-time = "2024-07-01T09:46:51.811Z" }, + { url = "https://files.pythonhosted.org/packages/4a/2a/4b04157cb7b9c74372fa867096a1607e6fedad93a44deeff553ccd307868/pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", size = 4366761, upload-time = "2024-07-01T09:46:53.961Z" }, + { url = "https://files.pythonhosted.org/packages/ac/7b/8f1d815c1a6a268fe90481232c98dd0e5fa8c75e341a75f060037bd5ceae/pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", size = 4536767, upload-time = "2024-07-01T09:46:56.664Z" }, + { url = "https://files.pythonhosted.org/packages/e5/77/05fa64d1f45d12c22c314e7b97398ffb28ef2813a485465017b7978b3ce7/pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", size = 4477989, upload-time = "2024-07-01T09:46:58.977Z" }, + { url = "https://files.pythonhosted.org/packages/12/63/b0397cfc2caae05c3fb2f4ed1b4fc4fc878f0243510a7a6034ca59726494/pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", size = 4610255, upload-time = "2024-07-01T09:47:01.189Z" }, + { url = "https://files.pythonhosted.org/packages/7b/f9/cfaa5082ca9bc4a6de66ffe1c12c2d90bf09c309a5f52b27759a596900e7/pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", size = 2235603, upload-time = "2024-07-01T09:47:03.918Z" }, + { url = "https://files.pythonhosted.org/packages/01/6a/30ff0eef6e0c0e71e55ded56a38d4859bf9d3634a94a88743897b5f96936/pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", size = 2554972, upload-time = "2024-07-01T09:47:06.152Z" }, + { url = "https://files.pythonhosted.org/packages/48/2c/2e0a52890f269435eee38b21c8218e102c621fe8d8df8b9dd06fabf879ba/pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", size = 2243375, upload-time = "2024-07-01T09:47:09.065Z" }, + { url = "https://files.pythonhosted.org/packages/38/30/095d4f55f3a053392f75e2eae45eba3228452783bab3d9a920b951ac495c/pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", size = 3493889, upload-time = "2024-07-01T09:48:04.815Z" }, + { url = "https://files.pythonhosted.org/packages/f3/e8/4ff79788803a5fcd5dc35efdc9386af153569853767bff74540725b45863/pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", size = 3346160, upload-time = "2024-07-01T09:48:07.206Z" }, + { url = "https://files.pythonhosted.org/packages/d7/ac/4184edd511b14f760c73f5bb8a5d6fd85c591c8aff7c2229677a355c4179/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", size = 3435020, upload-time = "2024-07-01T09:48:09.66Z" }, + { url = "https://files.pythonhosted.org/packages/da/21/1749cd09160149c0a246a81d646e05f35041619ce76f6493d6a96e8d1103/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", size = 3490539, upload-time = "2024-07-01T09:48:12.529Z" }, + { url = "https://files.pythonhosted.org/packages/b6/f5/f71fe1888b96083b3f6dfa0709101f61fc9e972c0c8d04e9d93ccef2a045/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", size = 3476125, upload-time = "2024-07-01T09:48:14.891Z" }, + { url = "https://files.pythonhosted.org/packages/96/b9/c0362c54290a31866c3526848583a2f45a535aa9d725fd31e25d318c805f/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", size = 3579373, upload-time = "2024-07-01T09:48:17.601Z" }, + { url = "https://files.pythonhosted.org/packages/52/3b/ce7a01026a7cf46e5452afa86f97a5e88ca97f562cafa76570178ab56d8d/pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", size = 2554661, upload-time = "2024-07-01T09:48:20.293Z" }, ] [[package]] name = "platformdirs" version = "4.3.6" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302 } +sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302, upload-time = "2024-09-17T19:06:50.688Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439 }, + { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439, upload-time = "2024-09-17T19:06:49.212Z" }, ] [[package]] name = "pluggy" version = "1.5.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 } +sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955, upload-time = "2024-04-20T21:34:42.531Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 }, + { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556, upload-time = "2024-04-20T21:34:40.434Z" }, ] [[package]] name = "pycparser" version = "2.22" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 } +sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736, upload-time = "2024-03-30T13:22:22.564Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 }, + { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552, upload-time = "2024-03-30T13:22:20.476Z" }, ] [[package]] @@ -960,9 +1121,9 @@ dependencies = [ { name = "pydantic-core" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/c4/bd/7fc610993f616d2398958d0028d15eaf53bde5f80cb2edb7aa4f1feaf3a7/pydantic-2.10.1.tar.gz", hash = "sha256:a4daca2dc0aa429555e0656d6bf94873a7dc5f54ee42b1f5873d666fb3f35560", size = 783717 } +sdist = { url = "https://files.pythonhosted.org/packages/c4/bd/7fc610993f616d2398958d0028d15eaf53bde5f80cb2edb7aa4f1feaf3a7/pydantic-2.10.1.tar.gz", hash = "sha256:a4daca2dc0aa429555e0656d6bf94873a7dc5f54ee42b1f5873d666fb3f35560", size = 783717, upload-time = "2024-11-22T00:58:43.709Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e0/fc/fda48d347bd50a788dd2a0f318a52160f911b86fc2d8b4c86f4d7c9bceea/pydantic-2.10.1-py3-none-any.whl", hash = "sha256:a8d20db84de64cf4a7d59e899c2caf0fe9d660c7cfc482528e7020d7dd189a7e", size = 455329 }, + { url = "https://files.pythonhosted.org/packages/e0/fc/fda48d347bd50a788dd2a0f318a52160f911b86fc2d8b4c86f4d7c9bceea/pydantic-2.10.1-py3-none-any.whl", hash = "sha256:a8d20db84de64cf4a7d59e899c2caf0fe9d660c7cfc482528e7020d7dd189a7e", size = 455329, upload-time = "2024-11-22T00:58:40.347Z" }, ] [[package]] @@ -972,72 +1133,72 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/a6/9f/7de1f19b6aea45aeb441838782d68352e71bfa98ee6fa048d5041991b33e/pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235", size = 412785 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6e/ce/60fd96895c09738648c83f3f00f595c807cb6735c70d3306b548cc96dd49/pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a", size = 1897984 }, - { url = "https://files.pythonhosted.org/packages/fd/b9/84623d6b6be98cc209b06687d9bca5a7b966ffed008d15225dd0d20cce2e/pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b", size = 1807491 }, - { url = "https://files.pythonhosted.org/packages/01/72/59a70165eabbc93b1111d42df9ca016a4aa109409db04304829377947028/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278", size = 1831953 }, - { url = "https://files.pythonhosted.org/packages/7c/0c/24841136476adafd26f94b45bb718a78cb0500bd7b4f8d667b67c29d7b0d/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05", size = 1856071 }, - { url = "https://files.pythonhosted.org/packages/53/5e/c32957a09cceb2af10d7642df45d1e3dbd8596061f700eac93b801de53c0/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4", size = 2038439 }, - { url = "https://files.pythonhosted.org/packages/e4/8f/979ab3eccd118b638cd6d8f980fea8794f45018255a36044dea40fe579d4/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f", size = 2787416 }, - { url = "https://files.pythonhosted.org/packages/02/1d/00f2e4626565b3b6d3690dab4d4fe1a26edd6a20e53749eb21ca892ef2df/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08", size = 2134548 }, - { url = "https://files.pythonhosted.org/packages/9d/46/3112621204128b90898adc2e721a3cd6cf5626504178d6f32c33b5a43b79/pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6", size = 1989882 }, - { url = "https://files.pythonhosted.org/packages/49/ec/557dd4ff5287ffffdf16a31d08d723de6762bb1b691879dc4423392309bc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807", size = 1995829 }, - { url = "https://files.pythonhosted.org/packages/6e/b2/610dbeb74d8d43921a7234555e4c091cb050a2bdb8cfea86d07791ce01c5/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c", size = 2091257 }, - { url = "https://files.pythonhosted.org/packages/8c/7f/4bf8e9d26a9118521c80b229291fa9558a07cdd9a968ec2d5c1026f14fbc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206", size = 2143894 }, - { url = "https://files.pythonhosted.org/packages/1f/1c/875ac7139c958f4390f23656fe696d1acc8edf45fb81e4831960f12cd6e4/pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c", size = 1816081 }, - { url = "https://files.pythonhosted.org/packages/d7/41/55a117acaeda25ceae51030b518032934f251b1dac3704a53781383e3491/pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17", size = 1981109 }, - { url = "https://files.pythonhosted.org/packages/27/39/46fe47f2ad4746b478ba89c561cafe4428e02b3573df882334bd2964f9cb/pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8", size = 1895553 }, - { url = "https://files.pythonhosted.org/packages/1c/00/0804e84a78b7fdb394fff4c4f429815a10e5e0993e6ae0e0b27dd20379ee/pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330", size = 1807220 }, - { url = "https://files.pythonhosted.org/packages/01/de/df51b3bac9820d38371f5a261020f505025df732ce566c2a2e7970b84c8c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52", size = 1829727 }, - { url = "https://files.pythonhosted.org/packages/5f/d9/c01d19da8f9e9fbdb2bf99f8358d145a312590374d0dc9dd8dbe484a9cde/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4", size = 1854282 }, - { url = "https://files.pythonhosted.org/packages/5f/84/7db66eb12a0dc88c006abd6f3cbbf4232d26adfd827a28638c540d8f871d/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c", size = 2037437 }, - { url = "https://files.pythonhosted.org/packages/34/ac/a2537958db8299fbabed81167d58cc1506049dba4163433524e06a7d9f4c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de", size = 2780899 }, - { url = "https://files.pythonhosted.org/packages/4a/c1/3e38cd777ef832c4fdce11d204592e135ddeedb6c6f525478a53d1c7d3e5/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025", size = 2135022 }, - { url = "https://files.pythonhosted.org/packages/7a/69/b9952829f80fd555fe04340539d90e000a146f2a003d3fcd1e7077c06c71/pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e", size = 1987969 }, - { url = "https://files.pythonhosted.org/packages/05/72/257b5824d7988af43460c4e22b63932ed651fe98804cc2793068de7ec554/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919", size = 1994625 }, - { url = "https://files.pythonhosted.org/packages/73/c3/78ed6b7f3278a36589bcdd01243189ade7fc9b26852844938b4d7693895b/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c", size = 2090089 }, - { url = "https://files.pythonhosted.org/packages/8d/c8/b4139b2f78579960353c4cd987e035108c93a78371bb19ba0dc1ac3b3220/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc", size = 2142496 }, - { url = "https://files.pythonhosted.org/packages/3e/f8/171a03e97eb36c0b51981efe0f78460554a1d8311773d3d30e20c005164e/pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9", size = 1811758 }, - { url = "https://files.pythonhosted.org/packages/6a/fe/4e0e63c418c1c76e33974a05266e5633e879d4061f9533b1706a86f77d5b/pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5", size = 1980864 }, - { url = "https://files.pythonhosted.org/packages/50/fc/93f7238a514c155a8ec02fc7ac6376177d449848115e4519b853820436c5/pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89", size = 1864327 }, - { url = "https://files.pythonhosted.org/packages/be/51/2e9b3788feb2aebff2aa9dfbf060ec739b38c05c46847601134cc1fed2ea/pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f", size = 1895239 }, - { url = "https://files.pythonhosted.org/packages/7b/9e/f8063952e4a7d0127f5d1181addef9377505dcce3be224263b25c4f0bfd9/pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02", size = 1805070 }, - { url = "https://files.pythonhosted.org/packages/2c/9d/e1d6c4561d262b52e41b17a7ef8301e2ba80b61e32e94520271029feb5d8/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c", size = 1828096 }, - { url = "https://files.pythonhosted.org/packages/be/65/80ff46de4266560baa4332ae3181fffc4488ea7d37282da1a62d10ab89a4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac", size = 1857708 }, - { url = "https://files.pythonhosted.org/packages/d5/ca/3370074ad758b04d9562b12ecdb088597f4d9d13893a48a583fb47682cdf/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb", size = 2037751 }, - { url = "https://files.pythonhosted.org/packages/b1/e2/4ab72d93367194317b99d051947c071aef6e3eb95f7553eaa4208ecf9ba4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529", size = 2733863 }, - { url = "https://files.pythonhosted.org/packages/8a/c6/8ae0831bf77f356bb73127ce5a95fe115b10f820ea480abbd72d3cc7ccf3/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35", size = 2161161 }, - { url = "https://files.pythonhosted.org/packages/f1/f4/b2fe73241da2429400fc27ddeaa43e35562f96cf5b67499b2de52b528cad/pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089", size = 1993294 }, - { url = "https://files.pythonhosted.org/packages/77/29/4bb008823a7f4cc05828198153f9753b3bd4c104d93b8e0b1bfe4e187540/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381", size = 2001468 }, - { url = "https://files.pythonhosted.org/packages/f2/a9/0eaceeba41b9fad851a4107e0cf999a34ae8f0d0d1f829e2574f3d8897b0/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb", size = 2091413 }, - { url = "https://files.pythonhosted.org/packages/d8/36/eb8697729725bc610fd73940f0d860d791dc2ad557faaefcbb3edbd2b349/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae", size = 2154735 }, - { url = "https://files.pythonhosted.org/packages/52/e5/4f0fbd5c5995cc70d3afed1b5c754055bb67908f55b5cb8000f7112749bf/pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c", size = 1833633 }, - { url = "https://files.pythonhosted.org/packages/ee/f2/c61486eee27cae5ac781305658779b4a6b45f9cc9d02c90cb21b940e82cc/pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16", size = 1986973 }, - { url = "https://files.pythonhosted.org/packages/df/a6/e3f12ff25f250b02f7c51be89a294689d175ac76e1096c32bf278f29ca1e/pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e", size = 1883215 }, - { url = "https://files.pythonhosted.org/packages/0f/d6/91cb99a3c59d7b072bded9959fbeab0a9613d5a4935773c0801f1764c156/pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073", size = 1895033 }, - { url = "https://files.pythonhosted.org/packages/07/42/d35033f81a28b27dedcade9e967e8a40981a765795c9ebae2045bcef05d3/pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08", size = 1807542 }, - { url = "https://files.pythonhosted.org/packages/41/c2/491b59e222ec7e72236e512108ecad532c7f4391a14e971c963f624f7569/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf", size = 1827854 }, - { url = "https://files.pythonhosted.org/packages/e3/f3/363652651779113189cefdbbb619b7b07b7a67ebb6840325117cc8cc3460/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737", size = 1857389 }, - { url = "https://files.pythonhosted.org/packages/5f/97/be804aed6b479af5a945daec7538d8bf358d668bdadde4c7888a2506bdfb/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2", size = 2037934 }, - { url = "https://files.pythonhosted.org/packages/42/01/295f0bd4abf58902917e342ddfe5f76cf66ffabfc57c2e23c7681a1a1197/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107", size = 2735176 }, - { url = "https://files.pythonhosted.org/packages/9d/a0/cd8e9c940ead89cc37812a1a9f310fef59ba2f0b22b4e417d84ab09fa970/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51", size = 2160720 }, - { url = "https://files.pythonhosted.org/packages/73/ae/9d0980e286627e0aeca4c352a60bd760331622c12d576e5ea4441ac7e15e/pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a", size = 1992972 }, - { url = "https://files.pythonhosted.org/packages/bf/ba/ae4480bc0292d54b85cfb954e9d6bd226982949f8316338677d56541b85f/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc", size = 2001477 }, - { url = "https://files.pythonhosted.org/packages/55/b7/e26adf48c2f943092ce54ae14c3c08d0d221ad34ce80b18a50de8ed2cba8/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960", size = 2091186 }, - { url = "https://files.pythonhosted.org/packages/ba/cc/8491fff5b608b3862eb36e7d29d36a1af1c945463ca4c5040bf46cc73f40/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23", size = 2154429 }, - { url = "https://files.pythonhosted.org/packages/78/d8/c080592d80edd3441ab7f88f865f51dae94a157fc64283c680e9f32cf6da/pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05", size = 1833713 }, - { url = "https://files.pythonhosted.org/packages/83/84/5ab82a9ee2538ac95a66e51f6838d6aba6e0a03a42aa185ad2fe404a4e8f/pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337", size = 1987897 }, - { url = "https://files.pythonhosted.org/packages/df/c3/b15fb833926d91d982fde29c0624c9f225da743c7af801dace0d4e187e71/pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5", size = 1882983 }, - { url = "https://files.pythonhosted.org/packages/7c/60/e5eb2d462595ba1f622edbe7b1d19531e510c05c405f0b87c80c1e89d5b1/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6", size = 1894016 }, - { url = "https://files.pythonhosted.org/packages/61/20/da7059855225038c1c4326a840908cc7ca72c7198cb6addb8b92ec81c1d6/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676", size = 1771648 }, - { url = "https://files.pythonhosted.org/packages/8f/fc/5485cf0b0bb38da31d1d292160a4d123b5977841ddc1122c671a30b76cfd/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d", size = 1826929 }, - { url = "https://files.pythonhosted.org/packages/a1/ff/fb1284a210e13a5f34c639efc54d51da136074ffbe25ec0c279cf9fbb1c4/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c", size = 1980591 }, - { url = "https://files.pythonhosted.org/packages/f1/14/77c1887a182d05af74f6aeac7b740da3a74155d3093ccc7ee10b900cc6b5/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27", size = 1981326 }, - { url = "https://files.pythonhosted.org/packages/06/aa/6f1b2747f811a9c66b5ef39d7f02fbb200479784c75e98290d70004b1253/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f", size = 1989205 }, - { url = "https://files.pythonhosted.org/packages/7a/d2/8ce2b074d6835f3c88d85f6d8a399790043e9fdb3d0e43455e72d19df8cc/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed", size = 2079616 }, - { url = "https://files.pythonhosted.org/packages/65/71/af01033d4e58484c3db1e5d13e751ba5e3d6b87cc3368533df4c50932c8b/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f", size = 2133265 }, - { url = "https://files.pythonhosted.org/packages/33/72/f881b5e18fbb67cf2fb4ab253660de3c6899dbb2dba409d0b757e3559e3d/pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c", size = 2001864 }, +sdist = { url = "https://files.pythonhosted.org/packages/a6/9f/7de1f19b6aea45aeb441838782d68352e71bfa98ee6fa048d5041991b33e/pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235", size = 412785, upload-time = "2024-11-22T00:24:49.865Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6e/ce/60fd96895c09738648c83f3f00f595c807cb6735c70d3306b548cc96dd49/pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a", size = 1897984, upload-time = "2024-11-22T00:21:25.431Z" }, + { url = "https://files.pythonhosted.org/packages/fd/b9/84623d6b6be98cc209b06687d9bca5a7b966ffed008d15225dd0d20cce2e/pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b", size = 1807491, upload-time = "2024-11-22T00:21:27.318Z" }, + { url = "https://files.pythonhosted.org/packages/01/72/59a70165eabbc93b1111d42df9ca016a4aa109409db04304829377947028/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278", size = 1831953, upload-time = "2024-11-22T00:21:28.606Z" }, + { url = "https://files.pythonhosted.org/packages/7c/0c/24841136476adafd26f94b45bb718a78cb0500bd7b4f8d667b67c29d7b0d/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05", size = 1856071, upload-time = "2024-11-22T00:21:29.931Z" }, + { url = "https://files.pythonhosted.org/packages/53/5e/c32957a09cceb2af10d7642df45d1e3dbd8596061f700eac93b801de53c0/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4", size = 2038439, upload-time = "2024-11-22T00:21:32.245Z" }, + { url = "https://files.pythonhosted.org/packages/e4/8f/979ab3eccd118b638cd6d8f980fea8794f45018255a36044dea40fe579d4/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f", size = 2787416, upload-time = "2024-11-22T00:21:33.708Z" }, + { url = "https://files.pythonhosted.org/packages/02/1d/00f2e4626565b3b6d3690dab4d4fe1a26edd6a20e53749eb21ca892ef2df/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08", size = 2134548, upload-time = "2024-11-22T00:21:35.823Z" }, + { url = "https://files.pythonhosted.org/packages/9d/46/3112621204128b90898adc2e721a3cd6cf5626504178d6f32c33b5a43b79/pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6", size = 1989882, upload-time = "2024-11-22T00:21:37.872Z" }, + { url = "https://files.pythonhosted.org/packages/49/ec/557dd4ff5287ffffdf16a31d08d723de6762bb1b691879dc4423392309bc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807", size = 1995829, upload-time = "2024-11-22T00:21:39.966Z" }, + { url = "https://files.pythonhosted.org/packages/6e/b2/610dbeb74d8d43921a7234555e4c091cb050a2bdb8cfea86d07791ce01c5/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c", size = 2091257, upload-time = "2024-11-22T00:21:41.99Z" }, + { url = "https://files.pythonhosted.org/packages/8c/7f/4bf8e9d26a9118521c80b229291fa9558a07cdd9a968ec2d5c1026f14fbc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206", size = 2143894, upload-time = "2024-11-22T00:21:44.193Z" }, + { url = "https://files.pythonhosted.org/packages/1f/1c/875ac7139c958f4390f23656fe696d1acc8edf45fb81e4831960f12cd6e4/pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c", size = 1816081, upload-time = "2024-11-22T00:21:45.468Z" }, + { url = "https://files.pythonhosted.org/packages/d7/41/55a117acaeda25ceae51030b518032934f251b1dac3704a53781383e3491/pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17", size = 1981109, upload-time = "2024-11-22T00:21:47.452Z" }, + { url = "https://files.pythonhosted.org/packages/27/39/46fe47f2ad4746b478ba89c561cafe4428e02b3573df882334bd2964f9cb/pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8", size = 1895553, upload-time = "2024-11-22T00:21:48.859Z" }, + { url = "https://files.pythonhosted.org/packages/1c/00/0804e84a78b7fdb394fff4c4f429815a10e5e0993e6ae0e0b27dd20379ee/pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330", size = 1807220, upload-time = "2024-11-22T00:21:50.354Z" }, + { url = "https://files.pythonhosted.org/packages/01/de/df51b3bac9820d38371f5a261020f505025df732ce566c2a2e7970b84c8c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52", size = 1829727, upload-time = "2024-11-22T00:21:51.722Z" }, + { url = "https://files.pythonhosted.org/packages/5f/d9/c01d19da8f9e9fbdb2bf99f8358d145a312590374d0dc9dd8dbe484a9cde/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4", size = 1854282, upload-time = "2024-11-22T00:21:53.098Z" }, + { url = "https://files.pythonhosted.org/packages/5f/84/7db66eb12a0dc88c006abd6f3cbbf4232d26adfd827a28638c540d8f871d/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c", size = 2037437, upload-time = "2024-11-22T00:21:55.185Z" }, + { url = "https://files.pythonhosted.org/packages/34/ac/a2537958db8299fbabed81167d58cc1506049dba4163433524e06a7d9f4c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de", size = 2780899, upload-time = "2024-11-22T00:21:56.633Z" }, + { url = "https://files.pythonhosted.org/packages/4a/c1/3e38cd777ef832c4fdce11d204592e135ddeedb6c6f525478a53d1c7d3e5/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025", size = 2135022, upload-time = "2024-11-22T00:21:59.154Z" }, + { url = "https://files.pythonhosted.org/packages/7a/69/b9952829f80fd555fe04340539d90e000a146f2a003d3fcd1e7077c06c71/pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e", size = 1987969, upload-time = "2024-11-22T00:22:01.325Z" }, + { url = "https://files.pythonhosted.org/packages/05/72/257b5824d7988af43460c4e22b63932ed651fe98804cc2793068de7ec554/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919", size = 1994625, upload-time = "2024-11-22T00:22:03.447Z" }, + { url = "https://files.pythonhosted.org/packages/73/c3/78ed6b7f3278a36589bcdd01243189ade7fc9b26852844938b4d7693895b/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c", size = 2090089, upload-time = "2024-11-22T00:22:04.941Z" }, + { url = "https://files.pythonhosted.org/packages/8d/c8/b4139b2f78579960353c4cd987e035108c93a78371bb19ba0dc1ac3b3220/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc", size = 2142496, upload-time = "2024-11-22T00:22:06.57Z" }, + { url = "https://files.pythonhosted.org/packages/3e/f8/171a03e97eb36c0b51981efe0f78460554a1d8311773d3d30e20c005164e/pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9", size = 1811758, upload-time = "2024-11-22T00:22:08.445Z" }, + { url = "https://files.pythonhosted.org/packages/6a/fe/4e0e63c418c1c76e33974a05266e5633e879d4061f9533b1706a86f77d5b/pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5", size = 1980864, upload-time = "2024-11-22T00:22:10Z" }, + { url = "https://files.pythonhosted.org/packages/50/fc/93f7238a514c155a8ec02fc7ac6376177d449848115e4519b853820436c5/pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89", size = 1864327, upload-time = "2024-11-22T00:22:11.478Z" }, + { url = "https://files.pythonhosted.org/packages/be/51/2e9b3788feb2aebff2aa9dfbf060ec739b38c05c46847601134cc1fed2ea/pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f", size = 1895239, upload-time = "2024-11-22T00:22:13.775Z" }, + { url = "https://files.pythonhosted.org/packages/7b/9e/f8063952e4a7d0127f5d1181addef9377505dcce3be224263b25c4f0bfd9/pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02", size = 1805070, upload-time = "2024-11-22T00:22:15.438Z" }, + { url = "https://files.pythonhosted.org/packages/2c/9d/e1d6c4561d262b52e41b17a7ef8301e2ba80b61e32e94520271029feb5d8/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c", size = 1828096, upload-time = "2024-11-22T00:22:17.892Z" }, + { url = "https://files.pythonhosted.org/packages/be/65/80ff46de4266560baa4332ae3181fffc4488ea7d37282da1a62d10ab89a4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac", size = 1857708, upload-time = "2024-11-22T00:22:19.412Z" }, + { url = "https://files.pythonhosted.org/packages/d5/ca/3370074ad758b04d9562b12ecdb088597f4d9d13893a48a583fb47682cdf/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb", size = 2037751, upload-time = "2024-11-22T00:22:20.979Z" }, + { url = "https://files.pythonhosted.org/packages/b1/e2/4ab72d93367194317b99d051947c071aef6e3eb95f7553eaa4208ecf9ba4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529", size = 2733863, upload-time = "2024-11-22T00:22:22.951Z" }, + { url = "https://files.pythonhosted.org/packages/8a/c6/8ae0831bf77f356bb73127ce5a95fe115b10f820ea480abbd72d3cc7ccf3/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35", size = 2161161, upload-time = "2024-11-22T00:22:24.785Z" }, + { url = "https://files.pythonhosted.org/packages/f1/f4/b2fe73241da2429400fc27ddeaa43e35562f96cf5b67499b2de52b528cad/pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089", size = 1993294, upload-time = "2024-11-22T00:22:27.076Z" }, + { url = "https://files.pythonhosted.org/packages/77/29/4bb008823a7f4cc05828198153f9753b3bd4c104d93b8e0b1bfe4e187540/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381", size = 2001468, upload-time = "2024-11-22T00:22:29.346Z" }, + { url = "https://files.pythonhosted.org/packages/f2/a9/0eaceeba41b9fad851a4107e0cf999a34ae8f0d0d1f829e2574f3d8897b0/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb", size = 2091413, upload-time = "2024-11-22T00:22:30.984Z" }, + { url = "https://files.pythonhosted.org/packages/d8/36/eb8697729725bc610fd73940f0d860d791dc2ad557faaefcbb3edbd2b349/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae", size = 2154735, upload-time = "2024-11-22T00:22:32.616Z" }, + { url = "https://files.pythonhosted.org/packages/52/e5/4f0fbd5c5995cc70d3afed1b5c754055bb67908f55b5cb8000f7112749bf/pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c", size = 1833633, upload-time = "2024-11-22T00:22:35.027Z" }, + { url = "https://files.pythonhosted.org/packages/ee/f2/c61486eee27cae5ac781305658779b4a6b45f9cc9d02c90cb21b940e82cc/pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16", size = 1986973, upload-time = "2024-11-22T00:22:37.502Z" }, + { url = "https://files.pythonhosted.org/packages/df/a6/e3f12ff25f250b02f7c51be89a294689d175ac76e1096c32bf278f29ca1e/pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e", size = 1883215, upload-time = "2024-11-22T00:22:39.186Z" }, + { url = "https://files.pythonhosted.org/packages/0f/d6/91cb99a3c59d7b072bded9959fbeab0a9613d5a4935773c0801f1764c156/pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073", size = 1895033, upload-time = "2024-11-22T00:22:41.087Z" }, + { url = "https://files.pythonhosted.org/packages/07/42/d35033f81a28b27dedcade9e967e8a40981a765795c9ebae2045bcef05d3/pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08", size = 1807542, upload-time = "2024-11-22T00:22:43.341Z" }, + { url = "https://files.pythonhosted.org/packages/41/c2/491b59e222ec7e72236e512108ecad532c7f4391a14e971c963f624f7569/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf", size = 1827854, upload-time = "2024-11-22T00:22:44.96Z" }, + { url = "https://files.pythonhosted.org/packages/e3/f3/363652651779113189cefdbbb619b7b07b7a67ebb6840325117cc8cc3460/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737", size = 1857389, upload-time = "2024-11-22T00:22:47.305Z" }, + { url = "https://files.pythonhosted.org/packages/5f/97/be804aed6b479af5a945daec7538d8bf358d668bdadde4c7888a2506bdfb/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2", size = 2037934, upload-time = "2024-11-22T00:22:49.093Z" }, + { url = "https://files.pythonhosted.org/packages/42/01/295f0bd4abf58902917e342ddfe5f76cf66ffabfc57c2e23c7681a1a1197/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107", size = 2735176, upload-time = "2024-11-22T00:22:50.822Z" }, + { url = "https://files.pythonhosted.org/packages/9d/a0/cd8e9c940ead89cc37812a1a9f310fef59ba2f0b22b4e417d84ab09fa970/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51", size = 2160720, upload-time = "2024-11-22T00:22:52.638Z" }, + { url = "https://files.pythonhosted.org/packages/73/ae/9d0980e286627e0aeca4c352a60bd760331622c12d576e5ea4441ac7e15e/pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a", size = 1992972, upload-time = "2024-11-22T00:22:54.31Z" }, + { url = "https://files.pythonhosted.org/packages/bf/ba/ae4480bc0292d54b85cfb954e9d6bd226982949f8316338677d56541b85f/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc", size = 2001477, upload-time = "2024-11-22T00:22:56.451Z" }, + { url = "https://files.pythonhosted.org/packages/55/b7/e26adf48c2f943092ce54ae14c3c08d0d221ad34ce80b18a50de8ed2cba8/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960", size = 2091186, upload-time = "2024-11-22T00:22:58.226Z" }, + { url = "https://files.pythonhosted.org/packages/ba/cc/8491fff5b608b3862eb36e7d29d36a1af1c945463ca4c5040bf46cc73f40/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23", size = 2154429, upload-time = "2024-11-22T00:22:59.985Z" }, + { url = "https://files.pythonhosted.org/packages/78/d8/c080592d80edd3441ab7f88f865f51dae94a157fc64283c680e9f32cf6da/pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05", size = 1833713, upload-time = "2024-11-22T00:23:01.715Z" }, + { url = "https://files.pythonhosted.org/packages/83/84/5ab82a9ee2538ac95a66e51f6838d6aba6e0a03a42aa185ad2fe404a4e8f/pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337", size = 1987897, upload-time = "2024-11-22T00:23:03.497Z" }, + { url = "https://files.pythonhosted.org/packages/df/c3/b15fb833926d91d982fde29c0624c9f225da743c7af801dace0d4e187e71/pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5", size = 1882983, upload-time = "2024-11-22T00:23:05.983Z" }, + { url = "https://files.pythonhosted.org/packages/7c/60/e5eb2d462595ba1f622edbe7b1d19531e510c05c405f0b87c80c1e89d5b1/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6", size = 1894016, upload-time = "2024-11-22T00:24:03.815Z" }, + { url = "https://files.pythonhosted.org/packages/61/20/da7059855225038c1c4326a840908cc7ca72c7198cb6addb8b92ec81c1d6/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676", size = 1771648, upload-time = "2024-11-22T00:24:05.981Z" }, + { url = "https://files.pythonhosted.org/packages/8f/fc/5485cf0b0bb38da31d1d292160a4d123b5977841ddc1122c671a30b76cfd/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d", size = 1826929, upload-time = "2024-11-22T00:24:08.163Z" }, + { url = "https://files.pythonhosted.org/packages/a1/ff/fb1284a210e13a5f34c639efc54d51da136074ffbe25ec0c279cf9fbb1c4/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c", size = 1980591, upload-time = "2024-11-22T00:24:10.291Z" }, + { url = "https://files.pythonhosted.org/packages/f1/14/77c1887a182d05af74f6aeac7b740da3a74155d3093ccc7ee10b900cc6b5/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27", size = 1981326, upload-time = "2024-11-22T00:24:13.169Z" }, + { url = "https://files.pythonhosted.org/packages/06/aa/6f1b2747f811a9c66b5ef39d7f02fbb200479784c75e98290d70004b1253/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f", size = 1989205, upload-time = "2024-11-22T00:24:16.049Z" }, + { url = "https://files.pythonhosted.org/packages/7a/d2/8ce2b074d6835f3c88d85f6d8a399790043e9fdb3d0e43455e72d19df8cc/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed", size = 2079616, upload-time = "2024-11-22T00:24:19.099Z" }, + { url = "https://files.pythonhosted.org/packages/65/71/af01033d4e58484c3db1e5d13e751ba5e3d6b87cc3368533df4c50932c8b/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f", size = 2133265, upload-time = "2024-11-22T00:24:21.397Z" }, + { url = "https://files.pythonhosted.org/packages/33/72/f881b5e18fbb67cf2fb4ab253660de3c6899dbb2dba409d0b757e3559e3d/pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c", size = 2001864, upload-time = "2024-11-22T00:24:24.354Z" }, ] [[package]] @@ -1048,18 +1209,18 @@ dependencies = [ { name = "pydantic" }, { name = "python-dotenv" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/b5/d4/9dfbe238f45ad8b168f5c96ee49a3df0598ce18a0795a983b419949ce65b/pydantic_settings-2.6.1.tar.gz", hash = "sha256:e0f92546d8a9923cb8941689abf85d6601a8c19a23e97a34b2964a2e3f813ca0", size = 75646 } +sdist = { url = "https://files.pythonhosted.org/packages/b5/d4/9dfbe238f45ad8b168f5c96ee49a3df0598ce18a0795a983b419949ce65b/pydantic_settings-2.6.1.tar.gz", hash = "sha256:e0f92546d8a9923cb8941689abf85d6601a8c19a23e97a34b2964a2e3f813ca0", size = 75646, upload-time = "2024-11-01T11:00:05.17Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5e/f9/ff95fd7d760af42f647ea87f9b8a383d891cdb5e5dbd4613edaeb094252a/pydantic_settings-2.6.1-py3-none-any.whl", hash = "sha256:7fb0637c786a558d3103436278a7c4f1cfd29ba8973238a50c5bb9a55387da87", size = 28595 }, + { url = "https://files.pythonhosted.org/packages/5e/f9/ff95fd7d760af42f647ea87f9b8a383d891cdb5e5dbd4613edaeb094252a/pydantic_settings-2.6.1-py3-none-any.whl", hash = "sha256:7fb0637c786a558d3103436278a7c4f1cfd29ba8973238a50c5bb9a55387da87", size = 28595, upload-time = "2024-11-01T11:00:02.64Z" }, ] [[package]] name = "pygments" version = "2.18.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905 } +sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905, upload-time = "2024-05-04T13:42:02.013Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513 }, + { url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513, upload-time = "2024-05-04T13:41:57.345Z" }, ] [[package]] @@ -1070,9 +1231,9 @@ dependencies = [ { name = "markdown" }, { name = "pyyaml" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/7c/44/e6de2fdc880ad0ec7547ca2e087212be815efbc9a425a8d5ba9ede602cbb/pymdown_extensions-10.14.3.tar.gz", hash = "sha256:41e576ce3f5d650be59e900e4ceff231e0aed2a88cf30acaee41e02f063a061b", size = 846846 } +sdist = { url = "https://files.pythonhosted.org/packages/7c/44/e6de2fdc880ad0ec7547ca2e087212be815efbc9a425a8d5ba9ede602cbb/pymdown_extensions-10.14.3.tar.gz", hash = "sha256:41e576ce3f5d650be59e900e4ceff231e0aed2a88cf30acaee41e02f063a061b", size = 846846, upload-time = "2025-02-01T15:43:15.42Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/eb/f5/b9e2a42aa8f9e34d52d66de87941ecd236570c7ed2e87775ed23bbe4e224/pymdown_extensions-10.14.3-py3-none-any.whl", hash = "sha256:05e0bee73d64b9c71a4ae17c72abc2f700e8bc8403755a00580b49a4e9f189e9", size = 264467 }, + { url = "https://files.pythonhosted.org/packages/eb/f5/b9e2a42aa8f9e34d52d66de87941ecd236570c7ed2e87775ed23bbe4e224/pymdown_extensions-10.14.3-py3-none-any.whl", hash = "sha256:05e0bee73d64b9c71a4ae17c72abc2f700e8bc8403755a00580b49a4e9f189e9", size = 264467, upload-time = "2025-02-01T15:43:13.995Z" }, ] [[package]] @@ -1083,9 +1244,9 @@ dependencies = [ { name = "nodeenv" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/11/05/4ea52a8a45cc28897edb485b4102d37cbfd5fce8445d679cdeb62bfad221/pyright-1.1.391.tar.gz", hash = "sha256:66b2d42cdf5c3cbab05f2f4b76e8bec8aa78e679bfa0b6ad7b923d9e027cadb2", size = 21965 } +sdist = { url = "https://files.pythonhosted.org/packages/11/05/4ea52a8a45cc28897edb485b4102d37cbfd5fce8445d679cdeb62bfad221/pyright-1.1.391.tar.gz", hash = "sha256:66b2d42cdf5c3cbab05f2f4b76e8bec8aa78e679bfa0b6ad7b923d9e027cadb2", size = 21965, upload-time = "2024-12-18T10:33:09.13Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ad/89/66f49552fbeb21944c8077d11834b2201514a56fd1b7747ffff9630f1bd9/pyright-1.1.391-py3-none-any.whl", hash = "sha256:54fa186f8b3e8a55a44ebfa842636635688670c6896dcf6cf4a7fc75062f4d15", size = 18579 }, + { url = "https://files.pythonhosted.org/packages/ad/89/66f49552fbeb21944c8077d11834b2201514a56fd1b7747ffff9630f1bd9/pyright-1.1.391-py3-none-any.whl", hash = "sha256:54fa186f8b3e8a55a44ebfa842636635688670c6896dcf6cf4a7fc75062f4d15", size = 18579, upload-time = "2024-12-18T10:33:06.634Z" }, ] [[package]] @@ -1100,9 +1261,9 @@ dependencies = [ { name = "pluggy" }, { name = "tomli", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919 } +sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919, upload-time = "2024-12-01T12:54:25.98Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083 }, + { url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083, upload-time = "2024-12-01T12:54:19.735Z" }, ] [[package]] @@ -1114,9 +1275,9 @@ dependencies = [ { name = "pytest" }, { name = "ruff" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d2/a7/b81d5cf26e9713a2d4c8e6863ee009360c5c07a0cfb880456ec8b09adab7/pytest_examples-0.0.14.tar.gz", hash = "sha256:776d1910709c0c5ce01b29bfe3651c5312d5cfe5c063e23ca6f65aed9af23f09", size = 20767 } +sdist = { url = "https://files.pythonhosted.org/packages/d2/a7/b81d5cf26e9713a2d4c8e6863ee009360c5c07a0cfb880456ec8b09adab7/pytest_examples-0.0.14.tar.gz", hash = "sha256:776d1910709c0c5ce01b29bfe3651c5312d5cfe5c063e23ca6f65aed9af23f09", size = 20767, upload-time = "2024-11-15T20:41:05.409Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/2b/99/f418071551ff2b5e8c06bd8b82b1f4fd472b5e4162f018773ba4ef52b6e8/pytest_examples-0.0.14-py3-none-any.whl", hash = "sha256:867a7ea105635d395df712a4b8d0df3bda4c3d78ae97a57b4f115721952b5e25", size = 17919 }, + { url = "https://files.pythonhosted.org/packages/2b/99/f418071551ff2b5e8c06bd8b82b1f4fd472b5e4162f018773ba4ef52b6e8/pytest_examples-0.0.14-py3-none-any.whl", hash = "sha256:867a7ea105635d395df712a4b8d0df3bda4c3d78ae97a57b4f115721952b5e25", size = 17919, upload-time = "2024-11-15T20:41:03.484Z" }, ] [[package]] @@ -1126,9 +1287,22 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pytest" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ec/53/69c56a93ea057895b5761c5318455804873a6cd9d796d7c55d41c2358125/pytest-flakefinder-1.1.0.tar.gz", hash = "sha256:e2412a1920bdb8e7908783b20b3d57e9dad590cc39a93e8596ffdd493b403e0e", size = 6795 } +sdist = { url = "https://files.pythonhosted.org/packages/ec/53/69c56a93ea057895b5761c5318455804873a6cd9d796d7c55d41c2358125/pytest-flakefinder-1.1.0.tar.gz", hash = "sha256:e2412a1920bdb8e7908783b20b3d57e9dad590cc39a93e8596ffdd493b403e0e", size = 6795, upload-time = "2022-10-26T18:27:54.243Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/33/8b/06787150d0fd0cbd3a8054262b56f91631c7778c1bc91bf4637e47f909ad/pytest_flakefinder-1.1.0-py2.py3-none-any.whl", hash = "sha256:741e0e8eea427052f5b8c89c2b3c3019a50c39a59ce4df6a305a2c2d9ba2bd13", size = 4644 }, + { url = "https://files.pythonhosted.org/packages/33/8b/06787150d0fd0cbd3a8054262b56f91631c7778c1bc91bf4637e47f909ad/pytest_flakefinder-1.1.0-py2.py3-none-any.whl", hash = "sha256:741e0e8eea427052f5b8c89c2b3c3019a50c39a59ce4df6a305a2c2d9ba2bd13", size = 4644, upload-time = "2022-10-26T18:27:52.128Z" }, +] + +[[package]] +name = "pytest-pretty" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pytest" }, + { name = "rich" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a5/18/30ad0408295f3157f7a4913f0eaa51a0a377ebad0ffa51ff239e833c6c72/pytest_pretty-1.2.0.tar.gz", hash = "sha256:105a355f128e392860ad2c478ae173ff96d2f03044692f9818ff3d49205d3a60", size = 6542, upload-time = "2023-04-05T17:11:50.917Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bf/fe/d44d391312c1b8abee2af58ee70fabb1c00b6577ac4e0bdf25b70c1caffb/pytest_pretty-1.2.0-py3-none-any.whl", hash = "sha256:6f79122bf53864ae2951b6c9e94d7a06a87ef753476acd4588aeac018f062036", size = 6180, upload-time = "2023-04-05T17:11:49.801Z" }, ] [[package]] @@ -1139,9 +1313,9 @@ dependencies = [ { name = "execnet" }, { name = "pytest" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/41/c4/3c310a19bc1f1e9ef50075582652673ef2bfc8cd62afef9585683821902f/pytest_xdist-3.6.1.tar.gz", hash = "sha256:ead156a4db231eec769737f57668ef58a2084a34b2e55c4a8fa20d861107300d", size = 84060 } +sdist = { url = "https://files.pythonhosted.org/packages/41/c4/3c310a19bc1f1e9ef50075582652673ef2bfc8cd62afef9585683821902f/pytest_xdist-3.6.1.tar.gz", hash = "sha256:ead156a4db231eec769737f57668ef58a2084a34b2e55c4a8fa20d861107300d", size = 84060, upload-time = "2024-04-28T19:29:54.414Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/6d/82/1d96bf03ee4c0fdc3c0cbe61470070e659ca78dc0086fb88b66c185e2449/pytest_xdist-3.6.1-py3-none-any.whl", hash = "sha256:9ed4adfb68a016610848639bb7e02c9352d5d9f03d04809919e2dafc3be4cca7", size = 46108 }, + { url = "https://files.pythonhosted.org/packages/6d/82/1d96bf03ee4c0fdc3c0cbe61470070e659ca78dc0086fb88b66c185e2449/pytest_xdist-3.6.1-py3-none-any.whl", hash = "sha256:9ed4adfb68a016610848639bb7e02c9352d5d9f03d04809919e2dafc3be4cca7", size = 46108, upload-time = "2024-04-28T19:29:52.813Z" }, ] [[package]] @@ -1151,62 +1325,71 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "six" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432 } +sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892 }, + { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, ] [[package]] name = "python-dotenv" version = "1.0.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/31/06/1ef763af20d0572c032fa22882cfbfb005fba6e7300715a37840858c919e/python-dotenv-1.0.0.tar.gz", hash = "sha256:a8df96034aae6d2d50a4ebe8216326c61c3eb64836776504fcca410e5937a3ba", size = 37399 } +sdist = { url = "https://files.pythonhosted.org/packages/31/06/1ef763af20d0572c032fa22882cfbfb005fba6e7300715a37840858c919e/python-dotenv-1.0.0.tar.gz", hash = "sha256:a8df96034aae6d2d50a4ebe8216326c61c3eb64836776504fcca410e5937a3ba", size = 37399, upload-time = "2023-02-24T06:46:37.282Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/44/2f/62ea1c8b593f4e093cc1a7768f0d46112107e790c3e478532329e434f00b/python_dotenv-1.0.0-py3-none-any.whl", hash = "sha256:f5971a9226b701070a4bf2c38c89e5a3f0d64de8debda981d1db98583009122a", size = 19482, upload-time = "2023-02-24T06:46:36.009Z" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.9" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/5c/0f/9c55ac6c84c0336e22a26fa84ca6c51d58d7ac3a2d78b0dfa8748826c883/python_multipart-0.0.9.tar.gz", hash = "sha256:03f54688c663f1b7977105f021043b0793151e4cb1c1a9d4a11fc13d622c4026", size = 31516, upload-time = "2024-02-10T13:32:04.684Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/44/2f/62ea1c8b593f4e093cc1a7768f0d46112107e790c3e478532329e434f00b/python_dotenv-1.0.0-py3-none-any.whl", hash = "sha256:f5971a9226b701070a4bf2c38c89e5a3f0d64de8debda981d1db98583009122a", size = 19482 }, + { url = "https://files.pythonhosted.org/packages/3d/47/444768600d9e0ebc82f8e347775d24aef8f6348cf00e9fa0e81910814e6d/python_multipart-0.0.9-py3-none-any.whl", hash = "sha256:97ca7b8ea7b05f977dc3849c3ba99d51689822fab725c3703af7c866a0c2b215", size = 22299, upload-time = "2024-02-10T13:32:02.969Z" }, ] [[package]] name = "pyyaml" version = "6.0.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199 }, - { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758 }, - { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463 }, - { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280 }, - { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239 }, - { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802 }, - { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527 }, - { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052 }, - { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774 }, - { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612 }, - { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040 }, - { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829 }, - { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167 }, - { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952 }, - { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301 }, - { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638 }, - { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850 }, - { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980 }, - { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873 }, - { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302 }, - { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154 }, - { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223 }, - { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542 }, - { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164 }, - { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611 }, - { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591 }, - { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338 }, - { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309 }, - { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679 }, - { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428 }, - { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361 }, - { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523 }, - { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660 }, - { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 }, - { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 }, - { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 }, +sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199, upload-time = "2024-08-06T20:31:40.178Z" }, + { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758, upload-time = "2024-08-06T20:31:42.173Z" }, + { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463, upload-time = "2024-08-06T20:31:44.263Z" }, + { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280, upload-time = "2024-08-06T20:31:50.199Z" }, + { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239, upload-time = "2024-08-06T20:31:52.292Z" }, + { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802, upload-time = "2024-08-06T20:31:53.836Z" }, + { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527, upload-time = "2024-08-06T20:31:55.565Z" }, + { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052, upload-time = "2024-08-06T20:31:56.914Z" }, + { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774, upload-time = "2024-08-06T20:31:58.304Z" }, + { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" }, + { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" }, + { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" }, + { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" }, + { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" }, + { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" }, + { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" }, + { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" }, + { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" }, + { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" }, + { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" }, + { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" }, + { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" }, + { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" }, + { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" }, + { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" }, + { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" }, + { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" }, + { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" }, + { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" }, + { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" }, + { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" }, + { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" }, + { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" }, + { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" }, + { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" }, ] [[package]] @@ -1216,78 +1399,78 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pyyaml" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/fb/8e/da1c6c58f751b70f8ceb1eb25bc25d524e8f14fe16edcce3f4e3ba08629c/pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb", size = 5631 } +sdist = { url = "https://files.pythonhosted.org/packages/fb/8e/da1c6c58f751b70f8ceb1eb25bc25d524e8f14fe16edcce3f4e3ba08629c/pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb", size = 5631, upload-time = "2020-11-12T02:38:26.239Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5a/66/bbb1dd374f5c870f59c5bb1db0e18cbe7fa739415a24cbd95b2d1f5ae0c4/pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069", size = 3911 }, + { url = "https://files.pythonhosted.org/packages/5a/66/bbb1dd374f5c870f59c5bb1db0e18cbe7fa739415a24cbd95b2d1f5ae0c4/pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069", size = 3911, upload-time = "2020-11-12T02:38:24.638Z" }, ] [[package]] name = "regex" version = "2024.11.6" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/8e/5f/bd69653fbfb76cf8604468d3b4ec4c403197144c7bfe0e6a5fc9e02a07cb/regex-2024.11.6.tar.gz", hash = "sha256:7ab159b063c52a0333c884e4679f8d7a85112ee3078fe3d9004b2dd875585519", size = 399494 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/95/3c/4651f6b130c6842a8f3df82461a8950f923925db8b6961063e82744bddcc/regex-2024.11.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ff590880083d60acc0433f9c3f713c51f7ac6ebb9adf889c79a261ecf541aa91", size = 482674 }, - { url = "https://files.pythonhosted.org/packages/15/51/9f35d12da8434b489c7b7bffc205c474a0a9432a889457026e9bc06a297a/regex-2024.11.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:658f90550f38270639e83ce492f27d2c8d2cd63805c65a13a14d36ca126753f0", size = 287684 }, - { url = "https://files.pythonhosted.org/packages/bd/18/b731f5510d1b8fb63c6b6d3484bfa9a59b84cc578ac8b5172970e05ae07c/regex-2024.11.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:164d8b7b3b4bcb2068b97428060b2a53be050085ef94eca7f240e7947f1b080e", size = 284589 }, - { url = "https://files.pythonhosted.org/packages/78/a2/6dd36e16341ab95e4c6073426561b9bfdeb1a9c9b63ab1b579c2e96cb105/regex-2024.11.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3660c82f209655a06b587d55e723f0b813d3a7db2e32e5e7dc64ac2a9e86fde", size = 782511 }, - { url = "https://files.pythonhosted.org/packages/1b/2b/323e72d5d2fd8de0d9baa443e1ed70363ed7e7b2fb526f5950c5cb99c364/regex-2024.11.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d22326fcdef5e08c154280b71163ced384b428343ae16a5ab2b3354aed12436e", size = 821149 }, - { url = "https://files.pythonhosted.org/packages/90/30/63373b9ea468fbef8a907fd273e5c329b8c9535fee36fc8dba5fecac475d/regex-2024.11.6-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f1ac758ef6aebfc8943560194e9fd0fa18bcb34d89fd8bd2af18183afd8da3a2", size = 809707 }, - { url = "https://files.pythonhosted.org/packages/f2/98/26d3830875b53071f1f0ae6d547f1d98e964dd29ad35cbf94439120bb67a/regex-2024.11.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:997d6a487ff00807ba810e0f8332c18b4eb8d29463cfb7c820dc4b6e7562d0cf", size = 781702 }, - { url = "https://files.pythonhosted.org/packages/87/55/eb2a068334274db86208ab9d5599ffa63631b9f0f67ed70ea7c82a69bbc8/regex-2024.11.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:02a02d2bb04fec86ad61f3ea7f49c015a0681bf76abb9857f945d26159d2968c", size = 771976 }, - { url = "https://files.pythonhosted.org/packages/74/c0/be707bcfe98254d8f9d2cff55d216e946f4ea48ad2fd8cf1428f8c5332ba/regex-2024.11.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f02f93b92358ee3f78660e43b4b0091229260c5d5c408d17d60bf26b6c900e86", size = 697397 }, - { url = "https://files.pythonhosted.org/packages/49/dc/bb45572ceb49e0f6509f7596e4ba7031f6819ecb26bc7610979af5a77f45/regex-2024.11.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:06eb1be98df10e81ebaded73fcd51989dcf534e3c753466e4b60c4697a003b67", size = 768726 }, - { url = "https://files.pythonhosted.org/packages/5a/db/f43fd75dc4c0c2d96d0881967897926942e935d700863666f3c844a72ce6/regex-2024.11.6-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:040df6fe1a5504eb0f04f048e6d09cd7c7110fef851d7c567a6b6e09942feb7d", size = 775098 }, - { url = "https://files.pythonhosted.org/packages/99/d7/f94154db29ab5a89d69ff893159b19ada89e76b915c1293e98603d39838c/regex-2024.11.6-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabbfc59f2c6edba2a6622c647b716e34e8e3867e0ab975412c5c2f79b82da2", size = 839325 }, - { url = "https://files.pythonhosted.org/packages/f7/17/3cbfab1f23356fbbf07708220ab438a7efa1e0f34195bf857433f79f1788/regex-2024.11.6-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8447d2d39b5abe381419319f942de20b7ecd60ce86f16a23b0698f22e1b70008", size = 843277 }, - { url = "https://files.pythonhosted.org/packages/7e/f2/48b393b51900456155de3ad001900f94298965e1cad1c772b87f9cfea011/regex-2024.11.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:da8f5fc57d1933de22a9e23eec290a0d8a5927a5370d24bda9a6abe50683fe62", size = 773197 }, - { url = "https://files.pythonhosted.org/packages/45/3f/ef9589aba93e084cd3f8471fded352826dcae8489b650d0b9b27bc5bba8a/regex-2024.11.6-cp310-cp310-win32.whl", hash = "sha256:b489578720afb782f6ccf2840920f3a32e31ba28a4b162e13900c3e6bd3f930e", size = 261714 }, - { url = "https://files.pythonhosted.org/packages/42/7e/5f1b92c8468290c465fd50c5318da64319133231415a8aa6ea5ab995a815/regex-2024.11.6-cp310-cp310-win_amd64.whl", hash = "sha256:5071b2093e793357c9d8b2929dfc13ac5f0a6c650559503bb81189d0a3814519", size = 274042 }, - { url = "https://files.pythonhosted.org/packages/58/58/7e4d9493a66c88a7da6d205768119f51af0f684fe7be7bac8328e217a52c/regex-2024.11.6-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5478c6962ad548b54a591778e93cd7c456a7a29f8eca9c49e4f9a806dcc5d638", size = 482669 }, - { url = "https://files.pythonhosted.org/packages/34/4c/8f8e631fcdc2ff978609eaeef1d6994bf2f028b59d9ac67640ed051f1218/regex-2024.11.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c89a8cc122b25ce6945f0423dc1352cb9593c68abd19223eebbd4e56612c5b7", size = 287684 }, - { url = "https://files.pythonhosted.org/packages/c5/1b/f0e4d13e6adf866ce9b069e191f303a30ab1277e037037a365c3aad5cc9c/regex-2024.11.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:94d87b689cdd831934fa3ce16cc15cd65748e6d689f5d2b8f4f4df2065c9fa20", size = 284589 }, - { url = "https://files.pythonhosted.org/packages/25/4d/ab21047f446693887f25510887e6820b93f791992994f6498b0318904d4a/regex-2024.11.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1062b39a0a2b75a9c694f7a08e7183a80c63c0d62b301418ffd9c35f55aaa114", size = 792121 }, - { url = "https://files.pythonhosted.org/packages/45/ee/c867e15cd894985cb32b731d89576c41a4642a57850c162490ea34b78c3b/regex-2024.11.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:167ed4852351d8a750da48712c3930b031f6efdaa0f22fa1933716bfcd6bf4a3", size = 831275 }, - { url = "https://files.pythonhosted.org/packages/b3/12/b0f480726cf1c60f6536fa5e1c95275a77624f3ac8fdccf79e6727499e28/regex-2024.11.6-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d548dafee61f06ebdb584080621f3e0c23fff312f0de1afc776e2a2ba99a74f", size = 818257 }, - { url = "https://files.pythonhosted.org/packages/bf/ce/0d0e61429f603bac433910d99ef1a02ce45a8967ffbe3cbee48599e62d88/regex-2024.11.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2a19f302cd1ce5dd01a9099aaa19cae6173306d1302a43b627f62e21cf18ac0", size = 792727 }, - { url = "https://files.pythonhosted.org/packages/e4/c1/243c83c53d4a419c1556f43777ccb552bccdf79d08fda3980e4e77dd9137/regex-2024.11.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bec9931dfb61ddd8ef2ebc05646293812cb6b16b60cf7c9511a832b6f1854b55", size = 780667 }, - { url = "https://files.pythonhosted.org/packages/c5/f4/75eb0dd4ce4b37f04928987f1d22547ddaf6c4bae697623c1b05da67a8aa/regex-2024.11.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9714398225f299aa85267fd222f7142fcb5c769e73d7733344efc46f2ef5cf89", size = 776963 }, - { url = "https://files.pythonhosted.org/packages/16/5d/95c568574e630e141a69ff8a254c2f188b4398e813c40d49228c9bbd9875/regex-2024.11.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:202eb32e89f60fc147a41e55cb086db2a3f8cb82f9a9a88440dcfc5d37faae8d", size = 784700 }, - { url = "https://files.pythonhosted.org/packages/8e/b5/f8495c7917f15cc6fee1e7f395e324ec3e00ab3c665a7dc9d27562fd5290/regex-2024.11.6-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:4181b814e56078e9b00427ca358ec44333765f5ca1b45597ec7446d3a1ef6e34", size = 848592 }, - { url = "https://files.pythonhosted.org/packages/1c/80/6dd7118e8cb212c3c60b191b932dc57db93fb2e36fb9e0e92f72a5909af9/regex-2024.11.6-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:068376da5a7e4da51968ce4c122a7cd31afaaec4fccc7856c92f63876e57b51d", size = 852929 }, - { url = "https://files.pythonhosted.org/packages/11/9b/5a05d2040297d2d254baf95eeeb6df83554e5e1df03bc1a6687fc4ba1f66/regex-2024.11.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ac10f2c4184420d881a3475fb2c6f4d95d53a8d50209a2500723d831036f7c45", size = 781213 }, - { url = "https://files.pythonhosted.org/packages/26/b7/b14e2440156ab39e0177506c08c18accaf2b8932e39fb092074de733d868/regex-2024.11.6-cp311-cp311-win32.whl", hash = "sha256:c36f9b6f5f8649bb251a5f3f66564438977b7ef8386a52460ae77e6070d309d9", size = 261734 }, - { url = "https://files.pythonhosted.org/packages/80/32/763a6cc01d21fb3819227a1cc3f60fd251c13c37c27a73b8ff4315433a8e/regex-2024.11.6-cp311-cp311-win_amd64.whl", hash = "sha256:02e28184be537f0e75c1f9b2f8847dc51e08e6e171c6bde130b2687e0c33cf60", size = 274052 }, - { url = "https://files.pythonhosted.org/packages/ba/30/9a87ce8336b172cc232a0db89a3af97929d06c11ceaa19d97d84fa90a8f8/regex-2024.11.6-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:52fb28f528778f184f870b7cf8f225f5eef0a8f6e3778529bdd40c7b3920796a", size = 483781 }, - { url = "https://files.pythonhosted.org/packages/01/e8/00008ad4ff4be8b1844786ba6636035f7ef926db5686e4c0f98093612add/regex-2024.11.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdd6028445d2460f33136c55eeb1f601ab06d74cb3347132e1c24250187500d9", size = 288455 }, - { url = "https://files.pythonhosted.org/packages/60/85/cebcc0aff603ea0a201667b203f13ba75d9fc8668fab917ac5b2de3967bc/regex-2024.11.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:805e6b60c54bf766b251e94526ebad60b7de0c70f70a4e6210ee2891acb70bf2", size = 284759 }, - { url = "https://files.pythonhosted.org/packages/94/2b/701a4b0585cb05472a4da28ee28fdfe155f3638f5e1ec92306d924e5faf0/regex-2024.11.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b85c2530be953a890eaffde05485238f07029600e8f098cdf1848d414a8b45e4", size = 794976 }, - { url = "https://files.pythonhosted.org/packages/4b/bf/fa87e563bf5fee75db8915f7352e1887b1249126a1be4813837f5dbec965/regex-2024.11.6-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb26437975da7dc36b7efad18aa9dd4ea569d2357ae6b783bf1118dabd9ea577", size = 833077 }, - { url = "https://files.pythonhosted.org/packages/a1/56/7295e6bad94b047f4d0834e4779491b81216583c00c288252ef625c01d23/regex-2024.11.6-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:abfa5080c374a76a251ba60683242bc17eeb2c9818d0d30117b4486be10c59d3", size = 823160 }, - { url = "https://files.pythonhosted.org/packages/fb/13/e3b075031a738c9598c51cfbc4c7879e26729c53aa9cca59211c44235314/regex-2024.11.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b7fa6606c2881c1db9479b0eaa11ed5dfa11c8d60a474ff0e095099f39d98e", size = 796896 }, - { url = "https://files.pythonhosted.org/packages/24/56/0b3f1b66d592be6efec23a795b37732682520b47c53da5a32c33ed7d84e3/regex-2024.11.6-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c32f75920cf99fe6b6c539c399a4a128452eaf1af27f39bce8909c9a3fd8cbe", size = 783997 }, - { url = "https://files.pythonhosted.org/packages/f9/a1/eb378dada8b91c0e4c5f08ffb56f25fcae47bf52ad18f9b2f33b83e6d498/regex-2024.11.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:982e6d21414e78e1f51cf595d7f321dcd14de1f2881c5dc6a6e23bbbbd68435e", size = 781725 }, - { url = "https://files.pythonhosted.org/packages/83/f2/033e7dec0cfd6dda93390089864732a3409246ffe8b042e9554afa9bff4e/regex-2024.11.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a7c2155f790e2fb448faed6dd241386719802296ec588a8b9051c1f5c481bc29", size = 789481 }, - { url = "https://files.pythonhosted.org/packages/83/23/15d4552ea28990a74e7696780c438aadd73a20318c47e527b47a4a5a596d/regex-2024.11.6-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:149f5008d286636e48cd0b1dd65018548944e495b0265b45e1bffecce1ef7f39", size = 852896 }, - { url = "https://files.pythonhosted.org/packages/e3/39/ed4416bc90deedbfdada2568b2cb0bc1fdb98efe11f5378d9892b2a88f8f/regex-2024.11.6-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e5364a4502efca094731680e80009632ad6624084aff9a23ce8c8c6820de3e51", size = 860138 }, - { url = "https://files.pythonhosted.org/packages/93/2d/dd56bb76bd8e95bbce684326302f287455b56242a4f9c61f1bc76e28360e/regex-2024.11.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0a86e7eeca091c09e021db8eb72d54751e527fa47b8d5787caf96d9831bd02ad", size = 787692 }, - { url = "https://files.pythonhosted.org/packages/0b/55/31877a249ab7a5156758246b9c59539abbeba22461b7d8adc9e8475ff73e/regex-2024.11.6-cp312-cp312-win32.whl", hash = "sha256:32f9a4c643baad4efa81d549c2aadefaeba12249b2adc5af541759237eee1c54", size = 262135 }, - { url = "https://files.pythonhosted.org/packages/38/ec/ad2d7de49a600cdb8dd78434a1aeffe28b9d6fc42eb36afab4a27ad23384/regex-2024.11.6-cp312-cp312-win_amd64.whl", hash = "sha256:a93c194e2df18f7d264092dc8539b8ffb86b45b899ab976aa15d48214138e81b", size = 273567 }, - { url = "https://files.pythonhosted.org/packages/90/73/bcb0e36614601016552fa9344544a3a2ae1809dc1401b100eab02e772e1f/regex-2024.11.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a6ba92c0bcdf96cbf43a12c717eae4bc98325ca3730f6b130ffa2e3c3c723d84", size = 483525 }, - { url = "https://files.pythonhosted.org/packages/0f/3f/f1a082a46b31e25291d830b369b6b0c5576a6f7fb89d3053a354c24b8a83/regex-2024.11.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:525eab0b789891ac3be914d36893bdf972d483fe66551f79d3e27146191a37d4", size = 288324 }, - { url = "https://files.pythonhosted.org/packages/09/c9/4e68181a4a652fb3ef5099e077faf4fd2a694ea6e0f806a7737aff9e758a/regex-2024.11.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:086a27a0b4ca227941700e0b31425e7a28ef1ae8e5e05a33826e17e47fbfdba0", size = 284617 }, - { url = "https://files.pythonhosted.org/packages/fc/fd/37868b75eaf63843165f1d2122ca6cb94bfc0271e4428cf58c0616786dce/regex-2024.11.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bde01f35767c4a7899b7eb6e823b125a64de314a8ee9791367c9a34d56af18d0", size = 795023 }, - { url = "https://files.pythonhosted.org/packages/c4/7c/d4cd9c528502a3dedb5c13c146e7a7a539a3853dc20209c8e75d9ba9d1b2/regex-2024.11.6-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b583904576650166b3d920d2bcce13971f6f9e9a396c673187f49811b2769dc7", size = 833072 }, - { url = "https://files.pythonhosted.org/packages/4f/db/46f563a08f969159c5a0f0e722260568425363bea43bb7ae370becb66a67/regex-2024.11.6-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c4de13f06a0d54fa0d5ab1b7138bfa0d883220965a29616e3ea61b35d5f5fc7", size = 823130 }, - { url = "https://files.pythonhosted.org/packages/db/60/1eeca2074f5b87df394fccaa432ae3fc06c9c9bfa97c5051aed70e6e00c2/regex-2024.11.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cde6e9f2580eb1665965ce9bf17ff4952f34f5b126beb509fee8f4e994f143c", size = 796857 }, - { url = "https://files.pythonhosted.org/packages/10/db/ac718a08fcee981554d2f7bb8402f1faa7e868c1345c16ab1ebec54b0d7b/regex-2024.11.6-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d7f453dca13f40a02b79636a339c5b62b670141e63efd511d3f8f73fba162b3", size = 784006 }, - { url = "https://files.pythonhosted.org/packages/c2/41/7da3fe70216cea93144bf12da2b87367590bcf07db97604edeea55dac9ad/regex-2024.11.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:59dfe1ed21aea057a65c6b586afd2a945de04fc7db3de0a6e3ed5397ad491b07", size = 781650 }, - { url = "https://files.pythonhosted.org/packages/a7/d5/880921ee4eec393a4752e6ab9f0fe28009435417c3102fc413f3fe81c4e5/regex-2024.11.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b97c1e0bd37c5cd7902e65f410779d39eeda155800b65fc4d04cc432efa9bc6e", size = 789545 }, - { url = "https://files.pythonhosted.org/packages/dc/96/53770115e507081122beca8899ab7f5ae28ae790bfcc82b5e38976df6a77/regex-2024.11.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d1e379028e0fc2ae3654bac3cbbef81bf3fd571272a42d56c24007979bafb6", size = 853045 }, - { url = "https://files.pythonhosted.org/packages/31/d3/1372add5251cc2d44b451bd94f43b2ec78e15a6e82bff6a290ef9fd8f00a/regex-2024.11.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:13291b39131e2d002a7940fb176e120bec5145f3aeb7621be6534e46251912c4", size = 860182 }, - { url = "https://files.pythonhosted.org/packages/ed/e3/c446a64984ea9f69982ba1a69d4658d5014bc7a0ea468a07e1a1265db6e2/regex-2024.11.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f51f88c126370dcec4908576c5a627220da6c09d0bff31cfa89f2523843316d", size = 787733 }, - { url = "https://files.pythonhosted.org/packages/2b/f1/e40c8373e3480e4f29f2692bd21b3e05f296d3afebc7e5dcf21b9756ca1c/regex-2024.11.6-cp313-cp313-win32.whl", hash = "sha256:63b13cfd72e9601125027202cad74995ab26921d8cd935c25f09c630436348ff", size = 262122 }, - { url = "https://files.pythonhosted.org/packages/45/94/bc295babb3062a731f52621cdc992d123111282e291abaf23faa413443ea/regex-2024.11.6-cp313-cp313-win_amd64.whl", hash = "sha256:2b3361af3198667e99927da8b84c1b010752fa4b1115ee30beaa332cabc3ef1a", size = 273545 }, +sdist = { url = "https://files.pythonhosted.org/packages/8e/5f/bd69653fbfb76cf8604468d3b4ec4c403197144c7bfe0e6a5fc9e02a07cb/regex-2024.11.6.tar.gz", hash = "sha256:7ab159b063c52a0333c884e4679f8d7a85112ee3078fe3d9004b2dd875585519", size = 399494, upload-time = "2024-11-06T20:12:31.635Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/3c/4651f6b130c6842a8f3df82461a8950f923925db8b6961063e82744bddcc/regex-2024.11.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ff590880083d60acc0433f9c3f713c51f7ac6ebb9adf889c79a261ecf541aa91", size = 482674, upload-time = "2024-11-06T20:08:57.575Z" }, + { url = "https://files.pythonhosted.org/packages/15/51/9f35d12da8434b489c7b7bffc205c474a0a9432a889457026e9bc06a297a/regex-2024.11.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:658f90550f38270639e83ce492f27d2c8d2cd63805c65a13a14d36ca126753f0", size = 287684, upload-time = "2024-11-06T20:08:59.787Z" }, + { url = "https://files.pythonhosted.org/packages/bd/18/b731f5510d1b8fb63c6b6d3484bfa9a59b84cc578ac8b5172970e05ae07c/regex-2024.11.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:164d8b7b3b4bcb2068b97428060b2a53be050085ef94eca7f240e7947f1b080e", size = 284589, upload-time = "2024-11-06T20:09:01.896Z" }, + { url = "https://files.pythonhosted.org/packages/78/a2/6dd36e16341ab95e4c6073426561b9bfdeb1a9c9b63ab1b579c2e96cb105/regex-2024.11.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3660c82f209655a06b587d55e723f0b813d3a7db2e32e5e7dc64ac2a9e86fde", size = 782511, upload-time = "2024-11-06T20:09:04.062Z" }, + { url = "https://files.pythonhosted.org/packages/1b/2b/323e72d5d2fd8de0d9baa443e1ed70363ed7e7b2fb526f5950c5cb99c364/regex-2024.11.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d22326fcdef5e08c154280b71163ced384b428343ae16a5ab2b3354aed12436e", size = 821149, upload-time = "2024-11-06T20:09:06.237Z" }, + { url = "https://files.pythonhosted.org/packages/90/30/63373b9ea468fbef8a907fd273e5c329b8c9535fee36fc8dba5fecac475d/regex-2024.11.6-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f1ac758ef6aebfc8943560194e9fd0fa18bcb34d89fd8bd2af18183afd8da3a2", size = 809707, upload-time = "2024-11-06T20:09:07.715Z" }, + { url = "https://files.pythonhosted.org/packages/f2/98/26d3830875b53071f1f0ae6d547f1d98e964dd29ad35cbf94439120bb67a/regex-2024.11.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:997d6a487ff00807ba810e0f8332c18b4eb8d29463cfb7c820dc4b6e7562d0cf", size = 781702, upload-time = "2024-11-06T20:09:10.101Z" }, + { url = "https://files.pythonhosted.org/packages/87/55/eb2a068334274db86208ab9d5599ffa63631b9f0f67ed70ea7c82a69bbc8/regex-2024.11.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:02a02d2bb04fec86ad61f3ea7f49c015a0681bf76abb9857f945d26159d2968c", size = 771976, upload-time = "2024-11-06T20:09:11.566Z" }, + { url = "https://files.pythonhosted.org/packages/74/c0/be707bcfe98254d8f9d2cff55d216e946f4ea48ad2fd8cf1428f8c5332ba/regex-2024.11.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f02f93b92358ee3f78660e43b4b0091229260c5d5c408d17d60bf26b6c900e86", size = 697397, upload-time = "2024-11-06T20:09:13.119Z" }, + { url = "https://files.pythonhosted.org/packages/49/dc/bb45572ceb49e0f6509f7596e4ba7031f6819ecb26bc7610979af5a77f45/regex-2024.11.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:06eb1be98df10e81ebaded73fcd51989dcf534e3c753466e4b60c4697a003b67", size = 768726, upload-time = "2024-11-06T20:09:14.85Z" }, + { url = "https://files.pythonhosted.org/packages/5a/db/f43fd75dc4c0c2d96d0881967897926942e935d700863666f3c844a72ce6/regex-2024.11.6-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:040df6fe1a5504eb0f04f048e6d09cd7c7110fef851d7c567a6b6e09942feb7d", size = 775098, upload-time = "2024-11-06T20:09:16.504Z" }, + { url = "https://files.pythonhosted.org/packages/99/d7/f94154db29ab5a89d69ff893159b19ada89e76b915c1293e98603d39838c/regex-2024.11.6-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabbfc59f2c6edba2a6622c647b716e34e8e3867e0ab975412c5c2f79b82da2", size = 839325, upload-time = "2024-11-06T20:09:18.698Z" }, + { url = "https://files.pythonhosted.org/packages/f7/17/3cbfab1f23356fbbf07708220ab438a7efa1e0f34195bf857433f79f1788/regex-2024.11.6-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8447d2d39b5abe381419319f942de20b7ecd60ce86f16a23b0698f22e1b70008", size = 843277, upload-time = "2024-11-06T20:09:21.725Z" }, + { url = "https://files.pythonhosted.org/packages/7e/f2/48b393b51900456155de3ad001900f94298965e1cad1c772b87f9cfea011/regex-2024.11.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:da8f5fc57d1933de22a9e23eec290a0d8a5927a5370d24bda9a6abe50683fe62", size = 773197, upload-time = "2024-11-06T20:09:24.092Z" }, + { url = "https://files.pythonhosted.org/packages/45/3f/ef9589aba93e084cd3f8471fded352826dcae8489b650d0b9b27bc5bba8a/regex-2024.11.6-cp310-cp310-win32.whl", hash = "sha256:b489578720afb782f6ccf2840920f3a32e31ba28a4b162e13900c3e6bd3f930e", size = 261714, upload-time = "2024-11-06T20:09:26.36Z" }, + { url = "https://files.pythonhosted.org/packages/42/7e/5f1b92c8468290c465fd50c5318da64319133231415a8aa6ea5ab995a815/regex-2024.11.6-cp310-cp310-win_amd64.whl", hash = "sha256:5071b2093e793357c9d8b2929dfc13ac5f0a6c650559503bb81189d0a3814519", size = 274042, upload-time = "2024-11-06T20:09:28.762Z" }, + { url = "https://files.pythonhosted.org/packages/58/58/7e4d9493a66c88a7da6d205768119f51af0f684fe7be7bac8328e217a52c/regex-2024.11.6-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5478c6962ad548b54a591778e93cd7c456a7a29f8eca9c49e4f9a806dcc5d638", size = 482669, upload-time = "2024-11-06T20:09:31.064Z" }, + { url = "https://files.pythonhosted.org/packages/34/4c/8f8e631fcdc2ff978609eaeef1d6994bf2f028b59d9ac67640ed051f1218/regex-2024.11.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c89a8cc122b25ce6945f0423dc1352cb9593c68abd19223eebbd4e56612c5b7", size = 287684, upload-time = "2024-11-06T20:09:32.915Z" }, + { url = "https://files.pythonhosted.org/packages/c5/1b/f0e4d13e6adf866ce9b069e191f303a30ab1277e037037a365c3aad5cc9c/regex-2024.11.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:94d87b689cdd831934fa3ce16cc15cd65748e6d689f5d2b8f4f4df2065c9fa20", size = 284589, upload-time = "2024-11-06T20:09:35.504Z" }, + { url = "https://files.pythonhosted.org/packages/25/4d/ab21047f446693887f25510887e6820b93f791992994f6498b0318904d4a/regex-2024.11.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1062b39a0a2b75a9c694f7a08e7183a80c63c0d62b301418ffd9c35f55aaa114", size = 792121, upload-time = "2024-11-06T20:09:37.701Z" }, + { url = "https://files.pythonhosted.org/packages/45/ee/c867e15cd894985cb32b731d89576c41a4642a57850c162490ea34b78c3b/regex-2024.11.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:167ed4852351d8a750da48712c3930b031f6efdaa0f22fa1933716bfcd6bf4a3", size = 831275, upload-time = "2024-11-06T20:09:40.371Z" }, + { url = "https://files.pythonhosted.org/packages/b3/12/b0f480726cf1c60f6536fa5e1c95275a77624f3ac8fdccf79e6727499e28/regex-2024.11.6-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d548dafee61f06ebdb584080621f3e0c23fff312f0de1afc776e2a2ba99a74f", size = 818257, upload-time = "2024-11-06T20:09:43.059Z" }, + { url = "https://files.pythonhosted.org/packages/bf/ce/0d0e61429f603bac433910d99ef1a02ce45a8967ffbe3cbee48599e62d88/regex-2024.11.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2a19f302cd1ce5dd01a9099aaa19cae6173306d1302a43b627f62e21cf18ac0", size = 792727, upload-time = "2024-11-06T20:09:48.19Z" }, + { url = "https://files.pythonhosted.org/packages/e4/c1/243c83c53d4a419c1556f43777ccb552bccdf79d08fda3980e4e77dd9137/regex-2024.11.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bec9931dfb61ddd8ef2ebc05646293812cb6b16b60cf7c9511a832b6f1854b55", size = 780667, upload-time = "2024-11-06T20:09:49.828Z" }, + { url = "https://files.pythonhosted.org/packages/c5/f4/75eb0dd4ce4b37f04928987f1d22547ddaf6c4bae697623c1b05da67a8aa/regex-2024.11.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9714398225f299aa85267fd222f7142fcb5c769e73d7733344efc46f2ef5cf89", size = 776963, upload-time = "2024-11-06T20:09:51.819Z" }, + { url = "https://files.pythonhosted.org/packages/16/5d/95c568574e630e141a69ff8a254c2f188b4398e813c40d49228c9bbd9875/regex-2024.11.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:202eb32e89f60fc147a41e55cb086db2a3f8cb82f9a9a88440dcfc5d37faae8d", size = 784700, upload-time = "2024-11-06T20:09:53.982Z" }, + { url = "https://files.pythonhosted.org/packages/8e/b5/f8495c7917f15cc6fee1e7f395e324ec3e00ab3c665a7dc9d27562fd5290/regex-2024.11.6-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:4181b814e56078e9b00427ca358ec44333765f5ca1b45597ec7446d3a1ef6e34", size = 848592, upload-time = "2024-11-06T20:09:56.222Z" }, + { url = "https://files.pythonhosted.org/packages/1c/80/6dd7118e8cb212c3c60b191b932dc57db93fb2e36fb9e0e92f72a5909af9/regex-2024.11.6-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:068376da5a7e4da51968ce4c122a7cd31afaaec4fccc7856c92f63876e57b51d", size = 852929, upload-time = "2024-11-06T20:09:58.642Z" }, + { url = "https://files.pythonhosted.org/packages/11/9b/5a05d2040297d2d254baf95eeeb6df83554e5e1df03bc1a6687fc4ba1f66/regex-2024.11.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ac10f2c4184420d881a3475fb2c6f4d95d53a8d50209a2500723d831036f7c45", size = 781213, upload-time = "2024-11-06T20:10:00.867Z" }, + { url = "https://files.pythonhosted.org/packages/26/b7/b14e2440156ab39e0177506c08c18accaf2b8932e39fb092074de733d868/regex-2024.11.6-cp311-cp311-win32.whl", hash = "sha256:c36f9b6f5f8649bb251a5f3f66564438977b7ef8386a52460ae77e6070d309d9", size = 261734, upload-time = "2024-11-06T20:10:03.361Z" }, + { url = "https://files.pythonhosted.org/packages/80/32/763a6cc01d21fb3819227a1cc3f60fd251c13c37c27a73b8ff4315433a8e/regex-2024.11.6-cp311-cp311-win_amd64.whl", hash = "sha256:02e28184be537f0e75c1f9b2f8847dc51e08e6e171c6bde130b2687e0c33cf60", size = 274052, upload-time = "2024-11-06T20:10:05.179Z" }, + { url = "https://files.pythonhosted.org/packages/ba/30/9a87ce8336b172cc232a0db89a3af97929d06c11ceaa19d97d84fa90a8f8/regex-2024.11.6-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:52fb28f528778f184f870b7cf8f225f5eef0a8f6e3778529bdd40c7b3920796a", size = 483781, upload-time = "2024-11-06T20:10:07.07Z" }, + { url = "https://files.pythonhosted.org/packages/01/e8/00008ad4ff4be8b1844786ba6636035f7ef926db5686e4c0f98093612add/regex-2024.11.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdd6028445d2460f33136c55eeb1f601ab06d74cb3347132e1c24250187500d9", size = 288455, upload-time = "2024-11-06T20:10:09.117Z" }, + { url = "https://files.pythonhosted.org/packages/60/85/cebcc0aff603ea0a201667b203f13ba75d9fc8668fab917ac5b2de3967bc/regex-2024.11.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:805e6b60c54bf766b251e94526ebad60b7de0c70f70a4e6210ee2891acb70bf2", size = 284759, upload-time = "2024-11-06T20:10:11.155Z" }, + { url = "https://files.pythonhosted.org/packages/94/2b/701a4b0585cb05472a4da28ee28fdfe155f3638f5e1ec92306d924e5faf0/regex-2024.11.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b85c2530be953a890eaffde05485238f07029600e8f098cdf1848d414a8b45e4", size = 794976, upload-time = "2024-11-06T20:10:13.24Z" }, + { url = "https://files.pythonhosted.org/packages/4b/bf/fa87e563bf5fee75db8915f7352e1887b1249126a1be4813837f5dbec965/regex-2024.11.6-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb26437975da7dc36b7efad18aa9dd4ea569d2357ae6b783bf1118dabd9ea577", size = 833077, upload-time = "2024-11-06T20:10:15.37Z" }, + { url = "https://files.pythonhosted.org/packages/a1/56/7295e6bad94b047f4d0834e4779491b81216583c00c288252ef625c01d23/regex-2024.11.6-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:abfa5080c374a76a251ba60683242bc17eeb2c9818d0d30117b4486be10c59d3", size = 823160, upload-time = "2024-11-06T20:10:19.027Z" }, + { url = "https://files.pythonhosted.org/packages/fb/13/e3b075031a738c9598c51cfbc4c7879e26729c53aa9cca59211c44235314/regex-2024.11.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b7fa6606c2881c1db9479b0eaa11ed5dfa11c8d60a474ff0e095099f39d98e", size = 796896, upload-time = "2024-11-06T20:10:21.85Z" }, + { url = "https://files.pythonhosted.org/packages/24/56/0b3f1b66d592be6efec23a795b37732682520b47c53da5a32c33ed7d84e3/regex-2024.11.6-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c32f75920cf99fe6b6c539c399a4a128452eaf1af27f39bce8909c9a3fd8cbe", size = 783997, upload-time = "2024-11-06T20:10:24.329Z" }, + { url = "https://files.pythonhosted.org/packages/f9/a1/eb378dada8b91c0e4c5f08ffb56f25fcae47bf52ad18f9b2f33b83e6d498/regex-2024.11.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:982e6d21414e78e1f51cf595d7f321dcd14de1f2881c5dc6a6e23bbbbd68435e", size = 781725, upload-time = "2024-11-06T20:10:28.067Z" }, + { url = "https://files.pythonhosted.org/packages/83/f2/033e7dec0cfd6dda93390089864732a3409246ffe8b042e9554afa9bff4e/regex-2024.11.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a7c2155f790e2fb448faed6dd241386719802296ec588a8b9051c1f5c481bc29", size = 789481, upload-time = "2024-11-06T20:10:31.612Z" }, + { url = "https://files.pythonhosted.org/packages/83/23/15d4552ea28990a74e7696780c438aadd73a20318c47e527b47a4a5a596d/regex-2024.11.6-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:149f5008d286636e48cd0b1dd65018548944e495b0265b45e1bffecce1ef7f39", size = 852896, upload-time = "2024-11-06T20:10:34.054Z" }, + { url = "https://files.pythonhosted.org/packages/e3/39/ed4416bc90deedbfdada2568b2cb0bc1fdb98efe11f5378d9892b2a88f8f/regex-2024.11.6-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e5364a4502efca094731680e80009632ad6624084aff9a23ce8c8c6820de3e51", size = 860138, upload-time = "2024-11-06T20:10:36.142Z" }, + { url = "https://files.pythonhosted.org/packages/93/2d/dd56bb76bd8e95bbce684326302f287455b56242a4f9c61f1bc76e28360e/regex-2024.11.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0a86e7eeca091c09e021db8eb72d54751e527fa47b8d5787caf96d9831bd02ad", size = 787692, upload-time = "2024-11-06T20:10:38.394Z" }, + { url = "https://files.pythonhosted.org/packages/0b/55/31877a249ab7a5156758246b9c59539abbeba22461b7d8adc9e8475ff73e/regex-2024.11.6-cp312-cp312-win32.whl", hash = "sha256:32f9a4c643baad4efa81d549c2aadefaeba12249b2adc5af541759237eee1c54", size = 262135, upload-time = "2024-11-06T20:10:40.367Z" }, + { url = "https://files.pythonhosted.org/packages/38/ec/ad2d7de49a600cdb8dd78434a1aeffe28b9d6fc42eb36afab4a27ad23384/regex-2024.11.6-cp312-cp312-win_amd64.whl", hash = "sha256:a93c194e2df18f7d264092dc8539b8ffb86b45b899ab976aa15d48214138e81b", size = 273567, upload-time = "2024-11-06T20:10:43.467Z" }, + { url = "https://files.pythonhosted.org/packages/90/73/bcb0e36614601016552fa9344544a3a2ae1809dc1401b100eab02e772e1f/regex-2024.11.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a6ba92c0bcdf96cbf43a12c717eae4bc98325ca3730f6b130ffa2e3c3c723d84", size = 483525, upload-time = "2024-11-06T20:10:45.19Z" }, + { url = "https://files.pythonhosted.org/packages/0f/3f/f1a082a46b31e25291d830b369b6b0c5576a6f7fb89d3053a354c24b8a83/regex-2024.11.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:525eab0b789891ac3be914d36893bdf972d483fe66551f79d3e27146191a37d4", size = 288324, upload-time = "2024-11-06T20:10:47.177Z" }, + { url = "https://files.pythonhosted.org/packages/09/c9/4e68181a4a652fb3ef5099e077faf4fd2a694ea6e0f806a7737aff9e758a/regex-2024.11.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:086a27a0b4ca227941700e0b31425e7a28ef1ae8e5e05a33826e17e47fbfdba0", size = 284617, upload-time = "2024-11-06T20:10:49.312Z" }, + { url = "https://files.pythonhosted.org/packages/fc/fd/37868b75eaf63843165f1d2122ca6cb94bfc0271e4428cf58c0616786dce/regex-2024.11.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bde01f35767c4a7899b7eb6e823b125a64de314a8ee9791367c9a34d56af18d0", size = 795023, upload-time = "2024-11-06T20:10:51.102Z" }, + { url = "https://files.pythonhosted.org/packages/c4/7c/d4cd9c528502a3dedb5c13c146e7a7a539a3853dc20209c8e75d9ba9d1b2/regex-2024.11.6-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b583904576650166b3d920d2bcce13971f6f9e9a396c673187f49811b2769dc7", size = 833072, upload-time = "2024-11-06T20:10:52.926Z" }, + { url = "https://files.pythonhosted.org/packages/4f/db/46f563a08f969159c5a0f0e722260568425363bea43bb7ae370becb66a67/regex-2024.11.6-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c4de13f06a0d54fa0d5ab1b7138bfa0d883220965a29616e3ea61b35d5f5fc7", size = 823130, upload-time = "2024-11-06T20:10:54.828Z" }, + { url = "https://files.pythonhosted.org/packages/db/60/1eeca2074f5b87df394fccaa432ae3fc06c9c9bfa97c5051aed70e6e00c2/regex-2024.11.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cde6e9f2580eb1665965ce9bf17ff4952f34f5b126beb509fee8f4e994f143c", size = 796857, upload-time = "2024-11-06T20:10:56.634Z" }, + { url = "https://files.pythonhosted.org/packages/10/db/ac718a08fcee981554d2f7bb8402f1faa7e868c1345c16ab1ebec54b0d7b/regex-2024.11.6-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d7f453dca13f40a02b79636a339c5b62b670141e63efd511d3f8f73fba162b3", size = 784006, upload-time = "2024-11-06T20:10:59.369Z" }, + { url = "https://files.pythonhosted.org/packages/c2/41/7da3fe70216cea93144bf12da2b87367590bcf07db97604edeea55dac9ad/regex-2024.11.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:59dfe1ed21aea057a65c6b586afd2a945de04fc7db3de0a6e3ed5397ad491b07", size = 781650, upload-time = "2024-11-06T20:11:02.042Z" }, + { url = "https://files.pythonhosted.org/packages/a7/d5/880921ee4eec393a4752e6ab9f0fe28009435417c3102fc413f3fe81c4e5/regex-2024.11.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b97c1e0bd37c5cd7902e65f410779d39eeda155800b65fc4d04cc432efa9bc6e", size = 789545, upload-time = "2024-11-06T20:11:03.933Z" }, + { url = "https://files.pythonhosted.org/packages/dc/96/53770115e507081122beca8899ab7f5ae28ae790bfcc82b5e38976df6a77/regex-2024.11.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d1e379028e0fc2ae3654bac3cbbef81bf3fd571272a42d56c24007979bafb6", size = 853045, upload-time = "2024-11-06T20:11:06.497Z" }, + { url = "https://files.pythonhosted.org/packages/31/d3/1372add5251cc2d44b451bd94f43b2ec78e15a6e82bff6a290ef9fd8f00a/regex-2024.11.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:13291b39131e2d002a7940fb176e120bec5145f3aeb7621be6534e46251912c4", size = 860182, upload-time = "2024-11-06T20:11:09.06Z" }, + { url = "https://files.pythonhosted.org/packages/ed/e3/c446a64984ea9f69982ba1a69d4658d5014bc7a0ea468a07e1a1265db6e2/regex-2024.11.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f51f88c126370dcec4908576c5a627220da6c09d0bff31cfa89f2523843316d", size = 787733, upload-time = "2024-11-06T20:11:11.256Z" }, + { url = "https://files.pythonhosted.org/packages/2b/f1/e40c8373e3480e4f29f2692bd21b3e05f296d3afebc7e5dcf21b9756ca1c/regex-2024.11.6-cp313-cp313-win32.whl", hash = "sha256:63b13cfd72e9601125027202cad74995ab26921d8cd935c25f09c630436348ff", size = 262122, upload-time = "2024-11-06T20:11:13.161Z" }, + { url = "https://files.pythonhosted.org/packages/45/94/bc295babb3062a731f52621cdc992d123111282e291abaf23faa413443ea/regex-2024.11.6-cp313-cp313-win_amd64.whl", hash = "sha256:2b3361af3198667e99927da8b84c1b010752fa4b1115ee30beaa332cabc3ef1a", size = 273545, upload-time = "2024-11-06T20:11:15Z" }, ] [[package]] @@ -1300,9 +1483,9 @@ dependencies = [ { name = "idna" }, { name = "urllib3" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 } +sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218, upload-time = "2024-05-29T15:37:49.536Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 }, + { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928, upload-time = "2024-05-29T15:37:47.027Z" }, ] [[package]] @@ -1314,70 +1497,70 @@ dependencies = [ { name = "pygments" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 } +sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149, upload-time = "2024-11-01T16:43:57.873Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 }, + { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424, upload-time = "2024-11-01T16:43:55.817Z" }, ] [[package]] name = "ruff" version = "0.8.5" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/25/5d/4b5403f3e89837decfd54c51bea7f94b7d3fae77e08858603d0e04d7ad17/ruff-0.8.5.tar.gz", hash = "sha256:1098d36f69831f7ff2a1da3e6407d5fbd6dfa2559e4f74ff2d260c5588900317", size = 3454835 } +sdist = { url = "https://files.pythonhosted.org/packages/25/5d/4b5403f3e89837decfd54c51bea7f94b7d3fae77e08858603d0e04d7ad17/ruff-0.8.5.tar.gz", hash = "sha256:1098d36f69831f7ff2a1da3e6407d5fbd6dfa2559e4f74ff2d260c5588900317", size = 3454835, upload-time = "2025-01-02T12:04:16.105Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/73/f8/03391745a703ce11678eb37c48ae89ec60396ea821e9d0bcea7c8e88fd91/ruff-0.8.5-py3-none-linux_armv6l.whl", hash = "sha256:5ad11a5e3868a73ca1fa4727fe7e33735ea78b416313f4368c504dbeb69c0f88", size = 10626889 }, - { url = "https://files.pythonhosted.org/packages/55/74/83bb74a44183b904216f3edfb9995b89830c83aaa6ce84627f74da0e0cf8/ruff-0.8.5-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f69ab37771ea7e0715fead8624ec42996d101269a96e31f4d31be6fc33aa19b7", size = 10398233 }, - { url = "https://files.pythonhosted.org/packages/e8/7a/a162a4feb3ef85d594527165e366dde09d7a1e534186ff4ba5d127eda850/ruff-0.8.5-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b5462d7804558ccff9c08fe8cbf6c14b7efe67404316696a2dde48297b1925bb", size = 10001843 }, - { url = "https://files.pythonhosted.org/packages/e7/9f/5ee5dcd135411402e35b6ec6a8dfdadbd31c5cd1c36a624d356a38d76090/ruff-0.8.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d56de7220a35607f9fe59f8a6d018e14504f7b71d784d980835e20fc0611cd50", size = 10872507 }, - { url = "https://files.pythonhosted.org/packages/b6/67/db2df2dd4a34b602d7f6ebb1b3744c8157f0d3579973ffc58309c9c272e8/ruff-0.8.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9d99cf80b0429cbebf31cbbf6f24f05a29706f0437c40413d950e67e2d4faca4", size = 10377200 }, - { url = "https://files.pythonhosted.org/packages/fe/ff/fe3a6a73006bced73e60d171d154a82430f61d97e787f511a24bd6302611/ruff-0.8.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b75ac29715ac60d554a049dbb0ef3b55259076181c3369d79466cb130eb5afd", size = 11433155 }, - { url = "https://files.pythonhosted.org/packages/e3/95/c1d1a1fe36658c1f3e1b47e1cd5f688b72d5786695b9e621c2c38399a95e/ruff-0.8.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c9d526a62c9eda211b38463528768fd0ada25dad524cb33c0e99fcff1c67b5dc", size = 12139227 }, - { url = "https://files.pythonhosted.org/packages/1b/fe/644b70d473a27b5112ac7a3428edcc1ce0db775c301ff11aa146f71886e0/ruff-0.8.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:587c5e95007612c26509f30acc506c874dab4c4abbacd0357400bd1aa799931b", size = 11697941 }, - { url = "https://files.pythonhosted.org/packages/00/39/4f83e517ec173e16a47c6d102cd22a1aaebe80e1208a1f2e83ab9a0e4134/ruff-0.8.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:622b82bf3429ff0e346835ec213aec0a04d9730480cbffbb6ad9372014e31bbd", size = 12967686 }, - { url = "https://files.pythonhosted.org/packages/1a/f6/52a2973ff108d74b5da706a573379eea160bece098f7cfa3f35dc4622710/ruff-0.8.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f99be814d77a5dac8a8957104bdd8c359e85c86b0ee0e38dca447cb1095f70fb", size = 11253788 }, - { url = "https://files.pythonhosted.org/packages/ce/1f/3b30f3c65b1303cb8e268ec3b046b77ab21ed8e26921cfc7e8232aa57f2c/ruff-0.8.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c01c048f9c3385e0fd7822ad0fd519afb282af9cf1778f3580e540629df89725", size = 10860360 }, - { url = "https://files.pythonhosted.org/packages/a5/a8/2a3ea6bacead963f7aeeba0c61815d9b27b0d638e6a74984aa5cc5d27733/ruff-0.8.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7512e8cb038db7f5db6aae0e24735ff9ea03bb0ed6ae2ce534e9baa23c1dc9ea", size = 10457922 }, - { url = "https://files.pythonhosted.org/packages/17/47/8f9514b670969aab57c5fc826fb500a16aee8feac1bcf8a91358f153a5ba/ruff-0.8.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:762f113232acd5b768d6b875d16aad6b00082add40ec91c927f0673a8ec4ede8", size = 10958347 }, - { url = "https://files.pythonhosted.org/packages/0d/d6/78a9af8209ad99541816d74f01ce678fc01ebb3f37dd7ab8966646dcd92b/ruff-0.8.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:03a90200c5dfff49e4c967b405f27fdfa81594cbb7c5ff5609e42d7fe9680da5", size = 11328882 }, - { url = "https://files.pythonhosted.org/packages/54/77/5c8072ec7afdfdf42c7a4019044486a2b6c85ee73617f8875ec94b977fed/ruff-0.8.5-py3-none-win32.whl", hash = "sha256:8710ffd57bdaa6690cbf6ecff19884b8629ec2a2a2a2f783aa94b1cc795139ed", size = 8802515 }, - { url = "https://files.pythonhosted.org/packages/bc/b6/47d2b06784de8ae992c45cceb2a30f3f205b3236a629d7ca4c0c134839a2/ruff-0.8.5-py3-none-win_amd64.whl", hash = "sha256:4020d8bf8d3a32325c77af452a9976a9ad6455773bcb94991cf15bd66b347e47", size = 9684231 }, - { url = "https://files.pythonhosted.org/packages/bf/5e/ffee22bf9f9e4b2669d1f0179ae8804584939fb6502b51f2401e26b1e028/ruff-0.8.5-py3-none-win_arm64.whl", hash = "sha256:134ae019ef13e1b060ab7136e7828a6d83ea727ba123381307eb37c6bd5e01cb", size = 9124741 }, + { url = "https://files.pythonhosted.org/packages/73/f8/03391745a703ce11678eb37c48ae89ec60396ea821e9d0bcea7c8e88fd91/ruff-0.8.5-py3-none-linux_armv6l.whl", hash = "sha256:5ad11a5e3868a73ca1fa4727fe7e33735ea78b416313f4368c504dbeb69c0f88", size = 10626889, upload-time = "2025-01-02T12:03:14.406Z" }, + { url = "https://files.pythonhosted.org/packages/55/74/83bb74a44183b904216f3edfb9995b89830c83aaa6ce84627f74da0e0cf8/ruff-0.8.5-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f69ab37771ea7e0715fead8624ec42996d101269a96e31f4d31be6fc33aa19b7", size = 10398233, upload-time = "2025-01-02T12:03:18.107Z" }, + { url = "https://files.pythonhosted.org/packages/e8/7a/a162a4feb3ef85d594527165e366dde09d7a1e534186ff4ba5d127eda850/ruff-0.8.5-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b5462d7804558ccff9c08fe8cbf6c14b7efe67404316696a2dde48297b1925bb", size = 10001843, upload-time = "2025-01-02T12:03:22.265Z" }, + { url = "https://files.pythonhosted.org/packages/e7/9f/5ee5dcd135411402e35b6ec6a8dfdadbd31c5cd1c36a624d356a38d76090/ruff-0.8.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d56de7220a35607f9fe59f8a6d018e14504f7b71d784d980835e20fc0611cd50", size = 10872507, upload-time = "2025-01-02T12:03:25.198Z" }, + { url = "https://files.pythonhosted.org/packages/b6/67/db2df2dd4a34b602d7f6ebb1b3744c8157f0d3579973ffc58309c9c272e8/ruff-0.8.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9d99cf80b0429cbebf31cbbf6f24f05a29706f0437c40413d950e67e2d4faca4", size = 10377200, upload-time = "2025-01-02T12:03:29.499Z" }, + { url = "https://files.pythonhosted.org/packages/fe/ff/fe3a6a73006bced73e60d171d154a82430f61d97e787f511a24bd6302611/ruff-0.8.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b75ac29715ac60d554a049dbb0ef3b55259076181c3369d79466cb130eb5afd", size = 11433155, upload-time = "2025-01-02T12:03:33.293Z" }, + { url = "https://files.pythonhosted.org/packages/e3/95/c1d1a1fe36658c1f3e1b47e1cd5f688b72d5786695b9e621c2c38399a95e/ruff-0.8.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c9d526a62c9eda211b38463528768fd0ada25dad524cb33c0e99fcff1c67b5dc", size = 12139227, upload-time = "2025-01-02T12:03:36.318Z" }, + { url = "https://files.pythonhosted.org/packages/1b/fe/644b70d473a27b5112ac7a3428edcc1ce0db775c301ff11aa146f71886e0/ruff-0.8.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:587c5e95007612c26509f30acc506c874dab4c4abbacd0357400bd1aa799931b", size = 11697941, upload-time = "2025-01-02T12:03:40.544Z" }, + { url = "https://files.pythonhosted.org/packages/00/39/4f83e517ec173e16a47c6d102cd22a1aaebe80e1208a1f2e83ab9a0e4134/ruff-0.8.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:622b82bf3429ff0e346835ec213aec0a04d9730480cbffbb6ad9372014e31bbd", size = 12967686, upload-time = "2025-01-02T12:03:43.751Z" }, + { url = "https://files.pythonhosted.org/packages/1a/f6/52a2973ff108d74b5da706a573379eea160bece098f7cfa3f35dc4622710/ruff-0.8.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f99be814d77a5dac8a8957104bdd8c359e85c86b0ee0e38dca447cb1095f70fb", size = 11253788, upload-time = "2025-01-02T12:03:48.222Z" }, + { url = "https://files.pythonhosted.org/packages/ce/1f/3b30f3c65b1303cb8e268ec3b046b77ab21ed8e26921cfc7e8232aa57f2c/ruff-0.8.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c01c048f9c3385e0fd7822ad0fd519afb282af9cf1778f3580e540629df89725", size = 10860360, upload-time = "2025-01-02T12:03:51.34Z" }, + { url = "https://files.pythonhosted.org/packages/a5/a8/2a3ea6bacead963f7aeeba0c61815d9b27b0d638e6a74984aa5cc5d27733/ruff-0.8.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7512e8cb038db7f5db6aae0e24735ff9ea03bb0ed6ae2ce534e9baa23c1dc9ea", size = 10457922, upload-time = "2025-01-02T12:03:55.212Z" }, + { url = "https://files.pythonhosted.org/packages/17/47/8f9514b670969aab57c5fc826fb500a16aee8feac1bcf8a91358f153a5ba/ruff-0.8.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:762f113232acd5b768d6b875d16aad6b00082add40ec91c927f0673a8ec4ede8", size = 10958347, upload-time = "2025-01-02T12:03:59.214Z" }, + { url = "https://files.pythonhosted.org/packages/0d/d6/78a9af8209ad99541816d74f01ce678fc01ebb3f37dd7ab8966646dcd92b/ruff-0.8.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:03a90200c5dfff49e4c967b405f27fdfa81594cbb7c5ff5609e42d7fe9680da5", size = 11328882, upload-time = "2025-01-02T12:04:02.224Z" }, + { url = "https://files.pythonhosted.org/packages/54/77/5c8072ec7afdfdf42c7a4019044486a2b6c85ee73617f8875ec94b977fed/ruff-0.8.5-py3-none-win32.whl", hash = "sha256:8710ffd57bdaa6690cbf6ecff19884b8629ec2a2a2a2f783aa94b1cc795139ed", size = 8802515, upload-time = "2025-01-02T12:04:05.399Z" }, + { url = "https://files.pythonhosted.org/packages/bc/b6/47d2b06784de8ae992c45cceb2a30f3f205b3236a629d7ca4c0c134839a2/ruff-0.8.5-py3-none-win_amd64.whl", hash = "sha256:4020d8bf8d3a32325c77af452a9976a9ad6455773bcb94991cf15bd66b347e47", size = 9684231, upload-time = "2025-01-02T12:04:08.414Z" }, + { url = "https://files.pythonhosted.org/packages/bf/5e/ffee22bf9f9e4b2669d1f0179ae8804584939fb6502b51f2401e26b1e028/ruff-0.8.5-py3-none-win_arm64.whl", hash = "sha256:134ae019ef13e1b060ab7136e7828a6d83ea727ba123381307eb37c6bd5e01cb", size = 9124741, upload-time = "2025-01-02T12:04:11.189Z" }, ] [[package]] name = "shellingham" version = "1.5.4" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310 } +sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755 }, + { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, ] [[package]] name = "six" version = "1.17.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031 } +sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050 }, + { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, ] [[package]] name = "sniffio" version = "1.3.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 } +sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 }, + { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, ] [[package]] name = "sortedcontainers" version = "2.4.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594 } +sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594, upload-time = "2021-05-16T22:03:42.897Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575 }, + { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575, upload-time = "2021-05-16T22:03:41.177Z" }, ] [[package]] @@ -1387,9 +1570,9 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "starlette" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/40/88/0af7f586894cfe61bd212f33e571785c4570085711b24fb7445425a5eeb0/sse-starlette-1.6.1.tar.gz", hash = "sha256:6208af2bd7d0887c92f1379da14bd1f4db56bd1274cc5d36670c683d2aa1de6a", size = 14555 } +sdist = { url = "https://files.pythonhosted.org/packages/40/88/0af7f586894cfe61bd212f33e571785c4570085711b24fb7445425a5eeb0/sse-starlette-1.6.1.tar.gz", hash = "sha256:6208af2bd7d0887c92f1379da14bd1f4db56bd1274cc5d36670c683d2aa1de6a", size = 14555, upload-time = "2023-05-18T11:26:30.829Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5e/f7/499e5d0c181a52a205d5b0982fd71cf162d1e070c97dca90c60520bbf8bf/sse_starlette-1.6.1-py3-none-any.whl", hash = "sha256:d8f18f1c633e355afe61cc5e9c92eea85badcb8b2d56ec8cfb0a006994aa55da", size = 9553 }, + { url = "https://files.pythonhosted.org/packages/5e/f7/499e5d0c181a52a205d5b0982fd71cf162d1e070c97dca90c60520bbf8bf/sse_starlette-1.6.1-py3-none-any.whl", hash = "sha256:d8f18f1c633e355afe61cc5e9c92eea85badcb8b2d56ec8cfb0a006994aa55da", size = 9553, upload-time = "2023-05-18T11:26:28.894Z" }, ] [[package]] @@ -1399,9 +1582,9 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/06/68/559bed5484e746f1ab2ebbe22312f2c25ec62e4b534916d41a8c21147bf8/starlette-0.27.0.tar.gz", hash = "sha256:6a6b0d042acb8d469a01eba54e9cda6cbd24ac602c4cd016723117d6a7e73b75", size = 51394 } +sdist = { url = "https://files.pythonhosted.org/packages/06/68/559bed5484e746f1ab2ebbe22312f2c25ec62e4b534916d41a8c21147bf8/starlette-0.27.0.tar.gz", hash = "sha256:6a6b0d042acb8d469a01eba54e9cda6cbd24ac602c4cd016723117d6a7e73b75", size = 51394, upload-time = "2023-05-16T10:59:56.286Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/58/f8/e2cca22387965584a409795913b774235752be4176d276714e15e1a58884/starlette-0.27.0-py3-none-any.whl", hash = "sha256:918416370e846586541235ccd38a474c08b80443ed31c578a418e2209b3eef91", size = 66978 }, + { url = "https://files.pythonhosted.org/packages/58/f8/e2cca22387965584a409795913b774235752be4176d276714e15e1a58884/starlette-0.27.0-py3-none-any.whl", hash = "sha256:918416370e846586541235ccd38a474c08b80443ed31c578a418e2209b3eef91", size = 66978, upload-time = "2023-05-16T10:59:53.927Z" }, ] [[package]] @@ -1411,48 +1594,48 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "webencodings" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/7a/fd/7a5ee21fd08ff70d3d33a5781c255cbe779659bd03278feb98b19ee550f4/tinycss2-1.4.0.tar.gz", hash = "sha256:10c0972f6fc0fbee87c3edb76549357415e94548c1ae10ebccdea16fb404a9b7", size = 87085 } +sdist = { url = "https://files.pythonhosted.org/packages/7a/fd/7a5ee21fd08ff70d3d33a5781c255cbe779659bd03278feb98b19ee550f4/tinycss2-1.4.0.tar.gz", hash = "sha256:10c0972f6fc0fbee87c3edb76549357415e94548c1ae10ebccdea16fb404a9b7", size = 87085, upload-time = "2024-10-24T14:58:29.895Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e6/34/ebdc18bae6aa14fbee1a08b63c015c72b64868ff7dae68808ab500c492e2/tinycss2-1.4.0-py3-none-any.whl", hash = "sha256:3a49cf47b7675da0b15d0c6e1df8df4ebd96e9394bb905a5775adb0d884c5289", size = 26610 }, + { url = "https://files.pythonhosted.org/packages/e6/34/ebdc18bae6aa14fbee1a08b63c015c72b64868ff7dae68808ab500c492e2/tinycss2-1.4.0-py3-none-any.whl", hash = "sha256:3a49cf47b7675da0b15d0c6e1df8df4ebd96e9394bb905a5775adb0d884c5289", size = 26610, upload-time = "2024-10-24T14:58:28.029Z" }, ] [[package]] name = "tomli" version = "2.2.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077 }, - { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429 }, - { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067 }, - { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030 }, - { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898 }, - { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894 }, - { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319 }, - { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273 }, - { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310 }, - { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309 }, - { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762 }, - { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453 }, - { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486 }, - { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349 }, - { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159 }, - { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243 }, - { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645 }, - { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584 }, - { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875 }, - { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418 }, - { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708 }, - { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582 }, - { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543 }, - { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691 }, - { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170 }, - { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530 }, - { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666 }, - { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954 }, - { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724 }, - { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383 }, - { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257 }, +sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175, upload-time = "2024-11-27T22:38:36.873Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077, upload-time = "2024-11-27T22:37:54.956Z" }, + { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429, upload-time = "2024-11-27T22:37:56.698Z" }, + { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067, upload-time = "2024-11-27T22:37:57.63Z" }, + { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030, upload-time = "2024-11-27T22:37:59.344Z" }, + { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898, upload-time = "2024-11-27T22:38:00.429Z" }, + { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894, upload-time = "2024-11-27T22:38:02.094Z" }, + { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319, upload-time = "2024-11-27T22:38:03.206Z" }, + { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273, upload-time = "2024-11-27T22:38:04.217Z" }, + { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310, upload-time = "2024-11-27T22:38:05.908Z" }, + { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309, upload-time = "2024-11-27T22:38:06.812Z" }, + { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762, upload-time = "2024-11-27T22:38:07.731Z" }, + { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453, upload-time = "2024-11-27T22:38:09.384Z" }, + { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486, upload-time = "2024-11-27T22:38:10.329Z" }, + { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349, upload-time = "2024-11-27T22:38:11.443Z" }, + { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159, upload-time = "2024-11-27T22:38:13.099Z" }, + { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243, upload-time = "2024-11-27T22:38:14.766Z" }, + { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645, upload-time = "2024-11-27T22:38:15.843Z" }, + { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584, upload-time = "2024-11-27T22:38:17.645Z" }, + { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875, upload-time = "2024-11-27T22:38:19.159Z" }, + { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418, upload-time = "2024-11-27T22:38:20.064Z" }, + { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708, upload-time = "2024-11-27T22:38:21.659Z" }, + { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582, upload-time = "2024-11-27T22:38:22.693Z" }, + { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543, upload-time = "2024-11-27T22:38:24.367Z" }, + { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691, upload-time = "2024-11-27T22:38:26.081Z" }, + { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170, upload-time = "2024-11-27T22:38:27.921Z" }, + { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530, upload-time = "2024-11-27T22:38:29.591Z" }, + { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666, upload-time = "2024-11-27T22:38:30.639Z" }, + { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954, upload-time = "2024-11-27T22:38:31.702Z" }, + { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724, upload-time = "2024-11-27T22:38:32.837Z" }, + { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383, upload-time = "2024-11-27T22:38:34.455Z" }, + { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257, upload-time = "2024-11-27T22:38:35.385Z" }, ] [[package]] @@ -1468,9 +1651,9 @@ dependencies = [ { name = "sniffio" }, { name = "sortedcontainers" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/9a/03/ab0e9509be0c6465e2773768ec25ee0cb8053c0b91471ab3854bbf2294b2/trio-0.26.2.tar.gz", hash = "sha256:0346c3852c15e5c7d40ea15972c4805689ef2cb8b5206f794c9c19450119f3a4", size = 561156 } +sdist = { url = "https://files.pythonhosted.org/packages/9a/03/ab0e9509be0c6465e2773768ec25ee0cb8053c0b91471ab3854bbf2294b2/trio-0.26.2.tar.gz", hash = "sha256:0346c3852c15e5c7d40ea15972c4805689ef2cb8b5206f794c9c19450119f3a4", size = 561156, upload-time = "2024-08-08T00:47:10.705Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/1c/70/efa56ce2271c44a7f4f43533a0477e6854a0948e9f7b76491de1fd3be7c9/trio-0.26.2-py3-none-any.whl", hash = "sha256:c5237e8133eb0a1d72f09a971a55c28ebe69e351c783fc64bc37db8db8bbe1d0", size = 475996 }, + { url = "https://files.pythonhosted.org/packages/1c/70/efa56ce2271c44a7f4f43533a0477e6854a0948e9f7b76491de1fd3be7c9/trio-0.26.2-py3-none-any.whl", hash = "sha256:c5237e8133eb0a1d72f09a971a55c28ebe69e351c783fc64bc37db8db8bbe1d0", size = 475996, upload-time = "2024-08-08T00:47:08.721Z" }, ] [[package]] @@ -1483,27 +1666,27 @@ dependencies = [ { name = "shellingham" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d4/f7/f174a1cae84848ae8b27170a96187b91937b743f0580ff968078fe16930a/typer-0.12.4.tar.gz", hash = "sha256:c9c1613ed6a166162705b3347b8d10b661ccc5d95692654d0fb628118f2c34e6", size = 97945 } +sdist = { url = "https://files.pythonhosted.org/packages/d4/f7/f174a1cae84848ae8b27170a96187b91937b743f0580ff968078fe16930a/typer-0.12.4.tar.gz", hash = "sha256:c9c1613ed6a166162705b3347b8d10b661ccc5d95692654d0fb628118f2c34e6", size = 97945, upload-time = "2024-08-17T03:33:12.974Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ae/cc/15083dcde1252a663398b1b2a173637a3ec65adadfb95137dc95df1e6adc/typer-0.12.4-py3-none-any.whl", hash = "sha256:819aa03699f438397e876aa12b0d63766864ecba1b579092cc9fe35d886e34b6", size = 47402 }, + { url = "https://files.pythonhosted.org/packages/ae/cc/15083dcde1252a663398b1b2a173637a3ec65adadfb95137dc95df1e6adc/typer-0.12.4-py3-none-any.whl", hash = "sha256:819aa03699f438397e876aa12b0d63766864ecba1b579092cc9fe35d886e34b6", size = 47402, upload-time = "2024-08-17T03:33:11.25Z" }, ] [[package]] name = "typing-extensions" version = "4.12.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 } +sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321, upload-time = "2024-06-07T18:52:15.995Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 }, + { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438, upload-time = "2024-06-07T18:52:13.582Z" }, ] [[package]] name = "urllib3" version = "2.3.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268 } +sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268, upload-time = "2024-12-22T07:47:30.032Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369 }, + { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369, upload-time = "2024-12-22T07:47:28.074Z" }, ] [[package]] @@ -1515,107 +1698,107 @@ dependencies = [ { name = "h11" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d3/f7/4ad826703a49b320a4adf2470fdd2a3481ea13f4460cb615ad12c75be003/uvicorn-0.30.0.tar.gz", hash = "sha256:f678dec4fa3a39706bbf49b9ec5fc40049d42418716cea52b53f07828a60aa37", size = 42560 } +sdist = { url = "https://files.pythonhosted.org/packages/d3/f7/4ad826703a49b320a4adf2470fdd2a3481ea13f4460cb615ad12c75be003/uvicorn-0.30.0.tar.gz", hash = "sha256:f678dec4fa3a39706bbf49b9ec5fc40049d42418716cea52b53f07828a60aa37", size = 42560, upload-time = "2024-05-28T07:20:42.231Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/2a/a1/d57e38417a8dabb22df02b6aebc209dc73485792e6c5620e501547133d0b/uvicorn-0.30.0-py3-none-any.whl", hash = "sha256:78fa0b5f56abb8562024a59041caeb555c86e48d0efdd23c3fe7de7a4075bdab", size = 62388 }, + { url = "https://files.pythonhosted.org/packages/2a/a1/d57e38417a8dabb22df02b6aebc209dc73485792e6c5620e501547133d0b/uvicorn-0.30.0-py3-none-any.whl", hash = "sha256:78fa0b5f56abb8562024a59041caeb555c86e48d0efdd23c3fe7de7a4075bdab", size = 62388, upload-time = "2024-05-28T07:20:38.256Z" }, ] [[package]] name = "watchdog" version = "6.0.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/db/7d/7f3d619e951c88ed75c6037b246ddcf2d322812ee8ea189be89511721d54/watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282", size = 131220 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0c/56/90994d789c61df619bfc5ce2ecdabd5eeff564e1eb47512bd01b5e019569/watchdog-6.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d1cdb490583ebd691c012b3d6dae011000fe42edb7a82ece80965b42abd61f26", size = 96390 }, - { url = "https://files.pythonhosted.org/packages/55/46/9a67ee697342ddf3c6daa97e3a587a56d6c4052f881ed926a849fcf7371c/watchdog-6.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc64ab3bdb6a04d69d4023b29422170b74681784ffb9463ed4870cf2f3e66112", size = 88389 }, - { url = "https://files.pythonhosted.org/packages/44/65/91b0985747c52064d8701e1075eb96f8c40a79df889e59a399453adfb882/watchdog-6.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c897ac1b55c5a1461e16dae288d22bb2e412ba9807df8397a635d88f671d36c3", size = 89020 }, - { url = "https://files.pythonhosted.org/packages/e0/24/d9be5cd6642a6aa68352ded4b4b10fb0d7889cb7f45814fb92cecd35f101/watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c", size = 96393 }, - { url = "https://files.pythonhosted.org/packages/63/7a/6013b0d8dbc56adca7fdd4f0beed381c59f6752341b12fa0886fa7afc78b/watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2", size = 88392 }, - { url = "https://files.pythonhosted.org/packages/d1/40/b75381494851556de56281e053700e46bff5b37bf4c7267e858640af5a7f/watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c", size = 89019 }, - { url = "https://files.pythonhosted.org/packages/39/ea/3930d07dafc9e286ed356a679aa02d777c06e9bfd1164fa7c19c288a5483/watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948", size = 96471 }, - { url = "https://files.pythonhosted.org/packages/12/87/48361531f70b1f87928b045df868a9fd4e253d9ae087fa4cf3f7113be363/watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860", size = 88449 }, - { url = "https://files.pythonhosted.org/packages/5b/7e/8f322f5e600812e6f9a31b75d242631068ca8f4ef0582dd3ae6e72daecc8/watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0", size = 89054 }, - { url = "https://files.pythonhosted.org/packages/68/98/b0345cabdce2041a01293ba483333582891a3bd5769b08eceb0d406056ef/watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c", size = 96480 }, - { url = "https://files.pythonhosted.org/packages/85/83/cdf13902c626b28eedef7ec4f10745c52aad8a8fe7eb04ed7b1f111ca20e/watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134", size = 88451 }, - { url = "https://files.pythonhosted.org/packages/fe/c4/225c87bae08c8b9ec99030cd48ae9c4eca050a59bf5c2255853e18c87b50/watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b", size = 89057 }, - { url = "https://files.pythonhosted.org/packages/30/ad/d17b5d42e28a8b91f8ed01cb949da092827afb9995d4559fd448d0472763/watchdog-6.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c7ac31a19f4545dd92fc25d200694098f42c9a8e391bc00bdd362c5736dbf881", size = 87902 }, - { url = "https://files.pythonhosted.org/packages/5c/ca/c3649991d140ff6ab67bfc85ab42b165ead119c9e12211e08089d763ece5/watchdog-6.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9513f27a1a582d9808cf21a07dae516f0fab1cf2d7683a742c498b93eedabb11", size = 88380 }, - { url = "https://files.pythonhosted.org/packages/a9/c7/ca4bf3e518cb57a686b2feb4f55a1892fd9a3dd13f470fca14e00f80ea36/watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13", size = 79079 }, - { url = "https://files.pythonhosted.org/packages/5c/51/d46dc9332f9a647593c947b4b88e2381c8dfc0942d15b8edc0310fa4abb1/watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379", size = 79078 }, - { url = "https://files.pythonhosted.org/packages/d4/57/04edbf5e169cd318d5f07b4766fee38e825d64b6913ca157ca32d1a42267/watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e", size = 79076 }, - { url = "https://files.pythonhosted.org/packages/ab/cc/da8422b300e13cb187d2203f20b9253e91058aaf7db65b74142013478e66/watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f", size = 79077 }, - { url = "https://files.pythonhosted.org/packages/2c/3b/b8964e04ae1a025c44ba8e4291f86e97fac443bca31de8bd98d3263d2fcf/watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26", size = 79078 }, - { url = "https://files.pythonhosted.org/packages/62/ae/a696eb424bedff7407801c257d4b1afda455fe40821a2be430e173660e81/watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c", size = 79077 }, - { url = "https://files.pythonhosted.org/packages/b5/e8/dbf020b4d98251a9860752a094d09a65e1b436ad181faf929983f697048f/watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2", size = 79078 }, - { url = "https://files.pythonhosted.org/packages/07/f6/d0e5b343768e8bcb4cda79f0f2f55051bf26177ecd5651f84c07567461cf/watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a", size = 79065 }, - { url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070 }, - { url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067 }, +sdist = { url = "https://files.pythonhosted.org/packages/db/7d/7f3d619e951c88ed75c6037b246ddcf2d322812ee8ea189be89511721d54/watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282", size = 131220, upload-time = "2024-11-01T14:07:13.037Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/56/90994d789c61df619bfc5ce2ecdabd5eeff564e1eb47512bd01b5e019569/watchdog-6.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d1cdb490583ebd691c012b3d6dae011000fe42edb7a82ece80965b42abd61f26", size = 96390, upload-time = "2024-11-01T14:06:24.793Z" }, + { url = "https://files.pythonhosted.org/packages/55/46/9a67ee697342ddf3c6daa97e3a587a56d6c4052f881ed926a849fcf7371c/watchdog-6.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc64ab3bdb6a04d69d4023b29422170b74681784ffb9463ed4870cf2f3e66112", size = 88389, upload-time = "2024-11-01T14:06:27.112Z" }, + { url = "https://files.pythonhosted.org/packages/44/65/91b0985747c52064d8701e1075eb96f8c40a79df889e59a399453adfb882/watchdog-6.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c897ac1b55c5a1461e16dae288d22bb2e412ba9807df8397a635d88f671d36c3", size = 89020, upload-time = "2024-11-01T14:06:29.876Z" }, + { url = "https://files.pythonhosted.org/packages/e0/24/d9be5cd6642a6aa68352ded4b4b10fb0d7889cb7f45814fb92cecd35f101/watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c", size = 96393, upload-time = "2024-11-01T14:06:31.756Z" }, + { url = "https://files.pythonhosted.org/packages/63/7a/6013b0d8dbc56adca7fdd4f0beed381c59f6752341b12fa0886fa7afc78b/watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2", size = 88392, upload-time = "2024-11-01T14:06:32.99Z" }, + { url = "https://files.pythonhosted.org/packages/d1/40/b75381494851556de56281e053700e46bff5b37bf4c7267e858640af5a7f/watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c", size = 89019, upload-time = "2024-11-01T14:06:34.963Z" }, + { url = "https://files.pythonhosted.org/packages/39/ea/3930d07dafc9e286ed356a679aa02d777c06e9bfd1164fa7c19c288a5483/watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948", size = 96471, upload-time = "2024-11-01T14:06:37.745Z" }, + { url = "https://files.pythonhosted.org/packages/12/87/48361531f70b1f87928b045df868a9fd4e253d9ae087fa4cf3f7113be363/watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860", size = 88449, upload-time = "2024-11-01T14:06:39.748Z" }, + { url = "https://files.pythonhosted.org/packages/5b/7e/8f322f5e600812e6f9a31b75d242631068ca8f4ef0582dd3ae6e72daecc8/watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0", size = 89054, upload-time = "2024-11-01T14:06:41.009Z" }, + { url = "https://files.pythonhosted.org/packages/68/98/b0345cabdce2041a01293ba483333582891a3bd5769b08eceb0d406056ef/watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c", size = 96480, upload-time = "2024-11-01T14:06:42.952Z" }, + { url = "https://files.pythonhosted.org/packages/85/83/cdf13902c626b28eedef7ec4f10745c52aad8a8fe7eb04ed7b1f111ca20e/watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134", size = 88451, upload-time = "2024-11-01T14:06:45.084Z" }, + { url = "https://files.pythonhosted.org/packages/fe/c4/225c87bae08c8b9ec99030cd48ae9c4eca050a59bf5c2255853e18c87b50/watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b", size = 89057, upload-time = "2024-11-01T14:06:47.324Z" }, + { url = "https://files.pythonhosted.org/packages/30/ad/d17b5d42e28a8b91f8ed01cb949da092827afb9995d4559fd448d0472763/watchdog-6.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c7ac31a19f4545dd92fc25d200694098f42c9a8e391bc00bdd362c5736dbf881", size = 87902, upload-time = "2024-11-01T14:06:53.119Z" }, + { url = "https://files.pythonhosted.org/packages/5c/ca/c3649991d140ff6ab67bfc85ab42b165ead119c9e12211e08089d763ece5/watchdog-6.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9513f27a1a582d9808cf21a07dae516f0fab1cf2d7683a742c498b93eedabb11", size = 88380, upload-time = "2024-11-01T14:06:55.19Z" }, + { url = "https://files.pythonhosted.org/packages/a9/c7/ca4bf3e518cb57a686b2feb4f55a1892fd9a3dd13f470fca14e00f80ea36/watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13", size = 79079, upload-time = "2024-11-01T14:06:59.472Z" }, + { url = "https://files.pythonhosted.org/packages/5c/51/d46dc9332f9a647593c947b4b88e2381c8dfc0942d15b8edc0310fa4abb1/watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379", size = 79078, upload-time = "2024-11-01T14:07:01.431Z" }, + { url = "https://files.pythonhosted.org/packages/d4/57/04edbf5e169cd318d5f07b4766fee38e825d64b6913ca157ca32d1a42267/watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e", size = 79076, upload-time = "2024-11-01T14:07:02.568Z" }, + { url = "https://files.pythonhosted.org/packages/ab/cc/da8422b300e13cb187d2203f20b9253e91058aaf7db65b74142013478e66/watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f", size = 79077, upload-time = "2024-11-01T14:07:03.893Z" }, + { url = "https://files.pythonhosted.org/packages/2c/3b/b8964e04ae1a025c44ba8e4291f86e97fac443bca31de8bd98d3263d2fcf/watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26", size = 79078, upload-time = "2024-11-01T14:07:05.189Z" }, + { url = "https://files.pythonhosted.org/packages/62/ae/a696eb424bedff7407801c257d4b1afda455fe40821a2be430e173660e81/watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c", size = 79077, upload-time = "2024-11-01T14:07:06.376Z" }, + { url = "https://files.pythonhosted.org/packages/b5/e8/dbf020b4d98251a9860752a094d09a65e1b436ad181faf929983f697048f/watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2", size = 79078, upload-time = "2024-11-01T14:07:07.547Z" }, + { url = "https://files.pythonhosted.org/packages/07/f6/d0e5b343768e8bcb4cda79f0f2f55051bf26177ecd5651f84c07567461cf/watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a", size = 79065, upload-time = "2024-11-01T14:07:09.525Z" }, + { url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070, upload-time = "2024-11-01T14:07:10.686Z" }, + { url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067, upload-time = "2024-11-01T14:07:11.845Z" }, ] [[package]] name = "webencodings" version = "0.5.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923", size = 9721 } +sdist = { url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923", size = 9721, upload-time = "2017-04-05T20:21:34.189Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78", size = 11774 }, + { url = "https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78", size = 11774, upload-time = "2017-04-05T20:21:32.581Z" }, ] [[package]] name = "websockets" version = "15.0.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/da/6462a9f510c0c49837bbc9345aca92d767a56c1fb2939e1579df1e1cdcf7/websockets-15.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d63efaa0cd96cf0c5fe4d581521d9fa87744540d4bc999ae6e08595a1014b45b", size = 175423 }, - { url = "https://files.pythonhosted.org/packages/1c/9f/9d11c1a4eb046a9e106483b9ff69bce7ac880443f00e5ce64261b47b07e7/websockets-15.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac60e3b188ec7574cb761b08d50fcedf9d77f1530352db4eef1707fe9dee7205", size = 173080 }, - { url = "https://files.pythonhosted.org/packages/d5/4f/b462242432d93ea45f297b6179c7333dd0402b855a912a04e7fc61c0d71f/websockets-15.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5756779642579d902eed757b21b0164cd6fe338506a8083eb58af5c372e39d9a", size = 173329 }, - { url = "https://files.pythonhosted.org/packages/6e/0c/6afa1f4644d7ed50284ac59cc70ef8abd44ccf7d45850d989ea7310538d0/websockets-15.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fdfe3e2a29e4db3659dbd5bbf04560cea53dd9610273917799f1cde46aa725e", size = 182312 }, - { url = "https://files.pythonhosted.org/packages/dd/d4/ffc8bd1350b229ca7a4db2a3e1c482cf87cea1baccd0ef3e72bc720caeec/websockets-15.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c2529b320eb9e35af0fa3016c187dffb84a3ecc572bcee7c3ce302bfeba52bf", size = 181319 }, - { url = "https://files.pythonhosted.org/packages/97/3a/5323a6bb94917af13bbb34009fac01e55c51dfde354f63692bf2533ffbc2/websockets-15.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac1e5c9054fe23226fb11e05a6e630837f074174c4c2f0fe442996112a6de4fb", size = 181631 }, - { url = "https://files.pythonhosted.org/packages/a6/cc/1aeb0f7cee59ef065724041bb7ed667b6ab1eeffe5141696cccec2687b66/websockets-15.0.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5df592cd503496351d6dc14f7cdad49f268d8e618f80dce0cd5a36b93c3fc08d", size = 182016 }, - { url = "https://files.pythonhosted.org/packages/79/f9/c86f8f7af208e4161a7f7e02774e9d0a81c632ae76db2ff22549e1718a51/websockets-15.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a34631031a8f05657e8e90903e656959234f3a04552259458aac0b0f9ae6fd9", size = 181426 }, - { url = "https://files.pythonhosted.org/packages/c7/b9/828b0bc6753db905b91df6ae477c0b14a141090df64fb17f8a9d7e3516cf/websockets-15.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3d00075aa65772e7ce9e990cab3ff1de702aa09be3940d1dc88d5abf1ab8a09c", size = 181360 }, - { url = "https://files.pythonhosted.org/packages/89/fb/250f5533ec468ba6327055b7d98b9df056fb1ce623b8b6aaafb30b55d02e/websockets-15.0.1-cp310-cp310-win32.whl", hash = "sha256:1234d4ef35db82f5446dca8e35a7da7964d02c127b095e172e54397fb6a6c256", size = 176388 }, - { url = "https://files.pythonhosted.org/packages/1c/46/aca7082012768bb98e5608f01658ff3ac8437e563eca41cf068bd5849a5e/websockets-15.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:39c1fec2c11dc8d89bba6b2bf1556af381611a173ac2b511cf7231622058af41", size = 176830 }, - { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423 }, - { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082 }, - { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330 }, - { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878 }, - { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883 }, - { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252 }, - { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521 }, - { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958 }, - { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918 }, - { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388 }, - { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828 }, - { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437 }, - { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096 }, - { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332 }, - { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152 }, - { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096 }, - { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523 }, - { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790 }, - { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165 }, - { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160 }, - { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395 }, - { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841 }, - { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440 }, - { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098 }, - { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329 }, - { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111 }, - { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054 }, - { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496 }, - { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829 }, - { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217 }, - { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195 }, - { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393 }, - { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837 }, - { url = "https://files.pythonhosted.org/packages/02/9e/d40f779fa16f74d3468357197af8d6ad07e7c5a27ea1ca74ceb38986f77a/websockets-15.0.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0c9e74d766f2818bb95f84c25be4dea09841ac0f734d1966f415e4edfc4ef1c3", size = 173109 }, - { url = "https://files.pythonhosted.org/packages/bc/cd/5b887b8585a593073fd92f7c23ecd3985cd2c3175025a91b0d69b0551372/websockets-15.0.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1009ee0c7739c08a0cd59de430d6de452a55e42d6b522de7aa15e6f67db0b8e1", size = 173343 }, - { url = "https://files.pythonhosted.org/packages/fe/ae/d34f7556890341e900a95acf4886833646306269f899d58ad62f588bf410/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76d1f20b1c7a2fa82367e04982e708723ba0e7b8d43aa643d3dcd404d74f1475", size = 174599 }, - { url = "https://files.pythonhosted.org/packages/71/e6/5fd43993a87db364ec60fc1d608273a1a465c0caba69176dd160e197ce42/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f29d80eb9a9263b8d109135351caf568cc3f80b9928bccde535c235de55c22d9", size = 174207 }, - { url = "https://files.pythonhosted.org/packages/2b/fb/c492d6daa5ec067c2988ac80c61359ace5c4c674c532985ac5a123436cec/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b359ed09954d7c18bbc1680f380c7301f92c60bf924171629c5db97febb12f04", size = 174155 }, - { url = "https://files.pythonhosted.org/packages/68/a1/dcb68430b1d00b698ae7a7e0194433bce4f07ded185f0ee5fb21e2a2e91e/websockets-15.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:cad21560da69f4ce7658ca2cb83138fb4cf695a2ba3e475e0559e05991aa8122", size = 176884 }, - { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743 }, +sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/da/6462a9f510c0c49837bbc9345aca92d767a56c1fb2939e1579df1e1cdcf7/websockets-15.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d63efaa0cd96cf0c5fe4d581521d9fa87744540d4bc999ae6e08595a1014b45b", size = 175423, upload-time = "2025-03-05T20:01:35.363Z" }, + { url = "https://files.pythonhosted.org/packages/1c/9f/9d11c1a4eb046a9e106483b9ff69bce7ac880443f00e5ce64261b47b07e7/websockets-15.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac60e3b188ec7574cb761b08d50fcedf9d77f1530352db4eef1707fe9dee7205", size = 173080, upload-time = "2025-03-05T20:01:37.304Z" }, + { url = "https://files.pythonhosted.org/packages/d5/4f/b462242432d93ea45f297b6179c7333dd0402b855a912a04e7fc61c0d71f/websockets-15.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5756779642579d902eed757b21b0164cd6fe338506a8083eb58af5c372e39d9a", size = 173329, upload-time = "2025-03-05T20:01:39.668Z" }, + { url = "https://files.pythonhosted.org/packages/6e/0c/6afa1f4644d7ed50284ac59cc70ef8abd44ccf7d45850d989ea7310538d0/websockets-15.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fdfe3e2a29e4db3659dbd5bbf04560cea53dd9610273917799f1cde46aa725e", size = 182312, upload-time = "2025-03-05T20:01:41.815Z" }, + { url = "https://files.pythonhosted.org/packages/dd/d4/ffc8bd1350b229ca7a4db2a3e1c482cf87cea1baccd0ef3e72bc720caeec/websockets-15.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c2529b320eb9e35af0fa3016c187dffb84a3ecc572bcee7c3ce302bfeba52bf", size = 181319, upload-time = "2025-03-05T20:01:43.967Z" }, + { url = "https://files.pythonhosted.org/packages/97/3a/5323a6bb94917af13bbb34009fac01e55c51dfde354f63692bf2533ffbc2/websockets-15.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac1e5c9054fe23226fb11e05a6e630837f074174c4c2f0fe442996112a6de4fb", size = 181631, upload-time = "2025-03-05T20:01:46.104Z" }, + { url = "https://files.pythonhosted.org/packages/a6/cc/1aeb0f7cee59ef065724041bb7ed667b6ab1eeffe5141696cccec2687b66/websockets-15.0.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5df592cd503496351d6dc14f7cdad49f268d8e618f80dce0cd5a36b93c3fc08d", size = 182016, upload-time = "2025-03-05T20:01:47.603Z" }, + { url = "https://files.pythonhosted.org/packages/79/f9/c86f8f7af208e4161a7f7e02774e9d0a81c632ae76db2ff22549e1718a51/websockets-15.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a34631031a8f05657e8e90903e656959234f3a04552259458aac0b0f9ae6fd9", size = 181426, upload-time = "2025-03-05T20:01:48.949Z" }, + { url = "https://files.pythonhosted.org/packages/c7/b9/828b0bc6753db905b91df6ae477c0b14a141090df64fb17f8a9d7e3516cf/websockets-15.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3d00075aa65772e7ce9e990cab3ff1de702aa09be3940d1dc88d5abf1ab8a09c", size = 181360, upload-time = "2025-03-05T20:01:50.938Z" }, + { url = "https://files.pythonhosted.org/packages/89/fb/250f5533ec468ba6327055b7d98b9df056fb1ce623b8b6aaafb30b55d02e/websockets-15.0.1-cp310-cp310-win32.whl", hash = "sha256:1234d4ef35db82f5446dca8e35a7da7964d02c127b095e172e54397fb6a6c256", size = 176388, upload-time = "2025-03-05T20:01:52.213Z" }, + { url = "https://files.pythonhosted.org/packages/1c/46/aca7082012768bb98e5608f01658ff3ac8437e563eca41cf068bd5849a5e/websockets-15.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:39c1fec2c11dc8d89bba6b2bf1556af381611a173ac2b511cf7231622058af41", size = 176830, upload-time = "2025-03-05T20:01:53.922Z" }, + { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423, upload-time = "2025-03-05T20:01:56.276Z" }, + { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082, upload-time = "2025-03-05T20:01:57.563Z" }, + { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330, upload-time = "2025-03-05T20:01:59.063Z" }, + { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878, upload-time = "2025-03-05T20:02:00.305Z" }, + { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883, upload-time = "2025-03-05T20:02:03.148Z" }, + { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252, upload-time = "2025-03-05T20:02:05.29Z" }, + { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521, upload-time = "2025-03-05T20:02:07.458Z" }, + { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958, upload-time = "2025-03-05T20:02:09.842Z" }, + { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918, upload-time = "2025-03-05T20:02:11.968Z" }, + { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388, upload-time = "2025-03-05T20:02:13.32Z" }, + { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828, upload-time = "2025-03-05T20:02:14.585Z" }, + { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" }, + { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" }, + { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" }, + { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" }, + { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" }, + { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" }, + { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" }, + { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" }, + { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" }, + { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" }, + { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" }, + { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" }, + { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" }, + { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" }, + { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" }, + { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" }, + { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" }, + { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" }, + { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" }, + { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" }, + { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" }, + { url = "https://files.pythonhosted.org/packages/02/9e/d40f779fa16f74d3468357197af8d6ad07e7c5a27ea1ca74ceb38986f77a/websockets-15.0.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0c9e74d766f2818bb95f84c25be4dea09841ac0f734d1966f415e4edfc4ef1c3", size = 173109, upload-time = "2025-03-05T20:03:17.769Z" }, + { url = "https://files.pythonhosted.org/packages/bc/cd/5b887b8585a593073fd92f7c23ecd3985cd2c3175025a91b0d69b0551372/websockets-15.0.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1009ee0c7739c08a0cd59de430d6de452a55e42d6b522de7aa15e6f67db0b8e1", size = 173343, upload-time = "2025-03-05T20:03:19.094Z" }, + { url = "https://files.pythonhosted.org/packages/fe/ae/d34f7556890341e900a95acf4886833646306269f899d58ad62f588bf410/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76d1f20b1c7a2fa82367e04982e708723ba0e7b8d43aa643d3dcd404d74f1475", size = 174599, upload-time = "2025-03-05T20:03:21.1Z" }, + { url = "https://files.pythonhosted.org/packages/71/e6/5fd43993a87db364ec60fc1d608273a1a465c0caba69176dd160e197ce42/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f29d80eb9a9263b8d109135351caf568cc3f80b9928bccde535c235de55c22d9", size = 174207, upload-time = "2025-03-05T20:03:23.221Z" }, + { url = "https://files.pythonhosted.org/packages/2b/fb/c492d6daa5ec067c2988ac80c61359ace5c4c674c532985ac5a123436cec/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b359ed09954d7c18bbc1680f380c7301f92c60bf924171629c5db97febb12f04", size = 174155, upload-time = "2025-03-05T20:03:25.321Z" }, + { url = "https://files.pythonhosted.org/packages/68/a1/dcb68430b1d00b698ae7a7e0194433bce4f07ded185f0ee5fb21e2a2e91e/websockets-15.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:cad21560da69f4ce7658ca2cb83138fb4cf695a2ba3e475e0559e05991aa8122", size = 176884, upload-time = "2025-03-05T20:03:27.934Z" }, + { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" }, ]