-
Notifications
You must be signed in to change notification settings - Fork 317
from_openapi server raises tons of Pydantic errors trying to parse JSON returned from the upstream api #187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I believe this is the same issue as #186 and may already be fixed on |
Update: 2.2 is out now if you'd like to confirm |
I'm sorry to hear that! Can you share any code so I can replicate against that API? |
Let me see what I can throw together tomorrow, will ping |
https://github.com./dbsanfte/topdesk-mcp-repro-bug That should do the job |
I've been trying to debug an issue in fastmcp over the past few days related to JSON deserialisation.
I've stood up a FastMCP instance using the
from_openapi
method, against an OpenAPI json generated here:https://converter.swagger.io/api/convert?url=https%3A%2F%2Fdevelopers.topdesk.com%2Fswagger%2Fincident_specification_3.8.5.yaml
This is the TopDesk Incident API:
https://developers.topdesk.com/explorer/?page=incident
When my local LLM calls the tool on the MCP server, the MCP server correctly and successfully calls the correct endpoint on the Topdesk API upstream, and it gets a list of incidents back. These appear to be well-structured JSON. But the MCP server fails to parse it with a bunch of Pydantic errors.
See below:
The tool call itself made by the LLM is:
And that gives me:
So the error is definitely in the server somewhere.
I'm not quite sure where to go from here but I'm happy to help debug more if necessary.
The text was updated successfully, but these errors were encountered: