How does Tavily MCP's web mapping tool differ from its web crawler?
The web mapping tool focuses on creating a structured representation or 'map' of a website's layout and content organization, which is useful for understanding site architecture. In contrast, the web crawler systematically explores websites to discover and index content, often for broader data collection or search purposes.
Can I specify custom parameters for all my search or extraction requests when using the remote MCP server?
Yes, when using the remote MCP server, you can specify default parameters for all requests by including a DEFAULT_PARAMETERS header. This header should contain a JSON object with your desired defaults, such as {"include_images":true, "search_depth": "basic", "max_results": 10}.
What are the authentication options for connecting Tavily MCP with Anthropic's Claude Code?
When integrating with Claude Code, you have two authentication options: passing your Tavily API key directly in the URL when adding the server, or using an OAuth authentication flow where you add the server without the API key in the URL and then complete the authentication process within Claude Code's MCP server management.
If I'm using OAuth authentication with the remote MCP server, how do I manage which Tavily API key is utilized?
After successful OAuth authentication, you can control which API key is used by naming it mcp_auth_default within your personal Tavily account. Any key with this specific name will be prioritized for use with OAuth-authenticated requests.
Is it possible to run the Tavily MCP server locally instead of relying on the remote service?
Yes, Tavily MCP can be run locally. The documentation outlines the components of the server, implying it can be deployed in your own environment, offering an alternative to connecting directly to the remote https://mcp.tavily.com/mcp endpoint.