The Model Context Protocol (MCP) standardizes how AI applications interact with external tools and resources.
Spring joined the MCP ecosystem early as a key contributor, helping to develop and maintain the official MCP Java SDK that serves as the foundation for Java-based MCP implementations.
Building on this contribution, Spring AI has embraced MCP with comprehensive support through dedicated Boot Starters and MCP Java Annotations, making it easier than ever to build sophisticated AI-powered applications that can seamlessly connect to external systems.
This blog introduces core MCP components and demonstrates building both MCP Servers and Clients using Spring AI, showcasing basic and advanced features. The complete source code is available at: MCP Weather Example.
Note: This content applies only to Spring AI1.1.0-SNAPSHOTor SpringAI 1.1.0-M1+versions.
What is the Model Context Protocol?
The Model Context Protocol (MCP) is a standardized protocol that enables AI models to interact with external tools and resources in a structured way. Think of it as a bridge between your AI models and the real world - allowing them to access databases, APIs, file systems, and other external services through a consistent interface. You can bootstrap AI applications with MCP support using Spring Initializr. For comprehensive details, see the Spring AI MCP Overview documentation.MCP Client-Server Architecture
AI Application/Host DevelopersHandle the complexity of orchestrating multiple MCP servers (connected via MCP Clients) and integrating with AI models. AI developers build AI applications that:
- Use MCP Clients to consume capabilities from multiple MCP Servers
- Handle AI model integration and prompt engineering
- Manage conversation context and user interactions
- Orchestrate complex workflows across different services
- Focus on creating compelling user experiences
MCP Server (Provider) DevelopersFocus on exposing specific capabilities (tools, resources, prompts) from third-party services as MCP Servers. Server developers create servers that:
- Wrap third-party services and APIs (databases, file systems, external APIs)
- Expose service capabilities through standardized MCP primitives (tools, resources, prompts)
- Handle authentication and authorization for their specific services
Such separation ensures that Server developers can concentrate on wrapping their domain-specific services without worrying about AI orchestration. At the same time the AI application developers can leverage existing MCP servers without understanding the intricacies of each third-party service. The division of labor means that a database expert can create an MCP server for PostgreSQL without needing to understand LLM prompting, while an AI application developer can use that PostgreSQL server without knowing SQL internals. The MCP protocol acts as the universal language between them. Spring AI embraces this architecture with MCP Client and MCP Server Boot Starters. This means Spring developers can participate in both sides of the MCP ecosystem - building AI applications that consume MCP servers and creating MCP servers that expose Spring-based services to the wider AI community.
MCP Features
- Expose tools that AI models can invoke
- Share resources and data with AI applications
- Provide prompt templates for consistent interactions
- Offers argument autocompletion suggestions for prompts and resource URIs
- Handle real-time notifications and progress updates
- Support client-side sampling, elicitation, structured logging and progress tracking
- Support various transport protocols: STDIO, Streamable-HTTP, and SSE
Important: Tools are owned by the LLM, unlike other MCP features such as prompts and resources. The LLM—not the Host—decides if, when, and in what order to call tools. The Host only controls which tool descriptions are offered to the LLM.
Client Features
MCP Clients enable AI applications to consume capabilities from MCP Servers:- Roots: Expose filesystem “roots” to servers
- Sampling: Standardized way for servers to request LLM sampling from LLMS via clients
- Elicitation: Standardized way for servers to request additional information from users through the client during interactions.
- Progress Tracking Listener: Monitor long-running operations with real-time progress updates
- Structured Logging Listener: Receive detailed log messages from servers for debugging and monitoring
- Change Notifications Listeners: Get notified when server capabilities (tools, resources, prompts) are updated
Server Features
MCP Servers expose capabilities and services to AI applications:- Tools: Publish functions that AI models can invoke through standardized interfaces
- Resources: Provide access to data sources, files, and external systems
- Prompts: Share reusable prompt templates with parameter support
- Completion: Argument autocompletion suggestions for prompts and resource URIs
- Real-time Notifications: Send updates about capability changes to connected clients
- Progress Reporting: Emit progress updates for long-running operations
- Structured Logging: Send detailed log messages to clients for transparency
Build an MCP Server
Let’s build a Streamable-HTTP MCP Server that provides real-time weather forecast information.Spring Boot Server application
Create a new (mcp-weather-server) Spring Boot application:
with Spring AI MCP Server dependency:
application.properties to enable the Streamable HTTP server transport:
You can start the server with either
STREAMABLE, STATELESS or SSE transport.
To enable the STDIO transport you need to set spring.ai.mcp.server.stdio=true.
Weather Service
Leverage the free Weather REST API to build a service that can retrieve weather forecasts by location coordinates. Add @McpTool and @McpToolParam annotations to register thegetTemperature method as an MCP Server Tool:
Build & Run
8080.
Using the MCP Server
Once the MCP Weather Server is up and running, you can interact with it using various MCP compliant client applications:MCP InspectorThe MCP Inspector is an interactive developer tool for testing and debugging MCP servers.
To start the inspector run:In the browser UI, set the Transport Type to
Streamable HTTP and the URL to http://localhost:8080/mcp.
Click Connect to establish the connection.
Then list the tools and run the getTemperature.
MCP Java SDKUse the MCP Java SDK client to programmatically connect to the server:
Other MCP compliant AI Applications/SDKsConnect your MCP server to popular AI applications:
Other MCP compliant AI Applications/SDKsConnect your MCP server to popular AI applications:
- Cline - AI coding assistant for VS Code
- VS Code MCP - GitHub Copilot MCP integration
- Cursor MCP
- Non-Java MCP client - Build MCP Client using with other (non-Java) SDKs
- …
Claude DesktopTo integrate with Claude Desktop, using the local STDIO transport, add the following configuration to your Claude Desktop settings:
/absolute/path/to/ with the actual path to your built JAR file.Follow the MCP server installation for Claude Desktop for further guidance. The free version of the Claude Desktop doesn’t support Sampling!
Advanced Server Features
Let’s extend our MCP Weather Server to demonstrate advanced MCP capabilities including Logging, Progress Tracking, and Sampling. These features enable rich interactions between servers and clients:- Logging: Send structured log messages to connected clients for debugging and monitoring
- Progress Tracking: Report real-time progress updates for long-running operations
- Sampling: Request the client’s LLM to generate content based on server data
-
McpSyncServerExchange - the
exchangeparameter provides access to server-client communication capabilities. It allows the server to send notifications and make requests back to the client. -
@ProgressToken - the
progressTokenparameter enables progress tracking. The client provides this token, and the server uses it to send progress updates. - Logging Notifications - sends structured log messages to the client for debugging and monitoring purposes.
- Progress Updates - reports operation progress (50% in this case) to the client with a descriptive message.
- Sampling Capability - the most powerful feature - the server can request the client’s LLM to generate content. This allows the server to leverage the client’s AI capabilities, creating a bidirectional AI interaction pattern.
Build an MCP Client
Let’s build an AI application that uses an LLM and connects to MCP Servers via MCP Clients.Client Configuration
Create a new Spring Boot project (mcp-weather-client) with the following dependencies:
application.yml, configure the connection to the MCP Server:
my-weather-server name to the server connection.
Spring Boot Client Application
Create a client application that usesChatClient connected to an LLM and to the MCP Weather Server:
- Application Lifecycle Management - the application starts, executes the weather query, displays the result, and then exits cleanly.
-
ChatClient Configuration - creates a configured ChatClient bean using Spring AI’s auto-configured builder. The builder is automatically populated with:
- The AI model configuration (Anthropic Claude in our case)
- Default settings and configurations from application.properties
-
CommandLineRunner - runs automatically after the application context is fully loaded. It injects the configured ChatClient for AI model interaction and the
ToolCallbackProviderwhich contains all registered MCP tools from connected servers. - AI Prompt - instructs the AI model to get Amsterdam’s current weather. The AI model automatically discovers and calls the appropriate MCP tools based on the prompt.
-
Progress Token - uses the
toolContextto pass a uniqueprogressTokento MCP tools annotated with @McpProgressToken parameter. - MCP Tool Integration - this crucial line connects the ChatClient to all available MCP tools:
mcpToolProvideris auto-configured by Spring AI’s MCP Client starter- Contains all tools from connected MCP servers (configured via
spring.ai.mcp.client.*.connections.*) - The AI model can automatically discover and invoke these tools during conversation
Client MCP Handlers
Create a service class to handle MCP notifications and requests from the server. These handlers are the client-side counterparts to the advanced server features we implemented above, enabling bidirectional communication between the MCP Server and Client:Understanding the Handler Components:
-
Progress Handler - Receives real-time progress updates from the server’s long-running operations. Triggered when the server calls
exchange.progressNotification(...). For example the weather server sends 50% progress when starting sampling, then 100% when complete. Commonly used to display progress bars, update UI status, or log operation progress. -
Logging Handler - Receives structured log messages from the server for debugging and monitoring. Triggered when the server calls
exchange.loggingNotification(...). For example the weather server logs “Call getTemperature Tool with latitude: X and longitude: Y”. Used to debug server operations, audit trails, or monitoring dashboards. -
Sampling Handler - The Most Powerful Feature. It enables the server to request AI-generated content from the client’s LLM. Used for bidirectional AI interactions, creative content generation, dynamic responses. Triggered when the server calls
exchange.createMessage(...)with sampling capability check. The execution flow looks like this:- If client supports sampling, requests a poem about the weather
- Client handler receives the request and uses its ChatClient to interact with the LLM and generate the poem
- Generated poem is returned to the server and incorporated into the final tool response
Key Design Patterns:
-
Annotation-Based Routing: The
clients = "my-weather-server"attribute ensures handlers only process notifications from the specific MCP server connection defined in your configuration:spring.ai.mcp.client.streamable-http.connections.[my-weather-server].url. If your application connects to multiple MCP servers, use theclientsattribute to assign each handler to the corresponding MCP Client: - The @Lazy annotation on ChatClient prevents circular dependency issues that can occur when the ChatClient also depends on MCP components
-
Bidirectional AI Communication: The sampling handler creates a powerful pattern where:
- The server (domain expert) can leverage the client’s AI capabilities
- The client’s LLM generates creative content based on server-provided context
- This enables sophisticated AI-to-AI interactions beyond simple tool invocation This architecture makes the MCP Client a reactive participant in server operations, enabling sophisticated interactions rather than just passive tool consumption.
Multiple MCP Servers
Connect to multiple MCP servers using different transports. Here’s how to add the Brave Search MCP Server for web search alongside your weather server:Build & Run
Make sure that your MCP Weather Server is up and running. Then build and start your client:Conclusion
The combination of Spring’s proven development model with MCP’s standardized protocol creates a powerful foundation for the next generation of AI applications. Whether you’re building chatbots, data analysis tools, or development assistants, Spring AI’s MCP support provides the building blocks you need. This introduction covered the essential MCP concepts and demonstrated how to build both MCP Servers and Clients using Spring AI’s Boot Starters with basic Tool functionality. However, the MCP ecosystem offers much more sophisticated capabilities that we’ll explore in upcoming blog posts:- Java MCP Annotations Deep Dive: Learn how to leverage Spring AI’s annotation-based approach for creating more maintainable and declarative MCP implementations, including advanced annotation patterns and best practices.
- Beyond Tools - Prompts, Resources & Completions: Discover how to implement the full spectrum of MCP capabilities including shared prompt templates, dynamic resource provisioning, and intelligent autocompletion features that make your MCP servers more user-friendly and powerful.
- Authorization support - securing MCP Servers: Secure your MCP Servers with OAuth 2, and ensure only authorized users can access tools, resources and other capabilities. Add authorization support to your MCP Clients, so they can obtain OAuth 2 tokens to authenticate with secure MCP servers.
Additional Resources
- Model Context Protocol Specification - Official MCP protocol documentation
- MCP Java SDK - MCP Java SDK documentation
- MCP Weather Example Code
- Spring AI MCP Overview - Complete architectural overview and concepts
- MCP Client Boot Starter - Client configuration and usage guide
- MCP Server Boot Starter - Server setup and configuration
- STDIO and SSE Servers - Traditional transport mechanisms
- Streamable-HTTP Servers - Modern HTTP-based transport
- Stateless Streamable-HTTP Servers - Cloud-native deployment options
- Spring AI MCP Java Annotations - Annotation-based method handling for MCP servers and clients in Java
- Client Annotations - Declarative way to implement MCP client handlers using Java annotations
- Server Annotations - Declarative way to implement MCP server functionality using Java annotations
- Special Parameters - Special parameter types that provide additional context and functionality to annotated methods
For the latest updates and comprehensive documentation, visit the Spring AI Reference Documentation.