Introduction

The intersection of cloud-native architecture and AI capabilities presents exciting opportunities for .NET developers. In this post, I'll walk through a practical implementation that combines .NET Aspire's orchestration capabilities with Ollama's local LLM hosting and the ModelContextProtocol (MCP) for tool usage.

This architecture allows AI models to leverage strongly-typed .NET methods through a clean, extensible interface - enabling us to build applications where LLMs can reason about problems while delegating specific operations to specialized tools.

The Components

Our solution consists of three main parts:

  1. AppHost: The .NET Aspire orchestrator that coordinates our distributed application
  2. McpServer: A server exposing mathematical operations as tools via ModelContextProtocol
  3. ApiService: A web API that connects client requests to the LLM and enables function calling

Let's explore how these components work together.

Setting Up the Aspire Host

First, our AppHost project coordinates the various services:

// AppHost
var builder = DistributedApplication.CreateBuilder(args);

var ollama = builder.AddOllama("ollama")
    .WithDataVolume()
    .WithGPUSupport();

var mistral = ollama.AddModel("mistral", "mistral:7b");

builder.AddProject<Projects.McpServer>("mcpserver");

builder.AddProject<Projects.ApiService>("apiservice")
    .WithReference(mistral)
    .WaitFor(mistral);

builder.Build().Run();

This configuration:

  • Sets up an Ollama instance with GPU support and persistent storage
  • Loads the Mistral 7B model
  • Adds our McpServer and ApiService projects
  • Establishes dependencies between services

Creating the MCP Server for Tool Definitions

The McpServer project defines our mathematical tools:

// McpServer
using ModelContextProtocol.Server;
using System.ComponentModel;

var builder = WebApplication.CreateBuilder(args);

builder.Services
    .AddMcpServer()
    .WithTools<MathTools>();

var app = builder.Build();

app.MapMcp();

app.Run();

[McpServerToolType]
public sealed class MathTools
{
    [McpServerTool, Description("Calculate the square root of a number.")]
    public static string SquareRoot(double number)
    {
        return number < 0 
            ? "Error: Cannot compute square root of a negative number." 
            : Math.Sqrt(number).ToString("F2");
    }

    [McpServerTool, Description("Calculate the factorial of a non-negative integer.")]
    public static string Factorial(int n)
    {
        if (n < 0)
        {
            return "Error: Factorial is not defined for negative numbers.";
        }
        if (n > 20)
        {
            return "Error: Input too large; maximum supported value is 20 to avoid overflow.";
        }

        long result = 1;
        for (int i = 2; i <= n; i++)
        {
            result *= i;
        }
        return result.ToString();
    }

    [McpServerTool, Description("Raise a number to a power.")]
    public static string Power(double baseNum, double exponent)
    {
        double result = Math.Pow(baseNum, exponent);
        return double.IsNaN(result) || double.IsInfinity(result) 
            ? "Error: Result is undefined or too large." 
            : result.ToString("F2");
    }
}

This server exposes three mathematical operations as tools, each with proper input validation and error handling.

Connecting Everything with the API Service

The ApiService acts as the glue between user requests, the LLM, and our tools:

// ApiService
builder
    .AddKeyedOllamaApiClient(ServiceKeys.Mistral)
    .AddKeyedChatClient()
    .UseFunctionInvocation()

app.MapPost("/ollama", async ([FromKeyedServices(ServiceKeys.Mistral)] IChatClient client, string prompt) =>
{
    var serverConfig = new McpServerConfiguration
    {
        Name = "Ollama Math MCP Server",
        Command = "http://localhost:5062/sse",
        TransportType = McpServerTransportType.Sse
    };

    var mcpTools = await Tools.GetFromMcpServers(serverConfig);

    var tools = mcpTools.Cast<McpClientTool>()
        .Select(t => new McpFunctionAdapter(t))
        .ToList();

    var response = await client.GetResponseAsync(prompt, new ChatOptions
    {
        Tools = [.. tools]
    });

    return response.Text;
});

public class ServiceKeys
{
    public const string Mistral = "mistral";
}

The Crucial Adapter

The McpFunctionAdapter is key to making this work - it bridges the gap between MCP tools and AI functions:

public class McpFunctionAdapter(McpClientTool mcpTool) : AIFunction
{
    private readonly McpClientTool _mcpTool = mcpTool ?? throw new ArgumentNullException(nameof(mcpTool));

    private static readonly JsonSerializerOptions DefaultJsonSerializerOptions = new()
    {
        PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
        Converters = { new JsonStringEnumConverter() }
    };

    public override string Name => _mcpTool.Function?.Name ?? "Unknown Function";

    public override string Description => _mcpTool.Function?.Description ?? "No description available";

    public override IReadOnlyDictionary<string, object?> AdditionalProperties =>
        _mcpTool.Function?.Parameters?.Properties?.ToDictionary(
            kvp => kvp.Key, kvp => new { kvp.Value.Type, kvp.Value.Description }
        ) ?? new Dictionary<string, object?>();

    public override JsonElement JsonSchema => GetJsonSchema();

    public override JsonSerializerOptions JsonSerializerOptions => DefaultJsonSerializerOptions;

    protected override async Task<object?> InvokeCoreAsync(IEnumerable<KeyValuePair<string, object?>> arguments,
        CancellationToken cancellationToken)
    {
        var argsDict = arguments.ToDictionary(kvp => kvp.Key, kvp => kvp.Value);
        return await _mcpTool.InvokeMethodAsync(argsDict);
    }

    private JsonElement GetJsonSchema()
    {
        if (_mcpTool.Function?.Parameters == null)
        {
            return JsonDocument.Parse("{}").RootElement;
        }

        var schema = new
        {
            type = _mcpTool.Function.Parameters.Type,
            properties = _mcpTool.Function.Parameters.Properties?.ToDictionary(
                kvp => kvp.Key, kvp => new { type = kvp.Value.Type, description = kvp.Value.Description }
            ) ?? new Dictionary<string, object?>(),
            required = _mcpTool.Function.Parameters.Required
        };

        var json = JsonSerializer.Serialize(schema, JsonSerializerOptions);
        return JsonDocument.Parse(json).RootElement;
    }
}

This adapter handles:

  • Converting between MCP tool definitions and AI function schemas
  • Managing parameter serialization/deserialization
  • Proper handling of function invocation

Why This Architecture Matters

This implementation pattern offers several advantages:

  1. Separation of Concerns: The LLM handles reasoning while specialized .NET code handles computations
  2. Type Safety: We get all the benefits of .NET's strong typing and validation
  3. Extensibility: Adding new tools is as simple as creating new methods with appropriate attributes
  4. Cloud-Native Design: Using .NET Aspire means this solution is ready for containerized deployments

Example Usage

With everything set up, users can send natural language queries to the API endpoint like:

  • "What's the square root of 144?"
  • "Calculate 5 to the power of 3"
  • "What's the factorial of 7?"

The LLM interprets these requests and calls the appropriate mathematical tools to get accurate results.

Technical Details

For this implementation, I used:

  • ModelContextProtocol 0.1.0-preview.6
  • OllamaSharp.ModelContextProtocol 5.1.9
  • .NET 8.0
  • Mistral 7B model via Ollama

Conclusion

The integration of .NET Aspire, Ollama, and ModelContextProtocol demonstrates the power of combining cloud-native architecture with AI capabilities. This pattern enables developers to create sophisticated applications where LLMs work alongside traditional code, each playing to their strengths.

As these technologies mature, we can expect to see more patterns emerge for effectively incorporating AI into distributed applications. The clean separation between tool definition and tool usage provided by ModelContextProtocol is particularly promising for building maintainable AI-enhanced systems.

What interesting integrations are you building with these technologies? I'd love to hear about your experiences in the comments below.