DevExpress Blazor AI Chat — Build a Multi-LLM Chat Application

AI services offer a variety of models tailored to meet user needs, preferences, and resource constraints. As you'd expect, models have strengths/weaknesses — some are optimized for coding tasks, while others are better suited for creative writing or real-time information retrieval.

To select the appropriate model, you often need to balance performance and cost. For example, advanced models like GPT-4.1 deliver exceptional results but require greater computational resources. In contrast, lighter models such as GPT-4.1 Mini or Nano offer faster response and lower costs, making them ideal for those seeking efficiency and responsiveness.

For enterprise applications, the ability to connect to a cloud-based AI service — with the option to fallback to offline models running in restricted environments — is often a must.

In this post, I’ll show you how to build a multi-LLM (Large Language Model) chat application that uses DevExpress Blazor AI Chat and ComboBox components. You’ll learn how to implement the IChatClient interface to manage multiple chat clients and their respective conversation histories (and how to use the DevExpress Blazor ComboBox to switch between LLMs during a chat session).

Getting Started

To get started, you must first integrate the DxAIChat component into your application (see our official guide for additional information in this regard): Add AI Chat to a Project.

In this guide, I will use two LLMs: GPT-4o from Azure OpenAI and Phi4 from Ollama running locally.

Here are my sample registrations for both chat clients from my Program.cs code unit. Note that I store LLM credentials and settings in the app configuration file (appSettings.Development.json). You can modify the following code to match your specific requirements:


using Azure;
using Azure.AI.OpenAI;
using Microsoft.Extensions.AI;
...
var openAiServiceSettings = builder.Configuration.GetSection("OpenAISettings").Get<OpenAIServiceSettings>();
var ollamaSettings = builder.Configuration.GetSection("OllamaSettings").Get<OllamaSettings>();

IChatClient azureChatClient = new AzureOpenAIClient(
    new Uri(openAiServiceSettings.Endpoint),
    new AzureKeyCredential(openAiServiceSettings.Key))
    .AsChatClient(openAiServiceSettings.DeploymentName);

IChatClient ollamaChatClient = new OllamaChatClient(
    new Uri(ollamaSettings.Uri), 
    ollamaSettings.ModelName);

NOTE: To install, run, and download Ollama model, refer to the DevExpress AI Extensions — Prerequisities help topic.

Next, we will implement infrastructure designed to track each client and its history.

CompositeChatClient Implementation

First, declare the ChatClientSession class that represents an individual LLM session (includes the user-friendly model/service name, a client instance, and message history):


using Microsoft.Extensions.AI;
//...
public class ChatClientSession {
    public string Name { get; set; }
    public IChatClient Client { get; }
    public List<BlazorChatMessage> Messages { get; set; }

    public ChatClientSession(IChatClient client, string name) {
        Name = name;
        Client = client ?? throw new ArgumentNullException(nameof(client));
        Messages = new List<BlazorChatMessage>();
    }
}

Next, define the CompositeChatClient class, which implements the IChatClient interface and serves as a container for multiple chat clients. By wrapping multiple clients within a single interface implementation, we can integrate this functionality with the DevExpress Blazor AI Chat component (which relies on the IChatClient interface).

This class is designed to support LLM switch while preserving individual conversation histories. Key considerations include:

  • Chat Client Management: Stores a list of ChatClientSession objects, where each object represents an LLM and its associated chat history.
  • Message Handling: Routes messages to the currently selected chat client and retrieves responses.

Here’s the implementation:


using Microsoft.Extensions.AI;
//...
public class CompositeChatClient : IChatClient	{
    public List<ChatClientSession> AvailableChatClients { get; }
    public ChatClientSession? SelectedSession { get; set; }
    public CompositeChatClient(params ChatClientSession[] chatClients)	{
        AvailableChatClients = chatClients.ToList();
        SelectedSession = AvailableChatClients[0];
    }

    public Task<ChatResponse> GetResponseAsync(IEnumerable<ChatMessage> messages, ChatOptions? options = null,
        CancellationToken cancellationToken = new CancellationToken())	{
        return SelectedSession?.Client.GetResponseAsync(messages, options, cancellationToken);
    }

    public IAsyncEnumerable<ChatResponseUpdate> GetStreamingResponseAsync(IEnumerable<ChatMessage> messages, ChatOptions? options = null,
        CancellationToken cancellationToken = new CancellationToken())	{
        return SelectedSession?.Client.GetStreamingResponseAsync(messages, options, cancellationToken);
    }

    public void Dispose()	{
        for (int i = 0; i < AvailableChatClients.Count; i++)	{
            AvailableChatClients[i].Client.Dispose();
            AvailableChatClients[i].Messages.Clear();
        }
    }
    public object? GetService(Type serviceType, object? serviceKey = null) {
        throw new NotImplementedException();
    }
}

Finally, switch back to the Program.cs and register CompositeChatClient as follows:


// Register both clients within a single instance of the IChatClient
var compositeChatClient = new CompositeChatClient(
    new ChatClientSession(azureChatClient, "Azure Open AI — GPT4o"), 
    new ChatClientSession(ollamaChatClient, "Ollama — Phi 4"));

builder.Services.AddScoped<IChatClient>((provider) => compositeChatClient);
builder.Services.AddDevExpressAI();

LLM Selection Using a ComboBox

To allow LLM selection, open the razor page with the DxAIChat component and add the DevExpress Blazor ComboBox. Key considerations include:

  • CompositeChatClient Injection: The CompositeChatClient is injected to the page. It provides a list of available chat clients to the UI application.
  • LLM Selection: The DxComboBox component is bound to a list of clients and triggers a callback when selection changes.
  • Chat Session Management: The OnModelChanged callback handles chat history save and load operations (when a user switches between LLMs).

Here’s a snippet of the implementation:


@page "/"
@using DevExpress.AIIntegration.Blazor.Chat
@using DXBlazorChatSelector.Services
@using Microsoft.Extensions.AI

<div class="main-container">
    <div class="top-container">
        <DxComboBox Data="@ModelsList"
        	CssClass="selector-container-combo-editor"
        	TextFieldName="@nameof(ChatClientSession.Name)"
            Value="ChatClientProvider.SelectedSession"
        	ValueChanged="@((ChatClientSession session) => OnModelChanged(session))"/>
    	</div>
    <DxAIChat @ref="DxAiChat" CssClass="my-chat"></DxAIChat>
</div>

@code{

    [Inject]
    IChatClient? ChatClient { get; set; }
    CompositeChatClient ChatClientProvider => ChatClient as CompositeChatClient;
    DxAIChat? DxAiChat { get; set; }
    List<ChatClientSession> ModelsList => ChatClientProvider?.AvailableChatClients;

    private void OnModelChanged(ChatClientSession value)  {
        SaveLastAssistantMessage(DxAiChat.SaveMessages());
        ChatClientProvider.SelectedSession = value;
        DxAiChat.LoadMessages(ChatClientProvider.SelectedSession.Messages);
    }

    private void SaveLastAssistantMessage(IEnumerable<BlazorChatMessage> saveMessages)	{
        if(ChatClientProvider.SelectedSession != null) {
            ChatClientProvider.SelectedSession.Messages.Clear();
            ChatClientProvider.SelectedSession.Messages.AddRange(saveMessages);
        }
    }
}    

Start New Chat Implementation

To clear the message history for the selected LLM and start a new chat session, we will add a DxButton and place it near the DxComboBox in the Index.razor page. Key considerations include:

  • Button Style: The button uses an SVG icon declared in CSS.
  • Session Reset: On click, the button clears the message history for the selected LLM and updates the chat UI.

The updated code for the Index.razor page:


@page "/"
@using DevExpress.AIIntegration.Blazor.Chat
@using DXBlazorChatSelector.Services
@using Microsoft.Extensions.AI

<div class="main-container">
    <div class="top-container">
    	<DxButton RenderStyle="ButtonRenderStyle.Primary"
        			RenderStyleMode="ButtonRenderStyleMode.Contained"
                  	IconCssClass="refresh"
                  	IconPosition="ButtonIconPosition.BeforeText"
                  	CssClass="refresh-button"
                  	Text="Start New Chat"
                  	Click="ClearHistory"/>
        <DxComboBox Data="@ModelsList"
        			CssClass="selector-container-combo-editor"
        			TextFieldName="@nameof(ChatClientSession.Name)"
            		Value="ChatClientProvider.SelectedSession"
        			ValueChanged="@((ChatClientSession session) => OnModelChanged(session))"/>
    	</div>
    <DxAIChat @ref="DxAiChat" CssClass="my-chat"></DxAIChat>
</div>

@code{

    [Inject]
    IChatClient? ChatClient { get; set; }
    CompositeChatClient ChatClientProvider => ChatClient as CompositeChatClient;
    DxAIChat? DxAiChat { get; set; }
    List<ChatClientSession> ModelsList => ChatClientProvider?.AvailableChatClients;

 	private void ClearHistory()	{
        ChatClientProvider.SelectedSession.Messages.Clear();
        DxAiChat.LoadMessages(ChatClientProvider.SelectedSession.Messages);
    }
    private void OnModelChanged(ChatClientSession value)	{
        SaveLastAssistantMessage(DxAiChat.SaveMessages());
        ChatClientProvider.SelectedSession = value;
        DxAiChat.LoadMessages(ChatClientProvider.SelectedSession.Messages);
    }

    private void SaveLastAssistantMessage(IEnumerable<BlazorChatMessage> saveMessages)	{
        if(ChatClientProvider.SelectedSession != null) {
            ChatClientProvider.SelectedSession.Messages.Clear();
            ChatClientProvider.SelectedSession.Messages.AddRange(saveMessages);
        }
    }
}    

The Final Result

The following video demonstrates our implementation:

Review our GitHub Example

To download a complete sample and explore the implementation, refer to the following GitHub Repository: DevExpress Blazor AI Chat — Implement Switching Between Multiple AI Services.

Conclusion

By combining the power of the IChatClient interface, the DevExpress Blazor ComboBox, and the DevExpress Blazor AI Chat component, you can deliver curated user experiences when using multiple LLMs.

If you have any questions related to this implementation or would like to discuss AI-related integration needs, feel free to submit a support ticket via the DevExpress Support Center. We'll be happy to follow up.

Free DevExpress Products - Get Your Copy Today

The following free DevExpress product offers remain available. Should you have any questions about the free offers below, please submit a ticket via the DevExpress Support Center at your convenience. We'll be happy to follow-up.