Host a LLM Powered Chatbot in .NET

Haiping Chen
4 min readOct 1, 2023

--

2023 is the year when LLM begins to explode. Everyone is busy integrating LLM with their existing systems and making full use of LLM’s language and reasoning capabilities. With the help of LLM, chatbots show their true intelligence. There are currently several tools that can help developers integrate LLM, such as LangChain, LlamaIndex and EmbedChain, as well as Microsoft’s Semantic Kernel and AutoGen. These frameworks can accelerate LLM application development at a certain point.

I would like to introduce to you another LLM open source framework developed by .NET, BotSharp. It is an enterprise-level LLM application construction framework written entirely in C# language. It can effectively accelerate the integration of enterprise business systems and LLM, and provides a comprehensive service from Agent. A complete set of code mechanisms from management to conversation management can truly be applied to production-level code quality, with the best programming paradigms and practices.

Let’s start with a simple introductory example to see how BotSharp connects traditional applications and LLM. This example is a pizza store’s AI customer service, which can help users create new pizza orders. Assume that you have been able to run the project and access the local Open API interface. The basic project setup code can be found here.

  1. Create a Pizza Ordering Agent

After creating the agent, you will see a new folder with the AgentId. Each file inside has a different purpose:

  • agent.json
    This file contains basic information recorded for the agent.
  • instruction.liquid
    This file contains a description of the agent’s capabilities, telling LLM your current role and business process. This prompt is defined using the liquid template, and you can use the Hook function to dynamically set some variable values.
  • functions.json
    This file is a function description of the agent. Functions are the only way for the Agent to interact with the outside world. The Agent can only access the outside world through function calls. They are defined using the JSON Schema standard.

Using files to define LLM’s Prompt and Function can be easily modified in subsequent expansions. The principle of separation of prompts and code is also implemented in the design.

2. Now you can start your chatbot via command:

dotnet run --project .\src\WebStarter\WebStarter.csproj -p SolutionName=PizzaBot

Start a conversation with the BOT in Postman

Did you notice that in the second conversation, LLM triggered a call to the function get_pizza_types. This function will correspond to the function in C# by name, and the specific function behavior can be defined and executed using C# code.

public class GetPizzaTypesFn : IFunctionCallback
{
public string Name => "get_pizza_types";

public async Task<bool> Execute(RoleDialogModel message)
{
message.ExecutionResult = "Pepperoni Pizza, Cheese Pizza, Margherita Pizza";
message.ExecutionData = new List<string>
{
"Pepperoni Pizza",
"Cheese Pizza",
"Margherita Pizza"
};
return true;
}
}

Let’s continue our conversation with Pizza Ordering:

Bot will communicate with users based on the business process we defined in the previous instruction.liquid template. After the user confirms the quantity, the get_pizza_price function will be called to calculate the total order price.

After the user confirms the order, the Bot will ask the user how to support the order. These processes are carried out completely in accordance with our process design. Meeting our business expectations.

From the above simple example, you can see that building LLM through BotSharp is very straightforward and simple. Scalability and code readability are very good. This simple example cannot demonstrate the power of BotSharp. In subsequent articles, I will introduce some of its more powerful functions to show how it can complete complex enterprise applications. The complete sample code can be found here.

--

--

Haiping Chen
Haiping Chen

No responses yet