Two .Net 8 microservices (Authentication & User‑Data) expose MCP tools via LangChain and include a built‑in chat UI so LLMs can drive backend operations in real time. The respective services are orchestrated by Kubernetes and brought directly into the AWS cloud by Terraform. An Ansible playbook is also provided to set up the local development environment ready to run the project.
The Project includes:
- Ansible to prepare the local development environment.
 - AuthenticationService a service that uses AspNet.Identity to handle authentication and send emails.
 - UserService a service that stores and retrieves user data for demonstration purposes. Communicates with the AuthenticationService.
 - Docker Files for the respective services and a Docker compose to run everything easily.
 - MCP to expose tools to access and control business logic for LLMs
 - LangChain .Net to connect to LLms (OpenAi) and to translate LLM chat inputs into MCP tool calls and automatically invokes the appropriate microservice endpoints.
 - Kubernetes Pods, deployment, services and co to orchestrate, scale and run everything.
 - Terraform to create AWS resources (including EKS) and populate them with the Kubernetes services.
 
git clone https://github.com/genaray/AspNet.PlatformEngineering.Template
cd AspNet.PlatformEngineering.TemplateExecute the Ansible script in /Ansible:
cd Ansibleansible-playbook -i inventory.ini prepare_environment.ymlThis will install all the necessary dependencies on your computer (MacOS).
Modify appsettings.json in /AuthenticationService and /UserService for:
- Database Connection (
ConnectionStrings:DefaultConnection) - Email Sender Settings (
EmailSendersection) - UserService Settings (
UserServicesection, usehttp://user-service:8080for docker compose andhttp://user-service-svc:8080for kubernetes and cloud) - AuthService Settings (
AuthServicesection, usehttp://authentication-service:8080for docker compose andhttp://authentication-service-svc:8080for kubernetes and cloud) - OpenAi Settings (
OpenAisection) 
Modify main.tf in /Terraform for:
- AWS-Credentials (
access_key and secret_key) 
Execute the following commands in the projects root-folder.
This will build the AuthenticationService:
docker build \
  -f Docker/authentication.dockerfile \
  -t authentication-service:latest \
  .And this will build the UserService:
docker build \
  -f Docker/user.dockerfile \
  -t user-service:latest \
  .Execute the following commands in the /Docker folder:
docker-compose up --buildThis will:
- Build and start the ASP.NET Core Project.
 
Execute following commands in the /k8s folder:
kubectl apply -k .Execute the following commands in the /Terraform folder:
export AWS_ACCESS_KEY_ID=AKIA…YOUR_KEY
export AWS_SECRET_ACCESS_KEY=abcd…YOUR_SECRET
terraform init
terraform applyTo establish a connection to the EKS cluster, you may have to adapt the Terraform/kubeconfig.tpl and integrate it via the cli.
The output kubeconfig_raw returns the kubeconfig that can be used to connect to the cluster afterwards.
The backend includes:
- MCP-Server to expose tools to interact with the 
UserService - LangChain. Net to connect to OpenAi, orchestrate and execute tools
 - MCP-Tools to modify, delete and query data from the 
UserService 
In this project, the Model Context Protocol (MCP) acts as a standardized interface that defines how external tools and services are exposed to language models. LangChain leverages this protocol to dynamically discover available tools, understand their inputs and outputs, and route user prompts accordingly. The language model uses LangChain (With OpenAi) to interpret natural language instructions, while MCP ensures consistent access to backend capabilities like data and user operations. Together, MCP and LangChain enable seamless, real-time interaction between the LLM and the underlying service infrastructure—all through a single, unified chat interface.
| Service | URL | 
|---|---|
| API (Swagger) | http://localhost:8080/swagger | 
| Chat-UI (LangChain) | http://localhost:8080/langchain | 
This is how the services can be addressed when deployed via Docker compose. In the case of Kubernetes, either further port adaptations or port forwarding are necessary.
This project demonstrates a simple AI template leveraging LLMs capabilities with MCP-Servers and LangChain. In the future, this could be complemented by a customized CI/CD pipeline to test the project, scan for security vulnerabilities (SonarQube, OWASP) and easily deploy to the cloud. Secrets and credentials could also be injected from the pipeline. Instead of using AspNet.Identity's own login mechanism, you could rely on a provider such as KeyCloak or communicate directly with AWS-Cognito. Instead of OpenAI, AWS Bedrock or another LLM could also be integrated, more tools for controlling the backend could be added (user creation, invitation), and the LLM could contribute analysis functions as tools based on ML that can, for example, perform user analyses or make predictions. As an extension, you could add products to the backend, allowing the LLM to create, manage, and analyze products using ML tools.
Feel free to submit pull requests or open issues!
Apache2.0 License
