Well, who could have guessed? My first real post is about AI. I was thinking of starting with something more traditional, but the point of this blog is to document my learning journey, and right now, AI is what I’m learning the most about.
There are hundreds of tools and articles out there, and technology moves so fast that it’s hard to keep up. I want to document the process of leveraging AI in a real-world project and share my experience.
Maybe “real-world” is a stretch, but it’s at least something more than a console app. Here’s what I have in mind:
We’ll build a simple, simulated IoT Environment
with devices that generate telemetry data. This data will be saved to a database and visualized on a dashboard. The same devices will be controllable via commands that we can send from the same dashboard using our APIs. Finally, we’ll build an AI agent that can interact with the system based on user requests.
I know, it’s nothing jaw-dropping, but it’s a start, and I hope to expand on this project in the future with more AI features and not only.
The High Level Architecture
To bring this to life, we’ll be building several interconnected services, orchestrated with .NET Aspire:
- ClimateCore.AspireHost: This is the orchestration project that uses .NET Aspire to manage the infrastructure, services, and their interactions. It will spin up RabbitMQ, Postgres, PgAdmin and will tie everything together. It will also run the Next.js frontend and connect it to the backend.
- ClimateCore.HvacSimulator: This project will simulate 3 IoT HVAC devices, generating temperature data and responding to commands.
- ClimateCore.IngestionService: This service will listen for telemetry data from the devices and save it to a Postgres database.
- ClimateCore.WebApi: This will expose APIs to get device status, telemetry data, and send commands to the devices.
- climate-core-ui: A simple Next.js frontend to visualize the data and interact with the devices.
With this first part, we’ll focus on building the world in which our Agent will operate.
Getting Up and Running
To follow along, you’ll need a few things installed first:
- .NET 10: I’m on preview
10.0.100-rc.1.25451.107
. - Docker: Aspire needs it to manage our containerized services like RabbitMQ and Postgres.
- Node.js: For our Next.js frontend. I’m using version
24.6.0
.
Also, I am using pnpm
to install the node-modules
for the UI. If you don’t want to enable it and prefer other packages you can configure it in the AppHost
of the Aspire project.
builder
.AddNpmApp("frontend", "../climate-core-ui", "dev") // Can be AddNpmApp or AddYarnApp etc
.WithPnpmPackageInstallation() // Can be WithNpmPackageInstallation or WithYarnPackageInstallation etc
.WaitFor(webApi)
.WithReference(webApi)
.WithHttpEndpoint(env: "PORT")
.WithExternalHttpEndpoints();
Instead of a line-by-line setup guide, I’ve made the complete source code for this part available on GitHub. This lets you jump right into the running application.
Once you’ve cloned the repository, make sure Docker is running, and then execute this single command from the project’s root directory:
dotnet run --project ClimateCore.AspireHost
First Run: A Tour of the System
The first time you run the project, Aspire will automatically download the necessary Docker images for services like RabbitMQ and Postgres. This might take a few moments, but subsequent launches will be much faster.
Once the build is complete, your console will display several endpoints. Find the one for the AspireDashboard
and open it in your browser. You should see something like this:
The dashboard is your central command center. From here, you can view logs, inspect configurations, and find the direct links to all the other running services. Let’s explore them:
- RabbitMQ: Our message broker that facilitates communication between the HVAC simulator and the ingestion service.
You can access the Rabbitmq dashboard following the link provided by Aspire and use
guest
as username and password to log in. - Postgres: Our database where all telemetry data will be stored.
You can access PgAdmin using the link provided by Aspire and use
postgres
as username and password to quickly inspect the database and tables. - Web API: This is where our backend APIs are exposed. You can download the OpenAPI specification to explore the endpoints by
clicking on the link provided by Aspire for the
webapi
service and appending/openapi/v1.json
to it. Then you can use a tool like Bruno or Postman to import the spec and explore the endpoints. - Next.js Frontend: This is our simple web dashboard to visualize data and interact with the devices.
You can access it by clicking on the link provided by Aspire for the
frontend
service.
A Look at the Frontend
The UI is simple and functional, mostly whipped up by an LLM to get us started quickly. The real magic will be in the AI agent later!
On the left, you can choose which HVAC unit you want to monitor and select the start and end times for the telemetry data.
On the right, you get the live snapshot in the Device State panel, showing the current temperature and setpoint. Below that, you can send commands to update the setpoint or toggle the manual override.
Everything comes together in the big Telemetry chart at the bottom, which visualizes the history you selected. You can see the temperature (orange line) reacting to the setpoint (green line), with the blue bars showing exactly when the compressor kicks in.
When the manual override is active, the compressor won’t turn on or off automatically, which gives us (and our future AI) full control.
For now, you’ll need to refresh the page to see the latest device state. The focus of this series is on AI Agents, but I might add WebSockets later for real-time updates.
Next Steps
Now that the foundation is set, in the next part we will create an Agent that can interact with the system following up user requests! Can’t wait to see how it goes!