🚀 Building Arrodes 2.0 – A Developer’s Journey

Creating Arrodes 2.0, my take on a Jarvis-style AI assistant, was an intense yet rewarding ride through system design, AI integration, and interface development. The goal? Build a local, private, powerful assistant that could both converse intelligently and manage my system like a digital co-pilot.

🧱 Week 1–2: Planning & Architecture

I started with a clear goal: the assistant needed to be modular, fast, and private—no cloud dependencies. I sketched out a rough architecture: a FastAPI backend, an Ollama-based AI chat module, system monitoring hooks, and a responsive frontend using HTMX + Tailwind.

⚙️ Week 3–4: Backend & Core Features

This phase was backend-heavy. I implemented:

  • The base FastAPI app structure
  • System monitoring using psutil (CPU, memory, disk, network)
  • File browsing and process control endpoints
  • Chat sessions and basic context handling

I faced my first real hurdle here: keeping system metric queries lightweight and non-blocking, especially when updating the UI in real time.

💬 Week 5: AI Integration

Integrating the Ollama LLM locally took longer than expected. Handling context windows, multiple sessions, and caching responses required deep testing. I also added endpoints to manage new chats, context clearing, and history tracking.

The challenge? Memory management and performance. Long sessions would start to drag if not handled carefully.

🌐 Week 6–7: Frontend UI

With the backend stable, I shifted to the UI. HTMX made things easier, but real-time updates and making the UI feel alive required WebSocket-like dynamics without actually using them.

I added:

  • A live dashboard for metrics
  • A chat interface
  • File browser and process manager UI

Styling with Tailwind was a breeze, but keeping the layout responsive across devices took some trial and error.

🔍 Week 8: Debugging, Polish & Docs

This week was about fixing edge cases (like non-UTF file names crashing the browser), adding error messages, writing documentation, and cleaning up the repo.

I also added developer conveniences: live-reloading, environment configs, and a simple CLI to manage the server.


🧠 Reflections

Building Arrodes 2.0 took about 2 months, from planning to polish. The biggest lessons?

  • Local AI assistants are very possible—and fun.
  • FastAPI and HTMX are a powerful combo for reactive apps.
  • Monitoring your own system gives a whole new respect for what’s going on under the hood.

Arrodes 2.0 is still evolving, but it’s already become a reliable sidekick in my daily workflow. You can explore it on GitHub.


Leave a Reply

Your email address will not be published. Required fields are marked *