How I work with AI

Gemini Generated Image Jcnnmpjcnnmpjcnn

How I work with AI

· 2 min read
AIWorkflowEngineeringMCP

There is a trend right now called "Vibe Coding"—typing vague instructions to an AI and shipping whatever comes out without really looking at it.

For me, that doesn't work. If I don't understand the logic behind the code, I feel like I’m losing control of the craft. I don't use AI to replace my thinking; I use it to handle the execution so I can focus on the architecture.

Here is a look at the actual protocol I use to build systems like this website.

1. The Process

My workflow is designed to prevent "hallucinations" and ensure the code is actually maintainable.

Step 1: Alignment (The Interview)

I never start by coding. I start by talking. I treat the LLM as a sounding board, dumping my abstract ideas into the chat until the model proves it understands the intent of the feature, not just the syntax.

Step 2: Context Injection

Once the AI understands the goal, I ask it to generate a System Prompt that summarizes our conversation. This ensures that every subsequent agent I spin up starts with the full context of the project, acting as a "clone" of my architectural intent.

Step 3: Architecture (ADO + MCP)

I use the Model Context Protocol (MCP) to connect the AI to my project management tools (Azure DevOps). We break the goal down into atomic, testable work items. If the plan is solid, the code usually follows.

Step 4: The Review

My default stance is skepticism. I review AI-generated code the same way I would review a junior engineer's Pull Request. I check for logic errors, security flaws, and inefficiency. I am the filter that ensures quality.

2. Case Studies

Case Study 01: The Self-Writing Knowledge Base

The Challenge: I wanted a digital twin to greet visitors, but I didn't want to spend hours manually writing documents about my own history.

The Solution:
I created an automated loop using Directus (Headless CMS) and MCP.
1. The Interview: I spun up an agent designed to interview me about my background.
2. The Injection: The agent listened to my natural answers, structured them into clean JSON data, and used the Directus API to inject them directly into the database.
3. The Result: I built a searchable "Source of Truth" for my bot without writing a single line of manual documentation.

Case Study 02: Maintaining Design Consistency

The Challenge: AI is notoriously bad at CSS. It tends to generate generic or broken styles because it lacks "taste."

The Solution:
I don't ask the AI to design; I ask it to replicate.
1. The Seed: I manually built one "Golden Sample" component with perfect Svelte structure and Tailwind classes.
2. The Guardrails: I fed this sample into the context with a strict rule: Follow this structure exactly.
3. The Result: The AI stopped guessing. It started generating UI components that perfectly matched my design system, simply by following the pattern I established.

3. The Future

People often ask if I’m worried about AI replacing engineers.

I think the role of "writing code" will definitely change. But the role of "solving problems" isn't going anywhere. Even if AI can do the heavy lifting, I’ll still be here defining what we should build and why it matters.

For now, I'm just happy to have a tool that lets me build worlds from my single monitor.

Related Articles