I built an MCP server this week. Not a toy exactly, not even a proof of concept; I wanted a working tool that lets a Copilot agent publish blog posts to a twenty-year-old blogging engine: thirteen tools, an XML-RPC client, multi-blog profile support, credential persistence, and auto-discovery of blog APIs. I built the whole thing in a single session with GitHub Copilot, describing what I wanted more than typing code.

The DasBlog Core blogging engine is an open-source project I've maintained and upgraded for years. It speaks MetaWeblog, Blogger, and Movable Type APIs over XML-RPC, protocols designed over 20 years ago so that desktop clients like Open Live Writer could talk to blog servers.

The Model Context Protocol was published in 2024 so that agents could actually use tools. The MCP server I built sits between the agent and my blog. It translates natural language requests from an agent into XML-RPC calls that DasBlog knows how to respond to.

DasBlog was never designed for this. Nobody writing XML-RPC handlers in 2003 was imagining that an AI would one day call metaWeblog.newPost on behalf of a human having a conversation in a terminal or an IDE. Here is what that bridge looks like:

public async Task<string> CreatePostAsync(BlogPost post, bool publish)
{
    var postStruct = Struct(MapFromPost(post));
    var result = await Rpc.InvokeAsync("metaWeblog.newPost",
        Str(Profile.BlogId), Str(Profile.Username), Str(Profile.Password), postStruct, Bool(publish));

    return result.AsString();
}

A few lines of C# connecting a 2024 protocol to a 2003 API.

Anil Dash wrote a piece called "Codeless" earlier this year framing the current moment as the latest abstraction in a long history of letting developers focus on the problem rather than the plumbing. He makes a compelling case, and the part about putting power in the hands of product managers resonates with me personally as I have moved into a product role. But he also concedes something quietly:

"Codeless approaches are probably not a great way to take over a big legacy codebase, since they rely on accurately describing an entire problem, which can often be difficult to completely capture."

This is exactly where I started. The software I needed already existed. The interesting question is not whether AI can generate greenfield applications from a strategic plan. It is whether AI can connect what you already have to what comes next.

Greenfield is easy to demo. You describe a feature, the agents generate code, you "ship" it. But brownfield development is where I believe the real test lives. Brownfield means case-sensitive method names that will silently fail if you get the capitalization wrong. It means timezone conversion bugs buried in a refactoring that someone started and never finished 22 years ago. It means an API that has no deletePost method in one protocol but does in another, so your abstraction layer has to quietly delegate to a different client for that one operation.

I encountered all of these building this MCP server. Copilot helped me write the code, but the knowledge that made it work was mine: understanding how DasBlog stores entries, knowing that XML-RPC dateTime.iso8601 carries no timezone information, recognizing that the Movable Type API is additive. You cannot bridge what you do not know.

This is where I think the conversation about AI and software development overlooks the brownfield reality. The abstractions Anil describes are real and valuable. But an abstraction built over ignorance will not simplify things in the long term. It only hides complexity with indirection.

The twenty-year-old XML-RPC layer in DasBlog only became useful again because someone (me) understood what it did, knew why it existed, and how to connect it to something new. Copilot wrote most of the glue, but it could not know which pieces belonged together or that a bridge was needed.

There are tens of thousands of systems like DasBlog sitting in production environments right now. Internal tools, legacy APIs, department-level applications that do their jobs quietly and well. Scott Hanselman once called the people who build and maintain these systems dark matter developers, the unseen majority who are not chasing the latest beta but are getting work done with mature, well-understood technology.

They were not building for this moment, but many of them have structured interfaces, documented protocols, and stable systems. The real work now is in understanding what you have well enough to connect it to what has become possible.

Construction cranes and scaffolding netting surround the ornate stone towers of Gaudí's Sagrada Familia in Barcelona, a basilica begun in 1882 and still under construction.


Comment Section