Non-technical guide to MCP servers

A complete guide to Model Context Protocol (MCP) servers what they are, how they work and why they matter.

5 min read
May 22, 2025
Author
Megan Johnson

Megan is a Technical Content Marketer. 

Share
Copy Link

Non-technical guide to MCP servers

As artificial intelligence models become more capable, a new problem emerges: How do we reliably connect these models to the tools, data, and systems they need to interact with in real time?

In November 2024, Anthropic had the answer to this question. Model Context Protocol (MCP), a new open standard approach to integrating LLMs with external tools. Instead of hardcoding integrations for every external service, you can use MCP to create a single, universal interface.

I’ve been getting into vibe coding recently, and building a lot of cool projects and it was only a matter of time before I stumbled on MCPs, so I’ve written this guide to help you get up to speed. I’m going to break down what MCP is, how it works, and why it’s a game-changer for builders like you and users alike.

What is the Model Context Protocol (MCP)?

The Model Context Protocol acts as a universal connector between your AI assistant and real-world tools like GitHub, Notion, Supabase, Webflow and more. Anthropic originally built MCP to improve Claude’s interaction with external tools. In early 2024, they open-sourced it to encourage broader industry adoption, and developers have been buzzing ever since.

Why was MCP created?

Large language models are undeniably powerful, but they’re often disconnected from real-time or task-specific data. MCP solves this by providing a secure and standardized way to:

  • Fetch live data from APIs
  • Access structured databases
  • Control applications and perform tasks
  • Generate dynamic, context-aware responses

For instance, with Webflow's new MCP server, you can connect an AI model like Claude directly to your Webflow site. Once connected, you can simply type commands like “Create a collection called Blog Posts” or “Publish my site” in plain English.

Claude interprets your instruction, invokes the right tool via the MCP server, and performs the action instantly, no code, no manual UI clicks. Within seconds, your CMS updates, collections are created, and sites are published, all from a natural language interface.

To see this in action, check out this video here: 

MCP was created to solve some of the most persistent challenges in building useful AI agents.

What problems does MCP solve?

Fragmented APIs and insufficient context: An LLM doesn’t intuitively know which API to use for your task. It needs to infer the task type (e.g., email vs Slack search), pick the correct API function and then parse results into natural language. This leads to hallucinations and incorrect outputs, we’ve all been there when ChatGPT gives us the wrong answer.

LLMs can struggle with multi-step workflows: Multi-step workflows (like fetching a contact ID, reading contact data, then updating it) are second nature in traditional code. But LLMs can lose context between actions, leading to incomplete tasks.

Vendor lock-in: If you build around GPT-4, you might find yourself forced to rebuild everything if you switch to another model like Claude or Gemini. MCP introduces a model-agnostic layer that allows the same tools to be reused across multiple LLMs without rewriting logic or descriptions.

How does MCP work?

MCP enables language models to interact with external tools in a secure, structured, and modular way. Instead of building separate, custom integrations for every tool like Webflow, Notion etc, these tools become accessible by setting up an MCP server, and the AI can then find and use those tools when needed.

To understand how MCP works, let’s walk through the example I mentioned earlier: using Claude Desktop to interact with your Webflow site via its MCP server. Suppose you want to create a new CMS collection called Blog Posts. Here’s how that simple request travels through the MCP architecture:

  1. You send a request to the AI assistant.

    You type a natural language instruction like: “Create a collection called Blog Posts” into Claude Desktop.

  2. The MCP Host checks which tools are available.

    Claude Desktop, acting as the MCP Host, is already connected to Webflow’s MCP server. It identifies that Webflow supports tools like createCollection, getItems, and publishSite.

  3. The MCP Client passes your request and tool options to the model.

    Behind the scenes, Claude’s MCP Client sends your request, along with a list of available Webflow tools, to the language model.

  4. The AI decides which tool to use and sends a structured request.

    Claude interprets your intent, realizes you want to create a Collection and generates a structured tool invocation, complete with the right parameters (like the name “Blog Posts”).

  5. The MCP Server talks to Webflow and performs the action.

    Webflow’s MCP server receives the request, communicates with the Webflow API, and executes the action, creating a new collection in your CMS.

  6. The result is sent back to the AI.

    Once Webflow confirms the collection has been created (with details like name, ID, and slug), that response is sent back through the MCP pipeline.

  7. The AI generates a response based on that result.

    Claude receives the confirmation and lets you know: “I’ve successfully created a collection called Blog Posts.”

  8. You see the final AI-generated response and your new collection

    You refresh your Webflow CMS and see the new collection live, just as Claude described. No manual steps. No code.

Key design principles behind MCP

The simplicity of the example above is powered by this thoughtful and scalable architecture. These core principles explain why MCP works across so many different tools and use cases:

Standard interface: MCP defines a universal format for integrating AI with external APIs, databases, file systems, and developer tools.

Client-server architectureAI assistants act as MCP clients that connect to lightweight servers, which expose tool capabilities and data.

Two way communication: Agents can retrieve data and take action.

Dynamic tool discovery: Unlike fixed integrations like OpenAPI, MCP supports runtime tool discovery, allowing models to see what’s available in real time.

Agentic InteractionSupports multi-step workflows and session-based memory, enabling AI agents to carry out tasks over multiple turns and tools.

The building blocks of MCP

MCP follows a client-server architecture that allows AI models to communicate with external tools in a structured and modular way. Each part of the system plays a distinct role in enabling the model to discover, understand, and interact with tools in real time. Here’s a breakdown of the key components that make it all work:

MCP Host

The MCP host is the environment where the AI agent operates, it manages communication between the language model and the external tools available through MCP. An example of an MCP host is Claude desktop.

MCP Client

The client acts like your app’s translator. It handles all the back-and-forth between the AI and the external tools by talking to MCP servers. Let’s say you’re using Claude Desktop to manage your Webflow site. When you type “Create a blog post collection,” the MCP Client sends that instruction to Webflow’s MCP Server.

MCP Server

The MCP Server is the bridge between the AI assistant and the external tools or data sources you want it to use. It connects to services like GitHub, Notion, Webflow, Supabse, Airtable, or even files on your computer

MCP’s benefits for developers and builders

By standardizing how AI assistants connect to external tools, MCP makes it easier to build, scale, and maintain powerful AI-driven applications.

For developers

MCP simplifies tool integration by introducing a consistent, reusable interface. Instead of building custom integrations for every model or platform, developers can expose tools once and use them across multiple AI systems.

  • Develop once, deploy across any LLM
  • Minimize hallucinations with structured, validated tool calls
  • Enable secure, permission-based access to real-time data

For AI builders

MCP unlocks new levels of functionality for AI assistants. It allows you go beyond chat, interacting with tools, running workflows, and responding based on live data, without needing constant fine-tuning. You can:

  • Equip models with the ability to reason, take action, and adapt to context
  • Add tool-based capabilities without retraining the model
  • Create rich, multi-step workflows with minimal engineering effort

Popular tools with MCP servers

MCP is already integrated with several widely used platforms, making it easier than ever to connect your AI assistant to the tools you rely on daily.

Below are examples of how MCP servers work with some of the most popular tools.

Webflow MCP server

Webflow’s MCP server lets AI assistants like Claude interact directly with your website’s CMS and design structure. Once connected, you can use natural language to create and update collections, modify fields, fetch page data, or even publish your site.

For example, you could type “Create a collection called Blog Posts with fields for Title and Author” or “Publish the homepage,” and Claude will handle it end to end.

Supabase MCP Server

Supabase’s MCP server connects your AI assistant to a fully featured Postgres database, complete with auth, storage, and real-time capabilities. This makes it possible to run live queries, insert records, or fetch user data securely and on demand. You could ask Claude, “Get the number of new users in the last 7 days,” or “Add a user with email user@example.com to the beta testers table.” With fine-grained access control and schema awareness, the AI can interact with your backend intelligently and safely.

Notion MCP Server

The Notion MCP server allows AI assistants to access, search, and update content within your Notion workspace. This includes retrieving documents, adding notes, editing databases, and even filtering tasks.

For example, you could say “Find my meeting notes from last Thursday,” “Create a task for writing the Q2 report,” or “Summarize all action items in this page.” By making your Notion data accessible in real time, the MCP server turns your workspace into a dynamic memory system for the AI.

What is the future with MCP?

We’re on the verge of a major shift in how AI applications are built and deployed. I believe that we’ll see the rise of plug-and-play MCP tool marketplaces, faster product development with fewer integration hurdles, and AI ecosystems that are interoperable by default.

We’ll see AI agents powered by MCP orchestrate real work: drafting reports, updating blog collections, summarizing data. Whalesync fits directly into this future. With two-way syncing across platforms like NotionAirtable, and Webflow, Whalesync ensures data stays up to date, no matter where the AI is acting.

Subscribe for more

Stay up to date with the latest no-code data news, strategies, and insights sent straight to your inbox!

Thank you for subscribing!

Oops! Something went wrong while submitting the form.
Keep reading

Related posts

May 19, 2025

7 best event website examples

Read post
May 9, 2025

10 best directory websites in 2025

Read post
May 6, 2025

The vibe coder’s tech stack

Read post
Mar 26, 2025

No coders guide to vibe coding

Read post
Mar 9, 2025

No-coder’s guide to Supabase

Read post
Feb 19, 2025

Zapier vs Make (2025)

Read post