Edit

Share via


Fabric Pro-Dev MCP Server overview

Fabric Pro-Dev MCP Server is a local, development-focused implementation of the Model Context Protocol (MCP) designed for building and extending Fabric solutions on your development machine.

Note

Fabric Pro-Dev MCP Server is available as an open-source project on GitHub. This article provides a brief overview, but full documentation is maintained in the repository.

What is Fabric Pro-Dev MCP Server

Fabric Pro-Dev MCP Server runs as a local subprocess on your development machine, providing AI agents with access to Fabric operations and local file system resources. It's optimized for development workflows where you need:

  • Local execution — Runs on your machine, no cloud dependency
  • File system access — Read/write local configuration and data files
  • Extensibility — Add custom tools and workflows for your needs
  • Development focus — Tools optimized for building Fabric solutions

Pro-Dev runs entirely on your machine and can integrate with your local development environment.

Key features

  • Local subprocess architecture — Runs whenever your AI agent needs it
  • Development tooling — Focused on building and testing Fabric solutions
  • File system integration — Access local files and configurations
  • Open source — Extend and customize for your workflows
  • Offline capable — Works in disconnected development environments

When to use Pro-Dev server

Choose Fabric Pro-Dev MCP Server for:

  • Active Fabric development — Building semantic models, reports, or data pipelines
  • Custom workflows — Automating repetitive development tasks
  • Local testing — Testing Fabric integrations before deployment
  • File-based operations — Working with local configuration or data files
  • Team extensibility — Sharing custom tools across your development team

Architecture

AI Agent ↔ Fabric Pro-Dev MCP Server (local) ↔ Fabric REST APIs
          ↕                                      ↕
     Local File System                    Local Dev Tools

The Pro-Dev server:

  1. Runs as a subprocess started by your AI agent
  2. Authenticates using locally configured credentials
  3. Calls Fabric APIs or accesses local resources
  4. Returns results through MCP protocol
  5. Terminates when the AI agent session ends

Getting started

Full installation, configuration, and usage documentation is available in the GitHub repository:

Key documentation sections

In the GitHub repository, you'll find:

  • Installation guide — npm package or source code installation
  • Configuration — Local authentication and settings
  • Tools reference — Available development-focused tools
  • Extension guide — Add custom tools and workflows
  • Examples — Common development scenarios
  • Contributing — Help improve the server