Building Custom Power BI Connectors for REST APIs: The Complete 2026 Playbook

December 19, 2025 at 02:27 AM | Est. read time: 15 min
Valentina Vianna

By Valentina Vianna

Community manager and producer of specialized marketing content

Power BI is fantastic out of the box—but when your data lives behind bespoke REST APIs, the native connectors and the generic Web connector can quickly hit their limits. That’s where custom connectors come in. With a well-built Power BI custom connector (written in Power Query M), you can give analysts a simple, secure, reusable way to pull governed, production-ready data from any REST API into semantic models and reports.

This end-to-end guide walks you through planning, building, testing, deploying, and governing a REST API custom connector for Power BI—plus the patterns you’ll need for OAuth 2.0, pagination, rate limiting, and incremental refresh. Whether you’re connecting to a SaaS platform, a partner API, or your own microservices, this is the playbook to get it right.

If you’re new to the platform, a quick primer on capabilities and components is here: What is Microsoft Power BI?

Why build a custom Power BI connector?

  • One-click reuse across teams: Analysts select your connector in Get Data, authenticate, and choose a table—no manual M scripts, no brittle copy-paste queries.
  • Centralized authentication: Support API key, Basic, or OAuth 2.0 safely. Rotate secrets without breaking dozens of reports.
  • Performance patterns baked in: Handle pagination, throttling, and large datasets with tested logic instead of ad-hoc queries.
  • Governed access: Publish a single source of truth and control refresh through gateways and service settings.
  • Maintainability: Version, test, and evolve the connector as the API changes—without touching every report.

In 2026, custom connectors are becoming the default for enterprises standardizing access to internal APIs and third-party platforms while preserving governance and developer velocity.

How a Power BI custom connector works (in plain English)

Under the hood, a connector is a Power Query M extension packaged as a .mez file. It defines:

  • A Data Source Kind: name, authentication type, test connection behavior, metadata (label, category, icons).
  • One or more shared functions (e.g., YourConnector.Contents) that call the API and return tables.
  • Optional OAuth handlers (StartLogin, FinishLogin, Refresh) for secure sign-in flows.

Where it runs:

  • Power BI Desktop: You install the .mez to Documents\Power BI Desktop\Custom Connectors.
  • Power BI Service: Data refresh runs through an on-premises data gateway (even if your API is cloud-hosted). Install the same .mez on the gateway machine and enable custom connectors.

Tip: Aim for a UX where business users only choose the connector, authenticate, and select logical entities (Orders, Users, Events), not raw endpoints.

Prerequisites and tooling

  • Power BI Desktop (latest)
  • Power Query SDK for Visual Studio (to create the M extension project and package .mez)
  • Access to the target API (sandbox credentials are ideal)
  • Optional debugging tools: Fiddler/Charles Proxy, Postman, Power Query Diagnostics

Design first: write a short connector spec

Before coding, capture the essentials:

  • Base URL and versioning: https://api.example.com/v1/
  • Authentication: API key header? OAuth 2.0 Authorization Code? Service account?
  • Entities and output tables: Which endpoints map to clean, typed tables?
  • Pagination: page/per_page, offset/limit, cursor tokens, or RFC5988 Link headers
  • Rate limiting: 429 and Retry-After header? Provider-specific headers (X-RateLimit-Remaining)?
  • Filters and date ranges: Can you pass created_after/updated_since for incremental refresh?
  • Error handling: 4xx vs 5xx, structured error payloads
  • Data volume: expected rows, refresh frequency, Pro vs Premium capacity considerations

This spec keeps the M code lean and future-proof.

Core architecture patterns for REST APIs

1) A safe request wrapper (retry + backoff)

  • Use Web.Contents() with ManualStatusHandling to capture 429/5xx.
  • Implement exponential backoff with Function.InvokeAfter().
  • Respect Retry-After headers if present.

2) Pagination strategies

  • Page/offset: increment until empty or has_more=false.
  • Cursor/continuation: pass next token from response into the next request.
  • Link headers: parse “next” rel to continue.

3) Schema and typing

  • Convert JSON to records and then to tables.
  • Lock types (id as Int64, timestamps as datetimezone, etc.) to stabilize models.
  • Handle nested objects with Table.ExpandRecordColumn/Table.ExpandListColumn.

4) Incremental refresh

  • Add RangeStart and RangeEnd parameters.
  • Pass them as query filters to the API so each partition calls only the relevant slice.

5) Caching and performance

  • Use Web.Contents(BaseUrl, [RelativePath, Query]) to help Power Query cache requests.
  • Buffer intermediate tables only when necessary to avoid repeated calls.
  • Keep transformations after data retrieval as simple and columnar as possible.

A minimal M skeleton (API key + pagination)

Note: This is a conceptual example to show patterns you’ll adapt to your API.

`

section RestConnector;

BaseUrl = "https://api.example.com/v1/";

[DataSource.Kind="RestConnector", Publish="RestConnector.Publish"]

shared RestConnector.Contents = (optional options as record) as table =>

let

// User options

Path = if options?[path]? <> null then options[path] else "items",

PageSize = if options?[pageSize]? <> null then options[pageSize] else 100,

// Build a paginated list of records

Pages =

List.Generate(

() => [Page = 1, Acc = {}],

each true,

each

let

Response = CallWithRetry(

BaseUrl,

[

RelativePath = Path,

Query = [page = Text.From([Page]), per_page = Text.From(PageSize)],

Headers = DefaultHeaders()

]

),

Json = Json.Document(Response),

Data = try Json[data] otherwise Json,

NextPage =

if (try Json[has_more] otherwise false) = true and List.Count(Data) > 0

then [Page] + 1

else null,

NextState =

if NextPage <> null

then [Page = NextPage, Acc = Data]

else [Page = null, Acc = Data]

in

NextState,

each [Acc]

),

Combined = List.Combine(Pages),

TableOut = if List.Count(Combined) = 0

then #table({}, {})

else Table.FromRecords(Combined),

Typed =

let

Cols = if Table.IsEmpty(TableOut) then {} else Table.ColumnNames(TableOut),

Types = List.Transform(Cols, (c) => { c, type any })

in

Table.TransformColumnTypes(TableOut, Types)

in

Typed;

// Default headers with API key from credential store

DefaultHeaders = () as record =>

let

Key = Extension.CurrentCredential()[Key],

H = [#"Authorization" = "Bearer " & Key, #"Accept" = "application/json"]

in

H;

// Resilient HTTP with basic retry

CallWithRetry = (base as text, opts as record, optional maxRetries as number) as binary =>

let

Max = if maxRetries = null then 5 else maxRetries,

Do = (attempt as number) =>

let

Response = try Web.Contents(base, Record.AddField(opts, "ManualStatusHandling", {429, 500, 502, 503, 504}))

in

if Response[HasError]

then

let

Delay = #duration(0,0,Number.Power(2, attempt),0)

in

if attempt < Max

then Function.InvokeAfter(() => Do(attempt + 1), Delay)

else error Response[Error]

else Response[Value]

in

Do(0);

// Data source metadata

RestConnector = [

TestConnection = (dataSourcePath) => { "RestConnector.Contents", null },

Authentication = [ Key = [] ],

Label = "REST API (Custom)"

];

RestConnector.Publish = [

Beta = true,

Category = "Other",

ButtonText = { "REST API (Custom)", "REST API (Custom)" }

];

`

What this shows:

  • A shared function users can invoke from Get Data.
  • API Key–based authentication that reads the key from Power BI’s credential store.
  • Resilient pagination and retry logic.
  • A predictable typed table.

OAuth 2.0 (high-level pattern)

For OAuth, define Authentication = [ OAuth = [ StartLogin, FinishLogin, Refresh ] ] and implement:

  • StartLogin: build the authorization URL and provide the redirect URI. Power BI commonly uses https://oauth.powerbi.com/views/oauthredirect.html.
  • FinishLogin: exchange the code for access/refresh tokens.
  • Refresh: exchange the refresh token when needed.

Keep client_id and client_secret out of source control. Consider using environment variables at build time.

Securing and governing your connector

  • Credentials: Store API keys and OAuth tokens in Power BI’s credential store—never hardcode.
  • Privacy levels: Set correctly (Organizational vs Public) to prevent data leakage across sources.
  • Gateways: For refresh in the Service, install the .mez on the on-premises data gateway and enable custom connectors. Use enterprise gateway for multi-user scenarios.
  • Certification: If you plan to distribute broadly, explore Microsoft’s connector certification to reduce trust prompts and increase adoption.
  • Governance: Define naming conventions, versioning, and lifecycle rules. For a practical framework, see Power BI governance: the practical guide to balancing control and self-service.

Incremental refresh that actually reduces API load

1) Create RangeStart and RangeEnd parameters (datetime).

2) In your connector, pass these as query parameters (e.g., updated_after, updated_before).

3) Build the table so Power BI can filter by those parameters during partition refresh.

4) Configure incremental refresh in the model (e.g., store 3 years, refresh last 7 days).

Because REST sources don’t support query folding, your connector must implement server-side filtering with those parameters; otherwise, Power BI will refetch everything.

Handling common REST complications

  • Cursor-based pagination: return the next cursor from each response and feed it into the next request.
  • Rate limiting and backoff: honor Retry-After; exponential backoff caps wasted calls.
  • Nested arrays and objects: expand carefully and keep column names stable; avoid dynamic schemas across refreshes.
  • Large arrays: paginate on the server-side and stream pages into Power BI; avoid buffering huge lists in memory.
  • Time zones: standardize to UTC (datetimezone) and document expectations.

Testing and debugging tips

  • Validate endpoints and parameters in Postman first.
  • Use Power Query Diagnostics and Diagnostics.Trace in M to log key events.
  • Test negative paths (expired token, 429, malformed query).
  • Simulate slow responses to verify retry and backoff.
  • Add unit-like tests in a sample PBIX that asserts row counts and schema.

Packaging and deployment

  • Build the .mez in Visual Studio with the Power Query SDK.
  • Install locally: Documents\Power BI Desktop\Custom Connectors.
  • Enable: File > Options and settings > Options > Security > Data Extensions > Allow any extension (temporary for dev).
  • Gateway install: C:\Program Files\On-premises data gateway\Custom Connectors. In the Power BI Service, enable the gateway’s custom connectors and configure credentials.
  • Document versioning and changelog so report authors know what changed and when.

Example scenario: Product analytics via REST

If your product team tracks behavior events in a tool like PostHog, a custom connector can expose curated dimensions and facts (users, events, funnels) with incremental refresh on event_time—no CSV exports, no brittle scripts. For a practical view of turning behavior data into insights, explore this hands-on guide: PostHog + Power BI: the practical guide to turning user behavior into business insights.

Best practices checklist

  • Define a stable table schema with explicit types
  • Implement robust retry with exponential backoff
  • Support incremental refresh via date filters
  • Keep request options in Web.Contents(RelativePath, Query) for caching
  • Parameterize environment (base URL, page size)
  • Never hardcode secrets; rely on credential store
  • Add TestConnection and clear error messages
  • Document limits, supported filters, and known edge cases
  • Validate refresh behavior through the gateway before rollout

When not to build a custom connector

  • The API is unstable or lacks pagination/filtering—fix upstream first.
  • A certified connector already covers the use case.
  • No governance model exists to maintain and distribute the connector safely.

In those cases, consider dataflows, ingestion pipelines, or staging data in a warehouse before modeling in Power BI.

Final thoughts

Custom Power BI connectors turn REST APIs into discoverable, governed, refreshable data sources that non-technical users can trust. Invest in good patterns—pagination, retries, incremental refresh, and security—and your connector will scale with your organization’s analytics needs throughout 2026 and beyond.

If you’re just ramping up on the platform’s building blocks, this overview helps contextualize your connector work: What is Microsoft Power BI?


FAQ: Custom Power BI Connectors for REST APIs

1) Do I always need a gateway to refresh a custom connector in the Power BI Service?

Yes. For custom connectors, scheduled refresh in the Service requires an on-premises data gateway with the same .mez installed. This is true even if your REST API is internet-accessible.

2) Can a REST-based custom connector use DirectQuery?

Generally no. REST APIs typically don’t support query folding, which DirectQuery requires. Assume Import mode with scheduled refresh. Use incremental refresh to keep data fresh without re-pulling everything.

3) How should I handle OAuth 2.0 authentication?

Declare Authentication = [ OAuth = [ StartLogin, FinishLogin, Refresh ] ] and implement:

  • StartLogin: build the authorization URL and return the redirect.
  • FinishLogin: exchange the code for tokens.
  • Refresh: exchange the refresh token when needed.

Use Power BI’s credential store; never embed secrets in code.

4) What’s the best way to implement pagination?

Match the API’s pattern:

  • Page/per_page or offset/limit: iterate until empty/has_more=false.
  • Cursor tokens: pass the next cursor until none is returned.
  • Link headers: follow rel="next".

Wrap calls with retry and backoff to survive 429s and 5xx.

5) How do I reduce refresh time and API usage?

  • Implement server-side filtering for incremental refresh (RangeStart/RangeEnd).
  • Use reasonable page sizes (e.g., 500–5,000 depending on API limits).
  • Cache via Web.Contents(RelativePath, Query), avoid redundant calls.
  • Avoid heavy row-by-row transformations; keep transformations columnar.

6) What happens if the API schema changes?

Power BI expects a stable schema. Add defensive logic:

  • Select and order columns explicitly.
  • Use Table.TransformColumnTypes with defaults.
  • Handle missing fields gracefully (e.g., try/otherwise).

Document changes and bump the connector version.

7) Is certification worth it?

If you plan broad distribution, yes. Certification improves trust and discoverability and can reduce friction in enterprise environments. It requires meeting Microsoft’s quality and security criteria.

8) Where do I store API keys securely?

In Power BI’s credential store. Choose the appropriate authentication type (Key, Basic, OAuth) in the connector definition and prompt users to enter credentials at connection time.

9) Can I use a custom connector in Dataflows?

Yes—via the gateway with the same .mez installed. Confirm that privacy levels and credentials are set on the gateway for the dataflow workspace.

10) How should I govern connector usage across the organization?

Don't miss any of our content

Sign up for our BIX News

Our Social Media

Most Popular

Start your tech project risk-free

AI, Data & Dev teams aligned with your time zone – get a free consultation and pay $0 if you're not satisfied with the first sprint.