While working on a mature Power BI semantic model, we needed to quickly answer a simple but critical question:
Where is each table’s data coming from?
The model had evolved over time and included data from multiple sources—SQL databases, APIs, SharePoint, and third-party systems. Although Power BI allowed us to check sources table by table, there was no consolidated view showing all tables and their underlying data sources in one place.
This became a challenge during audits, troubleshooting, and knowledge transfer. Manually opening each Power Query was time-consuming and error-prone, especially under tight delivery timelines. What we needed was a repeatable and automated way to document data sources across the entire model.
That gap is what led to the solution described in this article.
1. Introduction
In large Power BI and Microsoft Fabric environments, semantic models often grow complex, pulling data from multiple systems such as SQL databases, APIs, SharePoint, flat files, and third-party tools like Solver / BI360. Understanding where each table originates is critical for audits, troubleshooting, and governance.
2. Business Problem
Power BI does not provide a single consolidated view that maps tables to their data sources. This makes model audits, client handovers, and governance activities time-consuming and error-prone.
3. Conceptual Approach
Each imported table in a Power BI semantic model contains a partition with a Power Query (M) expression. These expressions include connector calls such as Sql.Database, Web.Contents, or BI360.Contents. By parsing these expressions as text, we can extract meaningful source information.
4. Step-by-Step Implementation
Step 1: Open Tabular Editor
Open your Power BI report, go to External Tools, and launch Tabular Editor connected to the semantic model.
– Power BI Desktop with Tabular Editor opened
– Model connected to the semantic model
Step 2: Open Advanced Scripting
In Tabular Editor, navigate to File → Advanced Scripting. This allows execution of C# scripts against the model metadata.
– Advanced Scripting window in Tabular Editor
Step 3: Run the Connector-Agnostic Script
Paste the following script into the Advanced Scripting window and adjust the file path as needed.
using System.Text.RegularExpressions;
var result = “TableName,Connector,Source\n”;
foreach (var t in Model.Tables)
{
string connector = “Calculated”;
string source = “Calculated”;
if (t.Partitions.Count > 0 && !string.IsNullOrEmpty(t.Partitions[0].Expression))
{
var expr = t.Partitions[0].Expression;
var match = Regex.Match(
expr,
@”([A-Za-z0-9_]+)\.([A-Za-z0-9_]+)\(“([^”]+)””,
RegexOptions.IgnoreCase
);
if (match.Success)
{
connector = $”{match.Groups[1].Value}.{match.Groups[2].Value}”;
source = match.Groups[3].Value;
}
else
{
connector = “Unknown / Complex M”;
source = “See M Expression”;
}
}
result += $”{t.Name},{connector},{source}\n”;
}
System.IO.File.WriteAllText(
@”C:\Users\YourUser\Downloads\PBIX_Table_Sources.csv”, //path for store your output
result
);
– Script pasted in Advanced Scripting
– Successful execution
Step 4: Review Output
Open the generated CSV file to review table names, connectors, and data sources.
– CSV output opened in Excel
– Press Ctrl+T to create table for the data
– Filter Unknown / Complex M
5. Business Value
This approach improves governance, accelerates audits, reduces onboarding time, and provides transparency into Power BI data sources.
6. Conclusion
A small and targeted automation can eliminate a significant blind spot in Power BI models. This method provides a scalable and repeatable way to document data sources at the semantic model level.