Well, that's a wrap on FabCon 2025! Here are some of my thoughts and key take aways.
Executive Summary
Here are my high-level thoughts on the sessions that I attended, if you are looking for the Cliffs Notes version of this post. The details with links follow below this summary.
- Keynotes - Held on Day one and three of the conference and were moved to T-Mobile arena this year. The walk was long, but the weather was nice. The big announcements were all made during these sessions. The biggest of these announcements for my clients are:
- Azure Key Vault support for connection credentials
- Copilot now available in all capacities
- Data Agents in Fabric / Azure AI Foundry integration
- Azure Synapse to Fabric migration
- Command line Interface for Fabric
Hetz's Details
Below are my more detailed thoughts and opinions on why the highlighted Fabric announcements are game-changing for our clients. As many of you know I am a Solution Architect for Eide Bailly, LLP and work with a variety of clients across SMB, SMC, and the Enterprise spaces. As a Microsoft Fabric Featured Partner, Eide Bailly gets involved with client projects that range from building new data warehouses from the ground up, migrating data warehouses from Azure Synapse to Fabric, building data integration pipelines in Fabric, and implementing the new SQL Database in Fabric. Some of our Fabric capabilities and experiences can be found in this blog article that was co-published with the Microsoft Fabric team. We are also able to deliver Fabric Analyst and Dashboard in a day training to your organization. What follows next are some deeper dives into the announcements that our clients have already realized added value from.
Everyone should be using Azure Key Vault for managing connection credentials, connection strings, secrets, and certificates.
Azure Key Vault Connection Credential Support
Us consultants have been recommending that our clients store their connection credentials in Azure Key Vault for a LONG time. In fact, as long as Azure Key Vault has been around. These days, I rarely encounter any client working in cloud environments that is not leveraging Azure Key Vault for their connection settings. So, it was unfortunate that when Fabric first released there was no support for the secure credential service in Azure that Microsoft has been touting (and consultants have been pushing) for years.
Well that all changed on the first day of the conference, when Guy in a Cube legend Patrick LeBlanc took the stage and demoed creating Key vault secured connections in Fabric and how to leverage them in Fabric Notebooks. Here is a link to the Guy in a Cube guide to using Key Vault in Fabric YouTube video that was recorded on main stage at FabCon 2025. It should start at the good part and skip the Key vault setup piece, since I assume anyone who reads or follows this blog knows how to do that already. if you don't, several Key vault links have already been provided to guide you through that (or you can restart the video from the beginning :). One client of mine has already switched to using their Key Vault credentials, previously they were copying them out into the connections within Fabric. In fact, this is the same client that I discussed in my presentation at the conference, so my presentation was already out of date when given for the first time!
No F64, no Copilot for you!
Copilot in all Fabric Capacities
One of the near-magic-like capabilities of Fabric has been the introduction of Copilot into the toolbox that analysts have for building stunning reports and dashboards. Equally amazing but less flashy for the C-Suite crowd, are Copilot's capabilities within notebooks for generating code and in data warehouses for generating or tweaking SQL statements. But until the Fabric Community Conference a few weeks ago, you could only unlock the Copilot features in Fabric if you were running an F64 or larger capacity. As a Microsoft Partner that specializes in the SMB and SMC space, the cost of this capacity just did not have the ROI that most clients would need to justify the spend.
Well, all of that changed during the keynote on day one of the conference. When this announcement was made, I was hot and sweaty - but I think that was from the walk over to T-Mobile from the MGM Grand. At any rate, here is the low down on what this new capability means for several different areas within Fabric. To get started, you will need to enable Copilot in Fabric first.
- Writing SQL queries in your data warehouse just got a whole faster with code completions and suggestions. You can accept suggestions as they appear while writing your query, just like IntelliSense in Visual Studio, and you can provide SQL comments at the beginning of your query which Copilot can use to reason over the query that you want to create. This, in turn, help Copilot to generate better and more complete suggestions. Here is the MS Learn article: Copilot Code Completion.
- Asking natural language questions of your data in the chat pane. This feature gives non-SQL savvy analysts the ability to query the organization's data warehouse using natural language. They do need to prefix their question with syntax that tells Copilot that they are asking a question of the data in the warehouse. To do this, you simply use the prefix '/question' and then ask your question. Here is an example from the MS Learn article:
/question what types of security are Supported for this warehouse?
- You can now leverage "Quick Actions" to fix query errors when they occur. Let's say you are writing a query and executing it in real time in an effort to see the results. This ensures that the query is correct, and the results are what you are trying to get at. We have all run into executions error when working in this manner. The developer then copies the error message and starts searching the internet or more recently, opens Copilot in the browser and pastes the error message into the chat pane. Now in the Fabric canvas we can click the "Fix query errors" button and voila - the query is fixed! At least it may be fixed, you definitely have to test and re-test the query to ensure the A.I. applied the fix correctly.
What we really would like, is an agent in Teams that can answer questions about our data in Fabric and that is secured according to our Graph and OneLake security settings. And yes, we are aware that we will need an F64...
Data Agents
What Copilot Agents do for your files in OneDrive, Data Agents in Fabric do for your data in OneLake. Imagine being able to create an agent for your employees to be able to interact with, using natural language, which is grounded in your data. That day is here with Data Agents in Fabric. These agents can also interact with models that you have created and fined tuned in Azure A.I. Foundry, providing you with a full end-to-end low-code custom A.I. experience. Speaking of Azure A.I. Foundry, you can leverage the connection to Fabric to call your data agents from your Azure A.I. Foundry agents. This allows you to create an "agent chain" like we are able to do in Copilot Studio. While there still are some limitations with these new agents, I have no doubt these will evaporate over time, just as many of the limitations with Fabric have. Just as with Copilot before, these agents do require an F64 paid capacity.
We have actually been working with an A.I. client for some time now that wants to empower their employees to research and find trends within their massive historical database. They do not have a dedicated data scientist, nor do they want to develop their own models, rather they want a natural language interface for their data. They would also like to be able to interact with this agent via Teams. We are currently exploring chaining data agents with a Copilot agent for them, and the results have been positive so far.
Get started with data agents today here: Data Agents in Fabric MS Learn.
Our current Synapse deployment is not well understood, and we really struggle to understand if our data is current and if our analytics are giving good insights or not.
Azure Synapse Migration Support
I have worked with several clients that wanted to ditch their Synapse data warehouse for various reasons. They do usually have some common themes - complexity, cost, and a lack of skilled resources for design and maintenance. My past migrations from Synapse to Fabric have involved the painful process of recreating the client's linked data sources, pipelines, databases, and data flows within Fabric (and always complaining that there isn't a tool to do this for us). Well now there is. It is wizard driven from right within the Fabric workspace. I have not personally used this tool yet but am really looking forward to our next Synapse migration so I can try it out. Here is the team's release post: Migration Assistant for Fabric Data Warehouse.
If we could script the deployment of updated artifacts, that'd be great...
Fabric Command Line Interface
Call them Old School, efficient, stuck in their ways, or any other adjective you might think of to describe them, but we all know and love our DevOps and release mangers who MUST have a command line to get their work done. Until now Fabric was lacking that command line interface, making automation of deployment artifacts difficult for those folks who want to just script it.
The new CLI for Fabric includes file system navigation capabilities and does require an updated release of Python. other than that, the functionality is just what you would expect from a command line interface for the platform. I won't bore you here going through all of the commands and the syntax needed for connecting. There is a handy, dandy cheet sheet available for all of the commands here.