Microsoft Fabric Updates Blog

Microsoft Fabric logo
Microsoft Fabric logo

Fabric changing the game: Logging your workload using Notebooks.

I was working on an example for a customer about logging a file error of execution while you are running multiple notebooks in parallel in a try-and-catch scenario. While thinking about that scenario in a Fabric environment I realized this work is now so much easier. As I mentioned before in other posts, OneLake integration …

Strong, useful, beautiful: Designing a new way of getting data

What is good design? In the data integration design team at Microsoft, we ask ourselves this question every day as we strive to create products that meet the needs and expectations of our customers. Design is not just about aesthetics or functionality, but about creating meaningful and relevant experiences for customers. Every design tells a story. It tells a story about people: what they want and what they need.

Learn Live: Get started with Microsoft Fabric

Calling all professionals, enthusiasts, and learners! On August 29, we’ll be kicking off the “Learn Live: Get started with Microsoft Fabric” series in partnership with Microsoft’s Data Advocacy teams and Microsoft WorldWide Learning teams to deliver 9x live-streamed lessons covering topics related to Microsoft Fabric! These will be delivered via the Microsoft Reactor YouTube channel …

Use Semantic Kernel with Lakehouse in Microsoft Fabric

Microsoft Fabric allows enterprises to bind different data sources through OneLake, and data engineer can call a unified API for different business scenarios to complete data analysis and data science. This article will describe how to allow data scientists to use Semantic Kernel with Lakehouse in Microsoft Fabric In Microsoft Build 2023, Microsoft proposed the …

Data Pipeline Performance Improvement Part 3: Gaining more than 50% improvement for Historical Loads

Introduction / Recap Welcome to the final entry of our 3-part series on improving performance for historical data loads! In the first two entries we dove deep into the technical weeds to demonstrate the capabilities of Data Pipeline Expression Language. Part 1: Data Pipeline Performance Improvements Part 1: How to convert a time interval (dd.hh:mm:ss) …