Skip to content

Details

The immensely experienced and talented Prathy Kamasani, will be taking us through her custom-built Logging Framework for Microsoft Fabric, during this lunchtime learning session, hosted by the Karoo Data User Group.

Prathy says:
"
๐Ÿ‘ฉ๐Ÿพโ€๐Ÿ’ป Most projects I've worked on, developers are keen on building but forget what happens after go-live. Support teams end up struggling to find out what went wrong with a single pipeline.

Yes, there's event handling and auditing at tenant and workspace level. But sometimes what you need is visibility into just your project.

I kept running into the same problem on client projects: no central place to see which pipelines ran, what failed, and how many records were processed.

So I built a logging framework, similar to what I used to do in my SSIS days. One Python file that creates:
- A monitoring lakehouse
- Logging tables with date and time dimensions
- A Direct Lake semantic model with pre-built measures

It's open source and takes about 5 minutes to deploy. This gives you a good starting point, and I usually build on it based on client requirements.
"

Video walkthrough: https://lnkd.in/erZdM_gF
GitHub: https://lnkd.in/e-tgNhV4
Blog: https://lnkd.in/euMjmrtQ

#MicrosoftFabric #PowerBI #DataEngineering #Python

You may also like