

About us
Welcome to the Karoo Data Platform User Group! Our aim is to meet monthly, virtually via a Remote Session, and in-person where possible. Our community meetup is open to everyone — developers, professionals, students, and enthusiasts alike — who share a passion for Enterprise Data technologies and related solutions. We believe in the power of diversity and inclusion, and we strive to create a supportive environment where everyone can network, learn new skills, collaborate on projects, and share their expertise.
Join us on a random Thursday evening (after work for us ;) ~5pm CAT-time zone/ SAST) every month, to learn, share experiences, and network with others in the industry as part of the broader Azure Data technical community. Come grab a drink and a snack, and together, we can celebrate human connection, continuous learning, and innovation in the world of data.
Here is the link to our Code of Conduct: [CodeOfConduct.docx](https://www.dropbox.com/scl/fi/stfjwifs3hpcc7s865hhe/CodeOfConduct.docx?rlkey=twdhx2e9yei4vtevs5c1isn8j&e=1&st=pm6rt2ic&dl=0)
Here is the link to our YouTube Channel:
https://www.youtube.com/@KDPUG-ZA
All other links available here:
KDPUG | Instagram, Facebook | Linktree
Upcoming events
4

KDUG #13: Fabric Workloads Monitoring
Location not specified yetThe immensely experienced and talented Prathy Kamasani, will be taking us through her custom-built Logging Framework for Microsoft Fabric, during this lunchtime learning session, hosted by the Karoo Data User Group.
Prathy says:
"
👩🏾💻 Most projects I've worked on, developers are keen on building but forget what happens after go-live. Support teams end up struggling to find out what went wrong with a single pipeline.Yes, there's event handling and auditing at tenant and workspace level. But sometimes what you need is visibility into just your project.
I kept running into the same problem on client projects: no central place to see which pipelines ran, what failed, and how many records were processed.
So I built a logging framework, similar to what I used to do in my SSIS days. One Python file that creates:
- A monitoring lakehouse
- Logging tables with date and time dimensions
- A Direct Lake semantic model with pre-built measuresIt's open source and takes about 5 minutes to deploy. This gives you a good starting point, and I usually build on it based on client requirements.
"Video walkthrough: https://lnkd.in/erZdM_gF
GitHub: https://lnkd.in/e-tgNhV4
Blog: https://lnkd.in/euMjmrtQ2 attendees
Past events
3



