Power BI Vibe Coding
Vibe coding is a trendy topic that everyone talks about online. And I’m fully embracing it.
In case you don’t already know, vibe coding is when developers use LLMs to generate code without fully understanding what it means or does. That’s a high-level definition. There are fascinating and polarizing takes on the concept, so do your own research if it interests you.
The main flaw with vibe coding is that when you do it, especially for production-level code, you risk poisoning your code base with vulnerable and/or inefficient code.
Vibe coding seems great at the surface because it minimizes the knowledge threshold needed to build software applications. But because you don’t understand the ins and outs of the code, the negative consequences can lead to future catastrophes.
I’m writing this post because I’m fully leaning into vibe coding for my current work project. However, I’m doing it in a safer way. More on that later.
First, I’ll tell you what I’m working on and why I chose to vibe code the solution.
My data team provides over 50 automated SSRS reports for users across our organization. But we’ve reached a point where SSRS doesn’t fully satisfy our needs. We needed a BI tool that could enhance and modernize how we deliver and visualize data. Since we are already heavily involved in the Microsoft ecosystem, we chose Power BI.
Our biggest barrier to Power BI development lies in experience. My team is small, and none of us has used a business intelligence tool before, so there was a steep learning curve ahead.
We planned to integrate Power BI into our organization at the beginning of the year, but three months later, we had made no progress. Our usual day-to-day projects swamped our schedules, and we had little time to learn Power BI. Or, at least, we couldn’t allocate enough time to research and learning.
To overcome that obstacle in April, we decided that I would take on fewer tasks to focus primarily on Power BI development.
My goal was to learn all aspects of Power BI, from account administration to dashboard/report development to creating connections and gateways to building deployment pipelines, and so on. This overwhelmed me. The first few weeks were full of stress, knowledge consumption, and more stress. I often felt unmotivated and wished I weren’t cursed with this task.
But I recognized the tool’s value for my company, so I persisted.
After learning the basics, I figured out what concepts mattered most to my situation, and I built simple reports and semantic models that went into production. My first pilot project was simple. I created a semantic model for users to connect to in Excel so they can run their analysis on the dataset. Nothing fancy or difficult to build here.
However, the next pilot project is more complex (and currently in the works). I need to build a dashboard with multiple visuals and complex aggregations.
At first, I made all the aggregations in SQL and then loaded the data into my Power BI project. There were many redundancies, and I realized I wasn’t leveraging Power BI’s capabilities properly. The simpler, and likely more efficient, approach was to aggregate the data using Power BI’s DAX language.
I encountered a new roadblock here. I knew next to nothing about DAX. My safest, and perhaps most optimal long-term solution, would have been to read documentation and learn as many DAX features as possible. But that’s time-consuming, and I’m here to ship solutions fast.
That’s why I decided to channel my inner vibe coder, relax, and let ChatGPT do all the work.
I’d feed the LLM my data model and tell it what aggregations I need completed. The LLM would then spit out some code, which I’d copy and paste into my solution and watch the magic happen. Most initial DAX code failed, so I’d feed ChatGPT the error messages and have it regenerate the code. I repeat this process until something works.
Through vibe coding, I built a working dashboard, and it’s ready for prod deployment.
But remember what I said earlier about vibe coding in a safer way.
Even though the DAX queries work, I won’t blindly deploy them for my users. That would be a bad and risky approach. Instead, I’m taking the time to review all of the DAX functions and techniques ChatGPT created, so I fully understand how the code works at a low level.
The biggest issue here is that I don’t know what the most optimal or efficient approach truly is. For all I know, ChatGPT could have ignored or not understood DAX best practices, so my dashboard could be flawed in ways I don’t know.
I’m okay with this for now.
I’m less than two months into my Power BI journey. There’s a lot more development, testing, and researching to do in the future. My goal right now is to ship Power BI solutions that work and provide value to users. I’m not developing any mission-critical application that would destroy the company if it were flawed or broken. Since I’m still in the pilot project phase, it’s okay if my solutions are scrappy. So long as they deliver high-quality, accurate data and provide value, I consider it a win. I’ll have time down the road to master DAX and ensure I write efficient queries.
It’s important to focus on and prioritize your overarching goals.
When building solutions with new skills, the goal is usually not to deliver something perfect right away. Instead, it’s to deliver something useful. Then, once you ship a few useful projects and continue learning your new skill, you can return to what you’ve built and refine them.
I understand that vibe coding DAX is not a long-term practice I should rely on. However, it’s useful now and allows me to deliver useful solutions for my organization.