I have spent the better part of two decades building, restructuring, and leading BI teams. In that time, I have seen technically brilliant teams fail to deliver real business value. I have also seen teams with modest technical resources become the most trusted function in their organisation. The difference was never the technology. It was never the tools. It was always the same six things — and most organisations are only focused on one of them.
When I was Director of Business Intelligence at Vendasta Technologies, I led a centralised team of eleven people serving seven divisions: Finance, Revenue Operations, Corporate Leadership, Digital Marketing, Products and Platforms, Sales, and Marketing. The team was technically capable. But when I first took over, they were working in silos — each person focused on their own domain, with little cross-functional knowledge, limited stakeholder relationships, and no shared operating model.
Fixing that took more than improving our data pipelines or upgrading our dashboards. It required a deliberate approach across six interconnected areas. I call them the six principles of BI leadership. I have spent the last several years refining them into a structured framework — one that any BI leader, in any organisation, can use to assess where their team stands and where they need to grow.
This article is a practitioner's account of what those six principles mean in practice. Not theory. Not a framework from a textbook. Things I have done, seen fail, and had to fix — sometimes at significant professional cost.
"Most BI failures have nothing to do with tools, platforms, or dashboards. They are failures of leadership, commercial awareness, and customer orientation."
— From 19 years leading BI teams across Canada and the USAThe most common gap I see in data teams is not technical. It is commercial. The team builds accurate dashboards, delivers on time, and still fails to create business value — because they do not understand the business they are serving.
Let me be specific. If you ask the average BI analyst how their company generates revenue, you will often get a vague answer. Ask them how a dashboard they built connects to a financial outcome — cost savings, customer retention, revenue growth — and many cannot answer that either. They know how the metrics are calculated. They do not know why those metrics matter.
This is not a criticism of individual people. It is a failure of leadership. A BI leader's job is to ensure that every person on the team understands the commercial context of their work — the revenue model, the strategic priorities, the key cost drivers, and how the analytics they build connects to outcomes the business actually cares about.
- Every team member can name the organisation's top three strategic priorities and connect their current work to at least one of them
- The leader uses financial data — CAC, LTV, margin, cost-per-acquisition — in prioritisation decisions, not just delivery metrics
- KPIs and dashboards are retired or redesigned when business strategy changes, not maintained indefinitely
- The team is consulted on business decisions before they become BI requirements — not after
In my years managing BI programmes at Microsoft through Infosys — across Customer Support, Sales and Marketing, Legal and Corporate Affairs, and Finance — the teams that had genuine commercial fluency were the ones stakeholders called first when a strategic decision needed analytical input. The ones that lacked it stayed in the queue, waiting for tickets.
Everything else in this article rests on this foundation. If you do not get the people dimension right, nothing else works the way it should.
I follow servant leadership principles. I learned this framework from James Hunter's The Servant, Ken Blanchard's work, and Patrick Lencioni's model on team dysfunction. The core belief is simple: the leader's job is to serve the team — not the other way around. That means removing obstacles, creating psychological safety, developing people, and being genuinely invested in their growth.
In practice, the most important ritual I introduced was a specific structure for one-on-ones. Every week, I would open by asking my team member for feedback on my own performance before I gave any feedback on theirs. This is not a common practice. Most managers use one-on-ones to cascade tasks and review progress. I used them to develop people — and to model the two-way feedback I wanted to see across the team.
I also introduced quarterly personal and professional goals. Every team member set one professional goal aligned to their role — a new technical skill, a certification, a project they wanted to lead — and one personal goal aligned to their wellbeing. In our weekly one-on-ones, I tracked progress on both. Not because I was managing their personal lives, but because I was signalling that their growth as a whole person mattered to me, not just their output.
"Culture is not something written on rock. It evolves day by day. As a leader, it is my responsibility to ensure the team's culture is not negatively influenced when someone new joins."
Team culture is a leadership accountability. It does not happen passively. People adapt to the culture around them — and if the culture is not actively maintained, it will drift toward whatever the most dominant personality in the room creates. I always looked for cultural fit in interviews — not sameness of personality, but alignment with the values of openness, mutual accountability, and genuine collaboration I had built.
Having good people and a commercially aware team is necessary but not sufficient. Without a structured operating model, even the best people cannot deliver at pace and quality. This is what Management means as a BI leadership principle — and it is where most BI teams have their most fixable gaps.
At Vendasta, when I inherited the BI portfolio, there was no centralised structure. Resources were scattered across the organisation with fragmented accountability. There was no consistent delivery methodology, no prioritisation governance, and no shared standards. Each analyst was essentially a one-person BI team for their assigned division — brilliant in isolation, unable to function as a collective.
The first thing I did was centralise. I brought everyone under one portfolio with a shared operating model — Agile delivery using Scrum, a two-track system separating planned sprint work from ad hoc requests, and a primary stakeholder designated for each division with formal authority to prioritise on behalf of their function.
That last part matters more than most BI leaders realise. Without a named prioritisation owner, the backlog becomes whatever the loudest stakeholder is asking for today. The most critical work gets displaced by urgency theatre. A primary stakeholder model — where a CFO, CMO, or VP owns the priority queue for their division — transforms the relationship from reactive to structured.
- I facilitated the team with a robust OKR framework that aligned every team member to the strategic objectives of their assigned division
- This gave each analyst a deep understanding of the business challenges, targets, and expectations of their stakeholders
- It enabled precise, targeted delivery — not reactive task completion
- Most importantly, it gave the team a sense of purpose and fulfilment — they could see exactly what business value they were creating
Management also covers the less glamorous but essential disciplines: communication, conflict resolution, risk management, and decision-making under uncertainty. These are not soft skills. They are leadership skills — and BI leaders who underinvest in them will find that technical excellence alone cannot hold a programme together under pressure.
The purpose of a BI function is to serve its customers. Everything else — the operating model, the technology, the domain expertise — is in service of that purpose. So how a BI team relates to its stakeholders is not a peripheral concern. It is the central one.
Most BI teams treat their stakeholders as requestors. A requirement comes in; the team builds it; the ticket gets closed. The problem with this model is that stakeholders are not always right about what they need. They know the problem they are trying to solve, but they do not always know what data approach will best solve it. A team that simply executes requests is a delivery resource. A team that understands the business problem behind the request — and challenges it constructively when a better approach exists — is a trusted advisor.
In my last role, I made it a deliberate practice to understand my customers from every angle. What are their business challenges right now? What are they being measured on? What keeps them up at night? What do they not yet know they need? I expected every team member to know the same about their assigned stakeholder — not just their reporting requirements.
I also made it clear that reprioritisation decisions would be made purely based on customer needs — not based on who pushed the hardest or escalated the loudest. That required political courage. But it earned the team a reputation for fairness and strategic alignment that no amount of delivery speed alone would have built.
Technology carries the lowest weight in my framework — not because it is unimportant, but because it is the area most organisations already invest in, and gaps here are more tactical to address than gaps in acumen, people, or customer orientation.
That said, there is a specific aspect of technology leadership that I see consistently undervalued: the technical credibility of the BI leader themselves. A leader who cannot read and evaluate the code their team produces, who cannot engage meaningfully in architectural decisions, who cannot identify a performance bottleneck or a cost inefficiency without being told — that leader cannot mentor, quality-check, or genuinely guide their team. They manage process. They do not lead technically.
In BI specifically, there are foundations that every leader must hold: SQL at depth, data warehousing concepts, ETL and data engineering principles, cloud cost awareness, and the ability to evaluate tool choices on their technical merits. With that foundation, a BI leader can adapt to any environment and any stack.
At Vendasta, I led platform improvement initiatives that resulted in saving approximately $54Kper year in GCP and BigQuery costs — by identifying and eliminating redundant full reloads, over-scheduled refreshes, and inefficient query patterns. That was not a technology decision alone. It was a leadership decision — one that required me to know enough to spot the problem, justify the investment of time to fix it, and coach the team through the implementation.
This is the principle I described in my original article as "the final and most important" — and I stand by that characterisation. Without functional and domain expertise, a BI team cannot become a subject matter expert for their stakeholders. And without SME status, there is no trusted advisor relationship. Period.
Domain expertise means more than knowing what the data means. It means understanding the business function you serve at the same depth a business analyst would — the processes, the language, the KPIs, the metrics, the upstream and downstream dependencies, the business rules behind every calculation. It means being the person in the room who can say: "That number looks wrong because of how this process changed in Q3" — before the stakeholder has to ask.
Over the course of my career, I have served Legal and Corporate Affairs, Manufacturing, Customer Support and Sales, Sales and Marketing, Revenue Operations, Finance, and Platform Services. Each domain required a genuine learning investment. Every time I took over a new project or programme, my first objective was to build domain knowledge — because without it, I could not connect with stakeholders, I could not ask the right questions, and I could not deliver analytics that solved the real problem rather than the stated one.
At Vendasta, the team had been working in silos — each person knew their own domain but had little cross-functional awareness. I restructured the team with a deliberate SME model: every division had a named BI owner accountable for deep domain knowledge in that function, with a trained backup who could cover at meaningful depth. I fostered transparency and knowledge exchange across the portfolio. Within eighteen months, the team had gone from domain-isolated specialists to cross-functional practitioners who could advise across the business.
"Unless my team and I develop functional and domain expertise, we won't be able to become subject matter experts for our clients and stakeholders. We won't be able to build a trustworthy relationship with them and get their buy-in. Period."
What stops BI teams from becoming truly great
In my experience, the single most overlooked problem in BI leadership is the gap between what leaders believe they are delivering and what their teams actually experience. A leader will rate their one-on-one practice as excellent. Their team will rate it as inconsistent. A leader will say they prioritise based on business value. Their team will describe a backlog driven by whoever escalates loudest.
This gap — between leadership intent and ground-level reality — is where most improvement programmes get stuck. You cannot fix what you cannot see. And most BI leaders do not have a structured instrument to surface that disconnect honestly.
That is what the BI Leadership Capability Assessment Framework was built to do. Across 208 questions spanning all six principles, it collects responses from both the leadership set and the employee set — and then compares them. The gap between those two perspectives is the most actionable output of the entire assessment. It tells you not just where the team is weak, but where leadership believes things are working when they are not.
I designed this framework because I needed something like it — across every organisation I worked in, every team I inherited, every restructure I led. It did not exist. So I built it. And I built it not as a theory, but as a direct translation of 19 years of doing this work.
Ready to see where your team actually stands?
Take the free sample assessment — 18 questions across all six principles. No email required. See your maturity score instantly. Then decide if you want the full picture.