<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1925815294117775&amp;ev=PageView&amp;noscript=1">

Actionable Insights

Voices on ROI: "7 Questions" with Param Ghangas

Feb 27, 2018 12:40:26 PM / by Adam Nathan

We sat down recently with Param Ghangas, the Director of Analytics at HBO, to talk through our 7 Questions.

Param has been working in analytics and marketing for over a decade, designing KPIs, building web analytics solutions, managing teams, and communicating actionable insights to a wide variety of stakeholders.  

Her experience spans the business and technical sides of creating ROI in complex enterprise environments. With a keen eye for maximizing the decision-making impact of a data strategy, she had a great deal to share on how companies can work effectively to leverage information assets.

Her answers have been lightly edited for clarity and length.

Param Ghangas, Director of Analytics, HBO

Param Ghangas, Director of Analytics at HBO

#1 - What professional experience has led to your understanding on how to deliver data ROI?

I have been involved in data and analytics in some capacity for ten or so years, both from the strategic side and also from the more technical side. I’ve worked in-house for several companies and I’ve consulted others.

While I’ve primarily been an end user of data, I’ve also played a significant role in defining requirements. All this to say… I’ve had a quite a range of experience that has contributed to my understanding of how to deliver data ROI. 

I’ve really had to understand how data drives decisions and understanding within a business - and I’ve often seen how it might drive misunderstanding, too. Even those times where data is used ineffectively can help inform on how it can better contribute to ROI.

Param Ghangas - If you can't explain it, it probably wasn't worth doing

#2 - How does a manager frame a successful analytics project?

It has to start with “what are you trying to achieve” and “what are the business decisions that you’re trying to affect.” The natural inclination for many people is to start with “what can the data tell us” and “how should we visualize it,” and to me, those things come later.

“What are you trying to achieve” frames what should you look at and how should you think about the problem. What is your hypothesis around what you’ll see helps guide what data you’ll need - and what data will help support what you’re trying to achieve rather than the other way around.

A lot of people start with data collection or data interrogation. For a project to be successful, you have to put some scope around it in terms of what is the business problem that you’re trying to address. 

Bartlett System Approach to Business Model Alignment

 

#3. - How would you prioritize multiple analytics projects with limited or competing resources?

Generally, the loudest voice - or the highest up in the room - will dictate the prioritization. Absent of that, let’s assume all stakeholders are created equal, I think that you have to have some revenue component tied to it. 

That’s not to say that that’s always easy. But the closer you can get to attaching your projects to estimated revenue impact, then you start getting closer to letting data do its part.

In theory, it isn’t about the loudest in the room, because everybody in that room should be focused on what’s driving value to the business. The equalizer in these situations is communicating in those terms. 

#4 - How would you coach executives to think about data projects?

Everything starts with mapping out the strategic initiatives. And then knowing what the key business questions are around those? And what are the key decision points that we can actually control?

Data problems are just strategic problems. It’s just a framing of a business question. I don’t start with the data elements. I don’t start from the bottom up. I start from the top down. That’s the most effective way communicating with people who aren’t in the weeds with the data. They don’t care.

There’s this tendency to attack data from a very broad perspective, but you can’t boil the ocean. And that’s where I actually see projects that don’t move forward, because it is very hard to get incrementally better when you’re trying to attack or correct everything all at once.   

#5 -  What's the best way to ensure the business takes actions on the insights that an analytics team delivers?

I think there’s an over reliance on dashboards. That isn’t to say that I don’t think they are important – I think commonly shared metrics across an organization are important for understanding and context - but I don’t think you should be attacking most questions through dashboards.

Most analysis projects should really be connected to a decision that you’re trying to make. If you can’t hone in on the precise decision you’re trying to make, it’s worth revisiting and reframing the project in a way that supports the decision in this area of the business, whether it be a marketing decision, a financial decision, a more strategic direction decision.

 

Bartlett System Approach to Business Model Alignment

 

#6 - How do you demonstrate bottom-line value to stakeholders?

It’s what we try to achieve every day.

If there’s a decision to be made between choice A vs. choice B, or a product feature is to be rolled out, what we try to develop an understanding around is how those business changes affect choices that consumers make and how those choices tie to retention outcomes. That is a clear revenue component for us.

And if it isn’t revenue explicitly, it is usually some metric that we know to impact revenue. It’s a conversion rate or a retention rate or something like that.  

I do try and focus people on understanding what the right key performance indicators are but that term itself has become a little bloated.

Metrics for information purposes are fine, but you need to understand if something is truly a performance indicator for the business. Does that metric actually tie to something meaningful in terms of retention or acquisition?

And often people attach themselves to a metric that doesn’t actually mean anything in terms of the business, but they’ve convinced themselves that it indicates something. It’s my role to correct those misunderstandings with better insights.

#7 - Other key insights

FOCUSING ON VALUE, NOT PRECISION

There’s always some issue with data, but the quickest way to undermine any analysis is to start off saying, “Well, I didn’t have this and that.” But the nature of analytics problems is that you’re always making assumptions, leveraging proxies and building models in the best way that you can, and to focus on the problems in the data would undermine the credibility of it.

The quickest way to undermine your analysis is to focus on the things that weren’t right rather than the things that were. 

OVERCOMPLICATING MODELS

Sometimes people overcomplicate the models that they use. A methodology that you use doesn’t have to be overly complicated. If you can’t explain it to a stakeholder, then it probably wasn’t worth doing.

The mistakes I generally see are overcomplicating of the problems or undermining themselves by over explaining some aspects of the process that aren’t salient points for the delivery of the analysis.  

REFRAME FOR VALUE

I tell team members that everything we do has to build on a foundation of what are we trying to achieve. We don’t go into projects saying, “Well, somebody asked for this data, so let’s send them this data.”

If somebody comes to us and says can you send us this, I challenge the folks who report to me with reframing it in a way that delivers business value. That’s our job.

Our job isn’t to just service people for what they ask for. Our job is to define and do the work in a way that effectively communicates the value of whatever analysis that we’re doing.   

SELF-SERVICE STRATEGY

I have mixed views on self-service.

When you say that you’re not in favor of self-service, it sounds like you’re not a fan of opening up access to data, which isn’t the case. But I often feel that people are servicing themselves in the wrong way. They are misapplying the data.

There needs to be more importance placed on educating people on the right uses of data and how to think analytically.  

You can’t just open access to data and not emphasize analytical and conceptual thinking for the person in the role themselves.

If we’re placing more importance on the role of self-service, then we should be hiring employees that truly understand how to use that data, whether they are in an analytics or data science function or not.  

Data is not instinctive. 

ON PRETTY DASHBOARDS

So, you have a lot of dashboards that exist, and then you present it in front of a stakeholder and they’re like, “What is this telling me? It looks pretty, but what is it telling me?”Often in this space it’s “let me show you this dashboard or this visualization,” and sometimes we get a little blinded by the bells and whistles,” rather than did this create value? did it free up resources for another activity?

Did it tell us something about the business?

Download Your Value Prop Playbook 

Topics: KPI, Strategy, Dashboard

Adam Nathan

Written by Adam Nathan

Adam has been called "the John Ive of business simplification." He brings twenty years of experience helping business and non-profit leaders drive value with actionable insights. As a long-time business owner and CEO himself, he understands that creating value from information is never a given and that creating a data driven culture is a learned skill.

Subscribe to Email Updates