Understanding Isn't Enough
From tracking revenue to knowing about data quality issues, vendor and data products don't sell on simply providing information. Value comes from action.
Gartner’s analytics maturity model isn’t new by any means. Still, many data vendors and data teams haven’t truly climbed up to deliver the highest value. The tech industry is convinced data is important, but is still having a hard time with the never-ending cycle of building understanding instead of moving forward and acting on what’s already known.
So how can analytics teams build data products that truly inspire change? Let’s look to product teams for some inspiration.
Products don’t make money on just providing information
Mint.com (now just “Mint”), a popular budgeting service acquired by Intuit in 2009, continues to grow in product offering and popularity. Even though the integrations and budget categorization could use some work, I use it as a way to do a personal financial review every quarter or so. Some use it as a weekly budgeting tool, others use it as a way to track savings towards a goal.
With that said, from first glance, when I open the app the landing page just gives me information. The balance of my accounts, credit cards, my spending among different categories this month. If I stop here, this is all it is: information. And some would argue this is in itself valuable because, well, getting this information outside of Mint would take a lot of manual time. The real key, however, is what can be done with the information when presented in the form of alerts or insights. Can I and should I save more money? Can I afford those new pair of skis I’ve been eyeing all year? The information can then influence financial decisions.
Mint’s marketing reflects exactly that. On the website landing page, the slogan isn’t “understand your finances”, no; that’s too passive. “Managing money, made simple.” It’s a call to action. The app is clearly supposed to save the user time doing things that otherwise wouldn’t be, well, simple.
I would think Mint’s product managers don’t set out to enable customers to understand things, but rather enable customers to do things and do them faster. Sure, on the way customers also gain understanding but the value doesn’t stop there.
Analytics teams and data tooling vendors make themselves more influential by taking a similar approach.
The most valuable vendors influence process
Take a look at dbt. Analytics professionals rave about it because the tool helps them develop and deploy transformation code (what would have been just plain old SQL) in a more repeatable, scalable, and generally faster way. This value proposition is directly reflected in dbt’s marketing website.
Similarly, reverse ETL is blowing up just like ETL blew up because it saves people time. Data engineers like myself don’t have to spend countless hours implementing and subsequently maintaining pipelines. I don’t like maintaining them, you don’t like when they break, and I don’t bring any unique value to the process.
Data quality is a different cup of tea. Of course, understanding what’s broken is far better than not knowing which pipelines are working properly and which ones aren’t. Ignorance is not bliss when it comes to data quality. Understanding which task in a pipeline breaks is important, but is only one step forward.
The biggest value add occurs when downstream dependent tasks automatically don’t run due to a high risk of more issues.
Really what I’m referring to are pipeline tests that influence the fate of a job, which I spoke about last year. If data quality doesn’t pass a reasonable threshold, don’t proceed sending data to dashboards, operational tools, etc.
Fundamentally, this is why I don’t like the term data observability. Of course, to take action there needs to be an understanding of what’s going on. Observing something is a step, but it’s certainly not the end all be all of where value is derived from. The observer needs to understand what action to take based on the observations.
The idea of influencing process isn’t limited to data quality vendors, it’s virtually any product (ex: Mint). That includes data products.
Analytics products should change behaviors
The finance team knows what the revenue last month was. The merchandising team also knows which products have the highest margin on your ecommerce site. Then you, a member of the data team, got some Slack alerts about data quality issues. But, so what? What are individuals going to with the provided information? If the answer is nothing, then the analytics team wasted time building reports, alerts, and context.
Analytics teams can learn from vendors who deliver value in several ways.
Figure out what question the data product should answer. This isn’t always going to be the first question being asked, and there are wrong questions. For instance, asking “how did revenue last month perform with respect to forecasted revenue?” puts performance into context. Even better, if revenue is trending below the forecast, what can be done to right-set performance. Maybe the answer is hiring more sales reps in a B2B world, or increasing marketing spend in a B2C ecosystem. The answer to the question being asked should make the action required clear.
Saying no is part of your job. Product managers serve as filters for engineering teams when prioritizing requirements. Companies that build every feature requested by every customer end up with a lot of sprawl and little focus. Just like Mint shouldn’t waste time building products no one wants to pay for, analytics teams shouldn’t waste time creating analytical outputs that aren’t influencing business metrics. After figuring out what action the question drives, get to the bottom of what top-line metric the effort influences. The importance of the metric will dictate how to prioritize the ask.
Follow up, assess, and revise. Keep the door open and communication flowing. Sometimes, the first pass data product isn’t going to be as useful for the business stakeholder as intended. Good product managers launch products effectively; great product managers ensure their products work as intended and perform well according to customers. Analytics products should also be monitored and revised to ensure they answer the question at hand most effectively.
This is all obviously easier said than done. Analytics teams will always get requests to build and maintain dashboards that are looked at once and never again. Similarly, it’s hard to prioritize revising data products that already exist instead of moving on to new shiny projects. However, both will help decrease analytical debt in the longterm.
Value add: Make people’s lives easier
Automating a report as an analytics engineer, being a vendor that migrates data for customers or makes decision making easier, you name it. Both provide value, but the question at hand is: where does value come from? My answer: there’s a dollar value to time, so there’s a value to saving it. If you make people’s lives easier, they’ll thank you for it.
Thanks for reading! I love talking data stacks. Shoot me a message.
Interesting point about observability being disconnected from action. I guess the issue resolution process/system should be put in place at the same time as implementing the test suite, otherwise you're only solving half the problem. Software development solves this with clear tests at merge time, but since data pipelines can fail at any time based on the data itself, it's a trickier problem.