Product Metrics Are Affecting Your Happiness
Happiness can be hard to find, especially in the workplace. Yet product managers generally love what they do and find the role quite fulfilling....
These days, product management and a data-driven approach are unshakeably tied together. Most want decisions based on a solid foundation of facts with experiments constructed to prove our hypothesis. A/B tests tell us what works and what doesn’t support evidence for every fork in the road.
This is a far cry from the old days of gut instincts and following visionary leaders who let their hunches lead the way. Now, expectation data and slide decks quantify why that button should be blue instead of aqua as one example.
How did we go from the Wild West to actuarial tables? And what have we lost by ignoring our instincts and letting the data dictate our decisions?
A data-obsessed product culture uncovers all kinds of actionable insights by poring over the information they have. But they’re also likely guilty of not giving viable concepts a fair shake if they don’t come prepackaged with supporting data.
During the prioritization and roadmapping phases, product teams make judgment calls on what’s most important, what deserves implementation first, and what remains in the backlog. These decisions have massive downstream ramifications, particularly on resource allocation, budgeting, go-to-market strategies, market competitiveness, and more.
It’s natural to want supporting data for these decisions and the ability to assess their ROI on the spot. Yet, these snap judgments tend to snuff out great ideas before they’ve even found their footing.
If the culture’s immediate response to any new item is to challenge its legitimacy and demand supporting evidence immediately, colleagues will be less likely to suggest anything at all without a briefcase full of data.
Since many people won’t have the access or know-how to quantify every idea as soon as it comes to them, they might not bother even bringing it up. This leaves a lot of creativity and innovation out of eventual consideration.
Organizations can also lean too heavily on a data-driven approach. This approach can bias their thinking too early in the assessment process. For example, in customer-centric companies, some “squeaky wheel” complaining users can turn a minor issue into a five-alarm fire because they’re making a lot of noise. The silent-yet-satisfied masses get ignored. Projects with a more significant impact are shelved to quell the vocal minority.
But before proceeding further, let’s clear up any doubt regarding the importance and value of a data-driven approach to decision making. No one is longing for a return to a shoot-from-the-hip free-for-all. We don’t want one executive’s whims and affections for shiny objects to dictate the fate of an entire company.
Looking at analytics reveals so many insights and unexpected facts, it’s hard to believe society progressed so far without it.
It helps you make your case, win over doubters, and get stakeholders aligned. It also proves out whether your hypotheses were correct. These wins bolster your credibility within the organization.
It makes those decisions easier. If they’re at all on the fence, turning to data is both a time-saver and a great way to justify decisions.
But those qualities are also what makes data problematic. Data makes safe choices safer. Thus it makes risky choices riskier. Data decisions can breed a culture where no one wants to take a chance.
The antithesis of basing every decision on cold, hard facts is going with your gut and winging it. Following your intuition can lead you down many paths, some of which work out while others end badly.
But without exploring our hunches at least a little bit, we deprive innovation. While data is a source of inspiration for iterative improvements, bold, new directions are seldom spawned from a pie chart.
Intuition itself doesn’t exist in a vacuum. It’s an organic conclusion based on our past experiences that leans on our deduction and perception powers.
Instinct is why the first time our prehistoric forbearers saw a new type of predator. They didn’t hang around to wait and see whether a preditor attacked. Their decision to hide and observe instead of remaining exposed is based on their previous experience—an instant assessment of the current threat and speedy response.
In business, we rarely have to worry about whether our competitor will eat us for breakfast. Still, there are occasions where quick thinking and prompt reactions can avert disaster or seize an opportunity. If we hide in a cave waiting for more data to come in, there’s a chance our cost of delay could be fatal.
On the other hand, relying on instinct can also be a recipe for disaster. We can become overly confident. Thus ignoring signs that our choices may not be panning out as expected or that the facts on the ground have changed.
If our early bets paid off, we could delude ourselves into believing we have some Midas touch versus beginner’s luck. Leading to even riskier decisions in the future. That’s why most tales of wildly successful entrepreneurs begin with early failures preceding subsequent triumphs.
The answer is to avoid both extremes. Striking the right balance between an incurable dependence on data and completely cavalier conduct can be tricky. When done correctly, you get the best of both worlds.
We have a hypothesis, run an experiment, analyze the data, and double down on what works. Or we spot a trend in the data, make an educated guess on how to act on it, then measure the results.
An environment that prizes creative approaches alongside meticulous research and analysis is the happy medium to strive for. This allows room for the uncomfortableness of uncertainty. While also leaving space for hunches to blossom or fizzles depending on their merit.
As a product leader trying to marry these two approaches, telling the story makes a big difference. Predicating your business case on anecdotal observations or limited customer feedback immediately casts doubt on your proposal. But stating that you’ve seen a particular trend in the data, have an idea on how to address it, and a plan to measure its efficacy threads the needle.
Product teams shouldn’t be committing to product roadmaps and strategic plans lacking adequate data-based confirmation. Executives should be holding their teams accountable and insisting on supporting data and measurable expected outcomes before dedicating resources to any initiative.
But those decisions get made further downstream in the process. What’s problematic is when the demand for data suffocates creativity and innovation at the ideation phase.
No one should be discouraged from brainstorming or voicing potentially terrible ideas during the creative process. Nor should an idea be shunned because there’s no existing set of data to vet it against.
Instead, welcome the flood of half-baked theories. When it’s all over, assign them some homework. If there’s consensus an idea has some legs, then there’s now an opportunity to spend some cycles investigating whether the data supports or contradicts it.
At the executive level, there should also be room for a little uncertainty. While most items presented for approval should meet the rigorous demands for supporting evidence, there should also be an opportunity for structured experiments.
The product stack can play a role here as well. Using tags and color-coding, items on the planning board or roadmap requiring more data and research can be flagged accordingly. Updates containing additional data added right in the tool. This will keep those less-researched ideas on the radar during later prioritization and roadmapping sessions.