October 15, 2021

Lesson 10: Scorecards

Lesson 10: Scorecards

Remember baseball cards…

…I do…

…which puts me in an age bracket approaching the steep precipice of irrelevancy in software development.

But this much I will tell you. If you never had a baseball card collection, then you know nothing about stack ranking according to a weighted score…

…NOTHING!

Me and my friends would spend countless hours arranging our cards in priority order based upon key stats such as Home Runs, RBIs, ERA, etc. We created sophisticated ranking algorithms that would put the Facebook newsfeed to shame.

“Does the Earned Run Average and Base on Balls of Roger Clemens supersede the Slugging Percentage and Grounded Into Double Play of Barry Bonds?” I would frequently ask my friend Zach.

“Doesn’t matter,” he would say. “At least not when you compare it to their PED to red blood cell ratio.”

We would then trade cards according to perceived value. Don Mattingly was in the upper tier of cards and could be traded for seemingly equivalent card such as a Cal Ripken Jr. And then there was the Ken Griffey Jr. cards which held such an impossibly high value where the trades transcended the realm of sports.

“I’ll trade you this Ken Griffey Jr. card for your Sega Genesis,” I remember attempting a few times in my young baseball card trading career.

When I first became a Product Manager and was asked to stack rank a list of business priorities, I smugly thought to myself, “That’s all you want me to do. This is going to be a piece of cake. This task doesn’t even involve an older brother who is trying to steal my cards to pawn and pay for the latest Motley Crüe cassette.”

I thought the system of ranking and scoring would translate to a list of well-thought-out priorities that the business would gladly accept and the developers would gladly implement in sequential order. At least, that’s what all the blog articles had told me would happen…

…oh boy…

…I not-so-gladly am here to say: it was all a lie…

…it ain’t easy.

I have a 5 step hierarchy of effective methods for product discovery, persuasion, and priority. The top 4 are pretty decent methods: prototypes, observation, interviews, whiteboarding. If done right, these methods all involve at least some form of market and user interaction.

And then there is a huge gap between these methods and scorecard ranking. It’s like you’re asked to rate the coolness of your favorite Marvel Superheroes. And you’re given a list with Ironman, Thor, Wolverine, and Hulk. And then at the end of the list you see Captain Ultra.

“Who the hell is Captain Ultra?” you say to yourself. “Shouldn’t there be another choice like Deadpool or Rocket Racoon or something like that?”

Upon proper research and investigation, you learn that Captain Ultra was a former plumber who was mysteriously given powers without any real backstory or explanation, but that for some reason he developed a crippling fear of fire. In one of the more obscure Marvel episodes, Captain Ultra once fainted because he saw a harmless little flame.

“What a lame superhero,” you conclude. “I mean, dealing with fire and explosions is kind of a job requirement for superhero, is it not?”

You then go on to imagine what a movie about Captain Ultra would be like and the only equivalent comparison you can make is "Sleepless in Seattle"...

...come to think of it, Tom Hanks would probably make a great Captain Ultra.

Suffice it to say that as a Product Manager, you will soon find out that scorecard stack ranking is the Captain Ultra of prioritization and persuasion methods…

…it’s lame...

...sorry, Tom.

And of course, it shouldn’t surprise any of us at all that it is the most popular method espoused in Product Management communities.

A quick Google Search of “Product Management Scorecard” brings back no less than 17 million results in less than a three quarters of a second…

…Trust me when I say I’ve read them all and each one gets lamer and lamer as you go on...

...kinda like Tom Hank's acting career...

...BTW. I actually don't have anything against Tom Hanks. The joke just started with "Sleepless In Seattle" and snowballed from there...

...and I had a secret crush on Meg Ryan in the 90s, so I obviously couldn't use her in my jokes...

...call me, Meg.

The reason the scorecard is so popular is because it’s a lazy way to not actually have to do real product management. It doesn’t require any work.

“Sit at your desk, don’t talk to anyone, and assign some arbitrary number value to this list of items,” says some Agile coach who claims to be an expert on the topic. “Then present it to your executive team and bask in their unending praise of your skill and intelligence”…

…ooh. How alluring?

Some methodologies such as Scaled Agile (or SAFe) have taken this lifeless scoring method to the extreme with a cult-like ideology called WSJF…

…which is even less effective for prioritizing than WWJD.

Ok. Ok. Ok. Maybe I'm being too harsh.

Well, unlike most Agile coaches, I don’t write about anything with which I don’t have personal experience with.

Here was my first experience with scorecard ranking.

I isolated myself away from the world for a few hours and assigned some meaningless values to my backlog items. I then showed it to our CEO (it was a small company) to get his feedback.

“Oh. If you would have just come and talked to me, I could have told you about our business and where we are going. Here’s a list of some of our best customers you can talk with to get their feedback on your product. I’m excited to see what you learn.”

“So, you don’t care about this inconsequential list I put together which I spent all of 30 minutes compiling? What you’re trying to say is that your 15 years of market experience and thousands of customer and prospect conversations are actually more valuable than that?” I thought to myself as I choked back my tears.

Due to my lack of experience, it became painfully obvious that my scorecard “weighted” value actually didn’t have any weight at all.

I kind of have to blame this whole scorecard ranking thing on the proliferation of Scrum throughout the development community. They came up with a trendy way to estimate effort with a cool thing called the story point and the product community got all jealous and was like, “Hey. We want a trendy scoring system too that can live side by side with your really rad story point estimates".

Not the kind to be outdone in coolness by a bunch of nerds, product people came up with their own scorecard.

…and that’s how I’m guessing WSJF spawned from hell into existence.

Look…

…If I have to convince you that figuring out the effort of engineering development and figuring out the value of a product idea are not the same thing…

…Product Management might not be for you.

Here’s the lesson. If you need to go through an exercise of scorecard ranking for your own information gathering and educational purposes…

…go for it.

It can be a personally valuable exercise to understand and learn what really matters to your business. It’s important to know if the value of an idea is to increase revenue, reduce costs, or if its just a shiny penny that sales needs to close a few more deals. You need to know all these things.

So do it.

Your whole product team should go through scorecard-like exercises to deeply understand the business impact of your items and to be able to clearly articulate the product objectives and KPIs by which you will measure success…

…I’m not saying don’t do it…

…I’m just saying don’t expect anyone else to care about your stupid point system. There are better ways to persuade the organization toward your product ideas and to convince them that you know what you’re talking about.

In other words…

…Don’t be Captain Ultra who retreats like a coward at the tiniest smell of fire, tucking his scorecard tail between his legs, complaining with all the other losers at after work gatherings about how the executives “just don’t get it”.

Step up and figure out how to be Iron Man…

…that’s what your organization needs…

…and if you want to trade anything at all for a huge stack of Ken Griffey Jr. cards…

…let me know.