Story
The omission of “why?”
A few years back, I worked at a company which made a very simple mistake in how they delivered product, which unduly limited the amount internal innovation: the omission of why.
The company was in the…
By Joe Stump
As a business, you’re trying to hire the best and brightest folks across your organization
- Allow lateral thinking among your employees by coloring in some of the context around why a feature is necessary
- Educate everyone else in the business of customer needs by answering why
- Share knowledge and the expression of intent to reinforce the notion of consistently delivering value.
See on blog.sprint.ly
The Silent Partner
Jason Goldman helped build Google and Twitter into what they are today — but few outside of tech’s inner circle know his name . On shunning the spotlight in a star-obsessed industry.
Jason Goldman Regarding “Product Managers”
He was explaining the product manager’s role, and not exactly overselling it.
- You’re the one that types the meeting notes,
- the one that is over-communicating the schedule,
- the one that goes and takes the meeting with the person no one else wants to meet with,” he said of his early work in the field.
- You’re just doing a lot of grunt work to make things run smoother.”
His first jobs were in user support, “in understanding how people use software,” he remembered. “It’s a great path into project management. You don’t have to be a designer, you don’t have to be an engineer.”
—
Product managers are sometimes said to oversee discrete components of a company, like feudal lords in a kingdom. But for many P.M.s, Goldman’s assessment is closer to reality.
“Everybody says the project manager is the C.E.O. of their project, and I think that’s total bullshit,” says Josh Elman, a former manager at Facebook and Twitter, the latter under Goldman. “The real heart of a product manager is the guy who sits in the back of the raft with the oar.”
- Troubleshooting behind the counter is perfect training for a product guy, overworked and unsung. If it sounds less plush than the chief executive’s chair, that’s because it is.
- “I’m the guy who stands up next, and says what does that mean in terms of what we’re building over the next six months,” he said.
- That’s the gritty work of fielding questions, farming out assignments and reconciling disagreements.
- “Your presentation doesn’t sound as good. Your presentation doesn’t have grand, inspiring goals,” Goldman went on.
- “You’re the guy who stands up and says, next week we’re going to fix a bunch of bugs.
- You’re the person that’s managing the fallout from the grand vision.”
- “He wasn’t the idea guy, as maybe some product people are,” Williams told me of Goldman.
- “He’s not necessarily defining what we need to do, he’s just making sure it got done. I don’t know that it’s a typical relationship, but it’s probably not super uncommon,” Williams added.
See on www.buzzfeed.com
Objectively Making Product Decisions
by Joe Stump
Deciding which mixture of features to release, and in what order, to drive growth in your product is difficult as it stands. Figuring out a way to objectively make those decisions with confidence can sometimes feel downright impossible.
On November 12th, we released Sprint.ly 1.0 to our customers. It was a fairly massive release with core elements being redesigned, major workflows being updated, and two major new features. The response has been overwhelmingly positive. Here’s an excerpt from an actual customer email:
“Well, I’ve just spent some time with your 1.0 release, and I think it’s wonderful. It’s got a bunch of features I’ve been sorely missing. To wit:
- Triage view – a Godsend or, no he didn’t?!
- Single-line item view – where have you been all my life?
- Convenient item sorting icons – OMG, how did you know?
- Item sizing, assigning, following icons everywhere – spin us faster, dad!
I’m sure there are a ton more, but these are great improvements.”
Yes, how did we know? I’m going to lay out the methodologies that we used at Sprint.ly to craft the perfect 1.0 for our users. It all begins with a lesson in survivorship bias. In short, survivor bias, as it applies to product development, posits that you’re going to get dramatically different responses to the question “What feature would you like?” when asking current customers versus former or potential customers.
LESSON 1: OBJECTIVELY EVALUATE YOUR EXIT SURVEYS
You do have an exit survey, yes? If not, stop reading this now, go to Wufoo, and set up a simple form asking customers who cancel their accounts or leave your product for input on why they left. You can take a look at ours for reference.
The problem with exit surveys and customer feedback in general is that everyone asks for things in slightly different ways. Customer A says “Android”, Customer B says “iOS”, and Customer C says “reactive design”. What they’re all really saying is “mobile”. Luckily, human brains are pretty good pattern recognition engines.
So here’s what I did:
- Created a spreadsheet and put groups along the top for each major theme I noticed in our exit surveys. I only put a theme up top if it was mentioned by more than one customer.
- I then went through every single exit survey and put a one (1) underneath each theme whenever an exit survey entry mentioned it. I’d put a one under each theme mentioned in each exit survey entry.
- I then calculated basic percentages of each theme so that I could rank each theme by what percentage of our former customers had requested that the theme be addressed.
Here’s the results:
Now I know you don’t know our product as well as yours so the themes might not make much sense, but allow me to elaborate on the points that I found most interesting about this data:
- Our support queues are filled with people asking for customized workflows, but in reality it doesn’t appear to be a major force driving people away from Sprint.ly.
- 17% of our customers churn either because we have no estimates or they can’t track sprints. Guess what? Both of those are core existing features in Sprint.ly. Looks like we have an education and on-boarding problem there.
- The highest non-pricing reason people were leaving was a big bucket that we referred to internally as “data density” issues.
After doing this research I was confident that we should be doubling down on fixing these UI/UX issues and immediately started working on major updates to a few portions of the website that we believed would largely mitigate our dreaded “data density” issues.
But how could we know these changes would keep the next customer from leaving?
LESSON 2: IDENTIFY WHICH CUSTOMERS WERE LIKELY CHURNING DUE TO “DATA DENSITY” ISSUES
We store timestamps for when a customer creates their account and a separate for when they cancel their account. This is useful data to have for a number of reasons, but what I found most telling was the following:
- Calculate the difference between when accounts are created and cancelled in number of days as an integer.
- Sum them up and group them by month. e.g. 100 churned in the first month, 50 in the second, etc.
You should end up with a chart that looks something like this:
It shouldn’t be surprising that the vast majority of people churn in the first two months. These are your trial users for the most part. The reason our first month is so high is another post for another day. What we’re really wanting to figure out is why an engaged paying customer is leaving so let’s remove trial users and the first month to increase the signal.
We get a very different picture:
In general you want this chart to curve down over time, but you can see Sprint.ly had a few troubling anomalies to deal with. Namely, there are clear bumps in churn numbers for months 5, 7, and 8.
We had a theory for why this was based on the above survey data. A large part of the “data density” issues had to do with a number of problems managing backlogs with a lot of items in them. Was the large amount of churn in months 5-8 due to people hitting the “data density” wall?
LESSON 3: TESTING THE THESIS
So far we’ve objectively identified the top reasons that people were leaving Sprint.ly as well as identifying a few anomalies that might identify customers who are churning for those reasons. Now we needed to verify our thesis and, more importantly, show those customers what we were cooking up and see if our update would be more (or less) likely to have prevented them from churning.
To do that we turned to intercom.io and set up an email to be sent out to customers that fit the following criteria:
- Had created their account more than 4 months ago.
- Had not been seen on the site in the last 2 weeks.
- Was the person who owned the account.
I also sent this email out manually to a number of customers who fit this profile that I was able to glean from our internal database as well. I got a number of responses from customers and was able to schedule phone calls with a handful of them.
From there it was a matter of showing our cards. I would hop on Skype and walk through the new design ideas, what problems we were trying to address, and asked whether or not these features would have kept them from leaving in the first place. Luckily, we had been closely measuring feedback and were pleased to find out that our efforts were not lost and that they did indeed address a lot of their issues.
CONCLUSION
Making product decisions based on customer feedback can be difficult. The more you can do to increase signal over noise, gather objective metrics, and distill customer feedback the better. It’s not always easy, but it’s always worth it.
About The Author, Joe Stump
Joe is a seasoned technical leader and serial entrepreneur who has co-founded three venture-backed startups (SimpleGeo, attachments.me, and Sprint.ly), was Lead Architect of Digg, and has invested in and advised dozens of companies.