Be Prepared To Be Wrong

Posted on Updated on

As startup founders, product managers, designers, it’s often easy for us to come up with all the answers. Should we add this feature? Will users take this action? Will changing this copy increase c…

StartUP Product‘s insight:

Teresa Torres shares 4 brief, but powerful explanations about how and why we so easily convince ourselves we’re right, when often we really aren’t. The keys to overcoming things like confirmation bias are awareness and honesty.

“It’s way too easy to look back and decide that what happened is exactly what you expected to happen. It’s much harder to be honest with yourself and realize that what you are building may not be working.  But the faster we become aware something might not be working, the faster we can correct course and try again.”

Perspectives like the one Teresa shares in this post are one reason we’re excited to have her as a speaker at Startup Product Summit SF2, Oct. 11, 2013.  Sign up now for advance registration pricing that is $200 off the full price!

See on

Engineering Where the Most Opportunity Exists >

Posted on Updated on

This blog on “Design Software” is not going to be about a review of what exists, but rather in what I feel is missing in the current offerings from so…

StartUP Product‘s insight:

– The best chance for a product to be developed to strike the balance between obtaining the highest performance, lowest cost, and any other requirement exist at the earliest stages of development.
– The ability to consider multiple concepts and measure/simulate performance against requirements early, i.e. before any detail design activities begin, maximizes the opportunity for a product to be successful.
– In other words: the best concept yields the best product.

So if the best potential for success is in the concept development phase, where are the conceptual engineering software tools?

See on

Steve Jobs Had It Wrong: Why You Should Look To Consumers For Product Innovation

Posted on Updated on

It has long been asserted (famously, by Steve Jobs) that customers can’t tell you what your next product should be. Companies create and customers consume.

StartUP Product‘s insight:

To be competitive, brands need to look outward and cultivate the communities of creative customers that are shaping the future of their products.

Developers have been using APIs and open source software for many years to increase the pace of innovation. Consumer product companies can mimic these more open systems. Just look at companies like Sifteo or Lapka that have created physical products connected to software that are designed to be remixed into new applications.

Bottom Line for extending engagement and product narrative:

People embrace what they influence, so more open and transparent brands will become the most loved and talked about as well.

Questions for discussion:

How can you enable customer influence and manage user creativity without losing control and focus on development cycles and roadmap…?

Is it possible to nurture creative customer communities in parallel with developer communities?  Remember User Groups that had love/hate for the developers?

How do you enable creative consumers without canabalizing next versions?

What are the most effective tools for managing crowdsourced feedback and ideas that enable more than marketing content and engagement incentives?

See on

Key Takeaways from the Roadmapping and Execution Panel at Startup Product Summit SF

Posted on Updated on

By Michelle Sun

Here are the notes for the Roadmapping and Execution Lightning Talks and Panel at Startup Product Summit SF.  Omissions and errors are mine (please let me know if you find any, thank you!), credit for the wisdom is entirely the speakers’.

“Building a Great Product Through Communication” – Joe Stump, Co-Founder,

  • Product manager’s role is to capture, communicate and distill product ideas, and mediate between business stakeholders and makers.
  • When building a product, pick two out of the three: quickly, correctly, cheaply.  Joe later mentioned on Twitter that he would pick quickly and correctly, as paying for quality is no brainer.
  • “Want to increase innovation? Lower the cost of failure” – Joi Ito
  • Empower every developer to commit things to the product through non-blocking development (NBD).
  • Advocate the move to 100% asynchronous communication because current approach is broken (needs human input to track reality) and remote teams are becoming more common.

“Raw Agile: Eating Your Own Dog Food” – Nick Muldoon, Agile Program Manager, Twitter

  • Twitter does dog-fooding by allowing developers deploy to internal server. Dog-fooding allows:
    • gathering real data from real (though internal) users.
    • increases incentive to produce quality shipped code.
    • better feedback.  He found that feedback in dog-fooding environment is generally more constructive.
    • keeps momentum through a positive reinforcing loop of continuous deployment and feedback. The team gets 50-100 feedback from internal users each day.
  • How to decipher and sift through the volume of feedback.  Look at only the “love” feedback, then all the “hate”, then discard the middle, categorize and show to the whole team.
  • Other important aspects in dog-fooding:
    1. Automation. Allow deploy more frequently especially internally.  ”On any commit, deploy internally.” Avoid accumulating technical debt.
    2. Visibility. Record progress and share on a wiki.
    3. SpeedMinimize cycle time (from to do to in progress, to done).

Best Practices for Architecting Your App to Ship Fast and Scale Rapidly” – Solomon Hykes, Founder & CEO, dotCloud

  • 3 things to aim for in architecting your app: speed (continuous deployment), scale, future-proofing (be prepared for things moving very fast, avoid bottleneck and need to refactor when adding every new feature).
  • What are the patterns/strategies in getting to these three goals?
    1. Be aware of trade-offs. There is no silver bullet; always trade-offs and prioritization.
    2. Trade-offs evolve over time.  Priorities change. Be aware of assumptions and revisit them from time to time.
    3. Trade-offs differ from team to team.  Be aware of bias in different teams. Always keep ownership of key decisions.
  • Put yourself in a position where you are embarrassed, and things are going to happen faster.

“Rocket Powered Bicycles: Avoiding Over and Under Engineering your Product” – Chris Burnor, Co-Founder & CTO, GroupTie & Curator, StartupDigest

  • A product connects business priorities with user experience.
  • Proposes that instead of Minimum Viable Product (MVP), think about Product: Viable Minimum (PVM).  Focus on viability.
  • A scientific method to approaching product roadmapping.
    • Idea: think about business priorities, user experience.  Do not let technical decisions drive your product.  Let product drive your technical decisions.
    • Test: Viability of the solution is whether it solves the problem it’s setting out to solve.  Determine what level of viability is suitable in different stages: GroupTie’s first viable minimum was a keynote presentation that was sent to potential customers.
      Scale of tests will vary.  Lack of big tests means the lack of breakout growth/ideas, lack of small tests means the team is doing too much.
    • Conclusion:  Debriefing phase is vital, share test results with the team and learn what it means to the idea. Testing without debriefing is like “talking without listening” in a conversation.
  • An unusual example of a PVM is Apple.  Product first: cares about user experience and business priorities.  Viability second: it just works.  Minimalism third: wait till a technology is ripe before adding to the product (no LTE for a long time, no RFID).

Notes on other panels:

About The Author

Michelle Sun is a product enthusiast and python developer.  She worked at Bump Technologies as a Product Data Analyst and graduated from the inaugural class of Hackbright Academy. Prior to Hackbright, she founded a mobile loyalty startup. She began her career as an investment research analyst. When she is not busy coding away these days, she enjoys blogging, practicing vinyasa yoga and reading about behavioral psychology. Follow her on Twitter at @michellelsun and her blog.

Key Takeaways from the Design Thinking and Rapid Prototyping Panel at Startup Product Summit SF

Posted on Updated on

By Michelle Sun

I had the privilege to attend the first Startup Product Summit in SF on Feb 7, 2013. It was a great lineup of speakers and a full room of buzzing energy and great conversation.  Without further adieu, I’d like to share some key learnings of each panel.

Please let me know if I omitted or made any errors in the references. Credit for the good stuff is entirely the speakers’ (link to twitter handles are included on each name).

Turning Mediocre Products into Awesome Products” – Jonathan Smiley, Partner & Design Lead, ZURB

  • Ideation and iteration can ”turn mediocre products into awesome products”.
  • Discussed a full spectrum of research from market-driven (focus group, survey) to user-driven (remote teaching, usability teaching)
  • Importance of sketching, a lot, aim for speed and volume, then critique
  • Advice to the audience: ”do 10 more sketches ( more ideation is always better ), build 1 more prototype, get 1 more round of feedback, ask 5 more customers”

“Being a UX Team of One” – Vince Baskerville, Product, Lithium & Co-Founder, TripLingo

1. Internal politics is a common challenge as a UX team/professional.  Learn to manage expectation of different internal stakeholders and keep everyone in the loop.

2. Don’t listen to what customers are saying.  Users’ claims are often unreliable.  See what they are doing.  Understand the underlying issue.

“Validate Your MVP on Paper” – Poornima Vijayashanker, Founder & CEO, BizeeBee & Femgineer

– 2 Reasons MVP Fail
1. Fail to figure out how to provide a simple value proposition that differentiates your product from your competition
2. Fail to figure out who their early adopter are.

– Early adopters are people who aren’t using the competitor’s product. Don’t want to take time to switch over.

– Steps on usability testing

  1. Explain the problem. What you are testing. How they are helping. Get them excited about the idea.
  2. Set expectations. Make them comfortable.
  3. Communicate intention (what exactly are you testing and specific feedback you are looking for).
  4. Thank them for their time. Follow up regularly.

Poornima’s slides are available here.

“Everyone’s Customers Are Wrong” – Evan Hamilton, Head of Community, UserVoice

  • Data doesn’t tell the whole story.  Analytics are bandaids because we can’t watch our customers.
  • People don’t tell the whole story.  Identify who the users are, where the feedback are from.  Are they: paying/freeloaders? Using product in the intended way? Using main features? Early adopters / ‘tech fanatics’ (who are not likely to stay on a product for the long haul)?
  • Combine data and customer stories.  Customer feedback / feature suggestion usually leads from a deeper issue.  Find out what the actual problem is by understanding the underlying need.
  • Don’t lose track of your creative mind by getting lost in data rat-hole.  Don’t chase 1% when you can get 15%.  Not just A/B, but try something crazy.  Try big bold things along with incremental fine-tunes.

“Designing for Everyone: The Craft of Picking or Killing a Concept” –Miki Setlur, Product Designer, Evernote

  • Everyone use product in many different ways.  A useful strategy is to segment users into business, partners (e.g., app stores for Evernote’s case) and users.
  • Figure out what each segment cares the most about: Business / Partners – acquisition, retention, engagement, revenue.  Users – being faster, better, happier.
  • Case study on how Evenote’s design process stroke balance between business goals (monetizing) while being sensitive to user experience and goals (finding things faster).

Other relevant points 

How to access willingness to pay during pre-product interviews.  Get the first dollar within the trial period.  Provide clear value proposition from the get-go.

How to get good feedback.  Be specific in what feedback are you looking for.  Instead of asking in general ‘what do you think of the prototype’, ask whether they are confused on what stage, what was confusing.

Tips on prototyping. Put more emphasis on story telling than illustrating.
For remote testing, use keynote as prototyping tool, screencast the keynote.

On the tension between product vs. business goals in roadmapping a product.  Early stage products make sense to focus on product.  Once reached product market fit, it makes sense to lead with business goals such as, acquiring, converting, retention customers.
Also mentioned was a tool called Impact Mapping.

Read on:

About The Author

Michelle Sun is a product enthusiast and python developer.  She worked at Bump Technologies as a Product Data Analyst and graduated from the inaugural class of Hackbright Academy. Prior to Hackbright, she founded a mobile loyalty startup. She began her career as an investment research analyst. When she is not busy coding away these days, she enjoys blogging, practicing vinyasa yoga and reading about behavioral psychology. Follow her on Twitter at @michellelsun and her blog.

Objectively Making Product Decisions

Posted on Updated on

by Joe Stump

Deciding which mixture of features to release, and in what order, to drive growth in your product is difficult as it stands. Figuring out a way to objectively make those decisions with confidence can sometimes feel downright impossible.

On November 12th, we released 1.0 to our customers. It was a fairly massive release with core elements being redesigned, major workflows being updated, and two major new features. The response has been overwhelmingly positive. Here’s an excerpt from an actual customer email:

“Well, I’ve just spent some time with your 1.0 release, and I think it’s wonderful. It’s got a bunch of features I’ve been sorely missing. To wit:

  • Triage view – a Godsend or, no he didn’t?!
  • Single-line item view – where have you been all my life?
  • Convenient item sorting icons – OMG, how did you know?
  • Item sizing, assigning, following icons everywhere – spin us faster, dad!

I’m sure there are a ton more, but these are great improvements.”

Yes, how did we know? I’m going to lay out the methodologies that we used at to craft the perfect 1.0 for our users. It all begins with a lesson in survivorship bias. In short, survivor bias, as it applies to product development, posits that you’re going to get dramatically different responses to the question “What feature would you like?” when asking current customers versus former or potential customers.


You do have an exit survey, yes? If not, stop reading this now, go to Wufoo, and set up a simple form asking customers who cancel their accounts or leave your product for input on why they left. You can take a look at ours for reference.

The problem with exit surveys and customer feedback in general is that everyone asks for things in slightly different ways. Customer A says “Android”, Customer B says “iOS”, and Customer C says “reactive design”. What they’re all really saying is “mobile”. Luckily, human brains are pretty good pattern recognition engines.

So here’s what I did:

  1. Created a spreadsheet and put groups along the top for each major theme I noticed in our exit surveys. I only put a theme up top if it was mentioned by more than one customer.
  2. I then went through every single exit survey and put a one (1) underneath each theme whenever an exit survey entry mentioned it. I’d put a one under each theme mentioned in each exit survey entry.
  3. I then calculated basic percentages of each theme so that I could rank each theme by what percentage of our former customers had requested that the theme be addressed.

Here’s the results:
Now I know you don’t know our product as well as yours so the themes might not make much sense, but allow me to elaborate on the points that I found most interesting about this data:

  • Our support queues are filled with people asking for customized workflows, but in reality it doesn’t appear to be a major force driving people away from
  • 17% of our customers churn either because we have no estimates or they can’t track sprints. Guess what? Both of those are core existing features in Looks like we have an education and on-boarding problem there.
  • The highest non-pricing reason people were leaving was a big bucket that we referred to internally as “data density” issues.

After doing this research I was confident that we should be doubling down on fixing these UI/UX issues and immediately started working on major updates to a few portions of the website that we believed would largely mitigate our dreaded “data density” issues.

But how could we know these changes would keep the next customer from leaving?


We store timestamps for when a customer creates their account and a separate for when they cancel their account. This is useful data to have for a number of reasons, but what I found most telling was the following:

  1. Calculate the difference between when accounts are created and cancelled in number of days as an integer.
  2. Sum them up and group them by month. e.g. 100 churned in the first month, 50 in the second, etc.

You should end up with a chart that looks something like this:

It shouldn’t be surprising that the vast majority of people churn in the first two months. These are your trial users for the most part. The reason our first month is so high is another post for another day. What we’re really wanting to figure out is why an engaged paying customer is leaving so let’s remove trial users and the first month to increase the signal.

We get a very different picture:
In general you want this chart to curve down over time, but you can see had a few troubling anomalies to deal with. Namely, there are clear bumps in churn numbers for months 5, 7, and 8.

We had a theory for why this was based on the above survey data. A large part of the “data density” issues had to do with a number of problems managing backlogs with a lot of items in them. Was the large amount of churn in months 5-8 due to people hitting the “data density” wall?


So far we’ve objectively identified the top reasons that people were leaving as well as identifying a few anomalies that might identify customers who are churning for those reasons. Now we needed to verify our thesis and, more importantly, show those customers what we were cooking up and see if our update would be more (or less) likely to have prevented them from churning.

To do that we turned to and set up an email to be sent out to customers that fit the following criteria:

  • Had created their account more than 4 months ago.
  • Had not been seen on the site in the last 2 weeks.
  • Was the person who owned the account.

I also sent this email out manually to a number of customers who fit this profile that I was able to glean from our internal database as well. I got a number of responses from customers and was able to schedule phone calls with a handful of them.

From there it was a matter of showing our cards. I would hop on Skype and walk through the new design ideas, what problems we were trying to address, and asked whether or not these features would have kept them from leaving in the first place. Luckily, we had been closely measuring feedback and were pleased to find out that our efforts were not lost and that they did indeed address a lot of their issues.


Making product decisions based on customer feedback can be difficult. The more you can do to increase signal over noise, gather objective metrics, and distill customer feedback the better. It’s not always easy, but it’s always worth it.

About The Author, Joe Stump

Joe is a seasoned technical leader and serial entrepreneur who has co-founded three venture-backed startups (, and, was Lead Architect of Digg, and has invested in and advised dozens of companies.