"More than half of our ideas will deliver no value, we just don't know which half" - John Wanamaker (3, pg 78)
Product Testing Statistics
Experiment Statistics (3, pg 79)
- 65% of features are rarely or never used
- At Google and Bing, only 10 – 20% of experiments generate positive results (Harvard Business Review)
- At Microsoft 1/3 have positive results, 1/3 have neutral results, 1/3 have negative results
Creating a Product Test
Focus on value (5, 78)
- Get to the point – ignore the navigation/ log in etc if it does not help you determine the value
- Use a clear call to action – give the user a clear way to show that they value your solution like signing up for it
- Prioritise ruthlessly – don’t hold on to invalidated solutions
- Stay agile – work in a medium that allows you to make updates easily as feedback comes in fast
- Don’t reinvent the wheel – use existing systems like email, forums etc to save work
- Measure behaviour – observe and measure what people do (behaviour trumps opinion here)
- Talk to users – understand why they are behaving this way
Truth Curve (5, pg 80)
The amount of effort you put into your MVP should be proportional to the amount of evidence you have that your idea is a good one
X axis shows the level of investment you should put into your MVP
Y axis shows the amount of market based evidence you have about your idea

Product Testing Grids
Qualitative Tests | Quantitative Tests | |
Marketing Tests | Marketing Materials | Landing Page/ smoke test Explainer Video Ad Campaign Marketing A/B tests Crowdfunding |
Product Tests | Paper Prototype Wireframes Mockups Interactive Prototype Wizard of Oz Concierge Live Product | Fake door/ 404 page Product analytics and A/B tests |
Qualitative Tests | Quantitative Tests | |
Behavioural | Usability testing | A/B testing Analytics |
Attitudinal | User Interviews | Surveys |
Qualitative Marketing Tests
Marketing Materials (1, pg 93)
- To understand which benefits resonate with customers
- To understand how they react to different ways of showing the benefits
- Aim is to understand how they find the marketing material and why
- Marketing material can be landing page, video, advert, email
Quantitative Marketing Tests
Landing page/ smoke test (1, pg 94)
- Traffic is directed to a landing page
- On this page they are asked to show interest (e.g. a sign up button, or a plans and pricing page)
- There is no product yet
- A ‘coming soon’ message is often displayed to those who show interest
Explainer video (1, pg 94)
- Same as landing page
- For products that are harder to explain on a landing page (e.g. dropbox)
Ad campaign (1, pg 94)
- As adverts don’t allow you to display a lot, this is more appropriate for optimising customer acquisition and not product-market fit
- Can advertise to different demographics to check hypothesis about target market
- Measure clickthrough rate to measure which ads (and from which demographics) prove more successful
Marketing A/B testing (1, pg 94)
- Test two alternative designs to compare how they perform
- Run the tests in parallel with 50% of the traffic to each for simplicity
Crowdfunding (1, pg 94)
- Advertising your product on a site like Kickstarter and asking people to pay for the product in advance of it being made
- Set a minimum threshold for funding where you do not build your product before you have raised £X of funding
- The donators to your product will then get your product once it has been built (i.e. pre-order the product with a discount)
Qualitative Product Tests
Paper Prototype (5, pg 89)
Pros
- Can be created quickly
- Easily arranged and rearranged
- Cheap and easy to throw away if you are wrong
- Can be assembled with materials already found in the office
- Fun activity many people enjoy
Cons
- Rapid iteration and duplication of the prototype can become time-consuming and tedious
- The simulation is very artificial, because you’re not using the actual input mechanisms (mouse/ keyboard/ touch screen etc)
- Feedback is limited to the high-level structure, information architecture, and flow of the product
- Only useful with limited audience
Wireframes/ Mockups/ Interactive Prototypes (1, pg 100) (2, pg 124) (5, pg 89)
- Demonstrate or show concepts to user to gauge their feedback (e.g. wireframes)
- Have an ‘ask’ as a definitive pass-or-fail criteria
- Commitment, monetary value, time, or another investment to show that they are interested
- E.g. dropbox did a video of their concept (advert as if they had built it) to convince investors
- Variations in interactivity and fidelity
- (fidelity refers to how closely the artifact looks like the final product)
PRos
- Provide a good sense of the length of the workflow
- Reveal major obstacles to primary task completion
- Allow assessment of findability of core elements
- Can be used relatively quickly
cons
- Most people will recognise that it is an unfinished product
- More attention than normal is paid to labelling and copy

Low fidelity prototype (4, page 49)
- Start with the a persona
- Draw the homepage and ask what actions the user wants to do from there
- For each action draw a box (each box is a story)
- Continue until the persona has completed their actions (including exploring edge cases) and then start with another persona

Concierge (2, pg 122)
- Deliver the end result to your customer manually
- Customer understands that it is being done manually and there is no appearance of a final solution to them
- Conduct with just enough users as this is labour intensive
Wizard of Oz (2, pg 123)
- Deliver the end result to your customer manually
- Customer is not aware that it is manual behind the scenes and thinks they are using the end product
- Tempting to leave them up as if successful you will get value from it, but it is expensive to run
- Can be combined with A/B testing
Live Product (5, pg 89)
pros
- Potential to reuse code for production
- The most realistic simulation to create
- Can be generated from existing code assets
cons
- The team can be become bogged down in debating the finer points of the prototype
- Time-consuming to create working code that delivers the desired experience
- It’s tempting to perfect the code before releasing to customers
- Updating and iterating can take a lot of time
Quantitative Product Tests
Fake Door/ 404 page (1, page 100)
- Good to test demand for a new feature
- Include a link or button on the product to direct customers to a new feature
- The link leads to a page saying it hasn’t been built yet and asking for why they would find this feature valuable
- Overuse will make customers unhappy
Product A/B tests (1, page 100)
- Used to compare performance of two alternative user experiences in your product
Qualitative Behavioural Tests
Usability testing
- Online tools can be used to give a user a task and record them completing the task
- Users are asked to talk through how easy it is to complete a task
Quantitative Behavioural Tests
A/B Testing
- Two different versions of the product are shown to the user
- Differences in behaviour are tracked (e.g. conversion percentage)
Analytics
- Tracking on the product of the users behaviour
- Data can then be analysed to see if hypothesis was acheived
Qualitative Attitudinal Tests
User Interviews
- One-on-one interview with a user
- Coming soon: Tips on User interviews
Quantitative Attitudinal Tests
User Surveys
- Coming soon: Tips of User Surveys
References
- Lean Product Playbook by Dan Olsen
- Escaping the Build Trap by Melissa Perri
- Mastering Professional Scrum by Stephanie Ockerman and Simon Reindl
- User Stories Applied by Mike Cohn
- Lean UX by Jeff Gothelf
One thought on “Hypothesis/ Solution Exploration Testing”