(Last change, July 2: Removed as beyond most teams' scope Eyetracking Study, Clickstream Analysis, Usability Benchmarking; Added Live-Data Prototypes, Demand Validation Test, Wizard of Oz Tests)
For our teams tasked with building products and features for The New York Times, we face a common challenge with many: how do we figure out what’s worth spending our time on?
The answer seems straightforward: test your ideas with real customers, leveraging the expertise of your product, UX, and engineering talent. Figure out the smallest test that you can come up with to test a specific hypothesis, gather data and insights, and keep iterating on it until you know whether the problem is real and your solution will prove valuable, usable, and feasible.
As part of our efforts to adopt such a data-driven, experimental approach to product development, we recently kicked off a product discovery pilot program. Small, cross-functional teams were paired with coaches and facilitators over a six week period to demonstrate how product discovery and Lean Startup techniques could work for real-world customer opportunities at The New York Times.
One of the first things that we learned about the process from our participants was that they wanted a "toolkit" - something to help them figure out what they should be doing, asking or making to get as quickly as possible towards the validated learning, prototypes and user tests that would have the most impact.
To help the facilitate the learning process for our dual-track Agile teams, the Product Architecture team here at The Times (Christine Yom, Jim Lamiell, Josh Turk, Priya Ollapally, and Al Ming) built a "Product Discovery Activity Guide" that rolled up activities, exercises, and testing techniques from all our favorite thought leaders.
This included brainstorming exercises from Gamestorming and Innovation Games, testing techniques from traditional user research, and rapid test-and-learn tactics from Google Ventures, Eric Ries (The Lean Startup), Jeff Gothelf (Lean UX), Steve Blank (Customer Development) and our spirit guide, Marty Cagan (Inspired), among others.
Our goal was to make it a tool not just for learning how to get started, but to be a living document for teams to share knowledge about the process itself. What techniques worked and didn't work? What tactics did they learn elsewhere that might be worth sharing with the rest of the company?
We hope you find it useful, and whether you’d like to share with us what you’re doing with it, or you have suggestions (big or small) to improve it for future product generations, please let us know! (email@example.com)