If I had asked people what they wanted, they would have said faster horses.
Henry Ford
The quote that starts this post,
attributed to Henry Ford, underlines a fundamental problem in product development: if what they need/want did not exist or they had not used it before, customers have no means to express their needs.
Yet, the first time I had to figure out what kind of product the users of my software wanted, the most frequent advice I got from the senior engineers was: "
Go to your customers and ask them: '
What do you want?' And then record the answers in the requirement document." Armed with this advice I started talking with the people who would use my software. The discussions were an interesting and frustrating exercise. There was usually a core element of the software we were developing that was pretty obvious and on which it was easy to agree about the functionality. To other questions, like "
Should this function be synchronous or asynchronous", the users would just say "
We don't know yet.", even after spending some time to think about it. Or sometimes they would prioritize the requirements in a way that would have clearly hurt their work.
The idea that people cannot express their needs and desires is not specific to product development. In fact, this idea found a more fertile ground in psychology, where
Sigmund Freud developed the new field of psychoanalysis. People like
Edward Bernays, one of Freud's nephews, started to use the idea in advertising and propaganda. Looking back through the history of ideas we can see that everything that challenges the common sense of the day takes a long time before it becomes main stream. The heliocentric theory and scurvy treatment took centuries to establish themselves. Customer cluelessness may not be such a radical idea, but it still took time for the software development community to wrap their minds around it. Anecdotal evidence from the enterprise software development community or public sector projects would suggest that the customer is still the sacred source of requirements in those areas. On the other hand, the software usability experts widely accept that the focus of their work should be on observing the users’ actual behaviour, rather than on what they tell about the product.
Maybe the software development is a young industry. Let's see how the discovery of customer cluelessness has recently changed a thousands years old human endeavour, the food industry.
For that I will rely heavily on Malcolm Gladwell's stories. The first story is about
the journey of Howard Moskowitz, the man who brought unexpected pleasures to spaghetti lovers. In the 1970s "
assumption number one in the food industry", says Gladwell,
"used to be that the way to find out what people want is to ask them." Moskowitz had other ideas. "
The mind knows not what the tongue wants", Moskowitz told Gladwell and that was the approach that he took when Campbell's hired him to save their spaghetti sauce business. Instead of gathering people and asking them how they like their spaghetti sauce Moskowitz created recipe variants based on each parameters of the spaghetti sauce that he could think about: sweetness, thickness, spiciness, cost, etc. He settled on 45 variants and took those on the road for people to taste. After processing the data he made a discovery that startled the people at Campbell's. A third of the people liked a whole new category of spaghetti sauce, the extra chunky, that no manufacturer was producing. Campbell's went on to make a lot of money with their Prego extra chunky spaghetti sauce and Moskowitz went on to create new categories of spaghetti sauce when he was was eventually hired by Campbell's main competitor, RagĂș.
To end this section about traditional requirements gathering in IT, I turn to
Dave Snowden's list of possible problems that arise from system analysts interviewing the users:
- In general users don't know what they want until they get it, then they want something different
- This in part because the interview process can only really explore what they don't like abut the current state of affairs, a sort of need defined by negation of the present
- Systems analysts like any interviewer start to form subconscious hypothesis after a fairly small number of interviews and then only pay attention to things that match those hypotheses in subsequent interviews
- Outliers, or odd demands are often ignored, while these may present some of the best opportunities of radical innovation and improvement
- Most radical new uses of technology are discovered through use, not through request and more often than not accidentally (think facebook, twitter etc. etc)
- people only know what they know when they need to know it, it requires a contextual trigger which cannot be achieved in an interview
- Early stage problems in roll out are easily ignored, or more frequently not reported, as they seem minor but then they build and result in major setbacks.
If there are fundamental limits on the user knowledge about new products how have the product developers worked around the issue?
I have already implied one method above: create several variants of your product and ask the users to test them. That is what the usability studies do and that is what Howard Moskowitz did. In his book
Blink, Malcolm Gladwell tells a few stories that should make a product developer careful when testing product variants:
- The way you structure the test influences the results. The Pepsi Challenge, a test in which soft drink tasters consistently chose Pepsi over Coca-Cola, was a "sip test" and Gladwell shows that people generally prefer the sweeter drink in a sip test. However, the results are different when they drink a whole can or when they drink a larger quantity over a longer period of time.
- Faced with too many choices customers are not able to make a decision. Using 6 varieties of jam in a tasting corner, lead to more sales jam than when 24 varieties were used. In his spaghetti sauce tasting experiments, Howard Moskowitz asked people to eat between eight and ten small bowls of different spaghetti sauces, rather than taste all 45 varieties that he had created.
- Asking non-experts to explain why they prefer a certain product changes people's preferences. When taste experts and non-experts were asked to rank strawberry jams, they produced very similar results if non-experts did not have to explain their choice. However non-expert ranking was completely messed up when they had to explain their choice. Remember the usability study principle: observe what the users do, not what they tell you about the product!
There are at least a couple of other limitations for this method:
- You need to have the minimum knowledge about the product needed to produce raw variants or mock ups.
- The usability testing approach requires observation of the user behavior, which might sometimes not be possible due to physical constraints or privacy concerns.
How about the cases when the product developers do not have the right information to build product variants? Two groups, starting from different directions, came up with solutions that share some similarities.
The first group, lead by Clayton Christensen, started from the business and marketing theory. In
The Innovator's Solution, Christensen argues that, while there is correlation, there is no proven causality between customer demographics and product sales. "
The fact that you're 18 to 35 years old with a college degree does not cause you to buy a product". People "hire" products that helps them to do a "job". Christensen's classic example is
the story of a fast food restaurant trying to improve its milk shake sales.
Its marketers first defined the market segment by product—milk shakes—and then segmented it further by profiling the demographic and personality characteristics of those customers who frequently bought milk shakes. Next, they invited people who fit this profile to evaluate whether making the shakes thicker, more chocolaty, cheaper, or chunkier would satisfy them better. The panelists gave clear feedback, but the consequent improvements to the product had no impact on sales.
The researcher from Christensen's group (JTBD group) used a different approach. Instead of focusing on the product parameters he spent his time trying to establish the context in which the milk shakes were bought: the time of day, what else the customers bought, etc. At this step the researcher already observed an interesting pattern:
He was surprised to find that 40 percent of all milk shakes were purchased in the early morning. Most often, these early-morning customers were alone; they did not buy anything else; and they consumed their shakes in their cars.
That pattern did not tell him yet what job the milk shakes were hired to do, so he interviewed early morning customers. The focus was again on the context in which the product was used rather than the product itself and his efforts were fully repaid when he figured out the pattern:
Most bought it to do a similar job: They faced a long, boring commute and needed something to make the drive more interesting. They weren't yet hungry but knew that they would be by 10 a.m. [...] They were in a hurry, they were wearing work clothes, and they had (at most) one free hand.
The key activities in this method are the data gathering for establishing the context and the customer interview process. In the interviews the subjects describe in details (as much as one can gather from a busy commuter early in the morning) the context of their decision to buy the product and the usage of the product. They are not asked for opinions about the product parameters, thus avoiding the problem mentioned earlier, of customers changing their choices when they have to explain them. A demonstration of the interview technique
can be found from jobstobedone.org.
In a discussion with Horace Dediu, from
Asymco, Bob Moesta, one of the people who worked in the '90s with Clayton Christensen, summarizes the consumer interview process:
The key is both getting the ethnography, of understanding what they [the customers] are doing and then getting them to tell stories and boil the essence of the story down and build a theory of why and how they consume.
The second group (CE group), centered around Dave Snowden, from
Cognitive Edge, has its roots in the study of organizations, "
drawing on anthropology, neuroscience and complex adaptive systems theory". Their techniques for requirement discovery are also based on data collection through user narratives and then processing for pattern detection, with context awareness playing a very important role in the process. There are differences, though, in both the data collection and in the pattern detection methods. Concerned about the system analysts introducing their own bias in the data collection, the CE focuses on getting both the data and its interpretation directly from the users through methods like the
Anecdote circles,
Future, Backwards or
Archetype Extraction.
Another difference between the methods used by the two teams is the CE group's approach on the scaling of the methods. Processing the stories manually is a problem for scaling the narrative methods. The solution offered by Cognitive Edge is a proprietary tool, called SenseMaker®, which can be used for collecting and indexing of a large number of stories. The stories are indexed by their authors, based on a reference framework created before the collection of the stories. A pre-exisisting reference framework means that the analyst bias is not completely removed, but its effects is reduced through a design that is broad enough to capture conflicting views and smart enough to avoid leading the users to the "correct" answer. The indexed data is then used for detecting patterns and the stories provide the context for interpreting the patterns. The CE methods have not been developed specifically for product requirements gathering, so one can find diverse examples of their applications from Cognitive Edge
articles and
case studies pages.
The role of the analyst is an important difference between the JTBD and CE methods. Since both methods have the support of successful cases, I can only conclude that the role of the analyst in the requirements gathering depends on the context. The
Cynefin framework developed by Dave Snowden might provide the clue to how to define the analyst's role. If the problem is in the complicated domain, expert analysis will be a very efficient tool for explaining the patterns. If the problem is in the complex domain, allowing the patterns to emerge will be the only solution. However, depending on the experts and their expertise, problems might appear to be complex to some experts while others will consider them complicated. The amount and the quality of the data can also influence how a problem is viewed.
The JTBD and CE methods help in discovering problems, needs or opportunities for product development. There is a long road from discovery to end product and it might require several product development and user feedback iterations, a process that Dave Snowden is describing as co-evolution.
For software developers the use of narrative approaches sounds like good news. User stories are a requirements capture method that has gained a lot of popularity lately. However, there are few problems in the way the user stories are captured by most software development teams and they are
highlighted by Jim Coplien:
What Alistair [Cockburn] originally meant by user story is something like the following:
"Susan, who is a doctor and has two children is a shift worker in a hospital. She works different hours on weekends than during the week. She wants to set up her alarm to get up at the right time. Sometimes she has to work night shifts or two shifts a day. She wants to set her alarm for an entire week in time, because she knows her work schedule a week in time, so she could wake up at the right time to go to work."
There is a user... and a story, hence the term user story.
Compare this to “
I, as a user, want to set up my alarm a week in advance."
Not that I am complaining. Without the oversimplification of the user stories we would have never had the wonderful world of the
Cat User Stories:
Notes:
- This post owes a lot to the days I have spent in Feb. 2013 in Amsterdam, attending the Practitioner Foundation course organized by Cognitive Edge and taught by Tony Quinlan. Scattered questions and answers have started to converge for me after three days around stimulating subjects and smart people.
- Malcolm Gladwell is as good a speaker as he is a writer, so it's worth listening to him telling the stories. Check this and this.