Just Enough Research

Just Enough Research

Author

Erika Hall

Year
2013
image

Review

I found this book incredibly useful when I first came across it; although I rarely recommend it now. If you’re seeking an intro to UX Research or Experimentation I recommend ‘Talking to Humans’ and ‘Testing with Humans’ instead. This book tries to cover too much ground - and therefore has to stay at the surface level.

The chapter on Organisational Research is the exception; I found it incredibly helpful. Product Managers and Project managers get wise to how your organisation works.

The title itself makes a great mental model - ‘Just enough of x’ is a useful lens. It reminds me of one of the 7 wastes of lean manufacturing ‘over processing’.

You Might Also Like…

image

Key Takeaways

The 20% that gave me 80% of the value.

Different types of research and when to use them. What you need to find out determines the type of research you need to conduct…
  • Generative or Exploratory Research is the research you do before you even know what you’re doing. The problem is badly defined. Interviews, field observation, reviewing existing literature.
  • Descriptive & explanatory Research is observing and describing the characteristics of what you are studying. Ensures you’re not designing for yourself. You’re looking for the best way to solve a problem. Learn about complementary activities and competing solutions.
  • Evaluative Research is “Are we getting close?” With a clear idea of the problem you can define and test potential solutions… to make sure they work and meet requirements. Ongoing and iterative as you move through design / development.
  • Casual Research is “Why is this happening?”. Looking at usage (cause and effect can be tricky).
Research can be viewed as a threat or a nuisance - you’ll face objections. Be ready to advocate for research get buy-in before you start
Objections you’ll hear people say
  • We don’t have time → then we don’t have time to be wrong about key assumptions either
  • We don’t have expertise → stick to simple techniques
  • We don’t have budget → reduce scope
  • Hippo will tell us what to do anyhow → we need to fight dictatorial culture
  • You can find out everything you need to in beta → better to know some things before starting
  • Research will change the scope → helps stop unexpected complexity later
  • You’ll stop/slow innovation → relevance to the real world separates invention from innovation
  • Their actual reasons are likely to be different (can’t be bothered, afraid of being wrong, uncomfortable talking to people)
  • Reminders
    • Customer service can be a treasure trove.
    • Agile describes only how to build - not what to build
    • Prioritise the highest value users.
    • Defer less urgent research and complete it while the software is being constructed.
6 types of bias. Design, Sample, Interviewer, Sponsor, Social Desirability and Hawthorne
  • Design bias → design of the studies themselves, how they are structured and conducted
  • Sampling bias → unavoidable in normal companies, careful when drawing conclusions
  • Interviewer bias → interviewers should be neutral,
  • Sponsor bias → concealing your motive (for neutrality)
  • Social desirability bias → you want honesty not ego
  • Hawthorne effect → behaviour may change when being observed
  • Research Best Practice:
    • 1) Phrase your research goal/question clearly
    • 2) Set realistic expectations with stakeholders before starting
      1. Questions to be answered
      2. Methods to be used
      3. Decisions to be informed by the findings
      4. Ask them what they hope for and tell them what to expect
    • 3) Be prepared. The better you prep, the faster and cleaner the work goes. Get your process and materials in order before you start. Setup so easy to reuse / replicate.
    • 4) Take good notes → If you don’t take notes it didn’t happen
    • 5) Allow sufficient time for analysis
  • How Much Research is Enough?
    • Try to avoid unnecessary research.
    • Identify your highest priority questions and your assumptions that carry the biggest risk. Don’t investigate assumptions with little risk or impact.
    • These ‘what if’ questions will make you want to do some research!
      • We are solving the wrong problem?
      • We were wrong about how much organisational support we have for this project?
      • We don’t have the competitive advantage we thought we had?
      • We were working on features that excited us not the customer?
      • We failed to reflect on what is actually the most important to our users?
      • Our users don’t understand the labels we’re using?
      • We missed a key aspect if our users’ environments?
      • We were wrong about our prospective users habits and preferences?
    • Research can validate assumptions and reduce risk, but it can also help you discover opportunities you haven’t thought of
    • The answer to how much is enough… is the point at which you feel sufficiently informed and inspired. The point at which you have a clear idea of the problem you want to solve and enough information to solve it
The steps of systematic inquiry
  1. Define the problem. Useful research depends on a clear problem statement. In design you’re solving for user needs and business goals. In research you’re solving for a lack of information.
    • You need to know when you’re finished. Create a base statement on a verb like ‘describe, evaluate or identify.
      • Example: We will describe how our users currently do x
      • Example: We will evaluate the relative advantages and disadvantages
      • Verbs like ‘understand’ or ‘explore’ should be avoided as they’re open ended
    • You may have a main statement and some sub-questions / points you’d like to cover
  2. Select the approach. Your problem statement should drive you toward a type of research.
    • Add constraints like time, money, expertise, resources.
    • If your question is about users you’ll be doing user research (ethnography).
    • If you want to evaluate an existing solution you’ll be doing evaluative research.
  3. Plan and prepare for the research. Make somebody responsible for the checklist. Sketch an initial plan - decide on time and money investment. Adapt the plan as you go. Each activity will make you smarter and more efficient
  4. Collect the data. Share with your team ASAP. The more organised you are the more effective the research will be
  5. Analyse the data. Gather the data together → look for patterns → write observations → form recommendation.
  6. Report the results. Write a summary report with one or more modules. Tailor it to the audience. Include goals, methods, insights and recommendations. Make it visual with a persona sketch, photo, sticky notes if you want.
Useful data from research is typically quotes and observations that indicate…
  • Goals (what the user wants to accomplish that your product can help them with or relates to)
  • Priorities (what is most important to the participant).
  • Tasks (actions the participant takes to meet their goal).
  • Motivators (the situation or event that starts the participant down the task path).
  • Barriers (the person, situation, or thing that prevents the participant from doing the task or accomplishing the goal).
  • Habits (things the participant does on a regular basis).
  • Relationships (the people the participant interacts with when doing the tasks).
  • Tools (the objects the participant interacts with while fulfilling the goals).
  • Environment (what else is present or going on that affects the participant’s desire or ability to do the tasks that help them meet their goals).

Organisational Research: Helps you understand your organisation.

You can have more influence if you understand the inner workings of your company. Your success depends on how well you understand the organisation.
  • A design project is a series of decisions, getting the right decisions made in a complex organisation is difficult
  • The success of a product or service depends on how well it fits into everything else the organisation is doing and well the organisation can and will support it.
  • An organisation is just a set of individuals and a set of rules, explicit and implicit. Once you understand that environment, you’ll be able to navigate it and create the best possible product.
  • Most user-centered design studios will interview client stakeholders (people who will be directly affected by the outcome of the project) as part of gathering requirements
  • In organisational research, the observer effect can be a force for positive change. Ask hard questions and they’ll have to come up with answers - leading to reflection
  • Asking the same question of different people will reveal crucial differences in motivation and understanding
  • Research should include anyone without whose support your project will fail. Executives, managers, subject matter experts, staff in various roles, investors and board members
Do organisational research to learn…
  • Your Stakeholders
    • Roles, attitudes and perspectives
    • Levels of influence, interest and availability
    • How to get buy-in from them and who to consult
    • How your work impacts others → do they stand to benefit or suffer with the success of failure of your work
    • Likelihood that any of them have power or potential to prevent project success
  • Understand the systems and workflows - How workflow has to change
  • Resources to build, launch and support
  • Business requirements and constraints
  • Do you agree on goals and definition of success
  • How people outside the team view your project
  • Company priorities, the strategy and how aligned day-to-day decision-making is to them
  • The approval process
  • How to tailor your design process
  • What to do with your organisational research
    • Create a clear statement of what you need to accomplish to make the project a success
    • Connect what you’re doing to the goals of the business
    • Create a clear set of requirements. They should be these 7 things…
      • Cohesive. The requirements all refer to the same thing
      • Complete. No missing information or secret requirements that didn’t make it to the list.
      • Consistent. The requirements don’t contradict each other.
      • Current. Nothing obsolete.
      • Unambiguous. No jargon. No acronyms. No opinions.
      • Feasible. Within the realm of possibility on this project
      • Concise. Keeping them short and clear will increase the chances that they are read, understood, remembered, and used. Aim for no more than two to three pages.
      What to include…
      • Problem statement and assumptions
      • Goals
      • Success metrics
      • Completion criteria
      • Scope
      • Risks concerns and contingency plans
      • A workflow diagram
      • Verbatim quotes

User Research: Helps you identify patterns and develop empathy

  • Ethnography = “What do people do and why do they do it?” User research = Ethnography + What are the implications for the success of what I am designing?”
  • Ethnography is a set of qualitative methods to understand and document the activities and mind-sets of a particular cultural group who are observed going about their ordinary activities in their habitual environment.
Do ethnographic research to…
  • Understand the needs and priorities of your subjects, the context in which they will interact with what you’re designing. How do they solve your problem currently.
    • Relevant habits, behaviours, and environments
  • Replace assumptions about what people need and why with actual insight.
  • Understand their mental models (internal concepts of associations with a system or situation).
  • Create design targets (personas) to represent the needs of the user in all decision-making.
  • Hear the users language and develop the voice of the product
  • This is very different from gathering opinions. It isn’t just surveying or polling. And it’s definitely not focus groups.
The 4 D’s of ethnography (Deep dive, daily life, data analysis, drama)
  • Deep dive - Get to know a small but sufficient number of representative users very well (in their shoes)
  • Daily life - Get into the field where things are messy and unpredictable. Behaviour changes with context & circumstances
  • Data analysis - take time to gain real understanding, and get your team involved in creating useful models.
  • Drama - narratives help everyone on your team rally around and act on the same understanding of user behaviour - Personas will keep you honest. You design for them, not for you or for your boss
  • Talking to, or observing representative users directly in their environment reduces the risk of bad assumptions and bias
  • Assumptions are insults. You risk being wrong and alienating people. Designing for yourself can bake discrimination into your product. Be cautious with assumptions about the age, gender, ethnicity, sex → they might not serve your goals or ethics.
  • Your challenge as a researcher is to figure out how to get the information you need by asking the right questions and observing the right details.
    1. The first rule of user research: never ask anyone what they want!
      • People want to be liked more than they want to tell the truth about their preferences. Many people lie, many lack sufficient self knowledge to give the truth
      • Most people are poor reporters and predictors of their own preferences and behaviour when presented with speculative or counterfactual scenarios in the company of others
    2. Mindset: “How should what I’m designing interact with this person” try not to be judgmental.
On Interviewing Humans…
  • Goal: Learn about everything that might influence how the users might use what you’re creating
    • You’re learning a new subject: that person
To be a good interviewer you need to be able to shut up.
  • Learn to listen.
  • Don’t try to be liked.
  • Don’t try to demonstrate your smarts
  • You should know nothing. You’re learning a new subject: that person
Preparation
Create an interview guide
  • Brief description and goal of study (you can share this with the participant)
  • Basic factual / demographic characteristics of the user - provide context
  • Ice breaker / warm up questions
  • The questions or topics that are the primary focus of the interview
  • Gather some background information on the topic and people you’ll be discussing, particularly if the domain is unfamiliar to you.
On recruiting participants…
  • Locate → attract → screen → acquire
  • Ask the right people the wrong things, and you’ll still learn something. The wrong people won’t help you.
What makes a good participant?
  • Shares the concerns and goals of your target users
  • Embodies key characteristics of your target users, such as age or role
  • Can articulate their thoughts clearly
  • Is as familiar with the relevant technology as your target user.
  • In theory, recruiting is just fishing. Decide what kind of fish you want. Make a net. Go to where the fish are. Drop some bait in the water. Collect the ones you want. It isn’t actually that mysterious, and once you get the hang of it, you’ll develop good instincts.
  • Match participants on actual behaviour (people who ride bikes) not what people say (people who say they are interested in bike riding)
  • Knowledge, skills, access, behaviours,
3 Act interview structure
  • Introduction: Smile, gratitude, purpose, topic, how info shared, obtain permission, opportunity for them to ask questions
  • Body: Open ended questions, probe (tell me more about that), leave silences. List of questions as a checklist not a script. Keep it natural
  • Conclusion: Thank you, let them answer questions,
Conducting the interview
  • You are the host and the student
  • Build rapport, put them at ease, make them comfortable
  • Absorb everything they have to say
  • Only say things to redirect, keep things on track
    • Breathe.
    • Practice active listening.
    • Keep an ear out for vague answers. Follow up with… why is that? Tell me more about that?
  • Avoid talking about yourself
Checklist Ethnography Field Guide from Helsinki Design Lab
  • Welcoming atmosphere / make participants feel at ease
  • Listen more than you speak
  • Take responsibility to accurately convey the thoughts and behaviours of the people you are studying
  • Conduct your research in the natural context of the topic you’re studying
  • Start each interview with a general description of the goal, but be careful of focusing responses too narrowly
  • Encourage participants to share their thoughts and go about their business
  • Avoid leading questions and closed yes/no questions. Ask follow-up questions
  • Prepare an outline of your interview questions in advance, but don’t be afraid to stray from it
  • Whenever possible, snap photos of interesting things and behaviors
  • Also note the exact phrases and vocabulary that participants use
  • Pay attention after you stop recording. You might get a valuable revelation.
Offer enough information to set the scope for the conversation, but not so much that you influence the response.
  • Tell me about your job
  • Walk me through a typical week in your life
  • How often are you online?
  • What computers or devices do you use?
  • When do you use each of them?
  • Do you share any of them?
  • What do you typically do online?
  • What do you typically do on your days off?
  • How do you decide what to do?
  • Tell me about how your children use the internet
  • How do you decide what to do on your days off with your kids?
  • What are your particular non-work interests? What do you read online besides the news?
  • How frequently do you visit museums in your town? Which ones?
  • What prompts you to go?
What to do with the data you collect
  • The interview is the basic unit of ethnographic research
  • Analyze them together to find themes, including user needs and priorities, behavior
  • Research patterns, and mental models
  • Note specific language and terms you heard
  • If doing generative research, look to the needs and behaviours you discover to point out problems that need solving
  • Turn the clusters around user types into personas that you can use for the life of the product or service you’re working on
Contextual Inquiry / Observation Studies
  • You enter the participant’s actual environment and observe as they go about the specific activities you’re interested in studying.
  • See behaviours in action, learn the small things you won’t hear about in an interview, such as unconscious and habitual work-arounds
  • Contextual inquiry is useful for developing accurate scenarios, stories of how users interact with potential features, identifying aspects of the environment that affect how a product is used.
  • Settle, interview at a time that's least disruptive, observe and summarise (and repeat back to user)
Don’t use focus groups - they are the antithesis of ethnography.
  • Focus groups don’t provide insight into behaviour or the user’s habitual context. Only good for identifying future ideas to be researched. The artificial environment bears no resemblance to actual context your product would actually be used. It’s a performance that invites social desirability bias and gets in the way of finding out what people need and how they behave outside the group dynamic. Participants are more likely to proclaim or conceal behaviours for the benefit of those around them.

The talking (and watching) cure

  • Accept no substitute for listening to and observing real people who need to do the things you’re designing a thing to help people do
  • A few interviews could change everything about how you approach your work. The information you gather will pay dividends as you gather and examine it, grounding your design decisions in real human needs and behaviors. You will develop powerful empathy that can inspire you to find creative, effective solutions
image

Deep Summary

Longer form notes, typically condensed, reworded and de-duplicated.

Competitive Research

Competition = anything considered or used that solves the problem you want to solve (or anything helps people avoid that problem)
  • If you aren’t working on something that solves a real problem or fills a real need then your competitions is anything else that anyone does with their time and money.
  • The hardest competitor is the one your potential customers are using now. Switching cost and inertia are powerful. You have to be better enough to justify change.
  • You’re also competing for attention of the minds of your users.
  • Goal: See how other people solving similar problems, and identify opportunities to offer something uniquely valuable
    • What matters to our customers? How are we better at serving that need than any competitor? How can we show our target customers our product is the superior choice
SWOT Analysis
  • Strengths and Weaknesses → User research
  • Threats and opportunities → Competitor research
Positive
Negative
Internal
Strengths - Reputation - Excellent Staff
Weaknesses - Internal design resources are more exhibit focused than online technology focused
External
Opportunities - Community desire for family-friendly weekend activities - More dads are in charge of Saturday activities
Threats - Competition for attention - Schools are cutting back on field trips
Competitor Audit:
  • Once you’ve identified a set of competitors and a set of brand attributes, conduct an audit to see how you stack up
  • Add in competitors mentioned in your interviews or that appear in web searches
  • Identify which bits of their work are most relevant and accessible. For each competitor:
    • Positioning? Offer?
    • Targeting? Vs your target audience
    • Key differentiators. What makes them uniquely valuable to their target customers?
    • To what extent do they embody your +ve/-ve attributes?
    • User needs / wants
    • What are they doing well / badly
    • Given all that - opportunities to beat, things to adopt or take into consideration
Brand Audit
  • Take a good look at your own brand
    • Does it set expectations correctly for the whole experience?
  • Brand = reputation - signify your identity & reputation to current and potential customers
  1. Attributes: Characteristics associated/ to be avoided with the product
  2. Value proposition: what value you offer, how you communicate this
  3. Customer perspective: what associations do they have with your brand
  • Competition often drives how important a brand is, and your positioning
  • Name: unique, unambiguous, easy to spell and say
  • Logo: Wordmark, lockup, app icon, favicon.
    • List all the contexts users encounter it
    • Consider competitors
    • Will it ever appear on its own?
Usability test the competition
  • Don’t just test yours, test the competitors
  • Helps you understand their strengths and weaknesses directly from the users point of view
  • How users conceptualise the tasks
  • Spot opportunities
  • Competitor landscape moves quickly → new options are appearing and product categories are collapsing every day. An accurate, user-centred perspective of your comparative strengths and weaknesses will help you focus your message and hone your image.

Evaluative Research

  • Is testing how well your design/solution works for your audience. Best before doing a large public launch. Helps you assessing the merit of your design. The approach will depend on project stage.
  • A Heuristic analysis is a quick and cheap way to identify potential issues. 2 colleagues can do it in an hour. It’s a good way to deal with obvious issues in early prototypes before bringing in users.
  • Heuristic Analysis - Score a product against the following 10 principles:
    • System status visibility - appropriate feedback
    • Match between system and the real world - familiar language and conventions
    • User control & freedom - exits, undo, redo
    • Consistency and standards - things that appear the same should behave the same
    • Error prevention - Help users avoid errors
    • Recognition rather than recall - options should be visible. Instructions findable. Don’t make the user remember information
    • Flexibility and efficiency of use - Support shortcuts for expert users
    • Aesthetic and minimalist design - avoid providing irrelevant information
    • Help users recognise and recover from errors - Error messages should be helpful
    • Help and documentation - should be usable without documentation, but help should still be available
  • You can test a competitor’s service or product. You can test your early sketches.
  • Once live you can look at quantitative data (site analytics) to see how people are actually interacting and if that meets your expectations
    • Numbers tell you what’s happening. Talking to individuals helps you understand why it’s happening.
  • Not a substitute for usability testing

Usability Testing

  • The easier it is for your customers to switch to an alternative the more important usability is. The more complex the system, the more work is required to ensure its usable.
Usability is an attribute defined by 5 components:
  • Learnability → How easy is first time use
  • Efficiency → Once learned, how quickly can they perform tasks
  • Memorability → when returning after a period, how quickly can they re-establish proficiency
  • Errors → How many errors do users make, how severe are those errors, how easily can they recover from those errors?
  • Satisfaction → how pleasant is it to use the design?
Cheap tests first, expensive tests later. Don’t use expensive tests to find out things you can discover with cheap tests.
  • Start with paper not prototypes.
  • Start in the office not in the field
  • Test with general audience before a specific audience
  • Test a competitors product before you put pen to paper
  • Frequency of testing should depend on how quickly design decisions are being made
  • Do it as you go - not just before launch. Build it into your design / build workflow. Create testing process and checklist that includes all of the information and equipment you need.
  • You need a … Plan, prototype or sketch, 4-8 participants, facilitator, observer, a way to document, a timer
Usability test plan
  • Revolves around tasks
    • Persona and journey led
    • Feature scenarios and tasks led
  • Provide a scenario to the user before giving them the task (you want to do x)
  • Not all tasks are super important - is it a deal breaker?
  • The test plan includes:
    • What you’ll do, how you’ll conduct the test. Which metrics you’ll capture, the number of participants, which scenarios you’ll use.
      • Objectives
      • of the test: what are you testing and what state is it in?
      • Methodology
      • Participants and recruiting
      • Procedure.
      • Tasks
      • Usability goals
      • Completion rate (the percentage of tasks the user was able to complete)
      • Error-free rate (the percentage of tasks completed without errors or hiccups)
  • Facilitators should avoid leading the user or helping the when they get lost
  • Users will blame themselves if they can’t get it to work - ask them to describe how they expected the system to work and why they had that expectation.
Observing and documenting - Recording is good. Second person note taking is good.
  • Note the following:
    • Participants reaction to the task
    • How long it takes
    • If they were successful / fail
    • Any terminology that was a stumbling block
    • Non-verbal frustration
    • Quotes
    • Successful / unsuccessful features
  • Using a webcam as a screen / audio recording
  • Only use eye tracking with people who can’t verbalise what’s drawing their attention (expensive waste of money)
Prioritise results by severity and frequency → Then benchmark against competitors.
Severity
  • High: an issue that prevents the user from completing the task at all
  • Moderate: an issue that causes some difficulty, but the user can ultimately complete the task
  • Low: a minor problem that doesn’t affect the user’s ability to complete the task.
Frequency:
  • High: 30% or more participants experience the problem
  • Moderate: 11–29% of participants experience the problem
  • Low: 10% or fewer of participants experience the problem.
Priority
  • Tier 1: high-impact problems that often prevent a user from completing a task. If you don’t resolve these you have a high risk to the success of your product
  • Tier 2: either moderate problems with low frequency or low problems with moderate frequency
  • Tier 3: low-impact problems that affect a small number of users. There is a low risk to not resolving these.
  • Benchmark against your competitors to make a more compelling argument for change!
Different types of models and analysis
Affinity Diagram - Turns research into evidence-based recommendations
Design Mandates
- Mandate 1 - Mandate 2 - Mandate 3
Insights
Insight A
Insight B
Insight C
Observations
- Observation 1 - Observation 2 - Observation 3
- Observation 1 - Observation 2 - Observation 3
- Observation 1 - Observation 2 - Observation 3
Personas & Scenarios - Help maintain empathy for users
  • A Persona is a fictional user archetype. Represents a group of needs and behaviours. They help you make decisions and write stories.
  • They a big part of user-centred design. Helps maintain empathy for users.
  • Design Personas are not marketing targets. They help the design process - not the business
  • Create as few as possible - while representing all relevant behaviour patterns. Consider them as a set.
  • For accurate personas, select a role that closely matches that of one of the participants you interviewed
  • Scenarios → if personas are characters, scenarios are your plots.A scenario is the story of how a persona interacts with your system to meet one (or more) of their goals.
  • Running a persona through a scenario helps you think through your design from the user’s point of view. You can use scenarios at several points in your process:
    • To flesh out requirements
    • To explore potential solutions
    • To validate proposed solutions
    • As the basics for a usability test script
Mental Models - If something is intuitive, it matches a users mental model
  • If something is intuitive → it matches the users mental model
  • We want the designers to be aware of the users mental model.
  • Mental models are about exposing how users think.
  • Use a diagram to share with your team → and expose unmet user needs
  • Evaluate Alternatives
    Chose based on priorities
    Avoid some destinations
    Visit websites of potential destinations
    Choose based on proximity
    Avoid expensive destinations
    Read reviews on travel website
    Choose based on educational potential
    Avoid complicated logistics
    Suggest alternatives to children
    Choose based on variety of activities
    Avoid tourist traps
    Discuss potential destinations with partner
    Choose based on value for money
    Avoid boring destinations
  • To to create a mental model
    • Do user research
    • Make an affinity diagram
    • Place affinity clusters in stacks representing the user’s cognitive space to create the model. These groups will include actions, beliefs, and feelings
    • Group the stacks around the tasks or goals they relate to
Conceptual Modelling / Sight mapping - relate tasks, content, functionality
  • You can translate mental model into a conceptual map that relates content and functionality according to the target users views
image
  • Gap Analysis - Use mental models to identify gaps, or mismatches between what you offer and what the user needs or expects. Design features that close those gaps.
Task Analysis = Breaking a task into the steps it takes to accomplish it
  • In addition to informing the feature set and flow of an application, task analysis can help identify where content might support a user along their task path. They may take different paths
  • Just the activity of creating a model like this with the team will improve shared understanding
  • Don’t underestimate the accessibility and appeal of your analysis, visualised.
  • image
  • This task path for ticket purchase can help identify areas where the user needs specific content and functionality to meet her goal

Quantitative Research

  • Quantitative research and analysis should mainly be used for optimisation.
  • Once you can measure your success in numerical terms, you can start iterating towards it
  • Look for trends and patterns BUT be careful data can deceive - more page views could be more engagement (or a sign of user frustration)
Split test - Control and variation.
  • Select a goal
  • Create variations
  • Choose an appropriate start date
  • Run the experiment until you’ve reached a 95% confidence level
  • Review the data
  • Decide what to do next, stick with control or switch to variation, or run more tests
  • Low traffic sites will take weeks to validate something
  • The best response to a user interface question is not necessarily a test
  • Use for tweaking and knob-twiddling—not a source of high-level strategic guidance
  • Can introduce inconsistencies
  • Local Maximum Problem: experimentation can lead to a culture of incrementalism and risk aversion. How will you ever make a great leap that might have short-term negative effects?
image

Deep Summary

Longer form notes, typically condensed, reworded and de-duplicated.

  • Being the smart person is more fun than obeying the smart person
    • How people can feel how people if they’re just recipients of the analysis
  • To select the best research tools… you need to know…
    • What decisions are in play (the purpose)
    • What you’re asking about (the topic)
  • Then you can find the best ways to gather background information, determine project goals and requirements, understand the project’s current context, and evaluate potential solutions.

Types of Research:

  • Generative or Exploratory Research is the research you do before you even know what you’re doing.
    • Interviews, field observation, reviewing existing literature
  • Descriptive & explanatory Research is observing and describing the characteristics of what you are studying.
    • Ensures you’re not designing for yourself. You’ve moved passed what is a good problem to solve and are now what is the best way to solve this problem?
    • What activities do people engage in which may complement or compete with your solution.
  • Evaluative Research is “Are we getting close?”
    • Once you have a clear idea of the problem you can define potential solutions.. And test them to make sure they work and meet requirements. Ongoing and iterative as you move through design / development.
  • Casual Research is “Why is this happening”
    • Looking at usage (cause and effect can be tricky).
  • Research Roles represent clusters of tasks, not individual people. Often one person will take on multiple roles in a study
    • Author: Plans and writes study. Problem statement and questions, and the interview guide or test script
    • Interviewer / Moderator: interacts with the test participants
    • Coordinator / scheduler: schedules sessions
    • Notetaker / recorder: writes notes
    • Recruiter: screens participants and selects good subjects
    • Analyst: reviews the gathered data to look for patterns and insights. More than one person should have this role
    • Documenter: Reports findings once research complete
    • Observer: Watch or listen in
  • The research process: We’ll cover ways to organise research activities - checklist & RACi
  • Objections to research:
    • Sometimes viewed as a threat or a nuisance
    • Organisational buy-in needed to proceed
    • Get ready to advocate for your research project - before you start it
  • Objections you will hear:
    • We don’t have time? - You don’t have time to be wrong about your key assumptions though
    • We don’t have the expertise or the budget - We can reduce scope
    • The CEO is going to dictate what we do anyway - fight dictatorial culture.

One Research methodology is superior (qualitative vs quantitative)

  • What you need to find out determines the type of research you need to conduct, its that simple.

Why people say it won’t work:

  • You need to be a scientist: this isn’t pure science. Its applied research. You’ll share these qualities with a scientist
    • Desire to find out needs to be stronger than to predict
    • Depersonalise the work
    • Good communicator and analytical thinker
  • You need infrastructure -
  • It will take too long -
  • You can find out everything you need in beta - You can find out if features work, found. However many things are better known before designing / coding
  • We know the issue / users / app / problem inside out already - unless research has been done recently a fresh look will probably be useful.
  • Research will change the scope - helps stop unexpected complexity later
  • Stopping innovation - relevance to the real world separates invention from innovation

Actual Reasons:

  • I don’t want to be bothered
  • Afraid of being wrong
  • Uncomfortable talking to people

Research in any situation:

Design happens in context, research is simply understanding that context

When doing in-house at an organisation:

  • Politics are a consideration
  • Challenging assumptions of those in power can be scary
  • Customer service is a data treasure trove
  • You need to understand how product and marketing decisions are made in your company
  • Agile describes only how to build not what to build

Read link: : Best practices for Adding UX Work to Agile Development by Jeff Patton

  • prioritise highest value users
  • Analyse and model data quickly and collaboratively
  • Defer less urgent research and complete it while the software is being constructed

Just enough rigor:

Cover your bias - note obvious ones

  • Design bias: design of the studies themselves, how they are structured and conducted
  • Sampling bias: unavoidable in normal companies, careful when drawing conclusions
  • Interviewer bias: interviewers should be neutral,
  • Sponsor bias: - concealing your motive (for neutrality)
  • Social desirability bias: you want honesty not ego
  • Hawthorne effect: behaviour may change when being observed

Ethics of user research:

  • Ethical research charter questions
  • Best researchers are like Spock. Just enough humanity and humour to temper their logical thought processes and allow them to roll with imperfect circumstances. Rigorous not rigid.
  • How do you know you’re being rigorous enough:
  • Discipline and Checklists

Research best practice checklist

  1. Phrase questions clearly (the big question you’re asking, why you’re doing the research)
  2. Set realistic expectations
    1. Do this before starting with stakeholders
    2. Questions to be answered
    3. Methods to be used
    4. Decisions to be informed by the findings
    5. Ask them what they hope for and tell them what to expect
  3. Be prepared
    1. The better you prep, the faster and cleaner the work goes
    2. Get your process and materials in order before you start. Setup so easy to reuse / replicate
  4. Allow sufficient time for analysis
  5. Take good notes
    1. If you don’t take notes it didn’t happen

How Much Research is Enough?

  • Avoiding unnecessary research
    • Identify your highest priority questions - your assumptions that carry the biggest risk
    • Given our business goals, what potential costs do we incur - what bad thing will happen - if, 6 months from now, we realise:
      • We are solving the wrong problem
      • We were wrong about how much the organisational support we have for this project
      • We don’t have the competitive advantage we thought we had
      • We were working on features that excited us not the customer
      • Failed to reflect on what is actually the most important to our users
      • Our users don’t understand the labels we’re using
      • We missed a key aspect if our users’ environments
      • We were wrong about our prospective users habits and preferences
    • If there is no risk associated with an assumption you don’t need to investigate it
    • As well as validating assumptions and minimising risks - you may discover new opportunities that you haven't thought of
    • All it takes to turn potential hindsight into foresight is keeping eyes open and asking the right questions
  • That satisfying click
    • No matter how much research you do, there will always be things you wish you’d known
    • Design will still be an iterative process
    • The answer to how much is enough… is the point at which you feel sufficiently informed and inspired. Known unknowns.
    • The point at which you have a clear idea of the problem you want to solve and enough information to solve it
    • Collaborate with the team to see if they see the same patterns as you

The Process

The systematic inquiry. Follow these 6 steps:

1) Defining the problem
  • A useful research study depends on a clear problem statement
  • In design you’re solving for user needs and business goals.
  • In research you’re solving for a lack of information.
  • You need to know when you’re finished.
    • Base statement on a verb like ‘describe, evaluate or identify;
    • Verbs like ‘understand’ or ‘explore’ should be avoided as they’re open ended
    • Good example: We will describe how our users currently do x
    • Good example: we will evaluate the relative advantages and disadvantages
    • You may have a main statement and some sub-questions / points you’d like to cover
  • Now you’ve identified the ‘what’ you can move onto the ‘how’
2) Select the approach
  • Your problem statement should drive you toward a type of research
  • Add constraints like time, money, expertise, resource
  • If your question is about users you’ll be doing user research (ethnography)
  • If you want to evaluate an existing solution you’ll be doing evaluative research
  • You may do 2 or 3 things.
image
3) Plan and prepare for the research
  • Have a person responsible for the checklist
  • Sketch an initial plan - decide on time and money investment
  • Adapt the plan as you go
  • Each activity will make you smarter and more efficient
  • Recruiting
    • Locating, attracting, screening and acquiring good research participants
    • You can learn things by asking the right people the wrong questions - the wrong people however won’t help you
    • A good participant:
      • Shares the concerns and goals of your target users
      • Embodies key characteristics of your target users, such as age or role
      • Can articulate their thoughts clearly
      • Is as familiar with the relevant technology as your target user.
    • In theory, recruiting is just fishing. Decide what kind of fish you want. Make a net. Go to where the fish are. Drop some bait in the water. Collect the ones you want. It isn’t actually that mysterious, and once you get the hang of it, you’ll develop good instincts.
    • Match participants on actual behaviour (people who ride bikes) not what people say (people who say they are interested in bike riding)
    • Knowledge, skills, access, behaviors,
4) Collect the data
  • Research makes data
  • Share with your team ASAP (in a shared drive / back them up)
  • The more organised you are the more effective the research will be
Interviewing
  • Effective way to get in another persons head and see the world like they do
  • requires basic social skills, some practice, and a modicum of self-awareness
  • a semi-structured interview, meaning that you will have prepared questions and topics, but not a strict script of questions to ask each participant in the same order and manner. This allows more flexibility to respond to the individual perspective and topics that come up. You might find out some very useful things you would have never thought to ask.
  • A successful interview is a comfortable interaction for everyone involved that yields the information you were looking for. The keys to success are preparation, structure, and conduct.
Usability testing
  • A directed interview with a user using a prototype or product attempting certain tasks. to determine to what extent the product or service as designed is usable
    • whether it allows users to perform the given tasks to a predetermined standard
    • to uncover any serious, resolvable issues
  • Don’t think of it as a separate activity, just another type of review to ensure you’re meeting that set of needs. Business review. Design
  • review. Technical review. Usability review
  • What usability testing does:
    • Can be your thing. Or a competitors.
    • Input as to what works and doesn’t with the current version
    • Uncover significant problems with labeling, structure, mental model, and flow, which will prevent your product from succeeding
    • Let you know whether the interface language works for your audience
    • Reveal how users think about the problems you purport to solve with your design
    • Demonstrate to stakeholders whether the approved approach is likely to meet stated goals.
  • Usability testing is criticized because aiming for a usable product is like aiming for mediocrity
    • Won’t provide you with a story, vision or breakthrough design
    • Be an indicator of success
    • Help you prioritise user tasks
    • Substitute for QA testing the final product
  • Don’t build a usability lab, test in context
Literature review
  • Use if you don’t have time or want background information
  • Sources: Research in your own company, or a different company,
  • Can inform your understanding of users to help you find better questions
  • Validate general assumptions
  • Complement your work
  • Use with caution! How was the question asked, does the sample map to your database, how old is it?
5) Analyse Data
  • Gather the data together, look for patterns
  • Turn patterns into observations, from observations recommendations will emerge
  • Refer back to your problem statement - how do your observations answer that
  • Involve the team. A group can generate more insights faster. Circulate findings afterwards
  • Structuring an analysis session
    • Get a room, review all the notes together. Make observations > turn them into insights
      • Summarise the goals and the process of the research
      • Describe who you spoke with and under which circumstances
      • Describe how you gathered the data
      • Describe the types of analysis you will be doing
      • Pull out quotes and observations
      • Group quotes and observations that typify a repeated pattern
      • Summarize findings including the patterns you noticed
      • Document the analysis in a sharable format
    • Ground rules:
      • Focus solely on understanding the needs of the user
      • Respect the structure of the session - refine from identifying patterns until gone through data
      • Differentiate observations from interpretations
      • No solutions
  • Data: Quotes and observations that indicate:
    • Goals (what the user wants to accomplish that your product can help them with or relates to)
    • Priorities (what is most important to the participant).
    • Tasks (actions the participant takes to meet their goal).
    • Motivators (the situation or event that starts the participant down the task path).
    • Barriers (the person, situation, or thing that prevents the participant from doing the task or accomplishing the goal).
    • Habits (things the participant does on a regular basis).
    • Relationships (the people the participant interacts with when doing the tasks).
    • Tools (the objects the participant interacts with while fulfilling the goals).
    • Environment (what else is present or going on that affects the participant’s desire or ability to do the tasks that help them meet their goals).
  • If you get outliers (not your target user) improve recruitment process in the future AND remove data from research, it's not helpful
6) Report the results
  • Summary report with one or more modules
  • Type depends on how decisions will be made following the report. (more informal in smaller team vs executives)
  • If the data is good - a quick persona sketch, or photo and sticky notes visible is good.
  • Summary should include - goals, methods, insights and recommendations

Chapter 4: Organisational Research:

Understanding you org:

  • You have a goal. You want to design something that millions of people will use
  • Design happens in the warm, sweaty proximity of people with a lot on their minds.
  • People create and join organisations to accomplish greater things more efficiently.
  • Organisations become more complex over time.
  • Oral culture trumps process of how to get things done. You get politics.
  • A design project is a series of decisions, getting the right decisions made in a complex organisation is difficult
  • You can have more influence than you think, you need to understand the inner workings though.
  • Its inescapable that the nature of the organisation matters to the design process (budgets, approvals, timing, resource and availability… all rely on your ability to navigate the organisation).
  • The success of a product or service depends on how well it fits into everything else the organisation is doing and well the organisation can and will support it.
  • Your success depends on how well you understand the organisation
  • An organization is just a set of individuals and a set of rules, explicit and implicit. Once you understand that environment, you’ll be able to navigate it and create the best possible product.

Put an MBA out of work:

  • Organisational research: Determining what drives a business, how all the pieces work together, and its capacity for change is normally done by Business Analysts.
  • However, very similar to user research
  • Most user-centered design studios will interview client stakeholders (people who will be directly afffected by the outcome of the project) as part of gathering requirements
  • May have to do some role playing to gather information (talk to me about how you interact with x, or y team. If you think of a particular role play example, and they refuse to go along, that can often be a sign. Find out why they resist.
  • In organisational research, the observer effect can be a force for positive change.
  • Ask hard questions and they’ll have to come up with answers - leading to reflection
  • Asking the same question of different people will reveal crucial differences in motivation and understanding
  • Asking a lot of questions can make you sound smart
  • The biggest difference is that you’re talking to current stakeholders not potential future customers

Who are stakeholders:

  • those groups without whose support the organization would cease to exist.”
  • research should include anyone without whose support your project will fail
  • Executives, managers, subject matter experts, staff in various roles, investors and board members

Interviewing stakeholders:

  • offer a rich source of insights into the collective mind of an organization. They can help you uncover areas of misalignment between a company’s documented strategy and the attitudes and day-to-day decision-making of stakeholders. They can also highlight issues that deserve special consideration due to their strategic importance to a business
  • What stakeholder interviews are for: Hearing the same issues considered by people in different roles relative to your work will give you a much more complete perspective and great insights. Some individual interviews can be valuable on their own, and some higher-level insights are only possible in the aggregate
  • Stakeholder interviews will help you understand the essential structure of the organization, how your work fits into the organization as a whole, and the approval process for various aspects of your project. They’ll also provide you with some less obvious opportunities to influence your project’s chances of succes
    • Neutralise politics - find anyone who is deeply opposed to your work
    • Better requirements gathering - Define success, how will you measure success, technical requirements
    • Understanding organisational priorities - we have a maxim based on repeated observation: the more important a project is to an organization, the more successful it will be
    • Tailoring the design process - be careful with a one size fits all design approach - you may have to bring together cross functional teams for example
    • Getting buy-in: Avoid ‘why wasn’t I consulted’ Humans have a need to be consulted and engaged to exercise their knowledge. Inquiry is flattery, inviting people to participate empowers and disarms them. Find out is your assumptions are true, how much you need to share, worse case scenarios for different stakeholders
    • Understanding how your work affects the org- your work affects everyone, Executives will have to defend it as a part of the overall strategy. Customer service will have to support it. Salespeople will have to sell it. Production staff will have to maintain it. If your product drives change, find out in what department, how, if they can cope etc… Anticipate changes in workflow!
    • Understanding workflow: - Understand how your work affects operations. Make tour recommendations based on that.

Sharpen your tact:

  • Prioritise your list of stakeholders. Meet them. You’ll have to meet with some people for political reasons.
  • Find out as much as possible about the people you’re interviewing before you do
  • Do individual interviews. Do a group interview if there’s a bunch of people with similar benefits and risks
  • You can try a remote email in a pinch. Send only a few key questions

Interview structure

  • 30mins, calm, natural, ask if you don’t understand, have a note taker if possible, send agenda & key questions ahead, the more complex the topic ‘the more you should give them a heads up.
    • Introduce yourself and the purpose of the meeting
    • Explain to what extent the information will be shared
    • Get something in writing that people can speak freely
    • Always ask
      • Role tenure, duties & responsibilities, typical day, teams & people work with & strength of those relationships, define success for this project, concerns for the project, the greatest challenges for the project (internal & external), how will their interactions/workflow change after the project?
      • What most common tasks within the system, what problems, what work-arounds, concerns, anyone else I should talk to?
    • Hostile witness: Remain calm, breath, restate goal, ask if clear, ask general open ended question (what they think is most important given that’s the goal).
      • Often caused by - under informed, power move, under preasure to perfom
  • Documenting interviews
    • Attitude, goal as they describe it, incentive alignment, influence, who they communicate with, participation in project, are you in harmony or conflict?
  • You’ve interviewed enough when you know:
    • The stakeholders
    • Roles, attitudes and perspectives
    • Levels of influence, interest and availability
    • How they stand to benefit or suffer with the success of failure of your work
    • Likelihood that any of them have power or potential to prevent project success
    • How workflow has to change
    • Resources to build, launch and support
    • Business requirements and constraints
    • Do you agree on goals and definition of success
    • How people outside the team view your project
  • What to do with stakeholder analysis:
    • Clear statement of what you need to accomplish to make the project a success
    • The requirements ensure stakeholders agree on the purpose and limitations of what you’re doing.
    • To increase your chance of success, connect what you’re doing to the goals of the business, increase collaboration, and save costs, particularly those associated with changes.
    • Business strategy must remain constant for the duration of a project.
    • Requirements must be:
      • Cohesive. The requirements all refer to the same thing
      • Complete. No missing information. No secret requirements that didn’t make it onto the list.
      • Consistent. The requirements don’t contradict each other.
      • Current. Nothing obsolete.
      • Unambiguous. No jargon. No acronyms. No opinions.
      • Feasible. Within the realm of possibility on this project
      • Concise. Keeping them short and clear will increase the chances that they are read, understood, remembered, and used. Aim for no more than two to three pages.
    • What to include:
      • Problem statement and assumptions
      • Goals
      • Success metrics
      • Completion criteria
      • Scope
      • Risks concerns and contingency plans
      • A workflow diagram
      • Verbatim quotes

Chapter 5: User Research:

  • Every delightful and every frustrating artifact that exists, exists because of a series of design decisions.
  • Your work must be accessible, novel, fit into other people's lives and environments. But how?
    • User Research to identify patterns and develop empathy
  • User Research = Ethnography, study of humans in their culture
  • We want to learn about target users as people existing in a cultural context
  • We want to understand how they behave and why
  • This is very different from gathering opinions. It isn’t just surveying or polling. And it’s definitely not focus groups
  • Ethnographic design research allows…
    • Understand the true needs and priorities of your customers/ readers/target audience/end users
    • Understand the context in which your users will interact with what you’re designing
    • Replace assumptions about what people need and why with actual insight.
    • Create a mental model of how the users see the world
    • Create design targets (personas) to represent the needs of the user in all decision-making.
    • Hear the users language and develop the voice of the product
  • Everything in context
    • Talk with or observe representative users directly in their environment - reducing the risk of bad assumptions and bias
    • Physical environment
    • Mental model - internal concept of and associations with a system or situation.
    • Habits - How does the user already solve the problem you are trying to solve for them, if indeed they do?
    • Social networks - intersection of human relationships & digital products. People are social animals and every interactive system has an interpersonal component.
  • Assumptions are insults:
    • Make assumptions about your users and you run the risk of being wrong. Wrong assumptions in your product can alienate people, maybe before they hear what you have to offer.
    • If you design for yourself or your team, you could be baking discrimination into your product
    • Assumptions about the age, gender, ethnicity, sex might lead to barriers you don’t actually intend—barriers that don’t serve your goals or ethics.

Good data from imperfect sources

  • The first rule of user research: never ask anyone what they want!
  • People want to be liked more than they want to tell the truth about their preferences
  • Many people lie, many lack sufficient self knowledge to give the truth
  • “most people are poor reporters and predictors of their own preferences and behavior when presented with speculative or counterfactual scenarios in the company of others.”
  • Your challenge as a researcher is to figure out how to get the information you need by asking the right questions and observing the right details.
  • Ask direct questions and you’ll run into defenses and come up with potentially useless answers
  • To design what your target users need, you have to know relevant habits, behaviors, and environments, then turn that knowledge into insights you can act on so you can design with more confidence and less guesswork.

What is ethnography?

  • Ethnography is a set of qualitative methods to understand and document the activities and mind-sets of a particular cultural group who are observed going about their ordinary activities in their habitual environment.
  • Ethnography = “What do people do and why do they do it?”
  • User research = “What do people do and why do they do it? + what are the implications for the success of what I am designing?”
  • Mindset: “how should what I’m designing interact with this person” try not to be judgmental.

4 D’s of ethnography:

  • Deep dive - Get to know a small but sufficient number of representative users very well (in their shoes)
  • Daily life - Get into the field where things are messy and unpredictable. Behaviour changes iwth context & circumstances
  • Data analysis - take time to gain real understanding, and get your team involved in creating useful models.
  • Drama - narratives help everyone on your team rally around and act on the same understanding of user behavior - Personas will keep you honest. You design for them, not for you or for your boss

Interviewing humans:

  • Goal: Learn about everything that might influence how the users might use what you’re creating
  • To be a good interviewer you need to be able to shut up
  • People want to be liked and want to demonstrate their smarts
  • When you’re interviewing someone you know nothing. You’re learning a new subject: that person
  • Preparation
    • Create an interview guide
      • Brief description and goal of study (you can share this with the participant)
      • Basic factual / demographic characteristics of the user - provide context
      • Ice breaker / warm up questions
      • The questions or topics that are the primary focus of the interview
    • Gather some background information on the topic and people you’ll be discussing, particularly if the domain is unfamiliar to you.
  • Interview structure - 3 boxes, loosely joined
    • Introduction: Smile, gratitude, purpose, topic, how info shared, obtain permission, opportunity for them to ask questions
    • Body: Open ended questions, probe (tell me more about that), leave silences. List of questions as a checklist not a script. Keep it natural
    • Conclusion: Thank you, let them answer questions,

Conducting the interview:

  • You are the host and the student
  • Build rapport, put them at ease, make them comfortable
  • Absorb everything they have to say
  • Only say things to redirect, keep things on track
    • Breathe.
    • Practice active listening
    • Keep an ear out for vague answers. Follow up with… why is that? Tell me more about that?
    • Avoid talking about yourself
    • Checklist Ethnography Field Guide from Helsinki design Lab:
      • Welcoming atmosphere / make participants feel at ease
      • Listen more than you speak
      • Take responsibility to accurately convey the thoughts and behaviors of the people you are studying
      • Conduct your research in the natural context of the topic you’re studying
      • Start each interview with a general description of the goal, but be careful of focusing responses too narrowly
      • Encourage participants to share their thoughts and go about their business
      • Avoid leading questions and closed yes/no questions. Ask follow-up questions
      • Prepare an outline of your interview questions in advance, but don’t be afraid to stray from it
      • Whenever possible, snap photos of interesting things and behaviors
      • Also note the exact phrases and vocabulary that participants use
      • Pay attention after you stop recording. You might get a valuable revelation.
    • Offer enough information to set the scope for the conversation, but not so much that you influence the response.
      • Tell me about your job
      • Walk me through a typical week in your life
      • How often are you online?
      • What computers or devices do you use?
      • When do you use each of them?
      • Do you share any of them?
      • What do you typically do online?
      • What do you typically do on your days off?
      • How do you decide what to do?
      • Tell me about how your children use the internet
      • How do you decide what to do on your days off with your kids?
      • What are your particular non-work interests? What do you read online besides the news?
      • How frequently do you visit museums in your town? Which ones?
      • What prompts you to go?

What to do with the data you collect

  • The interview is the basic unit of ethnographic research
  • Analyze them together to find themes, including user needs and priorities, behavior
  • Research patterns, and mental models
  • Note specific language and terms you heard
  • If doing generative research, look to the needs and behaviors you discover to point out problems that need solving
  • Turn the clusters around user types into personas that you can use for the life of the product or service you’re working on

Contextual Inquiry:

  • You enter the participant’s actual environment and observe as they go about the specific activities you’re interested in studying.
  • See behaviors in action, learn the small things you won’t hear about in an interview, such as unconscious and habitual work-arounds
  • Contextual inquiry is useful for developing accurate scenarios, stories of how users interact with potential features, identifying aspects of the environment that affect how a product is used.
  • Settle, interview at a time that's least disruptive, observe and summarise (and repeat back to user)

Focus Groups:

  • don’t provide insight into behavior or the user’s habitual context
  • Focus groups are supposed to be merely the source of ideas that need to be researched
  • Focus groups are the antithesis of ethnography.
  • Creates an artificial environment that bears no resemblance to the context in which what you’re designing would actually be used
  • Conversation is a performance that invites social desirability bias and gets in the way of finding out what people need and how they behave outside of this peculiar group dynamic
  • Participants are more likely to proclaim or conceal behaviors for the benefit of those around them.

The talking (and watching) cure

  • Accept no substitute for listening to and observing real people who need to do the things you’re designing a thing to help people do
  • A few interviews could change everything about how you approach your work. The information you gather will pay dividends as you gather and examine it, grounding your design decisions in real human needs and behaviors. You will develop powerful empathy that can inspire you to find creative, effective solutions

Chapter 7: Competitive Research:

  • Competition is everything else anyone has considered pr started using that solves the problem you want to solve or helps people avoid it
  • If you aren’t working on something that solves a real problem or fills a real need then your competitions is Anything else that anyone does with their time and money.
  • The hardest competitor is the one your potential customers are using now
    • Switching cost
    • Inertial - have to love you more than they hate change
  • Competition is for business but also for attention in the minds of your target users.
  • See how other people solving similar problems, identify opportunities to offer something uniquely valuble
  • What matters to our customers? How are we better at serving that need than any competitor? How can we show our target customers our product is the superior choice

User research helps with strengths and weaknesses, opportunities and threats are more competitor research

Competitive Audit:

  • Once you’ve identified a set of competitors and a set of brand attributes, conduct an audit to see how you stack up
  • Add in competitors mentioned in your interviews or that appear in web searches
  • Identify which bits of their work are most relevant and accessible. For each competitor:
    • Positioning? Offer?
    • Targeting? Vs your target audience
    • Key differentiators. What makes them uniquely valuable to their target customers?
    • To what extent do they embody your +ve/-ve attributes?
    • User needs / wants
    • What are they doing well / badly
    • Given all that - opportunities to beat, things to adopt or take into consideration

Brand Audit

  • Take a good look at your own brand
    • Does it set expectations correctly for the whole experience?
  • Brand = reputation - signify your identity & reputation to current and potential customers
  1. Attributes: Characteristics associated/ to be avoided with the product
  2. Value proposition: what value you offer, how you communicate this
  3. Customer perspective: what associations do they have with your brand
  • Competition often drives how important a brand is, and your positioning
  • Name: unique, unambiguous, easy to spell and say
  • Logo: Wordmark, lockup, app icon, favicon.
    • List all the contexts users encounter it
    • Consider competitors
    • Will it ever appear on its own?

Usability testing the competition

  • Don’t just test yours, test the competitors
  • Helps you understand their strengths and weaknesses directly from the users point of view
  • Opportunities
  • How users conceptualize the tasks

A niche in time:

  • The competitive landscape and how what you’re designing fits into it may be the fastest moving target of all research topics. New options are appearing and product categories are collapsing every day. Just taking a user-eye view at how your company, product, and message measure up will give you some competitive advantage. The accurate, user-centered perspective of your comparative strengths and weaknesses will help you focus your message and hone your image.

Chapter 7: Evaluative Research:

  • Have an appropriate design solution, it’s a good idea to test how well it works for your audience and its intended purpose before you stage a splashy public launch
  • Evaluation is assessing the merit of your design. It’s the research you never stop doing
  • Depends on the project:
    • early stages, evaluation takes the form of heuristic analysis and usability testing
    • test an existing site or application before redesigning. If you have access to a competitor’s service or product, you can test that. You can test even the very earliest sketches
    • Once live you can look at quantitative data (site analytics) to see how people are actually interacting and if that meets your expectations
    • The numbers will tell you what’s going on, and the individual people will help you understand why it’s happening.

Heuristic Analysis:

  • Most casual way of evaluating usability
  • Score a product against the following 10 principles:
    • System status visibility - appropriate feedback
    • Match between system and the real world - familiar language and conventions
    • User control & freedom - exits, undo, redo
    • Consistency and standards - things that appear the same should behave the same
    • Error prevention - Help users avoid errors
    • Recognition rather than recall - options should be visible. Instructions findable. Don’t make the user remember information
    • Flexibility and efficiency of use - Support shortcuts for expert users
    • Aesthetic and minimalist design - avoid providing irrelevant information
    • Help users recognise and recover from errors - Error messages should be helpful
    • Help and documentation - should be usable without documentation, but help should still be available
  • A few of the above focus on error prevention and recovers (often neglected in design)
  • Heuristic analysis is a quick and cheap way to identify potential issues. 2 colleagues can do it in an hour. It’s a good way to deal with obvious issues in early prototypes before bringing in users.
  • Not a substitute for usability testing

Usability testing:

  • Minimum standard for anything designed to be used by humans
  • The easier it is for your customers to switch to an alternative the more important usability is
  • The more complex the system, the more work is required to ensure its usable
  • Usability is an attribute defined by 5 components:
    • Learnability - How easy is first time use
    • Efficiency - Once learned, how quickly can they perform tasks
    • Memorability - when returning after a period, how quickly can they re-establish proficiency
    • Errors - How many errors do users make, how severe are those errors, how easily can they recover from those errors?
    • Satisfaction - how pleasant is it to use the design?

Cheap tests first, expensive tests later

  • Don’t use expensive tests to find out things you can discover with cheap tests
  • Start with paper not prototypes
  • Start in the office not in the field
  • Test with general audience before a specific audience
  • Test a competitors before you put pen to paper
  • Frequency of testing should depend on how quickly design decisions are being made
  • Do it as you go - not just before launch
  • Cheap is early, expensive is late, super expensive is with your customers

Preparing for usability testing:

  • Build into design / build workflow
  • Create testing process and checklist that includes all of the information and equipment you need
  • Maintain a database of potential participants
  • Decide who is in charge
  • You need a …
    • Plan, prototype or sketch, 4-8 participants, facilitator, observer, a way to document, a timer or a watch

Usability test plan:

  • Revolves around tasks
    • Persona and journey led
    • Feature scenarios and tasks led
  • Provide a scenario to the user before giving them the task (you want to do x)
  • Not all tasks are super important - is it a deal breaker?
  • The test plan includes:
    • What you’ll do, how you’ll conduct the test. Which metrics you’ll capture, the number of participants, which scenarios you’ll use.
      • Objectives
      • of the test: what are you testing and what state is it in?
      • Methodology
      • Participants and recruiting
      • Procedure.
      • Tasks
      • Usability goals
      • Completion rate (the percentage of tasks the user was able to complete)
      • Error-free rate (the percentage of tasks completed without errors or hiccups)
  • Add templates for Human factors etc!
  • Recruiting - Single use. Share key goals of your target users.
  • Facilitating - Guided journey of imagination, clear tasks (unclear ones can’t be tested), personable and patient, don’t intervene, avoid leading the user or helping the when they get lost
  • Users will blame themselves if they can’t get it to work - ask them to describe how they expected the system to work and why they had that expectation.
  • Observing and documenting - Recording is good. Second person note taking is good.
    • Note the following:
      • Participants reaction to the task
      • How long it takes
      • If they were successful / fail
      • Any terminology that was a stumbling block
      • Non-verbal frustration
      • Quotes
      • Successful / unsuccessful features
  • Using a webcam as a screen / audio recording
  • Only use eye tracking with people who can’t verbalise what’s drawing their attention (expensive waste of money)

How bad and how often?

  • Severity:
    • High: an issue that prevents the user from completing the task at all
    • Moderate: an issue that causes some difficulty, but the user can ultimately complete the task
    • Low: a minor problem that doesn’t affect the user’s ability to complete the task.
  • Frequency:
    • High: 30% or more participants experience the problem
    • Moderate: 11–29% of participants experience the problem
    • Low: 10% or fewer of participants experience the problem.
  • Tiers:
    • Tier 1: high-impact problems that often prevent a user from completing a task. If you don’t resolve these you have a high risk to the success of your product
    • Tier 2: either moderate problems with low frequency or low problems with moderate frequency
    • Tier 3: low-impact problems that affect a small number of users. There is a low risk to not resolving these.

Benchmark against your competitors to make a more compelling argument for change!

Chapter 8: Analysis & Models

  • Qualitative analysis can seem mysterious.
  • You’ll soon get clarity of the analysis then the concepts and recommendations
    • Closely Review notes
    • Look for interesting behaviours, eMotons, actions and quotes
    • Write observations on a sticky note (write a code that tags it back to the study)
    • Group observations
    • Watch patterns emerge
    • Rearrange notes as you build patterns
Affinity Diagram - Turns research into evidence-based recommendations
Design Mandates
- Mandate 1 - Mandate 2 - Mandate 3
Insights
Insight A
Insight B
Insight C
Observations
- Observation 1 - Observation 2 - Observation 3
- Observation 1 - Observation 2 - Observation 3
- Observation 1 - Observation 2 - Observation 3
Personas & Scenarios - Help maintain empathy for users
  • Persona is a fictional user archetype
  • Represents a group of needs and behaviours
  • You can write stories based off personas
  • Represent user-centered design:
    • embody the behavior patterns and priorities of real people and act as a reference point for decision-making
    • Helps maintain empathy for users
  • Design Personas are not marketing targets
    • Market segments do not translate to archetypes
      • User with the highest value to the design process may not be the highest value to the business
  • How many do you need?
    • As few as possible, while representing all relevant behavior patterns
    • You can often reduce the number by creating relationships among them and assigning multiple roles to one persona
  • Capturing the character:
    • Description should have just enough detail to capture those aspects of a target user most useful and inspiring for the designers to keep in mind.
    • the essential information about context of use and patterns of behavior are in a form you can integrate into your workspace and refer to repeatedly. Consider your personas as a set. You don’t have to capture all concerns in a single one. And the personas can have relationships to each other, just like people do in real life.
    • For accurate personas, select a role that closely matches that of one of the participants you interviewed and is also one of the identified target user types, such as the aforementioned teacher, parent, or tourist
    • Scenarios if personas are characters, scenarios are your plots.
    • A scenario is the story of how a persona interacts with your system to meet one (or more) of their goals.
    • Running a persona through a scenario helps you think through your design from the user’s point of view. You can use scenarios at several points in your process:
      • To flesh out requirements
      • To explore potential solutions
      • To validate proposed solutions
      • As the basics for a usability test script
Mental Models - If something is intuitive, it matches a users mental model
Evaluate Alternatives
Chose based on priorities
Avoid some destinations
Visit websites of potential destinations
Choose based on proximity
Avoid expensive destinations
Read reviews on travel website
Choose based on educational potential
Avoid complicated logistics
Suggest alternatives to children
Choose based on variety of activities
Avoid tourist traps
Discuss potential destinations with partner
Choose based on value for money
Avoid boring destinations

Mental Models:

  • Mental models can be time savers
  • Intuitive is a synonym for matches the users mental model
  • 2 types of mental models: the type each of us holds in our head to help us deal with the world, and the type designers sketch out to better create that world
    • For maximum success, be aware of the former and get to work on the latter.
  • A user’s mental model allows you get inside their head but and show it to everyone else. Use a mental model diagram to collaborate with your team, prioritise features, better organise information, and identify areas where users have needs that aren’t being served

How to create a mental model

  • Do user research
  • Make an affinity diagram
  • Place affinity clusters in stacks representing the user’s cognitive space to create the model. These groups will include actions, beliefs, and feelings
  • Group the stacks around the tasks or goals they relate to
Conceptual Modelling / Sight mapping - relate tasks, content, functionality
  • You can translate mental model into a conceptual map that relates content and functionality according to the target users views
image
  • Gap Analysis - Use mental models to identify gaps, or mismatches between what you offer and what the user needs or expects. Design features that close those gaps.
Task Analysis = Breaking a task into the steps it takes to accomplish it

Task Analysis / Workflow:

  • Task Analysis = Breaking a task into the steps it takes to accomplish it
  • Contextual inquiry is the best prelude to task analysis, but you can also use data from user interviews as long as you’ve collected sufficient detailed information about how the participants work toward their goals step by step
  • Any given task has both cognitive and physical components that may be more or less important given the domain and the goal
  • Helpful for mapping what people do in the real world to functionality you can build

Break it down:

  • Identify the steps participants reported or you observed to complete a given task
  • Note the initial state, the event prompting the user to begin the task, the information or tools the user needs at each step, and any steps at which the task is likely to be interrupted or resumed. Put all of these steps back together as a workflow.

Make it flow:

  • In addition to informing the feature set and flow of an application, task analysis can help identify where content might support a user along their task path. They may take different paths

Model Management:

  • Just the activity of creating a model like this with the team will improve shared understanding
  • Don’t underestimate the accessibility and appeal of your analysis, visualised.
image

This task path for ticket purchase can help identify areas where the user needs specific content and functionality to meet her goal

CHAPTER 9: Quantitative Research

  • Chief aim for quantitative research and analysis is for optimisation
  • Define good? What is good? How do you know its good? What does it mean to be best? What are you optimizing for? How will you know when you’ve reached it?
  • Once you can measure your success in numerical terms, you can start tweaking

Preaching to the converted:

  • Conversion (sign up, buy now, make a reservation)

Ease into analytics:

  • Look for trends and patterns
  • Be careful: more page views could mean more engagement (or more frustrated audience)
  • A high bounce rate means people aren’t getting what they’re expecting when they come to you from search
  • Split test - Control and variation.
    • Select a goal
    • Create variations
    • Choose an appropriate start date
    • Run the experiment until you’ve reached a 95% confidence level
    • Review the data
    • Decide what to do next, stick with control or switch to variation, or run more tests
  • You need a goal, you need to know the current conversion rate and how much you want to change it
  • Low traffic sites will take weeks to validate something
  • Cautions and considerations
    • Testing can be seductive because it seems to promise mathematical certitude and a set-it-and-forget-it level of automation
    • The best response to a user interface question is not necessarily a test
    • Good for tweaking and knob-twiddling—not a source of high-level strategic guidance
    • Can introduce inconsistencies
      • Landing pages for new users - why not
      • Navigation around the site - caution!
    • Local Maximum Problem: Focusing on small positive changes can lead to a culture of incrementalism and risk aversion. How will you ever make a great leap that might have short-term negative effects?
    • The best teams are Spock-like. They embrace data while encouraging and inspiring everyone working on a product to look beyond what can be measured to what might be valued.
    • You can optimize everything and still fail, because you have to optimize for the right things. That’s where reflection and qualitative approaches come in. By asking why, we can see the opportunity for something better beyond the bounds of the current best. Even math has its limits.

    0e2fa499cf2a439f8127b2a86cda5734
    image