Unmoderated study: User expectations of the Jump Search feature

Table of Contents

    Project description

    Toad for Oracle a desktop application for managing Oracle databases.  In this project, Product Management was concerned about our various help and training materials and if they were effective.  One way to access helpful information inside the product is through a field in the toolbar, internally referred to as “Jump Search”.  In a research spike, the Product Owner requested information about user expectations surrounding the field and reasons why people use it.  My research provided the answers. 

    To organize the study, I created and regularly updated a wiki page. This provided a way for stakeholders to track progress, look up details about the study, and find the results. The wiki page serves as a record of the completed study and includes links to the corresponding work ticket in JIRA, the prompts, the spreadsheet with the raw data, the detailed results, the Executive Summary, as well as a recording of the share out meeting – – Research study wiki page (send a request to access it).

    Research objective

    Understand user expectations of the Jump Search field:

    • What do they expect the field will help them find?
    • What would they enter?
    • What kinds of results do they expect to get?
    • Which resources do they expect will be searched?

    Research method

    We recruited 31 product users to complete an unmoderated study.

    We used screenshots of the product and asked the participants 6 questions.  The study took less than 5 minutes to complete.

    Participants

    Screening criteria

    • active product users
    • preference for freeware over commercial edition users, as they do not have access to technical support

    Incentive

    We did not offer an incentive to our customers.

    Recruiting email

    Help us make <product name> better!

We're running an online study about getting help within <product name> and we'd appreciate your input. This study will take less than 5 minutes of your time.  Your response will be used to improve <product name>.

    Unmoderated study prompts

    Prompt 1 – Click test to gauge if users see it as a source of help

    Task  - You have a question about using Toad for Oracle. Click on the image where you would go to find the answer.

    Prompt 2 – Open-ended question to inquire about uses of the field

    Look at the "Jump to..." box in the top right of <product name>.  What would you use this for?

    Prompt 3 – Open-ended question to gather examples

    Provide a specific example of what you would enter in the "Jump to..." box.

    Prompt 4 – Open-ended question to learn about result expectations

    Question - In your previous example, what are your expectations for the results? Please describe what you will see.

    Prompt 5 – Multiple choice question to learn about search expectations

    Question - Which resources do you expect the "Jump to..." box will search? Select all that apply.

    Prompt 6 – Final, open-ended question to gather additional feedback

    Wrap Up - Is there anything else you would like to share with us about the "Jump to..." box or how you find information within Toad for Oracle?

    Analysis process

    There were three aspects to the analysis.

    For the multiple-choice question, I tallied the responses to understand how many participants chose each option. These results conveyed the resources participants most expected as well as least expected.  

    I reviewed all of the responses to short answer questions to identify common themes.

    I also looked at the resulting heatmap from the click test. Interestingly, only 2 of the 31 (6%) of the participants clicked on the Jump Search field when asked where they would click to get a question answered.

    Heatmap showing clicks where people would look for answers.

    Share out

    A few days after the study was sent, an Executive Summary was reviewed with the Product Owner and Product Management.  The Product Owner was able to immediately act on the findings and recommendation, creating a design ticket to improve the feature.

    Executive Summary:  What we Did, Why We Did It, What We Found, What We Recommend