Toad for Oracle a desktop application for managing Oracle databases. In this project, Product Management was concerned about our various help and training materials and if they were effective. One way to access helpful information inside the product is through a field in the toolbar, internally referred to as “Jump Search”. In a research spike, the Product Owner requested information about user expectations surrounding the field and reasons why people use it. My research provided the answers.
To organize the study, I created and regularly updated a wiki page. This provided a way for stakeholders to track progress, look up details about the study, and find the results. The wiki page serves as a record of the completed study and includes links to the corresponding work ticket in JIRA, the prompts, the spreadsheet with the raw data, the detailed results, the Executive Summary, as well as a recording of the share out meeting – – Research study wiki page (send a request to access it).
Understand user expectations of the Jump Search field:
- What do they expect the field will help them find?
- What would they enter?
- What kinds of results do they expect to get?
- Which resources do they expect will be searched?
We recruited 31 product users to complete an unmoderated study.
We used screenshots of the product and asked the participants 6 questions. The study took less than 5 minutes to complete.
- active product users
- preference for freeware over commercial edition users, as they do not have access to technical support
We did not offer an incentive to our customers.
Unmoderated study prompts
Prompt 1 – Click test to gauge if users see it as a source of help
Prompt 2 – Open-ended question to inquire about uses of the field
Prompt 3 – Open-ended question to gather examples
Prompt 4 – Open-ended question to learn about result expectations
Prompt 5 – Multiple choice question to learn about search expectations
Prompt 6 – Final, open-ended question to gather additional feedback
There were three aspects to the analysis.
For the multiple-choice question, I tallied the responses to understand how many participants chose each option. These results conveyed the resources participants most expected as well as least expected.
I reviewed all of the responses to short answer questions to identify common themes.
I also looked at the resulting heatmap from the click test. Interestingly, only 2 of the 31 (6%) of the participants clicked on the Jump Search field when asked where they would click to get a question answered.
A few days after the study was sent, an Executive Summary was reviewed with the Product Owner and Product Management. The Product Owner was able to immediately act on the findings and recommendation, creating a design ticket to improve the feature.