Moderated study: Evaluate a workflow to configure notifications

Table of Contents

    Project description

    Spotlight Cloud is a database performance monitoring web app.  The app monitors SQL databases for various metrics and tracks when thresholds are exceeded.  In this project, a small, UX design and research team was tasked with designing an email notification feature.  Beyond displaying the monitoring results within the app, we wanted users to be notified via email when one or more thresholds were exceeded. Internally, there were design concerns about the proposed workflow for configuring the notifications. My research quickly evaluated this workflow to guide the design effort.

    To organize the study, I created and regularly updated a wiki page. This provided a way for stakeholders to track progress, look up details about the study, and find the results. The wiki page serves as a record of the completed study and includes links to the corresponding work ticket in JIRA, the related studies, the prototype, the moderator script, the session recordings, the analysis, the Executive Summary, as well as a recording of the share out meeting – Research study wiki page (send a request to access it).

    Research objective

    Evaluate a workflow for configuring email notifications.  Validate that users can navigate and create an email notification, specifying details and completing the workflow successfully.

    1. Looking at the product’s heatmap, can users successfully navigate to the right area for configuring email notifications?
    2. Do users know where to go to start to create a new email notification setting?
    3. Can users successfully specify a connection, an alarm type, and an email recipient?
    4. Do users have sufficient feedback to know that they have completed the task successfully?
    5. Do users have any concerns with our UIs or our approach of configuring notifications on individual metrics for a single database?

    Research method

    We ran five 30-minute moderated, remote sessions.

    What we tested

    We used a clickable prototype that had the main workflow activated and allowed participants to successfully complete the task.  

    Background questions

    We asked open questions about the participant’s involvement with the performance monitoring and how their organization uses email notifications today.

    Test introduction

    Participants were walked through the basics of a usability test and we told them to “think aloud” as they went through the application.  We also told them it was a prototype and that they wouldn’t be able to type and many of the links weren’t clickable.

    Workflow Scenario

    Participants were give the following scenario

    For the database M601\SQLK8, configure the web app to send an email message to backupteam@quest.com when the threshold for “Days Since Last Full Backup” has been exceeded so that the group can respond promptly.

    Follow-up questions

    After they completed the workflow, we asked participants to give us feedback on our approach of configuring notifications on individual metrics for a particular database.

    Participants

    Screening criteria 

    • database performance monitoring experts
    • people not deeply familiar with our web app product

    Incentive

    We provided a $25 gift card to an online retailer to thank the participants for their time.

    Recruiting email

    We're looking for you feedback!
We are running a 30-minute usability study on Tuesday, April 14th related to database performance monitoring and email notifications.  We're particularly interested in hearing from folks who monitor cloud-based databases, such as Azure SQL Database.

    Stakeholders involvement

    The Product Owner prioritized the project and requested regular updates.  I invited her and the larger UX team to observe the sessions firsthand and help with taking notes.  

    We’re running an email notification workflow study.

    You’re invited to join us and observe the test.

    So we can ensure a smooth and seamless testing process, here are a few details for you if you decide to join us:

    ·         To avoid distracting the participant, please mute yourself immediately and don’t say hello or announce yourself

    ·         Please stay on mute for the duration of the session

    ·         If you have something you’d like the moderator to address, please message them in Slack on the channel #scmm-2329

    ·         If the participant asks a question that you know the answer to but the moderator may not, send the moderator the answer in Slack for them to respond. Otherwise, the moderator will simply take down the question and assure the participant that they will direct their question to the appropriate person and get them the answer later

    We’ll be recording these in case you can’t make it.

    The meeting details are below.

    Data collection

    The sessions were recorded and made available company-wide on a wiki page. All of the stakeholders, including product management, technical writing, and engineering, were encouraged to review the recordings.

    Along with the recordings, the wiki page tracked the date and time of the session, provided details about the participant, including their first name, company, and job title, and linked to the observers’ notes – Participants and Raw data wiki page (send a request to access it).

    Analysis process

    A remote, collaborative analysis session was held using an online whiteboard tool, Mural.  All of the observers were invited to add their notes and reflections.  The whiteboard tool was organized based on research questions and participants.    

    Research Questions and Participants

    Share out

    A week after the sessions, an Executive Summary was reviewed with the Product Owner.  This provided an official, 30-minute update on the project, although the UX designer was already working on the next iteration of the designs.   

    Executive Summary:  What we Did, Why We Did It, What We Found, What We Recommend