Understanding and Interpret various statistics on Performance Metrics and Issue-based Metrics
Develop hands-on skills of understanding and interpret various statistics on
“Performance Metrics” and “Issue-based Metrics”;
Master key features of this usability testing software;
Being able to use usability testing software to interpret statistics and generate
evidence-based insights in written reports.
Task 1 Please go to the Loop 11 platform to locate its “Sample Project”:
GXNS/) which is an unmoderated usability testing on “Fyre Agency” website. Please
understand the statistics generated by Loop 11 to write up a basic “insights report”.
Please report on the “Results” & “Findings” from each category of data which are
“Dashboard” “Tasks”, “Video”, “Questions”, “Participants”, & “Clickstream & Heatmaps”.
In other words, you need to report your understandings about “users” and their
“performance and preference data” on those three tasks in the sample project. This
report should include key information on both “Performance metrics” and “Issue-based”
metrics. Some examples of metrics include (not limited to the below items):
Time on tasks
Efficiency (pageviews, lostness, etc.)
Task 2 Please export and download the raw data in the spreadsheet of this “Fyre
Agency” sample project to conduct further analysis. Please use Excel or other statistical
software (e.g., SPSS, R) to conduct further analysis to identify new insights
(patens/trends) which are not automatically reported by Loop11 (Hint: please refer to
Tullis & Albert Chapter 2 & 4 for examples). Please show tables, graphs, and your
interpretations of results in the report.
Task 3 Please select 5 participants’ videos sessions from the “Fyre Agency sample
project” and then use Reframer (https://blog.optimalworkshop.com/tag/reframer/) to tag
those important incidents in the video (e.g., Errors, confusing, disappointing moments,
user’s requiements, important quotes etc.). Export and download the notes from
Reframer for analysis.