Abstract

End-user authored tutorials found on the Web are increasingly becoming the norm for assisting users with learning software
applications, but little is known about the quality of these tutorials. Using quality metrics derived from previous work, we
perform a usability expert review on a sample of Photoshop tutorials, a popular image-manipulation program with one of the
largest showings of web-based tutorials. We also explore how the characteristics of these tutorials differ across four tutorial sources,
representing those that are, i) written by a close-knit online community; ii) written by expert users; iii) most likely to be
found; and iv) representative of the general population of tutorials. Our analysis reveals that expert users generally write
higher quality tutorials, and that many of the tutorials in our sample suffer from some important limitations, such as lacking
attempts to help users avoid common errors. We also find that a single five-star rating system did not sufficiently distinguish
quality between the tutorials. Building on this later finding, we propose and evaluate a rating approach based on multiple