Do you know of other products outside the Wikimedia world that offer tasks to users, that we could learn from?
Topic on Talk:Growth/Personalized first day/Structured tasks
Jump to navigation Jump to search
Reply to "Other products?"
I'm not sure how relevant this might be, but uTest is a software testing platform which recruits people for UX/UI/Bug testing. They have an interesting 'Academy' process which new testers have to pass before being given 'real world' paid software test cases. The 'Academy' consists of an initial series of very short and simple videos and teaching pages about different aspects of software testing, with a really basic 'Quiz' at the end of each short course. These are multiple choice quizzes, offering various answers to simple questions. It's a great feeling to pass each course and quiz (though one can re-do it, if required). I could envisage this fun, but satisfying process being applied to Wikipedia editing, helping one gain and test one's understanding. For some people, 'gaining the badges' could potentially be a valuable learning and self-validating experience. See https://www.utest.com/academy (though registration is required to access the courses).
@Nick Moyes -- thanks for checking out the page. I haven't tried uTest, but we at the WMF uses something that sounds just like: usertesting.com, and I remember that they have some kind of orientation process, too. Our team's designer, @RHo (WMF), keeps notes on products we can learn from, and maybe she has some thoughts on this.
@MMiller (WMF) Yes, it was your mention a while back of the usertesting site that piqued my interest in both platforms. They work differently in assessing users. The one you use requires the initial submission of a voice and screenshot recording of a product in use, and they give a rating and detailed feedback to the new tester on how they performed before allowing them anywhere near real-world clients. Whilst uTest requires screenshots and screen recordings (but no audio) as part of their training, they also have the simple training videos and quizzes mentioned above. As a platform, my initial impression is that uTest seems much more focussed towards testers reporting bug issues in software than on the overall UX (user experience) which obviously requires voice interaction and screen recordings to help clients discover what users actually do in real life.
Thanks @Nick Moyes for sharing this utest testing platform, I had not come across it before in my review of “task apps”, and it is interesting to see the way they break up required tester skills into these short courses. It reminds me of an idea we had earlier for a “Tutorial module” for the newcomer homepage, which would show tutorials and quizzes like the Five Pillars Quiz from WikiEdu. More "interactive" help material like quizzes and video tutorials will hopefully be revisited after we see how people respond to the initial guidance feature being released shortly, where newcomers being shown short text 'quick start tips' to help them through the particular task they have selected to do (mocks can be seen in the team's Newcomer tasks project page).
In terms other task products we've reviewed so far, gamification features like earning badges, points, and “leveling up” in status are used in quite a number of apps (including but not limited to: Duolingo, Google Crowdsource, Google Local Guides, Translate Facebook, LinkedIn - Skill assessment, Vivino, Foursquare). This tactic seems supported by a few onwiki studies as well, which saw editor activity increase after being awarded barnstars  , or being thanked .
TL;DR: I plan on including more moments of positive encouragement into the design moving forward.
 Restivo, M. & van de Rijt, A. (2012). Experimental Study of Informal Rewards in Peer Production. PLoS ONE 7(3): e34358. PDF: https://pubsonline.informs.org/doi/pdf/10.1287/mnsc.2016.2540
 Gallus, Jana (2016-09-30). "Fostering Public Good Contributions with Symbolic Awards: A Large-Scale Natural Field Experiment at Wikipedia". Management Science 63 (12): 3999–4015 https://pubsonline.informs.org/doi/pdf/10.1287/mnsc.2016.2540