4086Re: [agile-usability] Re: Online Usability Tests
- Mar 17, 2008Sorry if I misled people with my statement that we often get 1,000 people to do our online studies in a couple of days. While that is true for studies aimed at a general audience, it's also common to get data from perhaps 20-30 more targeted users in a couple of days.In terms of the timelines involved, that is in fact what I see as one of the major advantages of this online testing technique. The sample study of the Apollo Space Program took me about 1 hour to set up. And that included the time to decide on the tasks. Of course, I already had the basic online testing tool and so it was just a matter of defining the characteristics of this particular study (the tasks and their possible answers, the websites being evaluated, etc). In terms of the BASIC analysis of the data (task completion rates, task times, subjective ratings, etc), as you would imagine most of that is automated and takes perhaps half an hour. That kind of analysis is very quick and takes the same amount of time whether you have data from 20 people or 2,000 people. That lets you see the obvious things like which tasks users had the most trouble with. Or simply whether they had trouble or not. In some situations, that may be what you're most interested in.The more time-consuming analysis is the analysis of the verbatim comments from the users. And the time for that, of course, depends upon the number of participants. And it is those verbatim comments that can often give you more insight into WHY people were encountering the problems that they were. Making sense of verbatim comments from 1,000 people certainly takes longer than it does for 20 people. But you can often see the trends pretty quickly even with the large sample sizes.I've been involved in projects where we decided what we wanted to test one morning, set up the online study, sent out the message about the study to a panel of a few hundred users that had been set up previously, got data back from perhaps 20-30 of them by that evening, and reviewed the data with the project team the next morning.As I mentioned in my other message, I certainly don't see online testing as a replacement for lab testing, but perhaps a useful adjunct.--TomIn a message dated 3/17/2008 12:25:17 A.M. Eastern Daylight Time, manish1022@... writes:
1000 users seem to be a bit difficult to collate for
any project Agile or Waterfall. Do the benefits of
such a large user base outweigh the money invested, is
what I wonder (even if you're testing amazon.com).
Agile releases are very short spanned. We have been
trying to fit seamlessly, usability testing to agile(
remote or otherwise). Apart from the usual issue of
getting the real users against the 'representative
subject matter experts', the time lines are pretty
hectic to encompass a much detailed UT.Plus,
other challenges like getting the project management
team and the client on your side exist as well.
The cycle of sending out the feedback form and getting
responses asap to do a quick analysis of the data,
seems to be generally a long one for such a huge user
base.It would be a challenge to keep the rest of
project team members busy till then. Late feedback
does not really help in any way.
I like the way the test is designed. However, I'm
still trying the figure the benefit of this test to
you.The real issues may not be well translated by the
users. We definitely know that Task A is very
difficult to do, but why and how is missing from that
bit of information.
I'm interested in the conclusions of this test for
Thanks for sharing this,
Manish Govind Pillewar
It's Tax Time! Get tips, forms and advice on AOL Money & Finance.
- << Previous post in topic Next post in topic >>