Vishal Dixit
Ijeoma Udejiofor
vdixit84@gmail.com
ij_udejiofor@yahoo.co
University ID:
Medium severity: Issues are classified as For Tasks 4 and 5 there is a 95 percent chance that the
medium severity if it has been faced by many true mean will fall between 50 and 116 percent.
participants having a minor impact on the As part of the analysis of the efficiency of the calendar
usability or few participants but large impact on system, the total completion time of all the tasks was
the usability. measured for each of the users. Table 16 in appendix B
Low severity: Issues are classified as low shows a summary of the completion time for each user.
severity if it has been faced by few participants The average evaluation time was 144.5 seconds
and also has minor impact on the usability. with 50 percent of the users completing the tasks
(Tullis & Albert, 2008). before the average completion time.
The standard deviation is about 33 seconds. This Issues: with the tool were identified as the evaluation was
can be attributed to the small sample size. being carried out and a documentation of these issues is
Therefore estimating the true mean of the given in this section. The issues are categorized based on
individual tasks in addition to being classified as having
population will be less accurate if based on this
high, medium or low severity. The severity classifications
sample size. are the same as the ones given in the user interface issues.
The analysis also shows that with a 95 percent A summary of the issues identified are given in table 19
confidence, it can be inferred that the average in appendix B.
population completion time will be 144.5 35 Graph 11 in appendix B shows that three issues were
seconds. identified in Task 1 and 60 percent of the users
encountered problems in relation to the issue with the
The experience level of users was obtained during the test highest magnitude. It also shows that Task 3 had only one
and was measured by assigning numerical values to issue identified which all the users encountered. This
different levels of experience as shown in table 17 in indicates that the issue in Task 3 is the most sever and
appendix B. This was to enable us test for the relationship should be given priority when the issues are being
between the users’ previous experience with calendar addressed by the design team of the Calendar tool.
tools and the time it will take them to complete the tasks After the usability evaluation of the calendar tool, the
for the Powermeeting calendar tool. users were asked some questions on the tool in general
The scatter plot shown in graph 9 of appendix B indicates and were also asked to give general comments general.
a negative slope between the two variables. For each Table 20 in the appendix summarizes the responses of the
increase in level of experience there will be 0.65 seconds users.
decrease in the completion time. DISCUSSION
The correlation coefficient of -0.65 shows that there is a This section discusses the findings of the usability test for
strong negative relationship between the level of the user interface and calendar tool.
experience with other calendar systems and the time taken User Interface Discussion (Anastasia Tsoutsoumpi,
to complete the tasks in the usability test. Vishal Dixit and Billy Kwasi Yeboah)
Ease of Use: This was measured in terms of how easy it The user interface tool was evaluated on three metrics
was for the users to complete the tasks. The users were task success, errors and issues.
asked to rate the ease of use of the Calendar tool by
completing a questionnaire which was based on the The findings of the evaluation suggest that there is no
System Usability Scale (SUS) and has been presented in significant difference in errors among users with different
the Methods section. The users’ responses which are the level of computer experience. This could be as a result of
SUS scores (SUS scores are calculated by adding the time pressure the users where put under during the test
rating figures and multiplying by 2.5) are presented as and the fact that they were not familiar with
percentages in the table 18 and graph 10 in appendix B: PowerMeeting. For example, they had no reason to
assume they needed to login as administrators as they
have no prior knowledge of the fact that only
Easiest task - Task 4. (The task has an average administrators could create agenda items.
ease of use rating of 66 percent with only 33 Calendar Tool Discussion (Fatima Baba)
percent of the users rating below the average). Three main aspects of the calendar tool were measured
during the usability evaluation. They are efficiency, ease
The most difficult task - Task 3. (The average
of use and issues (problems).
ease of task rating of this task is 55 percent with
about 67 percent of the users rating below the
average).
Overall the users did not find the tasks too easy
to complete thus the average ease of use for all
the tasks are 62 percent with the maximum
efficiency of 75 percent.
Efficiency: The five tasks were measured for success and recognized during the test as well as from the statistical
the means and confidence intervals were calculated in the results are outlined below.
previous section. The confidence interval allowed us to
estimate to what degree of accurateness the results from
the sample can be used to generalize how successful the Recommendations (Anastasia Tsoutsoumpi and
task completions will be when used in the real world. The Ijeoma Udejiofor)
first two tasks were successfully completed by the users
After analyzing the result of the usability test taking into
but that does not mean everyone who will use the system
consideration improvement ideas from the users the
will be able to successfully complete the tasks. The small
sample size has to be considered before any following recommendations are given in order to improve
generalizations can be made. On the other hand, task 3 the Powermeeting system:
had a really low average completion success and this Connection for the user conference should be
point out that there is a problem with the tool and if used made quicker
in the real world a lot of people may not be able to use it
to complete such a task. Tasks 4 and 5 only showed The login should also be made quicker and
moderate levels of difficulty in successful completion and should have a mechanism to remember
this should also be addressed when improving the passwords.
calendar tool. The logout button should be clearly indicated on
The negative slope of graph 9 shows that as the computer the screen.
literacy level of the user increases, the time spent in
completing the tasks decreases. To better understand the Search mechanism should be built in the
strength of the relationship of these variables, we documentation.
calculated the correlation coefficient which gave a value
of -0.71. This can be interpreted to mean that it is 71% The recycle bin used to delete the agenda items
probable that proficient computer users will find the should allow users retrieve deleted items
system more usable than beginner computer users.
Individuals, irrespective of their level computer literacy, The design of the group calendar tool should be
need to be to manage group events using a calendar tool.
enhanced so that it is possible to easily edit the
Hence, the general features of this tool should be
enhanced to encourage even beginners to adopt it as their time for an event by entering a new time using
group calendar tool. the blinking cursor in the pop-up dialogue box.
This was a high severity issue and the general
A standard deviation of 33 was obtained and this indicates
a large variance between the completion times of the low usability results for task 3 echo this problem.
different users. Although we can attribute this to the small
sample size used for the test, it could also be indicative Standard alternatives should be made for core or
that it will take some classes of users far more time to commonly used functions. For example users
complete tasks using the tool when compared to other should be able to select the “add”, ” edit” or
classes of users. This could be as a result of several other “delete” event button by right clicking on the
factors such as level of computer literacy as discussed mouse. Users should also be able to use simple
above. Again, this indicates that the design should be
key combinations on the keyboard, as an
improved on to reduce the use time for all classes of
users. alternative.
Ease of use: from the results, it is clear again that there is The users should get a notification of when an
a problem with completing task 3. This confirms the event is added or edited by another group
findings in the efficiency that showed task 3 as the least
member and the identity of the person who
efficient. In general all the users did not rate the ease of
use of the calendar tool high, and this suggests that the added or edited the event.
tool is not easy to use.
Currently in the group calendar, a deleted event
Issues: with the calendar tool were identified during the cannot be restored. A means for restoring deleted
usability test. These issues are valuable insights for the
items should be added as a feature of the tool as
developers and in order to improve the tool, the issues
need to be resolved. Some recommendations for the this will greatly enhance its usability.
improvement of the tool gathered from the issues
When the mouse is placed on a clickable feature, As far as users attitudes towards web applications are
a brief textual description of its function should concerned two basic points can be noted. Firstly, people
be displayed to increase learnability and usability are always attracted by eye-catching user interfaces.
However, the motivation to be ‘loyal’ in a web
of the tool.
application is its functionality. An additional point is that
many users express their concerns about the security and
CONCLUSION (ANASTASIA TSOUTSOUMPI)
Firstly, the evaluation of the Power Meeting web the speed of web applications. That is a clear indication of
application, gave us the opportunity to gain a good their demand for high quality standards in contemporary
understanding of Ajax technology. Further to that, we had software products.
the opportunity to familiarize ourselves with very Finally, the success of an application in the ‘web market’
important human computer interaction concepts. The is based exclusively on users’ acceptance of it. But they
experience gained, through the preparation of the usability are the researchers and the professional software
test and through the analysis of users’ feedback, will developers, who lead the technology one step further into
improve our skills in the domain of software the future.
development. The main reason for that is that we will be
able to approach software design from the user’s point of
view.
10. Giglio, Jason. "AJAX: Highly Interactive Web
REFERENCES
1. Ajax Patterns. Whats Ajax. 25 March 2010. 1 April Applications." 2009.
2010 <http://ajaxpatterns.org/Whats_Ajax>. 11. INTERNET METHODOLOGIES JOURNAL AND
2. Arlekar, Sagar G. The Role of AJAX in enhancing NEWS. Are there Usability Issues with AJAX? 1
the user experience on the Web. 1 June 2006. 6 April 2010. 3 April 2010
March 2010 <http://www.roseindia.net/ajax/ajax- <http://www.imjan.com/internet-www/are-there-
user-interface.shtml>. usability-issues-with-ajax/>.
3. Avangate. Usability Friends: Ajax. 29 October 2007. 12. Itura. AJAX SECURITY: ARE AJAX
1 March 2010 APPLICATIONS VULNERABLE TO HACK
<http://www.avangate.com/articles/ajax-usability- ATTACKS? 2009. 05 March 2010
110.htm>. <http://www.itura.net/training/16-ajax-security-are-
ajax-applications-vulnerable-to-hack-attacks.html>.
4. Brookes, J. SUS - A Quick and Dirty Usability Scale.
2009. 4 March 2010 13. Keely, Pavan. Using Ajax. 18 January 2006. 2 March
<http://www.usabilitynet.org/trump/documents/Susch 2010
apt.doc>. <http://keelypavan.blogspot.com/2006/01/using-
ajax.html>.
5. Bruno, Vince, Audrey Tam and James Thom.
"CHARACTERISTICS OF WEB APPLICATIONS 14. Kluge, Jonas, Frank Kargl and Michael Weber. "THE
THAT AFFECT USABILITY: A REVIW." EFFECTS OF THE AJAX TECHNOLOGY ON
Proceedings of OZCHI 2005,. Canberra: CHISIG, WEB APPLICATION USABILITY." WEBIST 2007
2005. 2-4. International Conference on Web Information
Systems and Technologies. 2007. 289-294.
6. Dumas, J.S. and J.C. Redish. A practical guide to
Usability Testing. Exeter : Intellect Books, 1999. 15. "Ajax." Java Jazz Up 8 April 2008: 1-79.
7. Eernisse, Matthew. Build Your Own Ajax Web 16. Lerner, Reuven M. Ajax Application Design. 1
Applications. 28 June 2006. 5 March 2010 December 2006. 1 April 2010
<http://articles.sitepoint.com/article/build-your-own- <http://www.linuxjournal.com/article/9295>.
ajax-web-apps>. 17. MacKay, Tara. Ajax Usability Concerns. 25
8. Garrett, Jesse James. Ajax: A New Approach to Web December 2007. 2 April 2010
Applications. 18 February 2005. 7 March 2010 <http://www.notesondesign.net/resources/web-
<http://experiencezen.com/wp- design/ajax-usability-concerns/>.
content/uploads/2007/04/adaptive-path-ajax-a-new- 18. Molich, R and J. (1990) Nielsen. "Improving a
approach-to-web-applications1.pdf>. human-computer dialogue." Communications of the
9. Gibson, Becky. Ajax Accessibility Overview. 1 April ACM 33. 1990. 338-348.
2006. 1 April 2010 <http://www-
03.ibm.com/able/resources/ajaxaccessibility.html>.
19. Molich, R. and J. and Nielsen. "Improving a human- <http://www.sitesecuritymonitor.com/ajax-
computer dialogue." Communications of the ACM application-attacks/>.
33. 1990. 338-348.
27. SPOOL, JARED M. Five Usability Challenges of
20. Nielsen, J. and R Molich. "Heuristic evaluation of Web-Based Applications. 4 December 2007. 8 March
user interfaces." Proc. ACM CHI'90 Conf. Seattle, 2010
1990. 249-256. <http://www.uie.com/articles/usability_challenges_of
_web_apps/>.
21. Nielsen, J. "Finding usability problems through
heuristic evaluation." Proceedings ACM CHI'92 28. Tullis, Tom and Bill Albert. Measuring the User
Conference. CA: Monterey, 1992. 378-380. Experience. Burlington: Morgan Kaufmann, 2008.
22. —. Usability Inspection Methods. New York: John 29. Wang, W. PowerMeeting on CommonGrounds: Web
Wiley & Sons, 1994. based synchronous groupware with rich user
experience. 2008. 20 March 2010
23. Osborn, A. F. Applied Imagination. New York:
<http://sites.google.com/site/meetinginbrowsers/weig
Scribner, 1957.
ang-wang-s-work>.
24. S.Dumas, Joseph and Janice C.Redish. A Practical
30. Web Aim. What is AJAX? 1 March 2010. 6 March
Guide to Usability Testing. n.d.
2010 <http://www.webaim.org/techniques/ajax/>.
25. Sarwate, Amol. Hot or Not: Ajax Vulnerabilities. 19
31. Wood, John. Usability Heuristics Explained. 18
September 2007. 28 March 2010
January 2004. 28 March 2010
<http://www.scmagazineus.com/hot-or-not-ajax-
<http://iqcontent.com/publications/features/article_32
vulnerabilities/article/35698/>.
/>.
26. Site Security Monitor. Ajax Application Attacks.
2010. 1 March 2010
APPENDIX A
Asynchronous communication - the user’s interaction with the application happens independently of the application’s
communication with the server.
7 The various tabs were clearly visible and easy to find. Efficiency and ease of use
10 I made errors while navigating through the individual tasks. Ease of use
12 Telepointer navigation from one item to another was smooth. Efficiency and satisfaction
14 There is a consistent icon design scheme and stylistic treatment across the system Satisfaction
Statements and Usability Metric Tested For Voice Conference and Chat
1. The tool is easy to use for the tasks given. Ease of use
In relation to other tools I have used, this tool is easy. Ease of use
4.
10.
The following questions were then asked with expected binary response of YES/NO to gather qualitative data.
Did you encounter any problem while connecting with Skype? If you answer YES please briefly mention some of them.
Please make any comments on Power Meeting Voice conference (Skype) tool.
Would you recommend others to use this voice conference tool? Answer by (YES/NO).
Would you recommend others to use Power Meeting for group chat? Answer by (YES/NO).
1. Compared to other web applications that you use, how would you describe the registration process of the
PowerMeeting? Choose one of the following options and put it in a circle. You may consider selecting more than
one answer.
a. It is really confusing for the average user
b. Very poorly designed mechanism
c. Rather straightforward
2. Did you encounter any difficulties to log on the system and create a new session? Answer by (YES/NO)
3. If your answer is NO describe in a short sentence the basic difficulty you encountered.
4. Would you prefer it if Power Meeting included a mechanism to remember passwords? (YES/NO)
5. Are you convinced of the security which is provided by the Power Meeting during the log in process? Please
consider mostly the case where you need to log in with your skype id. Answer by (YES/NO).
6. On any PC, it is impossible to log in on the Power Meeting by using the same Web Browser. How would you
comment on that? Answer with a short sentence.
7. How would you characterize the overall design of the user guide? Your options are the following and you should
put your answer in a circle.
a. very bad
b. neither bad/nor good
c. good
d. good but corrections are needed
e. fascinating
8. Do you believe that the description of the sessions in the user guide was helpful to you?
(YES/NO/INDIFFERENT)
9. Are you satisfied by the organization of the user guide? (YES/NO)
10. Do you believe that the content of the user guide is accurate and to the point? (YES/NO)
11. Could you manage to communicate via Skype through the Power Meeeting tool without reading the session of the
user guide describing the voice conference with Skype?
12. Do you believe that you would have been able to perform better in the agenda task if the user guide had included
an illustrated presentation of this function? Answer by (YES/NO).
13. Suggest any improvement in a new version of Power Meeting’s documentation.
Questions For Assessing the Group calendar and Corresponding Metrics Tested
No Questions On Individual Tasks Type Of Usability
Metric
1 I found the tool easy to use for this task Ease of Use
3 I would need to use the tool several times before I get accustomed to performing this task. Ease of Use
4 The experience I have of previous tools increased my ability and speed of performing this task Ease of Use
in the Groupware calendar tool.
5 I understood the text descriptions of buttons on the user interface of the Group calendar. Ease of Use
6 The text description on the buttons aptly describe their functionality Ease of Use
7 The steps for each task followed a natural and logical order. Efficiency
8 I felt confident and very much in control of the tool while performing this task. Ease of Use
10 I found it easy to retrace my steps when I made an error while carrying out this task. Ease of Use
11 I felt I needed to check the online user documentation for this task Ease of Use
General Questions on the Group Calendar
14 I found the various functions in the Power meeting calendar tool well integrated. Ease of Use
16 I would likely use this tool frequently as my group calendar tool. Ease of use/
Satisfaction
17 I would imagine that most people would learn to use this system very quickly Ease of Use
(Questions 1, 13, 14, 16 and 17 are adapted from the System Usability Scale listed by Brookes J. but developed at the Digital
Equipment Corporation).
18. Please briefly describe any problems you encountered in carrying out the tasks
19. List some ways you think the calendar tool would help you work better in groups
20. What other features did you expect to see in the group calendar tool?
21. Please give recommendations on the improvement of this tool.
The Group Calendar Evaluators’ Guideline and Observation Form
We used this form to get our own assessment of the main user metrics we set out to test. The data was recorded for each task
and for each user. The table below lists out the guidelines we followed in order to effectively record our observations.
7 Please Record User’s Comments per Exact comments Efficiency/Ease of use/ Main Issues (They
Task confirm the other observations made above).
8 Please record any other observations Miscellaneous Efficiency/Ease of use/ Main Issues (They
made confirm the other observations made above).
APPENDIX B – Graphs and Tables
Documenta- Voice
Login tion Agenda Telepointer Conference
1 1 1 1 1
User 1
1 1 1 1 1
User 2
1 1 1 1 1
User 3
1 1 1 1 0
User 4
1 1 1 1 1
User 5
1 1 1 1 0
User 6
Confidence
Interval 0% 0% 0% 0% 41%
(95%)
60 90
User 2 User 2
60 50
User 3 User 3
120 60
User 4 User 4
40 18
User 5 User 5
50 22
User 6 User 6
Average 75 Average 51
Table 2 showing user completion times for login Table 5 showing user completion times for tele-pointer
Evaluation Evaluation
User User
Time (Secs) Time (Secs)
240 180
User 1 User 1
180 180
User 2 User 2
60 120
User 3 User 3
240 360
User 4 User 4
180 180
User 5 User 5
240 180
User 6 User 6
336 Experts 2 2
User 2
TOTAL 6 6
540 Chi-test 1
User 3
Table 7 Login Chi-test
300
User 4
Group Observed Expected
130
User 5 Novice 2 2
147 Intermediate 2 2
User 6
Experts 2 2
Average 292 TOTAL 6 6
Table 4 showing user completion times for Agenda Chi-test 1
Table 8 Documentation Chi-test
Relationship between level of experience and errors
made(Login)
Intermediate 2 2 0.8
0.6
Experts 2 2 R2 = 0.5786
0.4
TOTAL 6 6
0.2
Chi-test 1 0
Table 9 Agenda Chi-test 0 2 4 6 8 10 12 14 16
Months of Experience
TOTAL 6 6 1.2
1.2
Tools Errors
1
Average Errors per Day
Login 3
0.8
Documentation 2
0.6
R2 = 0.0357
Agenda 1 0.4
Telepointer 0 0.2
Voice 0
Conference 1 0 2 4 6 8 10 12 14 16
Table 12 Number of errors Months of Experience
Voice
Graph 5
Conference, 14%
Telepointer, 0%
Relationship between level of experience and errors made(Voice
Login, 43% Conference)
Agenda, 14%
1.2
Average Errors per Day
1
0.8
0.6
Documentation, R2 = 0.0357
29% 0.4
0.2
Login Documentation Agenda Telepointer Voice Conference
0
0 2 4 6 8 10 12 14 16
Graph 6
Tools High severity Medium severity Low severity
Login Takes long time to log in. Powermeeting does not
remember passwords
Documentation User guide does not illustrate
the tasks for agenda.
Agenda Cannot retrieve deleted items
from the trash.
Telepointer Cannot understand the use of
tele-pointer.
Voice Conference Takes too long to connect to
Skype.
Table 13 Issues encountered with the User Interface tools
120.00%
100.00%
%participants
80.00%
60.00%
40.00%
20.00%
0.00%
Login Agenda Voice
Conference
User 1 1 1 1 1 1
User 2 1 1 0 0 0
User 3 1 1 1 1 1
User 4 1 1 1 1 1
User 5 1 1 0 1 1
User 6 1 1 0 1 1
Task 2 Event
Graph 9 showing the relationship between the completion time and description can
general computer usage experience only be
modified by
Task 1 Task 2 Task 3 Task 4 Task 5 double clicking
to bring up the
User 1 43% 35% 43% 48% 33%
dialogue box. It
User 2 73% 73% 53% 63% 75% is not explicit
that user
User 3 70% 70% 60% 70% 70% should double
click event to
User 4 73% 73% 68% 75% 75%
see the
User 5 53% 63% 53% 73% 70% description
User 6 58% 68% 53% 68% 68% Task 3 Event time can only
be changed by
Average 61% 63% 55% 66% 65%
dragging and
Table 18 showing Data from Calendar Tool Ease of Use dropping event to a
Evaluation different time slot.
Alternative way to
change the time is
through the dialogue
box but this cannot
be done as the text-
box which contains
the time is not
editable
User How can the calendar tool What other features did you Other comments
help you work better in expect to see in the group
groups? calendar tool?
User 3 Saves cost : if you are away Priority feature will help
and need to meet up with your organize events better
group members
Saves time
User 5 Deadline and meetings can be The time in the dialogue box The ease of use of the
easily communicated should be made editable like tool exceeded my
the description expectation