Anda di halaman 1dari 12

Problem and solution overview

Our implementations to the current UI hope to improve instructor-student interaction in class


and resolve current problems of subjective assessment of student participation and usability
problems regarding the usage of the UI.

We will modify the current design of UI lights and include a screen to improve feedback. []
button will be included for entering of enquiries and for participation in class. The software
and Edventure will be modified to allow participation to be recorded and displayed to the
user.

Interview Description and Results

Our target users are the NTU student population who use the clicker regularly for their
classes. The rationale behind our choice of target users is that they have experience using the
clicker and thus design implementations would be relevant to them and improve their
efficiency of clicker usage.

We interviewed a total of three students who are currently using clickers in some classes that
they attend via a video camera. All three of our interviewees are currently taking the course
HP308 Psychology in the Workplace and are Psychology Year 3 majors. They all own mobile
phones and mp3 players which contain Bluetooth or wireless devices and have used the
remote control before. Our first and second interviewees Cherie and Nathalie’s language
medium is English and are Singaporeans while our third interviewee, Sheng Rong, is a China
PRC and she is more familiar with Chinese.

All three of them use the Clicker during HP308 for their pop quizzes that occur on a weekly
basis. They are required to select the correct response to the questions posed during a pop
quiz, by pressing the correct button with the response they want on the clicker during the time
frame with which they are given to answer the question.

The interviews with them gave us insights of the current problems as well as possible
solutions. We asked a total of thirteen questions and they are listed in the appendix. In
summary, they all mentioned that there was a lack of feedback when they selected their
answers and were unsure whether the answers got through or even whether they had clicked
the correct answer. Thus, it had resulted in them clicking on the same answer repeatedly,
leaving them frustrated. This is also applicable when they have selected a wrong answer,
which in this case, they would quickly make amendments by clicking another button, for
example, B, instead of A, repeatedly, to ensure that the new answer overwrites the previous
response.

Some themes common to all of them are the fear that their answers would not be accepted by
the system as it would affect their assessment scores and the need for the accuracy in
answering questions during a short frame of time.

Both Cherie and Nathalie viewed the implementations suggested in our proposal favorably.
They include the inclusion of a small screen, better feedback, the ability of clickers to tally
students’ participation scores, for students to check their scores via the clicker and for
students to post queries on various slides. However, for Sheng Rong, she was favorable of it
in some sense, but had mentioned briefly that the tallying of participation scores is ideal but
not necessary and that raising hands and speaking up directly in class rather than using the
clicker to pose queries on particular slides would be better as it gives people the chance to
exercise their communication skills.

Task Analysis

1. Who is going to use the system?


This system was designed for both lecturers and students. On the part of students, they
possess the clicker but not so for the lecturers. Thus we will be focusing on them instead of
on the lecturers due to the focus on the student response unit rather than the TurningPoint
software.

2. What tasks do they now perform?

According to the interviews, they currently only perform the task of answering multiple-
choice questions for general surveys and quizzes set by their lecturers.

3. What tasks are desired?

Up to date, we feel that the invention of the clicker program is due in part to increase the
interaction between both students and lecturers and to increase class participation and
interaction. However, it has been shown to be single-directional from lecturers to students.
Through the interview, we have gathered that some tasks that are desired could be the clicker
recording their participation scores or rates and perhaps the introduction of using the clicker
to pose queries or questions directly to the lecturer without disrupting the class.

4. How are the tasks learned?

According to the interviews, the task that they now performed is learned over a period of
time. Instructions were provided at the start of the class when clicker was first introduced and
subsequently, students learned to use it through numerous practices.

5. Where are the tasks performed?

The tasks are performed in settings as lecture theatres or seminar rooms, under the
instructions or plans of the lecturers who make the decision on when students are required to
use it.

6. What’s the relationship between user and data?

The users that we are focusing on, in this case, are the students. The data that is eventually
captured by the system through the use of the clickers by the students, reflect the eventual
scores of their quizzes. Thus, the relationship between the users and that of the data are results
that determine part of their final score in a module.

7. What other tools does the user have?

The users have basic computing skills, experience using remote control devices (with number
buttons on them) which are commonly used in the household. Most of the users have mobile
phones and laptops with wireless devices and thus have a basic idea of how wireless devices
work.

8. How do users communicate with each other?

The term “users” refer to both students and lecturers. Therefore, the two different groups of
users (students and lecturers), communicate via the clicker. However, this communication is
one-way, that is from lecturers to students to answer the required questions for their quizzes
through the clicker via a wireless network but none going from the students to the lecturers
with the need for the latter to answer students’ queries. There is no form of communication
available between student users.

9. How often are the tasks performed?

The task of answering their quizzes via the clicker is performed on nearly a weekly basis,
depending on the lecturer.
10. What are the time constraints on the tasks?

The time allowed for answering the questions depends largely on the part of the lecturers who
are responsible for setting the time limit to how much time the students are given to complete
the task. A time limit of ten seconds is usually set for each question posed in the quiz.

11. What happens when things go wrong?

Presently, there are no backup strategies for when task-related errors are committed. The lack
of adequate feedback from the UI prevents users from noticing their errors. If the user forgets
to change the channel formatted for his clicker or his response fails to be captured or a wrong
response is sent, the only possible feedback is the number of responses logged and displayed
on the instructor’s screen. However, the instructor usually does not announce or take
attendance of the number of people in class or present thus number of responses is a weak
feedback. He may only notice these errors when he views his updated scores on Edventure
when it is too late. In addition, when the user fails to submit his response within the time
frame given, he will be forced to accept a zero score for that question.

Description and analysis of sample tasks

Easy Task 1: You would like to select ‘A’: Yes regarding the question “Did you have
MacDonald’s for lunch?” Submit your response ‘A’ using the clicker.

This task is the submission of a response to a quiz or survey question using the clicker. It is
well-practiced by our target users. It is automatically captured via a wireless network to the
Turning Point software on the lecturer’s computer, as long as the answer is provided within
the time limit. Submission and connection are automated.

Easy Task 2: You pressed the 5/E button in order to respond to the question with the answer
‘none of the above’. However, there are only 4 options: A, B, C, D for this question. Change
your answer to option B instead.

Changing of responses occur frequently for users as many tend to select responses too quickly
without reading the question thoroughly and the adequate time frame provides opportunities
to correct an error submission. Submission of a new response automatically overwrites the
previous response submission. Users need not perform additional tasks in order to do so.

Moderate Task 1: Your lecturer is explaining some concepts on the current slide. You have
some enquiries about it. Use the clicker to signal that to your lecturer.

This is a task that our re-designed interface will support that was not supported in the original
interface. It will involve function buttons which are new to the user but the action sequence
required will be similar to that of answer submission. Submission and connection are also
automated. Thus we classify this task to be of moderate difficulty.

Moderate Task 2: Your lecturer posted a question to the class. You like to participate by
answering the question. Use your clicker to do so.

This is a task that our re-designed interface will support that was not supported in the original
interface. It will involve function buttons which are new to the user but the action sequence
required will be similar to that of answer submission. Submission and connection are also
automated. Thus we classify this task to be of moderate difficulty.

Difficult Task 1: You have some questions about a concept in a slide which has passed. Use
the clicker to signal to your lecturer regarding that.
The original interface only supports response submission regarding the current slide
displayed. The users have no experience in submitting responses regarding slides which have
passed as they were told that they are not able to do so. This concept will be introduced by
our new interface. In addition, the function buttons associated with posting enquiries are new
to the user. This task requires an action sequence which is longer than answer submission.
Thus it can be classified as a difficult task.

Difficult Task 2: You have been using your clicker to ‘participate’ in class. Check your
participation score thus far on Edventure.

This task will be supported by our re-designed interface; it was not supported in the original
interface. It is the only task which requires the usage of a computer, connection to the internet
and logging into Edventure. Although they are familiar to users, the option to view
participation scores on Edventure is new. The need to perform multiple steps in an unfamiliar
action sequence makes this task challenging for users.

Interface design

We will introduce a small screen to the current UI. The screen will resemble that of a
calculator screen which can display text and numbers. It will be dark grey in colour and
display text in white. Its low cost makes it feasible for production in large numbers to be
provided at low or no cost (the current UI) to the whole NTU student population. It will
display the following:

• What channel is the device connected to

• Which response was selected and whether it is accepted

• Slide number

• Errors in response selection i.e. “response is unavailable” with regard to current


question on screen

By informing the user which channel the clicker is connected to (Fig. 1.1) will remind the
user to change to the appropriate channel. This serves as a reminder and an error prevention
measure especially if the user uses the clicker at many classes and has to change channels
frequently.

Fig. 1.1

Informing the user which response is selected and whether it is accepted (Fig. 1.2) can
improve the current user feedback immensely. It will reduce fear associated with the
ambiguity of the system state and eliminate the need to submit response multiple times. Thus
improve user experience of the clicker.
Slide number information (Fig. 1.2) is essential as the new interface allows the user to input
response for slides which has passed. This implementation allows irreversible system
functions to be reversible by extending time constraints from 10 seconds to the time period of
the whole lecture. However, this will be available for participation and enquiries only, not for
answering quizzes in order not to defeat the purpose of placing time constraints on a quiz. As
this feature is novel, adequate feedback to system state will aid the user in performing
associated tasks. This will allow the user to know which slide the information is being sent to.

Current UI emits a blinking red and green light when it is activated for the input of channel
code, a blinking green light when polling for responses is open, a yellow light when a
response is sent. We believe changing the UI to emit a steady blue light when a response is
sent is more appropriate as it is more distinct from green than yellow (Fig. 1.2). A clear
feedback of a change in system state will be communicated to the user. Submission of a
subsequent different response will cause the system to stop emitting light, the user has to re-
enter the response one more time for the system to emit a steady blue light. The double entry
serves as a feedback to the user that the previous response is being removed and as a
confirmation measure as to change of response. The blue colour is also appropriate as it (apart
from green) universally represents openness and clearness and can be associated with positive
acceptance (of response). The usage of knowledge in the head instead of rote learning
(association of yellow to acceptance) is not only aesthetically pleasing but also makes the UI
more intuitive to the user.

Fig. 1.2

Providing error messages (Fig. 1.3) due to submission of a response that is unavailable serve
as an error prevention measure as well as a reminder to the user to double check the responses
available and input the desired response.

We also suggest that the clicker emit a blinking red light (Fig. 1.3) when an error selection is
made. This is an error preventive measure as the current UI only provide feedback to the user
after polling is closed; TurningPoint software displays selection for an unavailable option
together with valid options in a bar chart. Furthermore, detection of the slip will only occur if
the user can correctly associate the data on screen with his action which is difficult as
feedback is poor. Moreover, when the user receives this feedback, error correction is no
longer possible as polling is closed. Our implementation allows error correction to be made
within time constraints. Blinking frequency is suitable as a warning signal as it can more
easily capture the user’s attention than a steady light. In addition, the red colour is appropriate
as it relies on the user’s knowledge in the head which is the association of the colour red to
warning and error. It helps to maintain simplicity of design and eliminates the need for
instruction or labels.
Fig. 1.3

In order to incorporate additional features to the current design without cluttering it with more
function buttons, enquiries submission and participation will share the same button [] on the
UI (Fig. 1.4). This refinement contains the minimal critical features required for user
recognition without creating unnecessary cognitive load. [] effectively represents
participation and enquiry as it allows the system to match the real world where the user
participates and ask questions in class by raising hand. We will be replacing the current [?]
button with [] as the current [?] is commonly associated with “Help” function thus [] is
more appropriate.

In consideration of the possibility of mode error to occur when one button serves multiple
functions, the design of the UI is such that Participation and Enquiry are mutually exclusive.
When the instructor posts a question, the user can press the [] button to answer the question
(Participation). Enquiries are not needed as the instructor will further elaborate the concept
via the questions and answers. On the other hand, when a slide does not concur with any
instructor questions, the user uses the [] as an Enquiry button. The constraint placed on the
system effectively prevents mode error. Unlike other buttons on the UI, this button is
displayed is in the shape of a hand and lights up when pressed. Thus, when the user presses it,
[] will light up to feedback to the user (Fig. 1.4). In order to further differentiate the light
from the background of the UI, the light will be a distinct green. Our colour choice capitalizes
on user’s knowledge of the head as well because green is associated with positivity such as
the traffic light and increment (stock market). We hope to use the green light as a subtle cue
to encourage the user to participate and ask questions in class. [] button will be salient
against other buttons in the UI as it is a pictorial icon instead of alphabets or numbers. Search
time will be reduced as the user can perform parallel search instead of serial search for this
button. Our design hopes to encourage two-way interaction – student participation via the
relative ease of usage of the [] button.

Fig. 1.4
[] button serves as a mode button. When pressed, it changes the default mode in the UI
which is to answer questions to either Participation or Enquiry mode depending on whether
the instructor selected Participation mode in the software. At default, all information from the
clicker is sent to the current slide. Thus, in order to send Enquiry to a previous slide, the user
should first press the [] button to change mode then press the slide number. This
information will be reflected on the small screen as well because we wish to provide adequate
feedback to the users when this is novel to them (Fig. 1.5). This action sequence reflects a
system match with the real world when the user raises his hand before speaking in class, and
consistency with the existing action sequence where response numerals should come in last as
the final response is the only response recorded by TurningPoint.

Fig. 1.5

Time constraints in the current UI albeit effective and appropriate for quizzes, discourages
student participation and submission of enquiry, especially that of the latter. We will adjust
TurningPoint software to allow networks to all slides to be turned on concurrently, allowing
submission of responses after the slide has passed. The abolishment of time constraints allows
the system to be better matched to the real world where there is no time limit to when students
pose questions to the instructor. Also, in the real world, the user will require some time to
digest the lecture content and decide that further explanation is needed and we will not want
to deprive the user the opportunity to clarify lecture content.

As the ‘clicker’ is not a standalone device, TurningPoint should complement the functions
and feedback on the clicker for them to be effective. We suggest the natural mapping of
function button layout to that of the screen. For instance, the recorded numbers of
Participation and Enquiry should be displayed on the bottom right corner like that of the UI.
This allows the user to immediately find and associate the data with that of
Participation/Enquiry. Display of the number of participation and enquiry responses to
student users is important as it provides some form of feedback that response was recorded, it
also capitalizes on the bandwagon effect which is the tendency to perform a behavior when
many other people do so. Thus by providing feedback to student users that many people have
enquiries or wishes to participate in class will boost enquiry submission and participation,
serving the goal of the system.
Fig. 1.6

We will also allow participation scores (clicker “”) to be displayed under “View Grades” in
Edventure (Fig. 1.7). This is based on the Gestalt law of similarity. At present, quiz scores are
uploaded and displayed in “View Grades”. As participation assessment also contributes to
course grades, the user will view participation and quiz as a group resulting in the automatic
search for participation scores in the vicinity of quiz scores. By allowing the display of both
items to be found under the same hyperlink and on the same page will ease user search of
participation scores.

Fig. 1.7

Scenarios and Storyboards

Easy Task: (Easy Task 1)


Your professor decides that there is a HP308 quiz for the day. There are a total of eight
multiple-choice questions, with each question consisting of options from 1 to 5. Each
question is flashed on the screen for a time period of ten seconds. Each question needs to be
answered within the stipulated time. The first question is, ‘Fredrick Taylor is a psychologist
who introduced Scientific Management during which period of time?’ The answer for this
question could be one of these 5 options: 1-5. You need to answer the question and can do so
by selecting the response that you deem most appropriate, by pressing one of the buttons:
1/A, 2/B, 3/C, 4/D or 5/E on the Clicker. You believe 1 is the correct answer. Answer this
question using the clicker.

Turning on and setting channel

Flashing
green & red

Key in
channel code

Press and
release

Moderate Task: (Moderate Task 2)

The lecture is now in progress. As usual, your professor likes to ask a few questions to the
entire lecture of forty students to ensure that all of you understand the concept. She is
currently going through a slide on engineering psychology is a hybrid. The professor poses
this question verbally: “Why is it that engineers and psychologists have to work together?”,
and expects you to use your Clicker to signal that you want to attempt to answer her
questions. To try to get chosen to answer the particular question, press [] button on the
Clicker, so that the professor will know that you wish to answer the question and that you
may have the chance to do so.
Difficult Task: (Difficult Task 1)

In the progression of the HP308 lecture, you begin to grapple with the concepts of ‘efficiency
rules for manual labour’ in slide 5. Before you could fully comprehend the concepts and ideas
to why steps are done in this concept (Slide 5), your professor has moved onto the next slide
to discuss the following concept of ‘person-machine systems’. You feel that there is a need to
clarify your queries in slide 5. You can do so by clicking the [] button and entering the slide
number, which in this case is 5, to let the professor know that there are queries on this slide.
Appendix A

Interview Questions

1. Do you have a clicker?


2. How many classes require you to use your clicker? What are those classes?
3. How many times do you use your clicker a week?
4. What tasks do you perform with your clicker?
5. How did you learn to do them?
6. What are the time constraints when you use your clicker? For instance, do you have
to ‘click’ within a certain period of time? What is the time period?
7. What are some problems you experience with your clicker?
8. If you change your answer, does the clicker provide any feedback? i.e. are there any
visible changes in the clicker?
9. What do you do when problems happen?
10. Do you think adding a small screen on the clicker to display which answer was
chosen and whether it is accepted would improve the current design?
11. If the clicker can be redesigned to include more functions, what would you like them
to be?
Given that we NTU students are usually shy about asking questions or providing our
answers in class,
12. Would you like to be able to ask and answer your lecturer’s questions using the
clicker? For instance, signal your lecturer using the clicker when you want to
participate or when you have a question about a certain slide in the lecture.
Given that the current system for assessing class participation is subjective
13. Would you like to have the clicker record and quantify the number of times you
participated in class and allow you to view the tallied numbers via Edventure?

Anda mungkin juga menyukai