Anda di halaman 1dari 8

UbiSysTest: An Ubiquitous System for Educational

Assessment in m-Learning Environments


Dejailson Nascimento Pinheiro1 , Rafael Fernandes Lopes1,2 ,
Omar Andrés Carmona Cortes1 , Marcelo Portela Sousa2
1
Departamento Acadêmico de Informática
Instituto Federal de Educação, Ciência e Tecnologia do Maranhão (IFMA)
São Luı́s, MA
2
Instituto de Estudos Avançados em Comunicações (IECOM)
Campina Grande, PB
dejailson.pinheiro@gmail.com, rafaelf@ifma.edu.br,

omar@ifma.edu.br, marporsou@iecom.org.br

Abstract. Nowadays, e-learning technologies can be combined with mobile


computing in order to create a new educational paradigm called m-learning
or mobile learning. This new paradigm improves the cooperation between stu-
dents and teachers, overcoming the classroom borders. This paper proposes an
ubiquitous testing system (UbiSysTest) for m-Learning environments. The goal
of UbiSysTest is to provide an infrastructure for creating, storing, applying and
correcting academic tests. Through UbiSysTest, students can download and ex-
ecute these tests by means of mobile phones. This application runs off-line and
only connects through Internet whenever should be required.

1. Introduction
Educational distance appeared in the beginning of 19th century at University of
Chicago, offering educational opportunities for those who were not among the elite and
who could not afford full time residence at an educational institution. This educational
approach had been supported by correspondence (mail) until the development of computer
networks and the expansion of the Internet. Computer networks and the Internet provided
a new structure for accessing knowledge, transforming the traditional education distance
into e-learning. Formally, e-learning is an Internet-based learning system which involves
training, information and communication in real-time.
Nowadays, e-learning can be combined with mobile computing, creating the m-
learning or mobile learning paradigm, improving the cooperation between students and
teachers. In other words, both instructors and learners have wide variety of options
provided to them to perform content delivery, assess students, access course materials,
etc. [Balasundaram and Ramadoss 2007]. M-learning demands the use of many technolo-
gies, such as wireless networks, XML, Java, WAP, voice and electronic mail, SMS and
MMS messaging and sooner voice and video on-demand, overcoming the classroom bor-
ders [Pelissoli and Loyolla 2004].
In this context, this paper proposes an ubiquitous system for educational assess-
ment in m-Learning environments, named UbiSysTest [Lopes and Cortes 2007]. The goal
of UbiSysTest is to provide an infrastructure for creating, storing, applying and correcting
academic tests. Through UbiSysTest, students can download and execute these tests by
means of portable devices, anytime, anywhere, through the Internet. It is important to
note that, this system only access Internet whenever should be required (i.e., downloading
test and sending responses at the end of test).
The main purpose of this paper is to describe the architecture and development
of UbiSysTest, as well as its application in actual classes. So, this paper is organized
as follows: Section 2 presents the ubiquitous testing system; Section 3 describes the use
of UbiSysTest; Section 4 discusses about related works; finally, Section 5 presents the
conclusion of this paper and future works.

2. Ubiquitous Testing System


Learning evaluation is an essential aspect to be considered in courses or educa-
tional curriculums. Evaluation is a part of any instructional design model, since it might
to be used for gathering information about the impact or effectiveness of the presented
educational content. Learning performance evaluation can be used as a tool for measur-
ing whether or not the course objectives were achieved, being useful for diagnosis and
formative purposes. An efficient and flexible learning evaluation mechanism is a required
feature in computational learning platforms, in this context.
We have developed an ubiquitous testing system for evaluating students’ learning
performance, called UbiSysTest. This system allows students to measure their knowledge
about some subject using mobile phones. Besides, UbiSysTest allows instructors to gen-
erate, through a web interface, multiple-choice questions for students. These questions
can be viewed and answered by means of students’ mobile phones accessing UbiSysTest.
Because of its cost, this system only access Internet (through a GPRS/EDGE or UMTS
connection) whenever should be required (i.e., downloading test and sending responses at
the end of test).
UbiSysTest architecture is composed by four subsystems: (a) Test Management
Subsystem (TMA), (b) Test Execution Subsystem (TEX), (c) Test Development Subsystem
(TDE), and (d) Test Evaluation Subsystem (TEV). All testing processes can be done with
the cooperation of these subsystems. Figure 1 depicts an overview of this architecture.
The Test Management Subsystem (TMA) is a Java application responsible for
storing all data related to tests, such as questions, students’ answers and tests re-
sults. Other subsystems can manipulate stored information through its access interfaces
(e.g., mobile clients access TMA in order to download questions and send answers for
evaluation). All data are stored in a PostgreSQL relational database and converted to a
XML format before being sent back to clients.
The Test Execution Subsystem (TEX) is a JME application running on mobile
devices, responsible for executing tests. Through this application, registered mobile users
(i.e., students) can perform the download of XML test files available on TMA, answer
questions, submit answers back to TMA and view teacher test evaluations. XML parsing
is performed using kXML library [kXML 2010].
The Test Development Subsystem (TDE) provides maintenance functions that
teachers can use to create, modify or delete tests (and its questions) which are stored in the
Figure 1. UbiSysTest architecture overview

database. TDE was implemented as a friendly web application written with JSP/Servlets
technology running over the Glassfish application server. Finally, the Test Evaluation
Subsystem (TEV) is a web application whose teachers can use to evaluate answers and
measure students’ performance. It also interacts with the database in order to get students’
answers and store evaluation results. TEV and TDE can be accessed through the same
web interface.
The following subsections describe more details of each UbiSysTest subsystem.

2.1. Test Management Subsystem (TMA)


TMA provides the infrastructure that becomes possible for a student to take tests
through an enabled mobile phone. It works as a warehouse for other subsystems, pro-
viding a set of access interfaces for retrieving and storing tests, students’ answers and
evaluation results. These interfaces can be accessed by the other subsystems using socket
connections and a specific protocol.
TMA was implemented using Java technology. It acts as a high level interface
between other subsystems and the database. All data exchanged among TMA and other
subsystems are in XML-based formats.

2.2. Test Execution Subsystem (TEX)


TEX is the subsystem responsible for executing tests following the parameters
retrieved from TMA. These parameters are specified by the teacher when he/she was
developing the test using TDE web interface. This subsystem was implemented as a
JME application running on mobile devices. This technology was chosen due to its great
popularity.
Through TEX, registered mobile users (i.e., students) can perform the download
of XML test files from TMA, answer their questions (off-line), submit the answers back
to TMA (these answers are generated in a XML-based format, not shown by space re-
strictions) and view teacher test evaluations stored on TMA. XML parsing is performed
using kXML library [kXML 2010]. TEX saves downloaded test data to a native format
after parsing the XML test file (using JME record management system).
Figure 2(a) presents the TEX main menu. Through this menu, students can choose
among the following options: (a) change the course settings (in order to specify the course
which the student is participating), (b) download tests (from a list of tests available in the
defined course), (c) answer tests, (d) submit concluded tests and (e) view tests evaluations
done by the teacher. A multiple choice question is presented in Figure 2(b). The presented
question was generated using the XML shown in Figure 4 of Section 2.3.

(a) TEX main menu (b) A test being executed in TEX

Figure 2. TEX screenshots

2.3. Test Development Subsystem (TDE)


TDE allows teachers to create, modify or delete tests stored in TMA. Each test
is associated to a course, identified through a unique course number, which must be an-
nounced by teacher. Students can download any test to their enabled devices using this
unique identifier.
Teachers can create new tests and add questions using a web-based interface in
a flexible way. TDE allows generating multiple-choice questions, as shown in Figure 3,
that will be accessible for students using their mobile phones.
The TDE subsystem was implemented using MVC architectural pattern (Model-
View-Controller) [Krasner and Pope 1998]. The view layer is represented by JSP files
(i.e., user interface) and the controller layer is implemented using Java Servlets (i.e., pro-
cess and respond to events, typically user actions). These two layers are responsible for
handling user interactions with the subsystem, which can result in changes in the model
(i.e., domain-specific representation of information on which the application operates).
An example of XML file, which represents a test, is shown in Figure 4. The
structure of the XML file was designed to allow the incorporation of new different types
of questions in TDE. Thereafter, it is possible for TDE converts all data into this object to
Figure 3. Teacher inserting a multiple-choice question in a test using TDE

a XML-based format, to be stored in TMA. The Figure 4 is self-explained, not requiring


detailed information about it.

Figure 4. An example of test XML file

2.4. Test Evaluation Subsystem (TEV)


TEV provides a web interface that allows the teacher evaluate students’ tests. All
evaluation results must be released by teacher using this subsystem, allowing students to
view tests results in their mobile phones. Whereas all textual questions must be manually
evaluated by the teacher, selection questions are automatically evaluated by TEV when
the teacher releases the test. TEV will also provide suitable measurement reports for
determining whether and how much of a test criterion is effectively satisfied.
Java web technology (i.e., JSP/Servlets) are being used for implementing TEV in-
frastructure. When a teacher needs evaluating a test, TEV interacts with TMA for down-
loading two components required for this task: (a) the XML file which represents a test
(this XML-based format was presented in Section 2.3) and (b) the students’ answers.
These answers were generated by TEX submission process. After the end of evaluation
process, TEV must send evaluation results represented in a XML format as soon as pos-
sible.

3. Using UbiSysTest
Java is the main technology employed in the development of UbiSysTest. How-
ever, each one of its subsystems was implemented using a specific platform. Test Manage-
ment Subsystem (TMA) requires a Java Standard Edition (JSE) platform, since it executes
as a standalone application. It uses the Java Persistence API (JPA) for interacting with
PostgreSQL database. Test Execution Subsystem (TEX) was developed with Java Micro
Edition (JME) platform, and executes on JME enabled mobile phones (with CLDC 1.1 /
MIDP 2.1 support). TEX also demands an Internet connection for communicating with
TMA. The latter two subsystems – Test Development Subsystem (TDE) and Test Evalu-
ation Subsystem (TEV) – was developed with web technologies. TDE and TEV are Java
Enterprise Edition (JEE) applications, implemented as a single integrated Java web sys-
tem. These last two subsystems run over a Glassfish application server, using services
provided by this platform.
Each one of the UbiSysTest’s subsystems has to be deployed in one of the two
different network nodes: a server or a mobile station1 . While TMA, TDE, and TEV must
be deployed in the server side, TEX must be deployed in the mobile station (or emulator)
side. The server and the mobile station should be able to exchange data through some
network.
Interactions between the server and the mobile station are triggered by two actors:
(a) the teacher and (b) the students. Both actors have access to functionalities provided by
UbiSysTest through their respective interfaces (web and mobile interfaces). The following
steps can be performed in order to demonstrate the use of UbiSysTest’s functionalities for
students and teachers:
1. Through TDE, the teacher should register his/her courses, tests and questions.
Figure 3 illustrates the screen used by the teacher to register a multiple-choice
question. The registered course number should be informed for the students;
2. Using TEX, the students can create a new course setting, and then inform the
course name, the course number (indicated by the teacher) and the hostname or
the IP address of the server. Thereafter, the student can download the test, and,
finally, answer it. The main screen of TEX with an example of question were
presented in Figure 2;
3. After answering the test (with TEX), the student can submit it back to the server.
TEX then shows a screen asking the student’s name and registration code. This
information is used to identify the student in the server;
4. The teacher can access and evaluate the responses sent by the students using TEV
web interface. He/she should inform whether the response is correct or not, justify
his/her choice and insert some appropriate comments. Afterwards, the evaluations
completed by the teacher become available to students;
5. Students should check, through TEX, the evaluations of their answers.
1
The mobile station can be emulated using some tool, such as the Sun Java Wireless Toolkit for CLDC
– available at http://java.sun.com/products/sjwtoolkit/.
UbiSysTest, as well as detailed instructions for its usage, installation, and deploy-
ment, can be found at: http://sites.google.com/site/ubisystest 2 .

4. Related Work
The Internet has been considered as an educational tool accessible via many com-
puters around the world. Several types of educational content can be delivered in different
formats. However, the development of high speed networks, wireless communication and
mobile devices, definitively, changed this scenario with the m-learning systems. New
forms of cooperation and delivery content are essential, in this context. Some works
that addresses cooperation and educational content are [Lahner and Nosekabel 2002],
[Divitini et al. 2002] and [Peiper et al. 2004].
A complete m-learning system is not composed only by delivery content. A
test management system is also required. A wireless student testing system for
WAP devices has been designed and implemented at Columbus State University in
Zanev’s Work [Zanev 2004]. The main problem of this approach is that students
must be connected while the test is answered. The same author developed a test
management system for multiple choice question in [Zanev and Clark 2005]. Bala-
sundaran’s work focuses on using SMS for answering short words-answers types of
questions and evaluating them using simple matching process, providing enough feed-
back [Balasundaram and Ramadoss 2007]. Benavent developed a XML-based test system
running on Windows Mobile-based smartphones which allows students to take different
stored test [Benavent et al. 2006]. This mobile-based application runs off-line and only
connects through Internet whenever should be required.
Whereas the presented works addresses only one kind of question for the students
in their test management systems, the UbiSysTest management system deal with different
types of questions such as multiple-choice, true-false, paragraph, short answer, matching,
multiple response and fill in the blanks questions for students, given a considerable con-
tribution to the area. Further, UbiSysTest does not force the student to be connected while
the test is answered. Furthermore, Benavent’s work running only on Windows system
while UbiSysTest is supported in several mobile devices, since is based on JME.

5. Conclusions and Future Works


This paper presented the design and architecture of an ubiquitous testing system
(UbiSysTest) for m-Learning environments. We developed the UbiSysTest using Java
technologies. Through UbiSysTest, students can download and execute these tests by
means of mobile phones. This application runs off-line and only connect through Internet
whenever should be required. The architecture of UbiSysTest is composed by four subsys-
tems: TMA, TEX, TDE and TEV. We developed the UbiSysTest using Java technologies.
Through UbiSysTest, students can download and execute these tests by means of mobile
phones. This application runs off-line and only connect through Internet whenever should
be required.
Web services were not considered in our implementation because few mobile de-
vices support JME Web Service Specification (JSR 172). Some aspects, such as security
2
Available only in Portuguese.
and authentication, will be treated on future works using JME Security and Trust Ser-
vices API (SATSA - JSR 177). A version of TEX subsystem will be implemented using
SuperWaba Virtual Machine [SuperWaba 2010] in order to improve the usability of this
application running on PDAs.
In the future, we intend to use artificial intelligence (AI) to provide self-assessment
for textual question. Moreover, we aim to use AI for creating group of student based on
profiles and evaluating student progress.

References
Balasundaram, R. and Ramadoss, B. (2007). SMS for Question-Answering in the m-
Learning Scenario. Journal of Computer Science, 3(2):119–121.
Benavent, A. P., Bonastre, O. M., and Girona, M. M. (2006). Development of a
XML-based Ubiquitous System for Testing Using Smartphones. In Fourth IEEE In-
ternational Workshop on Wireless, Mobile and Ubiquitous Technology in Education
(WMTE’06), pages 47–49.
Divitini, M., Kristen, H., and Per-Arne, N. (2002). Improving Communication Through
Mobile Technologies: Which Possibilities. In Proceedings of the IEEE International
Workshop on Wireless and Mobile Technologies in Education, Vaxjo, Sweden.
Krasner, G. E. and Pope, S. T. (1998). A cookbook for using the Model-View-Controller
user interface paradigm in Smalltalk-80. Journal of Object-Oriented Programming,
1(3):26–49.
kXML (2010). kXML website. http://www.kxml.org/.
Lahner, F. and Nosekabel, H. (2002). The Role of Mobile Devices In ELearning First
Experiences With a Wireless E-Learning Environment. In Proceedings of the IEEE
International Workshop on Wireless and Mobile Technologies in Education, Sweden.
Lopes, R. F. and Cortes, O. A. C. (2007). An ubiquitous testing system for m-learning
environments. In Proceedings of the Second International Conference on Systems and
Networks Communications (ICSNC ’07). IEEE Computer Society.
Peiper, C., Chan, E., Campbell, R., Bresler, J., and Al-Muhtad, J. (2004). Expanding
education through active space collaboration. In Second IEEE Annual Conference on
Pervasive Computing and Communications Workshops, pages 236–240.
Pelissoli, L. and Loyolla, W. (2004). Aprendizado Móvel (M-Learning): Dispositivos
e Cenários. In 11Âo Congresso Internacional de Educação a Distância. ABED -
Associação Brasileira de Educação a Distância. In Portuguese.
SuperWaba (2010). SuperWaba website. http://www.superwaba.com.br/.
Zanev, V. (2004). Wireless Student Testing. In Proceedings of the International Confer-
ence on Pervasive Computing and Communications, Las Vegas, Nevada.
Zanev, V. and Clark, R. (2005). Wireless course management system. In ACM-SE 43:
Proceedings of the 43rd annual Southeast regional conference, pages 118–123, New
York, NY, USA. ACM Press.

Anda mungkin juga menyukai