omar@ifma.edu.br, marporsou@iecom.org.br
1. Introduction
Educational distance appeared in the beginning of 19th century at University of
Chicago, offering educational opportunities for those who were not among the elite and
who could not afford full time residence at an educational institution. This educational
approach had been supported by correspondence (mail) until the development of computer
networks and the expansion of the Internet. Computer networks and the Internet provided
a new structure for accessing knowledge, transforming the traditional education distance
into e-learning. Formally, e-learning is an Internet-based learning system which involves
training, information and communication in real-time.
Nowadays, e-learning can be combined with mobile computing, creating the m-
learning or mobile learning paradigm, improving the cooperation between students and
teachers. In other words, both instructors and learners have wide variety of options
provided to them to perform content delivery, assess students, access course materials,
etc. [Balasundaram and Ramadoss 2007]. M-learning demands the use of many technolo-
gies, such as wireless networks, XML, Java, WAP, voice and electronic mail, SMS and
MMS messaging and sooner voice and video on-demand, overcoming the classroom bor-
ders [Pelissoli and Loyolla 2004].
In this context, this paper proposes an ubiquitous system for educational assess-
ment in m-Learning environments, named UbiSysTest [Lopes and Cortes 2007]. The goal
of UbiSysTest is to provide an infrastructure for creating, storing, applying and correcting
academic tests. Through UbiSysTest, students can download and execute these tests by
means of portable devices, anytime, anywhere, through the Internet. It is important to
note that, this system only access Internet whenever should be required (i.e., downloading
test and sending responses at the end of test).
The main purpose of this paper is to describe the architecture and development
of UbiSysTest, as well as its application in actual classes. So, this paper is organized
as follows: Section 2 presents the ubiquitous testing system; Section 3 describes the use
of UbiSysTest; Section 4 discusses about related works; finally, Section 5 presents the
conclusion of this paper and future works.
database. TDE was implemented as a friendly web application written with JSP/Servlets
technology running over the Glassfish application server. Finally, the Test Evaluation
Subsystem (TEV) is a web application whose teachers can use to evaluate answers and
measure students’ performance. It also interacts with the database in order to get students’
answers and store evaluation results. TEV and TDE can be accessed through the same
web interface.
The following subsections describe more details of each UbiSysTest subsystem.
3. Using UbiSysTest
Java is the main technology employed in the development of UbiSysTest. How-
ever, each one of its subsystems was implemented using a specific platform. Test Manage-
ment Subsystem (TMA) requires a Java Standard Edition (JSE) platform, since it executes
as a standalone application. It uses the Java Persistence API (JPA) for interacting with
PostgreSQL database. Test Execution Subsystem (TEX) was developed with Java Micro
Edition (JME) platform, and executes on JME enabled mobile phones (with CLDC 1.1 /
MIDP 2.1 support). TEX also demands an Internet connection for communicating with
TMA. The latter two subsystems – Test Development Subsystem (TDE) and Test Evalu-
ation Subsystem (TEV) – was developed with web technologies. TDE and TEV are Java
Enterprise Edition (JEE) applications, implemented as a single integrated Java web sys-
tem. These last two subsystems run over a Glassfish application server, using services
provided by this platform.
Each one of the UbiSysTest’s subsystems has to be deployed in one of the two
different network nodes: a server or a mobile station1 . While TMA, TDE, and TEV must
be deployed in the server side, TEX must be deployed in the mobile station (or emulator)
side. The server and the mobile station should be able to exchange data through some
network.
Interactions between the server and the mobile station are triggered by two actors:
(a) the teacher and (b) the students. Both actors have access to functionalities provided by
UbiSysTest through their respective interfaces (web and mobile interfaces). The following
steps can be performed in order to demonstrate the use of UbiSysTest’s functionalities for
students and teachers:
1. Through TDE, the teacher should register his/her courses, tests and questions.
Figure 3 illustrates the screen used by the teacher to register a multiple-choice
question. The registered course number should be informed for the students;
2. Using TEX, the students can create a new course setting, and then inform the
course name, the course number (indicated by the teacher) and the hostname or
the IP address of the server. Thereafter, the student can download the test, and,
finally, answer it. The main screen of TEX with an example of question were
presented in Figure 2;
3. After answering the test (with TEX), the student can submit it back to the server.
TEX then shows a screen asking the student’s name and registration code. This
information is used to identify the student in the server;
4. The teacher can access and evaluate the responses sent by the students using TEV
web interface. He/she should inform whether the response is correct or not, justify
his/her choice and insert some appropriate comments. Afterwards, the evaluations
completed by the teacher become available to students;
5. Students should check, through TEX, the evaluations of their answers.
1
The mobile station can be emulated using some tool, such as the Sun Java Wireless Toolkit for CLDC
– available at http://java.sun.com/products/sjwtoolkit/.
UbiSysTest, as well as detailed instructions for its usage, installation, and deploy-
ment, can be found at: http://sites.google.com/site/ubisystest 2 .
4. Related Work
The Internet has been considered as an educational tool accessible via many com-
puters around the world. Several types of educational content can be delivered in different
formats. However, the development of high speed networks, wireless communication and
mobile devices, definitively, changed this scenario with the m-learning systems. New
forms of cooperation and delivery content are essential, in this context. Some works
that addresses cooperation and educational content are [Lahner and Nosekabel 2002],
[Divitini et al. 2002] and [Peiper et al. 2004].
A complete m-learning system is not composed only by delivery content. A
test management system is also required. A wireless student testing system for
WAP devices has been designed and implemented at Columbus State University in
Zanev’s Work [Zanev 2004]. The main problem of this approach is that students
must be connected while the test is answered. The same author developed a test
management system for multiple choice question in [Zanev and Clark 2005]. Bala-
sundaran’s work focuses on using SMS for answering short words-answers types of
questions and evaluating them using simple matching process, providing enough feed-
back [Balasundaram and Ramadoss 2007]. Benavent developed a XML-based test system
running on Windows Mobile-based smartphones which allows students to take different
stored test [Benavent et al. 2006]. This mobile-based application runs off-line and only
connects through Internet whenever should be required.
Whereas the presented works addresses only one kind of question for the students
in their test management systems, the UbiSysTest management system deal with different
types of questions such as multiple-choice, true-false, paragraph, short answer, matching,
multiple response and fill in the blanks questions for students, given a considerable con-
tribution to the area. Further, UbiSysTest does not force the student to be connected while
the test is answered. Furthermore, Benavent’s work running only on Windows system
while UbiSysTest is supported in several mobile devices, since is based on JME.
References
Balasundaram, R. and Ramadoss, B. (2007). SMS for Question-Answering in the m-
Learning Scenario. Journal of Computer Science, 3(2):119–121.
Benavent, A. P., Bonastre, O. M., and Girona, M. M. (2006). Development of a
XML-based Ubiquitous System for Testing Using Smartphones. In Fourth IEEE In-
ternational Workshop on Wireless, Mobile and Ubiquitous Technology in Education
(WMTE’06), pages 47–49.
Divitini, M., Kristen, H., and Per-Arne, N. (2002). Improving Communication Through
Mobile Technologies: Which Possibilities. In Proceedings of the IEEE International
Workshop on Wireless and Mobile Technologies in Education, Vaxjo, Sweden.
Krasner, G. E. and Pope, S. T. (1998). A cookbook for using the Model-View-Controller
user interface paradigm in Smalltalk-80. Journal of Object-Oriented Programming,
1(3):26–49.
kXML (2010). kXML website. http://www.kxml.org/.
Lahner, F. and Nosekabel, H. (2002). The Role of Mobile Devices In ELearning First
Experiences With a Wireless E-Learning Environment. In Proceedings of the IEEE
International Workshop on Wireless and Mobile Technologies in Education, Sweden.
Lopes, R. F. and Cortes, O. A. C. (2007). An ubiquitous testing system for m-learning
environments. In Proceedings of the Second International Conference on Systems and
Networks Communications (ICSNC ’07). IEEE Computer Society.
Peiper, C., Chan, E., Campbell, R., Bresler, J., and Al-Muhtad, J. (2004). Expanding
education through active space collaboration. In Second IEEE Annual Conference on
Pervasive Computing and Communications Workshops, pages 236–240.
Pelissoli, L. and Loyolla, W. (2004). Aprendizado Móvel (M-Learning): Dispositivos
e Cenários. In 11Âo Congresso Internacional de Educação a Distância. ABED -
Associação Brasileira de Educação a Distância. In Portuguese.
SuperWaba (2010). SuperWaba website. http://www.superwaba.com.br/.
Zanev, V. (2004). Wireless Student Testing. In Proceedings of the International Confer-
ence on Pervasive Computing and Communications, Las Vegas, Nevada.
Zanev, V. and Clark, R. (2005). Wireless course management system. In ACM-SE 43:
Proceedings of the 43rd annual Southeast regional conference, pages 118–123, New
York, NY, USA. ACM Press.