Anda di halaman 1dari 9

Proceedings of International Conference on Computing Sciences

WILKES100 ICCS 2013


ISBN: 978-93-5107-172-3
Expressing recognized gestures for humanoid robot using Proto-
Symbol space
Nitin Kumar
1*
, Rahul Sharma
2
, Suraj Prakash Sahu
3
and Narendra Garg
4
1
Assistant Professor in Lovely Professional University, Punjab
2
Assistant ProIessor in Lovely ProIessional University, Punjab
3
Project officer in HCL
4
Assistant ProIessor in Amity University, Gwalior
Abstract
This paper describes the gestural communication technique for acquiring a gesture-based scheme by employing Mimesis
model. The proposed technique is relevant to any gesture presented by a dimensional signal. The work focus primarily on
expressing hand gestures recognition. It explicates a medium to communicate between Humans and the Humanoids through
gestures. The idea is to perform human gestures through imitation, recognition, generation and expressing through the use of
Mimesis Model. Gestures are commuted into certain codes and then converted into symbols, these special symbols then used
for training of the Humanoid robot and are often called as Proto-Symbol Space. The recognition is executed through a novel
algorithm developed by us, a combination of K-nearest neighbor and Euclidean distance. The generation part is done through
the use of WEBOTS Software which uses Humanoid robot HOAP 2 for generation of gestures. All the process of training,
recognition, generation and expression are simulated through MATLAB.
2013 Elsevier Science. All rights reserved.
Keywords: Hidden Markov Model (HMM), Proto-Symbol Space, Mimesis model ,WEBOTS simulation software, Humanoid robot HOAP-2.
1. Introduction
The field of Mimesis is about 600 years older, not only humans do mimesis, animals also performs the same
way. Scientifically imitation in most cases produces effective results. This imitation is also quite helpful in
development of robotics from doing household chores to playing football or from mimicking his master to obey
the instructions given by him. The emerging field of robotics develops new ways to perform imitation. The
imitation is performed using Mimesis model[1]. The model helps not only in motion recognition but also helps in
motion generation of a humanoid robot. Further the humanoid can be used to express the imitated gesture. The
Main objective of his work is to implement the imitation for gestures such as Bye-bye, Salute, Namaste etc and to
express them all with a good accuracy with Hidden Markov Model (HMM) using MATLAB.
This work uses a technique that employs nonverbal way of communication i.e. with the help of gestures.
Initially the gestures used for training are very common and are less complex. The use of nonverbal
communication produces much coherent results. In the Mimesis model for the Motion Recognition technique, the
adapted motion is performed using a resistive rotatory sensor Potentiometer, this sensor is handy to use and is
much efficient and cheaper than the other rotatory sensor such as Encoders or accelerometer. A suit was then
designed using these sensors. This calibrated sensor provides a continuous and analog data with respect to joint
angles, time and voltage and gestures performed. The performed gestures indicate a relation between joint angles
voltage and time. This joint angled data is then transformed into a CODEBOOK which is a dynamically
generated file. The CODEBOOK is then transformed into certain special symbols, called as Proto symbol space.
*
Corresponding author. Nitin Kumar
344 Elsevier Publications, 2013.
Nitin Kumar

Rahul Sharma, Suraf Prakash Sahu and Narendra Garg
The Proto Symbols will be different for different gestures and are helpful for generating imitations. The gestures
are performed by Humanoid robot, which have 25 degrees of freedom.
Fig. 1. Describes the Mimesis model for executing imitation
2. Gesture Data collection and preprocessing
Since we are considering real time data collected from our suit, which is made through a combinations of
resistive potentiometers sensors. The sensorswill give output in the form of orientation which will be scaled with
respect to time. The format for taking the data is in the CSV (comma separate value) format. This CSV file is
then transformed into a CODE BOOK through different techniques.This CODEBOOK is constructed by
considering the obtained data that for any gesture performed by Humanoid robot(HOAP-2), the joint angle data
can be decreasing, increasing or constant [2]. These observations are displayed below in Fig.2
Fig.2 Different gestural observations which helps to create Comma separated file
Fig.3 Generated CODE BOOK through various CSV files
345 Elsevier Publications, 2013.
Expressing recognized gestures for humanoid robot using Proto-Symbol space

3. Recognition approach for humanoid robot
Self learning is a process through which a robot will perform Mimesis [6]. The Self learning is mainly focused on
learning through self teaching through the HMM (Hidden Markov Model) , Since the data is continuous ,it is best
stated as Continuous HMM.HMM is a stochastic process through which the real time data is trained through the
use of techniques like Evaluation, Decoding and learning. The HMM has three parameters associated with it. i.e.
H (a
i
) is the initial state probability matrix.
A (a
ij
) is the state transition probability matrix.
B (b
ij
) is the confusion probability matrix.
In HMM [5]we need to determine observed states and the hidden states associated with each gestures, for
performing a mimesis in case of imitation the observed state are the joint angle values and the hidden state are the
values of the parameters associated with CSV file which comes out to be either constant, increasing or decreasing
[2] which is shown in Fig. 4.
Fig.4 Shows different hidden states and the observed states associated with it.
The probability of moving from one state to another, here state of the robot change from one state to another state
is called transition probability[8] for a classifying a simple gesture the gestural data is checked from all the
associated gestures, starting from salute to bye-bye finally bending and so-on, this is depicted below in fig.5
Fig.5 Shows the relation between State transition matrix and the associated Left to right observations
346 Elsevier Publications, 2013.
3.1 Feature Extraction Algorithm and generation of CODEBOOK:
Each state is associated with a different gestural action (bye-bye, Namaste etc) and has a different CSV file
associated with each of the gestures. These CSV files has certain joint angle data which changes with respect to
time .The data changes as soon as the gestures changes from some definite value and the data changes per
millisecond [3], while performing any gesture the data can changes to almost constant value reveal that there is
no change in joint angle data, the data coming can also be increasing or decreasing indicating that the joint values
are changing in clockwise or in anti-clockwise directions. These are depicted below in fig.6
Fig.6 Depicted the compared value, which comes out to be constant, increasing or decreasing.
With the values of different CSV file a CODEBOOK is generated for every gesture. I have taken 20-25 CSV file
for every gesture for the generation of CODEBOOK.
3.2 Training of data with the help of CODEBOOK
The generation of CODEBOOK is a requisite for training each gestural data to be further used by Baum Welch
Algorithm [2]. The Algorithm converts the CODEBOOK data into some parameters namely A (State Transition
Probability matrix), B (Confusion Probability matrix) and pi (Initial State Probability)[7].The general dimensions
for the matrix for A, B and Pi are taken respectively as L*L, L*O and 1*L, where L is the number of states and O
is the number of observation. The overall value for A,B and pi collectively defines the HMM which is depicted
in fig.7 shown below for Lifting hand Gesture ,Salute gesture, Bye-Bye gesture, Traffic Light Gesture, Namaste
gesture and May I Please Gesture.
All Gesture
Lifting hand Gesture Salute Gesture
347 Elsevier Publications, 2013.
Nitin Kumar

Rahul Sharma, Suraf Prakash Sahu and Narendra Garg
Expressing recognized gestures for humanoid robot using roto-ymbol space
Elsevier Publications, 2013.
Fig.7 Shows the Proto-symbol space each for All gesture, Lifting hand Gesture ,Salute gesture, Bye-Bye gesture, Traffic Light Gesture,
Namaste gesture and May I Please Gesture
4. Software specification
The execution part is performed by the Humanoid robot HOAP-2 robot which is in-corporate in the software
WEBOTS. WEBOTS basically is simulation software which is designed to facilitate and designing applications
on robots. The software is facilitated with structural designing and Coding such as C, C++and J ava can be used
to program different robots. This Software Provide a superb Environment to build the simulations for Robots. It
has different scenes for a building a robot. The software provides modeling through its scene window, the current
Position can be viewed through log window.
There are several robots in this software, for our work we have used humanoid robot HOAP-2. The Robot
HOAP-2 (Humanoid Open Architecture Platform) is a humanoid robot designed by company HONDA and its
proto type is included in WEBOTS. It hasthe humanoid shape with two arms, two legs, head and a torso
structure. This robot can be used for almost all the applications which involve humans, it has 25 degree of
freedom (DOF) out of which 5 DOF for each hand, 6 DOF for each leg and 3 are body joints, these DOF are
attained through the use of servo motors. Our Humanoid robot HOAP-2 in WEBOTS runs through a command
program and a CSV file which is basically a excel file having information for every motion of robot through its
25 joint angles values running in any particular time span. Shown below is the WEBOTS platform.
Fig.8 MATLAB simulation Software
5. Learning with HMM
For Motion recognition, as the Proto-symbols are created and the training is provided to the humanoid robot
using Baum WELCH algorithm then this data can be used for motion generation of robots. The generation part
348
Bye-Bye Gesture TraIIic Light
Namaste Gesture May I Please Gesture
needs the robot to perform classification and then execution of an known or unknown gesture. The classifier will
be chosen in such a way that it can classify real time data. The steps of classification are shown below
1. For training every gesture will have different proto-symbols i.e. A, B, PI matrix values finally
concatenated in a single array.
2. For every test gesture (Trained or untrained) A, B, PI matrix values will be calculated again and again a
concatenated Single dimensional array is formed using the test feature vector

.
3. For N different gestures the number of classes be N and the dimensions of each class feature vector is
F

which is equal to [(number of hidden states)


2
+(number. of observed states *number. of hidden
states) +number. Of hidden states] =(Size of A matrix +size of B matrix +size of pi matrix) which is
equated to D.
4. Now calculate the distance vector between test feature vector and trained class feature vector d, using
the formula
( )
2
,
2
, , j i j i j i
F d =
,
Where
D i
and
j
No. of class
N
5. Now using the K-nearest neighbor [11] algorithm we can find the class of test gesture in D.
i) For different values of
i
, we can select the different values of
( ) i J j =
, for which
j i
d
,
is
minimum.
ii) Thus we can obtain array
( ) i J
, for all the values of
i
in
j i
d
,
, where
D i
.
iii) For all values of
i
we select
( ) ( ) i J Mod
(maximum occurrence of a particular
j
in the
array), as the recognizing class.
Fig.9 Figure shows the conversion of CODEBOOK data into HMM parameters.
5. Result analysis
349 Elsevier Publications, 2013.
Nitin Kumar

Rahul Sharma, Suraf Prakash Sahu and Narendra Garg
Expressing recognized gestures for humanoid robot using Proto-Symbol space
Elsevier Publications, 2013.
Fig.10 Humanoid mimesis to human different gestures
Class used for
Classification
No of gesture use for
training
test gesture Mis-classified
gesture
Accuracy (in
Percentage)
Bye-Bye 25 11 2 81.8
Salute 21 9 None 100
Traffic light 19 7 1 85.7
Lifting Hand 31 16 None 100
Namaste 15 10 1 90
May I Please 12 6 None 100
Fig.11 showsthe result of the classified gesture
6. Conclusion and future work
Continuing my old work [9] of imitating gestures through Non verbal communication on humanoid robot [4][5]
HOAP-2 an add-on feature of verbal communication has been added to increase the expressing power of the
robot. Our work uses several approaches for recognition different gestures in real time. The Overall process has
been model through HMM based Mimesis Model. The model provides better lustiness to build a concrete model
which can be used to train and imitate gestures in much efficient manner. Theprocess has been successfully
accomplished with the implementation of the Baum Welch algorithm which creates Proto-symbol on gesture
recognition and then it had been executed through Simulation software WEBOTS. The software WEBOTS
provides a real time environment with the use of its robot HOAP-2. This robot has to classify different gestures
by using distance vector which is a concatenation of Euclidean distance algorithm [10] and K-nearest neighbor
algorithm [11].These all functions are in cooperated with the help of MATLAB software.
Future work includes the implementation of purely verbal based communication for imitating different gestures.
350
References
[1] Inamura , T. ; Nakamura, Y.; Toshima, I.; Ezaki, H.; "Mimesis embodiment and proto-symbol acquisition for
humanoids," Advanced Intelligent Mechatronics, 2001. Proceedings. 2001 IEEE/ASME International
Conference on, vol.1, no., pp.159-164 vol.1, 2001.
[2] T.Inamura, I. Toshima, H. Tanie, and Y. Nakamura, Embodied symbol emergence based on mimesis
theory, The International J ournal of Robotics Research, vol. 23:3-5, pp. 363377, 2004.
[3] Inamura, T.; Nakamura, Y.; Ezaki, H.; Toshima, I.; , "Imitation and primitive symbol acquisition of
humanoids by the integrated mimesis loop," Robotics and Automation, 2001.Proceedings 2001 ICRA. IEEE
International Conference on , vol.4, no., pp. 4208- 4213 vol.4, 2001
[4] Takano, W.; Tanie, H.; Nakamura, Y.; , "Key feature extraction for probabilistic categorization of human
motion patterns," Advanced Robotics, 2005. ICAR '05. Proceedings., 12th International Conference on, vol.,
no., pp.424-430, 18-20 J uly 2005.
[5] Kuniyoshi, Y.; Inaba, M.; Inoue, H.; , "Learning by watching: extracting reusable task knowledge from
visual observation of human performance," Robotics and Automation, IEEE Transactions on , vol.10, no.6,
pp.799-822, Dec 1994.
[6] Stefan Schaal. Is imitation learning the way to humanoid robots? Trends in Cognitive Sciences, Vol. 3, No.
6, pp. 233-242, 1999.
[7] Mataric, M.J .;, "Getting humanoids to move and imitate," Intelligent Systems and their Applications, IEEE ,
vol.15, no.4, pp.18-24, J ul/Aug 2000.
[8] Qiang Huang; Kaneko, K.; Yokoi, K.; Kajita, S.; Kotoku, T.; Koyachi, N.; Arai, H.; Imamura, N.; Komoriya,
K.; Tanie, K.; , "Balance control of a piped robot combining off-line pattern with real-time modification,"
Robotics and Automation, 2000. Proceedings. ICRA '00. IEEE International Conference on , vol.4, no.,
pp.3346-3352 vol.4, 2000.
[9] Nitin Kumar; P.Suraj; J . Prakash Recognizing gesture for humanoid robot using proto-symbol space
advanced material research vol403-408 November 2011.
[10] Miyazawa, M.; Peifeng Zeng; Iso, N.; Hirata, T.;, "A systolic algorithm for Euclidean distance transform,"
Pattern Analysis and Machine Intelligence, IEEE Transactions on , vol.28, no.7, pp.1127-1134, J uly 2006.
[11] Shiliang Sun; Rongqing Huang; , "An adaptive k-nearest neighbor algorithm," Fuzzy Systems and
Knowledge Discovery (FSKD), 2010 Seventh International Conference on , vol.1, no., pp.91-94, 10-12 Aug.
2010.
351
Nitin Kumar

Rahul Sharma, Suraf Prakash Sahu and Narendra Garg
Index

A
ABR. see Associativity-based routing (ABR)
Ad hoc on-demand distance vector (AODV), 339
Adhoc protocols, study, 338
ABR, 339340
AODV, 339
ARIADNE, 339
AODV. see Ad hoc on-demand distance vector (AODV)
Associativity-based routing (ABR), 339340

W
Wireless networks
adhoc protocols, study of, 338340
average end-to-end delay, 340
routing overhead, 341342
successful packet delivery ratio, 340

Anda mungkin juga menyukai