(1) Introduction……………………………………………………………………………………………………………………………………………….. 3
(2) Goal of the project……………………………………………………………………………………………………………………………………….4
(3) Project requirement…………………………………………………………………………………………………………………………………….4
User Profile
Tools used
(4) Steps for design quality evaluation………………………………………………………………………………………………………………..6
(5) Identify performance attribute………………………………………………………………………………………………………………………7
(6) Elementary criteria……………………………………………………………………………………………………………………………………….10
(7) Aggregation of preference………………………………………………………………………………………………………………………….…28
(8) Competitive system…………………………………………………………………………………………………………………………………….…34
(9) Result…………………………………………………………………………………………………………………………………………………………....38
(1) EVALUATION REPORT FOR THE SearchEngine PROJECT [SearchEngine.txt] ……………………………………………..38
(2) DETAILED EVALUATION RESULTS FOR THE SearchEngine PROJECT [SearchEngine.lst]……………………………….44
(3) Result of SearchEngines [SearchEngine.res]……………………………………………………………………………………………...70
(4) SUMMARY OF RESULTS FOR THE SearchEngine PROJECT [SearchEngine.sum]…………………………………………..72
1. Introduction
The software Quality Analysis is a measure of properties of a piece of software or its
specifications. The direct measurement of software quality is quite difficult due to lack of
quality factor measurement. To resolve this measurement problem, there is a model which
measures the quality of the software in terms of the attributes, specifications and
characteristics. This model is known as LSP (Logic Score Preference) .When client gives
specifications of the software to the developer then client expects the good quality of
software from developers. Hence, to decide the quality of software we can use this LSP
model.
This model validates following software quality attributes.
(1) Functionality
Suitability
Accuracy
Security
Interoperability
Compliance
(2) Usability
Understandability
Learn ability
Operability
(3) Performance
Processing time
Throughput
Resource consumption
(4) Maintainability
(5) Portability
(6) Reusability
In LSP, the features are decomposed into above aggregation blocks. And this decomposition
continues with in the each block until the all the lowest level features are directly measurable
and makes tree of decomposed features. And for each feature, an elementary criterion is
defined. And LSP calculates elementary preference for each criterion and then aggregate all
of them to calculate final global preference. And this global preference shows the quality of
the software. We can calculate global preference for different systems and we can analyze
and compare the systems’ quality.
2. Goal of the Project
The very first tool used for searching on the internet was Archie in 1990. And from that day
the era of search engines begins. And effective search engines such as W2Catalog and Aliweb
were introduced in 1993. After that, search engines were improved by adding complexity and
functionality. And now days we have efficient and qualitative search engines such as Google,
Yahoo Search, bing, Ask, etc. Hence, the goal of this project is to analyze the functionality
and the complexity of those search engines. And we will find how these search engines satisfy
the user requirements.
But before the evaluation, evaluator must know the user requirements. It means evaluator
must know that who is the user and what user expects from the system. And next stage is that
evaluator must have the expert knowledge in the system. Hence, evaluator can sort out the
software attributes very well. And for further stage, evaluator should select the system
components which can be compared between two systems.
Hence, as being an evaluator we will choose the different search engines to compare their
system quality. And we will choose different attributes for the search engine system. And
finally we will give different scale of preference to each attribute for every search-engine
(Google, Yahoo Search, bing, Ask). And after going through whole process of LSP, we will find
out global preference of each search-engine and analyze their quality.
In this project we will put more focus on building a comprehensive evaluation model for five
search engines: Google, Yahoo Search, bing, Ask, altavista. This model aggregates all the
features those reflect the functionality and usability of the search engines and generates a
compound indication for the overall quality. Hence finally it reflects the measurement of user
satisfaction for all aspects of the search-engines.
3. Project Requirements
Step 2:
Define criterion function for each attribute, and apply attribute measurement for each
search-engine.
Elementary evaluation criteria specify how to measure quantifiable attributes. The result is an
elementary preference, which can be interpreted as a Degree of satisfied requirement. For
each attribute, it is necessary to establish an acceptable range of values and define a
function, called the elementary criterion. This function maps the measured value in the
numerical domain.
- So, for search-engine I have developed elementary preference and attribute measurement
values in section [5.1]
Step 3:
- Evaluating elementary preferences
- Logic aggregation of preferences
- So, for search engine I have developed logic aggregation preference using pictorial diagram.
Aggregators are chosen based on the user needs which are expressed as relationship between
inputs to be simultaneity, replaceability and neutrality. There are some inputs which are to be
satisfied simultaneously and finally we aggregate the preferences and compute the overall
suitability of the system.
Step 4:
- Evaluating competitive functions
- Here, we have five different such engines such as Google, Yahoo Search, bing, Ask, altavista.
And we will evaluate competitive feature analysis for each above search-engines.
Step 5:
- Ranking and selection of the best system and analysis of the system
- And finally using LSPCalc, we can give rank to each search-engine according to their global
reference. And select best among them.
- How to use LSPCalc128 tool ?
Ans: Create .cri and .dat files
Create a directory in the projects folder named with your project
Create two files as projectname.cri and projectname.dat
By default the LSPcalc128 looks into projects folder to search for all the projects
And then we can open any client shell to access the LSPcalc128
Run the executable and follow the instructions on the screen of this tool.
5. Identifying performance attributes
5.1: System attribute tree
Attribute tree
2. Usability
2.1 Interface usability
2.1.1 Interface visibility
2.1.2 Operability
2.1.3 Customization
2.1.3.1 Customization of page size
2.1.3.2 Customization of page rank
2.2 Result Evaluation
2.2.1 Result visibility
2.2.2 Accessibility of results
2.2.3 Availability of cached result
2.3 User guide
2.3.1 Online help
2.3.2 Manual user-guide
2.3.3 FAQ
2.3.4 Related tutorial material
3. Performance
3.1 Loading time
3.1.1 Load page time
3.1.2 Automatic search suggestion time
3.1.3 Result evaluation
3.1.3.1 Time to evaluate best result
3.1.3.2 Time to evaluate top N result
3.2 Resource Consumption
4. Reliability
4.1 User satisfaction
4.1.1 Popular pages results
4.1.2 High rank pages results
4.1.3 Coverage of user need
4.2 Confusion matrix
4.2.1 Accuracy
4.2.2 Precision
4.2.3 Recall
4.2.4 Specificity
6 : Elementary Criteria
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Search with group of - Search engine should have capability to correlate group of keywords 50
keywords and find combination of these keywords on the web.
- Find all words of group – 8
- Find some of words of group - 2
Group of keywords [Rank of Accuracy]
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Statement Searching
Search exact - Search REL= 100 * S/ S max 40
statement Efficiency to search exact expression on web as user wants
Search exact statement [Level of efficiency]
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Case sensitive - if search-engine is case sensitive - 0 20
Search - otherwise - 1
0 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Search with different - Number of languages which search engine can support. 20
language
Search with different languages[No of languages]
1 25 51
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
4 15 30
Relational Searching
Abbreviation Search - AS REL= 100 * AS/ AS max 20
Efficiency to search Abbreviation on web as user wants
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Operator Searching
Including Search - IS REL= 100 * IS/ IS max
(using ‘+’ operator) Efficiency to search multiple expression with the operator ‘+’
55
Including Search [Level of efficiency]
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Image Searching
Filename search - FNS REL= 100 * FNS/ FNS max
Efficiency to search image as its file name on the web
20
Filename Search [Level of efficiency]
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Image link search - ILS REL= 100 * ILS/ ILS max
Efficiency to search Image on web as user wants
30
Image link Search [Level of efficiency]
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Video Searching
Popularity based - VPS REL= 100 * VPS/ VPS max
Search Efficiency to search video as its popularity 50
10 5 1
0 10 20 30 40 50 60 70 80 90 100
0 10 20 30 40 50 60 70 80 90 100
10 5 1
Audio Searching - AudioS REL= 100 * AudioS/AudioS max
Efficiency to Search audio as per user requirement.
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Citation filter
Adult content filter - ACS REL= 100 * ACS/ ACS max
Efficiency of filter to prevent Searching Adult content on the web.
0 10 20 30 40 50 60 70 80 90 100
Casino content filter - CCS REL= 100 * CCS/ CCS max 10
Efficiency of filter to prevent Searching Casino and gambling
content on the web.
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Domain filter
Pages related to same - DPS REL= 100 * DPS/ DPS max
content Efficiency of filter to search pages which has same kind of 45
contents.
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Extended filter
RSS support pages - RSS REL= 100 * RSS/ RSS max 35
Efficiency of filter to search RSS support pages.
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
1 2 4
0 10 20 30 40 50 60 70 80 90 100
File specific filter [pdf, - Ability to find out different kind of file on the web.
word, excel sheet,..] - i.e. .ppt, .doc, .pdf, etc..
35
File specific filter [No of types]
1 15 30
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Time filter
Page created time - PTS REL= 100 * PTS/ PTS max 20
Efficiency of filter to search page with created year or time for
that page.
10 5 1
0 10 20 30 40 50 60 70 80 90 100
0 10 20 30 40 50 60 70 80 90 100
Location filter
Location of searching user - LS REL= 100 * LS/ LS max
[i.e. weather forecast] Efficiency of filter to search pages related to geographic location
of user. 30
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
1 5 10
0 10 20 30 40 50 60 70 80 90 100
1 5 10
0 10 20 30 40 50 60 70 80 90 100
[6.4] 2. Usability
1 5 10
0 10 20 30 40 50 60 70 80 90 100
1 5 10
0 10 20 30 40 50 60 70 80 90 100
Customization
Customization of page - Facility to customize page according to user need
size No of Results per page – 8 20
Searching customization – 1.5
Page-Theme selection – 0.5
1 5 10
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
Result Evaluation
Result visibility - Result visibility
Top N result on the page – 3
Easy to access – 5 20
Result without redundant data on page – 2
1 5 10
0 10 20 30 40 50 60 70 80 90 100
1 5 10
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
User guide
Online help - ONH REL= 100 *ONH/ ONH max
Availability of online help to guide users.
20
10 5 1
Manual user-guide - UG REL= 100 *UG/ UG max
Availability of manual user-guide to guide users. 30
10 5 1
0 10 20 30 40 50 60 70 80 90 100
5 25 45
0 10 20 30 40 50 60 70 80 90 100
10 5 1
0 10 20 30 40 50 60 70 80 90 100
`
[6.5] 3. Performance
12 8 4
0 10 20 30 40 50 60 70 80 90 100
3 1 0.5
0 10 20 30 40 50 60 70 80 90 100
Result evaluation
Time to evaluate best - Time to evaluate result for target search. 80
result [in Second]
18 11 4
0 10 20 30 40 50 60 70 80 90 100
Time to evaluate top N - Time taken to evaluate top N result [in Second]
result 20
Time to having top N Results [ Seconds]
16 8 0.9
0 10 20 30 40 50 60 70 80 90 100
520 20
0 10 20 30 40 50 60 70 80 90 100
[6.6] 3. Reliability
User satisfaction
Popular pages results - Efficiency for reliable search
Popular pages - 4 30
Informative pages - 3
Trustworthy pages - 1
Ability to search target page of specific web-site
domain rather Home Page – 2
Popular Pages Results [Level of reliability]
10 5 1
0 10 20 30 40 50 60 70 80 90 100
High rank pages - Efficiency to computing page rank and evaluate result using 20
results ranking
10 5 1
0 10 20 30 40 50 60 70 80 90 100
0 10 20 30 40 50 60 70 80 90 100
100 50 1
0 10 20 30 40 50 60 70 80 90 100
100 50 1
30
0 10 20 30 40 50 60 70 80 90 100
Recall - Recall for evaluated result for a target search
Recall [% recall] 20
100 50 1
0 10 20 30 40 50 60 70 80 90 100
Specificity [% Specificity]
100 50 1
0 10 20 30 40 50 60 70 80 90 100
7. Ranking and Aggregation of the preferences
The system requirement tree has 73 performance variables as listed above. We make a weighted graph of
the performance variables and using aggregators .
7.1Functionality
7.1.5 Functionality :
[Fig: 5 – Functionality]
7.2.1 Usability
[Fig: 6 Usability]
7.3.1 Performance:
[Fig: 7 Performance]
7.4.1 Reliability
[Fig: 8 Reliability]
Attributes
1.1.1.5.2 Synonyms
Search 1 1 1 2 2
1.1.1.5.3 Stemming
Search 1 1 2 2 2
1.1.1.5.4 Misspell
Corrected Search 1 1 1 2 2
1.1.1.6.1 Including
Search (using ‘+’ 1 2 2 1 2
operator)
1.1.1.6.2 Excluding
Search (using ‘-‘ 3 3 3 3 3
operator)
1.1.1.6.3
Combinational 2 3 3 3 3
Search (using ‘*’
operator)
1.1.1.6.4 Optional
Search (using ‘OR’ 1 3 1 2 2
operator)
1.1.2.1.1 Filename
search 2 2 2 3 3
1.1.2.1.2 Image link
search 2 2 2 2 2
1.1.2.1.3 Adjacent
text search 1 1 2 2 2
1.1.2.2.1 Popularity
based Search 1 2 1 2 2
1.1.2.2.2 Content
based Search 1 1 1 1 1
1.1.2.2.3 Content
Controlled Search 2 2 2 2 2
1.1.2.3 Audio
Searching 2 2 2 2 2
1.2.1.1 Pages with
Spam, doorway 1 1 1 2 1
1.2.1.2 Duplicate
Content 2 3 3 3 2
1.2.2.1 Adult content
filter 2 2 2 2 2
1.2.2.2 Casino
content filter 3 4 3 4 4
1.2.3.1 Pages related
to same content 3 3 2 3 3
1.2.3.2 Pages related
to same website 1 1 1 1 1
1.2.3.3 Linked page
2 2 2 2 2
1.2.4.1 RSS support
pages 2 3 2 3 3
1.2.4.2 Usage rights
1 2 1 2 2
1.2.4.3 Numeric
range filter 1 3 3 3 3
1.2.4.4 field to
search keyword 4 3 4 4 3
[title,text,URL,Link]
1.2.5 File specific
filter [pdf, word, 30 26 27 24 23
excel sheet,..]
1.2.6 Broken link
filter 4 3 3 4 3
1.2.7.1 Page created
time 4 3 3 4 3
1.2.7.2 Recent
update time 2 2 2 2 2
1.2.8.1 Location of
searching user [i.e. 2 3 2 4 4
weather forecast]
1.2.8.2 Location of
country 2 2 2 3 4
1.3.1 Weather
1 2 2 3 2
1.3.2 Blog
1 2 2 3 3
1.3.3 Movie time
3 1 2 3 3
1.3.4 Sport score
2 1 1 2 2
1.3.5 Stock price
2 1 2 2 3
1.3.6 Literature
[books,..] 9 9 9 8 8
1.3.7 Maps
10 8 9 7 7
2.1.1 Interface
visibility 10 9 9 9 9
2.1.2 Operability
10 10 10 9 9
2.1.3.1
Customization of 10 10 10 10 10
page size
2.1.3.2
Customization of 2 2 2 3 3
page rank
2.2.1 Result
visibility 10 8 9 8 8
2.2.2 Accessibility of
results 10 9 9 9 9
2.2.3 Availability of
cached result 2 2 2 3 2
2.3.1 Online help
1 1 1 2 1
2.3.2 Manual user-
guide 2 2 2 3 2
2.3.3 FAQ
30 28 30 27 28
2.3.4 Related
tutorial material 1 1 1 2 2
3.1.1 Load page time
2 2 2 2 2
3.1.2 Automatic
search suggestion 0.5 0.5 0.5 0.5 0.5
time
3.1.3.1 Time to
evaluate best result 3.5 3.5 3.5 4 4.5
3.1.3.2 Time to
evaluate top N result 3 3 3 3.5 4.0
3.2 Resource
Consumption 76 142 55 360 50
4.1.1 Popular pages
results 1 1 1 3 6
4.1.2 High rank
pages results 1 2 2 3 7
4.1.3 Coverage of
user need 2 3 3 4 3
4.2.1 Accuracy
90 75 85 70 75
4.2.2 Precision
95 80 85 80 80
4.2.3 Recall
88 78 82 78 78
4.2.4 Specificity
93 85 88 84 84
9.1 Results :
This report presents the evaluation results for the following 5 competitive
systems:
1. Google
2. Yahoo
3. bing
4. Ask
5. altavista
1. Functionality
2. Usability
3. Performance
4. Reliability
This summary includes two parts: (1) System Comparison and Ranking, and
(2) Survey of Individual Systems. Deatailed numerical results can be found in
the report entitled "Detailed Evaluation Results of the SearchEngine Project".
1. 94.89% Google
2. 89.78% bing
3. 88.18% Yahoo
4. 78.42% altavista
5. 77.35% Ask
1. 100.00% Google
2. 94.62% bing
3. 92.93% Yahoo
4. 82.65% altavista
5. 81.51% Ask
Yahoo
bing
Ask
altavista
Competitive System(s):
1. Google
2. Yahoo
3. bing
4. Ask
5. altavista
PERFORMANCE VARIABLES:
125 27.00 90.00 File specific filter [pdf, word, excel sheet,..]
32 55.00 93.00 Resource Consumption
2131 10.00 100.00 Customization of page size
1211 1.00 100.00 Pages with Spam, doorway
11164 1.00 100.00 Optional Search (using ‘OR’ operator)
11124 1.00 100.00 Case sensitive Search
11154 1.00 100.00 Misspell Corrected Search
231 1.00 100.00 Online help
11123 1.00 100.00 Ignore stop words
1232 1.00 100.00 Pages related to same website
234 1.00 100.00 Related tutorial material
311 2.00 100.00 Load Page Time
312 0.50 100.00 Automatic search suggestion time
3131 3.50 100.00 Time to evaluate best result
11221 1.00 100.00 Popularity based Search
11222 1.00 100.00 Content based Search
411 1.00 100.00 Popular pages results
134 1.00 100.00 Sport score
1242 1.00 100.00 Usage rights
11151 1.00 100.00 Abbreviation Search
1244 4.00 100.00 field to search keyword [title,text,URL,Link]
11152 1.00 100.00 Synonyms Search
212 10.00 100.00 Operability
ID E[%] Subsystem
-------------------------------------------------------------
1111 100.00 Keyword Searching
1000 100.00 Temp GCD for 11121 11122
1001 100.00 Temp GCD for 11123 11124
1002 100.00 Temp GCD for 1000 1001
1003 100.00 Temp GCD for 1111 1002
1114 91.07 Numeric Expression Search
1115 100.00 Relational Searching
1116 95.98 Operator Searching
1004 95.49 Temp GCD for 1113 1114 1115 1116
111 99.29 Text Searching
1121 94.16 Image Searching
1005 100.00 * Temp GCD for 11221 11222
1122 97.96 Video Searching
112 93.97 Multimedia Search
121 94.94 Security filter
122 87.66 Citation filter
123 87.54 Domain filter
1006 96.05 * 1006
124 98.19 Extended filter
1007 90.86 *Temp GCD for 122 123 124
127 84.25 Time filter
128 88.89 Location filter
1008 86.73 *Temp GCD for 125 126 127 128
12 89.35 Searching Filter
1009 90.83 Temp GCD for 131 132 133
1010 92.71 Temp GCD for 134 135 136 137
13 92.05 Specific activity Search
11 97.10 Searching Input
1011 94.26 Temp GCD for 11 12
1 93.79 Functionality
1012 100.00 Temp GCD for 211 212
213 91.07 Customization
21 98.81 Interface usability
1013 97.74 Temp GCD for 222 223
23 88.60 User guide
1014 92.71 Temp GCD for 221 1013 23
2 96.95 Usability
1015 100.00 Temp GCD for 311 312
1016 97.16 3131 3132
31 99.13 21
3 95.82 Performance
41 94.37 User satisfaction
42 91.90 Confusion matrix
4 93.87 Reliability
9999 94.89 Whole_Search_Engine
ID E[%] Subsystem
-------------------------------------------------------------
1111 88.89 Keyword Searching
1000 92.65 Temp GCD for 11121 11122
1001 100.00 Temp GCD for 11123 11124
1002 93.62 Temp GCD for 1000 1001
1003 90.50 Temp GCD for 1111 1002
1114 61.99 Numeric Expression Search
1115 100.00 Relational Searching
1116 83.81 Operator Searching
1004 85.31 Temp GCD for 1113 1114 1115 1116
111 89.68 Text Searching
1121 94.16 Image Searching
1005 94.16 * Temp GCD for 11221 11222
1122 93.20 Video Searching
112 92.15 Multimedia Search
121 89.74 Security filter
122 86.12 Citation filter
123 87.54 Domain filter
1006 81.04 * 1006
124 77.62 Extended filter
1007 85.95 *Temp GCD for 122 123 124
127 86.62 Time filter
128 85.50 Location filter
1008 84.98 *Temp GCD for 125 126 127 128
12 85.61 Searching Filter
1009 93.27 Temp GCD for 131 132 133
1010 89.21 Temp GCD for 134 135 136 137
13 90.62 Specific activity Search
11 90.65 Searching Input
1011 88.83 Temp GCD for 11 12
1 89.21 Functionality
1012 95.13 Temp GCD for 211 212
213 91.07 Customization
21 94.59 Interface usability
1013 88.89 Temp GCD for 222 223
23 87.41 User guide
1014 85.78 Temp GCD for 221 1013 23
2 88.40 Usability
1015 100.00 Temp GCD for 311 312
1016 97.16 3131 3132
31 99.13 21
3 90.87 Performance
41 86.44 User satisfaction
42 79.86 Confusion matrix
4 85.05 Reliability
9999 88.18 Whole_Search_Engine
ID E[%] Subsystem
-------------------------------------------------------------
1114 61.99 Numeric Expression Search
124 77.62 Extended filter
42 79.86 Confusion matrix
1006 81.04 * 1006
1116 83.81 Operator Searching
1008 84.98 *Temp GCD for 125 126 127 128
4 85.05 Reliability
1004 85.31 Temp GCD for 1113 1114 1115 1116
128 85.50 Location filter
12 85.61 Searching Filter
1014 85.78 Temp GCD for 221 1013 23
1007 85.95 *Temp GCD for 122 123 124
122 86.12 Citation filter
41 86.44 User satisfaction
127 86.62 Time filter
23 87.41 User guide
123 87.54 Domain filter
ID E[%] Subsystem
-------------------------------------------------------------
1111 88.89 Keyword Searching
1000 88.89 Temp GCD for 11121 11122
1001 100.00 Temp GCD for 11123 11124
1002 90.35 Temp GCD for 1000 1001
1003 89.40 Temp GCD for 1111 1002
1114 84.25 Numeric Expression Search
1115 97.73 Relational Searching
1116 88.19 Operator Searching
1004 90.12 Temp GCD for 1113 1114 1115 1116
111 89.51 Text Searching
1121 88.89 Image Searching
1005 100.00 * Temp GCD for 11221 11222
1122 97.96 Video Searching
112 92.34 Multimedia Search
121 89.74 Security filter
122 87.66 Citation filter
123 92.71 Domain filter
1006 88.14 * 1006
124 94.34 Extended filter
1007 90.38 *Temp GCD for 122 123 124
127 86.62 Time filter
128 88.89 Location filter
1008 86.75 *Temp GCD for 125 126 127 128
12 89.07 Searching Filter
1009 88.89 Temp GCD for 131 132 133
1010 91.06 Temp GCD for 134 135 136 137
13 90.30 Specific activity Search
11 90.62 Searching Input
1011 90.07 Temp GCD for 11 12
1 90.12 Functionality
1012 95.13 Temp GCD for 211 212
213 91.07 Customization
21 94.59 Interface usability
1013 88.89 Temp GCD for 222 223
23 88.60 User guide
1014 88.72 Temp GCD for 221 1013 23
2 88.86 Usability
1015 100.00 Temp GCD for 311 312
1016 97.16 3131 3132
31 99.13 21
3 97.22 Performance
41 86.44 User satisfaction
42 85.14 Confusion matrix
4 86.18 Reliability
9999 89.78 Whole_Search_Engine
ID E[%] Subsystem
-------------------------------------------------------------
1114 84.25 Numeric Expression Search
42 85.14 Confusion matrix
4 86.18 Reliability
41 86.44 User satisfaction
127 86.62 Time filter
1008 86.75 *Temp GCD for 125 126 127 128
122 87.66 Citation filter
1006 88.14 * 1006
1116 88.19 Operator Searching
23 88.60 User guide
1014 88.72 Temp GCD for 221 1013 23
2 88.86 Usability
1000 88.89 Temp GCD for 11121 11122
1121 88.89 Image Searching
128 88.89 Location filter
1111 88.89 Keyword Searching
1009 88.89 Temp GCD for 131 132 133
1013 88.89 Temp GCD for 222 223
12 89.07 Searching Filter
1003 89.40 Temp GCD for 1111 1002
111 89.51 Text Searching
121 89.74 Security filter
ID E[%] Subsystem
-------------------------------------------------------------
1111 77.78 Keyword Searching
1000 88.89 Temp GCD for 11121 11122
1001 100.00 Temp GCD for 11123 11124
1002 90.35 Temp GCD for 1000 1001
1003 81.82 Temp GCD for 1111 1002
1114 68.20 Numeric Expression Search
1115 88.89 Relational Searching
1116 92.00 Operator Searching
1004 69.68 Temp GCD for 1113 1114 1115 1116
111 79.87 Text Searching
1121 86.45 Image Searching
1005 94.16 * Temp GCD for 11221 11222
1122 93.20 Video Searching
112 89.80 Multimedia Search
121 83.82 Security filter
122 86.12 Citation filter
123 87.54 Domain filter
1006 81.04 * 1006
124 90.62 Extended filter
1007 86.53 *Temp GCD for 122 123 124
127 84.25 Time filter
128 74.38 Location filter
1008 77.74 *Temp GCD for 125 126 127 128
12 83.16 Searching Filter
1009 77.78 Temp GCD for 131 132 133
1010 78.06 Temp GCD for 134 135 136 137
13 77.96 Specific activity Search
11 83.61 Searching Input
1011 83.45 Temp GCD for 11 12
1 82.26 Functionality
1012 88.89 Temp GCD for 211 212
213 82.05 Customization
21 87.98 Interface usability
1013 86.62 Temp GCD for 222 223
23 78.25 User guide
1014 79.83 Temp GCD for 221 1013 23
2 85.55 Usability
1015 100.00 Temp GCD for 311 312
1016 96.46 3131 3132
31 98.91 21
3 63.49 Performance
41 72.13 User satisfaction
42 78.51 Confusion matrix
4 73.33 Reliability
9999 77.35 Whole_Search_Engine
ID E[%] Subsystem
-------------------------------------------------------------
3 63.49 Performance
1114 68.20 Numeric Expression Search
1004 69.68 Temp GCD for 1113 1114 1115 1116
41 72.13 User satisfaction
4 73.33 Reliability
128 74.38 Location filter
ID E[%] Subsystem
-------------------------------------------------------------
1111 77.78 Keyword Searching
1000 77.78 Temp GCD for 11121 11122
1001 100.00 Temp GCD for 11123 11124
1002 80.69 Temp GCD for 1000 1001
1003 78.78 Temp GCD for 1111 1002
1114 61.99 Numeric Expression Search
1115 88.89 Relational Searching
1116 86.05 Operator Searching
1004 65.58 Temp GCD for 1113 1114 1115 1116
111 76.65 Text Searching
1121 86.45 Image Searching
1005 94.16 * Temp GCD for 11221 11222
1122 93.20 Video Searching
112 89.80 Multimedia Search
121 94.94 Security filter
122 86.12 Citation filter
123 87.54 Domain filter
1006 81.04 * 1006
124 77.62 Extended filter
1007 87.11 *Temp GCD for 122 123 124
127 86.62 Time filter
128 66.67 Location filter
1008 77.27 *Temp GCD for 125 126 127 128
12 83.30 Searching Filter
1009 81.59 Temp GCD for 131 132 133
1010 75.92 Temp GCD for 134 135 136 137
13 77.89 Specific activity Search
11 81.49 Searching Input
1011 82.12 Temp GCD for 11 12
1 81.20 Functionality
1012 88.89 Temp GCD for 211 212
213 82.05 Customization
21 87.98 Interface usability
1013 88.89 Temp GCD for 222 223
23 84.22 User guide
1014 83.86 Temp GCD for 221 1013 23
2 88.10 Usability
1015 100.00 Temp GCD for 311 312
1016 92.94 3131 3132
31 97.79 21
3 96.62 Performance
41 57.47 User satisfaction
42 79.57 Confusion matrix
4 60.97 Reliability
9999 78.42 Whole_Search_Engine
ID E[%] Subsystem
-------------------------------------------------------------
41 57.47 User satisfaction
4 60.97 Reliability
1114 61.99 Numeric Expression Search
1004 65.58 Temp GCD for 1113 1114 1115 1116
128 66.67 Location filter
1010 75.92 Temp GCD for 134 135 136 137
111 76.65 Text Searching
1008 77.27 *Temp GCD for 125 126 127 128
124 77.62 Extended filter
1000 77.78 Temp GCD for 11121 11122
1111 77.78 Keyword Searching
13 77.89 Specific activity Search
Note: All sorted reports are divided in two parts: the first part contains
components that are below the average quality level for the analyzed system
and the second part contains those components that are above the average
quality level
9.3 SearchEngine.res
Google
100 100 100 100 100 100 100 100 100 100
100 90.3846 88.8889 100 91.0711 100 100 100 100 100
100 77.7778 88.8889 100 95.978 95.4867 99.2907 88.8889 88.8889 100
94.1613 100 100 100 88.8889 97.959 88.8889 93.967 100 88.8889
94.9383 88.8889 77.7778 87.6579 77.7778 100 88.8889 87.5421 88.8889 100
100 100 96.0476 98.1875 90.8575 100 66.6667 66.6667 88.8889 84.2513
88.8889 88.8889 88.8889 86.7288 89.3539 100 100 77.7778 88.8889 88.8889
88.8889 100 90.8257 92.7102 92.0489 97.1004 94.2619 93.7855 100 100
100 100 88.8889 91.0711 98.8115 100 100 88.8889 97.738 100
88.8889 62.5 100 88.5995 92.7073 96.9464 100 100 100 86.0927
100 97.1553 99.1317 88.8 95.8232 100 100 88.8889 94.3746 89.899
94.9495 87.8788 92.9293 91.9015 93.8709 94.8858
Yahoo
88.8889 88.8889 88.8889 100 88.8889 92.6521 100 100 100 93.6205
90.4972 80.7692 88.8889 0 61.9853 100 100 100 100 100
88.8889 77.7778 77.7778 77.7778 83.8056 85.312 89.6803 88.8889 88.8889 100
94.1613 88.8889 100 94.1633 88.8889 93.1976 88.8889 92.1478 100 77.7778
89.7373 88.8889 66.6667 86.1237 77.7778 100 88.8889 87.5421 77.7778 88.8889
77.7778 75 81.0445 77.6207 85.9511 86.6667 77.7778 77.7778 88.8889 86.6216
77.7778 88.8889 85.4964 84.9819 85.6086 88.8889 88.8889 100 100 100
88.8889 77.7778 93.2663 89.2081 90.6206 90.6536 88.8309 89.214 88.8889 100
95.1324 100 88.8889 91.0711 94.5942 77.7778 88.8889 88.8889 88.8889 100
88.8889 57.5 100 87.4093 85.7789 88.4011 100 100 100 86.0927
100 97.1553 99.1317 75.6 90.8727 100 88.8889 77.7778 86.4394 74.7475
79.798 77.7778 84.8485 79.8605 85.0506 88.18
bing
88.8889 88.8889 88.8889 88.8889 88.8889 88.8889 100 100 100 90.3518
89.3963 82.6923 88.8889 66.6667 84.2513 100 100 88.8889 100 97.7303
88.8889 77.7778 77.7778 100 88.1911 90.1191 89.5087 88.8889 88.8889 88.8889
88.8889 100 100 100 88.8889 97.959 88.8889 92.3353 100 77.7778
89.7373 88.8889 77.7778 87.6579 88.8889 100 88.8889 92.7142 88.8889 100
77.7778 100 88.1402 94.3431 90.384 90 77.7778 77.7778 88.8889 86.6216
88.8889 88.8889 88.8889 86.7543 89.0683 88.8889 88.8889 88.8889 100 88.8889
88.8889 88.8889 88.8889 91.0635 90.3001 90.6213 90.0725 90.1213 88.8889 100
95.1324 100 88.8889 91.0711 94.5942 88.8889 88.8889 88.8889 88.8889 100
88.8889 62.5 100 88.5995 88.7152 88.8618 100 100 100 86.0927
100 97.1553 99.1317 93 97.2208 100 88.8889 77.7778 86.4394 84.8485
84.8485 81.8182 87.8788 85.1395 86.1767 89.7764
Ask
77.7778 77.7778 77.7778 88.8889 88.8889 88.8889 100 100 100 90.3518
81.8171 10.4167 88.8889 9.09091 68.2024 88.8889 88.8889 88.8889 88.8889 88.8889
100 77.7778 77.7778 88.8889 91.9989 69.6791 79.8701 77.7778 88.8889 88.8889
86.4543 88.8889 100 94.1633 88.8889 93.1976 88.8889 89.803 88.8889 77.7778
83.819 88.8889 66.6667 86.1237 77.7778 100 88.8889 87.5421 77.7778 88.8889
77.7778 100 81.0445 90.6169 86.5303 80 66.6667 66.6667 88.8889 84.2513
66.6667 77.7778 74.3761 77.7405 83.1634 77.7778 77.7778 77.7778 88.8889 88.8889
77.7778 66.6667 77.7778 78.0629 77.9631 83.6054 83.4502 82.2622 88.8889 88.8889
88.8889 100 77.7778 82.0512 87.9798 77.7778 88.8889 77.7778 86.6216 88.8889
77.7778 55 88.8889 78.2542 79.8324 85.5476 100 100 100 82.7815
100 96.4576 98.9141 32 63.4887 77.7778 77.7778 66.6667 72.1309 69.697
79.798 77.7778 83.8384 78.5127 73.3343 77.3459
altavista
77.7778 77.7778 77.7778 77.7778 77.7778 77.7778 100 100 100 80.6944
78.7777 2.08333 88.8889 0 61.9853 88.8889 88.8889 88.8889 88.8889 88.8889
88.8889 77.7778 77.7778 88.8889 86.0481 65.5789 76.6524 77.7778 88.8889 88.8889
86.4543 88.8889 100 94.1633 88.8889 93.1976 88.8889 89.803 100 88.8889
94.9383 88.8889 66.6667 86.1237 77.7778 100 88.8889 87.5421 77.7778 88.8889
77.7778 75 81.0445 77.6207 87.1146 76.6667 77.7778 77.7778 88.8889 86.6216
66.6667 66.6667 66.6667 77.2685 83.3033 88.8889 77.7778 77.7778 88.8889 77.7778
77.7778 66.6667 81.5946 75.9182 77.8871 81.4889 82.1161 81.2023 88.8889 88.8889
88.8889 100 77.7778 82.0512 87.9798 77.7778 88.8889 88.8889 88.8889 100
88.8889 57.5 88.8889 84.2168 83.8634 88.0973 100 100 96.4286 79.4702
100 92.9375 97.7867 94 96.6235 44.4444 33.3333 77.7778 57.4659 74.7475
79.798 77.7778 83.8384 79.5652 60.9744 78.4239
9.4 SUMMARY OF RESULTS FOR THE SearchEngine PROJECT
[SearchEngine.sum]
Following are the final numeric results. For more detailed analysis please
see other LSP reports (SearchEngine.lst and SearchEngine.txt).
1. 94.89% Google
2. 89.78% bing
3. 88.18% Yahoo
4. 78.42% altavista
5. 77.35% Ask
1. 100.00% Google
2. 94.62% bing
3. 92.93% Yahoo
4. 82.65% altavista
5. 81.51% Ask
Interpretation of results: