Q-2] Describe some problems that you had with automating testing tools?
I had several problems working with test automation tools like
1) Tools limitations for Object Detection
1) Tools configuration / Deployment in various environments.
1) Tools Precision / Default skeleton script issues like window synchronization issues etc.
1) Tools bugs with respect to exception handling.
1) Tools abnormal polymorphism in behavior like sometimes it works but some times not for the same
application / same script / same environment.
Q-5] How did you use automating testing tools in your job?
I used the automation testing tools for Regression, Performance, and Functionality.
Q-6] what is Data-Driven automation?
Data Driven Automation is the most important part of test automation where the requirement is to execute
the same test cases for different set of test input data so that test can executed for pre-defined iterations with
deferent set of test input data for each iteration.
Q-8] How will you evaluate the tool for test automation?
Whenever a Tool has to be evaluated we need to go through few important verifications / validations.
1) Platform support from the tool
1) Protocols / Technologies support
1) Tool Cost
1) Tool type with its features Vs our Requirements Analysis.
1) Tool Usage Comparisons with other similar available tools in market
1) Tool Compatibility with our Application Architecture and Development Technologies
1
1) Tool Configuration & Deployment Requirements
1) Tools limitation Analysis.
1) Test type Expected ( Eg. Regression, Funtionality Testing, Performance – Load Testing)
1) Tool Cost Vs Project Testing Budget estimation
Q-13] What types of scripting techniques for test automation do you know?
Scripting Technique: how to structure automated test scripts for maximum benefit and minimum impact of
software changes, scripting issues, scripting approaches:linear, shared, data-driven and programmed, script
pre-processing, minimizing the impact of Software changes on test scripts. The major ones I had used are:
1) Data-Driven Scripting.
1) Centralized Application Specific / Generic Compiled Modules / Library Development
1) Parent Child Scripting.
2
1) Techniques To Generalize the scripting.
1) Increasing the factor of Reusability of the script.
Q-15] What tools are available for support of testing during software development life cycle?
Test Director for Test Management, Bugzilla for Bug Tracking and Notification etc are the tools for support
of Testing. Rational Test Studio used for entire Software Development Life Cycle.
Rational Purifier (For Unit Testing)
1) Purify: Runtime Errors, Memory Leakage Testing at Unit Level
1) Quantify: Performance Bottlenecks, Third Party Controls
1) Pure Coverage: Code Management Tool
Rational Robot
-Automated Functional Testing Tool, Functionality Testing & Performance Testing
Planning:-
1)Rational Admin 2) Recording Work Flow 3) Recording/Playing back settings
4) Verification Points 5) Data Pool 6) SQA Basic 7) Performance Testing.
Reliability: The occurrence & failures very low that is Reliable. The feature functionality should be
consistence.
3
1) Some basic knowledge of coding standards will be good to have
1) Skill to Interpret Results given by tool and perform analysis to reach the level to meet the
requirements.
Q-19] How to find that tools work with your existing system?
1) Tool should support our system development and deployment technologies.
1) Tool should have compatibility to work with all the third party tools used by our application
1) Tool should support all platforms that our application supports for deployment
1) There should be major environmental settings required by the tool to work for the application that
might result in Problems for the existing system.
1) Tool should not create any conflict with the other tools existing in current system.
1) Tool does not create any memory conflict issues for application.
Q-20] How can one file compare future dated output files from a program which has change against
the baseline run which used current date for input. The client does not want to mask dates on the out
put file to allow compares?
Rerun baseline anf future date input files same not of days as future dated of program with change. Now
run file compare against the baseline future dated output and the changed programs future dated output.
Reliability
The accuracy and repeatability of your test automation.
Number of a test failed due to defects in the tests or in the test scripts.
Flexibility
The ease of working with all the different kinds of automation test ware.
The time and effort needed to identify, locate, restore, combine and execute the different test automation
test ware.
Efficiency
The total cost related to the effort needed for the automation.
Monitoring over time the total cost of automated testing, i.e. resources, material, etc.,
Portability
The ability of automated test to run on different environments.
The effort and time needed to set-up and run test automation in a new environment.
Robustness
The effectiveness of automation on an unstable or rapidly changing system
4
Usability
The extent to which automation can be used by different types of users.
The time needed to train users to become confident and productive with test automation.
Check the maximum field lengths to ensure that there are no truncated characters?
Where the database requires a value (other than null) then this should be defaulted into fields. The user
must either enter an alternative valid value or leave the default value intact.
If numeric fields accept negative values can these be stored correctly on the database and does it make
sense for the field to accept negative numbers?
If a set of radio buttons represent a fixed set of values such as A, B and C then what happens if a blank
value is retrieved from the database? (In some situations rows can be created on the database by other
functions which are not screen based and thus the required initial values can be incorrect.)
If a particular set of data is saved to the database check that each value gets saved fully to the database. i.e.
Beware of truncation (of strings) and rounding of numeric values.
1 Does the overall design implement all explicit requirements? Has a traceability table been developed?
2 Does the overall design achieve all implicit requirements?
5
9 Has the design defined both procedural and data abstractions that can be reused?
10 Has the design been defined and represented in a stepwise fashion?
11 Has the resultant software architecture been partitioned for ease of implementation? Maintenance?
12 Have the concepts of information hiding and functional independence been followed throughout the
design?
13 Has a Design Specification been developed for the software?
Have data objected defined in the analysis model been properly translated into required data structured?
1
2 Do the data structures contain all attributes defined in the analysis model?
3 Have any new data structures and/or attributes been defined at design time?
4 How do any new data structures and/or attributes related to the analysis model and to overall user
requirements?
5 Have the simplest data structures required to do the job been chosen?
6 Can the data structures be implemented directly in the programming language of choice?
7 How are data communicated between software components?
8 Do explicit data components (e.g., a database) exist? If so, what is their role?
C For architectural design:
1 Has a library of architectural styles been considered prior to the definition of the resultant software
architecture?
2 Has architectural tradeoff analysis been performed?
3 Is the resultant software architecture a recognizable architectural style?
4 Has the architecture been exercised against existing usage scenarios?
5 Has an appropriate mapping been used to translate the analysis model into the architectural model?
6 Can quality characteristics associated with the resultant architecture (e.g., a factored call-and-return
architecture) be readily identified from information provided in the design model?
D For user interface design:
1 Have the results of task analysis been documented?
2 Have goals for each user task been identified?
3 Has an action sequence been defined for each user task?
4 Have various states of the interface been documented?
6
5 Have objects and actions that appear within the context of the interface been defined?
6 Have the three "golden rules" (SEPA, 5/e, p. 402) been maintained throughout the GUI design?
7 Has flexible interaction been defined as a design criterion throughout the interface?
8 Have expert and novice modes of interaction been defined?
9 Have technical internals been hidden from the causal user?
10 Is the on-screen metaphor (if any) consistent with the overall applications?
11 Are icons clear and understandable?
12 Is interaction intuitive?
13 Is system response time consistent across all tasks?
14 Has an integrated help facility been implemented?
15 Are all error message displayed by the interface easy to understand? Do they help the user resolve the
problem quickly?
16 Is color being used effectively?
17 Has a prototype for the interface been developed?
18 Have user's impressions of the prototype been collected in an organized manner?
E For component-level design:
1 Have proof of correctness techniques (SEPA, 5/e, Chapter 26) been applied to all algorithms?
2 Has each algorithm been "desk-tested" to uncover errors? Is each algorithm correct?
3 Is the design of the algorithm consistent with the data structured that the component manipulates?
4 Have algorithmic design alternatives been considered? If yes, why was this design chosen?
5 Has the complexity of each algorithm been computed?
6 Have structured programming constructs been used throughout?
7
Q-29] what is End – To – End Testing?
Similar to system testing; the 'macro' end of the test scale; involves testing of a complete application
environment in a situation that mimics real-world use, such as interacting with a database, using network
communications, or interacting with other hardware, applications, or systems if appropriate.
Functional testing, simply stated, verifies that an application does what it is supposed to do. For example, if
you were functionally testing a word processing application, a partial list of checks you would perform
includes creating, saving, editing, spell checking and printing documents.
Positive functional testing entails exercising the application's functions with valid input and verifying the
outputs are correct. Continuing with the word processing example, a positive test for the printing function
might be to print a document containing both text and graphics to a printer that is online, filled with paper
and for which the correct drivers are installed.
Negative functional testing involves exercising application functionality using a combination of invalid
inputs, unexpected operating conditions and other "out-of-bounds" scenarios. Continuing the word
processing example, a negative test for the printing function might be to disconnect the printer from the
computer while a document is printing. What probably should happen in this scenario is a plain-English
error message appears, informing the user what happened and instructing him/her on how to remedy the
problem. What might happen, instead, is the word processing software simply hangs up or crashes because
the "abnormal" loss of communications with the printer isn't handled properly.
Validating an application or Web site conforms to its specifications and correctly performs all its required
functions. This entails a series of tests which perform a feature by feature validation of behavior, using a
wide range of normal and erroneous input data. This can involve testing of the product's user interface,
database management, security, installation, networking, etc.
8
acceptable performance? If modifications of an existing product, what are the current metrics?
What are the expected major bottlenecks and performance problem areas on this feature?
Performance tests should be designed to verify response time, execution time, throughput, Primary and
secondary memory utilization and traffic rates on data channels and communication links.
1) Measures speed of the application and finds processing bottlenecks. This testing will also provide a
measurement for code and architecture changes: did they improve the response time, or make it worse?
1. Does the application responding fast enough for the users?
2. Does this release improve over the previous version?
3. In which module or component is the application spending most of its time? This will determine
where the development team should focus their attention in order to improve response time.
4. Does caching help or hinder the users? (Do they have to dump the cache manually to get the latest
data?)
1)
1) Q-32] Explain about Sanity Testing or Smoke Testing?
1) Typically an initial testing effort to determine if a new software version is performing well enough to
accept it for a major testing effort. For example, if the new software is crashing systems every 5
minutes, bogging down systems to a crawl, or destroying databases, the software may not be in a 'sane'
enough condition to warrant further testing in its current state.
1) Sanity tests are subsets of the confidence test and are used only to validate high-level functionality.
Validates basic activities the system needs to perform whenever each release is made.
1)
1) Q-33] Explain Scalability Testing?
1) Scalability testing exercises an application at an increasing number of concurrent users to determine
when the application fails. The definition of failure for scalability testing purposes depends on the
business need and criticality of the application under test. Common failures include the server hanging
up or crashing, or the average application response time going from several seconds to several minutes.
1) Why can't the number of concurrent users just keep growing and growing without having any effect on
the application? Bottlenecks, found in every application, prevent this from happening. Bottlenecks
might lurk in one or more of the Web server, the application server, the database server, the application
code, and the network infrastructure...or in many other areas. Scalability testing can be considered a
specific case of performance testing.
1)
1)
9
1) Q-34] what is Test Bed?
1) Test bed to include test design, test scripts, and test data, as well as for each individual test procedure.
1)
1) Q-35] what are Test Case Reader, Test Driver and Test Harness?
1) Test Case Reader
1) This reads and parses the intermediate format. Errors in the test data are reported.
1) Test Driver
1) A program or test tool used to execute software against a test case suite. A software module used to
invoke a module under test and, often, provide test inputs, control and monitor execution, and report
test results. This starts the test case reader and executes the lines of the test case. It embodies the
testing methodology and conventions.
1) Test Harness
1) A testing tool that comprises a test driver and a test comparator.
1)
1) Q-35] What is Test Log, Test object & Test Procedure?
1) Test Log
1) It includes the test ID, test activities, which executed the test, start and stop times, pass or fail results,
and comments.
1) Test Object
1) A Software object (Function/Method, Module/Class, Component/Subsystem, System) which is to be
tested. The test object should not be changed for the test and all necessary surrounding components
should already have been tested sufficiently.
1) Test Procedure
1) A formal document developed from a test plan that presents detailed instructions for the setup,
operation, and evaluation of the results for each defined test.
1)
1) Q-36] what is Testing Strategy?
1) It is a background document for the testing department or group that defines "high level" testing issues.
These issues include project scope and type, software and hardware used by development and testing
and success factors.
1) Indicates how testing is to be carried out. This will clearly indicate where special emphasis on various
aspects of the system has to be given so that best possible use of resource and time can be made use of.
1) A test strategy is a statement of the overall approach to testing, identifying what levels of testing are to
be applied and the methods, techniques and tools to be used.
1)
1) Q-37] what is an Use Case?
1) Use Cases define a sequence of actions completed by a system or user that provides a recognizable
result to the user.
1)
1) Q-38] what is a Web Server?
1) Web Server
1) A Web Server understands and supports only HTTP protocol. A user is typically submitting a request to
a system through a web browser. A Web Server (otherwise known as an HTTP Server) must receive the
web request of some sort. This web server must handle standard HTTP requests and responses,
typically returning HTML to the calling user. Code that executes within the server environment may be
CGI driven Servlets, ASP, or some other server-side programming language.
1)
1) Q-39] what is a Application Server?
1) Application Server
1) Whereas an Application Server supports HTTP, TCP/IP and many more protocols. An Application
Server is any server that supplies additional functionality related to enterprise computing -- for
instance, load balancing, database access classes, transaction processing, messaging, and so on. Also
many more features such as Caches, Clusters, and Load Balancing are there in Application Servers,
which are not available in Web Servers. We can also Configure Application Servers to work as Web
Server. In short, Application Server is a super set of which Web Server is a sub set.
10
1)
11
1) Q-40] What are SEI-CMM & Levels?
1) The SEI contract was competitively awarded to Carnegie Mellon University in December 1984. The
SEI staff has extensive technical and managerial experience from government, industry, and academia.
The U.S. Department of Defense established the Software Engineering Institute to advance the practice
of software engineering because quality software that is produced on schedule and within budget is a
critical component of U.S. defense systems The SEI helps organizations and individuals to improve
their software engineering practices. There are five levels in CMM, they are defined as follows:
1)
1) CMM-1. Initial the software process is characterized as ad hoc. Few processes are defined, and success
depends on individual effort and heroics
1) CMM-2. Repeatable Basic project management processes are established to track cost, schedule, and
functionality. The necessary process discipline is in place to repeat earlier successes on projects with
similar applications
1) CMM-3. Defined the software process for both management and engineering activities is documented,
standardized, and integrated into a standard software process for the organization. All projects use an
approved, tailored version of the organization's standard software process for developing and
maintaining software
1) CMM-4. Managed Detailed measures of the software process and product quality are collected. Both
the software process and products are quantitatively understood and controlled
1) CMM-5. Optimizing Continuous process improvement is enabled by quantitative feedback from the
process and from piloting innovative ideas and technologies
1)
1) Q-41] What is Six Sigma?
1) Six Sigma is a philosophy, a philosophy of working smarter so that we make fewer and fewer mistakes
and eventually no mistakes at all in anything we do!! Amongst many things, Six Sigma is a disciplined
data driven approach, a customer oriented management philosophy, a statistic that reads “3.4 defects
per million opportunities to make defects”.
Centric to the Six Sigma philosophy is the Customer and it starts by understanding his needs and then
translating these needs into measurable parameters. This ensures that we are always in touch with
market realities.
1) From the experiences and knowledge gained in implementing Maturity Model, Independent Testing
Team will continuously endure and ensure to make the process as benchmark for Tanning.
1)
1) Q-42] what is Security Testing?
1) Application security testing is performed to guarantee that only the users with the appropriate authority
are able to use the applicable features of the system. The systems engineer establishes different
security settings for each user in the test environment. Network security testing is performed to
guarantee that the network is safe and secure from unauthorized users. The test manager must consider
the depth of hacking skill of the test staff and whether they would be able to perform an adequate set of
challenges. Frequently, it is advisable to use outsource security test specialists whose specialty is
comprehensive security testing.
1)
1) Q-43] what is Installation or Upgrading Testing?
1) Installation or upgrade testing involves testing the setup or upgrade routine to ensure that the product
can be successfully created. The test team decides whether to implement full builds or incremental
ones. The build process typically ensures that the installation or upgrade testing is completed
satisfactorily.
1)
1) Q-44] what is Network Testing?
1) Network testing determines the behavior of the complete network infrastructure when different
network latency is applied. Such testing can uncover problems with slow network links and so on. The
following infrastructure elements can be tested as part of network testing in the Internet Data Center
architecture:
1) Network availability
12
1) Equipment redundancy
1) Cluster fail over
1) Meshed topology
1) NIC Teaming feature
1) Dual-port functionality
Q-45] what is Memory Testing?
This testing is designed to ensure that the application will run in the amount of memory specified in the
technical documentation. This testing should also detect memory leaks associated with frequent starting and
stopping of the application.
Q-50] what is Data Integrity? Give examples for Data Integrity Testing?
Important data stored in the database include the catalogue, pricing, shipping tables, tax tables, order
database, and customer information. Testing must verify the correctness of the stored data. Therefore,
testing should be performed on a regular basis because data changes over time.
Examples of data integrity tests
1) Test the creation, modification, and deletion of data in tables as specified in the functionality.
1) Test to make sure that sets of radio buttons represent a fixed set of values. Check what happens when a
blank value is retrieved from the database.
1) Test that when a particular set of data is saved to the database, each value gets saved fully. In other
words, the truncation of strings and rounding of numeric value does not occur.
13
1) Test whether default values are saved in the database if the user input is not specified.
1) Test the compatibility with old data. In addition, old hardware, versions of the operating system, and
interfaces with other software need to be tested.
The main technique in security testing is to attempt to violate built-in security controls. This technique
ensures that the protection mechanisms in the system secure it from improper penetration. The tester
overwhelms the system by continuous requests, thereby denying service to others. The tester may
purposely cause system errors to penetrate during recovery or may browse through insecure data to find the
key to system entry.
There are two distinct areas of concern in e-commerce security: network security and payment transaction
security. Types of security breaches in these areas are:
1) Secrecy
1) Authentication
1) Non-repudiation
1) Integrity control
14
1) User rights
1) Removable disk locations
1) Strength of password policies
1) Use of logon scripts and password expiration dates
1) Storage of passwords in clear text or encrypted form
The KSA report manager generates several reports to check miscellaneous sets of security-related concerns.
The software points out security loopholes only and does not trap unauthorized visitors.
Server Process
A server process (program) fulfills the client request by performing the task requested. Server programs
generally receive requests from client programs, execute database retrieval and updates, and manage data
integrity and dispatch responses to client requests. Sometimes server programs execute common or
complex business logic.
The server-based process "may" run on another machine on the network. This server could be the host
operating system or network file server; the server is then provided both file system services and
application services. Or in some cases, another desktop machine provides the application services. The
server process acts as a software engine that manages shared resources such as databases, printers,
communication links, or high powered-processors. The server process performs the back-end tasks that are
common to similar applications.
15
Q-56] what is Middleware?
Connectivity allows applications to transparently communicate with other programs or processes,
regardless of their location. The key element of connectivity is the network operating system (NOS). NOS
provide services such as routing, distribution, messaging, file and print, and network management services.
NOS rely on communication protocols to provide specific services. The protocols are divided into three
groups: media, transport and client-server protocols. Media protocols determine the type of physical
connections used on a network (some examples of media protocols are Ethernet, Token Ring, Fiber
Distributed Data interface (FDDI), coaxial and twisted-pair). A transport protocol provides the mechanism
to move packets of data from client to server. Once the physical connection has been established and
transport protocols chosen, a client-server protocol is required before the user can access the network
services. A client-server protocol dictates the manner in which clients request information and services
from a server and also how the server replies to that request.
Introduction
This document takes you through whirl-wind tour of common software errors. This is an excellent aid for
software testing. It helps you to identify errors systematically and increases the efficiency of software
testing and improves testing productivity. For more information, please refer Testing Computer Software,
Wiley Edition.
Type of Errors
16
5 Wrong function
6 Functionality must be created by user
7 Doesn't do what the user expects
Communication
Missing Information
Sl No Possible Error Conditions
1 No on Screen instructions
2 Assuming printed documentation is already available.
3 Undocumented features
4 States that appear impossible to exit
5 No cursor
6 Failure to acknowledge input
7 Failure to show activity during long delays
8 Failure to advise when a change will take effect
9 Failure to check for the same document being opened twice
Wrong, misleading, confusing information
10 Simple factual errors
11 Spelling errors
12 Inaccurate simplifications
13 Invalid metaphors
14 Confusing feature names
15 More than one name for the same feature
16 Information overland
17 When are data saved
18 Wrong function
19 Functionality must be created by user
20 Poor external modularity
Help text and error messages
21 Inappropriate reading levels
22 Verbosity
23 Inappropriate emotional tone
24 Factual errors
25 Context errors
26 Failure to identify the source of error
27 Forbidding a resource without saying why
28 Reporting non-errors
29 Failure to highlight the part of the screen
30 Failure to clear highlighting
31 Wrong/partial string displayed
32 Message displayed for too long or not long enough
Display Layout
33 Poor aesthetics in screen layout
34 Menu Layout errors
35 Dialog box layout errors
17
36 Obscured Instructions
37 Misuse of flash
38 Misuse of color
39 Heavy reliance on color
40 Inconsistent with the style of the environment
41 Cannot get rid of on screen information
Output
42 Can't output certain data
43 Can't redirect output
44 Format incompatible with a follow-up process
45 Must output too little or too much
46 Can't control output layout
47 Absurd printout level of precision
48 Can't control labeling of tables or figures
49 Can't control scaling of graphs
Performance
50 Program Speed
51 User Throughput
52 Can't redirect output
53 Perceived performance
54 Slow program
55 slow echoing
56 how to reduce user throughput
57 Poor responsiveness
58 No type ahead
59 No warning that the operation takes long time
60 No progress reports
61 Problems with time-outs
62 Program pesters you
Program Rigidity
User tailorability
Sl No Possible Error Conditions
1 Can't turn off case sensitivity
2 Can't tailor to hardware at hand
3 Can't change device initialization
4 Can't turn off automatic changes
5 Can't slow down/speed up scrolling
6 Can't do what you did last time
7 Failure to execute a customization commands
8 Failure to save customization commands
9 Side effects of feature changes
10 Can't turn off the noise
11 Infinite tailorability
Who is in control?
18
12 Unnecessary imposition of a conceptual style
13 Novice friendly, experienced hostile
14 Surplus or redundant information required
15 Unnecessary repetition of steps
16 Unnecessary limits
19
33 non-standard use of function keys
34 Failure to filter invalid keys
35 Failure to indicate key board state changes
Missing Commands
State transitions
Sl No Possible Error Conditions
1 Can't do nothing and leave
2 Can't quit mid-program
3 Can't stop mid-command
4 Can't pause
Disaster prevention
5 No backup facility
6 No undo
7 No are you sure
8 No incremental saves
Disaster prevention
9 Inconsistent menu position
10 Inconsistent function key usage
11 Inconsistent error handling rules
12 Inconsistent editing rules
13 Inconsistent data saving rules
Error handling by the user
14 No user specifiable filters
15 Awkward error correction
16 Can't include comments
17 Can't display relationships between variables
Miscellaneous
18 Inadequate privacy or security
19 Obsession with security
20 Can't hide menus
21 Doesn't support standard OS features
22 Doesn't allow long names
Error Handling
Error prevention
Sl No Possible Error Conditions
1 Inadequate initial state validation
2 Inadequate tests of user input
3 Inadequate protection against corrupted data
4 Inadequate tests of passed parameters
5 Inadequate protection against operating system bugs
6 Inadequate protection against malicious use
7 Inadequate version control
20
Error Detection
Sl No Possible Error Conditions
1 ignores overflow
2 ignores impossible values
3 ignores implausible values
4 ignores error flag
5 ignores hardware fault or error conditions
6 data comparison
Error Recovery
Sl No Possible Error Conditions
1 automatic error detection
2 failure to report error
3 failure to set an error flag
4 where does the program go back to
5 aborting errors
6 recovery from hardware problems
7 no escape from missing disks
Calculation Errors
21
6 Impossible parenthesis
7 Wrong order of calculations
8 Bad underlying functions
9 Overflow and Underflow
10 Truncation and Round-off error
11 Confusion about the representation of data
12 Incorrect conversion from one data representation to another
13 Wrong Formula
14 Incorrect Approximation
Race Conditions
22
4 Wrong returning state assumed
5 Exception handling based exits
Interrupts
Sl No Possible Error Conditions
1 Wrong interrupt vector
2 Failure to restore or update interrupt vector
3 Invalid restart after an interrupt
4 Failure to block or un-block interrupts
Program Stops
Sl No Possible Error Conditions
1 Dead crash
2 Syntax error reported at run time
3 Waiting for impossible condition or combinations of conditions
4 Wrong user or process priority
Error Detection
Sl No Possible Error Conditions
1 infinite loop
2 Wrong starting value for the loop control variables
3 Accidental change of loop control variables
4 Command that do or don't belong inside the loop
5 Command that do or don't belong inside the loop
6 Improper loop nesting
23
Multiple Cases
Sl No Possible Error Conditions
1 Missing default
2 Wrong default
3 Missing cases
4 Overlapping cases
5 Invalid or impossible cases
6 Commands being inside the THEN or ELSE clause
7 Case should be sub-divided
Data boundaries
Sl No Possible Error Conditions
1 Un-terminated null strings
2 Early end of string
3 Read/Write past end of data structure or an element in it
Messaging Problems
Sl No Possible Error Conditions
1 Messages sent to wrong process or port
2 Failure to validate an incoming message
3 Lost or out of synch messages
4 Message sent to only N of N+1 processes
24
Data Storage corruption
Sl No Possible Error Conditions
1 Overwritten changes
2 Data entry not saved
3 Too much data for receiving process to handle
4 Overwriting a file after an error exit or user abort
Load Conditions
Hardware
25
14 Disk sector bug and other length dependent errors
15 Wrong operation or instruction codes
16 Misunderstood status or return code
17 Underutilizing device intelligence
18 Paging mechanism ignored or misunderstood
19 Ignores channel throughput limits
20 Assuming device is or isn't or should be or shouldn't be initialized
21 Assumes programmable function keys are programmed correctly
Testing Errors
Poor reporting
Sl No Possible Error Conditions
1 Illegible reports
2 Failure to make it clear how to reproduce the problem
3 Failure to say you can't reproduce the problem
4 Failure to check your report
26
5 Failure to report timing dependencies
6 Failure to simplify conditions
7 Concentration on trivia
8 Abusive language
GENERAL QUESTIONS-6
CONCEPTS
What is the difference between testing and debugging?
Big difference is that debugging is conducted by a programmer and the programmer fix the errors
during debugging phase. Tester never fixes the errors, but rather find them and return to programmer.
Bug is a error during execution of the program. There are two types of bugs: syntax and logical.
General Questions
Could you test a program 100%? 90%? Why?
Definitely not! The major problem with testing that you cannot calculate how many error are in the
code, functioning etc. There are many factors involved such as experience of programmer, complexity
of the system etc.
How would you test a mug (chair/table/gas station etc.)?
First of all you must demand requirements and functional specification and design document of the
mug. There will find requirements like ability to hold hot water, waterproof, stability, break ability and
so on. Then you should test the mug according to all documents.
How would you conduct a test: top-down or down-top? What is it? Which one is better?
Down-Top: unit -> interface -> system.
Top-Down is a vice versa.
You may use both, but down-top allows to discover malfunctioning at earlier phases of development
and cheaper to fix than in the case of top-down.
How to develop a test plan ? How to develop a test case?
Test plan consists of test cases. Test cases you develop according to requirement and design documents
of the unit, system etc. You may be asked what would you do if you are not provided with
requirements documents. Then, you start creating your test cases based on functionality of the system.
You should mention that this is a bad practice since there is no way to compare that what you got is
what you planned to create.
How do you see a QA role in the product development life cycle?
QA should be involved in early stages of the development process in order to create an adequate test
cases and better general understanding of the system. QA, however, must be separated from the
27
development team to ensure that there is no influence of developers on QA engineers. As a last resort
before shipping product to the customer, QA has great level of responsibility for the quality of the
product and image of the company on the market.
What is the size of your executable?
10MB. Who cares? You should demonstrate that you can't be caught with unexpected questions. This
question is one of the dumbest, but you must react accordingly. Tell any reasonable number you want,
but be careful not to exaggerate!
How would you execute a SQL query in Oracle 8?
Again, if you ever worked with Oracle, this question should be trivial for you to answer (from
command prompt, of course) If you never worked with Oracle, note politely that you did not touch an
Oracle database on your career path.
What version of OS were you using?
Tell whatever you want - you can't be caught here. Popular answers are Windows 95/98, Windows
2000 (make sure you know various flavors) and various Unix flavors (AIX, Solaris, SunOS, HPUX
etc.)
Have you tested front-end of back-end?
In other word you are asked if you tested GUI part of the application or server part of your application.
What's the most important thing in testing?
You may bring your own arguments, or simply mention "accuracy", "knowledge of the system", etc.
What was the most difficult problem you ever found while testing?
This is homework. Think about one and give it as an example.
What's your favorite command in Unix?
Again, this is homework. You may mention any command you want: ls, ps, cat, etc. Be careful not to
mention "complex" command unless you absolutely sure what you are talking about. For example, if
you mention "vi" you may be asked how were you using vi editor?
What were you responsible to test in your previous company?
This is homework for you. Actually, this question is a test of the knowledge of your own resume. You
must know your real or fake resume as a bible. Practice in front of mirror or ask you
Why do you like to test?
You enjoy bug hunting process, feel great being between developers and customers, your background
and experience are targeting the testing techniques enhancements and you feel proud of your
contribution to the whole development process.
What role do you see yourself in 2-3 years from now? Would you want to become a developer?
You should not concentrate the attention of the interviewer on your wish to become a developer. You
are being hired for testing role and you should demonstrate reliability. Team lead of QA team is OK,
but do not answer yes when asked if you are willing to become a developer.
What bug tracking system did you use?
Again and again, it does not matter what bug tracking system did you use if you made your homework
and invented the name of such or mentioned a standard one. You may say you've used proprietary bug
tracking system (works especially well if you previous company was this way or another dealing with
databases) or you may say Test Director, Clarify, ClearCase, VisualTest etc.
Here I am sending answers for some of recently asked Interview questions. Please go through them
before appearing interview. May Be Helpful.
1. Test Bed: It is a combination of test scripts + a document explaining how to execute the test scripts
and analyze the results.
28
2. Test Driver: A program/tool that will be used to send various input conditions for different test cases.
3. Testing process: It explains the process we are following in our organization for testing like.
1. Thorough Requirement Analysis.
2. Test Planning
3. Test Case Implementation.
4. Test Execution
5. Testing Metrics like Test Coverage
6. Risk Analysis.
7. Test Summary Report
4. Test Framework: This is the standard framework that should be followed by every person of the
project. This contains an information like Coding Standards, Naming Conventions, Ways to develop test
scripts etc.
5. Test Scenario: The order in which we have to run the test cases.
For Example: Say we have 3 test cases 1) To with draw money from
bank
2) To deposit money 3) To cancel transaction. We can run different scenarios for above 3 test cases like
running 1,2,3 or running 1,3,2 or running 3,2,1 etc.
Interview Questions
On this page I put more than 200 different interview questions from different recourses. Some of them are
very simple some are a little bit difficult. If you would like to check you technical knowledge or to see
more questions and answers you can download free copy of Exam application from “ Software we offer
page”
Test Automation:
1. What automating testing tools are you familiar with?
2. How did you use automating testing tools in your job?
3. Describe some problem that you had with automating testing tool.
4. How do you plan test automation?
5. Can test automation improve test effectiveness?
6. What is data - driven automation?
7. What are the main attributes of test automation?
8. Does automation replace manual testing?
9. How will you choose a tool for test automation?
10. How you will evaluate the tool for test automation?
11. What are main benefits of test automation?
12. What could go wrong with test automation
13. How you will describe testing activities?
14. What testing activities you may want to automate?
15. Describe common problems of test automation.
16. What types of scripting techniques for test automation do you know?
17. What are principles of good testing scripts for automation?
18. What tools are available for support of testing during software development life cycle?
29
19. Can the activities of test case design be automated?
20. What are the limitations of automating software testing?
21. What skills needed to be a good test automator?
22. How to find that tools work well with your existing system?
23.Describe some problem that you had with automating testing tool.
24.What are the main attributes of test automation?
25.What testing activities you may want to automate in a project?
26.How to find that tools work well with your existing system?
Load Testing:
1) What criteria would you use to select Web transactions for load testing?
1) For what purpose are virtual users created?
1) Why it is recommended to add verification checks to your all your scenarios?
1) In what situation would you want to parameterize a text verification check?
1) Why do you need to parameterize fields in your virtual user script?
1) What are the reasons why parameterization is necessary when load testing the Web server
and the database server?
1) How can data caching have a negative effect on load testing results?
1) What usually indicates that your virtual user script has dynamic data that is dependent on you
parameterized fields?
1) What are the benefits of creating multiple actions within any virtual user script?
General questions:
1. What types of documents would you need for QA, QC, and Testing?
2. What did you include in a test plan?
3. Describe any bug you remember.
4. What is the purpose of the testing?
5. What do you like (not like) in this job?
6. What is quality assurance?
7. What is the difference between QA and testing?
8. How do you scope, organize, and execute a test project?
9. What is the role of QA in a development project?
10. What is the role of QA in a company that produces software?
11. Define quality for me as you understand it
12. Describe to me the difference between validation and verification.
13. Describe to me what you see as a process. Not a particular process, just the basics of having a
process.
14. Describe to me when you would consider employing a failure mode and effect analysis.
15. Describe to me the Software Development Life Cycle as you would define it.
16. What are the properties of a good requirement?
17. How do you differentiate the roles of Quality Assurance Manager and Project Manager?
18. Tell me about any quality efforts you have overseen or implemented. Describe some of the
challenges you faced and how you overcame them.
19. How do you deal with environments that are hostile to quality change efforts?
20. In general, how do you see automation fitting into the overall process of testing?
21. How do you promote the concept of phase containment and defect prevention?
22. If you come onboard, give me a general idea of what your first overall tasks will be as far as
starting a quality effort.
23. What kinds of testing have you done?
24. Have you ever created a test plan?
25. Have you ever written test cases or did you just execute those written by others?
26. What did your base your test cases?
27. How do you determine what to test?
28. How do you decide when you have 'tested enough?'
29. How do you test if you have minimal or no documentation about the product?
30. Describe me to the basic elements you put in a defect report?
30
31. How do you perform regression testing?
32. At what stage of the life cycle does testing begin in your opinion?
33. How do you analyze your test results? What metrics do you try to provide?
34. Realising you won't be able to test everything - how do you decide what to test first?
35. Where do you get your expected results?
36. If automating - what is your process for determining what to automate and in what order?
37. In the past, I have been asked to verbally start mapping out a test plan for a common situation,
such as an ATM. The interviewer might say, "Just thinking out loud, if you were tasked to test an
ATM, what items might you test plan include?" These type questions are not meant to be answered
conclusively, but it is a good way for the interviewer to see how you approach the task.
38. If you're given a program that will average student grades, what kinds of inputs would you
use?
39. Tell me about the best bug you ever found.
40. What made you pick testing over another career?
41. What is the exact difference between Integration & System testing, give me examples with
your project.
42. How did you go about testing a project?
43. When should testing start in a project? Why?
44. How do you go about testing a web application?
45. Difference between Black & White box testing
46. What is Configuration management? Tools used?
47. What do you plan to become after say 2-5yrs (Ex: QA Manager, Why?)
48. Would you like to work in a team or alone, why?
49. Give me 5 strong & weak points of yours
50. Why do you want to join our company?
51. When should testing be stopped?
52. What sort of things would you put down in a bug report?
53. Who in the company is responsible for Quality?
54. Who defines quality?
55. What is an equivalence class?
56. Is a "A fast database retrieval rate" a testable requirement?
57. Should we test every possible combination/scenario for a program?
58. What criteria do you use when determining when to automate a test or leave it manual?
59. When do you start developing your automation tests?
60. Discuss what test metrics you feel are important to publish an organization?
61. In case anybody cares, here are the questions that I will be asking:
62. Describe the role that QA plays in the software lifecycle.
63. What should Development require of QA?
64. What should QA require of Development?
65. How would you define a "bug?"
66. Give me an example of the best and worst experiences you've had with QA.
67. How does unit testing play a role in the development / software lifecycle?
68. Explain some techniques for developing software components with respect to testability.
69. Describe a past experience with implementing a test harness in the development of software.
70. Have you ever worked with QA in developing test tools? Explain the participation
Development should have with QA in leveraging such test tools for QA use.
71. Give me some examples of how you have participated in Integration Testing.
72. How would you describe the involvement you have had with the bug-fix cycle between
Development and QA?
72. What is unit testing?
73. Describe your personal software development process.
74. How do you know when your code has met specifications?
75. How do you know your code has met specifications when there are no specifications?
76. Describe your experiences with code analyzers.
77. How do you feel about cyclomatic complexity?
78. Who should test your code?
31
79.How do you survive chaos?
80. What processes/methodologies are you familiar with?
81. What type of documents would you need for QA/QC/Testing?
82. How can you use technology to solve problem?
83. What type of metrics would you use?
84. How to find that tools work well with your existing system?
85. What automated tools are you familiar with?
86. How well you work with a team?
87. How would you ensure 100% coverage of testing?
88. How would you build a test team?
89. What problem you have right now or in the past? How you solved it?
90. What you will do during the first day of job?
91. What would you like to do five years from now?
92. Tell me about the worst boss you've ever had.
93. What are your greatest weaknesses?
94. What are your strengths?
95. What is a successful product?
96. What do you like about Windows?
97. What is good code?
98. Who is Kent Beck, Dr Grace Hopper, Dennis Ritchie?
99. What are basic, core, practises for a QA specialist?
100. What do you like about QA?
101. What has not worked well in your previous QA experience and what would you change?
102. How you will begin to improve the QA process?
103. What is the difference between QA and QC?
104. What is UML and how to use it for testing?
105. What is CMM and CMMI? What is the difference?
106. What do you like about computers?
107. Do you have a favourite QA book? More than one? Which ones? And why.
108. What is the responsibility of programmers vs QA?
109.What are the properties of a good requirement?
110.Ho to do test if we have minimal or no documentation about the product?
111.What are all the basic elements in a defect report?
112.Is an "A fast database retrieval rate" a testable requirement?
32
18. What are some of the typical bugs you encountered in your last assignment?
19. How do you prioritize testing tasks within a project?
20. How do you develop a test plan and schedule? Describe bottom-up and top-down approaches.
21. When should you begin test planning?
22. When should you begin testing?
23. Do you know of metrics that help you estimate the size of the testing effort?
24. How do you scope out the size of the testing effort?
25. How many hours a week should a tester work?
26. How should your staff be managed? How about your overtime?
27. How do you estimate staff requirements?
28. What do you do (with the project tasks) when the schedule fails?
29. How do you handle conflict with programmers?
30. How do you know when the product is tested well enough?
31. What characteristics would you seek in a candidate for test-group manager?
32. What do you think the role of test-group manager should be? Relative to senior management?
Relative to other technical groups in the company? Relative to your staff?
33. How do your characteristics compare to the profile of the ideal manager that you just described?
34. How does your preferred work style work with the ideal test-manager role that you just described?
What is different between the way you work and the role you described?
35. Who should you hire in a testing group and why?
36. What is the role of metrics in comparing staff performance in human resources management?
37. How do you estimate staff requirements?
38. What do you do (with the project staff) when the schedule fails?
39. Describe some staff conflicts youÂ’ve handled.
Here are some questions you might be asked on a job interview for a testing opening: (from MU COSC 198
Software Testing by Dr. Corliss)
1) What was a problem you had in your previous assignment (testing if possible)? How did you
resolve it?
1) What are two of your strengths that you will bring to our QA/testing team?
1) What is the Waterfall Development Method and do you agree with all the steps?
1) What is the V-Model Development Method and do you agree with this model?
1) What is the Capability Maturity Model (CMM)? At what CMM level were the last few companies
you worked?
1) Could you tell me two things you did in your previous assignment (QA/Testing related hopefully)
that you are proud of?
33
1) List 5 words that best describe your strengths.
1) Define each of the following and explain how each relates to the other: Unit, System, and
Integration testing.
1) Define Verification and Validation. Explain the differences between the two.
1) How do you go about going into a new organization? How do you assimilate?
1) Define the following and explain their usefulness: Change Management, Configuration
Management, Version Control, and Defect Tracking.
1) What is IEEE 829? (This standard is important for Software Test Documentation-Why?)
1) We have a testing assignment that is time-driven. Do you think automated tests are the best
solution?
1) What is your experience with change control? Our development team has only 10 members. Do
you think managing change is such a big deal for us?
1) Are reusable test cases a big plus of automated testing and explain why.
1) Can you build a good audit trail using Compuware's QACenter products. Explain why.
1) Do you think tools are required for managing change. Explain and please list some tools/practices
which can help you managing change.
1) We believe in ad-hoc software processes for projects. Do you agree with this? Please explain your
answer.
1) Are regression tests required or do you feel there is a better use for resources?
34
1) Our software designers use UML for modeling applications. Based on their use cases, we would
like to plan a test strategy. Do you agree with this approach or would this mean more effort
for the testers.
1) Tell me about a difficult time you had at work and how you worked through it.
1) Give me an example of something you tried at work but did not work out so you had to go at
things another way.
1) How can one file compare future dated output files from a program which has change, against the
baseline run which used current date for input. The client does not want to mask dates on the
output files to allow compares. - Answer-Rerun baseline and future date input files same # of
days as future dated run of program with change. Now run a file compare against the baseline
future dated output and the changed programs' future dated output.
Interviewing Suggestions
1) If you do not recognize a term ask for further definition. You may know the methodology/term but
you have used a different name for it.
1) Always keep in mind that the employer wants to know what you are going to do for them, with
that you should always stay/be positive.
Preinterview Questions
1) What is the structure of the company?
1) What are the employer's methods and processes used in software arena?
1) What is the project all about you are interviewing for-as much information as possible.
35