Anda di halaman 1dari 35

GENERAL QUESTIONS-2

Q-1] What Automating testing tools you are familiar with?


Mercury Interactive Tools & Rational Tools

Q-2] Describe some problems that you had with automating testing tools?
I had several problems working with test automation tools like
1) Tools limitations for Object Detection
1) Tools configuration / Deployment in various environments.
1) Tools Precision / Default skeleton script issues like window synchronization issues etc.
1) Tools bugs with respect to exception handling.
1) Tools abnormal polymorphism in behavior like sometimes it works but some times not for the same
application / same script / same environment.

Q-3] How do you plan test automation?


Planning is the most important task in Test Automation. Test Automation plan should cover the following
task items:
1) Tools Selection:- Types of Test automation Expected(Regression / Performance etc.)
1) Tool Evaluation:- Tool Availability, Tool license Availability, Tool License Limitations
1) Tool Cost Estimation VS Project Estimation Statistics for Testing
1) Resource Requirements VS Availability Study
1) Time Availability VS Time Estimation Statistics for Testing.
1) Production requirement analysis results consideration with respect to Factors like Load-Performance,
Functionality Expected, Scalability etc.
1) Test Automation process definitions including standard to be followed while performing Test
Automation.
1) Test automation scope definitions
1) Automation risk analysis and planning to overcome if defined Risks Emerge in the Automation
Process.
1) Reference Document Requirement as Prerequisites for Test Automation.

Q-4] Can Test Automation improve test effectiveness?


Yes, definitely Test Automation plays a vital role in improving Test Effectiveness in Various ways like,
1) Reduction in slippage caused due to human errors.
1) Object / Object properties level UI verifications.
1) Virtual Load / Users usage in Load/Performance testing where in its not possible to use so many
resources physically performing test and get so accurate results.
1) Precise time calculations

Q-5] How did you use automating testing tools in your job?
I used the automation testing tools for Regression, Performance, and Functionality.
Q-6] what is Data-Driven automation?
Data Driven Automation is the most important part of test automation where the requirement is to execute
the same test cases for different set of test input data so that test can executed for pre-defined iterations with
deferent set of test input data for each iteration.

Q-8] How will you evaluate the tool for test automation?
Whenever a Tool has to be evaluated we need to go through few important verifications / validations.
1) Platform support from the tool
1) Protocols / Technologies support
1) Tool Cost
1) Tool type with its features Vs our Requirements Analysis.
1) Tool Usage Comparisons with other similar available tools in market
1) Tool Compatibility with our Application Architecture and Development Technologies

1
1) Tool Configuration & Deployment Requirements
1) Tools limitation Analysis.
1) Test type Expected ( Eg. Regression, Funtionality Testing, Performance – Load Testing)
1) Tool Cost Vs Project Testing Budget estimation

Q-9] What are main benefits of Test Automation?


1) Test Automation Saves Major Testing Time
1) Saves Resources ( Human, H/W, S/W resources)
1) Reduction in verification slippage’s caused due to human errors.
1) Object properties level verifications can be done which is difficult manually
1) Virtual Load/ users Generation for load testing which is not worth in doing manually as it needs lots of
resources and also it might not give that precise results which can be achieved using a Automation
Tool.
1) Regression Testing purpose.
1) For data driven testing

Q-10] What could go wrong with test automation?


While using test automation there are various factors that can affect the testing process like
1) Tools limitations might result in Application Defects.
1) Automation tools abnormal behavior like Scalability Variation due to memory violation might be
considered as Applications Memory violation in heavy load tests
1) Environment settings required for Tool causes Application to show up Bugs which are just due to the
JDK installation in System which I had experienced my self as on un-installation of JDK and
Java-Addins my application works fine.

Q-11] What Testing activities you may want to automate?


Any thing which is repeated should be automated if possible. Thus I feel the following testing activities can
be automated:
1) Test Case Preparation
1) Test like Cursor, Regression, Functional & Load / Performance testing
1) Test Report Generation
1) Test status / Results Notification
1) Bug Tracking System etc.

Q-12] Describe Common problems of test automation?


In Test Automation we come across several problems out of which I would like to highlight few as given
below:
1) Automation Script Maintenance, which becomes tough if product gets through frequent changes.
1) Automation Tools limitations for Object recognizing.
1) Automation Tools Third part integration Limitations.
1) Automation Tools abnormal behavior due to its Scalability issues.
1) Due to Tools Defects, we might assume its Application Defect and consider any issue as Application
Bug.
1) Environmental Settings and API’s / Addins Required by Tool to make it compatible to work with
Specialized Environments like JAVA-CORBA creates JAVA environmental Issues for the
Application to Work. Winrunner 7.05 Java-Support Environmental Variables Creates Application
under Test to malfunction
1) There are many issues, which we come across while actual automation.

Q-13] What types of scripting techniques for test automation do you know?
Scripting Technique: how to structure automated test scripts for maximum benefit and minimum impact of
software changes, scripting issues, scripting approaches:linear, shared, data-driven and programmed, script
pre-processing, minimizing the impact of Software changes on test scripts. The major ones I had used are:
1) Data-Driven Scripting.
1) Centralized Application Specific / Generic Compiled Modules / Library Development
1) Parent Child Scripting.

2
1) Techniques To Generalize the scripting.
1) Increasing the factor of Reusability of the script.

Q-14] What are principles of good testing scripts for automation?


1) Automation scripts should be reusable
1) Coding standards should be followed for scripting , which makes script Updations, Understanding,
Debugging easier.
1) Scripts should be environment, data independent as much as possible which can be achieved using
parameterization.
1) Script should be generalized
1) Scripts should be modular.
1) Repeated Tasks should be kept in Functions while scripting to avoid code repeat, complexity and make
script easy for debugging.
1) Script should be readable and appropriate comments should be written for each line section of script.
1) Script Header should contain script developer name, script updated date, script environmental
requirements, scripted environmental details, script pre-requisites from application side, script
description in brief, script contents, script scope.

Q-15] What tools are available for support of testing during software development life cycle?
Test Director for Test Management, Bugzilla for Bug Tracking and Notification etc are the tools for support
of Testing. Rational Test Studio used for entire Software Development Life Cycle.
Rational Purifier (For Unit Testing)
1) Purify: Runtime Errors, Memory Leakage Testing at Unit Level
1) Quantify: Performance Bottlenecks, Third Party Controls
1) Pure Coverage: Code Management Tool
Rational Robot
-Automated Functional Testing Tool, Functionality Testing & Performance Testing
Planning:-
1)Rational Admin 2) Recording Work Flow 3) Recording/Playing back settings
4) Verification Points 5) Data Pool 6) SQA Basic 7) Performance Testing.

Rational Test Manager


Planning Testing Activities. We can evaluate, we can execute, viewing the results.

Rational Test Factory


It is used for reliability testing.

Reliability: The occurrence & failures very low that is Reliable. The feature functionality should be
consistence.

Q-16] Can the activities of test case design be automated?


Yes, Test Director is one such tool, which has the feature of Test Case Design and execution.

Q-17] What are the limitations of automating software testing?


1) Automation needs lots of time in the initial stage of automation
1) Every tool will have its own limitations with respect to protocol support, technologies supported,
object recognition, platform supported etc due to which not 100% of the Application can be
automation because there is always something limited to the tool which we have to overcome with
R&D.
1) Tools Memory Utilization is also one the important factor that blocks the application’s memory
resources and creates problems to application in few cases like Java Applications etc.

Q-18] What skills needed to be a good test automator?


1) Programming skills
1) Any procedural language Basics
1) Generic skill of automation Tools Configurations and Deployment

3
1) Some basic knowledge of coding standards will be good to have
1) Skill to Interpret Results given by tool and perform analysis to reach the level to meet the
requirements.

Q-19] How to find that tools work with your existing system?
1) Tool should support our system development and deployment technologies.
1) Tool should have compatibility to work with all the third party tools used by our application
1) Tool should support all platforms that our application supports for deployment
1) There should be major environmental settings required by the tool to work for the application that
might result in Problems for the existing system.
1) Tool should not create any conflict with the other tools existing in current system.
1) Tool does not create any memory conflict issues for application.

Q-20] How can one file compare future dated output files from a program which has change against
the baseline run which used current date for input. The client does not want to mask dates on the out
put file to allow compares?
Rerun baseline anf future date input files same not of days as future dated of program with change. Now
run file compare against the baseline future dated output and the changed programs future dated output.

Q-21] What is Configuration Management?


Configuration Management is the management of the evolution of components that make up a software
development and applies equally to testing the design and implementation of the product. CM provides the
means for managing integrity and traceability throughout the development artifacts produced during
software development and testing.

Q-22] How do you Scope, Organize and Execute a Test Project?


With the help of External Specification, High Level Design Document, Low Level Design Document and
User Guide.

Q-23] What are the main attributes of test automation?


The main role of test automation are:
Maintainability
Definition:- The effort needed to update the test automation suits for each new release.
Possible Measurements :- The possible measurements can be e.g. the average work effort in hours to
update a test suit.

Reliability
The accuracy and repeatability of your test automation.
Number of a test failed due to defects in the tests or in the test scripts.

Flexibility
The ease of working with all the different kinds of automation test ware.
The time and effort needed to identify, locate, restore, combine and execute the different test automation
test ware.

Efficiency
The total cost related to the effort needed for the automation.
Monitoring over time the total cost of automated testing, i.e. resources, material, etc.,

Portability
The ability of automated test to run on different environments.
The effort and time needed to set-up and run test automation in a new environment.

Robustness
The effectiveness of automation on an unstable or rapidly changing system

4
Usability
The extent to which automation can be used by different types of users.
The time needed to train users to become confident and productive with test automation.

Q-24] What is Change Management?


A high authority team or taskforce that oversees and minimizes the impact of change to an organization or
its people when major impact activities such as a business process re-engineering, introduction of an ERP
system or a major outsourcing activity is initiated.

Q-25] Write the Check List for Data Integrity?


Is the data saved when the window is closed by double clicking on the close box?

Check the maximum field lengths to ensure that there are no truncated characters?

Where the database requires a value (other than null) then this should be defaulted into fields. The user
must either enter an alternative valid value or leave the default value intact.

Check maximum and minimum field values for numeric fields?

If numeric fields accept negative values can these be stored correctly on the database and does it make
sense for the field to accept negative numbers?

If a set of radio buttons represent a fixed set of values such as A, B and C then what happens if a blank
value is retrieved from the database? (In some situations rows can be created on the database by other
functions which are not screen based and thus the required initial values can be incorrect.)

If a particular set of data is saved to the database check that each value gets saved fully to the database. i.e.
Beware of truncation (of strings) and rounding of numeric values.

Q-26] What are the Design Review Check Lists?

S.No Review Checklist


A General Issues

1 Does the overall design implement all explicit requirements? Has a traceability table been developed?
2 Does the overall design achieve all implicit requirements?

3 Is the design represented in a form that is easily understood by outsiders?

4 Is design notation standardized? Consistent?


5 Does the overall design provide sufficient information for test case design?

6 Is the design created using recognizable architectural and procedural patterns?

7 Does the design strive to incorporate reusable components?


8 Is the design modular?

5
9 Has the design defined both procedural and data abstractions that can be reused?
10 Has the design been defined and represented in a stepwise fashion?
11 Has the resultant software architecture been partitioned for ease of implementation? Maintenance?
12 Have the concepts of information hiding and functional independence been followed throughout the
design?
13 Has a Design Specification been developed for the software?

B For data design:

Have data objected defined in the analysis model been properly translated into required data structured?
1
2 Do the data structures contain all attributes defined in the analysis model?
3 Have any new data structures and/or attributes been defined at design time?
4 How do any new data structures and/or attributes related to the analysis model and to overall user
requirements?
5 Have the simplest data structures required to do the job been chosen?
6 Can the data structures be implemented directly in the programming language of choice?
7 How are data communicated between software components?
8 Do explicit data components (e.g., a database) exist? If so, what is their role?
C For architectural design:
1 Has a library of architectural styles been considered prior to the definition of the resultant software
architecture?
2 Has architectural tradeoff analysis been performed?
3 Is the resultant software architecture a recognizable architectural style?
4 Has the architecture been exercised against existing usage scenarios?
5 Has an appropriate mapping been used to translate the analysis model into the architectural model?
6 Can quality characteristics associated with the resultant architecture (e.g., a factored call-and-return
architecture) be readily identified from information provided in the design model?
D For user interface design:
1 Have the results of task analysis been documented?
2 Have goals for each user task been identified?
3 Has an action sequence been defined for each user task?
4 Have various states of the interface been documented?

6
5 Have objects and actions that appear within the context of the interface been defined?
6 Have the three "golden rules" (SEPA, 5/e, p. 402) been maintained throughout the GUI design?
7 Has flexible interaction been defined as a design criterion throughout the interface?
8 Have expert and novice modes of interaction been defined?
9 Have technical internals been hidden from the causal user?
10 Is the on-screen metaphor (if any) consistent with the overall applications?
11 Are icons clear and understandable?
12 Is interaction intuitive?
13 Is system response time consistent across all tasks?
14 Has an integrated help facility been implemented?
15 Are all error message displayed by the interface easy to understand? Do they help the user resolve the
problem quickly?
16 Is color being used effectively?
17 Has a prototype for the interface been developed?
18 Have user's impressions of the prototype been collected in an organized manner?
E For component-level design:
1 Have proof of correctness techniques (SEPA, 5/e, Chapter 26) been applied to all algorithms?
2 Has each algorithm been "desk-tested" to uncover errors? Is each algorithm correct?
3 Is the design of the algorithm consistent with the data structured that the component manipulates?
4 Have algorithmic design alternatives been considered? If yes, why was this design chosen?
5 Has the complexity of each algorithm been computed?
6 Have structured programming constructs been used throughout?

Q-27] what is Feasibility Study?


Analysis of the known or anticipated need for a product, system, or component to assess the degree to
which the requirements, designs, or plans can be implemented.

Q-28] what is an IEEE (Institute Of Electrical and Electronic Engineers)?


An organization involved in the generation and promulgation of standards. IEEE standards represent the
formalization of current norms of professional practice through the process to obtaining the consensus of
concerned, practicing professionals in the given field.

7
Q-29] what is End – To – End Testing?
Similar to system testing; the 'macro' end of the test scale; involves testing of a complete application
environment in a situation that mimics real-world use, such as interacting with a database, using network
communications, or interacting with other hardware, applications, or systems if appropriate.

Q-30] Explain about Functionality Testing?`


Is your objectives been met? Under functionality we check your sites behavior under various user
environments. Black-box type testing geared to functional requirements of an application; testers should do
this type of testing. This doesn't mean that the programmers shouldn't check that their code works before
releasing it.

Functional testing, simply stated, verifies that an application does what it is supposed to do. For example, if
you were functionally testing a word processing application, a partial list of checks you would perform
includes creating, saving, editing, spell checking and printing documents.

Positive functional testing entails exercising the application's functions with valid input and verifying the
outputs are correct. Continuing with the word processing example, a positive test for the printing function
might be to print a document containing both text and graphics to a printer that is online, filled with paper
and for which the correct drivers are installed.

Negative functional testing involves exercising application functionality using a combination of invalid
inputs, unexpected operating conditions and other "out-of-bounds" scenarios. Continuing the word
processing example, a negative test for the printing function might be to disconnect the printer from the
computer while a document is printing. What probably should happen in this scenario is a plain-English
error message appears, informing the user what happened and instructing him/her on how to remedy the
problem. What might happen, instead, is the word processing software simply hangs up or crashes because
the "abnormal" loss of communications with the printer isn't handled properly.

Validating an application or Web site conforms to its specifications and correctly performs all its required
functions. This entails a series of tests which perform a feature by feature validation of behavior, using a
wide range of normal and erroneous input data. This can involve testing of the product's user interface,
database management, security, installation, networking, etc.

Q-31] Explain About Performance Testing?


Performance testing will validates how well the system performs; with focus on the following
1) Speed and 2) Data Processing. This is important because even if all business requirements have
been implemented, a poorly performing system will work against much of the system's
features.All systems have bottlenecks. Where are the bottlenecks in your system – in the code,
in the database, in the network infrastructure, somewhere else? Will they become an issue as
the user base or transaction rates increase? Performance testing identifies current bottlenecks
in your website, web or client/server application and verifies it meets or exceeds key
performance measures. Load testing is analogous to volume testing and determines how the
application deals with large tasks. Stress testing examines application behavior under peak
bursts of activity.
Performance, load and stress testing answers questions like:
• Can my website support 1,000 hits/second? If so, for how long?
• Can my e-commerce application handle 500 users searching for products and 250 users
adding items to their shopping carts simultaneously?
• What happens to application performance, as the backend database gets larger and larger?
If concrete performance goals aren't defined, performance, load and stress testing can answer
the question "At what point will my application malfunction or fail?"
Generally describes the processes of making the web site and its server as efficient as possible, in
terms of download speed, machine resource usage, and server request handling.
1) How fast and how much can the feature do? Does it do enough fast enough? What testing
methodology will be used to determine this information? What criterion will be used to indicate

8
acceptable performance? If modifications of an existing product, what are the current metrics?
What are the expected major bottlenecks and performance problem areas on this feature?
Performance tests should be designed to verify response time, execution time, throughput, Primary and
secondary memory utilization and traffic rates on data channels and communication links.
1) Measures speed of the application and finds processing bottlenecks. This testing will also provide a
measurement for code and architecture changes: did they improve the response time, or make it worse?
1. Does the application responding fast enough for the users?
2. Does this release improve over the previous version?
3. In which module or component is the application spending most of its time? This will determine
where the development team should focus their attention in order to improve response time.
4. Does caching help or hinder the users? (Do they have to dump the cache manually to get the latest
data?)
1)
1) Q-32] Explain about Sanity Testing or Smoke Testing?
1) Typically an initial testing effort to determine if a new software version is performing well enough to
accept it for a major testing effort. For example, if the new software is crashing systems every 5
minutes, bogging down systems to a crawl, or destroying databases, the software may not be in a 'sane'
enough condition to warrant further testing in its current state.
1) Sanity tests are subsets of the confidence test and are used only to validate high-level functionality.
Validates basic activities the system needs to perform whenever each release is made.
1)
1) Q-33] Explain Scalability Testing?
1) Scalability testing exercises an application at an increasing number of concurrent users to determine
when the application fails. The definition of failure for scalability testing purposes depends on the
business need and criticality of the application under test. Common failures include the server hanging
up or crashing, or the average application response time going from several seconds to several minutes.
1) Why can't the number of concurrent users just keep growing and growing without having any effect on
the application? Bottlenecks, found in every application, prevent this from happening. Bottlenecks
might lurk in one or more of the Web server, the application server, the database server, the application
code, and the network infrastructure...or in many other areas. Scalability testing can be considered a
specific case of performance testing.
1)
1)

9
1) Q-34] what is Test Bed?
1) Test bed to include test design, test scripts, and test data, as well as for each individual test procedure.
1)
1) Q-35] what are Test Case Reader, Test Driver and Test Harness?
1) Test Case Reader
1) This reads and parses the intermediate format. Errors in the test data are reported.
1) Test Driver
1) A program or test tool used to execute software against a test case suite. A software module used to
invoke a module under test and, often, provide test inputs, control and monitor execution, and report
test results. This starts the test case reader and executes the lines of the test case. It embodies the
testing methodology and conventions.
1) Test Harness
1) A testing tool that comprises a test driver and a test comparator.
1)
1) Q-35] What is Test Log, Test object & Test Procedure?
1) Test Log
1) It includes the test ID, test activities, which executed the test, start and stop times, pass or fail results,
and comments.
1) Test Object
1) A Software object (Function/Method, Module/Class, Component/Subsystem, System) which is to be
tested. The test object should not be changed for the test and all necessary surrounding components
should already have been tested sufficiently.
1) Test Procedure
1) A formal document developed from a test plan that presents detailed instructions for the setup,
operation, and evaluation of the results for each defined test.
1)
1) Q-36] what is Testing Strategy?
1) It is a background document for the testing department or group that defines "high level" testing issues.
These issues include project scope and type, software and hardware used by development and testing
and success factors.
1) Indicates how testing is to be carried out. This will clearly indicate where special emphasis on various
aspects of the system has to be given so that best possible use of resource and time can be made use of.
1) A test strategy is a statement of the overall approach to testing, identifying what levels of testing are to
be applied and the methods, techniques and tools to be used.
1)
1) Q-37] what is an Use Case?
1) Use Cases define a sequence of actions completed by a system or user that provides a recognizable
result to the user.
1)
1) Q-38] what is a Web Server?
1) Web Server
1) A Web Server understands and supports only HTTP protocol. A user is typically submitting a request to
a system through a web browser. A Web Server (otherwise known as an HTTP Server) must receive the
web request of some sort. This web server must handle standard HTTP requests and responses,
typically returning HTML to the calling user. Code that executes within the server environment may be
CGI driven Servlets, ASP, or some other server-side programming language.
1)
1) Q-39] what is a Application Server?
1) Application Server
1) Whereas an Application Server supports HTTP, TCP/IP and many more protocols. An Application
Server is any server that supplies additional functionality related to enterprise computing -- for
instance, load balancing, database access classes, transaction processing, messaging, and so on. Also
many more features such as Caches, Clusters, and Load Balancing are there in Application Servers,
which are not available in Web Servers. We can also Configure Application Servers to work as Web
Server. In short, Application Server is a super set of which Web Server is a sub set.

10
1)

11
1) Q-40] What are SEI-CMM & Levels?
1) The SEI contract was competitively awarded to Carnegie Mellon University in December 1984. The
SEI staff has extensive technical and managerial experience from government, industry, and academia.
The U.S. Department of Defense established the Software Engineering Institute to advance the practice
of software engineering because quality software that is produced on schedule and within budget is a
critical component of U.S. defense systems The SEI helps organizations and individuals to improve
their software engineering practices. There are five levels in CMM, they are defined as follows:
1)
1) CMM-1. Initial the software process is characterized as ad hoc. Few processes are defined, and success
depends on individual effort and heroics
1) CMM-2. Repeatable Basic project management processes are established to track cost, schedule, and
functionality. The necessary process discipline is in place to repeat earlier successes on projects with
similar applications
1) CMM-3. Defined the software process for both management and engineering activities is documented,
standardized, and integrated into a standard software process for the organization. All projects use an
approved, tailored version of the organization's standard software process for developing and
maintaining software
1) CMM-4. Managed Detailed measures of the software process and product quality are collected. Both
the software process and products are quantitatively understood and controlled
1) CMM-5. Optimizing Continuous process improvement is enabled by quantitative feedback from the
process and from piloting innovative ideas and technologies
1)
1) Q-41] What is Six Sigma?
1) Six Sigma is a philosophy, a philosophy of working smarter so that we make fewer and fewer mistakes
and eventually no mistakes at all in anything we do!! Amongst many things, Six Sigma is a disciplined
data driven approach, a customer oriented management philosophy, a statistic that reads “3.4 defects
per million opportunities to make defects”.
Centric to the Six Sigma philosophy is the Customer and it starts by understanding his needs and then
translating these needs into measurable parameters. This ensures that we are always in touch with
market realities.
1) From the experiences and knowledge gained in implementing Maturity Model, Independent Testing
Team will continuously endure and ensure to make the process as benchmark for Tanning.
1)
1) Q-42] what is Security Testing?
1) Application security testing is performed to guarantee that only the users with the appropriate authority
are able to use the applicable features of the system. The systems engineer establishes different
security settings for each user in the test environment. Network security testing is performed to
guarantee that the network is safe and secure from unauthorized users. The test manager must consider
the depth of hacking skill of the test staff and whether they would be able to perform an adequate set of
challenges. Frequently, it is advisable to use outsource security test specialists whose specialty is
comprehensive security testing.
1)
1) Q-43] what is Installation or Upgrading Testing?
1) Installation or upgrade testing involves testing the setup or upgrade routine to ensure that the product
can be successfully created. The test team decides whether to implement full builds or incremental
ones. The build process typically ensures that the installation or upgrade testing is completed
satisfactorily.
1)
1) Q-44] what is Network Testing?
1) Network testing determines the behavior of the complete network infrastructure when different
network latency is applied. Such testing can uncover problems with slow network links and so on. The
following infrastructure elements can be tested as part of network testing in the Internet Data Center
architecture:
1) Network availability

12
1) Equipment redundancy
1) Cluster fail over
1) Meshed topology
1) NIC Teaming feature
1) Dual-port functionality
Q-45] what is Memory Testing?
This testing is designed to ensure that the application will run in the amount of memory specified in the
technical documentation. This testing should also detect memory leaks associated with frequent starting and
stopping of the application.

Q-46] what is Functionality Testing?


Validating that an application or Web site conforms to its specifications and correctly performs all its
required functions. This entails a series of tests, which perform a feature-by-feature validation of behavior,
using a wide range of normal and erroneous input data. This can involve testing of the product's user
interface, APIs, database management, security, installation, networking, etc. Testing Automation can
perform functional testing on an automated or manual basis using black box or white box methodologies.

Q-47] how can it be known when to stop Testing?


This can be difficult to determine. Many modern software applications are so complex, and run in such an
interdependent environment, that complete testing can never be done. Common factors in deciding when to
stop are:
1) Deadlines (release deadlines, testing deadlines, etc.)
1) Test cases completed with certain percentage passed
1) Test budget depleted
1) Coverage of code/functionality/requirements reaches a specified point
1) Bug rate falls below a certain level
1) Beta or alpha testing period ends

Q-48] How does a client/server environment affect testing?


Client/server applications can be quite complex due to the multiple dependencies among clients, data
communications, hardware, and servers. Thus testing requirements can be extensive. When time is limited
(as it usually is) the focus should be on integration and system testing. Additionally,
load/stress/performance testing may be useful in determining client/server application limitations and
capabilities. There are commercial tools to assist with such testing.

Q-49] what is Optimization?


A process closely related to testing is optimisation. Optimisation is the process by which bottlenecks are
identified and removed by tuning the software, the hardware, or both. The optimisation process consists of
four key phases: collection, analysis, configuration, and testing. In the first phase of optimising an
application, you need to collect data to determine the baseline performance. Then by analysing this data
you can develop theories that identify potential bottlenecks. After making and documenting adjustments in
configuration or code, you must repeat the initial testing and determine if your theories proved true.
Without baseline performance data, it is impossible to determine if your modifications helped or hindered
your application.

Q-50] what is Data Integrity? Give examples for Data Integrity Testing?
Important data stored in the database include the catalogue, pricing, shipping tables, tax tables, order
database, and customer information. Testing must verify the correctness of the stored data. Therefore,
testing should be performed on a regular basis because data changes over time.
Examples of data integrity tests
1) Test the creation, modification, and deletion of data in tables as specified in the functionality.
1) Test to make sure that sets of radio buttons represent a fixed set of values. Check what happens when a
blank value is retrieved from the database.
1) Test that when a particular set of data is saved to the database, each value gets saved fully. In other
words, the truncation of strings and rounding of numeric value does not occur.

13
1) Test whether default values are saved in the database if the user input is not specified.
1) Test the compatibility with old data. In addition, old hardware, versions of the operating system, and
interfaces with other software need to be tested.

Q-51] what is Recovery Testing?


Another test that is performed on database software is the Recovery test. This test involves forcing the
system to fail in a variety of ways to ensure that:
1) The system recovers from faults and resumes processing within a pre-defined period of time.
1) The system is fault-tolerant, which means that processing faults do not halt the overall functioning of
the system.
1) Data recovery and restart are correct in case of auto-recovery. If recovery requires human intervention,
then the mean time to repair the database is within pre-defined acceptable limits.

Q-52] what is Security Testing & Network Security Testing?


Security
Gaining the confidence of online customers is extremely important to e-commerce success. Building the
confidence of online customers is not an easy task and requires a lot of time and effort. Therefore,
entrepreneurs must plan confidence-building measures. Ensuring the security of transactions over the
Internet ensures customer confidence.

The main technique in security testing is to attempt to violate built-in security controls. This technique
ensures that the protection mechanisms in the system secure it from improper penetration. The tester
overwhelms the system by continuous requests, thereby denying service to others. The tester may
purposely cause system errors to penetrate during recovery or may browse through insecure data to find the
key to system entry.

There are two distinct areas of concern in e-commerce security: network security and payment transaction
security. Types of security breaches in these areas are:
1) Secrecy
1) Authentication
1) Non-repudiation
1) Integrity control

Network Security Testing


Unauthorized users can wreak havoc on a Web site by accessing confidential information or by damaging
the data on the server. This kind of security lapse is due to insufficient network security measures.
The network operating system, together with the firewall, takes care of the security over the network.The
network operating system must be configured to allow only authentic users to access the network. Also,
firewalls must be installed and configured. This ensures that the transfer of data is restricted from only one
point on the network. This effectively prevents hackers from accessing the network.
For example, a hacker accesses the unsecured FTP port (Port 25) of a Web server. Using this port as an
entry point to the network, the hacker can access data on the server. The hacker may also be able to access
any machine connected to this server. Therefore, security testing will indicate these vulnerable areas and
will also help to configure the network settings for better security.
Network security over the Internet is tested using programs. One such program for Microsoft Windows
2000 is the Kane Security Analyst (KSA) from Intrusion Detection Inc. KSA is a complete network-testing
tool that also tests operating systems other than Windows 2000.
The KSA network security testing tool tests for:

14
1) User rights
1) Removable disk locations
1) Strength of password policies
1) Use of logon scripts and password expiration dates
1) Storage of passwords in clear text or encrypted form
The KSA report manager generates several reports to check miscellaneous sets of security-related concerns.
The software points out security loopholes only and does not trap unauthorized visitors.

Q-53] what is a Client/Server?


Client/server is a computational architecture that involves client processes requesting service from server
processes.
Client Process
The client is a process (program) that sends a message to a server process (program), requesting that the
server perform a task (service). Client programs usually manage the user-interface portion of the
application, validate data entered by the user, dispatch requests to server programs, and sometimes execute
business logic.
The client-based process is the front- end of the application that the user sees and interacts with. The client
process contains solution-specific logic and provides the interface between the user and the rest of the
application system. The client process also manages the local resources that the user interacts with such as
the monitor, keyboard, workstation CPU and peripherals.
One of the key elements of a client workstation is the graphical user interface (GUI). Normally a part of
operating system i.e. the window manager detects user actions, manages the windows on the display and
displays the data in the windows.

Server Process
A server process (program) fulfills the client request by performing the task requested. Server programs
generally receive requests from client programs, execute database retrieval and updates, and manage data
integrity and dispatch responses to client requests. Sometimes server programs execute common or
complex business logic.
The server-based process "may" run on another machine on the network. This server could be the host
operating system or network file server; the server is then provided both file system services and
application services. Or in some cases, another desktop machine provides the application services. The
server process acts as a software engine that manages shared resources such as databases, printers,
communication links, or high powered-processors. The server process performs the back-end tasks that are
common to similar applications.

Q-54] what is Two-Tier-Architecture?


A two-tier architecture is where a client talks directly to a server, with no intervening server. It is typically
used in small environments (less than 50 users).
A common error in client/server development is to prototype an application in a small, two-tier
environment and then scale up by simply adding more users to the server. This approach will usually result
in an ineffective system, as the server becomes overwhelmed. To properly scale to hundreds or thousands
of users, it is usually
Necessary to move to a three-tier architecture.

Q-55] what is Three-Tier-Architecture?


A three-tier architecture introduces a server between the client and the server. The role of the agent is many
folds. It can provide translation services, metering services, or intelligent agent services

15
Q-56] what is Middleware?
Connectivity allows applications to transparently communicate with other programs or processes,
regardless of their location. The key element of connectivity is the network operating system (NOS). NOS
provide services such as routing, distribution, messaging, file and print, and network management services.
NOS rely on communication protocols to provide specific services. The protocols are divided into three
groups: media, transport and client-server protocols. Media protocols determine the type of physical
connections used on a network (some examples of media protocols are Ethernet, Token Ring, Fiber
Distributed Data interface (FDDI), coaxial and twisted-pair). A transport protocol provides the mechanism
to move packets of data from client to server. Once the physical connection has been established and
transport protocols chosen, a client-server protocol is required before the user can access the network
services. A client-server protocol dictates the manner in which clients request information and services
from a server and also how the server replies to that request.

Q-57] what is Distributed Processing?


The distribution of applications and business logic across multiple processing platforms. Distributed
processing implies that processing will occur on more than one processor in order for a transaction to be
completed. In other words, processing is distributed across two or more machines and the processes are
most likely not running at the same time, i.e. each process performs part of an application in a sequence.
Often the data used in a distributed processing environment is also distributed across platforms.

Common Software Errors

Introduction

This document takes you through whirl-wind tour of common software errors. This is an excellent aid for
software testing. It helps you to identify errors systematically and increases the efficiency of software
testing and improves testing productivity. For more information, please refer Testing Computer Software,
Wiley Edition.

Type of Errors

1) User Interface Errors


1) Error Handling
1) Boundary related errors
1) Calculation errors
1) Initial and Later states
1) Control flow errors
1) Errors in Handling or Interpreting Data
1) Race Conditions
1) Load Conditions
1) Hardware
1) Source, Version and ID Control
1) Testing Errors

Let us go through details of each kind of error.

User Interface Errors


Functionality
Sl No Possible Error Conditions
1 Excessive Functionality
2 Inflated impression of functionality
3 Inadequacy for the task at hand
4 Missing function

16
5 Wrong function
6 Functionality must be created by user
7 Doesn't do what the user expects

Communication
Missing Information
Sl No Possible Error Conditions
1 No on Screen instructions
2 Assuming printed documentation is already available.
3 Undocumented features
4 States that appear impossible to exit
5 No cursor
6 Failure to acknowledge input
7 Failure to show activity during long delays
8 Failure to advise when a change will take effect
9 Failure to check for the same document being opened twice
Wrong, misleading, confusing information
10 Simple factual errors
11 Spelling errors
12 Inaccurate simplifications
13 Invalid metaphors
14 Confusing feature names
15 More than one name for the same feature
16 Information overland
17 When are data saved
18 Wrong function
19 Functionality must be created by user
20 Poor external modularity
Help text and error messages
21 Inappropriate reading levels
22 Verbosity
23 Inappropriate emotional tone
24 Factual errors
25 Context errors
26 Failure to identify the source of error
27 Forbidding a resource without saying why
28 Reporting non-errors
29 Failure to highlight the part of the screen
30 Failure to clear highlighting
31 Wrong/partial string displayed
32 Message displayed for too long or not long enough
Display Layout
33 Poor aesthetics in screen layout
34 Menu Layout errors
35 Dialog box layout errors

17
36 Obscured Instructions
37 Misuse of flash
38 Misuse of color
39 Heavy reliance on color
40 Inconsistent with the style of the environment
41 Cannot get rid of on screen information
Output
42 Can't output certain data
43 Can't redirect output
44 Format incompatible with a follow-up process
45 Must output too little or too much
46 Can't control output layout
47 Absurd printout level of precision
48 Can't control labeling of tables or figures
49 Can't control scaling of graphs
Performance
50 Program Speed
51 User Throughput
52 Can't redirect output
53 Perceived performance
54 Slow program
55 slow echoing
56 how to reduce user throughput
57 Poor responsiveness
58 No type ahead
59 No warning that the operation takes long time
60 No progress reports
61 Problems with time-outs
62 Program pesters you

Program Rigidity
User tailorability
Sl No Possible Error Conditions
1 Can't turn off case sensitivity
2 Can't tailor to hardware at hand
3 Can't change device initialization
4 Can't turn off automatic changes
5 Can't slow down/speed up scrolling
6 Can't do what you did last time
7 Failure to execute a customization commands
8 Failure to save customization commands
9 Side effects of feature changes
10 Can't turn off the noise
11 Infinite tailorability
Who is in control?

18
12 Unnecessary imposition of a conceptual style
13 Novice friendly, experienced hostile
14 Surplus or redundant information required
15 Unnecessary repetition of steps
16 Unnecessary limits

Command Structure and Rigidity


Inconsistencies
Sl No Possible Error Conditions
1 Optimizations
2 Inconsistent syntax
3 Inconsistent command entry style
4 Inconsistent abbreviations
5 Inconsistent termination rule
6 Inconsistent command options
7 Similarly named commands
8 Inconsistent Capitalization
9 Inconsistent menu position
10 Inconsistent function key usage
11 Inconsistent error handling rules
12 Inconsistent editing rules
13 Inconsistent data saving rules
Time Wasters
14 Garden paths
15 choice can't be taken
16 Are you really, really sure
17 Obscurely or idiosyncratically named commands
Menus
18 Excessively complex menu hierarchy
19 Inadequate menu navigation options
20 Too many paths to the same place
21 You can't get there from here
22 Related commands relegated to unrelated menus
23 Unrelated commands tossed under the same menu
Command Lines
24 Forced distinction between uppercase and lowercase
25 Reversed parameters
26 Full command names are not allowed
27 Abbreviations are not allowed
28 Demands complex input on one line
29 no batch input
30 can't edit commands
Inappropriate use of key board
31 Failure to use cursor, edit, or function keys
32 Non std use of cursor and edit keys

19
33 non-standard use of function keys
34 Failure to filter invalid keys
35 Failure to indicate key board state changes

Missing Commands
State transitions
Sl No Possible Error Conditions
1 Can't do nothing and leave
2 Can't quit mid-program
3 Can't stop mid-command
4 Can't pause
Disaster prevention
5 No backup facility
6 No undo
7 No are you sure
8 No incremental saves
Disaster prevention
9 Inconsistent menu position
10 Inconsistent function key usage
11 Inconsistent error handling rules
12 Inconsistent editing rules
13 Inconsistent data saving rules
Error handling by the user
14 No user specifiable filters
15 Awkward error correction
16 Can't include comments
17 Can't display relationships between variables
Miscellaneous
18 Inadequate privacy or security
19 Obsession with security
20 Can't hide menus
21 Doesn't support standard OS features
22 Doesn't allow long names

Error Handling

Error prevention
Sl No Possible Error Conditions
1 Inadequate initial state validation
2 Inadequate tests of user input
3 Inadequate protection against corrupted data
4 Inadequate tests of passed parameters
5 Inadequate protection against operating system bugs
6 Inadequate protection against malicious use
7 Inadequate version control

20
Error Detection
Sl No Possible Error Conditions
1 ignores overflow
2 ignores impossible values
3 ignores implausible values
4 ignores error flag
5 ignores hardware fault or error conditions
6 data comparison

Error Recovery
Sl No Possible Error Conditions
1 automatic error detection
2 failure to report error
3 failure to set an error flag
4 where does the program go back to
5 aborting errors
6 recovery from hardware problems
7 no escape from missing disks

Boundary related errors

Sl No Possible Error Conditions


1 Numeric boundaries
2 Equality as boundary
3 Boundaries on numerosity
4 Boundaries in space
5 Boundaries in time
6 Boundaries in loop
7 Boundaries in memory
8 Boundaries with data structure
9 Hardware related boundaries
10 Invisible boundaries
11 Mishandling of boundary case
12 Wrong boundary
13 Mishandling of cases outside boundary

Calculation Errors

Sl No Possible Error Conditions


1 Bad Logic
2 Bad Arithmetic
3 Imprecise Calculations
4 Outdated constants
5 Calculation errors

21
6 Impossible parenthesis
7 Wrong order of calculations
8 Bad underlying functions
9 Overflow and Underflow
10 Truncation and Round-off error
11 Confusion about the representation of data
12 Incorrect conversion from one data representation to another
13 Wrong Formula
14 Incorrect Approximation

Race Conditions

Sl No Possible Error Conditions


1 Races in updating data
2 Assumption that one event or task finished before another begins
3 Assumptions that one event or task has finished before another begins
4 Assumptions that input won't occur during a brief processing interval
5 Assumptions that interrupts won't occur during brief interval
6 Resource races
7 Assumptions that a person, device or process will respond quickly
8 Options out of sync during display changes
9 Tasks starts before its prerequisites are met
10 Messages cross or don't arrive in the order sent

Initial and Later States

Sl No Possible Error Conditions


1 Failure to set data item to zero
2 Failure to initialize a loop-control variable
3 Failure to initialize a or re-initialize a pointer
4 Failure to clear a string
5 Failure to initialize a register
6 Failure to clear a flag
7 Data were supposed to be initialized elsewhere
8 Failure to re-initialize
9 Assumption that data were not re-initialized
10 Confusion between static and dynamic storage
11 Data modifications by side effect
12 Incorrect initialization
Control Flow Errors

Program runs amok


Sl No Possible Error Conditions
1 Jumping to a routine that isn't resident
2 Re-entrance
3 Variables contains embedded command names

22
4 Wrong returning state assumed
5 Exception handling based exits

Return to wrong place


Sl No Possible Error Conditions
1 Corrupted Stack
2 Stack underflow/overflow
3 GOTO rather than RETURN from sub-routine

Interrupts
Sl No Possible Error Conditions
1 Wrong interrupt vector
2 Failure to restore or update interrupt vector
3 Invalid restart after an interrupt
4 Failure to block or un-block interrupts

Program Stops
Sl No Possible Error Conditions
1 Dead crash
2 Syntax error reported at run time
3 Waiting for impossible condition or combinations of conditions
4 Wrong user or process priority

Error Detection
Sl No Possible Error Conditions
1 infinite loop
2 Wrong starting value for the loop control variables
3 Accidental change of loop control variables
4 Command that do or don't belong inside the loop
5 Command that do or don't belong inside the loop
6 Improper loop nesting

If Then Else , Or may not


Sl No Possible Error Conditions
1 Wrong inequalities
2 Comparison sometimes yields wrong result
3 Not equal verses equal when there are three cases
4 Testing floating point values for equality
5 confusion between inclusive and exclusive OR
6 Incorrectly negating a logical expression
7 Assignment equal instead of test equal
8 Commands being inside the THEN or ELSE clause
9 Commands that don't belong either case
10 Failure to test a flag
11 Failure to clear a flag

23
Multiple Cases
Sl No Possible Error Conditions
1 Missing default
2 Wrong default
3 Missing cases
4 Overlapping cases
5 Invalid or impossible cases
6 Commands being inside the THEN or ELSE clause
7 Case should be sub-divided

Errors Handling or Interpreting Data

Problems in passing data between routines


Sl No Possible Error Conditions
1 Parameter list variables out of order or missing
2 Data Type errors
3 Aliases and shifting interpretations of the same area of memory
4 Misunderstood data values
5 inadequate error information
6 Failure to clean up data on exception handling
7 Outdated copies of data
8 Related variable get out of synch
9 Local setting of global data
10 Global use of local variables
11 Wrong mask in bit fields
12 Wrong value from table

Data boundaries
Sl No Possible Error Conditions
1 Un-terminated null strings
2 Early end of string
3 Read/Write past end of data structure or an element in it

Read outside the limits of message buffer


Sl No Possible Error Conditions
1 Complier padding to word boundaries
2 value stack underflow/overflow
3 Trampling another process's code or data

Messaging Problems
Sl No Possible Error Conditions
1 Messages sent to wrong process or port
2 Failure to validate an incoming message
3 Lost or out of synch messages
4 Message sent to only N of N+1 processes

24
Data Storage corruption
Sl No Possible Error Conditions
1 Overwritten changes
2 Data entry not saved
3 Too much data for receiving process to handle
4 Overwriting a file after an error exit or user abort

Load Conditions

Sl No Possible Error Conditions


1 Required resources are not available
2 No available large memory area
3 Input buffer or queue not deep enough
4 Doesn't clear item from queue, buffer or stock
5 Lost Messages
6 Performance costs
7 Race condition windows expand
8 Doesn't abbreviate under load
9 Doesn't recognize that another process abbreviates output under load
10 Low priority tasks not put off
11 Low priority tasks never done

Doesn't return a resource


Sl No Possible Error Conditions
1 Doesn't indicate that it's done with a device
2 Doesn't erase old files from mass storage
3 Doesn't return unused memory
4 Wastes computer time

Hardware

Sl No Possible Error Conditions


1 Wrong Device
2 Wrong Device Address
3 Device unavailable
4 Device returned to wrong type of pool
5 Device use forbidden to caller
6 Specifies wrong privilege level for the device
7 Noisy Channel
8 Channel goes down
9 Time-out problems
10 Wrong storage device
11 Doesn't check the directory of current disk
12 Doesn't close the file
13 Unexpected end of file

25
14 Disk sector bug and other length dependent errors
15 Wrong operation or instruction codes
16 Misunderstood status or return code
17 Underutilizing device intelligence
18 Paging mechanism ignored or misunderstood
19 Ignores channel throughput limits
20 Assuming device is or isn't or should be or shouldn't be initialized
21 Assumes programmable function keys are programmed correctly

Source, Version, ID Control

Sl No Possible Error Conditions


1 Old bugs mysteriously re appear
2 Failure to update multiple copies of data or program files
3 No title
4 No version ID
5 Wrong version number of title screen
6 No copy right message or bad one
7 Archived source doesn't compile into a match for shipping code
8 Manufactured disks don't work or contain wrong code or data

Testing Errors

Missing bugs in the program


Sl No Possible Error Conditions
1 Failure to notice a problem
2 You don't know what the correct test results are
3 You are bored or inattentive
4 Misreading the Screen
5 Failure to report problem
6 Failure to execute a planned test
7 Failure to use the most promising test case
8 Ignoring programmer's suggestions

Finding bugs that aren't in the program


Sl No Possible Error Conditions
1 Errors in testing programs
2 Corrupted data files
3 Misinterpreted specifications or documentation

Poor reporting
Sl No Possible Error Conditions
1 Illegible reports
2 Failure to make it clear how to reproduce the problem
3 Failure to say you can't reproduce the problem
4 Failure to check your report

26
5 Failure to report timing dependencies
6 Failure to simplify conditions
7 Concentration on trivia
8 Abusive language

Poor Tracking and follow-up


Sl No Possible Error Conditions
1 Failure to provide summary report
2 Failure to re-report serious bug
3 Failure to check for unresolved problems just before release
4 Failure to verify fixes

GENERAL QUESTIONS-6
CONCEPTS
What is the difference between testing and debugging?

Big difference is that debugging is conducted by a programmer and the programmer fix the errors
during debugging phase. Tester never fixes the errors, but rather find them and return to programmer.

What is a bug? What types of bugs do you know?

Bug is a error during execution of the program. There are two types of bugs: syntax and logical.
General Questions
Could you test a program 100%? 90%? Why?
Definitely not! The major problem with testing that you cannot calculate how many error are in the
code, functioning etc. There are many factors involved such as experience of programmer, complexity
of the system etc.
How would you test a mug (chair/table/gas station etc.)?
First of all you must demand requirements and functional specification and design document of the
mug. There will find requirements like ability to hold hot water, waterproof, stability, break ability and
so on. Then you should test the mug according to all documents.
How would you conduct a test: top-down or down-top? What is it? Which one is better?
Down-Top: unit -> interface -> system.
Top-Down is a vice versa.
You may use both, but down-top allows to discover malfunctioning at earlier phases of development
and cheaper to fix than in the case of top-down.
How to develop a test plan ? How to develop a test case?
Test plan consists of test cases. Test cases you develop according to requirement and design documents
of the unit, system etc. You may be asked what would you do if you are not provided with
requirements documents. Then, you start creating your test cases based on functionality of the system.
You should mention that this is a bad practice since there is no way to compare that what you got is
what you planned to create.
How do you see a QA role in the product development life cycle?
QA should be involved in early stages of the development process in order to create an adequate test
cases and better general understanding of the system. QA, however, must be separated from the

27
development team to ensure that there is no influence of developers on QA engineers. As a last resort
before shipping product to the customer, QA has great level of responsibility for the quality of the
product and image of the company on the market.
What is the size of your executable?
10MB. Who cares? You should demonstrate that you can't be caught with unexpected questions. This
question is one of the dumbest, but you must react accordingly. Tell any reasonable number you want,
but be careful not to exaggerate!
How would you execute a SQL query in Oracle 8?
Again, if you ever worked with Oracle, this question should be trivial for you to answer (from
command prompt, of course) If you never worked with Oracle, note politely that you did not touch an
Oracle database on your career path.
What version of OS were you using?
Tell whatever you want - you can't be caught here. Popular answers are Windows 95/98, Windows
2000 (make sure you know various flavors) and various Unix flavors (AIX, Solaris, SunOS, HPUX
etc.)
Have you tested front-end of back-end?
In other word you are asked if you tested GUI part of the application or server part of your application.
What's the most important thing in testing?
You may bring your own arguments, or simply mention "accuracy", "knowledge of the system", etc.
What was the most difficult problem you ever found while testing?
This is homework. Think about one and give it as an example.
What's your favorite command in Unix?
Again, this is homework. You may mention any command you want: ls, ps, cat, etc. Be careful not to
mention "complex" command unless you absolutely sure what you are talking about. For example, if
you mention "vi" you may be asked how were you using vi editor?
What were you responsible to test in your previous company?
This is homework for you. Actually, this question is a test of the knowledge of your own resume. You
must know your real or fake resume as a bible. Practice in front of mirror or ask you
Why do you like to test?
You enjoy bug hunting process, feel great being between developers and customers, your background
and experience are targeting the testing techniques enhancements and you feel proud of your
contribution to the whole development process.
What role do you see yourself in 2-3 years from now? Would you want to become a developer?
You should not concentrate the attention of the interviewer on your wish to become a developer. You
are being hired for testing role and you should demonstrate reliability. Team lead of QA team is OK,
but do not answer yes when asked if you are willing to become a developer.
What bug tracking system did you use?
Again and again, it does not matter what bug tracking system did you use if you made your homework
and invented the name of such or mentioned a standard one. You may say you've used proprietary bug
tracking system (works especially well if you previous company was this way or another dealing with
databases) or you may say Test Director, Clarify, ClearCase, VisualTest etc.
Here I am sending answers for some of recently asked Interview questions. Please go through them
before appearing interview. May Be Helpful.

1. Test Bed: It is a combination of test scripts + a document explaining how to execute the test scripts
and analyze the results.

28
2. Test Driver: A program/tool that will be used to send various input conditions for different test cases.

3. Testing process: It explains the process we are following in our organization for testing like.
1. Thorough Requirement Analysis.
2. Test Planning
3. Test Case Implementation.
4. Test Execution
5. Testing Metrics like Test Coverage
6. Risk Analysis.
7. Test Summary Report

4. Test Framework: This is the standard framework that should be followed by every person of the
project. This contains an information like Coding Standards, Naming Conventions, Ways to develop test
scripts etc.

5. Test Scenario: The order in which we have to run the test cases.
For Example: Say we have 3 test cases 1) To with draw money from
bank
2) To deposit money 3) To cancel transaction. We can run different scenarios for above 3 test cases like
running 1,2,3 or running 1,3,2 or running 3,2,1 etc.

6. Test Condition: This is the detailed view of looking Test


Requirements (i.e What to test from Software Requirements).

NOTE: Please go thorugh document "Defining Test Requirements


For Automated Software Testing" from files section which I felt worth
reading.
General Questions-4

Interview Questions

On this page I put more than 200 different interview questions from different recourses. Some of them are
very simple some are a little bit difficult. If you would like to check you technical knowledge or to see
more questions and answers you can download free copy of Exam application from “ Software we offer
page”
Test Automation:
1. What automating testing tools are you familiar with?
2. How did you use automating testing tools in your job?
3. Describe some problem that you had with automating testing tool.
4. How do you plan test automation?
5. Can test automation improve test effectiveness?
6. What is data - driven automation?
7. What are the main attributes of test automation?
8. Does automation replace manual testing?
9. How will you choose a tool for test automation?
10. How you will evaluate the tool for test automation?
11. What are main benefits of test automation?
12. What could go wrong with test automation
13. How you will describe testing activities?
14. What testing activities you may want to automate?
15. Describe common problems of test automation.
16. What types of scripting techniques for test automation do you know?
17. What are principles of good testing scripts for automation?
18. What tools are available for support of testing during software development life cycle?

29
19. Can the activities of test case design be automated?
20. What are the limitations of automating software testing?
21. What skills needed to be a good test automator?
22. How to find that tools work well with your existing system?
23.Describe some problem that you had with automating testing tool.
24.What are the main attributes of test automation?
25.What testing activities you may want to automate in a project?
26.How to find that tools work well with your existing system?

Load Testing:
1) What criteria would you use to select Web transactions for load testing?
1) For what purpose are virtual users created?
1) Why it is recommended to add verification checks to your all your scenarios?
1) In what situation would you want to parameterize a text verification check?
1) Why do you need to parameterize fields in your virtual user script?
1) What are the reasons why parameterization is necessary when load testing the Web server
and the database server?
1) How can data caching have a negative effect on load testing results?
1) What usually indicates that your virtual user script has dynamic data that is dependent on you
parameterized fields?
1) What are the benefits of creating multiple actions within any virtual user script?

General questions:
1. What types of documents would you need for QA, QC, and Testing?
2. What did you include in a test plan?
3. Describe any bug you remember.
4. What is the purpose of the testing?
5. What do you like (not like) in this job?
6. What is quality assurance?
7. What is the difference between QA and testing?
8. How do you scope, organize, and execute a test project?
9. What is the role of QA in a development project?
10. What is the role of QA in a company that produces software?
11. Define quality for me as you understand it
12. Describe to me the difference between validation and verification.
13. Describe to me what you see as a process. Not a particular process, just the basics of having a
process.
14. Describe to me when you would consider employing a failure mode and effect analysis.
15. Describe to me the Software Development Life Cycle as you would define it.
16. What are the properties of a good requirement?
17. How do you differentiate the roles of Quality Assurance Manager and Project Manager?
18. Tell me about any quality efforts you have overseen or implemented. Describe some of the
challenges you faced and how you overcame them.
19. How do you deal with environments that are hostile to quality change efforts?
20. In general, how do you see automation fitting into the overall process of testing?
21. How do you promote the concept of phase containment and defect prevention?
22. If you come onboard, give me a general idea of what your first overall tasks will be as far as
starting a quality effort.
23. What kinds of testing have you done?
24. Have you ever created a test plan?
25. Have you ever written test cases or did you just execute those written by others?
26. What did your base your test cases?
27. How do you determine what to test?
28. How do you decide when you have 'tested enough?'
29. How do you test if you have minimal or no documentation about the product?
30. Describe me to the basic elements you put in a defect report?

30
31. How do you perform regression testing?
32. At what stage of the life cycle does testing begin in your opinion?
33. How do you analyze your test results? What metrics do you try to provide?
34. Realising you won't be able to test everything - how do you decide what to test first?
35. Where do you get your expected results?
36. If automating - what is your process for determining what to automate and in what order?
37. In the past, I have been asked to verbally start mapping out a test plan for a common situation,
such as an ATM. The interviewer might say, "Just thinking out loud, if you were tasked to test an
ATM, what items might you test plan include?" These type questions are not meant to be answered
conclusively, but it is a good way for the interviewer to see how you approach the task.
38. If you're given a program that will average student grades, what kinds of inputs would you
use?
39. Tell me about the best bug you ever found.
40. What made you pick testing over another career?
41. What is the exact difference between Integration & System testing, give me examples with
your project.
42. How did you go about testing a project?
43. When should testing start in a project? Why?
44. How do you go about testing a web application?
45. Difference between Black & White box testing
46. What is Configuration management? Tools used?
47. What do you plan to become after say 2-5yrs (Ex: QA Manager, Why?)
48. Would you like to work in a team or alone, why?
49. Give me 5 strong & weak points of yours
50. Why do you want to join our company?
51. When should testing be stopped?
52. What sort of things would you put down in a bug report?
53. Who in the company is responsible for Quality?
54. Who defines quality?
55. What is an equivalence class?
56. Is a "A fast database retrieval rate" a testable requirement?
57. Should we test every possible combination/scenario for a program?
58. What criteria do you use when determining when to automate a test or leave it manual?
59. When do you start developing your automation tests?
60. Discuss what test metrics you feel are important to publish an organization?
61. In case anybody cares, here are the questions that I will be asking:
62. Describe the role that QA plays in the software lifecycle.
63. What should Development require of QA?
64. What should QA require of Development?
65. How would you define a "bug?"
66. Give me an example of the best and worst experiences you've had with QA.
67. How does unit testing play a role in the development / software lifecycle?
68. Explain some techniques for developing software components with respect to testability.
69. Describe a past experience with implementing a test harness in the development of software.
70. Have you ever worked with QA in developing test tools? Explain the participation
Development should have with QA in leveraging such test tools for QA use.
71. Give me some examples of how you have participated in Integration Testing.
72. How would you describe the involvement you have had with the bug-fix cycle between
Development and QA?
72. What is unit testing?
73. Describe your personal software development process.
74. How do you know when your code has met specifications?
75. How do you know your code has met specifications when there are no specifications?
76. Describe your experiences with code analyzers.
77. How do you feel about cyclomatic complexity?
78. Who should test your code?

31
79.How do you survive chaos?
80. What processes/methodologies are you familiar with?
81. What type of documents would you need for QA/QC/Testing?
82. How can you use technology to solve problem?
83. What type of metrics would you use?
84. How to find that tools work well with your existing system?
85. What automated tools are you familiar with?
86. How well you work with a team?
87. How would you ensure 100% coverage of testing?
88. How would you build a test team?
89. What problem you have right now or in the past? How you solved it?
90. What you will do during the first day of job?
91. What would you like to do five years from now?
92. Tell me about the worst boss you've ever had.
93. What are your greatest weaknesses?
94. What are your strengths?
95. What is a successful product?
96. What do you like about Windows?
97. What is good code?
98. Who is Kent Beck, Dr Grace Hopper, Dennis Ritchie?
99. What are basic, core, practises for a QA specialist?
100. What do you like about QA?
101. What has not worked well in your previous QA experience and what would you change?
102. How you will begin to improve the QA process?
103. What is the difference between QA and QC?
104. What is UML and how to use it for testing?
105. What is CMM and CMMI? What is the difference?
106. What do you like about computers?
107. Do you have a favourite QA book? More than one? Which ones? And why.
108. What is the responsibility of programmers vs QA?
109.What are the properties of a good requirement?
110.Ho to do test if we have minimal or no documentation about the product?
111.What are all the basic elements in a defect report?
112.Is an "A fast database retrieval rate" a testable requirement?

From Cem Kaner article: "Recruiting testers" December 1999

1. What is software quality assurance?


2. What is the value of a testing group? How do you justify your work and budget?
3. What is the role of the test group vis-à¶is documentation, tech support, and so forth?
4. How much interaction with users should testers have, and why?
5. How should you learn about problems discovered in the field, and what should you learn from those
problems?
6. What are the roles of glass-box and black-box testing tools?
7. What issues come up in test automation, and how do you manage them?
8. What development model should programmers and the test group use?
9. How do you get programmers to build testability support into their code?
10. What is the role of a bug tracking system?
11. What are the key challenges of testing?
12. Have you ever completely tested any part of a product? How?
13. Have you done exploratory or specification-driven testing?
14. Should every business test its software the same way?
15. Discuss the economics of automation and the role of metrics in testing.
16. Describe components of a typical test plan, such as tools for interactive products and for database
products, as well as cause-and-effect graphs and data-flow diagrams.
17. When have you had to focus on data integrity?

32
18. What are some of the typical bugs you encountered in your last assignment?
19. How do you prioritize testing tasks within a project?
20. How do you develop a test plan and schedule? Describe bottom-up and top-down approaches.
21. When should you begin test planning?
22. When should you begin testing?
23. Do you know of metrics that help you estimate the size of the testing effort?
24. How do you scope out the size of the testing effort?
25. How many hours a week should a tester work?
26. How should your staff be managed? How about your overtime?
27. How do you estimate staff requirements?
28. What do you do (with the project tasks) when the schedule fails?
29. How do you handle conflict with programmers?
30. How do you know when the product is tested well enough?
31. What characteristics would you seek in a candidate for test-group manager?
32. What do you think the role of test-group manager should be? Relative to senior management?
Relative to other technical groups in the company? Relative to your staff?
33. How do your characteristics compare to the profile of the ideal manager that you just described?
34. How does your preferred work style work with the ideal test-manager role that you just described?
What is different between the way you work and the role you described?
35. Who should you hire in a testing group and why?
36. What is the role of metrics in comparing staff performance in human resources management?
37. How do you estimate staff requirements?
38. What do you do (with the project staff) when the schedule fails?
39. Describe some staff conflicts youÂ’ve handled.
Here are some questions you might be asked on a job interview for a testing opening: (from MU COSC 198
Software Testing by Dr. Corliss)

1) Why did you ever become involved in QA/testing?

1) What is the testing lifecycle and explain each of its phases?

1) What is the difference between testing and Quality Assurance?

1) What is Negative testing?

1) What was a problem you had in your previous assignment (testing if possible)? How did you
resolve it?

1) What are two of your strengths that you will bring to our QA/testing team?

1) How would you define Quality Assurance?

1) What do you like most about Quality Assurance/Testing?

1) What do you like least about Quality Assurance/Testing?

1) What is the Waterfall Development Method and do you agree with all the steps?

1) What is the V-Model Development Method and do you agree with this model?

1) What is the Capability Maturity Model (CMM)? At what CMM level were the last few companies
you worked?

1) What is a "Good Tester"?

1) Could you tell me two things you did in your previous assignment (QA/Testing related hopefully)
that you are proud of?

33
1) List 5 words that best describe your strengths.

1) What are two of your weaknesses?

1) What methodologies have you used to develop test cases?

1) In an application currently in production, one module of code is being modified. Is it necessary to


re- test the whole application or is it enough to just test functionality associated with that
module?

1) Define each of the following and explain how each relates to the other: Unit, System, and
Integration testing.

1) Define Verification and Validation. Explain the differences between the two.

1) Explain the differences between White-box, Gray-box, and Black-box testing.

1) How do you go about going into a new organization? How do you assimilate?

1) Define the following and explain their usefulness: Change Management, Configuration
Management, Version Control, and Defect Tracking.

1) What is ISO 9000? Have you ever been in an ISO shop?

1) When are you done testing?

1) What is the difference between a test strategy and a test plan?

1) What is ISO 9003? Why is it important

1) What are ISO standards? Why are they important?

1) What is IEEE 829? (This standard is important for Software Test Documentation-Why?)

1) What is IEEE? Why is it important?

1) Do you support automated testing? Why?

1) We have a testing assignment that is time-driven. Do you think automated tests are the best
solution?

1) What is your experience with change control? Our development team has only 10 members. Do
you think managing change is such a big deal for us?

1) Are reusable test cases a big plus of automated testing and explain why.

1) Can you build a good audit trail using Compuware's QACenter products. Explain why.

1) How important is Change Management in today's computing environments?

1) Do you think tools are required for managing change. Explain and please list some tools/practices
which can help you managing change.

1) We believe in ad-hoc software processes for projects. Do you agree with this? Please explain your
answer.

1) When is a good time for system testing?

1) Are regression tests required or do you feel there is a better use for resources?

34
1) Our software designers use UML for modeling applications. Based on their use cases, we would
like to plan a test strategy. Do you agree with this approach or would this mean more effort
for the testers.

1) Tell me about a difficult time you had at work and how you worked through it.

1) Give me an example of something you tried at work but did not work out so you had to go at
things another way.

1) How can one file compare future dated output files from a program which has change, against the
baseline run which used current date for input. The client does not want to mask dates on the
output files to allow compares. - Answer-Rerun baseline and future date input files same # of
days as future dated run of program with change. Now run a file compare against the baseline
future dated output and the changed programs' future dated output.
Interviewing Suggestions
1) If you do not recognize a term ask for further definition. You may know the methodology/term but
you have used a different name for it.

1) Always keep in mind that the employer wants to know what you are going to do for them, with
that you should always stay/be positive.
Preinterview Questions
1) What is the structure of the company?

1) Who is going to do the interview-possible background information of interviewer?

1) What is the employer's environment (platforms, tools, etc.)?

1) What are the employer's methods and processes used in software arena?

1) What is the employer's philosophy?

1) What is the project all about you are interviewing for-as much information as possible.

1) Any terminologies that the company may use.

35

Anda mungkin juga menyukai