Anda di halaman 1dari 101

A Training Report

Submitted to the Rajasthan Technical University,


Kota
in Partial fulfillment of the requirement for the
degree of
MASTER OF COMPUTER APPICATIONS

Submitted by
Gaurav Chaplot

GUIDED BY: TRAINING INCHARGE:


Blessy Nagma Khan
Head of Comp. Sc. Dept. HR Manager

om th

Apex Institute of Management & Science,


Jaipur
Affiliated to
Rajasthan Technical University, Kota
MCA (2007-10)

May,2010
HEALTH RISK ASSESSMENT

CONTENTS

1. Preface

2. Project Profile:-

11 Company Profile

11 Project Definition

11 Problem Definition According to Roles

11 General Outline of project

 Overview of the project

 Entities of the project

3. Software Tools:
1. About ASP.Net.

2. About JavaScript.

3. About HTML

4. SQL Server 2005

4. Hardware Platform

5. System Analysis
1. Introduction.

2. Feasibility Study.

3. Problem analysis.
4. Need of a new system.

6. System Requirement Specification

 Proposed System and Advantages

7. System Design and Development:


1. Introduction

2. Steps of Design Process

3. Database design of Sections.

8. E R Diagram

9. USE CASE Diagram

10. System Testing

11. Future and scope of Application.

12. Conclusion

13. Bibliography

PREFACE
Health Risk assessment is a scientific process of evaluating the adverse effects caused
by a substance, activity, lifestyle, or natural phenomenon.

Health Risk Assessment (HRA) is an evolving concept defined as a systematic


collection of electronic health information about individual patients or populations. It
is a record in digital format that is capable of being shared across different health care
settings, by being embedded in network-connected enterprise-wide information
systems. Such records may include a whole range of data in comprehensive or
summary form, including demographics, medical history, medication and allergies,
immunization status, laboratory test results, radiology images, and billing
information.

Whereas the practices of 20th century health care were based largely on paper, there
is now a broad consensus that realizing an improved 21st century vision of health care
will require intensive use of information technology to acquire, manage, analyze, and
disseminate health care information and knowledge. Accordingly, the Administration
and Congress have been moving to encourage the adoption, connectivity, and
interoperability of health care information technology.

Formar President George W. Bush called for nationwide use of electronic medical
records by 2014 and the Department of Health and Human Services (HHS) is
involved in various aspects of achieving this goal.

It is essentially axiomatic that modern health care is an information-and knowledge-


intensive enterprise. The information collected in health care includes, among other
things, medical records of individual patients (both paper and electronic, spread
across many different health care organizations), laboratory test results, information
about treatment protocols and drug interactions, and a variety of financial and
administrative information.
PROJECT PROFILE

COMPANY PROFILE
Fusion Outsourcing Pvt. Ltd. is leading solution provider based in Udaipur, India.

FUSION is well-positioned to be a global IT hub because of its intellectual resources


and specialized expertise. FUSION offshore services understand the needs of
international customers and deliver to high standards of quality and efficiency.

With the company’s inception in the year 2004 FUSION has set milestones in its
areas of operation. It started up with the idea of providing an offshore support to its
foreign clients in the area of process outsourcing.
Today FUSION has diversified its areas of operation in various fields which
includes:

Software development
Application Management

FUSION's application management services relieve you of the burden of application-


related processes that are draining your budget. With our experience in application
management services since 2001, we know how to get the best out of your code.
Through our application management services, we make your applications productive
at 50 - 60% of the cost that you are paying for internal staffing. FUSION then works
with you to get you even more savings and benefits over the long-term.

Testing & Validation

FUSION has a proven track record of establishing and running dedicated test labs for
their Global 1000 clients. The teams at FUSION use disciplined, mature SEI CMM
certified quality processes that have been honed

Over 1 Million person hours of project work in the field of onsite/offshore testing,
with a clear focus on areas critical to the success of such efforts. FUSION has
developed methodologies for end-to-end testing of complex systems, and has
adequate exposure to testing of systems using a client’s in-house methods. FUSION
provides robust review, testing and change control processes, ensuring cost-effective,
fast and quality deliverables through a global delivery model consisting of onsite,
near-shore and onsite / offshore execution.

Product Engeneering
Today, saving money is the primary motivation for Independent Software Vendors
(ISVs) seeking offshore outsourcing vendor relationships. Some companies, not
satisfied with potential savings of 20% to 40%, seek greater savings by opening their
own offshore product development and maintenance centers. This practice is
particularly prevalent in the high-tech sector, where companies may already have
offshore R&D centers, and in the financial services sector, where companies are
looking to reduce costs and/or better secure their internal systems. Unfortunately, the
management investment and overhead costs associated with opening a high-quality
offshore product development center that is intended to support an enterprise IT
organization rarely justifies the payoff.

Technology Consulting

FUSION's IT consulting practice is focused on helping clients use information and


information technology

Our clients are global corporations who are empowered through our value-based,
customized, and implementation-oriented approach.

When we serve our clients on IT issues, we combine our knowledge and experience in
IT with our strengths in strategy, organization, and operations. With a rich pool of
resources across the world, deep industry and business process expertise, broad global
resources and a proven track record, FUSION can mobilize the right people, skills,
and technologies to help clients improve their performance.

Web Services
Website design & development

FUSION's provides professional website design, development and maintenance


services. Our skilled web designers and developers accomplish various website
projects from brochure sites to multi-functional web portals. FUSION's has a large
pool of resources and the required business and technical expertise to develop
websites of any complexity. We provide complete front-end and back-end
development based on the latest technologies and industry trends.

Web application Development


FUSION's has been working in the field of custom software development since 1999.
Our expert team has taken part in dozens of application development projects. We are
experienced in developing advanced systems with complex business logic dealing
with large amounts of data and transactions. We are able to supply you with an
innovative, trustworthy software solution to complement your most complicated
business ideas.

Graphic design/Multimedia

FUSION's offers a wide range of professional graphic design services including logo
design, corporate identity development, multimedia and flash presentations.

We lay solid foundation in web design development. Keeping track of the latest
technological innovations we make good use of the most advanced web design tools,
thus ensuring the top-notch quality of the end-product and complete satisfaction of
our customers. Our step-by-step process guarantees that nothing is overlooked in the
final product.

Maintenance & Support

Web applications and websites require maintenance and support as your needs change
and grow. FUSION's provides maintenance and support services freeing you from
maintaining expensive in-house support staff. We partner with our clients to help their
on line business running smoothly and error-free.

KPO Service
Financial Research & Analysis

Web applications and websites require maintenance and support as your needs change
and grow. FUSION's provides maintenance and support services freeing you from
maintaining expensive in-house support staff. We partner with our clients to help their
on line business running smoothly and error-free.

Business consultancy services

FUSION offers business consulting in various areas of business operations by


defining, aligning and optimizing our client's business goals and objectives. We take
time to understand our client's business, their requirements and needs, and then create
customized solutions that deliver tangible benefits, add value and improve their
business processes. Foresight to evaluate new technologies from a practical
perspective and focus on key industries has resulted in accumulation of extensive
domain knowledge about business processes. This has enabled us to deliver quality
and cutting edge solutions to our clients.

BPO Services
Call Center Services

FUSION uses advanced call center services to provide standardized streamlined ,


uniform services for consumers, making it an effective approach for interacting with
their customers. Our offshore strategic call center services can help you attain your
business targets! Give your business a cutting edge, with our world-class customer
support services.

CATI Services -Computer Aided Telephone Interviewing

Think FUSION for next-generation CATI research solutions. At FUSION outsourcing


unique distributed dialer technology can enable efficient call management and
compliance management. Get accurate data, with our automated predictive dialer. Our
Rapid Dial ensures four times higher productivity, than manual dialing to ours
outsourcing partners. Target future customers, and detect customer churn with our
100% voice recording.

E-mail and Chat Support Services

At FUSION, we will answer your customer’s queries promptly, competently and


precisely. We helps our outsourcing ventures to balance work between peak periods
and off-peak periods, while they focus on their core competencies. FUSION's e-mail
based support system will ensure lower call volumes and reduces costs for voice
support.

Healthcare Services

Many companies provide complete healthcare services that cater to the healthcare
industry as a whole, these companies offers high-quality medical services at cost-
effective price but FUSION outsourcing Pvt. Ltd. offers a range of Healthcare
services that can help you achieve a faultless process giving competitive advantage.
FUSION outsourcing offers the following Healthcare services :-

 Medical Coding

 Medical Billing

 Account Receivable

Data Management

FUSION understands that timely access to business document ensures the smooth
functioning of an organization. Today's fast pace dynamic business environment
demands that organizations adopt a document management system that keeps pace
with its growth,in the absence of which, administrators and network consultants feel
handicapped . FUSION offshore services provides you with the best in industry data
management solutions.

FUSION outsourcing offers the following data management services:

 Data Entry Services

 Data Cleansing

 Data Conversion Services

 Data Mining

INTRODUCTION OF HRA
Health Risk Assessment (HRA) is an evolving concept defined as a systematic
collection of electronic health information about individual patients or populations. It
is a record in digital format that is capable of being shared across different health care
settings, by being embedded in network-connected enterprise-wide information
systems. Such records may include a whole range of data in comprehensive or
summary form, including demographics, medical history, medication and allergies,
immunization status, laboratory test results, radiology images, and billing
information.

Doctors and nurses spend hours every day chasing down patient charts and missing
information. Hospitals bear enormous costs associated with record filing and
overhead, as well as paper copying and printing. As difficult as the challenges of
paper-based medicine can be, the prospect of an overnight conversion to fully
electronic records can be even more unnerving.

The Health Risk Assessment (HRA) Records Management tracks activity and status
for a hospital’s paper medical records files, while additionally permitting a hospital to
integrate paper-based information with digital records quickly and easily. By uniting
paper and electronic records, hospitals gain the benefits of integrated, shared patient
information and automated workflow that are inherent with digital data.

Purpose of HRA can be understood as a complete record patient encounters that


allows to automate and streamline workflow in health care settings and to increase
safety through evidence-based decision support, quality management, and outcomes
reporting.

BENEFITS OF HEALTH RISK


Assessment include:

• Increased productivity among clinical staff. With HRA Records Management,


paper charts can be tracked and located with ease while digital scanning and
storage of paper orders, prescriptions, patient forms, and physician documentation
puts information instantly in clinicians’ hands.

• Cost savings. Hospitals can see substantial reductions in the costs associated with
record filing and overhead, time lost while hunting down files, and paper copying
and printing.

• More informed clinical decisions. With shared access to complete information,


including images and scanned paper documents, from anywhere in the hospital,
doctors and nurses can communicate better, make more informed diagnoses, and
prescribe more successful treatment plans.

• Convenience and familiarity for staff. By allowing doctors, nurses, and


administrative staff to retain their comfortable processes, hospitals can foster
employee satisfaction and realize higher success in long-term adoption of
electronic ordering and documentation.

• More complete data for better compliance. To meet accreditation and


regulatory requirements, hospitals can ensure an accurate and complete patient
record and access reports easily for more efficient responses to audits and
information requests.

• Better tracking of patient records. Centralized data stored in a single system


allows clinicians to make faster, more informed decisions.

• Incremental conversion to electronic medical records. Hospitals gain many of


the benefits of converting to electronic records and processes without the
productivity challenges that can accompany an overnight conversion to a new
system.

• Improve Quality of care. The implementation of electronic health records can


help lessen patient sufferance due to medical errors and the inability of analysts to
assess quality. During compilation while hospitalization or ambulant serving of
the patient, easing to get access on details is improved with browser capabilities
applied to screen presentations also cross referring to the respective coding
concepts ICD, DRG and medical procedures information.
Problem Analysis According to Role

Services to Data Management:-

1. They need to surf login pages, so option of user name and password is to
display.

2. Data Manager can set Grid Page Size in pages of others roles.

3. Data Manager has Permission to set Screen setting by set that which tabs will
be appeared in pages for others role.

4. Data Manager can add new records in the Lookups section like new role,
family member, physician, medicine etc. and can set that which and how many
roles, medicines, family members etc. will be appeared in pages for others
role.

5. Data Manager can add new roles with their user name and password.

Service to Data Entry:-

1. They need to surf login pages, so option of user name and password is to
display.

2. Employee can add new patient with patient information like patient name, date
of service, location etc. Employee can search for a particular patient by his
/her first name or last name.

3. After add new patient, Employee can update Demographic, Insurance, PCP
Consultants related information of patient.

4. Employee can add new records in Problem list and recommendation.

Service to Physician:-

1. They need to surf login pages, so option of user name and password is
to display.
2. Physician can perform all actions similar to Employee.

3. Physician can take print out of Patient Information.

4. Physician can save patient information in PDF or DOC format.

5. Only physician has authority to perform E Signature for patient report.

6. Employee can't perform this job. It is basic difference between


Employee and Physician services.
SOFTWARE TOOLS
INTRODUCTION TO ASP.NET
ASP.NET is more than the next version of Active Server Pages (ASP); it provides a
unified Web development model that includes the services necessary for developers to
build enterprise-class Web applications. While ASP.NET is largely syntax compatible
with ASP, it also provides a new programming model and infrastructure for more
scalable and stable applications that help provide greater protection. You can feel free
to augment your existing ASP applications by incrementally adding ASP.NET
functionality to them.

ASP.NET is a compiled, .NET-based environment; you can author applications in any


.NET compatible language, including Visual Basic .NET, C#, and JScript .NET.
Additionally, the entire .NET Framework is available to any ASP.NET application.
Developers can easily access the benefits of these technologies, which include the
managed common language runtime environment, type safety, inheritance, and so on.

ASP.NET has been designed to work seamlessly with WYSIWYG HTML editors and
other programming tools, including Microsoft Visual Studio .NET. Not only does this
make Web development easier, but it also provides all the benefits that these tools
have to offer, including a GUI that developers can use to drop server controls onto a
Web page and fully integrated debugging support.

Developers can use Web Forms or XML Web services when creating an ASP.NET
application, or combine these in any way they see fit. Each is supported by the same
infrastructure that allows you to use authentication schemes, cache frequently used
data, or customize your application's configuration, to name only a few possibilities.

• Web Forms allow you to build powerful forms-based Web pages. When
building these pages, you can use ASP.NET server controls to create common
UI elements, and program them for common tasks. These controls allow you
to rapidly build a Web Form out of reusable built-in or custom components,
simplifying the code of a page. For more information, see Web Forms Pages.
For information on how to develop ASP.NET server controls, see Developing
ASP.NET Server Controls.

• An XML Web service provides the means to access server functionality


remotely. Using XML Web services, businesses can expose programmatic
interfaces to their data or business logic, which in turn can be obtained and
manipulated by client and server applications. XML Web services enable the
exchange of data in client-server or server-server scenarios, using standards
like HTTP and XML messaging to move data across firewalls. XML Web
services are not tied to a particular component technology or object-calling
convention. As a result, programs written in any language, using any
component model, and running on any operating system can access XML Web
services. For more information, see XML Web Services Created Using
ASP.NET and XML Web Service Clients.

Each of these models can take full advantage of all ASP.NET features, as well as the
power of the .NET Framework and .NET Framework common language runtime.
These features and how you can use them are outlined as follows:

• If you have ASP development skills, the new ASP.NET programming model
will seem very familiar to you. However, the ASP.NET object model has
changed significantly from ASP, making it more structured and object-
oriented. Unfortunately this means that ASP.NET is not fully backward
compatible; almost all existing ASP pages will have to be modified to some
extent in order to run under ASP.NET. In addition, major changes to Visual
Basic .NET mean that existing ASP pages written with Visual Basic Scripting
Edition typically will not port directly to ASP.NET. In most cases, though, the
necessary changes will involve only a few lines of code. For more
information, see Migrating from ASP to ASP.NET.

• Accessing databases from ASP.NET applications is an often-used technique


for displaying data to Web site visitors. ASP.NET makes it easier than ever to
access databases for this purpose. It also allows you to manage the database
from your code. For more information, see Accessing Data with ASP.NET.

• ASP.NET provides a simple model that enables Web developers to write logic
that runs at the application level. Developers can write this code in the
Global.asax text file or in a compiled class deployed as an assembly. This
logic can include application-level events, but developers can easily extend
this model to suit the needs of their Web application. For more information,
see ASP.NET Applications.

• ASP.NET provides easy-to-use application and session-state facilities that are


familiar to ASP developers and are readily compatible with all other .NET
Framework APIs. For more information, see ASP.NET State Management.

• For advanced developers who want to use APIs as powerful as the ISAPI
programming interfaces that were included with previous versions of ASP,
ASP.NET offers the IHttpHandler and IHttpModule interfaces. Implementing
the IHttpHandler interface gives you a means of interacting with the low-
level request and response services of the IIS Web server and provides
functionality much like ISAPI extensions, but with a simpler programming
model. Implementing the IHttpModule interface allows you to include
custom events that participate in every request made to your application. For
more information, see HTTP Runtime Support.

• ASP.NET takes advantage of performance enhancements found in the .NET


Framework and common language runtime. Additionally, it has been designed
to offer significant performance improvements over ASP and other Web
development platforms. All ASP.NET code is compiled, rather than
interpreted, which allows early binding, strong typing, and just-in-time (JIT)
compilation to native code, to name only a few of its benefits. ASP.NET is
also easily factorable, meaning that developers can remove modules (a session
module, for instance) that are not relevant to the application they are
developing. ASP.NET also provides extensive caching services (both built-in
services and caching APIs). ASP.NET also ships with performance counters
that developers and system administrators can monitor to test new applications
and gather metrics on existing applications. For more information, see
ASP.NET Caching Features and ASP.NET Optimization.

• Writing custom debug statements to your Web page can help immensely in
troubleshooting your application's code. However, they can cause
embarrassment if they are not removed. The problem is that removing the
debug statements from your pages when your application is ready to be ported
to a production server can require significant effort. ASP.NET offers the Trace
Context class, which allows you to write custom debug statements to your
pages as you develop them. They appear only when you have enabled tracing
for a page or entire application. Enabling tracing also appends details about a
request to the page, or, if you so specify, to a custom trace viewer that is
stored in the root directory of your application. For more information, see
ASP.NET Trace.

• The .NET Framework and ASP.NET provide default authorization and


authentication schemes for Web applications. You can easily remove, add to,
or replace these schemes, depending upon the needs of your application. For
more information, see Securing ASP.NET Web Applications.

• ASP.NET configuration settings are stored in XML-based files, which are


human readable and writable. Each of your applications can have a distinct
configuration file and you can extend the configuration scheme to suit your
requirements. For more information, see ASP.NET Configuration.

• Applications are said to be running side by side when they are installed on the
same computer but use different versions of the .NET Framework. To learn
how to use different versions of ASP.NET for separate applications on your
server, see Side-by-Side Support in ASP.NET.

• IIS 6.0 uses a new process model called worker process isolation mode, which
is different from the process model used in previous versions of IIS. ASP.NET
uses this process model by default when running on Windows Server 2003.
For information about how to migrate ASP.NET process model settings to
worker process isolation mode, see IIS 6.0 Application Isolation Modes.
The Components of .NET Framework
The .NET Framework is divided into two main components: the .NET Framework
Class Library and the Conman language Runtime.

 Common Language Runtime

The Common Language Runtime is the execution engine for .NET


Framework applications.

It provides a number of services, including the following : Code management


(loading and execution)

• Application memory isolation

• Verification of type safety

• Conversion of IL to native code

• Access to metadata (enhanced type information)

• Managing memory for managed objects

• Enforcement of code access security

• Exception handling, including cross-language exceptions

• Interoperation between managed code, COM objects, and pre-existing


DLLs (unmanaged code and data)

• Automation of object layout

• Support for developer services (profiling, debugging, and so on)

 The Common Type System defines how types are declared, used, and
managed in the runtime, and is also an important part of the runtime's support
for cross-language integration. The common type system performs the
following functions:
• Establishes a framework that helps enable cross-language integration, type
safety, and high performance code execution.

• Provides an object-oriented model that supports the complete implementation


of many programming languages.

• Defines rules that languages must follow, which helps ensure that objects
written in different languages can interact with each other

 Assemblies in the Common Language Runtime

Assemblies are the building blocks of .NET Framework applications; they


form the fundamental unit of deployment, version control, reuse, activation
scoping, and security permissions. An assembly is a collection of types and
resources that are built to work together and form a logical unit of
functionality. An assembly provides the common language runtime with the
information it needs to be aware of type implementations. To the runtime, a
type does not exist outside the context of an assembly.

 NET Framework Class Library Overview

The .NET Framework includes classes, interfaces, and value types that
expedite and optimize the development process and provide access to system
functionality. To facilitate interoperability between languages, the .NET
Framework types are CLS-compliant and can therefore be used from any
programming language whose compiler conforms to the common language
specification (CLS).

The .NET Framework types are the foundation on which .NET applications,
components, and controls are built. The .NET Framework includes types that
perform the following functions:

• Represent base data types and exceptions.

• Encapsulate data structures.

• Perform I/O.

• Access information about loaded types.

• Invoke .NET Framework security checks.


• Provide data access, rich client-side GUI, and server-controlled, client-side
GUI.

THE .NET FRAMEWORK


ASP.NET is a request processing engine. It takes an incoming request and passes it
through its internal pipeline to an end point where you as a developer can attach code
to process that request. This engine is actually completely separated from HTTP or
the Web Server. In fact, the HTTP Runtime is a component that you can host in your
own applications outside of IIS or any server side application altogether. For example,
you can host the ASP.NET runtime in a Windows form

The runtime provides a complex yet very elegant mechanism for routing requests
through this pipeline. There are a number of interrelated objects, most of which are
extensible either via sub classing or through event interfaces at almost every level of
the process, so the framework is highly extensible. Through this mechanism it’s
possible to hook into very low level interfaces such as the caching, authentication and
authorization. You can even filter content by pre or post processing requests or simply
route incoming requests that match a specific signature directly to your code or
another URL. There are a lot of different ways to accomplish the same thing, but all
of the approaches are straightforward to implement, yet provide flexibility in finding
the best match for performance and ease of development.

About ADO.NET
Most applications need data access at one point of time making it a crucial component
when working with applications. Data access is making the application interact with a
database, where all the data is stored. Different applications have different
requirements for database access. VB .NET uses ADO .NET (Active X Data Object)
as it's data access and manipulation protocol which also enables us to work with data
on the Internet. Let's take a look why ADO .NET came into picture replacing ADO.

The ADO.NET Data Architecture


Data Access in ADO.NET relies on two components: Dataset and Data Provider.

DataSet
The dataset is a disconnected, in-memory representation of data. It can be considered
as a local copy of the relevant portions of the database. The DataSet is persisted in
memory and the data in it can be manipulated and updated independent of the
database. When the use of this DataSet is finished, changes can be made back to the
central database for updating. The data in DataSet can be loaded from any valid data
source like Microsoft SQL server database, an Oracle database or from a Microsoft
Access database.

Data Provider
The Data Provider is responsible for providing and maintaining the connection to the
database. A DataProvider is a set of related components that work together to provide
data in an efficient and performance driven manner. The .NET Framework currently
comes with two DataProviders: the SQL Data Provider which is designed only to
work with Microsoft's SQL Server 7.0 or later and the OleDb DataProvider which
allows us to connect to other types of databases like Access and Oracle. Each
DataProvider consists of the following component classes:

 The Connection object which provides a connection to the database.

 The Command object which is used to execute a command.

 The DataReader object which provides a forward-only, read only, connected


record set

 The DataAdapter object which populates a disconnected DataSet with data and
performs update

Data access with ADO.NET can be summarized as follows:


A connection object establishes the connection for the application with the database.
The command object provides direct execution of the command to the database. If the
command returns more than a single value, the command object returns a Data Reader
to provide the data. Alternatively, the Data Adapter can be used to fill the Dataset
object. The database can be updated using the command object or the Data Adapter.
The tiers are better explained through an example. Say, you want a system to work
with voter information data. Now the tiers are supposed to take on the following
responsibilities-
1. Presentation

Has the capability to show the data. Also, gives you the UI for operations like
insert, edit, delete and look up etc. However, although this part lets you issue the
commands for CRUD, this part renders the actual duty to a lower layer called
Business Layer.

2. Business Layer

This layer is responsible for applying all business rules on the data. For an example,
this layer may reject an insert operation if the age of the voter is less than 18. So, the
rules that we want to enforce from a users point of view are confirmed in this layer.

3. Data Layer

This layer is supposed to be just a copy of your persistent storage. So, you want this
layer to be reusable across different business layers. And also, this layer is supposed
to act as an adapter between your Object Oriented world and the SQL world for the
Business Layer. This layer may have methods that actually perform CRUD
operations and provides OO interface to the Business Layer.
ABOUT JAVA SCRIPT
JavaScript is a programming language that is used to make web pages
interactive. It runs on your visitor's computer and so does not require constant
downloads from your web site.

JavaScript supports all the structured programming syntax


in C (e.g., if statements, while loops, switch statements, etc.). One partial
exception is scoping: C-style block-level scoping is not supported. JavaScript 1.7,
however, supports block-level scoping with the let keyword. Like C, JavaScript
makes a distinction between expressions and statements.
ABOUT HTML
HTML is a computer language devised to allow website creation. These websites can
then be viewed by anyone else connected to the Internet. It is relatively easy to learn,
with the basics being accessible to most people in one sitting; and quite powerful in
what it allows you to create. It is constantly undergoing revision and evolution to
meet the demands and requirements of the growing Internet audience under the
direction of the » W3C, the organization charged with designing and maintaining the
language.

The definition of HTML is Hypertext Markup Language.

• Hypertext is the method by which you move around on the web — by clicking
on special text called hyperlinks which bring you to the next page. The fact
that it is hyper just means it is not linear — i.e. you can go to any place on the
Internet whenever you want by clicking on links — there is no set order to do
things in.

• Markup is what HTML tags do to the text inside them. They mark it as a
certain type of text (italicized text, for example).

• HTML is a Language, as it has code-words and syntax like any other


language.

How HTML Works


HTML consists of a series of short codes typed into a text-file by the site
author — these are the tags. The text is then saved as a html file, and viewed
through a browser, like Internet Explorer or Netscape Navigator. This browser reads
the file and translates the text into a visible form, hopefully rendering the page as the
author had intended. Writing your own HTML entails using tags correctly to create
your vision. You can use anything from a rudimentary text-editor to a powerful
graphical editor to create HTML.

SQL-SERVER 2005
SQL Server 2005 Express Edition is the next version of MSDE and is a free, easy-to-
use, lightweight, and embeddable version of SQL Server 2005. Continue reading to
learn more about the benefits of SQL Server Express Edition and to download SQL
Server Management Studio.

Microsoft makes SQL Server available in multiple editions, with different feature sets
and targeting different users.

SQL Server Compact Edition (SQL CE)

The compact edition is an embedded database engine. Unlike the other editions of
SQL Server, the SQL CE engine is based on SQL Mobile (initially designed for use
with hand-held devices) and does not share the same binaries. Due to its small size
(1MB DLL footprint), it has a markedly reduced feature set compared to the other
editions. For example, it supports a subset of the standard data types, does not support
stored procedures or Views or multiple-statement batches (among other limitations).
It is limited to 4GB maximum database size and cannot be run as a Windows service,
Compact Edition must be hosted by the application using it. The 3.5 version includes
considerable work that supports ADO.NET Synchronization Services.

SQL Server Developer Edition

SQL Server Developer Edition includes the same features as SQL Server Enterprise
Edition, but is limited by the license to be only used as a development and test system,
and not as production server. This edition is available to download by students free of
charge as a part of Microsoft's DreamSpark program.

SQL Server 2005 Embedded Edition (SSEE)


SQL Server 2005 Embedded Edition is a specially configured named instance of the
SQL Server Express database engine which can be accessed only by certain Windows
Services.

SQL Server Enterprise Edition

SQL Server Enterprise Edition is the full-featured edition of SQL Server, including
both the core database engine and add-on services, while including a range of tools
for creating and managing a SQL Server cluster.

SQL Server Evaluation Edition

SQL Server Evaluation Edition, also known as the Trial Edition, has all the features
of the Enterprise Edition, but is limited to 180 days, after which the tools will
continue to run, but the server services will stop.

SQL Server Express Edition

SQL Server Express Edition is a scaled down, free edition of SQL Server, which
includes the core database engine. While there are no limitations on the number of
databases or users supported, it is limited to using one processor, 1 GB memory and 4
GB database files. The entire database is stored in a single .mdf file, and thus
making it suitable for XCOPY deployment. It is intended as a replacement for MSDE.
Two additional editions provide a superset of features not in the original Express
Edition. The first is SQL Server Express with Tools, which includes SQL Server
Management Studio Basic. SQL Server Express with Advanced Services adds full-
text search capability and reporting services.

Features of SQL SERVER 2005

1. One of SQL's greatest benefits are it is truly a cross-


platform language and a cross-product language.

2. It is the common thread that runs throughout


client/server application development is the use client/server computing of SQL
and relational databases.

3. SQL SERVER 2005 is recommended where


database is not so huge.

4. SQL SERVER 2005 provides good security.


Hard ware Required
1. Intel Pentium IV

2. 1 GB RAM

3. 80 GB Hard Disk

SYSTEM ANALYSIS
1. Existing System: -

The existing system does not use any computer based system. All the jobs are done
manually. The job such as maintaining the information about the various department’s
meeting when the following meeting is organized and where it held and when, who
are the attendances and what is the final decision of the meeting. Informing all the
department member manually which is very time consuming process. Information
about the various meeting held in past is to be searched manually that takes time and
consume many resources.

The existing system had following Drawbacks:

• It was very time consuming

• Reports were generated manually.

• The Activities are often prone to errors.

• The speed of processing of data is very slow.

• The information required was not readily available.

• Lots of paper work was there.

2. Need for System

At present, all the operations are carried out manually. Sometimes, information is
duplicated, filled incorrectly or missed. A new computerized system is needed to
handle all these data integrity and consistency problems that arise when system is
handled manually.

(a) To improve quality of work and accuracy

(b) To improve work speed and accuracy

(c) To provide the easy and user friendly environment

(d) Get instant and detail information at single terminal

(e) The project would help in effective and systematic record keeping that
is storing and retrieving of useful data.

(f) Project will produce various reports so that management can make
decisions on the basis of these reports

3. Proposed System: -

The proposed system objectives are drawn to avoid the drawbacks of the Meeting
scheduler .The proposed system replaces inefficient process and time-consuming task
of the current system to be a simple, accurate and effective proposed system. The
objective of this project is to create a website of meeting management system and
provide services to provide every detail about the meeting that held and to be held in
the various departments. This site records all the information about the past meetings
for future use.

Modules of the project: -

• Administrator – This module is prepared for the administrator who is


the only user having the authority to add, delete and modify the database.
The administrator can avail these facilities only after proper login
mechanism built in the system.

• Registration – This module is developed for new user can open an


account with the web site to become a registered user. Through this
module, personal information of users is gathered to allow them signing
into the system. Once their username and password is created they can
login into the system through login module.

• Login – This module is developed to allow the visitors and


administrators to login into the system.
Through the I/O/Update facility one can easily maintain the records in the file.
Moreover, the user will be able to get variety of reports, which is the most important
part of this system like any other. A lot of effort and care has been taken in designing
format. Most of the time, it is not possible to computerize the manual system
completely and some things have to be left out. This leads to some limitations in the
development of the system. But utmost care has been taken to satisfy most of the
needs.

The system deals right from entering the records in the database to generation of the
reports. The major advantage is the increase in the speed and efficiency of the work.
Much of the tedious jobs in the old system are reduced.

4. Feasibility Study:-

A Feasibility Study is a test of system proposal regarding to its workability; impact


on the organization; an effective use of resources.

Technical Feasibility: -

This application is technically feasible.

Hardware Requirements:-

 Processor Pentium Class

 512 MB RAM

 80 GB HDD

 Monitor

 Keyboard

 Mouse

 LAN card

 Modem

 Internet Connection
SOFTWARE REQUIREMENTS
For Development Purpose:

 ASP.NET with C#

 IIS

 SQL Server 2005

 Email Accounts

 Windows XP

For Work Station:

 Internet Explorer.

 Internet Connection

 Mail Server

 Windows Operating System


SYSTEM
REQUIREMENT
SPECIFICATION
SYSTEM REQUIREMENT SPECIFICATION

Introduction to SRS
The Software Requirements Specifications presents the overall information about the
interface, flow of data and constraints for the products.

An SRS is a document that completely describes what the proposed software should
do without describing how the software will do it. Therefore it describes the complete
external behavior of proposed software. An SRS is used to define needs and
expectations of the users. It serves as a contract document between customer and
developer. It is produced at the culmination of analysis task.

The function and performance allocated to software as part of Software Engineering is


refunded by:

• Establishing a complete introduction

• General description

• Information description

• A detailed functional description, and

• Other information pertinent to requirement.

This document, that is, software requirements specifications describes the overall
requirement that will be satisfied by the final product development. It serves as the
foundation for subsequent software engineering activities. It describes the function of
computer-based system and the constraints that will govern the development
SYSTEM DESIGN
&
DEVELOPMENT
SYSTEM DESIGN & DEVELOPMENT
Software Engineering Paradigm Applied: -

Project Planning

The key to a successful project is in the planning. Creating a project plan is the first
thing you should do when undertaking any kind of project.

Often project planning is ignored in favor of getting on with the work. However,
many people fail to realize the value of a project plan in saving time, money and
many problems.

Project Planning Objective: -

• Coordinate the various interrelated processes of the project.

• Ensure project includes all the work required, and only the work required,
to complete the project successfully.

• Ensure that the project is completed on time and within budget.

• Ensure that the project will satisfy the needs for which it was undertaken.

• Ensure the most effective use of the people involved with the project.

• Promote effective communication between the projects team members and


key members.

• Ensure that project risks are identified, analyzed, and responded.

Need and Importance: -

• Quality delivery

• Customer satisfaction

• Structured

• Managing the manpower


Modules of HRA and Description
Home Page

Description:-

This page is default login page of the site from here all employees, physicians and
data manager will login and proceed.
On this page the login of the HRA and logo section is arrange according to the need
of client.

Data Management Home page


Data Management Home page Menu Descriptions:-
 Home

This is a home page specifying the Grid View setting for the pages of others
role.

 Screen Permission

This page is used by Data Manager to set that which and how many tabs will
be appeared on the pages of others role.

 Look Ups

Look Ups is used to add, delete and edit items in all sections like family
member, medications and consults etc which are selected by physicians or
employees.

 Add Users

This page is used to add new role with user name and password.

 Change Password

This Page is used to change password of Data Manager.

 Log Out

This is used to log out from current session.


Data Entry Home Page
Description:

This is Data Entry home page. From here an employee can search any patient by
his/her first name or last name and can add new patient with patient information. An
employee can update any patient information by select patient from patient bar at the
left side of page alphabetically.

Data Entry Home page Menu


Data Entry Home page Menu
Description
 Demographic

 Insurance

 PCP/Consults

 Personal & Social History

 Family History

 Advanced Directive

 Allergies

 Immunization

 Surgical History

 Hospitalization/SNF Visits

 Medications

 Chief Complaint

 HPI

 Review of Systems

 Exam/Vitals

 Labs

 Diagnostics

 Functional Assessment

 Problem List

 Recommendation
Data Entry Home page Menu Descriptions

 Add Users
General Description-

Add New is an interface from where an Employee can add new patient
with click on Save button after fill up the form.
Technical Description

All Fields Values:

Sr.No Field Name Field Control Field Type In DB

1 Last Name Text Box varchar(50)

2 First Name Text Box varchar(50)

3 Middle Name Text Box varchar(50)

4 DOB Text Box datetime

5 SSN Text Box int

6 MRN Text Box varchar(15)

7 Reside At Drop down List varchar(15)

8 Marital status Drop down List varchar(15)

9 Gender Drop down List bit

10 Address1 Text Box varchar(50)

11 Address2 Text Box char(25)

12 City Text Box varchar(15)

13 State Drop down List varchar(15)

14 Contry Drop down List varchar(15)

15 Phone Text Box varchar(15)

16 Cell Text Box varchar(15)

17 Zip Text Box varchar(15)

18 Location Text Box varchar(15)

19 Date of Service Text Box DateTime

Validations:
Mandatory Fields:

1. Last Name

2. First Name

3. Location

4. Date of Service

5. DOB

6. Reside At

7. Gender

Size Validation:

1. LastName : 50 Characters

2. FirstName : 50 Characters

3. SSN : 11 Characters (Including dashes [“-”])

4. MRN : 15 Characters

Other Validations:

 SSN (If Entered) : It Should be a specific format like 123-45-6789

Code Description:
Functionality:

On Save Button we validate where the Specific Patient already exist in our system or
not.
If Exist
Update the information and make a new admission for him
Otherwise
Insert that patient to Patient table and also make a new admission for him.

Functions we used to get this thing done:

1. btn_save_Click():-

This function get all fields value in variables by get


property for all fields.

2. fun_demo_insert():-

This function of class file is used to add parameters and call


stored procedure for save records in Patient and Patient
History tables.

Database Operations:

Affected Class:

 csDemoGraphics.cs

Affected Function

 btn_save_Click()

 fun_demo_insert()

Used SP’s :

 DE_Demographic_Insert()

Fields Description:

Patient Table

MRN SSN

FirstName LastName
MiddleName GenderId

DOB MaritalStatusId

Address1 Address2

City StateId

Zip Phone

Fax Cell

CreatedDate CreatedBy

CountryId CountyId

RaceId EthnicityId

Patient History Table

SSN FirstName ,

LastName MiddleName,

GenderId DOB
MaritalStatusId Address1

Address2 City

StateId Zip

Phone Fax

Cell CreatedDate

CreatedBy CountryId

CountyId RaceId

EthnicityId ResideAtID

Demographic
General Description-
Demographic is an interface from where an Employee can update patient's
Demographic related information with click on update button.
Technical Description
All Fields Values:

Sr.No Field Name Field Control Field Type In DB

1 Last Name Text Box varchar(50)

2 First Name Text Box varchar(50)

3 Middle Name Text Box varchar(50)

4 DOB Text Box datetime

5 SSN Text Box int

6 MRN Text Box varchar(15)

7 Reside At Drop down List varchar(15)

8 Marital status Drop down List varchar(15)

9 Gender Drop down List bit

10 Address1 Text Box varchar(50)

11 Address2 Text Box char(25)

12 City Text Box varchar(15)

13 State Drop down List varchar(15)

14 Contry Drop down List varchar(15)

15 Phone Text Box varchar(15)

16 Cell Text Box varchar(15)

17 Zip Text Box varchar(15)

18 Location Text Box varchar(15)

19 Date of Service Text Box DateTime

Validations:

Mandatory Fields:
1. Last Name

2. First Name

3. Location

4. Date of Service

5. DOB

6. Reside At

7. Gender

Size Validation:

1. LastName : 50 Characters

2. FirstName : 50 Characters

3. SSN : 11 Characters (Including dashes [“-”])

4. MRN : 15 Characters

Other Validations:

 SSN (If Entered) : It Should be a specific format like 123-45-6789

Code Description
Functions we used to get this thing done:

1. onPadeLoad():-

This function set all fields value in form fields by set


property for all fields.
2. fun_demo_update():-

This function of class file is used to add parameters and


call stored procedure for update records in Patient and Patient History tables.

Database Operations

Affected Class:

 csDemoGraphics.cs

Affected Function

 onPageLoad()

 fun_demo_update()

Used SP’s :

 DE_Dempgraphic_Update()

Fields Description:
Patient Table

MRN SSN

FirstName LastName
MiddleName GenderId

DOB MaritalStatusId

Address1 Address2

City StateId

Zip Phone

Fax Cell

CreatedDate CreatedBy

CountryId CountyId

RaceId EthnicityId

Patient History Table


SSN FirstName,

LastName MiddleName

GenderId DOB
MaritalStatusId Address1

Address2 City

StateId Zip

Phone Fax

Cell CreatedDate

CreatedBy CountryId

CountyId RaceId

EthnicityId ResideAtID

Family History
General Description-

Family History is an interface from where an Employee can add new and update patient's
Family History related information with click on update button.
Technical Description
All Fields Values:
Sr.No Field Name Field Control Field Type In DB

1 Family Member Drop down List varchar(50)

2 Age of Death Text Box varchar(10)

3 Chronic Disease 1 Ajax Control varchar(50)

4 Chronic Disease 2 Ajax Control varchar(50)

5 Chronic Disease 3 Ajax Control varchar(50)

6 Chronic Disease 4 Ajax Control varchar(15)


Validations:
Mandatory Fields:
1. Family Member

Code Description
Functionality:

1. btn_save_Click():

This function get all fields value in variables by get


property for all fields.

2. csfamilyinfo.FamilyInfo_Update():-

This function of class file is used to add parameters and


call stored procedure for update records in
PatientFamilyInfo table.

Database Operations
Affected Class:

 csFamilyInfo.cs

Affected Function

 btn_save_Click()

 csfamilyinfo.FamilyInfo_Update()

Used SP’s :

DE_FamilyInfo_Update()

Fields Description:
PatientFamilyInfo Table

FamilyMember PatientID

AssessmentId AgeOfDath

CODEID1 CODEID2

CODEID3 CODEID4

ModifiedDate ModifiedBy

Labs
General Description-

Labs is an interface from where an Employee can add and update patient's
different Labs related information.

Lab CMP Description


Technical Description

All Fields Values:

Sr.No Field Name Field Control Field Type In DB

1 Order Date DateTime varchar(50)

2 Order Time DateTime varchar(10)

3 Order By Drop Down List varchar(50)

4 ALT(GPT)(0.0-36.0 U/L) Text Box varchar(50)

5 ALK PHOS(20.0-125.0 U/L) Ajax Control varchar(50)

6 AST(GOT)(10.0-40.0 U/L) Ajax Control varchar(15)

7 TBI(0.30-1.50 mg/dl) Text Box varchar(10)

8 CA(8.3-10.5 mg/dl) Text Box varchar(10)

9 GLUCOSE(65.0-99.0 mg/dl) Text Box varchar(10)

10 NA(135.0-151.0 mmol/L) Text Box varchar(10)

11 CL(98.0-113.0 mmol/L) Text Box varchar(10)

12 GLOBULIN(2.2-4.2 g/dL) Text Box varchar(10)

Validations:

Mandatory Fields:

1. Order Date

2. Order By
Code Description
Functions we used to get this thing done:

 btn_save_Click():-

This function set all fields value in variables by set property


for all fields.

 cslabs.LabInfo_Update()

This function of class file is used to add parameters and call


stored procedure for update records in PatientLabCMP
tables.

Database Operations

Affected Class:
 csLabs.cs

Affected Function
 btn_save_Click()

 cslabs.LabInfo_Update()

Used SP’s :
 DE_Lab_CMP_Insert_Update_Delete()

Fields Description:
PatientLabCMP Table

ALTGPT ALKPHOS

ASTGOT ModifiedBy
TBI CA

GLUCOSE NA

K CL

CO2 BUN

CREATININE BUNORCREATRATIO

ALBUMIN TOTALPROTEIN

GLOBULIN AORGRATIO

PatientId OrderBy

OrderOn ModifiedDate

Physician Home page Menu Descriptions

E Signature
General Description-
E Signature is an interface from where a physician can perform E Signature for
a patient report.

Technical Description
All Fields Values:

Sr.No Field Name Field Control Field Type In DB

1 Signature Text Box varchar(50)


Validations:

Mandatory Fields:

1. Signature

Code Description
Functionality:

1. btn_save_Click():

This function get all fields value in variables by get


property for all fields.

2. tblCBC_Return():-

This function of class file is used to add parameters and call


stored procedure for select signature fom Users tables.

3. Physician_Approval_insert():-

If signature is matched from Users table signature, Then


this function call stored procedure to set physician name
and current(approval) date.

Database Operations
Affected Function
 btn_save_Click()

 tblCBC_Return()

 Physician_Approval_insert()

Used SP’s :
 PH_Approve_SignatureReturnOnPhysician()

 PH_Approve_Insert()

Fields Description:

Users Table

UserId RoleId

UserName Password
RoleIdentityId SIGNATURE

PhysicianId InsuranceCompanyId

ActiveUser CreatedDate

CreatedBy Modifieddate

ModifiedBy DeletedDate

PatientPhysicianApproval Table

ApprovalId PatientId

AssessmentId UserId

ApprovalDate CreatedDate
CreatedBy ModifiedDate

Modifiedby DeletedDate

PatientPhysicianApprovalHistory Table

ApprovalHistoryId ApprovalId

PatientId AssessmentId

UserId ApprovalDate
CreatedDate CretedBy

ModifiedDate ModifiedBy

DeletedDate DeletedBy
E R DIAGRAM
E R DIAGRAM
In software engineering, an entity-relationship model (ERM) is an abstract
and conceptual representation of data. Entity-relationship modeling is a
database modeling method, used to produce a type of conceptual schema or

semantic data model of a system, often a relational database, and its

requirements in a top-down fashion. Diagrams created by this process are


called entity-relationship diagrams, ER diagrams, or ERDs.
USE CASE Diagram
SYSTEM
TESTING

Testing
1) Introduction:

System testing of software or hardware is testing conducted on a complete,


integrated system to evaluate the system's compliance with its specified
requirements. System testing falls within the scope of black box testing, and as
such, should require no knowledge of the inner design of the code or logic.

As a rule, system testing takes, as its input, all of the "integrated" software
components that have successfully passed integration testing and also the
software system itself integrated with any applicable hardware system(s). The
purpose of integration testing is to detect any inconsistencies between the
software units that are integrated together (called assemblages) or between any
of the assemblages and the hardware. System testing is a more limiting type of
testing; it seeks to detect defects both within the "inter-assemblages" and also
within the system as a whole.

2) Testing Objective:-

Testing is the process of executing a program with the intent of finding an


error.

A good test case is the one that has a high probability of finding an as-yet-
undiscovered error.

A successful test is one that uncovers an as-yet-undiscovered error.

The objective of testing are: -

Software quality improvement.

Verification and Validation.

Software Reliability Estimation.

These objectives imply dramatic change in viewpoint. They move counter to


the commonly held view that a successful test is one in which no errors are
found. Our objective is to design tests that systematically uncover different
classes of errors and to do so with minimum amount of time and effort. If
testing is conducted successfully (according to the objective stated
previously), it will uncover errors in the software. As a secondary benefit,
testing demonstrates that software functions appear to be the working
according to the specification, that behavioral and performance requirements
appear to have been met. In addition, data collected as testing conducted
provide a good indication of software reliability and some indication of
software quality as a whole but testing cannot show the absence of errors and
defects, it can show only that software errors and defects are present. It is
important to keep this (rather gloomy) statement in mind as testing is being
conducted.

3) Types of Testing: -

They are various types of testing as follows:

Black box testing - Internal system design is not considered in this type of
testing. Tests are based on requirements and functionality.

White box testing - This testing is based on knowledge of the internal logic of
an application’s code. Also known as Glass box Testing. Internal software and
code working should be known for this type of testing. Tests are based on
coverage of code statements, branches, paths, conditions.

Unit testing - Testing of individual software components or modules.


Typically done by the programmer and not by testers, as it requires detailed
knowledge of the internal program design and code. may require developing
test driver modules or test harnesses.

Incremental integration testing - Bottom up approach for testing i.e


continuous testing of an application as new functionality is added; Application
functionality and modules should be independent enough to test separately.
done by programmers or by testers.

Integration testing - Testing of integrated modules to verify combined


functionality after integration. Modules are typically code modules, individual
applications, client and server applications on a network, etc. This type of
testing is especially relevant to client/server and distributed systems.

Functional testing - This type of testing ignores the internal parts and focus
on the output is as per requirement or not. Black-box type testing geared to
functional requirements of an application.

System testing - Entire system is tested as per the requirements. Black-box


type testing that is based on overall requirements specifications, covers all
combined parts of a system.
End-to-end testing - Similar to system testing, involves testing of a complete
application environment in a situation that mimics real-world use, such as
interacting with a database, using network communications, or interacting
with other hardware, applications, or systems if appropriate.

Sanity testing - Testing to determine if a new software version is performing


well enough to accept it for a major testing effort. If application is crashing for
initial use then system is not stable enough for further testing and build or
application is assigned to fix.

Regression testing - Testing the application as a whole for the modification in


any module or functionality. Difficult to cover all the system in regression
testing so typically automation tools are used for these testing types.

Acceptance testing -Normally this type of testing is done to verify if system


meets the customer specified requirements. User or customer do this testing to
determine whether to accept application.

Load testing - Its a performance testing to check system behavior under load.
Testing an application under heavy loads, such as testing of a web site under a
range of loads to determine at what point the system’s response time degrades
or fails.

Stress testing - System is stressed beyond its specifications to check how and
when it fails. Performed under heavy load like putting large number beyond
storage capacity, complex database queries, continuous input to system or
database load.

Performance testing - Term often used interchangeably with ’stress’ and


‘load’ testing. To check whether system meets performance requirements.
Used different performance and load tools to do this.

Usability testing - User-friendliness check. Application flow is tested, Can


new user understand the application easily, Proper help documented whenever
user stuck at any point. Basically system navigation is checked in this testing.

Install/uninstall testing - Tested for full, partial, or upgrade install/uninstall


processes on different operating systems under different hardware, software
environment.
Recovery testing - Testing how well a system recovers from crashes,
hardware failures, or other catastrophic problems.

Security testing - Can system be penetrated by any hacking way. Testing how
well the system protects against unauthorized internal or external access.
Checked if system, database is safe from external attacks.

Compatibility testing - Testing how well software performs in a particular


hardware/software/operating system/network environment and different
combination s of above.

Comparison testing - Comparison of product strengths and weaknesses with


previous versions or other similar products.

Alpha testing - In house virtual user environment can be created for this type
of testing. Testing is done at the end of development. Still minor design
changes may be made as a result of such testing.

Beta testing - Testing typically done by end-users or others. Final testing


before releasing application for commercial purpose.

SOFTWARE TESTING LIFE CYCLE


Software testing life cycle identifies what test activities to carry out and when
(what is the best time) to accomplish those test activities. Even though testing
differs between organizations, there is a testing life cycle.

Software Testing Life Cycle consists of six (generic) phases:


• Test Planning,

• Test Analysis,

• Test Design,

• Construction and verification,

• Testing Cycles,

• Final Testing and Implementation and

• Post Implementation.

Software testing has its own life cycle that intersects with every stage of the
SDLC. The basic requirements in software testing life cycle is to control/deal with
software testing – Manual, Automated and Performance.

Test Planning

This is the phase where Project Manager has to decide what things need to be
tested, do I have the appropriate budget etc. Naturally proper planning at this stage
would greatly reduce the risk of low quality software. This planning will be an
ongoing process with no end point.

Activities at this stage would include preparation of high level test plan-(according
to IEEE test plan template The Software Test Plan (STP) is designed to prescribe
the scope, approach, resources, and schedule of all testing activities. The plan must
identify the items to be tested, the features to be tested, the types of testing to be
performed, the personnel responsible for testing, the resources and schedule
required to complete testing, and the risks associated with the plan.). Almost all of
the activities done during this stage are included in this software test plan and
revolve around a test plan.

Test Analysis

Once test plan is made and decided upon, next step is to delve little more into the
project and decide what types of testing should be carried out at different stages of
SDLC, do we need or plan to automate, if yes then when the appropriate time to
automate is, what type of specific documentation I need for testing.
Proper and regular meetings should be held between testing teams, project
managers, development teams, Business Analysts to check the progress of things
which will give a fair idea of the movement of the project and ensure the
completeness of the test plan created in the planning phase, which will further help
in enhancing the right testing strategy created earlier. We will start creating test
case formats and test cases itself. In this stage we need to develop Functional
validation matrix based on Business Requirements to ensure that all system
requirements are covered by one or more test cases, identify which test cases to
automate, begin review of documentation, i.e. Functional Design, Business
Requirements, Product Specifications, Product Externals etc. We also have to
define areas for Stress and Performance testing.

Test Design

Test plans and cases which were developed in the analysis phase are revised.
Functional validation matrix is also revised and finalized. In this stage risk
assessment criteria is developed. If you have thought of automation then you have
to select which test cases to automate and begin writing scripts for them. Test data
is prepared. Standards for unit testing and pass / fail criteria are defined here.
Schedule for testing is revised (if necessary) & finalized and test environment is
prepared.

Construction and verification

In this phase we have to complete all the test plans, test cases, complete the
scripting of the automated test cases, Stress and Performance testing plans needs to
be completed. We have to support the development team in their unit testing phase.
And obviously bug reporting would be done as when the bugs are found.
Integration tests are performed and errors (if any) are reported.

Testing Cycles

In this phase we have to complete testing cycles until test cases are executed
without errors or a predefined condition is reached. Run test cases --> Report Bugs
--> revise test cases (if needed) --> add new test cases (if needed) --> bug fixing -->
retesting (test cycle 2, test cycle 3….).
Final Testing and Implementation

In this we have to execute remaining stress and performance test cases,


documentation for testing is completed / updated, provide and complete different
matrices for testing. Acceptance, load and recovery testing will also be conducted
and the application needs to be verified under production conditions.

Post Implementation

In this phase, the testing process is evaluated and lessons learnt from that testing
process are documented. Line of attack to prevent similar problems in future
projects is identified. Create plans to improve the processes. The recording of new
errors and enhancements is an ongoing process. Cleaning up of test environment is
done and test machines are restored to base lines in this stage.

Testing is usually performed for the following purposes:

To improve quality:

As computers and software are used in critical applications, the outcome of a bug can
be severe. Bugs can cause huge losses. Bugs in critical systems have caused airplane
crashes, allowed space shuttle missions to go awry, halted trading on the stock
market, and worse. Bugs can kill. Bugs can cause disasters. The so-called year 2000
(Y2K) bug has given birth to a cottage industry of consultants and programming tools
dedicated to making sure the modern world doesn't come to a screeching halt on the
first day of the next century. In a computerized embedded world, the quality and
reliability of software is a matter of life and death.

Quality means the conformance to the specified design requirement. Being correct,
the minimum requirement of quality, means performing as required under specified
circumstances. Debugging, a narrow view of software testing, is performed heavily to
find out design defects by the programmer. The imperfection of human nature makes
it almost impossible to make a moderately complex program correct the first time.
Finding the problems and get them fixed, is the purpose of debugging in
programming phase.

For Verification & Validation (V&V):

Just as topic Verification and Validation indicated, another important purpose of


testing is verification and validation (V&V). Testing can serve as metrics. It is heavily
used as a tool in the V&V process. Testers can make claims based on interpretations
of the testing results, which either the product works under certain situations, or it
does not work. We can also compare the quality among different products under the
same specification, based on results from the same test.

We can not test quality directly, but we can test related factors to make quality visible.
Quality has three sets of factors -- functionality, engineering, and adaptability. These
three sets of factors can be thought of as dimensions in the software quality space.
Each dimension may be broken down into its component factors and considerations at
successively lower levels of detail. Table 1 illustrates some of the most frequently
cited quality considerations.

Functionality (exterior Engineering (interior Adaptability (future


quality) quality) quality)

Correctness Efficiency Flexibility

Reliability Testability Reusability

Usability Documentation Maintainability

Integrity Structure

Table 1. Typical Software Quality Factors [Hetzel88]

Good testing provides measures for all relevant factors. The importance of any
particular factor varies from application to application. Any system where human
lives are at stake must place extreme emphasis on reliability and integrity. In the
typical business system usability and maintainability are the key factors, while for a
one-time scientific program neither may be significant. Our testing, to be fully
effective, must be geared to measuring each relevant factor and thus forcing quality to
become tangible and visible.
Tests with the purpose of validating the product works are named clean tests, or
positive tests. The drawbacks are that it can only validate that the software works for
the specified test cases. A finite number of tests can not validate that the software
works for all situations. On the contrary, only one failed test is sufficient enough to
show that the software does not work. Dirty tests, or negative tests, refer to the tests
aiming at breaking the software, or showing that it does not work. A piece of software
must have sufficient exception handling capabilities to survive a significant level of
dirty tests.

A testable design is a design that can be easily validated, falsified and maintained.
Because testing is a rigorous effort and requires significant time and cost, design for
testability is also an important design rule for software development.

For reliability estimation :

Software reliability has important relations with many aspects of software, including
the structure, and the amount of testing it has been subjected to. Based on an
operational profile (an estimate of the relative frequency of use of various inputs to
the program ), testing can serve as a statistical sampling method to gain failure data
for reliability estimation.

Software testing is not mature. It still remains an art, because we still cannot make it a
science. We are still using the same testing techniques invented 20-30 years ago,
some of which are crafted methods or heuristics rather than good engineering
methods. Software testing can be costly, but not testing software is even more
expensive, especially in places that human lives are at stake. Solving the software-
testing problem is no easier than solving the Turing halting problem. We can never be
sure that a piece of software is correct. We can never be sure that the specifications
are correct. No verification system can verify every correct program. We can never be
certain that a verification system is correct either.
FEATURES OF THE PROJECT
There are 6 key features for this project. They are:-

 Design of User-Friendly input formats.

 Provisions for Data Entry and Validations.

 Provision for data retrieval.

 Unauthorized persons can not accessible.


 More flexible

 More interoperability.
CONCLUSION

CONCLUSION

Paper-based records require a significant amount of storage space compared to digital


records. In the US, most states require physical records be held for a minimum of
seven years. The costs of storage media, such as paper and film, per unit of
information differ dramatically from that of electronic storage media. When paper
records are stored in different locations, collating them to a single location for review
by a healthcare provider is time consuming and complicated, whereas the process can
be simplified with electronic records. This is particularly true in the case of person-
centered records, which are impractical to maintain if not electronic (thus difficult to
centralize or federate). When paper-based records are required in multiple locations,
copying, faxing, and transporting costs are significant compared to duplication and
transfer of digital records.

Handwritten paper medical records can be associated with poor legibility, which can
contribute to medical errors. Pre-printed forms, the standardization of abbreviations,
and standards for penmanship were encouraged to improve reliability of paper
medical records. Electronic records help with the standardization of forms,
terminology and abbreviations, and data input. Digitization of forms facilitates the
collection of data for epidemiology and clinical studies.

In contrast, HRA (www.imedlogin.com/HRA) can be continuously updated. The


ability to exchange records between different HRA systems ("interoperability") would
facilitate the co-ordination of healthcare delivery in non-affiliated healthcare facilities.
In addition, data from an electronic system can be used anonymously for statistical
reporting in matters such as quality improvement, resource management and public
health communicable disease surveillance
BIBLIOGRAPHY

BIBLIOGRAPHY

Books:
System Analysis And Design :James A.Senn

Microsoft SQL Server :Mike Chapple

ASP.NET :Stephen Walther.

ASP.NET :Wrox Publication

Websites:

www.msdn.microsoft.com/asp.net

www.quickstarts.asp.net

www.sitepoint.com/article/sql-injection-attacks-safe

Anda mungkin juga menyukai