SUBMITED BY:
TABLE OF CONTENTS
1. PROJECT PROPOSAL………………………………………………………..1
2. PERFORMA FOR APPROVAL………………………………………………3
3. GUIDE RESUME………………………………………………..……………4-6
4. TITLE OF PROJECT…………………………………………………….……7
5. INTRODUCTION……………………………………………………….……9-11
6. TOOLS/PLATFORM, HARDWARE AND SOFTWARE
REQUIREMENT SPECIFICATION…………………...……………………12-17
7. Technology Overview of .net 2005 with c# 2.0 ,…………………………..
SQL SERVER 2005 , SDLC ………………………………………………
8. ANALYSIS (DFDs, ER Diagrams, Class Diagrams etc. As per the project
requirements)…………………………………………………………………22-31
9. A COMPLETE STRUCTURE OF THE PROGRAM……………………….32-38
. Number of modules and their description.
. Data structures for all modules.
. Process Logic of each module
. Report generation.
10. TESTING AND RESULT………………..………………………………….39-51
11. FUTURE ENHANCEMENT…………………………………………………52
12. Glossary
ONLINE JOB PORTAL
SYSTEM
(USING ASP.NET THROUGH C# 2.0)
Acknowledgement
This is a great opportunity to acknowledge and to thanks all those persons without
whose support and help this project would have been impossible. We would like to
add a few heartfelt words for the people who were part of this project in numerous
ways.
Thank you
matter embodied in this project has not been submitted earlier for award of any
degree or diploma
done by me is an authentic work carried out for the partial fulfillment of the
under the guidance of “NAME OF HOD”. The matter embodied in this project
work has not Been submitted earlier for award of any degree or diploma to
TOOLS:
PLATFORM:
HARDWARE ENVIRONMENT:
SOFTWARE ENVIRONMENT:
The Microsoft .NET Framework version 2.0 extends the .NET Framework
version 1.1 with new features, improvements to existing features, and
enhancements to the documentation. This section provides information
about some key additions and modifications.
For more information about compatibility and for a list of the public API
modifications to the class library that might affect the compatibility of your
application.
ASP.NET
You can now customize Web sites and pages in a variety of ways. Profile
properties enable ASP.NET to track property values for individual users
automatically. Using Web Parts, you can create pages that users can
customize in the browser. You can add navigation menus using simple
controls.
For a more complete list of new features in ASP.NET, see What's New in
ASP.NET.
Authenticated Streams
Applications can use the new NegotiateStream and SslStream classes for
authentication and to help secure information transmitted between a client
and a server. These authenticated stream classes support mutual
authentication, data encryption, and data signing. The NegotiateStream
class uses the Negotiate security protocol for authentication. The SslStream
class uses the Secure Socket Layer (SSL) security protocol for
authentication.
Four major enhancements have been made to the classes and tools that
support interoperability with COM:
The new Data Protection API (DPAPI) includes four methods that allow
applications to encrypt passwords, keys, connections strings, and so on,
without calling platform invoke. You can also encrypt blocks of memory on
computers running Windows Server 2003 or later operating systems.
You can now control how Visual Studio displays a class or member when an
application is being debugged. The debugger's Display Attributes feature
enables you to identify the most useful information to display in the
debugger.
The .NET Framework 2.0 reintroduces the Edit and Continue feature that
enables a user who is debugging an application in Visual Studio to make
changes to source code while executing in Break mode. After source code
edits are applied, the user can resume code execution and observe the
effect. Furthermore, the Edit and Continue feature is available in any
programming language supported by Visual Studio.
Distributed Computing
In the System.Net namespace, support has been added for FTP client
requests, caching of HTTP resources, automatic proxy discovery, and
obtaining network traffic and statistical information.
The namespace now includes a Web server class that you can use to create
a simple Web server for responding to HTTP requests. Classes that generate
network traffic have been instrumented to output trace information for
application debugging and diagnostics. Security and performance
enhancements have been added to the System.Net.Sockets.Socket and
System.Uri classes.
EventLog Enhancements
You can now use custom DLLs for EventLog messages, parameters, and
categories.
The .NET Framework now supports X.509 certificate stores, chains, and
extensions. In addition, you can sign and verify XML using X.509 certificates
without using platform invoke. There is also support for PKCS7 signature and
encryption, and CMS (a superset of the PKCS7 standard available on
Microsoft Windows 2005 and later operating systems). PKCS7 is the
underlying format used in Secure/Multipurpose Internet Mail Extensions
(S/MIME) for signing and encrypting data. For more information, see the
X509Certificate2 class topic.
FTP Support
Applications can now access File Transfer Protocol resources using the
WebRequest, WebResponse, and WebClient classes.
The .NET Framework 2.0 introduces generics to allow you to create flexible,
reusable code. Language features collectively known as generics act as
templates that allow classes, structures, interfaces, methods, and delegates
to be declared and defined with unspecified, or generic type parameters
instead of specific types. Actual types are specified later when the generic is
used. Several namespaces, such as System Namespace and
System.Collections.Generic, provide generic classes and methods. The new
System.Collections.Generic namespace provides support for strongly typed
collections. System.Nullable<T> is a standard representation of optional
values. Generics are supported in three languages: Visual Basic, C#, and C+
+.
Globalization
• Support for custom cultures enables you to define and deploy culture-
related information as needed. This feature is useful for creating minor
customizations of existing culture definitions, and creating culture
definitions that do not yet exist in the .NET Framework. For more
information, see the CultureAndRegionInfoBuilder class.
I/O Enhancements
Manifest-Based Activation
This feature provides new support for loading and activating applications
through the use of a manifest. Manifest-based activation is essential for
supporting ClickOnce applications. Traditionally, applications are activated
through a reference to an assembly that contains the application's entry
point. For example, clicking an application's .exe file from within the
Windows shell causes the shell to load the common language runtime (CLR)
and call a well-known entry point within that .exe file's assembly.
This activation model also invokes an entity called a Trust Manager that
performs the following tasks:
.NET Framework remoting now supports IPv6 addresses and the exchange of
generic types. The classes in the System.Runtime.Remoting.Channels.Tcp
namespace support authentication and encryption using the Security
Support Provider Interface (SSPI). Classes in the new
System.Runtime.Remoting.Channels.Ipc namespace allow applications on
the same computer to communicate quickly without using the network.
Finally, you can now configure the connection cache time-out and the
number of method retries, which can improve the performance of network
load-balanced remote clusters.
Ping
You can use the HttpListener class to create a simple Web server that
responds to HTTP requests. The Web server is active for the lifetime of the
HttpListener object and runs within your application, with your application's
permissions. This class is available only on computers running the Windows
XP Service Pack 2 or Windows Server 2003 operating systems.
The .NET Framework has two main components: the common language
runtime and the .NET Framework class library. The common language
runtime is the foundation of the .NET Framework. You can think of the
runtime as an agent that manages code at execution time, providing core
services such as memory management, thread management, and remoting,
while also enforcing strict type safety and other forms of code accuracy that
promote security and robustness. In fact, the concept of code management
is a fundamental principle of the runtime. Code that targets the runtime is
known as managed code, while code that does not target the runtime is
known as unmanaged code. The class library, the other main component of
the .NET Framework, is a comprehensive, object-oriented collection of
reusable types that you can use to develop applications ranging from
traditional command-line or graphical user interface (GUI) applications to
applications based on the latest innovations provided by ASP.NET, such as
Web Forms and XML Web services.
While the runtime is designed for the software of the future, it also supports
software of today and yesterday. Interoperability between managed and
unmanaged code enables developers to continue to use necessary COM
components and DLLs.
The runtime is designed to enhance performance. Although the common
language runtime provides many standard runtime services, managed code
is never interpreted. A feature called just-in-time (JIT) compiling enables all
managed code to run in the native machine language of the system on which
it is executing. Meanwhile, the memory manager removes the possibilities of
fragmented memory and increases memory locality-of-reference to further
increase performance.
The .NET Framework class library is a collection of reusable types that tightly
integrate with the common language runtime. The class library is object
oriented, providing types from which your own managed code can derive
functionality. This not only makes the .NET Framework types easy to use,
but also reduces the time associated with learning new features of the .NET
Framework. In addition, third-party components can integrate seamlessly
with classes in the .NET Framework.
• Console applications.
• Windows GUI applications (Windows Forms).
• ASP.NET applications.
• XML Web services.
• Windows services.
The Windows Forms classes contained in the .NET Framework are designed
to be used for GUI development. You can easily create command windows,
buttons, menus, toolbars, and other screen elements with the flexibility
necessary to accommodate shifting business needs.
For example, the .NET Framework provides simple properties to adjust visual
attributes associated with forms. In some cases the underlying operating
system does not support changing these attributes directly, and in these
cases the .NET Framework automatically recreates the forms. This is one of
many ways in which the .NET Framework integrates the developer interface,
making coding simpler and more consistent.
The following illustration shows a basic network schema with managed code
running in different server environments. Servers such as IIS and SQL
Server can perform standard operations while your application logic executes
through the managed code.
ASP.NET is the hosting environment that enables developers to use the .NET
Framework to target Web-based applications. However, ASP.NET is more
than just a runtime host; it is a complete architecture for developing Web
sites and Internet-distributed objects using managed code. Both Web Forms
and XML Web services use IIS and ASP.NET as the publishing mechanism for
applications, and both have a collection of supporting classes in the .NET
Framework.
If you have used earlier versions of ASP technology, you will immediately
notice the improvements that ASP.NET and Web Forms offer. For example,
you can develop Web Forms pages in any language that supports the .NET
Framework. In addition, your code no longer needs to share the same file
with your HTTP text (although it can continue to do so if you prefer). Web
Forms pages execute in native machine language because, like any other
managed application, they take full advantage of the runtime. In contrast,
unmanaged ASP pages are always scripted and interpreted. ASP.NET pages
are faster, more functional, and easier to develop than unmanaged ASP
pages because they interact with the runtime like any managed application.
The .NET Framework also provides a collection of classes and tools to aid in
development and consumption of XML Web services applications. XML Web
services are built on standards such as SOAP (a remote procedure-call
protocol), XML (an extensible data format), and WSDL ( the Web Services
Description Language). The .NET Framework is built on these standards to
promote interoperability with non-Microsoft solutions.
For example, the Web Services Description Language tool included with
the .NET Framework SDK can query an XML Web service published on the
Web, parse its WSDL description, and produce C# or Visual Basic source
code that your application can use to become a client of the XML Web
service. The source code can create classes derived from classes in the class
library that handle all the underlying communication using SOAP and XML
parsing. Although you can use the class library to consume XML Web
services directly, the Web Services Description Language tool and the other
tools contained in the SDK facilitate your development efforts with the .NET
Framework.
If you develop and publish your own XML Web service, the .NET Framework
provides a set of classes that conform to all the underlying communication
standards, such as SOAP, WSDL, and XML. Using those classes enables you
to focus on the logic of your service, without concerning yourself with the
communications infrastructure required by distributed software
development.
Finally, like Web Forms pages in the managed environment, your XML Web
service will run with the speed of native machine language using the scalable
communication of IIS.
Compilers and tools expose the runtime's functionality and enable you to
write code that benefits from this managed execution environment. Code
that you develop with a language compiler that targets the runtime is called
managed code; it benefits from features such as cross-language integration,
cross-language exception handling, enhanced security, versioning and
deployment support, a simplified model for component interaction, and
debugging and profiling services.
Language compilers and tools expose the runtime's functionality in ways that
are intended to be useful and intuitive to developers. This means that some
features of the runtime might be more noticeable in one environment than in
another. How you experience the runtime depends on which language
compilers or tools you use. For example, if you are a Visual Basic developer,
you might notice that with the common language runtime, the Visual Basic
language has more object-oriented features than before. Following are some
benefits of the runtime:
• Performance improvements.
• The ability to easily use components developed in other languages.
• Extensible types provided by a class library.
• New language features such as inheritance, interfaces, and
overloading for object-oriented programming; support for explicit free
threading that allows creation of multithreaded, scalable applications;
support for structured exception handling and custom attributes.
If you use Microsoft® Visual C++® .NET, you can write managed code using
the Managed Extensions for C++, which provide the benefits of a managed
execution environment as well as access to powerful capabilities and
expressive data types that you are familiar with. Additional runtime features
include:
You can also write managed code using the C# language, which provides the
following benefits:
Microsoft® SQL Server™ 2005 extends the performance, reliability, quality, and ease-of-use of
Microsoft SQL Server version 7.0. Microsoft SQL Server 2005 includes several new features
that make it an excellent database platform for large-scale online transactional processing
(OLTP), data warehousing, and e-commerce applications.
The OLAP Services feature available in SQL Server version 7.0 is now called SQL Server 2005
Analysis Services. The term OLAP Services has been replaced with the term Analysis Services.
Analysis Services also includes a new data mining component. For more information,
The Repository component available in SQL Server version 7.0 is now called Microsoft SQL
Server 2005 Meta Data Services. References to the component now use the term Meta Data
Services. The term repository is used only in reference to the repository engine within Meta Data
Services. For more information, .
The What's New topics contain brief overviews of the new features and links to relevant
conceptual topics that provide more detailed information. These conceptual topics provide links
to topics that describe the commands or statements you use to work with these features.
Microsoft® SQL Server™ 2005 introduces several server improvements and new features:
XML Support
The relational database engine can return data as Extensible Markup Language (XML)
documents. Additionally, XML can also be used to insert, update, and delete values in the
database. For more information,
SQL Server 2005 supports enhancements to distributed partitioned views that allow you to
partition tables horizontally across multiple servers. This allows you to scale out one database
server to a group of database servers that cooperate to provide the same performance levels as a
cluster of database servers. This group, or federation, of database servers can support the data
storage requirements of the largest Web sites and enterprise data processing systems.
SQL Server 2005 introduces Net-Library support for Virtual Interface Architecture (VIA)
system-area networks that provide high-speed connectivity between servers, such as between
application servers and database servers. For more information,
User-Defined Functions
Indexed views can significantly improve the performance of an application where queries
frequently perform certain joins or aggregations. An indexed view allows indexes to be created
on views, where the result set of the view is stored and indexed in the database. Existing
applications do not need to be modified to take advantage of the performance improvements with
indexed views. For more information, .
SQL Server 2005 introduces three new data types. bigint is an 8-byte integer type. sql_variant
is a type that allows the storage of data values of different data types. table is a type that allows
applications to store results temporarily for later use. It is supported for variables, and as the
return type for user-defined functions. For more information
INSTEAD OF triggers are executed instead of the triggering action (for example, INSERT,
UPDATE, DELETE). They can also be defined on views, in which case they greatly extend the
types of updates a view can support. AFTER triggers fire after the triggering action. SQL Server
2005 introduces the ability to specify which AFTER triggers fire first and last. For more
information, .
You can control the actions SQL Server 2005 takes when you attempt to update or delete a key
to which existing foreign keys point. This is controlled by the new ON DELETE and ON
UPDATE clauses in the REFERENCES clause of the CREATE TABLE and ALTER TABLE
statements. For more information, .
Collation Enhancements
SQL Server 2005 replaces code pages and sort orders with collations. SQL Server 2005 includes
support for most collations supported in earlier versions of SQL Server, and introduces a new set
of collations based on Windows collations. You can now specify collations at the database level
or at the column level. Previously, code pages and sort orders could be specified only at the
server level and applied to all databases on a server. For more information, .
Collations support code page translations. Operations with char and varchar operands having
different code pages are now supported. Code page translations are not supported for text
operands. You can use ALTER DATABASE to change the default collation of a database. For
more information,
Full-text search now includes change tracking and image filtering. Change tracking maintains a
log of all changes to the full-text indexed data. You can update the full-text index with these
changes by flushing the log manually, on a schedule, or as they occur, using the background
update index option. Image filtering allows you to index and query documents stored in image
columns. The user provides the document type in a column that contains the file name extension
that the document would have had if it were stored as a file in the file system. Using this
information, full-text search is able to load the appropriate document filter to extract textual
information for indexing. For more information .
SQL Server 2005 supports running multiple instances of the relational database engine on the
same computer. Each computer can run one instance of the relational database engine from SQL
Server version 6.5 or 7.0, along with one or more instances of the database engine from SQL
Server 2005. Each instance has its own set of system and user databases. Applications can
connect to each instance on a computer similar to the way they connect to instances of SQL
Servers running on different computers. The SQL Server 2005 utilities and administration tools
have been enhanced to work with multiple instances. For more information, .
Index Enhancements
You can now create indexes on computed columns. You can specify whether indexes are built in
ascending or descending order, and if the database engine should use parallel scanning and
sorting during index creation. For more information,
The CREATE INDEX statement can now use the tempdb database as a work area for the sorts
required to build an index. This results in improved disk read and write patterns for the index
creation step, and makes it more likely that index pages will be allocated in contiguous strips. In
addition, the complete process of creating an index is eligible for parallel operations, not only the
initial table scan.
Failover Clustering Enhancements
The administration of failover clusters has been greatly improved to make it very easy to install,
configure, and maintain a Microsoft SQL Server 2005 failover cluster. Additional enhancements
include the ability to failover and failback to or from any node in a SQL Server 2005 cluster, the
ability to add or remove a node from the cluster through SQL Server 2005 Setup, and the ability
to reinstall or rebuild a cluster instance on any node in the cluster without affecting the other
cluster node instances. The SQL Server 2005 utilities and administration tools have been
enhanced to work with failover clusters. For more information
Net-Library Enhancements
The SQL Server 2005 Net-Libraries have been rewritten to virtually eliminate the need to
administer Net-Library configurations on client computers when connecting SQL Server 2005
clients to instances of SQL Server 2005. The new Net-Libraries also support connections to
multiple instances of SQL Server on the same computer, and support Secure Sockets Layer
encryption over all Net-Libraries. SQL Server 2005 introduces Net-Library support for Virtual
Interface Architecture (VIA) system-area networks that provide high-speed connectivity between
servers, such as between application servers and database servers.
Microsoft SQL Server 2005 Enterprise Edition can use the Microsoft Windows 2005 Advanced
Windows Extension (AWE) API to support up to 64 GB of physical memory (RAM) on a
computer. For more information, .
SQL Server 2005 introduces a new OPENDATASOURCE function, which you can use to
specify ad hoc connection information in a distributed query. SQL Server 2005 also specifies
methods that OLE DB providers can use to report the level of SQL syntax supported by the
provider and statistics on the distribution of key values in the data source. The distributed query
optimizer can then use this information to reduce the amount of data that has to be sent from the
OLE DB data source. SQL Server 2005 delegates more SQL operations to OLE DB data sources
than earlier versions of SQL Server. Distributed queries also support the other functions
introduced in SQL Server 2005, such as multiple instances, mixing columns with different
collations in result sets, and the new bigint and sql_variant data types.
SQL Server 2005 distributed queries add support for the OLE DB Provider for Exchange and the
Microsoft OLE DB Provider for Microsoft Directory Services.
SQL Server 2005 introduces enhancements to distributed partitioned views. You can partition
tables horizontally across several servers, and define a distributed partitioned view on each
member server that makes it appear as if a full copy of the original table is stored on each server.
Groups of servers running SQL Server that cooperate in this type of partitioning are called
federations of servers. A database federation built using SQL Server 2005 databases is capable of
supporting the processing requirements of the largest Web sites or enterprise-level databases. For
more information, .
SQL Server 2005 uses Kerberos to support mutual authentication between the client and the
server, as well as the ability to pass the security credentials of a client between computers, so that
work on a remote server can proceed using the credentials of the impersonated client. With
Microsoft Windows® 2005, SQL Server 2005 uses Kerberos and delegation to support both
integrated authentication as well as SQL Server logins. For more information,
SQL Server 2005 introduces a new, more easily understood model for specifying backup and
restore options. The new model makes it clearer that you are balancing increased or decreased
exposure to losing work against the performance and log space requirements of different plans.
SQL Server 2005 introduces support for recovery to specific points of work using named log
marks in the transaction log, and the ability to do partial database restores
Users can define passwords for backup sets and media sets that prevent unauthorized users from
accessing SQL Server backups.
SQL Server 2005 enhancements for utility operations include faster differential backups, parallel
Database Console Command (DBCC) checking, and parallel scanning. Differential backups can
now be completed in a time that is proportional to the amount of data changed since the last full
backup. DBCC can be run without taking shared table locks while scanning tables, thereby
enabling them to be run concurrently with update activity on tables. Additionally, DBCC now
takes advantage of multiple processors, thus enabling near-linear gain in performance in relation
to the number of CPUs (provided that I/O is not a bottleneck).
SQL Server 2005 supports a new text in row table option that specifies that small text, ntext,
and image values be placed directly in the data row instead of in a separate page. This reduces
the amount of space used to store small text, ntext, and image data values, and reduces the
amount of disk I/O needed to process these values. For more information,
The Microsoft® SQL Server™ 2005 relational database engine natively supports Extensible
Markup Language (XML).
You can now access SQL Server 2005 over HTTP using a Universal Resource Locator (URL).
You can define a virtual root on a Microsoft Internet Information Services (IIS) server, which
gives you HTTP access to the data and XML functionality of SQL Server 2005.
You can use HTTP, ADO, or OLE DB to work with the XML functionality of SQL Server 2005:
• You can define XML views of SQL Server 2005 databases by annotating XML-Data
Reduced (XDR) schemas to map the tables, views, and columns that are associated with
the elements and attributes of the schema. The XML views can then be referenced in
XPath queries, which retrieve results from the database and return them as XML
documents.
• The results of SELECT statements can be returned as XML documents. The SQL Server
2005 Transact-SQL SELECT statement supports a FOR XML clause that specifies that
the statement results be returned in the form of an XML document instead of a relational
result set. Complex queries, or queries that you want to make secure, can be stored as
templates in an IIS virtual root, and executed by referencing the template name.
• You can expose the data from an XML document as a relational rowset using the new
OPENXML rowset function. OPENXML can be used everywhere a rowset function can
be used in a Transact-SQL statement, such as in place of a table or view reference in a
FROM clause. This allows you to use the data in XML documents to insert, update, or
delete data in the tables of the database, including modifying multiple rows in multiple
tables in a single operation.
Microsoft® SQL Server™ 2005 introduces these graphical administration improvements and
new features:
Log Shipping
Log shipping allows the transaction logs from a source database to be continually backed up and
loaded into a target database on another server. This is useful for maintaining a warm standby
server, or for offloading query processing from the source server to a read-only destination
server. For more information, see Log Shipping.
SQL Profiler now supports size-based and time-based traces, and includes new events for Data
File Auto Grow, Data File Auto Shrink, Log File Auto Grow, Log File Auto Shrink, Show Plan
All, Show Plan Statistics, and Show Plan Text.
SQL Profiler has been enhanced to provide auditing of SQL Server activities, up to the auditing
levels required by the C2 level of security defined by the United States government. For more
information,
SQL Query Analyzer includes a stored procedure debugger. SQL Query Analyzer also includes
templates that can be used as the starting points for creating objects such as databases, tables,
views, and stored procedures. For more information,
Users can run the Copy Database Wizard to upgrade SQL Server version 7.0 databases to SQL
Server 2005 databases. It can also be used to copy complete databases between instances of SQL
Server 2005. For more information..
Replication Enhancements
Microsoft® SQL Server™ 2005 introduces the following replication improvements and new
features:
Implementing Replication
SQL Server 2005 enhances snapshot replication, transactional replication, and merge replication
by adding:
• Alternate snapshot locations, which provide easier and more flexible methods for
applying the initial snapshot to Subscribers. You can save (and compress) the snapshot
files to a network location or removable media, which can then be transferred to
Subscribers without using the network.
• Attachable subscription databases, which allow you to transfer a database with replicated
data and one or more subscriptions from one Subscriber to another SQL Server. After the
database is attached to the new Subscriber, the subscription database at the new
Subscriber will automatically receive its own pull subscriptions to the publications at the
specified Publishers.
• Schema changes on publication databases, which allow you to add or drop columns on
the publishing table and propagate those changes to Subscribers.
• On demand script execution, which allows you to post a general SQL script that will be
executed at all Subscribers.
• Pre- and post-snapshot scripts, which allow you to run scripts before or after a snapshot is
applied at the Subscriber.
• Remote agent activation, which allows you to reduce the amount of processing on the
Distributor or Subscriber by running the Distribution Agent or Merge Agent on one
computer while activating that agent from another computer. You can use remote agent
activation with push or pull subscriptions.
• Support of new SQL Server features, which includes user-defined functions, indexed
views, new data types, and multiple instances of SQL Server.
• More snapshot scripting options, which support transfer of indexes, extended properties,
and constraints to Subscribers.
Merge Replication
Merge replication is the process of distributing data from Publisher to Subscribers, allowing the
Publisher and Subscribers to make updates while connected or disconnected, and then merging
the changes between sites when they are connected. Enhancements to merge replication include:
• Greater parallelism of the Merge Agent for improved server-to-server performance.
• Dynamic snapshots, which provide more efficient application of the initial snapshot when
using dynamic filters.
• The ability to use alternate synchronization partners when synchronizing data. Using
alternate synchronization partners, a Subscriber to a merge publication can synchronize
with any specified server that has the same data as the original Publisher.
• Several new merge replication conflict resolvers including interactive resolvers that
provide a user interface for immediate, manual conflict resolution, priority based on a
column value, minimum/maximum value wins, first/last change wins, additive/average
value, and merge by appending different text values.
• New COM interfaces that support heterogeneous data sources as Publishers within a SQL
Server replication topology.
Transactional Replication
With transactional replication, an initial snapshot of data is applied at Subscribers, and then when
data modifications are made at the Publisher, the individual transactions are captured and
propagated to Subscribers. Enhancements to transactional replication include:
• Improved error handling and the ability to skip specified errors and continue replication.
• The option to store data modifications made at the Subscriber in a queue (queued
updating).
Queued Updating
Queued updating allows snapshot replication and transactional replication Subscribers to modify
published data without requiring an active network connection to the Publisher.
When you create a publication with the queued updating option enabled and a Subscriber
performs INSERT, UPDATE, or DELETE statements on published data, the changes are stored
in a queue. The queued transactions are applied asynchronously at the Publisher when network
connectivity is restored.
Because the updates are propagated asynchronously to the Publisher, the same data may have
been updated by the Publisher or by another Subscriber and conflicts can occur when applying
the updates. Conflicts are detected automatically and several options for resolving conflicts are
offered.
Using transformable subscriptions in your replication topology allows you to customize and send
published data based on the requirements of individual Subscribers, including performing data
type mappings, column manipulations, string manipulations, and use of functions as data is
published.
Replication Usability
There have been several improvements in SQL Server Enterprise Manager that provide for easier
implementation, monitoring, and administration of replication. Enhancements to replication
usability include:
• A centralized Replication folder in the SQL Server Enterprise Manager tree, which
organizes all subscriptions and publications on the server being administered.
• The ability to browse for and subscribe to publications (when permission is allowed)
using Windows Active Directory.
• The ability to see multiple Distributors in a single monitoring node in SQL Server
Enterprise Manager.
• Standard and advanced replication options separated in the Create Publication, Create
Push Subscription, and Create Pull Subscription Wizards. You can choose to show
advanced options in these wizards on the Welcome page of each wizard.
• New wizards for creating jobs that create dynamic snapshots for merge publications that
use dynamic filters (Create Dynamic Snapshot Job Wizard), and for transforming
published data in snapshot replication or transactional replication (Transform Published
Data Wizard).
Microsoft® SQL Server™ 2005 introduces these Data Transformation Services (DTS)
enhancements and new features:
New DTS custom tasks, available through DTS Designer or the DTS object model, allow you to
create DTS packages that perform tasks or set variables based on the properties of the run-time
environment. Use these tasks to:
• Import data from, and send data and completed packages to, Internet and File Transfer
Protocol (FTP) sites.
DTS package logs save information for each package execution, allowing you to maintain a
complete execution history. You can also view execution information for individual processes
within a task.
You can generate exception files for transformation tasks. When you log to exception files, you
can save source and destination error rows to a file through the DTS OLE DB text file provider
and re-process the error rows.
DTS packages now can be saved to a Microsoft® Visual Basic® file. This allows a package
created by the DTS Import/Export Wizard or DTS Designer to be incorporated into Visual Basic
programs or to be used as prototypes by Visual Basic developers who need to reference the
components of the DTS object model. For more information,
A new multiphase data pump allows advanced users to customize the operation of the data pump
at various stages of its operation. You can now use global variables as input and output
parameters for queries .
You can now use parameterized source queries in a DTS transformation task and an Execute
SQL task. In addition, DTS includes an option for saving the results of a parameterized query to
a global variable, allowing you to perform functions such as saving disconnected Microsoft
ActiveX® Data Objects (ADO) recordsets in DTS.
You now can use the Execute Package task to dynamically assign the values of global variables
from a parent package to a child package. Use global variables to pass information from one
package to another when each package performs different work items. For example, use one
package to download data on a nightly basis, summarize the data, assign summary data values to
global variables, and pass the values to another package that further processes the data.
Working with Named and Multiple Instances of SQL Server
2005
With Microsoft® SQL Server™ 2005, you have the option of installing multiple copies, or
instances of SQL Server on one computer. When setting up a new installation of SQL Server
2005 or maintaining an existing installation, you can specify it as:
This instance is identified by the network name of the computer on which it is running.
Applications using client software from earlier versions of SQL Server can connect to a default
instance. SQL Server version 6.5 or SQL Server version 7.0 servers can operate as default
instances. However, a computer can have only one version functioning as the default instance at
a time.
This instance is identified by the network name of the computer plus an instance name, in the
format <computername>\<instancename>. Applications must use SQL Server 2005 client
components to connect to a named instance. A computer can run any number of named instances
of SQL Server concurrently. A named instance can run at the same time as an existing
installation of SQL Server version 6.5 or SQL Server version 7.0. The instance name cannot
exceed 16 characters.
A new instance name must begin with a letter, an ampersand (&), or an underscore (_),
and can contain numbers, letters, or other characters. SQL Server sysnames and reserved
names should not be used as instance names. For example, the term "default" should not
be used as an instance name because it is a reserved name used by Setup.
Single and multiple instances of SQL Server 2005 (default or named) are available using the
SQL Server 2005 Personal Edition, the SQL Server 2005 Standard Edition, or the SQL Server
2005 Enterprise Edition.
Default Instances
You cannot install a default instance of SQL Server 2005 on a computer that is also running SQL
Server 7.0. You must either upgrade the SQL Server 7.0 installation to a default instance of SQL
Server 2005, or keep the default instance of SQL Server 7.0 and install a named instance of SQL
Server 2005.
You can install a default instance of SQL Server 2005 on a computer running SQL Server 6.5,
but the SQL Server 6.5 installation and the default instance of SQL Server 2005 cannot be
running at the same time. You must switch between the two using the SQL Server 2005 vswitch
command prompt utility.
Multiple Instances
Multiple instances occur when you have more than one instance of SQL Server 2005 installed on
one computer. Each instance operates independently from any other instance on the same
computer, and applications can connect to any of the instances. The number of instances that can
run on a single computer depends on resources available. The maximum number of instances
supported in SQL Server 2005 is 16.
When you install SQL Server 2005 on a computer with no existing installations of SQL Server,
Setup specifies the installation of a default instance. However, you can choose to install SQL
Server 2005 as a named instance instead by clearing the Default option in the Instance Name
dialog box.
A named instance of SQL Server 2005 can be installed at any time: before installing the default
instance of SQL Server 2005, after installing the default instance of SQL Server 2005, or instead
of installing the default instance of SQL Server 2005.
Each named instance is made up of a distinct set of services and can have completely different
settings for collations and other options. The directory structure, registry structure, and service
names all reflect the specific instance name you specify.
of SQL Server
Multiple instances in Microsoft® SQL Server™ 2005 offer enhanced ways to work with earlier
versions of Microsoft SQL Server already installed on your computer. You can leave previous
installations intact, and also install and run SQL Server 2005. For example, you can run SQL
Server version 7.0 and a named instance of SQL Server 2005 at the same time, or you can run
SQL Server version 6.5 in a version switch configuration with SQL Server 2005. If you need to
have three different versions of SQL Server installed on the same computer, there are several
ways to accomplish this.
In addition, users of all editions of SQL Server can have more than one instance of SQL Server
2005 installed and running at once (multiple instances), as well as one or more earlier versions.
Considerations for using SQL Server 2005 in combination with previous installations include:
• Using SQL Server 6.5 with the default instance or named instances of SQL Server 2005.
• Running SQL Server 7.0 with a named instance of SQL Server 2005.
• Working with three versions of SQL Server: SQL Server 6.5, SQL Server 7.0, and SQL
Server 2005.
Note The concept of the default instance is new to SQL Server 2005, due to the introduction
of multiple instances. If installed on the same computer as SQL Server 2005, either SQL
Server version 6.5 or SQL Server version 7.0 can function as default instances of SQL
Server. (A default instance is identified by the network name of the computer on which it is
running.) For more information, .
When you keep Microsoft SQL Server version 7.0 on your computer and install a named
instance of SQL Server 2005, SQL Server Books Online for SQL Server 7.0 remains in its
original location: C:\Mssql7\Books. In this side-by-side configuration, Books Online for SQL
Server 7.0 remains accessible from the start menu in the SQL Server 7.0 program group.
Note This is an exception to what occurs for the other shared tools (such as code samples,
scripts, and templates), when a named instance of SQL Server 2005 is installed along with
SQL Server 7.0. All other shared tools from the 7.0 installation are copied to storage
locations, with pointers to the SQL Server 2005 tools replacing previous versions of the
tools. Files for Books Online for SQL Server 7.0 are not redirected in this way -- they remain
ready for use.
When SQL Server 7.0 is upgraded to the default version of SQL Server 2005, the 7.0 Books
Online files are also upgraded. That is, they are replaced with the SQL Server 2005 Books
Online.
Whether you have SQL Server 7.0 installed or not, you can access information in the SQL Server
7.0 documentation. For more information
Fundamentals of SQL Server 2005 Architecture
Microsoft® SQL Server™ 2005 is a family of products that meet the data storage requirements
of the largest data processing systems and commercial Web sites, yet at the same time can
provide easy-to-use data storage services to an individual or small business.
The data storage needs of a modern corporation or government organization are very complex.
Some examples are:
• Increasing numbers of corporations are implementing large Web sites as a mechanism for
their customers to enter orders, contact the service department, get information about
products, and for many other tasks that previously required contact with employees.
These sites require data storage that is secure, yet tightly integrated with the Web.
• Organizations are implementing off-the-shelf software packages for critical services such
as human resources planning, manufacturing resources planning, and inventory control.
These systems require databases capable of storing large amounts of data and supporting
large numbers of users.
• Organizations have many users who must continue working when they do not have
access to the network. Examples are mobile disconnected users, such as traveling sales
representatives or regional inspectors. These users must synchronize the data on a
notebook or laptop with the current data in the corporate system, disconnect from the
network, record the results of their work while in the field, and then finally reconnect
with the corporate network and merge the results of their fieldwork into the corporate
data store.
• Independent Software Vendors (ISVs) must be able to distribute data storage capabilities
with applications targeted at individuals or small workgroups. This means the data
storage mechanism must be transparent to the users who purchase the application. This
requires a data storage system that can be configured by the application, and then tune
itself automatically so that the users do not need to dedicate database administrators to
constantly monitor and tune the application
Extensible Markup Language (XML) is a hypertext programming language used to describe the
contents of a set of data and how the data should be output to a device or displayed in a Web
page. Markup languages originated as ways for publishers to indicate to printers how the content
of a newspaper, magazine, or book should be organized. Markup languages for electronic data
perform the same function for electronic documents that can be displayed on different types of
electronic gear.
Both XML and the Hypertext Markup Language (HTML) are derived from Standard Generalized
Markup Language (SGML). SGML is a very large, complex language that is difficult to fully use
for publishing data on the Web. HTML is a more simple, specialized markup language than
SGML, but has a number of limitations when working with data on the Web. XML is smaller
than SGML and more robust than HTML, so is becoming an increasingly important language in
the exchange of electronic data through the Web or intracompany networks.
In a relational database such as Microsoft® SQL Server™ 2005, all operations on the tables in
the database produce a result in the form of a table. The result set of a SELECT statement is in
the form of a table. Traditional client/server applications that execute a SELECT statement
process the results by fetching one row or block of rows from the tabular result set at a time and
mapping the column values into program variables. Web application programmers, on the other
hand, are more familiar with working with hierarchical representations of data in XML or HTML
documents.
SQL Server 2005 introduces support for XML. These new features include:
• Support for XML-Data schemas and the ability to specify XPath queries against these
schemas.
Database Architecture
Microsoft® SQL Server™ 2005 data is stored in databases. The data in a database is organized
into the logical components visible to users. A database is also physically implemented as two or
more files on disk.
When using a database, you work primarily with the logical components such as tables, views,
procedures, and users. The physical implementation of files is largely transparent. Typically,
only the database administrator needs to work with the physical implementation.
Each instance of SQL Server has four system databases (master, model, tempdb, and msdb)
and one or more user databases. Some organizations have only one user database, containing all
the data for their organization. Some organizations have different databases for each group in
their organization, and sometimes a database used by a single application. For example, an
organization could have one database for sales, one for payroll, one for a document management
application, and so on. Sometimes an application uses only one database; other applications may
access several databases.
It is not necessary to run multiple copies of the SQL Server database engine to allow multiple
users to access the databases on a server. An instance of the SQL Server Standard or Enterprise
Edition is capable of handling thousands of users working in multiple databases at the same time.
Each instance of SQL Server makes all databases in the instance available to all users that
connect to the instance, subject to the defined security permissions.
When connecting to an instance of SQL Server, your connection is associated with a particular
database on the server. This database is called the current database. You are usually connected to
a database defined as your default database by the system administrator, although you can use
connection options in the database APIs to specify another database. You can switch from one
database to another using either the Transact-SQL USE database_name statement, or an API
function that changes your current database context.
SQL Server 2005 allows you to detach databases from an instance of SQL Server, then reattach
them to another instance, or even attach the database back to the same instance. If you have a
SQL Server database file, you can tell SQL Server when you connect to attach that database file
with a specific database name.
Using English Query, you can turn your relational databases into English Query applications,
which allow end users to pose questions in English instead of forming a query with an SQL
statement.
The English Query Model Editor appears within the Microsoft® Visual Studio® version 6.0
development environment. From there, you can choose one of the English Query project wizards,
the SQL Project Wizard or the OLAP Project Wizard, to automatically create an English Query
project and model. After the basic model is created, you can refine, test, and compile it into an
English Query application (*.eqd), and then deploy it (for example, to the Web).
Creating an English Query Project and Model
Using the SQL Project Wizard or the OLAP Project Wizard, you incorporate the database
structure (table names, field names, keys, and joins) or cube information of the database into a
project and a model.
A model contains all the information needed for an English Query application, including the
database structure, or schema, of the underlying SQL database or cube and the semantic objects
(entities and relationships). You also define properties for an application and add entries to the
English Query dictionary, as well as manually add and modify entities and relationships while
testing questions and set other options to expand the model.
Creating Entities and Relationships
With the wizards, semantic objects are automatically created for the model. These include
entities and relationships (with phrasings such as customers buy products or Customer_Names
are the names of customers). Entities are usually represented by tables, fields, and OLAP
objects.
An entity is a real-world object, referred to by a noun (person, place, thing, or idea), for example:
customers, cities, products, shipments, and so forth. In databases, entities are usually
represented by tables, fields, and Analysis Services objects.
Relationships describe what the entities have to do with one another, for example: customers
purchase products. Command relationships are not represented in the database but refer to
actions to be executed. For example, a command to a compact disc player can allow requests
such as "Play the album with song X on it."
Deploying an English Query Application
You can deploy an English Query application in several ways, including within a Microsoft
Visual Basic® or Microsoft Visual C++® application and on a Web page running on Microsoft
Internet Information Services (IIS). In the Web scenario, the interface of the application is with a
set of Active Server Pages (ASP).
Meta Data Services Overview
Microsoft® SQL Server™ 2005 Meta Data Services is an object-oriented repository technology
that can be integrated with enterprise information systems or with applications that process meta
data.
A number of Microsoft technologies use Meta Data Services as a native store for object
definitions or as a platform for deploying meta data. One of the ways in which SQL Server 2005
uses Meta Data Services is to store versioned Data Transformation Services (DTS) packages. In
Microsoft Visual Studio®, Meta Data Services supports the exchange of model data with other
development tools.
You can use Meta Data Services for your own purposes: as a component of an integrated
information system, as a native store for custom applications that process meta data, or as a
storage and management service for sharing reusable models. You can also extend Meta Data
Services to provide support for new tools for resale or customize it to satisfy internal tool
requirements
Troubleshooting Overview
As a starting point to troubleshooting a problem in Microsoft® SQL Server™ 2005, you may
find the solution in one of the online troubleshooters from SQL Server Product Support Services
(PSS). For more information, see Online Troubleshooters from PSS. In addition, review current
error logs for information that may pinpoint the problem. Other current information about
troubleshooting SQL Server 2005 can be found on the FAQs & Highlights for SQL Server page,
available at Microsoft Web site.
Error Logs
The error log in SQL Server 2005 provides complete information about events in SQL Server.
You may also want to view the Microsoft Windows® 2005 or Windows NT® 4.0 application
log, which provides an overall picture of events that occur on the Windows NT 4.0 and Windows
2005 operating systems, as well as events in SQL Server and SQL Server Agent. Both logs
include informational messages (such as startup data), and both record the date and time of all
events automatically.
SQL Server events are logged according to the way you start SQL Server.
• When SQL Server is started as a service under the Windows 2005 or Windows NT 4.0
operating system, events are logged to the SQL Server error log, to the Windows 2005 or
Windows NT application log, or to both logs.
• When SQL Server is started from the command prompt, events are logged to the SQL
Server error log and to standard output (typically the monitor, unless output has been
redirected elsewhere).
.
If you encounter a problem regarding compatibility between SQL Server 2005 and earlier
versions of SQL Server, see SQL Server 2005 and SQL Server version 7.0 and SQL Server 2005
and SQL Server version 6.5. For information about a detailed list of feature changes between
SQL Server 6.5 and SQL Server 2005, .
Additional Resources
For access to the Microsoft Knowledge Base and other current information, a subscription to
Microsoft TechNet or MSDN® can be helpful. For more information, see:
Numerous links to Microsoft Product Support Services (PSS) Web pages are provided in the
Troubleshooting topics. Links to the new online troubleshooters, as well as pertinent Microsoft
Knowledge Base articles and white papers, are also available. Every effort has been made to
ensure the Web links are correct and will remain stable over time. However, if a link does not
work, go to the MSDN Online Support Web page at Microsoft Web site, and navigate to the
correct location
1. Initiation Phase
The initiation of a system (or project) begins when a business need or opportunity is
identified. A Project Manager should be appointed to manage the project. This business need is
documented in a Concept Proposal. After the Concept Proposal is approved, the System Concept
Development Phase begins.
2. System Concept Development Phase
Once a business need is approved, the approaches for accomplishing the concept are
reviewed for feasibility and appropriateness. The Systems Boundary Document
identifies the scope of the system and requires Senior Official approval and funding
before beginning the Planning Phase.
3. Planning Phase
The concept is further developed to describe how the business will operate once the approved
system is implemented, and to assess how the system will impact employee and customer
privacy. To ensure the products and /or services provide the required capability on-time and
within budget, project resources, activities, schedules, tools, and reviews are defined.
Additionally, security certification and accreditation activities begin with the identification of
system security requirements and the completion of a high level vulnerability assessment.
4. Requirements Analysis
Phase Functional user requirements are formally defined and delineate the requirements in terms
of data, system performance, security, and maintainability requirements for the system.
All requirements are defined to a level of detail sufficient for systems design to proceed.
All requirements need to be measurable and testable and relate to the business need or
opportunity identified in the Initiation Phase.
5. Design Phase
The physical characteristics of the system are designed during this phase. The operating
environment is established, major subsystems and their inputs and outputs are defined, and
processes are allocated to resources. Everything requiring user input or approval must be
documented and reviewed by the user. The physical characteristics of the system are specified
and a detailed design is prepared. Subsystems identified during design are used to create a
detailed structure of the system. Each subsystem is partitioned into one or more design units or
modules. Detailed logic specifications are prepared for each software module.
6. Development Phase
The detailed specifications produced during the design phase are translated into hardware,
communications, and executable software. Software shall be unit tested, integrated, and retested
in a systematic manner. Hardware is assembled and tested.
The various components of the system are integrated and systematically tested. The user tests the
system to ensure that the functional requirements, as defined in the functional requirements
document, are satisfied by the developed or modified system. Prior to installing and operating the
system in a production environment, the system must undergo certification and accreditation
activities.
8. Implementation Phase
The system or system modifications are installed and made operational in a production
environment. The phase is initiated after the system has been tested and accepted by the . This
phase continues until the system is operating in production in accordance with the defined user
requirements.
The system operation is ongoing. The system is monitored for continued performance in
accordance with user requirements, and needed system modifications are incorporated. The
operational system is periodically assessed through In-Process Reviews to determine how the
system can be made more efficient and effective. Operations continue as long as the system can
be effectively adapted to respond to an organization’s needs. When modifications or changes are
identified as necessary, the system may reenter the planning phase.
10. Disposition Phase
The disposition activities ensure the orderly termination of the system and preserve the vital
information about the system so that some or all of the information may be reactivated in the
future if necessary. Particular emphasis is given to proper preservation of the data processed by
the system, so that the data is effectively migrated to another system or archived in accordance
with applicable records management regulations and policies, for potential future access.
SDLC Objectives
This guide was developed to disseminate proven practices to system developers, project
managers, program/account analysts and system owners/users throughout the DOJ. The specific
objectives expected include the following:
Key Principles
This guidance document refines traditional information system life cycle management
approaches to reflect the principles outlined in the following subsections. These are the
foundations for life cycle management.
Life Cycle Management Should be used to Ensure a Structured Approach to
Information Systems Development, Maintenance, and Operation
The establishment of an Integrated Product Team (IPT) can aid in the success of a project. An
IPT is a multidisciplinary group of people who support the Project Manager in the planning,
execution, delivery and implementation of life cycle decisions for the project. The IPT is
composed of qualified empowered individuals from all appropriate functional disciplines that
have a stake in the success of the project. Working together in a proactive, open communication,
team oriented environment can aid in building a successful project and providing decision
makers with the necessary information to make the right decisions at the right time.
To help ensure effective planning, management, and commitment to information systems, each
project must have a clearly identified program sponsor. The program sponsor serves in a
leadership role, providing guidance to the project team and securing, from senior management,
the required reviews and approvals at specific points in the life cycle. An approval from senior
management is required after the completion of the first seven of the SDLC phases, annually
during Operations and Maintenance Phase and six-months after the Disposition Phase. Senior
management approval authority may be varied based on dollar value, visibility level,
congressional interests or a combination of these.
The program sponsor is responsible for identifying who will be responsible for formally
accepting the delivered system at the end of the Implementation Phase.
3. A Single Project Manager must be Selected for Each System
Project
The Project Manager has responsibility for the success of the project and works through a
project team and other supporting organization structures, such as working groups or user
groups, to accomplish the objectives of the project. Regardless of organizational affiliation, the
Project Manager is accountable and responsible for ensuring that project activities and decisions
consider the needs of all organizations that will be affected by the system. The Project Manager
develops a project charter to define and clearly identify the lines of authority between and within
the agency’s executive management, program sponsor, (user/customer), and developer for
purposes of management and oversight.
Certain roles are considered vital to a successful system project and at least one individual
must be designated as responsible for each key role. Assignments may be made on a full- or part-
time basis as appropriate. Key roles include program/functional management, quality assurance,
logistics, financial, systems engineering, test and evaluation, contracts management, and
configuration management. For most projects, more than one individual should represent the
actual or potential users of the system (that is, program staff) and should be designated by the
FEASIBILITY STUDY
A feasibility study is conducted to select the best system that meets performance
requirement. This entails an identification description, an evaluation of candidate system and the
selection of best system for he job. The system required performance is defined by a statement of
constraints, the identification of specific system objective and a description of outputs.
The key consideration in feasibility analysis are :
1. Economic Feasibility :
2. Technical Feasibility :
3. Operational Feasibility:
Economical feasibility
computers in the organization are highly sophisticated and don’t needs extra components to load
the software. Hence the organization can implement the new system without any additional
The result of the feasibility study is a formal proposal. This is simply report-a formal document
detailing the nature and the scope of the proposed solution. The proposals summarize what is
known and what is going to be done. Three key considerations are involved in the feasibility
2.3.1 Economic Feasibility: Economic analysis is the most frequently used method for
evaluating the effectiveness of a candidate system. More determine the benefits and the saving
that are expressed from a candidate system and compare them costs. If benefits outweigh costs.
Otherwise, further justification or alterations in the proposed system will have to be made if it is
to have a chance of being approved. This is an ongoing effort that improves in accuracy at each
2.3.2 Technical Feasibility: Technical feasibility center around the existing computer
system hardware etc. and to what extent it can support the proposed addition. For example, if the
current computer is operating at 80% capacity - an arbitrary ceiling – then running another
application could over load the system or require additional hardware. This involves financial
consideration to accommodate technical enhancements. If the budget is a serious constraint then
some thing to do with turnover, transfers, retraining and changes in employee job status.
Therefore, it is understandable that the introduction of a candidate system requites special efforts
to educate, sell, and train the staff on new ways of conducting business.
2.3.4 C h o i c e o f P l a t f o r m ?
Technical Feasibility
It is a measure of the practically of a specific technical solution and the availability of
technical resources and expertise
• The proposed system uses Java as front-end and Sql server
2003 as back-end tool.
• Oracle is a popular tool used to design and develop database
objects such as table views, indexes.
• The above tools are readily available, easy to work with and
widely used for developing commercial application.
Hardware used in this project are- p4 processor 2.4GHz, 128 MB RAM, 40 GB
hard disk, floppy drive. These hardware were already available on the existing computer
system. The software like Sql Server 2003, iis,.net framework and operating system
WINDOWS-XP’ used were already installed On the existing computer system. So no
additional hardware and software were required to purchase and it is technically feasible. The
technical feasibility is in employing computers to the organization. The organization is
equipped with enough computers so that it is easier for updating. Hence the organization has
not technical difficulty in adding this system.
Operational Feasibility
The system will be used if it is developed well then be resistance for users that
undetermined
No major training and new skills are required as it is based on DBMS
model.
• It will help in the time saving and fast processing and dispersal of user
request and applications.
New product will provide all the benefits of present system with better
performance.
Improved information, better management and collection of the reports.
User support.
• User involvement in the building of present system is sought to keep in
mind the user specific requirement and needs.
• User will have control over there own information. Important information
such as pay-slip can be generated at the click of a button.
• Faster and systematic processing of user application approval, allocation
of IDs, payments, etc. used had greater chances of error due to wrong information
entered by mistake.
Behavioral Feasibility
People are inherent to change. In this type of feasibility check, we come to know if the
newly developed system will be taken and accepted by the working force i.e. the people who will
use it.
This symbolically represents place where data is stored the data can be
stored for future procession (or) it can be processed for future return any place
where data is stored is called data stored.
1. Between process
2. File to process
3. External entity to process
Software
Test Results Evaluatio
Configuration
n
Testing Error
Rate Debu
Data g
Expected
Test Results
Reliabilit
Configuration y
Model
Correction
Program Evaluation and Review Technique (PERT) and Critical Path Method (CPM) are the
project scheduling techniques that can be applied to software development. Both technique are
Estimation of effort
Decomposition of tasks
Both PERT and CPM provide quantitative tools that allow the software planning to determine
critical path – the claim of task that determined the duration of the project establish “most likely”
times estimates for individual tasks by applying statically models: and Calculation “boundary
Both PERT and CPM have been implemented in a wide verity of automated tools that are
available for the personal computer. Such tools are easy to use and make the scheduling methods
When creating a software project schedule, the planner begins with a set of tasks (the work break
down structure). If automated tools are used, the work break down is input as a task network or
task outline. Efforts, duration, and start date are then input for each task. In addition, task may be
As a sequence of this input, a timeline chart, also called a Gantt Chart, is generated. A Gantt
Chart can be developed for the entire project. Alternatively, separated it depicts a part of a
software project schedule that emphasizes the concept scooping task for a new word processing
software project. All project task (for concept scooping) are listed in the left hand column. The
horizontal bars occur at the same time on the calendar, task concurrency is implied. The
Once the information necessary for the generation of the Gantt Chart has been input, the major of
project tables a tabular listing of all project tasks, their planned and actual start and end dates,
and a verity of related information. Used in conjunction with the Gantt Chart project tables
enable the project manager to track progress
Work Flow of Music Online
DFD
0’s Level
1st Level
MAINTENANCE
The maintenance is applied when an error occurs & system halts and
further processing cannot be done .At this time user can view documentation or
consult us for rectification & we will analyze and change the code if needed.
Example: - If user gets a error “report width is larger than paper size” while
printing report & reports can not be generated then by viewing the help
documentation & changing the paper size to ‘A4’ size of default printer will
rectify the problem.”
1. This project has achieved the objective of replacing/augmenting the conventional system
of arranging manpower as could be conducted by a typical telecom dept.
2. The development of this package has been achieved by using C#.NET, which is very
conductive to develop the package with regard to time and specific need to the user.
3. This package is highly user friendly, required an optimal minimal input from user while
providing highly relevant and focused outputs.
4. Fully automated, avoiding human intervention. Hence it provides a very rapid cost
effective alternative to the conventional manual operation/procedures; the visual outputs
are more reliable than the audio forms of manual communication.
5. The system can further extended as per user and administrative requirements to
encompass other aspects of connection management for telecom dept.
LIMITATIONS: -
This project does not Edit the date of connection or store the date of transfer in
case of connection transfer.
System date for the project is like as backbone for the human, i.e. proposed
system is depends on system date so it must be correct.
Cannot be connected to the Internet.
There are some inherent problems like time, finance etc. to elaborate further
study.
Glossary of My Project
Access
Microsoft Access is an entry-level database management software from
Microsoft, which allows you to organize, access, and share information easily.
Access is very user-friendly and easy to use for inexperienced users, while
sophisticated enough for database and software developers.
ACID
ACID short for Atomicity – Consistency – Isolation – Durability and describes the
four properties of an enterprise-level transaction:
ADO
Short for Microsoft ActiveX Data Objects. ADO enables your client applications to
access and manage data from a range of sources through an OLE DB provider.
ADO is built on top of OLE DB and its main benefits are ease of use, high speed,
and low memory overhead. ADO makes the task of building complex database
enabled client/server and web applications a breeze.
Column
Database tables are made of different columns (fields) corresponding to the
attributes of the object described by the table.
COMMIT
The COMMIT command in SQL marks the finalization of a database transaction.
Cursor
Short for Current Set Of Records in some database languages. The cursor is a
database object pointing to a currently selected set of records.
Data
Piece of information collected and formatted in a specific way. The term data is
frequently used to describe binary (machine-readable) information.
Database
A database is a collection of information organized into related tables of data and
definitions of data objects. The data within a database can be easily accessed
and manipulated trough computer program.
DB2
DB2 is a relational database management system developed by IBM. DB2 runs
on a variety of platforms including Sun Solaris, Linux and Windows.
Field
See Column definition
Flat File
Flat file is a data file that has no structured relationships between its records.
Foreign Key
A foreign key is a key field (column) that identifies records in a table, by matching
a primary key in a different table. The foreign keys are used to cross-reference
tables.
Index
An index is a database feature (a list of keys or keywords), allowing searching
and locating data quickly within a table. Indexes are created for frequently
searched attributes (table columns) in order to optimize the database
performance.
INSERT
The INSERT is a SQL command used to add a new record to a table within a
database.
Isolation
See ACID definition
JOIN
The JOIN is a SQL command used to retrieve data from 2 or more database
tables with existing relationship based upon a common attribute.
Key
See Primary Key and Foreign Key definitions
Lock
Locks are used by Database management systems to facilitate concurrency
control. Locks enable different users to access different records/tables within the
same database without interfering with one another. Locking mechanisms can be
enforced at the record or table levels.
MySQL
MySQL is an open source relational database management system. MySQL can
be used on various platforms including UNIX, Linux and Windows (there are OLE
DB and ODBC providers as well as .NET native provider for MySQL). MySQL is
widely used as a backend database for Web applications and it' viable and
cheaper alternative to enterprise database systems like MS SQL Server and
Oracle.
Normalization
Normalization is the process of organizing data to minimize redundancy and
remove ambiguity. Normalization involves separating a database into tables and
defining relationships between the tables. There are three main stages of
normalization called normal forms. Each one of those stages increases the level
of normalization.
NULL
The NULL SQL keyword is used to represent a missing value.
ODBC
Short for Open DataBase Connectivity, a standard database access technology
developed by Microsoft Corporation. The purpose of ODBC is to allow accessing
any DBMS (DataBase Management System) from any application (as long as the
application and the database are ODBC compliant), regardless of which DBMS is
managing the data. ODBC achieves this by using a middle layer, called a
database driver, between an application and the DBMS. The purpose of this
layer is to transform the application's data queries into commands that the DBMS
understands. As we said earlier, both the application and the DBMS must be
ODBC compliant meaning, the application must be capable of sending ODBC
commands and the DBMS must be capable of responding back to them.
PostgreSQL
PostgreSQL is an object-oriented open source relational database management
system, which uses a subset of SQL language.
Primary Key
The primary key of a relational table holds a unique value, which identifies each
record in the table. It can either be a normal field (column) that is guaranteed to
be unique or it can be generated by the database system itself (GUID or Identity
field in MS SQL Server for example). Primary keys may be composed of more
than 1 field (column) in a table.
Query
Queries are the main way to make a request for information from a database.
Queries consist of questions presented to the database in a predefined format, in
most cases SQL (Structured Query Language) format.
R
Record
The record is a complete set of information presented within a RDBMS. Records
are composed of different fields (columns) in a table and each record is
represented with a separate row in this table.
ROLLBACK
The ROLLBACK is a SQL command which cancels/undoes the proposed
changes in a pending database transaction and marks the end of the transaction.
Row
See Record definition
SELECT
The SELECT is a SQL command, which is the primary means for retrieving data
from a RDBMS.
SQL
SQL is short for Structured Query Language and is an industry standard
language used for manipulation of data in a RDBMS. There are several different
dialects of SQL like, ANSI SQL, T-SQL, etc.
Stored Procedure
Stored Procedure is a set of SQL statements stored within a database server and
is executed as single entity. Using stored procedures has several advantages
over using inline SQL statements, like improved performance and separation of
the application logic layer from database layer in n-tier applications.
Table
A Table in RDBMS refers to data arranged in rows and columns, which defines a
database entity.
UPDATE
The UPDATE is a SQL command used to edit/update existing records in a
database table
.Net Framework Glossary
Abstract IL (ILX)—A toolkit for accessing the contents of .NET Common IL binaries.
Among its features, it lets you transform the binaries into structured abstract syntax trees
that can be manipulated.
Access modifiers—Language keywords used to specify the visibility of the methods and
member variables declared within a class. The five access modifiers in the C# language are
public, private, protected, internal, and protected internal.
Acrylic— Codename for an innovative illustration, painting and graphics tool that provides
creative capabilities for designers working in print, web, video, and interactive media.
Active Server Pages (ASP)—A Microsoft technology for creating server-side, Web-based
application services. ASP applications are typically written using a scripting language, such
as JScipt, VBScript, or PerlScript. ASP first appeared as part of Internet Information Server
2.0 and was code-named Denali.
BackOffice Server 2005—A suite of Microsoft servers applications used for B2B and B2C
services. Included in this suite are Windows 2005 Server, Exchange Server 2005, SQL
Server 2005, Internet Security and Acceleration Server 2005, Host Integration Server 2005,
and Systems Management Server 2.0. These server applications are now referred to as the
.NET Enterprise Server product family.
Base class—The parent class of a derived class. Classes may be used to create other
classes. A class that is used to create (or derive) another class is called the base class or
super class. See Derived Class, Inheritance.
Behave!—A project for building tools to checking things such as deadlock freedom,
invariant checking, and message-understood properties in behavior properties of
asynchronous, message-passing programs.
BizTalk Server 2005—A set of Microsoft Server applications that allow the integration,
automation, and management of different applications and data within and between
business organizations. BizTalk Server is a key B2B component of the .NET Enterprise
Server product family.
Callback Method—A method used to return the results of an asynchronous processing call.
Typically, methods are called in a synchronous fashion, where the call does not return until
the results (i.e., the output or return value) of the call are available. An asynchronous
method call returns prior to the results, and then sometime later a callback method is called
to return the actual results. The callback method itself contains program statements that
are executed in response to the reception of the results. Also referred to as a callback
function under the Win32 API. See Event.
Data provider—A set of classes in the .NET Framework that allow access to the information
a data source. The data may be located in a file, in the Windows registry, or any any type of
database server or network resource. A .NET data provider also allows information in a data
source to be accessed as an ADO.NET DataSet. Programmers may also author their own
data providers for use with the .NET Framework. See Managed providers.
Garbage Collection (GC)—The process of implicitly reclaiming unused memory by the CLR.
Stack values are collected when the stack frame they are declared within ends (e.g., when a
method returns). Heap objects are collected sometime after the final reference to them is
destroyed.
GDI (Graphics Device Interface)—A Win32 API that provides Windows applications the
ability to access graphical device drivers for displaying 2D graphics and formatted text on
both the video and printer output devices. GDI (pronounced "gee dee eye") is found on all
version of Windows. See GDI+.
GDI+ (Graphics Device Interface Plus)—The next generation graphics subsystem for
Windows. GDI+ (pronounced "gee dee eye plus") provides a set of APIs for rendering 2D
graphics, images, and text, and adds new features and an improved programming model
not found in its predecessor GDI. GDI+ is found natively in Windows XP and the Windows
Server 2003 family, and as a separate installation for Windows 2005, NT, 98, and ME. GDI+
is the currently the only drawing API used by the .NET Framework.
Hash Code—A unique number generated to identify each module in an assembly. The hash
is used to insure that only the proper version of a module is loaded at runtime. The hash
number is based on the actual code in the module itself.
"Hatteras"—Codename for Team Foundation Version Control tool. This is the new version
control in Visual Studio 2005.
Heap—An area of memory reserved for use by the CLR for a running programming. In .NET
languages, reference types are allocated on the heap. See Stack.
Host Integration Server 2005—A set of Microsoft server applications use to ingrate
the .NET platform and applications with non-Microsoft operating systems and hardware
(e.g., Unix and AS/400), security systems (e.g., ACF/2 and RACF), data stores (e.g., DB2),
and transaction environments (e.g., CICS and IMS).
Identifiers—The names that programmers choose for namespaces, types, type members,
and variables. In C# and VB.NET, identifiers must begin with a letter or underscore and
cannot be the same name as a reserved keyword. Microsoft no longer recommends the use
of Hungarian Notation (e.g., strMyString, nMyInteger) or delimiting underscores (e.g.,
Temp_Count) when naming identifiers. See Qualified identifiers.
Indigo —The code name for for Windows Communication Foundation (WCF), which is the
communications portion of Longhorn that is built around Web services. This communications
technology focuses on providing spanning transports, security, messaging patterns,
encoding, networking and hosting, and more.
"Indy"—The code-name for a capacity Planning tool being developed by Microsoft. This was
originally a part of Longhorn, but is speculated to ship earlier.
Just In Time (JIT)—The concept of only compiling units of code just as they are needed at
runtime. The JIT compiler in the CLR compiles MSIL instructions to native machine code as a
.NET application is executed. The compilation occurs as each method is called; the JIT-
compiled code is cached in memory and is never recompiled more than once during the
program's execution.
Keywords—Names that have been reserved for special use in a programming language.
The C# language defines about 80 keywords, such as bool, namespace, class,
static, and while. The 160 or so keywords reserved in VB.NET include Boolean,
Event, Function, Public, and WithEvents. Keywords may not be used as
identifiers in program code.
back to top
License Compiler—A .NET programming tool (lc.exe) used to produce .licenses files
that can be embedded in a CLR executable.
Lifetime—The duration from an objects existence. From the time an object is instantiated
to the time it is destroyed by the garbage collector.
Local assembly cache—The assembly cache that stores the compiled classes and methods
specific to an application. Each application directory contains a \bin subdirectory which
stores the files of the local assembly cache.
"Magneto"—The code-name for Windows Mobile 5.0. This version is to unify the Windows
CE, PocketPC, and SmartPhone platforms. This platform includes a new user interface,
improved video support, better keyboard support, and more.
Make Utility—A .NET programming tool (nmake.exe) used to interpret script files (i.e.,
makefiles) that contain instructions that detail how to build applications, resolve file
dependency information, and access a source code control system. Microsoft's nmake
program has no relation to the nmake program originally created by AT&T Bell Labs and
now maintained by Lucent. Although identical in name and purpose these two tools are not
compatible. See Lucent nmake Web site.
Managed code—Code that is executed by the CLR. Managed code provides information
(i.e., metadata) to allow the CLR to locate methods encoded in assembly modules, store
and retrieve security information, handle exceptions, and walk the program stack. Managed
code can access both managed data and unmanaged data.
Namespace—A logical grouping of the names (i.e., identifiers) used within a program. A
programmer defines multiple namespaces as a way to logically group identifiers based on
their use. For example, System.Drawing and System.Windows are two namespaces
containing each containing types used for for different purposes. The name used for any
identifier may only appear once in any namespace. A namespace only contains the name of
a type and not the type itself. Also called name scope.
Object—The instance of a class that is unique and self-describing. A class defines an object,
and an object is the functional, realization of the class. Analogously, if a class is a cookie
cutter then the cookies are the objects the cutter was used to create.
Object type—The most fundamental base type (System.Object) that all other .NET
Framework types are derived from.
OLE (Object Linking and Embedding)—A Microsoft technology that allows an application to
link or embed into itself documents created by another type of application. Common
examples include using Microsoft Word to embed an Excel spreadsheet file into a Word
document file, or emailing a Microsoft Power Point file as an attachment (link) in Microsoft
Outlook. OLE is often confused with the Component Object Model (COM), because COM was
released as part of OLE2. However, COM and OLE are two separate technologies.
Orcas—The code name for the version of Visual Studio .NET to be released near the time
Microsoft Longhorn is released. This follows the release of Visual Studio .NET Whidbey.
"Pheonix"—A software optimization and analysis framework that is to be the basis for all
future Microsoft compiler technologies.
"Photon"—A feature-rich upgrade to Windows Mobile that includes features such as battery
life. This version will follow Windows Mobiles 2005 (code-named "Magneto").
Pinned—A block of memory that is marked as unmovable. Blocks of memory are normally
moved at the discretion of the CLR, typically at the time of garbage collection. Pinning is
necessary for managed pointer types that will be used to work with unmanaged code and
expect the data to always reside at the same location in memory. A common example is
when a pointer is used to pass a reference to a buffer to a Win32 API function. If the buffer
were to be moved in memory, the pointer reference would become invalid, so it must be
pinned to its initial location.
Pre-JIT compiler—Another name for the Native Image Generator tool used to convert
MSIL and metadata assemblies to native machine code executables.
Qualified identifiers—Two or more identifiers that are connected by a dot character (.).
Only namespace declarations use qualified identifiers (e.g., System.Windows.Forms).
back to top
R2—The codename for the Windows Server 2003 Update due in 2005.
Reference types—A variable that stores a reference to data located elsewhere in memory
rather than to the actual data itself. Reference types include array, class, delegate, and
interface. See Value types, Pointer types.
Satellite assembly—An assembly that contains only resources and no executable code.
Satellite assemblies are typically used by .NET application to store localized data. Satellite
assembles can be added, modified, and loaded into a .NET application at runtime without
the need to recompile the code. Satellite assemblies are created by compiling .resource
files using the Assembly Linking Utility.
Saturn—the code name for the original ASP.NET Web Matrix product.
Seamless Computing—A term indicating that a user should be able to find and use
information effortlessly. The hardware and software within a system should work in an
intuitive manner to make it seamless for the user. Seamless computing is being realized
with the improvements in hardware (voice, ink, multimedia) and software.
Try/Catch block—An exception handling mechanism in program code. A try block contains
a set of program statements that may possibly throw an exception when executed. The
associated catch block contains program statements that handle any exception that is
thrown in the try block. Multiple catch blocks may be defined to catch specific exceptions
(e.g., divide by zero, overflow, etc.). See Finally block.
Value types—A variable that stores actual data rather than a reference to data, which is
stored elsewhere in memory. Simple value types include the integer, floating point number,
decimal, character, and boolean types. Value types have the minimal memory overhead and
are the fastest to access. See Reference types, Pointer types.
Variable—A typed storage location in memory. The type of the variable determines what
kind of data it can store. Examples of variables include local variables, parameters, array
elements, static fields and instance fields. See Types.
Web Form—A .NET Framework object that allows development of Web-based applications
and Web sites. See Windows form.
The Web Matrix Project—A free WSIWIG development product (IDE)for doing ASP.NET
development that was released as a community project. The most recent version—The Web
Matrix Project (Revisited)—can be found here.
Web service—An application hosted on a Web server that provides information and
services to other network applications using the HTTP and XML protocols. A Web service is
conceptually an URL-addressable library of functionality that is completely independent of
the consumer and stateless in its operation.
XCOPY—An MS-DOS file copy program used to deploy .NET applications. Because .NET
assemblies are self-describing and not bound to the Windows registry as COM-based
application are, most .NET applications can be installed by simply being copied from one
location (e.g., directory, machine, CD-ROM, etc.) to another. Applications requiring more
complex tasks to be performed during installation require the use of the Microsoft Windows
Installer.
XDR (XML Data-Reduced)—A reduced version of XML Schema used prior to the release of
XML Schema 1.0.
XML Schema Definition Tool— A .NET programming tool (Xsd.exe) used to generate XML
schemas (XSD files) from XDR and XML files, or from class information in an assembly. This
tool can also generate runtime classes, or DataSet classes, from an XSD schema file.
XML Web services—Web-based .NET applications that provide services (i.e., data and
functionality) to other Web-based applications (i.e. Web service consumers). XML Web
services are accessed via standard Web protocols and data formats such as HTTP, XML, and
SOAP.
Yukon—The code name for the release of Microsoft SQL Server 2003 (a.k.a., SQL Server 9). Yukon
offers a tighter integration with both the .NET Framework and the Visual Studio .NET IDE. Yukon will
include full support for ADO.NET and the CLR, allowing .NET languages to be used for
writing stored procedures