ABSTRACT
In this world of growing technologies everything has been computerized. With large number of work opportunities the Human workforce has increased. Thus there is a need of a system which can handle the data of such a large number of Employees in an organization. This project simplifies the task of maintain records because of its user friendly nature. The EMPLOYEE MANAGEMENT SYSTEM has been developed to override the problems prevailing in the practicing manual system. This software is supported to eliminate and in some cases reduce the hardships faced by this existing system. Moreover this system is designed for the particular need of the company to carry out operations in a smooth and effective manner. The application is reduced as much as possible to avoid errors while entering the data. It also provides error message while entering invalid data. No formal knowledge is needed for the user to use this system. Thus by this all it proves it is user-friendly. This project will allow admin to add new employees after proper authentication. Admin can also add new departments and posts. It can allocate employees to different departments at different posts. Database should store all personal details of employees such as date of birth full name etc. and his educational background, work experience, skill sets, current and past projects in different tables with proper relations. This system enables employees to perform their own profile. It enables the automation of work flow notifications and leave request. Work flow notification from administrator are stored in the backend and notified to employee, once employee log in to the system. Leave request made by the employee is placed for administrator approval, the administrator module checks up with the leave availability and approves or rejects the request.
SYSTEM STUDY
2. INTRODUCTION
The feasibility study of this project has revealed the project as follows: The project has shown the economic feasibility by the study of the fact that by using this software the increased number of the consumers can be given service effectively and efficiently and can save a lot time and saving time means saving money. The cost and benefit analysis has shown that cost that have incurred in developing the project is less than the benefits that the project is going to provide once it is developed, so this project has passed the feasibility test. The working staff members are also interested in this project, as it will help them to do work with ease and efficiently without complexity, so they supported the development of this project with full enthusiasm. This shows the behavioral feasibility of the project. Technical feasibility centers on the existing computer system (Hardware, Software etc) and to what extent it supports the existing system. As the existing system computer system is viable so there is no matter of technical feasibility that is the system is technically feasible. It is the determination of whether a proposed project can be implemented fully within stipulated time frame. The project was decided to be done in three months and was thought to be feasible enough.
2.1.
EXISTING SYSTEM
The company uses an application which is a single user system to find the employee information. The important and the most significant drawback is that the system is manual. There are errors due to carelessness or oversight that may result in loss to the data and as to the organization. For an organization, time is very important factor . The employee information are stored in the application which is like a excel sheet. This makes it impossible to search for a company information in such a long table manually and to add a new query if the searched query is not available.
2.2.
This system will reduce the complexity of employee management. By using this system we can easily maintain all the records about ON.EMPLOYEES or OFF EMPLOYEES. It will reduce searching time. It can be easily handled by the person who have elementary knowledge of computer because it provides an user friendly environment. Its hardware and software configuration is not very costly that means.
2.4.
2.4.1.
2.4.2.
Outputs:
Admin can have his own home page. Users enter their own home page. The user defined data can store in the centralized database. Admin will get the login information of a particular user. The new users data will be stored in the centralized database. Admin get the search details of different criteria. User can get his forgot password.
FEASIBILITY REPORT
3. FEASIBILITY
Preliminary investigation examine project feasibility, the likelihood the system will be useful to the organization. The main objective of the feasibility study is to test the Technical, Operational and Economical feasibility for adding new modules and debugging old running system. All system is feasible if they are unlimited resources and infinite time. There are aspects in the feasibility study portion of the preliminary investigation: Technical Feasibility Operational Feasibility Economical Feasibility
3.1.
TECHNICAL FEASIBILITY
The technical issue usually raised during the feasibility stage of the investigation
includes the following: Does the necessary technology exist to do what is suggested? Do the proposed equipments have the technical capacity to hold the data required to use the new system? Will the proposed system provide adequate response to inquiries, regardless of the number or location of users? Can the system be upgraded if developed? Are there technical guarantees of accuracy, reliability, ease of access and data security? Earlier no system existed to cater to the needs of Secure Infrastructure Implementation System. The current system developed is technically feasible. It is a web based user interface for audit workflow at NIC-CSD. Thus it provides an easy access to the users. The databases purpose is to create, establish and maintain a workflow among various entities in order to facilitate all concerned users in their various capacities or roles. Permission to the users would be granted based on the roles specified. Therefore, it provides the technical guarantee of accuracy, reliability and security. The software and hard requirements for the development of this project are not many and are already available inhouse at NIC or are available as free as open source. The work for the project is done with the current equipment and existing software technology. Necessary bandwidth exists for providing a fast feedback to the users irrespective of the number of users using the system.
3.2.
OPERATIONAL FEASIBILITY
Proposed projects are beneficial only if they can be turned out into information
system. That will meet the organizations operating requirements. Operational feasibility aspects of the project are to be taken as an important part of the project implementation. Some of the important issues raised are to test the operational feasibility of a project includes the following: Is there sufficient support for the management from the users? Will the system be used and work properly if it is being developed and implemented? Will there be any resistance from the user that will undermine the possible application benefits? This system is targeted to be in accordance with the above-mentioned issues. Beforehand, the management issues and user requirements have been taken into consideration. So there is no question of resistance from the users that can undermine the possible application benefits. The well-planned design would ensure the optimal utilization of the computer resources and would help in the improvement of performance status.
3.3.
ECONOMICAL FEASIBILITY
A system can be developed technically and that will be used if installed must still be a
good investment for the organization. In the economical feasibility, the development cost in creating the system is evaluated against the ultimate benefit derived from the new systems. Financial benefits must equal or exceed the costs. The system is economically feasible. It does not require any addition hardware or software. Since the interface for this system is developed using the existing resources and technologies available at NIC, There is nominal expenditure and economical feasibility for certain.
REQUIREMENTS
4.2.
Speed RAM
HARDWARE REQUIREMENTS
Pentium-III 1.1Ghz 512MB(min) 40GB 1.44MB Standard Windows Keyboard Two or Three Button Mouse SVGA
Processor
Floppy Drive -
The software, Site Explorer is designed for management of web sites from a remote location.
INTRODUCTION
Purpose: The main purpose for preparing this document is to give a general insight into the analysis and requirements of the existing system or situation and for determining the operating characteristics of the system. Scope: This Document plays a vital role in the development life cycle (SDLC) and it describes the complete requirement of the system. It is meant for use by the developers and will be the basic during testing phase. Any changes made to the requirements in the future will have to go through formal change approval process.
5.1.
FUNCTIONAL REQUIREMENTS
Keeping in view the above description the project is to have outputs mainly coming under the category of internal outputs. The main outputs desired according to the requirement specification are: The outputs were needed to be generated as a hot copy and as well as queries to be viewed on the screen. Keeping in view these outputs, the format for the output is taken from the outputs, which are currently being obtained after manual processing. The standard printer is to be used as output media for hard copies.
5.1.10.
Data Validation
Procedures are designed to detect errors in data at a lower level of detail. Data validations have been included in the system in almost every area where there is a possibility for the user to commit errors. The system will not accept invalid data. Whenever an invalid data is keyed in, the system immediately prompts the user and the user has to again key in the data and the system will accept the data only if the data is correct. Validations have been included where necessary. The system is designed to be a user friendly one. In other words the system has been designed to communicate effectively with the user. The system has been designed with popup menus.
5.1.11.
user interface:
It is essential to consult the system users and discuss their needs while designing the
5.1.12.
1.
user/computer dialogue. In the computer-initiated interface, the computer selects the next stage in the interaction. 2. Computer initiated interfaces
In the computer initiated interfaces the computer guides the progress of the user/computer dialogue. Information is displayed and the user response of the computer takes action or displays further information.
5.1.13.
User initiated interfaces fall into tow approximate classes: 1. Command driven interfaces: In this type of interface the user inputs commands or queries which are interpreted by the computer. 2. Forms oriented interface: The user calls up an image of the form to his/her screen and fills in the form. The forms oriented interface is chosen because it is the best choice.
5.1.14.
Computer-Initiated Interfaces
The following computer initiated interfaces were used: 1. The menu system for the user is presented with a list of alternatives and the user chooses one; of alternatives. 2. Questions answer type dialog system where the computer asks question and takes action based on the basis of the users reply. Right from the start the system is going to be menu driven, the opening menu displays the available options. Choosing one option gives another popup menu with more options. In this way every option leads the users to data entry form where the user can key in the data.
5.1.15.
The design of error messages is an important part of the user interface design. As user is bound to commit some errors or other while designing a system the system should be designed to be helpful by providing the user with information regarding the error he/she has committed. This application must be able to produce output at different modules for different inputs.
5.2.
PERFORMANCE REQUIREMENTS
Performance is measured in terms of the output provided by the application. Requirement specification plays an important part in the analysis of a system. Only when the requirement specifications are properly given, it is possible to design a system, which will fit into required environment. It rests largely in the part of the users of the existing system to give the requirement specifications because they are the people who finally use the system. This is because the requirements have to be known during the initial stages so that the system can be designed according to those requirements. It is very difficult to change the system once it has been designed and on the other hand designing a system, which does not cater to the requirements of the user, is of no use. The requirement specification for any system can be broadly stated as given below: The system should be able to interface with the existing system The system should be accurate The system should be better than the existing system The existing system is completely dependent on the user to perform all the duties.
SYSTEM DESIGN
6. INTRODUCTION
Software design sits at the technical kernel of the software engineering process and is applied regardless of the development paradigm and area of application. Design is the first step in the development phase for any engineered product or system. The designers goal is to produce a model or representation of an entity that will later be built. Beginning, once system requirement have been specified and analyzed, system design is the first of the three technical activities -design, code and test that is required to build and verify software. The importance can be stated with a single word Quality. Design is the place where quality is fostered in software development. Design provides us with representations of software that can assess for quality. Design is the only way that we can accurately translate a customers view into a finished software product or system. Software design serves as a foundation for all the software engineering steps that follow. Without a strong design we risk building an unstable system one that will be difficult to test, one whose quality cannot be assessed until the last stage. During design, progressive refinement of data structure, program structure, and procedural details are developed reviewed and documented. System design can be viewed from either technical or project management perspective. From the technical point of view, design is comprised of four activities architectural design, data structure design, interface design and procedural design.
7. NORMALIZATION
Database designed based on the E-R model may have some amount of inconsistency, ambiguity and redundancy. To resolve these issues we have to do some amount of refinement is required. This refinement process is called as Normalization.
7.1.
Process of Normalization
As mentioned previously, normalization technique is based on strong mathematical foundation. Basically in software industry four normal forms are used to design the database. Before getting to know the normalization techniques in detail, let us define a few building blocks which are used to define normal forms.
7.1.1. Determinant
Attribute X can be defined as determinant if it uniquely defines the attribute value Y in a given relationship or entity. To qualify as determinant attribute need NOT be a key attribute. Usually dependency of an attribute is represented as X Y, which means attribute X decides attribute Y.
Example: In RESULT relation, Empno. attribute may decide t he Empsalary attribute. This is represented as Empno. Employee and read as Empno. decides Employee. In the RESULT relation, Empno. attribute is not a key attribute. Hence it can be concluded that key attributes are determinants but not all the determinants are key attributes.
Full Functional Dependency Formal definition of full functional dependency is: In a given relation R, X and Y are attributes. Y is fully functionally dependent on attribute X only if it is not functionally dependent on sub-set of X. However X may be composite in nature.
Partial Dependency Formal definition of partial dependency is: In a given relation R, X and Y are attributes. attribute Y is partially dependent on the attribute X only if it is dependent on subset of attribute X. However X may be composite in nature.
Transitive Dependency Similarly Empsalary depends on Empno., in turn Empno. depends on Employee# DEPT# hence Empsalary fully transitively depends on Employee# DEPT#.
7.2.
Emp.Name A B C D E
A relation is in second normal form (2NF) if all of its non-key attributes are dependent on all of the key. Relations that have a single attribute for a key are automatically in 2NF. This is one reason why we often use artificial identifiers as keys.
Emp.Name A B C D E
A relation is in third normal form (3NF) if it is in second normal form and it contains no transitive dependencies. Consider relation R containing attributes A, B and C. R(A, B, C) If A B and B C then A C Transitive Dependency: Three attributes with the above dependencies.
8. DATA-FLOW DIAGRAM
I have constructed a database that consists of six data tables. There will be one main table (parent table) and five child tables, related to each other. Patently, for this purpose the necessary primary and foreign keys should be defined into the responding tables. The so defined structure above is made up in conformity with the users needs and demands. Each employee of the staff is intended to have several records, responding to his Working History, Contact Person Information, Salary Information, Time Information and Holiday Information, and only one record containing his basic information within the company his personal details as: date of birth, gender, marital status, address and phone details, and his current working record. An employee is supposed to have not only one record of his Working history, or his Contact Person Information..For instance, if we take a look to the Time Information data table an employee may have several records in case he has some experience within the current company. It is absolutely the same with the Salary Information, Contact Person Information and Holiday Information data tables.
The relationships between the data tables, we can distinguish six tables that the database consists of. All of the relationships are of type: one-to-many. (For more details about the data tables, see. The primary key fields could be set to Auto-number data type as Access creates these values in an ascending order to ensure that they are unique within a table. Some of the fields should be adjusted to accept null-values. It is quite important to be done as it slightly related to the input fields of the application program. I decided to perform it in the following way: those fields that are compulsory to be filled by the user I have set not to accept any null-values of data and on the other hand, those ones, that can be left blank, are set to accept null-values. It is easy to be performed by changing the Allow Zero Length setting (Appendix A: Figure 11 Setting a data-field to accept null values). It is just needed to go to the desired field that has to be set, and switch between the two options, offered into the Allow Zero Length field. In the example, shown above, the Personal_ID_Number field is set not to allow any null-values thus its fields length cannot be zero as its value is quite essential for identifying an employee as an individual and distinctive person. That has been considered and done for a kind of convenience as the user would wish not to enter all of the data at the moment, and come back later.
SYSTEM TESTING
This phase determine the error in the project. If there is any error then it must be removed before delivery of the project. For determining errors various types of test action are performed.
Unit Testing
Unit testing focuses verification effort on the smallest unit of software design the module. Using the detail design description as a guide, important control paths are tested to uncover errors within the boundary of the module. The relative complexity of tests and the errors detected as a result is limited by the constrained scope established for unit testing. The unit test is always white box oriented, and the step can be conducted in parallel for multiple modules. Unit testing is normally considered an adjunct to the coding step. After source level code has been developed, reviewed, and verified for correct syntax, unit test case design begins. A review of design information provides guidance for establishing test cases that are likely to uncover errors. Each test case should be coupled with a asset of expected results. Because a module is not a stand-alone program, driver and/or stub software must be developed for each unit test. In most applications a driver is nothing more than a main program that accepts test case data passes such data to the module(to be tested), and prints the relevant results. Stubs serve to replace modules that are subordinate (called by) the module to be tested. Stub or dummy subprogram users the subordinate modules interface, may do minimal data manipulation, prints verification of entry and returns. Drivers and stubs represent overhead. That is, both are software that must be written but tat is not delivered with the final software product. If drivers and stubs are kept simple, actual overhead is relatively low. Unfortunately, many modules cannot be adequately unit tested with simple overhead software. In such cases, complete testing can be postponed until the integration test step. Unit testing is simplified when a module with high cohesion is designed. When only one function is addressed by a module, the number of test cases is reduced and errors can be more easily predicted and uncovered.
System Testing
Software is only one element of a larger computer based system. Ultimately, software is incorporated with other system elements (e.g. new hardware, information), and a series of system integration and validation tests are conducted. Steps taken during software design and testing can greatly improve the probability of successful software integration in the larger system. A classics system testing problem is finger pointing. This occurs when a defect is uncovered, and one system element developer blames another for the problem. Rather that including in such nonsense, the software engineer should anticipate potential interfacing problems and (1) design error handling paths that test all information coming from other elements of the system.(2) conduct a series of tests that simulate bad data or other potential errors at the software interface; (3) record the results or tests to use as evidence if finger pointing does occur (4) participate in the planning and design of system test to ensure that software is adequately tested. There are many types of system tests, which are worthwhile for software-based systems, as detailed hereunder: Recovery testing is a system test that forces the software to fail in a variety of ways that verifies that recovery is properly performed. Security testing attempts to verify that protection mechanisms built into a system will protect it from improper penetration Stress tests are designed to confront programs with abnormal situations. Performance testing is designed to test the run-time performance of software within the context of an integrated system.
Integration Testing: A neophyte in the software world might ask a seemingly legitimate question once all modules have been unit-tested. If they all work individually, why do you doubt that theyll work when we put tem together? The problem, of course, is putting them together interfacing. Date can be lost across an interface; one module can have an inadvertent, adverse effect on anther, sub functions, when combined, may not produce the desired
major function; individually acceptable imprecision may be magnified to unacceptable levels; global data structures can present problems. Sadly, the list goes on and on. Integration testing is a systematic technique for construction the program structure while at the same time conduction test to uncover errors associated with interfacing. The objective is to take unit tested modules and build a program structure that has been dictated by design. There is often a tendency to attempt non-incremental integration; that is, to construct the program using a big bang approach. All modules are combined in advance. The entire program is tested as a whole. And chaos usually results! A set of errors are encountered. Correction is difficult because the isolation of causes is complicated by the vast expanse of the entire program. Once these errors are corrected, new ones appear and the process continues in a seemingly endless loop. Incremental integration is the antithesis of the big bang approach. The program is constructed and tested is small segments, where errors are easier to isolate and correct; interfaces are more likely to be tested completely, and a systematic test approach may be applied. Integration testing can be categorized into two types, namely top-down integration or bottom-up integration. Top-down integration is an incremental approach to the construction of program structure. Modules are integrated by moving downward through the control hierarchy, beginning with the main control module. Modules subordinate to the main control module are incorporated into the structure in either a depth-first or breadth-first manner. The bottom-up integration testing as its name implies, begins construction and testing with atomic modules. Because modules are integrated for the bottom up processing required for modules subordinate to given level is always available and the need for stubs is eliminated. The selection of an integration strategy depends upon software characteristic and, sometime project schedule. In general, a combined approach that uses the top-down strategy for the upper levels of the program structure, coupled with a bottom-up strategy for the subordinate levels, may be the best compromise.