Anda di halaman 1dari 39

we transfer data to knowledge

Business Intelligence

Project and Operations Lifecycle


Business Intelligence
Business intelligence describes the enterprise's ability to access data and explore i nformation (often contained in a data warehouse) and to analyze that information to develop insights and understanding, which leads to improved and informed decision making. K. Harris, H. Dresner, Gartner Group

Business Intelligence Platform


A business intelligence platform supports individual end users with reporting on demand, giving them fast and easy access to a unified business related reporting model thru analytical applications built for interactive standard reports and ad-hoc reporting. The data extraction from all relevant sources, its content verification and integration into the reporting model is updated thru automated background processes (ETL).

VERSION FILENAME REVISION HISTORY VERSION 0.8 0.9 0.9 1.0 DATE

1.0 KNOWLEDGE_BI_LifeCycle_ver1_0.doc

AUTHOR Michael Brnnimann Michael Brnnimann Michael Brnnimann Michael Brnnimann

COMMENT Created Draft to first review/finalization User Interactivity >> Information Scope Final Review Initial public version 1.0

05.08.2009 12.08.2009 14.08.2009 18.08.2009

Business Intelligence Project and Operations Lifecycle 1.0

2/39

Management Summary
This document provides an overview about the different key aspects, which are crucial for a successful implementation of a corporate business intelligence solution. Next to professional project management with a strong corporate sponsorship, the proper assessment and envisioning of todays and potentially future business requirements is the key driver and thereafter measure of success towards a sustainable solution. Next to the conceptually comprehensive approach, there exist a few potential shortcuts, which help to reduce times and costs especially for an initial release. But there are also common pitfalls, business intelligence initiatives should be aware of to avoid project and investment failures. The design, development and deployment of a sustainable business intelligence solution is primarily an organizational project and not a technology driven project. Due to new and changing demands in business reporting and analysis, a business intelligence solution must be flexible and scalable to be able to satisfy the business needs in short times at minimal costs. The technology and implementation partners involved serve as enabler to achieve the business goals. Therefore the evaluation and selection of base components (data warehouse and OLAP databases, ETL tools, analytical application platforms, etc.), intermediate best-practice templates (customizable data model, standard reports) and implementation partners (implementation, customizing, support) must be done carefully, as they facilitate the creation of the fundament for a sustainable solution.

Copyright
All materials in this document are copyright of ibax AG and may not be used for any commercial purpose without ibax's prior consent in every specific case. The materials may not be altered without the prior consent of ibax AG.

Disclaimer
The information and materials provided in this document are intended to be general and do not constitute advice in any specific case. Information provided does not create a legal relationship between ibax AG and the reader and ibax does not accept any responsibility or provide any guarantee of any kind with respect to the accuracy or relevance of information provided in this document.

Trademark
The trademarks and logos of ibax and its products are registered trademarks of ibax AG. There is no right to use any trademark displayed in this document without the written permission of ibax AG or such third party that may own the trademarks.

Business Intelligence Project and Operations Lifecycle 1.0

3/39

Business Intelligence Lifecycle Diagram

Business Intelligence Project and Operations Lifecycle 1.0

4/39

Table of Contents
The structure and numbering of the document follows the numbering logic of the business intelligence lifecycle diagram as shown on the previous page.

A short introduction, feasible project shortcuts and potential project pitfalls follow on the next pages.
1 Initiative 2 Planning 3 Project Management 4 Specification 5 Evaluation 5a Partner Evaluation 5b Product Selection 6 Business Intelligence Platform 6a Architectural Components 6b Information Scope 6c Reporting Model Design & Implementation 6d Analytical Application Design & Development 7 Final Assurance 8 Deployment 9 Operations 9a Production 9b Outsourcing 10 Growth 8 11 15 16 21 21 22 26 26 31 33 34 35 35 36 36 36 36

Further Reading
The Data Warehouse Lifecycle Toolkit, 2nd Edition from Ralph Kimball, Margy Ross, Warren Thornthwaite, Joy Mundy, Bob Becker; publisher Wiley & Sons, January 2008 ISBN: 978-0-470-14977-5 and http://www.lifecycle-toolkit.com/pages/index.htm Impossible Data Warehouse Situations: Solutions from the Experts from Sid Adelman, Joyce Bischoff, Jill Dych, Douglas Hackney, Sean Ivoghli, Chuck Kelley, David Marco, Larissa T. Moss, Clay Rehm; publisher Addison-Wesley, Oct 1, 2002 ISBN: 978-0-201-76033-0

Abbreviations / Glossary
Used abbreviations and technical terms are explained in the appendix.
Business Intelligence Project and Operations Lifecycle 1.0 5/39

Introduction
The Business Intelligence Lifecycle Diagram (thereafter referred as LifeCycle) provides an overview about the different aspects and major dependencies to implement a sustainable business intelligence solution. Independent of the solution scope, especially also for small and easy looking BI projects, the use of the LifeCycle is recommended at least in the context of using it as a checklist, to make sure all areas are considered. Because a business intelligence solution should be understood as a multi-release process cycle, the different aspects have different importance depending on its release level. This means, not all aspects must always be covered in full detail, but you have to ensure you dont run into u nplanned efforts for future releases or in worst case have to consider re-building the whole solution from scratch. Especially for an initial release the phases of the BI initiative, the identification and specification of business requirements and the design and development of the reporting model are of key importance, even if your first release is just focused on quick wins or intends to serve a single business unit or business purpose. The left handed project time diagram in standard work units visualizes a typical relation of the different efforts regarding its release cycle. A standard work unit is a multiple of days depending on the size of the project. Depending on the existing skills and experience of your internal resources, for an initial release its advisable to involve project and product independent BI expertise in form of approved BI consulting specialists.

Business Intelligence Project and Operations Lifecycle 1.0

6/39

Shortcuts
The rollout of an initial release can be significantly reduced (by 30-50%; as indicated by the yellow colored areas in the project time diagram on the previous page), in case there is only one data source and there is no need for a data warehouse with a corresponding ETL processes. Further cutbacks in the specification phase and the design and implementation of the reporting model usually result in higher development efforts for the initial analytical application and all its future enhancement cycles, because missing functional features of the reporting model have to be covered by specific functionalities of the analytical application (i.e., date series, date aggregates).

Potential Pitfalls
The most common potential pitfalls for causing poor use and low level of acceptance of the deployed business intelligence solution up to complete project failures are:
Prejudiced opinion of BI initiative, that a successful BI implementation is solely a question of a good product selection Missing definition of project sponsorship and steering committee Insufficient or exceeding decision taking involvement of project sponsorship and steering committee Inadequate (unrealistic) project scope setting Inadequate project cost/time estimates Missing or insufficient early warning system (issue tracking) and project progress monitoring Inadequate handling of moving target situations (change of project goals during project progress) Insufficient skill set, project understanding or goal focused commitment of project team members No or inadequate internal communication towards all potential recipients Missing or poorly maintained library of corporate terminology Missing or poorly execution of gathering and specifying the business requirements Missing identification and addressing of existing compliance requirements Insufficient business context related BI expertise of project management or design & dev. partner Insufficient reflection of BI vision in reporting model and analytical application Poor or even partially invalid source mapping definitions Missing definition, communication and deployment of business reporting processes Missing or poorly execution of the final assurance prior the platform deployment Missing or insufficient involvement of IT operations Missing or inadequate end user training Missing or poor quality assurance process Platform doesnt meet the required date timeliness Platform doesnt meet the specified query and processing performance

Despite or because of all known potential pitfalls an initial release of a business intelligence solution can become a success story!

Business Intelligence Project and Operations Lifecycle 1.0

7/39

1 Initiative
The objectives of this phase are to build awareness about the potential of business intelligence solutions in the context of the own corporation and to form the organizational key drivers, which help to stay on the track towards a successful business intelligence implementation. With every major change in corporate strategy, or change of business fields and their critical success factors, or at least every 3-5 years, the established BI initiative should be reviewed and revised regarding its organizational setup and its mid-term vision.

Sponsorship: The project sponsor should preferably be the most senior person with widely accepted leadership, which itself also is direct or indirect user of the analytical business reporting results. Usually its a non-IT person from the top management (CEO, COO, CFO) or business line middle management (Controlling, HR, Sales, etc.). Steering Committee: A well constituted committee (3-7 people) ensures that all potential interests of building, operating and using a business intelligence platform are represented. Together with the sponsor they form the backbone of the project management. The committee reviews the project progress on a regular base (weekly/monthly, milestones) and participates and supports the project manager and sponsor in decision taking questions. Depending on the project scope the committee may be extended during the phases of planning and business requirement definitions. Particularly in the phase of model and application detail specification and in case of moving target situations (change of project goals during project progress), the committees feedback is a crucial indicator for a future solutions acceptance and project success. Project Manager: The project manager should be a person with proved project management experience and strong skills in communication, people and conflict management. The project manager may be assisted by an independent BI specialist to cover the necessary BI background expertise. Business Intelligence Project and Operations Lifecycle 1.0 8/39

Vision: The envisioning task regarding the possibilities of a business intelligence solution helps to explore all potential fields which may be involved in a long-term, future business intelligence solution, which empowers users to create and analyze reports on demand themselves. This vision shall be created independent of cost/benefit considerations and other existing or potential constraints. Target Audience Key Customers / Suppliers Bank / Investors / M&A Auditing & Compliance Chairman/Board Top Mgmt (CEO, CFO, etc.) Middle Mgmt (Sales, HR, Product Mgmt, Production, etc.) Operational Monitoring and Business Applications, etc. Biz Area Finance HR Sales Production Inventory others Content Dashboard/Cockpit Standard Reports Information Lookup Ad-hoc Reporting & Analysis Recons. Customer Base others Granularity By Month/Day/Hour By Customer By Product By Cost Account By Shopping-Basket others

A major advantage of business intelligence solutions is to provide a source independent unified view (single version of truth) on aggregated quantitative data with excellent data access performance and nearly unlimited user interactivity functionalities. Non-quantitative data and interactive invocation of the source systems can be embedded in a BI platform up to a certain extent, whereas detail reports (fact sheets) on non-aggregated transaction data (i.e., invoices) and looking up item details bound to specific visualization/layout rules or item specific value calculations (i.e., customers, products, accounts) are preferably created and retrieved thru their source application. There is no strict line between what kind of information should be retrieved thru a business intelligence platform or thru its source applications, as the BI platform usually hosts the same level of detail (granularity) as the sources. The resulting blurry borderline is defined by the question of cost-effectiveness about re-creating application logic of the source application on the business intelligence platform, which primary purpose is to create reports on aggregated data. Multi-lingual Support: Its possible to create a multi-lingual business intelligence platform. But for reasons of cost-effectiveness and reduced implementation, deployment, documentation and maintenance costs its recommended to stay with one primary language, as it creates about 30% extra costs to the overall total costs. Multi-lingual solutions keep the technical implementation in the primary corporate language, whereas the analytical application applies different alternative translation schemas (language and/or business terminology) for all visible reporting model metadata. Additionally, for all visible report aspect items (product catalog, geographical names, chart of accounts, etc.) the data sources must provide item specific translations. Alternatively a limited multi-lingual approach might be appropriate, where translations are limited to the visible metadata in the analytical application and related documents (incl. online help).

Business Intelligence Project and Operations Lifecycle 1.0

9/39

BI as Evolution: All involved people (project, source data provider, end user) must understand the cyclic nature of a business intelligence solution and know the importance of building a solid fundament for a successful and sustainable use of future solution releases. Furthermore they appreciate, that a BI platform may evolve with the future business needs, but recognize that not each release can address all needs from the beginning at the same time.

Business Case 4 BI: Based on the corporate related vision and the rough estimate of costs, potential savings and other non-financial benefits (strategy, tactical, operational), a 2-5 pages business case must be established to verify potential cost-effectiveness for a drafted mid-term (3-5 years) solution scenario. Following indicators may be used to create a break-even estimate for an envisioned solution scenario: Companies with more than 100 employees Companies with more than 20 report explorers and analysts (controlling, HR, sales) Companies with more than 15-20 subunits/branches Companies with more than 1000-4000 products, customers, and/or suppliers Companies with high transaction data volumes (> 300000 -500000 transactions / year) Companies, which have to integrate data from more than 3-5 different sources Companies with existing business applications platforms (ERP/CRM, etc.) with no or not satisfactory reporting functionality, where initial implementation costs (license / customizing) are above 150 250 K EUR Companies where the monthly time spent to prepare standard reports (data collection, data consolidation and reconciliation, report creation) exceed 15 work-hours. (This time does not include the time used to analyze and interpret the reporting results.) Companies with strong demand for analytical reporting, complex financial consolidation, detailed cost/profit analysis, having many different standard reports, have frequently changing reporting needs, ask for interactive ad-hoc reporting and analysis.

Quick Wins: Identify the most wanted and urgent business related reporting and other needs, which may be addressed by the targeted business intelligence solution already in an initial release (low hanging fruits).

Business Intelligence Project and Operations Lifecycle 1.0

10/39

2 Planning
Scope: The sponsor, project manager and steering committee define the scope (audience, business area, content, and time frame) for an initial release based on the preliminary established BI vision and identified quick wins. The defined scope for a platform release should be achievable within the next 6-12 months time. In the context of the defined scope, the steering committee may be temporarily extended by context related members. In case of a new or adjusted BI vision the subsequent phase of identifying the high level business requirements has to cover a mid-term range for all aspects addressed by the BI vision.

Project Team: The design, development and deployment of a business intelligence solution are an interdisciplinary challenge. More than for the constitution of the steering committee, it is important, that all team members have next to a global understanding of the LifeCycle a specific strong conceptual and technical skill set regarding the related components. But they must understand themselves as team member always aiming and contributing to achieve the overall project goals. An existing member of the steering committee cannot be a member of the project team and vice-versa. Specialists not having the required soft skills (language, culture, tolerance, and acceptance) should not be member of the core team, but may get involved to perform very specific tasks upon request. For the initial implementation of a sustainable business intelligence solution the core project team (5-10 members) often implies the involvement of external people from different companies, who bring in the required in-depth expertise. For development and deployment tasks, the members of the core project team may be supported by further assistant personnel (5-20 assistants), which are not member of the core team. The project core team has to cover following roles/skill sets: a) Best-practice business reporting including KPIs (key per formance indicators) and corporate reporting and escalation processes b) Business terminology and naming rules maintaining an object metadata repository c) Assurance of data quality and legal compliance, maintenance of data access security d) BI platform operations (incl. data processing, backup/recovery) e) Deployment (rollout, training, application support) f) SQL data warehouse environments and ETL processes g) OLAP database environments and enhanced cube design h) Design & Development of analytical applications i) Source system data architecture j) Master data management In a typical setup, the areas a) to e) are usually staffed with internal people, whereas for an initial release, the remaining areas are assigned to external experts, which join the team at a later stage. They are assigned as result of the evaluation of the design and implementation partners. For later release cycles, the involvement of external specialists may be significantly reduced, as the initial release should also provide an initial knowledge transfer and documentation to operate and extend the platform without or minimal external support.

Business Intelligence Project and Operations Lifecycle 1.0

11/39

Volume Assessment: The assessment is focusing on the involved data sources with their data volumes and the target standard reports. Each volume assessment (for data sources and standard reports) should be done in two variants, where the first one just relates to the current project scope and the second one refers to the potential context of fully outlined BI vision. The assessment results are an indicator for the mid-term size and complexity of the solution and help to verify the identified budget costs and savings.
SourceID
NAV

Description
Navision ERP Vx.0 for production and trade Abacus ERP 2008 for HR

Contact, Owner
Mr. Sample

Access Path
LAN: Server Instance: NAV\SQL2005 LAN: ABAC_HR LAN (VPN): AVLQ LAN: \\GLOBAL\ Finance\Budget

Time Zone
GMT+1

Access Format

Data Area

Current Volume
3.0 Mio. 4000 1200 100000 1500 50 400 5.0 Mio. 10000 300 5000 800 160000 250

Annual Growth
0.5 Mio. 100 300 350 100 5 20 1 Mio. 500 30 5000 50 80000 2

Modifications
5% 30% 20% 5% 5% 10% 30% 5% 15% 20% 25% 10% 25% 3%

Transactions Microsoft SQL Products Server 2005 Customers Time-Trx: Recruitm.-Trx Units Persons Transactions Products Customers Transactions Accounts Rates Currencies

ABACUS

Ms. Nice

GMT+0 Pervasive SQL

AVALOQ

Avaloq Banking Ms. MoneySystem penny Budget for Finance and Cost Profit Foreign Exchange Rates Mr. Account

UTC

Oracle 10g MS Excel 2008 CSV File

Budget

GMT-1

FX_Rates

Mr. Contact; FTP: ContractID: //ftp.oanda.com Oanda_xy

UTC

Explanatory notes:
Volume Assessment for Data Sources covers internal & external sources: ERP, CRM; budgeting/planning tool; market quotes (foreign exchange, stock quotes), market poll/survey, phone/ address directories, zip-codes; item specific background information (GTIN/EAN product databases, company registrations, credit ratings). SourceID: is assigned for easier reference. Description: contains information about application, version, content and primary usage. Contact, Owner: contact information (technical, responsibility, data ownership) and contract reference details for external sources. Access Path: information about from which origin and how the data can be retrieved. Time Zone: data source related local time zone. Indication for potentially required time zone / daylight saving adjustments during ETL process. Access Format: Database/file format and version. Note: despite its on electronic nature also purely paper-based sources should be listed, if they contain information, which is used to create reports. Data Area: type of transactional data and primary, large context dimensions. Current Volume: todays data volume (incl. historical data) for each data area . Annual Growth: annual growth (net) for each data area. Modifications: annual modifications applied as percentage, multiple annual changes add up.

Business Intelligence Project and Operations Lifecycle 1.0

12/39

ReportID Comment
Board

Purpose, Frequency

Content, Specials
KPI, key figures, date series charts P&L accounts Columns: YTD budget + latest forecast Time Categories with annual overview Customer segment hierarchy

Filtering SampleRef
month, report currency and unit month, report currency and unit DashB_2009.xls

Reader
Unit Mgmt: (5 Users)

User Interactivity Explorer


Unit Finance: (5 Users)

Analyst
Board Assistant (2 Users) Controlling (5 Users) HR Dept. (3 Users) Sales Dept. (5 Users)

Management Monthly Dashboard Monthly Profit & Loss internal Statement financial accounting Timesheet Analysis Customer Analysis Monthly Quarterly Customer Profitability

P&L

PL_200907.xls

Unit Mgmt: (5 Users) Unit Mgmt (5 Users) None

Unit Finance: (10 Users) Unit Finance: (10 Users) Account Mgr (30 Users)

TS

year, unit, TS_2009.doc person quarter, account manager CL2009Q3.xls

CL

TOTAL different reports

MIN FREQUENCY: monthly

DISTINCT USERS

40

15

Explanatory notes:
Volume Assessment for Target Reports: includes all existing data summarizing reports. Fact detail sheets showing item (account/customer/product) or single transaction details may be listed as well. But due to the nature of their data source specific content (application and visualization logic), its encouraged to better retrieve reports in the source system itself. ReportID: is assigned for easier reference. Comment, Specials: textual comment and special value calculation (currency impact calculation, intra-group elimination, minority-handling, etc.) and visualization (layout, charting) requirements. Purpose, Frequency: information about primary purpose and use of the report and the required frequency of creating updated reports. As lowest update frequency daily updates are encouraged. Theoretical its possible to provide near real -time reporting, whereas this usually causes extra costs (ETL and hardware) and moreover creates confusion for the user, as the data they looking up may change while they interact with the reports. Content: brief description about the major report dimensions (rows, columns, values) and calculated items (key ratios, percentages, etc.). Filtering: list of known filtering/display options, where the user can choose from. Typical common filtering aspects are: date, report currency, scale of currency, organizational unit, cost/profit center. SampleRef: to get a more detailed impression about the target report, its recommended, th at they are illustrated with some fictive sample reports, which are stored in separate document referenced by the sample reference column. User Interactivity: Identifies the target user audience by their organizational unit, type of interactivity (see LifeCycle User Interactivity) and number of users . The complimentary number of total different reports, the minimal update frequency and the distinct user counts can be used to verify/adjust the related budget positions.

Business Intelligence Project and Operations Lifecycle 1.0

13/39

Savings: All major potential savings related to the defined scope are identified, quantified and weighted according to their scenario probability (worst, expected, best): work-hours to prepare reports, improved bargaining power towards key suppliers and customers, ability to act in-time due to efficient information availability and early detection of changes, reduced IT costs for report programming and maintenance thru end user empowerment (user interactivity), etc.. Also identified non-financial benefits should be listed. Such benefits indirectly support the achievement of the financial goals: improved bonus targeting, decreased wait time to get access to up-to-date reports, improved data quality, strengthened business connections to key customers and suppliers, tightened data access security, etc..

Budget: Establish a multi-scenario (worst, expected, best) budget estimate at full cost: internal & external resources (time/cost), license fees, working extra-hours, hardware investments, legal costs, etc. Add a surplus of 30-60% to the estimated budget for exchange rate and product price changes, other unknown extra expenses and moving target situations. The break-even of savings against budget can usually not be achieved with the first platform release due to the initial investments (design, development, licenses) in the base platform, but should be achieved with follow-up releases within 2-3 years after the initial BI initiative.

Communication: Define the recipients, frequency, and channel of distribution and its minimal structural content. The communication informs about the project status (in relation to time/cost and level of progress), recent/current activities and the next major steps. A well structured communication on a regular base (monthly, quarterly) is especially useful to keep people informed, which are not directly involved in the current project activities. The communication also sets the ground for a later-on smooth deployment and usually results in a higher level of user acceptance and subsequent use of the business intelligence solution.

Milestones / Reviews: Prior having established a detailed project plan, milestones should be set for at least each project critical step: specification of business requirements, reporting model design, analytical application design, development work results and the achievement of the project goals (final review and assurance) prior deployment. The setting of the milestones is being agreed by the sponsor, project manager and steering committee. All milestones related deliverables are reviewed and revised as necessary. The project management and steering committee specifies details about how to enter the next phase, which may include task specific details (modifications) and other constraints. The sponsor is informed about the milestone achievement and how to enter the next phase, whereas he has a remaining right of veto, but not the right to change directions (goal setting) of the project himself.

Business Intelligence Project and Operations Lifecycle 1.0

14/39

3 Project Management
The project manager and the steering committee create the initial detailed multi-phase project plan. If required, the project manager may be assisted by an independent BI specialist to cover the necessary BI background expertise. The resulting project plan provides time-related information about phases, tasks/reviews, milestones and assigned resources and costs (task related budget) by functional roles (including license and other fees). Make sure, that you also allocate 20-40% extra calendar time up front each milestone. Wherever applicable, the roles/tasks are assigned to the existing project team members. This draft version must be refined as final task of the business requirement specifications. The remaining role assignments to external people will be done later after the evaluation and job assignment to external design and development partners. In this phase its also recommended to re-verify/confirm and adjust the existing role/member assignments. The project manager is responsible to track the fulfillment (deliverables) of the tasks regarding time, working hours and running costs on a weekly base and summarize a progress/status report for the project sponsor and the steering committee (weekly/monthly). This summary in the same or slightly simplified form is also an integrated part of the regular information statements for all potential stakeholders (internal communication). Additionally its recommended to introduce an early warning system in form of an ongoing issue tracking, which lists and monitors all rising (unexpected) issues, even the related running tasks are still within cost and time. Whenever the monitored work progress on the critical path gets behind plan or the running costs exceed the related budget, the project manager must develop together with the steering committee corrective measures and/or adjustments of the project plan, which must be signed off by the sponsor afterwards, before the suggested actions can be put into place.

Business Intelligence Project and Operations Lifecycle 1.0

15/39

4 Specification
A detailed specification of the business requirements is the key for a sustainable and well accepted business intelligence solution. In case of an initial release, where its important to get the big picture, also all future potential requirements must be collected and summarize at least at a draft best estimate level. The data required is collected and summarized thru standardized qualitative interviews. The structure and content of the specific interviews and identification of the target interviewee persons is prepared by the project team and reviewed by the steering committee. Whenever possible, the same interview should be addressed to 2-3 interviewee persons. The interviews are conducted with the selected interviewee persons (user, legal department, IT operations, etc.) oneby-one or in a single group. The interviewer teams are made up of one content qualified member of the project team and one adequate member of the steering committee. The consolidated findings are summarized by the project team and formally reviewed by the steering committee. An abstract version of the findings will be signed off by the sponsor and communicated thru the regular information statement. Whenever some requirement details of the current project scope cannot be transformed into corresponding deliverables, the related issue must be raised from the project manager and resolved in the steering committee. The suggested measures (adjustment of requirement details and/or project plan) must be signed off by the project sponsor and communicated thru the regular information statements.

The following areas are the primary information aspects, which should be covered by the business requirement specifications.

Corporate Business Model: An overall business process model describes all main current and planned standard business processes with their actors and other involved resources. In combination with known general best-practice processes, this especially helps the design and development teams to create a reporting model, which is extensible and flexible enough, to cover potential future platform releases.

Business Intelligence Project and Operations Lifecycle 1.0

16/39

Report (Template) and KPI Specification: Similar to the assessment in the planning phase all identified reports are documented but specified in much more detail, and each report must mandatory be illustrated with some sample reports. Understanding the underlying business goals of specific reports is crucial for a successful translation of the specifications into design (reporting model, analytical application). In addition to the known aspects of the report assessment in the planning phase the specification must cover following areas:
Purpose in depth: Identification of the underlying business goals. Component and visualization details (pivot table and chart items). Row, column and filtering items of visible (pivot) tables and/or hidden tables sourcing specific charts. Complimentary background information for specific row/column/filtering items, like address, birthday, social security number etc. for person items, or company registration id, website, stock listing for organizational items (corporate, supplier, etc.). These attributes may be used for special filtering, interactive functionality (see below) or to create item list reports. Special report-wide calculation requirements like support of multi -currency, -scenario, -rate-types; value scales, date-series; financial consolidation (intra-group elimination / minority-handling), etc. Special calculated report item values like percentages, key figures, forecasting calculation and statistical computations. Special item naming (terminology) of row, column or filter items or calculations. Special interactive functionality. Most professional analytical application platforms provide following interactivity options by default: expand/drilldown, value filtering, sorting, and hiding empty report rows/column. Further additional typical requirements are: invocation of source business application (ERP/CRM), other applications (document management system, semantic web), information services (internet) or office applications (email, etc.) reporting model related context sensitive online help interactive lookup of background information for visible report items Special data quality assurance requirements (if required) Report specific query performance requirements (maximum and average in seconds) Report specific user data access restrictions (if required)

In addition to the primarily business line driven report specifications, the report requirements for complimentary statistics may be specified regarding platform updating process/status, data quality reports and end user platform usage. The resulting reporting specifications should be checked against the preliminary assessed data sources. In cases where reports require the integration of additional data sources, consider to postpone the specific reports for a later release or refine their specifications, so they can be covered by the available sources. We strongly discourage from adding more data sources at this phase, as it implies additionally complexity, time and cost efforts to the current release, which are not reflected or covered by the existing budget and time plan. For improved ease of use and collaboration aspects, its encouraged to align the identified reports with a collection of report templates regarding the major visualization aspects. The templates also incorporate the same set of interactive functionality and follow common design principles (typefaces/fonts, coloring, and page layout settings as agreed or defined by corporate design guidelines). The resulting report and template specifications, which might partially get adjusted during the design of the analytical applications and their standard reports, are used to review the final results and serve as base to create the technical and user related report/template documentation.

Business Intelligence Project and Operations Lifecycle 1.0

17/39

Data Timeliness: Based on the required frequency for having updated reports, the overall minimal update frequency for report items (products, customers, accounts, etc.) and related value information (transactional data) can be derived. Theoretically its possible to build sophisticated ETL solutions, where with each load process just the absolute minimal required information is extracted from its sources. From the point of ease of monitoring and maintain load processes, we recommend always to extract all dimensional data (=full extract of current, changed and historical data of all report items like products, customers, etc.) and to restrict the extraction of fact data to new and changed data (= incremental extract of transactional data for with reduced load times).

Data Requirements: In the context of the specified reports, their update frequency and the available data sources, the following data specific requirements regarding retrieval and storage should be answered in a way, they comply with the mid-term BI vision. Whereas the current implementation release may not cover all requirements right away, the platform design and the selected products must guarantee that they can be met with future releases without the need for a re-design and/or or unforeseen product licenses. Data persistency in a dedicated storage area (data warehouse) provides a source independent archive over time, even the genuine source systems are replaced. Collecting process supports full and incremental data load for better load performance. Preservation of historical structures (organizational structures, product hierarchy, etc.), might be required by business and/or compliance, to be able to re-produce date-related, historical reports any time in the future based on their historical structures. Master data management as a functional extension of a business intelligence platform, allows a fully or partially automated alignment of different data structures coming from different source (.i.e.; product catalogs, chart of accounts, etc.) prior its integration into the target reporting model. Master data management as such is not a business intelligence core component, but a prerequisite in case of a necessary merge/integration of different data structures, coming from different sources. Master data management is not required, in case the data for a target data structure just comes from a single source. Data quality assurance is usually covered by embedded rule checking mechanisms in the ETL process (data load), which automatically generates and delivers quality related notifications/reports. Quality assurance can optionally be extended thru reporting model extensions providing item related quality ratios, which can be used to generate additional in-depth quality reports. Quality rules usually check up on invalid, incomplete, missing and outdated data, and moreover for wrong/inadequate data context (business process). If the quality aspects refer to a specific report only, they should be detailed within the report specifications; otherwise they can be listed as part of the data requirements.

Business Intelligence Project and Operations Lifecycle 1.0

18/39

Performance: A good query performance for standard reports is crucial for user acceptance and report usability/interactivity. A stable superior performance (especially on platforms with growing data volume and user audience) is primarily a question of a good design and code development of the analytical application and the underlying reporting model. Using more powerful hardware may help to improve the performance additionally, but usually only if the overall BI solution can make use of the additional resources. In some cases low bandwidth network connections may cause bad query performance. This can either be addressed thru increased bandwidth, distributed copies of the query platform data and/or thru giving the users access to the analytical applications thru hosted remote sessions (terminal services, Citrix, etc.). Next to the query performance, the background processing performance relates to the data updating processes of a business intelligence platform. Processing performance must stay within the required date timeliness (as defined by the business) and moreover leaf enough room for other IT operations (backup and regular maintenance activities). Existing query performance may be improved thru performing pre-calculations and building data aggregates during the preliminary data processing. But be aware, that using such features increase the resulting processing load times. Query performance specifications should at least differentiate between standard reports and adhoc queries. They required answering times (average and maximum times in seconds) should be reasonably set and reflect how often and intensive the reports are used. For users using the platform during data processing times, (working off the regular office times or accessing from different time zones) the (reduced) response time requirements should be defined separately, as such activities may interfere with the resource consumption of background load and maintenance processes. As the different reports can be of quite different complexity, its advisable also to specify required response times for each report separately. The maximum background processing time allowed (incl. allowed scheduling) must fit into the existing IT operations plan. Existing maintenance schedules and other constraints should be documented.

System Availability and Outage: Depending on the user audience and their working times the required system availability (weekdays, hours of day) for query access must be defined. Also a maximum tolerance time (hours or minutes) for a system outage of the business intelligence platform should be fixed, as this implies platform related measures (hardware, software, processes) to reduce the risks and/or support specific recovery times.

User Data Access Security: Next to general restrictions, report related restrictions should be defined at report specification level. Due to the nature of the multidimensional reporting model, which is a core element of every business intelligence solution, the various report specific requirements are later translated into reporting model related rules. This guarantees user related data access control at the platform data level for all potential analytical applications, standard reports and ad-hoc queries alike. Wherever possible, data related access rules should be directly extracted from existing data sources and integrated in the logic of the BI platform security. Despite its possible to add complex and tight security mechanism, for the cost -effectiveness of an initial release and its future maintenance, its recommended, that the really required restriction rules should be evaluated carefully and be kept as simple as possible.

Business Intelligence Project and Operations Lifecycle 1.0

19/39

Compliance: Depending on the usage purpose and context of the specified standard reports and ad-hoc queries, additional constraints may have a significant impact on the design of the reporting model, the platform technology, the ETL process and the analytical applications. Such constraints are usually given by external laws, regulations, reporting standards or other internal requirements (SOX, Basel II, IKS; MiFID, ISO, IFRS, data privacy, external auditing, etc.). All such constraints may add a high degree of extra complexity (plus costs and times) to a business intelligence solution. Therefore we encourage verifying such constraints carefully in the context of the real usage and purpose of the specified reports. If only a few reports underlie such constraints, consider creating the reports in its genuine source application. Such reports are usually of a more static character and dont take advantage of the integrated interactivity of a business intelligence solution.

Process Specifications: As the business intelligence platform provides a unified businessrelated reporting model for all standard and ad-hoc reports, also all related organizational processes should be documented and provide information about the default process with its workflow tasks, actors and involved resources: Business reporting at strategic, tactical and operational level (incl. escalation and linkage to connected processes like planning and publishing) BI platform operations End user training & support Data quality assurance (incl. escalation and linkage to data providers)

Prototyping and Proof of Concept: Depending on the complexity of the resulting requirement specifications, it might be appropriate to ask the design and development team(s) to provide a prototype. Such a prototype is used to verify specific functional and conceptual topics and to review/adjust the requirement specifications and/or project scope depending on its outcome. In case of an initial release with having external BI specialists involved, a prototype for a preagreed lump-sum may also facilitate the evaluation process for design and development partners and/or a related product selection. The work results of the prototype can be taken into the design and development phase as draft conceptual input only and which explicitly excludes program code and similar documents.

Business Intelligence Project and Operations Lifecycle 1.0

20/39

5 Evaluation
5a Partner Evaluation
Based on the collected BI vision, the volume assessments and the specified business requirements, the project team must be completed. For an initial release people covering following topics, must be recruited internally and/or externally: a) SQL data warehouse environments and ETL processes b) OLAP database environments and enhanced cube design c) Design & Development of analytical applications d) Source system data architecture e) Master data management (if required) As mentioned earlier, its important, that the new members are real team players and fit well into the overall team. Non team conforming specialists may be called in temporarily, to cover specific and well defined tasks. Depending on the overall skill set covered by the new team formation and the individual performance of the existing team members, it may be appropriate to replace some of the existing team members. The member and/or partner evaluation should be focused on the area specific in-depth knowledge and experience for conceptual design, technical development and implementation in combination with the product related expertise. Due to the nature of the wide field of required expertise, often a combination of several partners and products provide the best results. Following selection criteria should be considered as well:
Existence of area specific solution templates (reporting model, ETL workflows, analytical application, data source model mapping) or potential team members have proven experience by reference to other similar productive solutions. Note: Good quick-fitting solution templates, help to reduce times and costs for design, development and implementation. Ask for details about their partner network giving them quick access to peer-specialists on demand. Make sure, that a potential contract can be cancelled within a short notice time, if deliverables dont meet the agreed specifications by content and/or quality, or are not in time.

For negotiating prices, ensuring continuity and further contract details, you may also consider:
The overall project goals and deliverables must be specified. Make sure the deliverables align to existing project phases and milestones. Ask for deliverable related fixed prices (supporting different (budget) scenarios, incl. expenses) and its underlying calculation considerations, estimated work times and personal availability. Note: realistic fixed rate estimates can only be done on the base well documented detailed specifications of the preceding phases related to the different budget scenarios. Alternatively ask for variable rates and deliverable related target time/cost estimates. Include bonus/malus payments on previously agreed target time/cost estimates in the contract provisions. This enables you to split exceeding costs or savings. Define a predefined escalation process for potentially moving target situations, so existing contract task agreements and the project plan (incl. budget) can be reviewed and adjusted in time. A (time) limited warranty on deliverables and invalid applications behavior applies, which includes corrective measures in time, but excludes any liabilities regarding data loss or other damages. The regulation about the intellectual property and program code of the solution must ensure the company either owns the code or the code provider fixes bugs and adds minor extensions within a reasonable pre-defined time at predefined rates (if not covered by the warranty).

Business Intelligence Project and Operations Lifecycle 1.0

21/39

5b Product Selection
Depending on the BI vision and current project scope some or all of following architectural components are necessary to form a business intelligence solution addressing all the specified requirements. BI products as such must be understood as pure solution enabling technologies, which provide specific functionalities in the context of the architectural components of a BI platform. Details about the functions covered by the different architectural components can be found in the next chapter. The key success factors of a sustainable business intelligence solution are: Corporate business intelligence vision Business requirement specifications Business intelligence expertise (design & development) Unless there is a strong strategic and sustainable case for a specific technology vendor, which also fully complies with the mid-term BI vision, then we urge to focus on well established nonproprietary products with open, documented data access interfaces supporting database query and communication industry standards like SQL for DWH databases MDX for OLAP databases ODBC/OLEDB and XMLA for database communication allows the (re-) combination of products from almost any vendor ensures access to a vast community of specialists and product in-depth knowledge minimizes product related dependency from design and development partner Further general selection criteria are scalability and high performance without volume degradation system stability, high availability and fast recovery integrated user data access security multi-lingual support (if required) manufacturer product support product market share as indicator for high product acceptance by market and end users lower adjustment costs for new employees already familiar with the product easier recruitment of IT staff having existing product knowledge pricing (license and annual maintenance fees by: platform, modules, user volumes by their user profile (reader, explorer analyst), server CPUs and/or data volume) continuity of core product development product is already in use or already covered by existing license within company or there exist other internal restrictions regarding product / manufacturer selection product related expertise of selected design & implementation partner Business Intelligence Project and Operations Lifecycle 1.0 22/39

Related to the different architectural components additional criteria may apply:

Server Server hardware: scalable hardware (CPU, memory, disk-array). Depending on the workload, the DWH and OLAP database may be run on own dedicated servers. In case of very high volume query loads, the parallel access to multiple copies of OLAP databases may be required. Server OS: well performing OS (Operating System) supporting scalable high end server hardware (multiple CPU, 64 bit, SAN, fiber optic); supported by SQL and OLAP engine BI Platform (Software Components) DWH database engine: server scalable core; support of SQL object / query standards and other de-facto features like index, triggers, stored procedures, user-defined functions; integrated user data access security. ETL workflow engine: supports parallel object processing on DWH database and provides standard interfaces to business application source data and relational and nonrelational data formats of other vendors (incl. CSV/XML-files). OLAP database engine: server scalable core; well performing, reads data from any SQL sources and support of MDX object / query and XMLA communication standards with integrated user data access security. Analytical application(s): <details see below> Other products and add-ons: pre-defined business related templates for reporting models and analytical applications, and value-adding product add-ons for selected DWH and OLAP products may significantly help to reduce times and costs for an initial release. Depending on the data integration requirements, dedicated master data management solutions may facilitate the development of related integration activities. Network and Other Components Network: low bandwidth networks may require that analytical applications can only be used as web application and/or they are made accessible thru remote application sessions, hosted on a Citrix or Terminal Service server. Client hardware: the appropriate configuration of the client hardware depends on the network and the usage of the analytical application (web versus desktop). Other solution related components: data source system, web server used by analytical application(s), corporate file share for report sharing and/or access to reporting templates; internet, office applications, source system and document management system called by clients thru invoking functionality of the analytical application, etc..

Business Intelligence Project and Operations Lifecycle 1.0

23/39

Explanatory notes regarding the evaluation of analytical applications There exist only a few established manufacturers for DWH and OLAP database engines and/or ETL workflow engines. But the area of analytical application platforms is a fast growing market with a large variety of manufacturers and vendors. But remember, nevertheless the analytical application is the most visible part of the overall BI platform to the end user, the business content and data structures of the report come from the underlying DWH and OLAP sources. In combination with a unified reporting model, these, and not the analytical application as such, are the elementary sources which support easy interactive use and fast data query performance. A BI initiative just focusing a nice looking dashboard /cockpit application is like buying an empty skyscraper with a nicely furnished top floor, but with no elevator and no electricity, water and heating standing on a weak fundament. The different user profiles (reader, explorer and analyst) have different needs towards an analytical application. A carefully selected mix of alternative products accessing the same data source (DWH and/or OLAP) often provides the best result regarding the user specific application skills and the application related license fees. Depending on the specific user profile following sample criteria (with focus on Analysis Services OLAP databases) may apply:
Support of OLAP (MDX/XMLA) and DWH (SQL/OLEDB/ODBC) sources (for numeric and text data) Centralized definition of corporate visualization templates for analysis, printing and publishing (content/object placement, fonts, colors, logos) Repository based multi-lingual / business terminology translation of reporting model metadata The existing metadata model incl. description properties is fully accessible for end users (cubes, measures, dimensions, attributes, hierarchies, levels) Running queries may be cancelled Existing reports can still be opened gracefully, also in case of changes in the reporting model Export to Excel and PDF Support for multi-scenario data entry with value distribution support for budgeting and planning Support of cube and application based calculated members and dynamic named sets, whereas analytical application provides own centralized library to access and administrate the specific definitions Support of StrToMember, StrToSet and StrToValue MDX with specific solve order / cube scope Support of cube based actions: rowset, dataset, html, url, drillthrough Support of customizable reporting model sensitive context help Support cube cell level properties (colors, value formatting) Dimension member properties can be made visible Support of enhance visualizations (waterfall graphs, sparklines/sparkbars) Display of inverted dimension hierarchies (parent member stays at bottom of expanded subtree) Collaboration support for adding/sharing report, dimension, and cell level annotations Collaboration support for creating, storing and sharing personal reports / analysis snapshots

Business Intelligence Project and Operations Lifecycle 1.0

24/39

Following table gives an overview about the architectural components and a selection of the major BI technology manufacturer and products. Good sources to get more vendors unbiased information about the different OLAP related products and their market shares can be found on www.olapreport.com, www.gartner.com, www.barc.de, and www.idc.com. Hint: many comparisons found on the market do no clearly distinct between the different architectural components, which sometimes makes comparisons more confusing than they help.
Architectural Components Manufacturer Product DWH ETL OLAP Analytical Application Comment
Comprehensive flexible Suite, but no standard solution templates. Reporting Services (SSRS) OLAP reporting and further solution templates are supported by a vast community of product vendors (incl. Open Source). Comprehensive mixed Suite of many merged products (thru acquisition) with solution templates. Comprehensive mixed Suite of many merged products (thru acquisition), no or limited solution templates. Collection of Open Source products, may be combined with other products like Talend (ETL), Jasper (Reporting); no solution templates. High End data warehouse specialist with proprietary components and solution templates Proprietary Open Source OLAP database with Excel add-in High end ETL platform with extensions like master data mgmt, etc. supports various existing OLAP database products Proprietary OLAP database with Excel add-in and Web Application Proprietary OLAP database with own Analytical Application.

www.microsoft.com

SQL Server Suite (includes all other SQL Server components)

Integration Services (SSIS)

Analysis Services (SSAS)

www.oracle.com www.ibm.com

BI Suite Cognos Suite

Oracle Database DB II MySQL, PostgreSQL

Data Integrator Series 7 or Ascential Kettle

OLAP or Essbase PowerPlay

Answers, Publisher Cognos BI, Cognos Now

www.pentaho.com

BI Suite

Mondrian JPivot Proprietary Applications

www.teradata.com www.jedox.com www.incormatica.com www.cubeware.com www.ibm.com www.qliktech.com

DWH Suite Palo Power Center Cubeware Cognos TM1 Cube /Front End Suite

Proprietary database and ETL n/a n/a n/a Power Center Importer n/a n/a

n/a

Palo, pure in memory OLAP solution. n/a <various> Cockpit

Former Applix TM1, pure in memory OLAP solution QlikView, pure in memory OLAP solution with limited ETL capabilities

Other MDX / XMLA compatible vendors of analytical application platforms (MDX/XMLA are language standards to query and communication with OLAP databases; established by major OLAP engine manufacturers.) www.microsoft.com ProClarity Excel (other products are supported by vendor specific add-ins) DeltaMaster < supports SSAS only> < supports natively SSAS and Gemini only> <various OLAP engines> Desktop Processional & Analytics WebServices Microsoft Excel DeltaMaster Excellent functionality and value for interactive OLAP of SSAS Limited support of SSAS Cubes. In-memory OLAP solution Gemini (Excel add-in) will follow soon.

www.microsoft.com www.bissantz.de

Powerful analytical application Supports almost all possible sources MicroStrategy 9 www.microstrategy.com MicroStrategy 9 <various> (SQL, OLAP, etc.) and provides Desktop & Web reporting solution templates. Desktop & Office Powerful extensions for SSRS and www.it-workplace.co.uk Intelligencia <supports SSAS only> & SSRS OLAP reporting Good for enterprise analytical www.panorama.com NovaView <various> Desktop & Web reporting & analysis www.xlcubed.com XLCubed <supports SSAS only> Excel & Web Interesting visualization features And at least 10-20 more powerful products of analytical applications (incl. open source), with full support for MDX/XMLA and partially for SQL.

Business Intelligence Project and Operations Lifecycle 1.0

25/39

6 Business Intelligence Platform


6a Architectural Components

(Extract from Business Intelligence Lifecycle Diagram) Reporting Model: The multi-dimensional reporting model is the key element to organize and access the companys consolidated data. It provides a unified business related view to all transactions and transaction related structures (date, daytime, unit, person, product, account, geography, etc.). This unified view is also often named as single version of truth. For simplicity of understanding, ease of use and good data access performance for large data volumes, the object model follows a simple star-structure, where the facts (transactions) are surrounded by all context related dimensions (report aspects). Depending on the overall business model, the star can grow into a snowflake or even galaxy like shape, whereas in case of having multiple facts, they should share their common dimensions. This allows comparing different facts along their common shared dimensions.

The conceptual model also defines how the data is organized in its data storage area (data warehouse). A functional slightly extended version of the model is also used for OLAP cubes, which then are accessed by the users thru their analytical application(s). Business Intelligence Project and Operations Lifecycle 1.0 26/39

Data Warehouse: The data warehouse is a database system designed to host the consolidated data and make it accessible to analytical applications (SQL) and/or OLAP engines. Moreover it provides space to support the data collecting and consolidation process, which is coordinated by the ETL workflow. Therefore a typical data warehouse differentiates between following data areas: Extract / Staging / Target: The ETL process initially stores fully or incrementally extracted data from the various sources in an extract area before they are subsequently transformed into the reporting model structure (staging) and loaded into the consolidated persistent target data structure (target). The non-persistent data in the extract and staging area is recreated and overwritten by the ETL process, whereas the target represents a continual data area, which keeps all source data in the unified reporting model structure, even if the operational source system changes or its historical data has been removed. Export: As part of the ETL process, data may be exported selectively from the target data structure to provide business applications (sources) and other special programs (i.e., statistical packages) with custom structured data extracts (i.e., consolidated product catalog). Query: The query area serves as abstraction layer for the physical target data structure and provides end users and OLAP engines data warehouse access on the base of the unified business related reporting model. Repository: This area holds the core information to re-create and load the data warehouse. The repository describes all data warehouse hosted data objects (metadata) and stores the object related ETL process settings (ETL Configuration). OLAP Cubes: OLAP engines provide fast access to structured data following the unified reporting model. Compared to the underlying data warehouse, OLAP engines process data warehouse data and store them as cubes in a proprietary pre-aggregated and query optimized data format. Depending on the performance requirements further intermediate levels are preaggregated during the cube processing. For user-friendly fast and efficient report creation and to support simplified user interactions, the OLAP related unified reporting model is usually extended with additional OLAP specific functional dimensions. Independent of additional features of the selected analytical application, these dimensions enable the user to create fast and easy any ad-hoc reports without the need for in-depth knowledge about the query language. The so called in-memory OLAP engines keep the immediate processed data constantly in the memory, except the operating system itself frees up memory by disk related memory page swapping. Conventional OLAP engines store the processed data on disk, but supported by a sophisticated caching logic (server and client), they keep as much cube data as possible in the memory, which de-facto results in the same query performance as known from pure in-memory OLAP engines. In case of system failure, conventional OLAP engines dont have to reprocess the data and can be queried right away, whereas in-memory OLAP engines must reprocess all data first before its accessible to queries again. In case of high data volumes in-memory OLAP databases can only be run on a 64-bit server with plenty of memory (hardware investments) and runs into risk that after a case of failure reprocessing may not be done within the specified recovery time. Only in cases of medium to low volume source data and non-shareable OLAP data, the one big advantage of in-memory OLAP engines, is that they dont require a server infrastructure or a client with server-like hardware specifications. Such engines can be run on any regular client hardware, empowering the end user to build and analyze ad-hoc cubes completely on their own. To still assure a single version of truth (except its data timeliness) across individually built ad-hoc cubes, its recommended, that all users access the same unified reporting model of the data warehouse. Be aware, that such client based approaches are a potential security hole. Business Intelligence Project and Operations Lifecycle 1.0 27/39

ETL Process: Any business intelligence platform requires an ETL process, where data from several sources must be consolidated or the data warehouse is required to maintain a persistent data base or to preserve historical data structures or the data conversion into the reporting model involves complex transformations. The ETL process updates the data in the target data warehouse by extracting data from the sources, transforming and cleansing the data towards the unified reporting model structure and loading the data into a persistent target database. This process should be fully automated and scheduled according to the minimal required update frequency. The overall ETL process is configured thru object specific ETL execution settings. All necessary object information used to create, extract, load and update the object data itself, comes from the repository database of the data warehouse. The data transformation into the reporting model structure requires Source Mapping definitions and optionally additional rules how to cleanse and enrich the data. Data cleansing or data scrubbing is the act of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database. Used mainly in databases, the term refers to identifying incomplete, incorrect, inaccurate, irrelevant etc. parts of the data and then replacing, modifying or deleting this dirty data. http://en.wikipedia.org/wiki/Data_cleansing Data enriching is the act to optionally enrich existing database records based on lookups within the same or other database tables. Enriching is often used to complete item catalogs like customer or products, where up-to-date item related information is retrieved from special lookup sources (phone directories, GTIN/EAN database, etc.). The topic of data quality assurance is favorably implemented as part of the data cleansing, where the data is checked on the base of object related quality verification rules (attribute, inrow, in-table, cross-table). All rule violations are fully logged. At the end of a data quality assurance task, all data responsible persons are automatically notified with an individual data related status report. As part of an optional master data management, alternate item catalogs (i.e., products, customers, accounts) coming from different sources are automatically merged and aligned according to specific rules and reference (catalog) data. Conventional business intelligence solutions focus on reporting and analysis of quantitative data, whereas an enhanced reporting model also includes many supportive attributes. Such item specific background information (address, color, etc.) can be used in the analytical applications to provide additional functionality. An integration with existing qualitative data (text, images, streaming media), as stored in document management systems, knowledge management and semantic web applications, may require the creation of some special ETL tasks, which link the quantitative data accordingly. The data warehouse is also a source to create customized extracts for other applications, like unified product and customer catalogs for business applications (closed loop), transactions extracts for other programs like statistical packages, item structure extracts for customer mailings or for other (external) recipients. This aspect is covered by an export task, which (re-)creates the customized table objects and exports the selected data. Only in cases, where you can directly access pre-integrated and preferably pre-transformed data and there is absolutely no need for persistent target data, you may build a business intelligence platform without a data warehouse and ETL process. As soon you want to integrate nonpre-transformed sources or you need a place to keep (historical) data, there is no feasible way how it can be done without a data warehouse and a corresponding ETL process.

Business Intelligence Project and Operations Lifecycle 1.0

28/39

Analytical Application(s): The analytical application is the visible interface to the end users to create and retrieve their reports and to perform ad-hoc queries. Independent of the selected analytical application all visible data content (values and structures) is identical as long the data comes from the same unified reporting model either from the OLAP database and/or the data warehouse. The analytical application primarily helps to create and visualize data content. Additionally, most applications empower the user to explore and analyze the data with interactive functions like drilldown/drill-up, expand/collapse, pivotize, sort, value-highlighting, export to Excel and PDF, etc.. Some applications also support advanced cube features like cube-based cell formatting, creation and use of local/cube-based dynamic named sets and calculations, and the invocation of cube-based actions (incl. model context sensitive online help). For multi-lingual solutions a few applications also support the on-the-fly translation of all visible reporting model metadata. Additional statistical, visualization and interactivity features like inverted display of dimension hierarchies, in-cell sparklines/sparkbars, waterfall diagrams, interactive ABC/Pareto-charts, hyperbolic trees, and dimension impact analysis, are further features of some applications. Following useful features are supported by only a few analytical application platforms so far: Collaboration Support create and share personal reports (derived from other standard reports) add and share personal annotations at report, dimension and cell level WCMS-like (report) layout templates, which can be administrated centrally and allow alignment with existing corporate design guidelines. Enhanced template based publishing support with automated integration of text data. Budgeting and planning with multi-scenario data entry and workflow support: This quite complex theme is not a real BI topic in the narrower sense, but is often raised as pending business issue from users using the analytical applications. Also a few vendors have started selling integrated planning features in their analytical application, such approaches usually dont deliver the expected value and its reco mmended to address planning related issues with specialized application packages. Further details about the different aspects can also be found in the related chapter about product selection criteria for analytical applications. Because not all users have the same needs, a mix of carefully selected alternative analytical applications may provide the best result regarding the user specific application skills and the application related license fees. The different levels of user interactivity are explained in more detail in the next chapter.

Business Intelligence Project and Operations Lifecycle 1.0

29/39

Batch report distribution is a task coming from earlier days, when it usually took minutes to hours to create updated business reports. But as there are situations, where report recipients are not end users or they have only limited or no access to the DWH and/or OLAP databases, it may still make sense to look for specific application features of the analytical platform, which support a fully automated creation and distribution of published reports. Data mining is the process of extracting hidden patterns from data. As more data is gathered, with the amount of data doubling every three years, data mining is becoming an increasingly important tool to transform this data into information. It is commonly used in a wide range of profiling practices, such as marketing, surveillance, fraud detection and scientific discovery. http://en.wikipedia.org/wiki/Data_mining Some DWH and OLAP database products and some analytical applications provide server and/or client based data mining functionality. The result of the data mining activity is the creation of an explanatory mining model, which helps for a given record with providing some attribute values to predict the probable values for specific remaining attributes within the same record. Such models are usually not visible to the user and are automatically updated as part of the ETL process and indirectly used by regular business applications to predict customer profile/behavior and business risks already during the data entry or before the data is being loaded into the data warehouse. Depending on the selected data mining algorithm like cluster analysis or decision tree, it may be useful to create and update a mining model ad-hoc and to analyze the outcome directly. Note: Some DWH and OLAP database products support the re-integration of preliminary created or updated mining models as artificial reporting dimension.

Business Intelligence Project and Operations Lifecycle 1.0

30/39

6b Information Scope

(Extract from Business Intelligence Lifecycle Diagram) The information scope shows the relation between the users of the different organizational levels in context of their business role (management, knowledge worker, analyst, operation monitoring, and data entry/creation). According to a users role, the person asks for different levels of aggregates and different types of primary visualizations combined with a set of specific interactivity options to perform their tasks. Analysts usually create their individual ad-hoc reports on base of an existing standard report. In cases of very specific analytical function requirements it may be useful, to equip the analyst users with a specialized analytical application. The diagram also illustrates the context of data sources and the business intelligence platform within the overall information scope. As explained earlier, the item detail reports are preferably created and used within their source application, but could be migrated to the BI platform on demand. From point of view of the business intelligence platform, the users are profiled regarding the required application interactivity (reader, explorer, and analyst). This user profiles correspond with the features of the available alternative analytical application platforms and facilitate the optional selection of a mix of alternative, profile specific analytical applications. Such a mix usually provides the best result also regarding the user specific application skills and the application related license fees. Reader: This user (typically a manager at any level) was used to receive static reports in electronic or paper form. With the introduction of an analytical application, most of the new interactive features are not really useful to this group of users. They are satisfied to get updated standard reports and charts in time and have the possibility to change some of the available filters and the set of standard header columns themselves. Depending on the report they may use the platform also as source for centralized information lookup and ask for some item background information (customer address, staff phone no.). Business Intelligence Project and Operations Lifecycle 1.0 31/39

Explorer: This user (often called knowledge workers; typically a controller, product or key account manager) was either used to receive many regular reports covering all possible aspects and levels of details or a few extensively large reports. With the introduction of an analytical application, he can interactively browse thru all different aspects of filter combinations and drill-down selectively to the lowest level of detail on demand, without having to go thru loads of static reports. The ad-hoc invocation of context related information sources (company reg., etc.) or the ability to quickly send an email to a related person, helps them to improve their productivity on using multiple applications simultaneously. Analyst: This user (typically a qualified assistant of top or middle management, or product/service/business development) has to answer diverse questions quite frequently. Therefore he was used to gather information from many different sources to create a consolidated view before he was able to analyze and interpret the findings. With the introduction of an analytical application on the base of an extensive business intelligence platform, he gains a lot of time to get a unified picture of almost any business related data context, as he can directly retrieve all required data directly thru the unified reporting model. The analyst benefits most of the potential interactivity, special statistical functions and collaboration features. Based on his expertise using the unified reporting model and analytical application features, he is also predestinated to create new or maintain existing standard reports upon request.

15% 50% 35%

Reader

Explorer

Analyst

Typical distribution of the count of the different BI user profiles

Operation Monitoring: This user (typically operations staff) must be able to create item specific detail reports (customer billing, purchase/sales order, etc.) and fact sheets (single customer, specific product or account, etc.). Such reports are preferably created and used within their source application, but can be migrated to the business intelligence platform on demand.

Reporting Process: The regular reporting processes (operational monitoring, area specific controlling, management reporting, report publishing for key customer/suppliers and the public) should be documented and communicated to all involved people. Next to information about the process workflow tasks and the involved actors and resources, the documents should also inform about escalation procedures in case of exceeding existing threshold values.

Operations and Support: The regular operation processes to run, update and maintain the business intelligence platform should be documented. Next to information about the process workflow tasks and the involved actors and resources, they should also inform about existing escalation procedures. Moreover processes regarding end user related training and support have to be illustrated and communicated to all potentially involved people.

Business Intelligence Project and Operations Lifecycle 1.0

32/39

6c Reporting Model Design & Implementation


Multidimensional Modeling: A multidimensional reporting model is primarily a conceptual approach to create an easy to use data model supporting high performance data access. A simple star-schema has been proven to achieve the optimum for the two contrary goals and allows using the same standardized ETL processes for all dimension and fact objects alike. The resulting conceptual model should be able to answer all specified report requirements by the structure of its multidimensional design. The physical implementation of the model in the data warehouse target database may be optimized regarding ETL processing and required data storage space. For example for specified requirements like currency exchange calculation into a specific report currency or intra-group eliminations, the reporting values are not fully pre-calculated and stored in the related fact table. Such calculations are preferably integrated directly in the user accessible abstraction layer of the query database representing the unified reporting model thru specific database views. With the physical implementation in the OLAP cube, some additional, so called functional dimensions may be added to improve cube usability. Such functional dimensions are for instance date-series, date-aggregations (YTD, etc.), date-relation (Previous Year) and value scales. Compliance: Compliance related business requirements and/or constraints must be reflected by the logical and physical designs and its underlying ETL processes. Source Mapping and Master Data Management: During the Source Mapping, all objects and attributes of the physical reporting model implementation must be mapped to objects and attributes of the identified data sources. This time-consuming task may become quite complex, when different sources provide similar, but not constantly equal data and multiple alternative sources must be merged into single attribute values. The merge and alignment of multiple item catalogs from different sources (products, customers, and accounts) also called master data management should be solved as own task. Afterwards the execution of the resulting integration logic can be embedded in the main ETL process. ETL workflow design and configuration: ETL workflows, which extract, transform and load the source data into the target data warehouse, have to be designed and implemented. This can either be done by creating individual workflows and code routines for all source extractions and transformation/load-operations of target objects, or by creating a general ETL workflow with standard code routines, where object related execution parameters are retrieved dynamically from an own repository (ETL Configuration). The second approach is more complex for an initial release, but makes maintenance and platform extending efforts much more cost-effective due to centralized configuration und parameter settings. A further advantage of this approach is that all documents and operation settings are well documented in one place (potential metadata compliance requirement). This repository offers also a good source to create all kind of customized documentation. Depending on the selected ETL engine product, an existing add-on product supporting a centralized repository approach should be considered to facilitate a cost-effective initial design and implementation of the ETL processes. Business Intelligence Project and Operations Lifecycle 1.0 33/39

Documentation: A good documentation of the final design and the development deliverables including the key considerations, which have lead to specific design and development adaptations, are important for future maintenance and growth of the business intelligence solution. It documents, transfers and preserves knowledge that future releases and other project teams can build on.

6d Analytical Application Design & Development


Terminology and Naming Rules: Its important to build and maintain a library about corporate terminology definitions for all items which are visible in the unified reporting model. This information should be an integrated part of the users documentation and/or the model context sensitive online help in the analytical application. This ensures that all internal or external users across all organizational units have the same understanding about specific terms. The library must cover at least all metadata objects (facts, dimensions) and its attributes. Moreover we recommend including definitions for specific content items (financial accounts, customer segments, key figures and ratios). In practice this task often brings forward, that different users and organizational units across a company have a quite different understanding for the same word (i.e., cash flow, return on investments, top-tier customers). For easier (intuitive) understanding of specific names/terms related to specific objects and its attributes, its advisable to define naming rules, which also facilitate the naming of new objects and attributes at the design phase. Although a business intelligence solution consists of many separate metadata layers, its suggested, to (re-) use the same or similar wording for congruent or similar objects. This especially applies to the conceptual and physical reporting models and to visible report metadata in analytical applications and its reports. In cases of using pre-defined design components (templates, etc.) you should maintain a terms-layer-linkage document, which describes the relations between the different words across the different metadata layers. Over time, when business requires changing the wording for specific items, they dont have to be changed in all layers alike. Its also feasible just to change it in the layers, which are visible to the end user and to update the terms-layer-linkage document. Reports and Templates: For improved report readability and end user application usability its recommended, to establish a limited set of layout templates with specific interactive functionalities, whereas each template can be used to create various standard reports with their specific data contents. The layout of the templates should be aligned with existing corporate design guidelines or should at least follow the same unified visual appearance (fonts, colors, logo placement, etc.). Depending on the analytical application platform, report templates can be defined and maintained in one place, so all new and existing reports follow the current template settings.

Process Details: Process documentation has to be created, which informs about the process with its workflow tasks and involved actors and resources. The documents should cover following areas: Business reporting and publishing Platform operations and maintenance End user training and support

Business Intelligence Project and Operations Lifecycle 1.0

34/39

7 Final Assurance
The final project deliverables resulting from the design and development phase (documentation, analytical application with its standard reports) must be reviewed by the project manager and the steering committee regarding quality and completeness. Background program code, especially ETL process related procedures and the object related execution settings should be verified at least by their execution results on the base of selected sample objects. A specific code inspection may be required depending on the existing compliance and quality assurance requirements. Moreover, the deliverables must be checked by the steering committee against the business requirement specifications covering standard reports, user interactivity, performance goals, data quality, data access security, compliance, etc. The review findings are documented and summarized for the project sponsor and the project team. Depending on the specified business requirements and existing operational IT standards the final release has to go thru a more or less rigorous testing phase, where groups of selected reader, explorer, and analyst users have to go thru representative real-live examples based on real world up-to-date data. This testing phase also covers the predefined processes for business reporting, platform operations, and end user training and support. Missing, incomplete, wrong or non requirement compliant deliverables must be supplied or corrected or, in rare cases, requirement specifications may have to be modified towards feasible deliverables. The project cannot go into the phase of deployment unless all critical findings have been addressed, re-verified, tested and the overall deliverables have been signed off by the project sponsor.

8 Deployment
The rollout towards the end user should be split into separate sessions according to the related user interactivity profiles (reader, explorer and analyst). The group of reader users may just be introduced into the new solution by an introductory usage-oriented documentation about the analytical application and their standard reports. But the explorer and analyst users should pass formal trainings. The first training block (half a day) informs about the basics of the unified reporting model, the existing standard reports and introduces the trainees into the basic functionality of the analytical application. The analyst users complete their training with a second formal training (one day), where they get introduced into the details of the unified reporting model including the purpose and use of OLAP related functional dimensions. A more in-depth introduction into the analytical features of the analytical application concludes this session. In case the analysts have access to a second more sophisticated analytical application, a specific introduction should also be covered by the analysts training. Consider arranging further trainings for advanced users, where a reader becomes an explorer, an explorer becomes an analyst and the analyst extends his expertise in using his analytical application(s) including being able to build simple and more sophisticated BI platform queries (SQL, MDX).

Business Intelligence Project and Operations Lifecycle 1.0

35/39

9 Operations
9a Production
The ongoing platform operations cover regular updates of the data warehouse target data (monitoring and minor configuration adjustments), regular platform maintenance tasks (backst up/recovery and basic tuning), and ongoing 1 level end user support.

9b Outsourcing
Depending on the BI platform size, complexity, corporate IT strategy and internal IT skills and resources, an efficient and cost-effective operation of a business intelligence platform may be delegated to a specialized BI outsourcing partner. This may be, that the hard- and software components remain located in the companys premises, and the outsourcing partner provides his services in place and/or thru remote access, or the whole platform including hard- and software is transferred to or owned by and operated by the outsourcing partner. The latter option may interfere with existing compliance requirements regarding system availability and confidentiality. Regarding more complex operations and end user support requests, an external partner may be nd contracted to provide 2 level support and assist the internal BI team in questions of adaptive maintenance. This role is usually best covered by the design and development partners, who have built and implemented the initial platform release.

10 Growth
As result of operating and using the business intelligence platform and other ongoing changes in business strategy and operations, new requirements arise. Having a standard suggestion and requirement collection process in place, ensures a comprehensive documentation and tracking of all requests. On a regular base (monthly/quarterly) the summarized requests are discussed in the steering committee, which maintains together with the project manager an updated platform release plan and decide together with the project sponsor about future releases regarding its content and scheduling.

Despite or because of all known potential pitfalls an initial release of a business intelligence solution can become a success story!

Business Intelligence Project and Operations Lifecycle 1.0

36/39

APPENDIX: Abbreviations / Glossary


Abbrev Full Wording
BI Business Intelligence Chief Executive Officer

Description
Business intelligence (BI) refers to skills, technologies, applications and practices used to help a business acquire a better understanding of its commercial context. Business intelligence may also refer to the collected information itself. BI technologies provide historical, current, and predictive views of business operations. Common functions of business intelligence technologies are reporting, OLAP, analytics, data mining, business performance management, benchmarking, text mining, and predictive analytics. http://en.wikipedia.org/wiki/Business_intelligence A chief executive officer (CEO) or chief executive is one of the highest-ranking corporate officers (executives) or administrators in charge of total management. http://en.wikipedia.org/wiki/CEO The CFO or Chief Financial Officer is a high standing position in any company. He or she reports to the board and CEO and carries out various tasks and assignments for them. They are regarded as the second in the company to the CEO himself, unless a COO position also exists. CFO offers high command capabilities for individuals and allows them to exercise their leadership qualities under such projects as debt refinancing as so on. A position as CFO should be viewed as one with options and links, however usually is given to someone who has vast amounts of experience in the industry. http://en.wikipedia.org/wiki/CFO A chief operating officer or chief operations officer (COO) is a corporate officer responsible for managing the dayto-day activities of the corporation and for operations management (OM). The COO is one of the highest-ranking members of an organization's senior management, monitoring the daily operations of the company and reporting to the board of directors and the top executive officer, usually the chief executive officer (CEO). The COO is usually an executive or senior officer. http://en.wikipedia.org/wiki/Chief_operating_officer A central processing unit (CPU) or processor is an electronic circuit that can execute computer programs. This term has been in use in the computer industry at least since the early 1960s (Weik 1961). http://en.wikipedia.org/wiki/CPU Customer relationship management (CRM) consists of the processes a company uses to track and organize its contacts with its current and prospective customers. CRM software is used to support these processes; information about customers and customer interactions can be entered, stored and accessed by employees in different company departments. Typical CRM goals are to improve services provided to customers, and to use customer contact information for targeted marketing. http://en.wikipedia.org/wiki/Customer_relationship_management

CEO

CFO

Chief Financial Officer

COO

Chief Operating Officer Central Processing Unit Customer Relationship Management

CPU

CRM

CSV

A Comma separated values (CSV) file is used for the digital storage of data structured in a table of lists form, where each associated item (member) in a group is in association with others also separated by the commas of its Commaset. Each line in the CSV file corresponds to a row in the table. Within a line, fields are separated by commas, each Separated Values field belonging to one table column. Since it is a common and simple file format, CSV files are often used for moving tabular data between two different computer programs, for example between a database program and a spreadsheet program. http://en.wikipedia.org/wiki/Comma-separated_values Data Warehouse Enterprise Resource Planning Extract Transform Load Euro File Transfer Protocol Global Trade Item Number / European Article Number Human Resources Data warehouse is a repository of an organization's electronically stored data. Data warehouses are designed to facilitate reporting and analysis. This definition of the data warehouse focuses on data storage. However, the means to retrieve and analyze data, to extract, transform and load data, and to manage the data dictionary are also considered essential components of a data warehousing system. http://en.wikipedia.org/wiki/Data_warehouse Enterprise resource planning (ERP) is a company-wide computer software system used to manage and coordinate all the resources, information, and functions of a business from shared data stores. http://en.wikipedia.org/wiki/Enterprise_resource_planning Extract, transform, and load (ETL) in database usage and especially in data warehousing involves: Extracting data from outside sources Transforming it to fit operational needs (which can include quality levels) Loading it into the end target (database or data warehouse) http://en.wikipedia.org/wiki/Extract,_transform,_load International currency symbol for Euro. http://en.wikipedia.org/wiki/EUR File Transfer Protocol (FTP) is a standard network protocol used to exchange and manipulate files over an Internet Protocol computer network, such as the Internet. http://en.wikipedia.org/wiki/FTP Global Trade Item Number (GTIN) is an identifier for trade items developed by GS1 (comprising the former EAN International and Uniform Code Council). Such identifiers are used to look up product information in a database (often by inputting the number through a bar code scanner pointed at an actual product) which may belong to a retailer, manufacturer, collector, researcher, or other entity. The uniqueness and universality of the identifier is useful in establishing which product in one database corresponds to which product in another database, especially across organizational boundaries. http://en.wikipedia.org/wiki/GTIN Human resources is an increasingly broadening term that refers to managing "human capital," the people of an organization. The field has moved from a traditionally administrative function to a strategic one that recognizes the link between talented and engaged people and organizational success. http://en.wikipedia.org/wiki/HR

DWH

ERP

ETL

EUR FTP

GTIN / EAN

HR

Business Intelligence Project and Operations Lifecycle 1.0

37/39

Abbrev Full Wording


IFRS

Description

International International Financial Reporting Standards (IFRS) are Standards, Interpretations and the Framework for the Financial Report- Preparation and Presentation of Financial Statements (in the absence of a Standard or an Interpretation) adopted ing Standards by the International Accounting Standards Board (IASB). http://en.wikipedia.org/wiki/IFRS Ein Internes Kontrollsystem (IKS) besteht aus systematisch gestalteten organisatorischen Manahmen und KonInternes Kontrolltrollen im Unternehmen zur Einhaltung von Richtlinien und zur Abwehr von Schden, die durch das eigene Persosystem (german) nal oder bswillige Dritte verursacht werden knnen. http://de.wikipedia.org/wiki/IKS International Organization for Standardization Information Technology The International Organization for Standardization (Organisation internationale de normalisation), widely known as ISO, is an international-standard-setting body composed of representatives from various national standards organizations. http://en.wikipedia.org/wiki/ISO Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information. http://en.wikipedia.org/wiki/Information_technology

IKS

ISO

IT

KPI

A performance indicator or key performance indicator (KPI) is a measure of performance. Such measures are Key Performance commonly used to help an organization define and evaluate how successful it is, typically in terms of making Indicator progress towards its long-term organizational goals. http://en.wikipedia.org/wiki/Key_performance_indicator Local Area Network A local area network (LAN) is a computer network covering a small physical area, like a home, office, or small group of buildings, such as a school, or an airport. The defining characteristics of LANs, in contrast to wide-area networks (WANs), include their usually higher data-transfer rates, smaller geographic place, and lack of a need for leased telecommunication lines. http://en.wikipedia.org/wiki/LAN

LAN

MDX

Multidimensional Expressions (MDX) is a query language for OLAP databases, much like SQL is a query language Multidimensional for relational databases. It is also a calculation language, with syntax similar to spreadsheet formulas. Expressions http://en.wikipedia.org/wiki/Multidimensional_Expressions Markets in Financial Instruments Directive The Markets in Financial Instruments Directive (MiFID) as subsequently amended is a European Union law which provides a harmonized regulatory regime for investment services across the 30 member states of the European Economic Area (the 27 Member States of the European Union plus Iceland, Norway and Liechtenstein). The main objectives of the Directive are to increase competition and consumer protection in investment services. http://en.wikipedia.org/wiki/MiFID At the very early stage also called DSS (decision support systems). MIS had a primary focus on building management cockpits (dashboards). Conceptually has MIS been incorporated in the more general BI approach, which addresses the whole organization and includes balanced scorecards over dashboards down to any detail report. A management information system (MIS) is a subset of the overall internal controls of a business covering the application of people, documents, technologies, and procedures by management accountants to solve business problems such as costing a product, service or a business-wide strategy. http://en.wikipedia.org/wiki/Management_Information_System Microsoft Corporation (NASDAQ: MSFT, HKEX: 4338) is a United States-based multinational computer technology corporation that develops, manufactures, licenses, and supports a wide range of software products for computing devices. Headquartered in Redmond, Washington, USA, its most profitable products are the Microsoft Windows operating system and the Microsoft Office suite of productivity software. http://en.wikipedia.org/wiki/Microsoft In computing, Open Database Connectivity (ODBC) provides a standard software API method for using database management systems (DBMS). The designers of ODBC aimed to make it independent of programming languages, database systems, and operating systems. http://en.wikipedia.org/wiki/Odbc OLE DB (Object Linking and Embedding, Database, sometimes written as OLEDB or OLE-DB) is an API designed by Microsoft for accessing different types of data stored in a uniform manner. It is a set of interfaces implemented using the Component Object Model (COM); it is otherwise unrelated to OLE. It was designed as a higher-level replacement for, and successor to, ODBC, extending its feature set to support a wider variety of non-relational databases, such as object databases and spreadsheets that do not necessarily implement SQL. http://en.wikipedia.org/wiki/OLE_DB

MiFID

MIS

Management Information System

MS

Microsoft

Open Database Connectivity / ODBC / Object Linking OLE DB and Embedding Database

OLAP OLTP PDF

Online Analytical Online analytical processing, or OLAP (pronounced /o lp/), is an approach to quickly answer multi-dimensional Processing analytical queries. http://en.wikipedia.org/wiki/OLAP Online Transaction Processing Portable Document Format Storage Area Network Online transaction processing, or OLTP, refers to a class of systems that facilitate and manage transactionoriented applications, typically for data entry and retrieval transaction processing. http://en.wikipedia.org/wiki/OLTP Portable Document Format (PDF) is a file format created by Adobe Systems in 1993 for document exchange. PDF is used for representing two-dimensional documents in a manner independent of the application software, hardware, and operating system. http://en.wikipedia.org/wiki/PDF A storage area network (SAN) is an architecture to attach remote computer storage devices (such as disk arrays, tape libraries, and optical jukeboxes) to servers in such a way that the devices appear as locally attached to the operating system. http://en.wikipedia.org/wiki/San

SAN

Business Intelligence Project and Operations Lifecycle 1.0

38/39

Abbrev Full Wording


SOX Sarbanes-Oxley

Description
The Sarbanes-Oxley Act of 2002 (Pub.L. 107-204, 116 Stat. 745, enacted July 30, 2002), also known as the Public Company Accounting Reform and Investor Protection Act of 2002 and commonly called Sarbanes-Oxley, Sarbox or SOX, is a United States federal law enacted on July 30, 2002, as a reaction to a number of major corporate and accounting scandals including those affecting Enron, Tyco International, Adelphia, Peregrine Systems and WorldCom. http://en.wikipedia.org/wiki/Sarbanes-Oxley_Act SQL (Structured Query Language) is a database computer language designed for managing data in relational database management systems (RDBMS). Its scope includes data query and update, schema creation and modification, and data access control. http://en.wikipedia.org/wiki/SQL

SQL

Structured Query Language

SSAS SSIS SSRS

SQL Server Analysis Services / SQL Server Integration Services / SQL Server Integration Services: SQL Server All services are parts of Microsoft SQL Server, a database management system. Microsoft has included a number Analysis, Integraof services in SQL Server related to Business Intelligence and Data Warehousing. These services include Integration, Reporting tion Services, Analysis Services and Reporting Services. Analysis Services includes a group of OLAP and Data Services Mining capabilities. http://en.wikipedia.org/wiki/Microsoft_Analysis_Services Web Content Management System A web-content-management system (WCMS or Web CMS) is content management system (CMS) software, usually implemented as a Web application, for creating and managing HTML content. It is used to manage and control a large, dynamic collection of Web material (HTML documents and their associated images). A WCMS facilitates content creation, content control, editing, and many essential Web maintenance functions. http://en.wikipedia.org/wiki/Web_content_management_system

WCMS

XML

XML (Extensible Markup Language) is a general-purpose specification for creating custom markup languages Extensible which specifies lexical grammar and parsing requirements. The term extensible is used to indicate that a markupMarkup Language language designer has significant freedom in the choice of markup elements. http://en.wikipedia.org/wiki/Xml XML for Analysis XML for Analysis (abbreviated as XMLA) is the industry standard for data access in analytical systems, such as OLAP and Data Mining. XMLA is based on other industry standards such as XML, SOAP and HTTP. XMLA is maintained by XMLA Council with Microsoft, Hyperion and SAS being the official XMLA Council founder members. http://en.wikipedia.org/wiki/XMLA Year-To-Date is a period starting January 1 or July 1 (depending on the country) of the current year and ending today. Year-to-date is used in many contexts, mainly for recording results of an activity in the time between a date (exclusive, since this day may not yet be "complete") and the beginning of either the calendar or fiscal year. http://en.wikipedia.org/wiki/Year-To-Date

XMLA

YTD

Year To Date

Business Intelligence Project and Operations Lifecycle 1.0

39/39

Anda mungkin juga menyukai