Pages

Ads 468x60px

Labels

Monday 28 April 2008

Let’s talk EPM – Part 1

Welcome to the world of Enterprise Performance Management (EPM), considered the Holy Grail of Business Intelligence. EPM and its various manifestations creatively named as Business Performance Management (BPM), Corporate Performance Management (CPM) etc. is a set of processes that help organizations optimize their business performance.
Does it sound good? – Ofcourse, Yes! Show me an organization that does not want to optimize!!
Does it sound practical? – Not really! Don’t know where to start!!
EPM means many things to many people – Optimization of business performance can mean optimization at the business processes level (local optima), can also mean optimization at the organizational level (global optima) and can also have many flavors in between.
With many BI vendors jumping into the EPM bandwagon, the problem is that EPM is immediately equated to the solutions provided by tools like Business Objects, Cognos, Hyperion etc. That view, in my opinion, is far removed from the truth.
In this series of posts, I would like to share some thoughts on making EPM a practical reality in organizations. To start with, let me enumerate the components of an EPM strategy:
  1. Business Process Maps – Understand the business process
  2. Metrics Identification – Get hold of the metrics
  3. Metrics Profiling – Understand the metrics in depth
  4. Metrics Maps – Understand the cause and effect relationships between metrics
  5. Metrics Visualization – Implementation of Metric Maps on BI Tools
  6. Watch and Improve – Monitor Metrics and Improve business process as required
A keen observer will immediately realize that implementing EPM has lot more of pen & paper work (substitute your favorite analysis tool here!) before technology can come into the picture. Also, in my opinion, there is no silver bullet – No single metric map can fit companies across industries or even within same industry. EPM framework for an organization has to evolve in phases based on company’s growth, its corporate vision, and the important numbers at different stages etc. or in other words ‘EPM is very personal to an organization’.
EPM, for a BI practitioner, represents a convergence of many things –
  • Domain Understanding
  • Quantitative Play
  • BI Tool capability
  • Closed-Loop BI Architecture
  • Knowledge of proven methodologies like Six Sigma, Balanced Scorecard etc.
will try and explain some of the interesting aspects of an EPM strategy like Metrics Profiling, Metrics Maps etc. in the next few posts. Meanwhile, you can take a look at resources like this one (http://www.dmreview.com/issues/20050501/1026062-1.html) to understand the ‘big picture’ with respect to Enterprise Performance Management.
Thanks for reading!

Tuesday 15 April 2008

Using Analytic Hierarchy Process (AHP) for BI Tool Evaluation

Enterprise wide BI architecture utilizes a multitude of tools within its landscape, each serving a specific functionality – Extract, Transform and Load (ETL), Data Cleansing, Metadata Management, Databases (both relational and multidimensional), Reporting and Analytics (OLAP), Data Mining etc. For example, just taking the OLAP area alone, there are more than 40 different products that can potentially solve a customer problem. You can imagine the number of combinations possible when all the tool options are combined across the overall landscape. This establishes the fact that one of the most challenging and vexing problems in Business Intelligence domain is Tool Evaluation.
Tool Evaluation and selection has become strategic to the implementation of enterprise wide Business Intelligence. Traditionally, tool selection involved comparing the technical features of the tools, looking at demos by product vendors, reading up industry reports, get word-of-mouth referrals and then taking a final decision. In my humble opinion – that is not sufficient any more.
Technical features, though important, cannot be the definitive criteria for selecting a particular tool. More crucial than technical features is what I term as the “Business Fitment Index”. The selected tool should fit with the characteristics of the business process prevalent in the organization and should take into account the requirements of different classes of users. The concept of Business Fitment can be classified as a Multi Criteria Decision Making (MCDM) problem and one of the powerful tools in this category is the Analytic Hierarchy Process (AHP).
AHP is a systematic procedure that helps to:
  1. Represent the elements of any problem, breaking it down into smaller constituents
  2. Assign weightages to each constituent by following a pairwise comparison technique
  3. Leverage expert judgment and intuitive feel into a coherent framework for problem solving
Though AHP can be used in many situations, Hexaware’s BI practice has perfected the art of leveraging its power in the realm of “BI Tools Evaluation”. There are 3 steps to calculating the Business Fitment Index using AHP.
Step 1 – Pair-wise comparison of business parameters by customer stakeholders is done in this step. The parameters can be things like – Real Time Data Integration, Data Volumes, Data Quality, Business Rules Flexibility etc.
Step 2 – Relative ranking of Business Parameters based on the AHP (Analytic Hierarchy Process) technique
Step 3 – Each of the short-listed tools are evaluated against the business parameters and a final rating is arrived at taking into account the organization readiness factors
Bottom-line is that the technical features of the tools have to be taken in conjunction with the fitment level of tool to the characteristics of the business. That alone would ensure the success of the tool for enterprise wide BI initiatives.
AHP is a simple yet powerful way of arriving at a decision by consensus. There are wide ranging applications of AHP in BI and this is a great area for practitioners to get interested. If you have some thoughts on other applications of AHP in the BI world, please do share it with us. Thanks for reading!

Wednesday 9 April 2008

Data Integration Challenge – Initial Load – II

The other challenges during Initial load are the
System Performance
Catching Up
System Performance is always a challenge during initial load especially when many years of history data are to be loaded; there is an increase in the usage of system resources because of huge data load which wouldn’t happen during regular incremental load. Some of the ways of handling the system performance issue during initial load are
Group the data loads by filters like years or customer and load the related data in chunks. We could load data for the month Jan, Feb, Mar or load the customers region wise from NY, followed by NJ etc. Such grouping of records for loading eliminates data surge and as well provides better way to perform data validation
As the data gets loaded into the warehouse the data required for lookup from the warehouse becomes huge, we need to decide the required lookup records based on the incoming data. For example if the data in the warehouse has data related to all the regions North, South, East, West and the incoming data currently has only North data then we need to have a override filter and access only the data pertaining to North  from the warehouse
We could plan and increase the available memory for ETL server for a temporary period
Avoid sequential inserts, write data to a file, sort and bulk load
Determine and plan for more disk space requirement for initial load files that are extracted and provided by the source systems is an interesting problem where in the warehouse is not able to cope up and able to deliver latest data (or 1 day old) as in the source system. This problem would be more often due to the ETL performance issues, where even before the initial data is successfully loaded and verified in the DW, additional set of new records would have come from the source which the DW is not able to catch up.
Though at times some kind of code tuning, running things in parallel or hardware upgrade (usually a costly one at a later stage) could resolve such problems, in certain situations this problem could run into an unsolvable state where in the complete ETL architecture has to be re-looked.
One other way to manage such situations is to have the daily process of loading current data to proceed independently and in parallel through a separate of processes bring in the history data on a regular basis, in certain scenarios we might need to build a process that would run and sync up the current with the old data especially the aggregate data if any designed in the data model.
Read More about