Pages

Ads 468x60px

Labels

Monday 24 November 2008

Business Intelligence Challenge – Product Upgrades & Migrations, Moving the Code – 4

Last time we discussed about Impact Assessment , the next logical step after this is to perform the actual upgrade or migration of the code.
Moving the Code: Performing Upgrade or Migration of the Objects
When we talk about product upgrades, always the product vendor provides tools by which the objects in the earlier version can be upgraded to the latest version. Yes we would see some objects failing through while using such tools; these are the ones that would need rework after the upgrade process.
When we talk about product migration like moving from Cognos to Business Objects or Business Objects to Cognos, there is good scope for us to look for some ways to automate the code migration. Earlier discussions have been on how to leverage the metadata for understanding the environment, now we are looking at an option on how to manipulate or transform the metadata so that an object in platform ‘A’ becomes compliant to platform ‘B’.
Steps involved in building an automated product migration process
Perform metadata level object mapping between the two platforms, determine the gaps. This would actually be a ‘by product’ of ‘Step 2’ in Impact Assessment
Build individual components that would
  • Read the metadata from the source platform and prepare a repository
  • Have the knowledge of the match & gap between the platforms, could be reference tables
  • Transform the ‘source’ metadata and write out as understood by the ‘target’ platform by using the reference tables
Benefits of Automated Migration
  • Helps avoid creation of objects from scratch
  • Ensures availability of time for testing (core task) than code development
  • Enables team to have a flexible skillset
  • A faster way of delivering things when a ‘one to one’ migration from the source platform is seen as a must
Automated Migration Challenges
Transforming the source metadata to the target platform would be a challenge, especially with data manipulation functions. Having a good understanding of the gaps will help; a reference table mapping the functions between the platforms would be useful. In scenarios where a function cannot be converted to the target platform, a comment can be written into a log file enabling quicker attention.
Have seen good success in writing such automated migration components though not 100%. With almost every products providing good SDK kits for reading and as well writing metadata and as well with the support for XML structures, writing such bridges for object migration are getting easier.
Whether the objects in a product are migrated/upgraded in an automated way or not, the following activity of ‘Validation’ plays a key role in ensuring the final quality, next time let us discuss on some of the means for effective validation ….

Thursday 20 November 2008

Zachman Framework for BI Assessments

The Zachman Framework for Enterprise Architecture has become the model around which major organizations view and communicate their enterprise information infrastructure. Enterprise Architecture provides the blueprint, or architecture, for the organization’s information infrastructure. More information on the Zachman Framework can be obtained at www.zifa.com.
For BI practitioners, the Zachman Framework provides a way of articulating the current state of the BI infrastructure in the organization. Ralph Kimball in his eminently readable book “The Data Warehouse Lifecycle Toolkit” illustrates how the Zachman Framework can be adapted to the Business Intelligence context.
Given below is a version of the Zachman Framework that I have used in some of my consulting engagements. This is just one way of using this framework but does illustrate the power of this model in some measure.
zachman
Some Salient Points with respect to the above diagram are:
  • The framework answers the basic questions of “What”, “How”, “Who” and “Where” across 4 important dimensions – Business Requirements, Conceptual Model, Logical/Physical Model and Actual Implementation.
  • Zachman Framework reinforces the fact that a successful enterprise system combines the ingredients of business, process, people and technology in proper measure.
  • It is typically used to assess the current state of the BI infrastructure in any organization
  • Each of the cells that lies at the intersection of the rows and columns (Ex: Information Requirements of Business) has to be documented in detail as part of the assessment document
  • Information on each cell is gathered through subjective and objective questionnaires.
  • Scoring Models can be developed to provide an assessment score for each of the cells. Based on the scores, a set of recommendations can be provided to achieve the intended goals.
  • Another interesting thought is to create a As-Is Zachman framework and overlay that with To-Be one in situations where re-engineering of a BI environment is undertaken. This will help us provide a transition path from the current state to the future.
Thanks for reading. If you have used the Zachman framework differently in your environment, please do share your thoughts.

Monday 10 November 2008

Valuing your Business Intelligence System – Part 1

Sample these statements:
  • Dow Jones Industrial Average jumped 200 points today, a 2% increase from the previous close
  • The carbon footprint of an average individual in the world is about 4 tonnes per year which is a 3% increase over last year
  • The number of unique URL’s as on July 2008 in the World Wide web is 1 trillion. The previous landmark of 1 billion was reached in 2000
  • One day 5% VaR (Value at Risk) for the portfolio is $ 1 Million as compared to the VaR of $ 1.3 Million a couple of weeks back
Most of us buy into the idea of having a single number that encapsulates complex phenomena. Though the details of the underlying processes are important, the single number (and the trend) does act like a bellwether of sorts helping us quickly get a feel of the current situation.
As a BI practitioner, I feel that it is about time that we formulated a way for valuing the BI infrastructure in organizations. Imagine a scenario where the Director of BI in company X can announce thus: “The value of the BI system in this organization has grown 15% over the past 1 year to touch $50 Million” (substitute your appropriate currencies here!).
The core idea of this post is to find a way to “scientifically put a number to your data warehouse”. Here are a few level setting points:
  1. Valuation of BI systems is different from computing the Return on Investment (ROI) for BI initiatives. ROI calculations are typically done using Discounted Cash Flow techniques and are used in organizations to some extent
  2. More than the absolute number, the trends are important which means that the BI system has to be valued using the same norms at different points in time. Scientific / Mathematical rigor helps in bringing the consistency aspect.
  3.  
My perspective to valuation is based on the “Outside-in” logic where the fundamental premise is that the value of the BI infrastructure is completely determined by its consumption. Or in other words, if there are no consumers for your data warehouse, the value of such a system is zero. One simple, yet powerful technique in the “Outside-in” category is RFM Analysis. RFM stands for Recency, Frequency and Monetary and is very popular in the direct marketing world. My 2-step hypothesis for BI system valuation using the RFM technique is:
  • Step 1: Value of BI system = Sum of the values of individual BI consumers
  • Step 2: Value of each individual consumer = Function (Recency, Frequency, Monetary parameters)
Qualitatively speaking, from the business user standpoint, one who has accessed information from the BI system more recently, has been using data more frequently and uses that information to make decisions that are critical to the organization will be given a higher value. A calibration chart will provide the specific value associated with RFM parameters based on the categories within them. For example: For the Recency parameter, usage of information within the last 1 day can be fixed at 10 points while access 10 days back will fetch 1 point. I will explain my version of the calibration chart in detail in subsequent posts. (Please note that the conversion of points to dollar values is also an interesting, non-trivial exercise)
Am sure that people acknowledge the fact that valuing data assets are difficult, tricky at best. But then, lot more difficult questions on nature and behavior have been reduced to mathematical equations – probably, the day on which BI practitioners can apply standardized techniques to value their BI infrastructure is not too far off.
Read More About  Business Intelligence System