Saturday, December 26, 2009

Successful BI Strategy - Part I

We often run into situations where major companies ask us to help develop a BI strategy. When we ask companies about the objective of implementing BI solution, we hear the following statements quite often
  • “…produce enhanced organizational capabilities to manage data and information as organizational assets.”
  • “…provide a single version of the truth.”
  • “…enable consistent and reliable access to accurate corporate-wide data.”
  • “…provide more sophisticated reporting and analysis, faster turnaround, improved accessibility and enhanced quality.”
  • “…a single touch point where detailed financial transaction information can be filtered on user-entered selection criteria, viewed online, downloaded in standard file formats and used to generate real time reports.”
These objectives doesn't excite business executives and managers as it doesn't articulate how business intelligence will be used within specific business processes to improve business performance.As a result, they underfund business intelligence, which limits its business impact. Very recently, we have faced a situation at one of the large organizations where in Business teams refused fund BI project. They didn't see any compelling reason & business case to implement BI solution.

Lot of time, organizations also get into functional requirements such as the following during BI product evaluation cycle
  • The system shall provide the ability to drill down, drill across, and slice-and-dice.
  • The system shall provide the ability to specify organizational hierarchies and display performance scorecards for each organizational unit.
  • The system shall enable role-based access to information.
  • The system shall provide capabilities to route alerts to business users according to user-defined parameters.
  • The system shall enable integration of data from multiple disparate sources.
BI functional requirements like those listed above are standard features of commercially available BI tools.While it is important to know what your company needs BI tools to do, BI functional requirements typically say little about the kinds of business information, analytical techniques and decision support that are required or the specific core business processes that the company seeks to improve via business intelligence.

As per Gatrner Report "Fatal Flaws in BI Implementation", it is very important to get buy in & active participation from business teams for a BI project to be successful. This requires a clear linkage between business strategies, the core business processes via which the strategies are executed, and BI-driven business improvement opportunities, which is the basis for a BI business case that is compelling to the business stakeholders.

Some examples of compelling BI system objectives can be as below,
  • " BI system will help reduce transportation cost by 5%".
  • "BI system will help reduce cash out situations at ATMs to less than 3".
  • "BI System will help reduce idle cash in ATMs by 40%"
  • "BI System will help increase private label sell by 5% in 80% of retail outlets".
  • "BI System will help reduce stock out situations to less than 2 per outlet for premium or fast moving items".
  • "BI System will help increase share of wallet by 10%".
Each of the above BI objectives are linked to a business process. It clearly tells business team how implementation of BI system can help them achieve their goals. It becomes easier to get buy in from business executives and managers when BI is directly linked to business process improvement.

When you develop a BI strategy, do not look at point solutions like reporting, data integration etc..It's always recommended to look at a business analytics framework which can help you improve your business processes and achieve your business goals. The framework will in turn comprises of set of solutions which can help you address your business problems.  Point solutions like reporting, data integration etc will help you gain short term benefits but it will not help you gain long term benefits & business support. 

Sunday, December 20, 2009

Fatal Flaws in Business Intelligence Implementations

Lot of organizations, assumes that business intelligence(BI) projects are like any other project, are often surprised when their BI project spins out of control. The requirements appear to be a “moving target;” the schedule keeps slipping; the source data is much dirtier than expected and is impacting the ETL team; the staff does not have the necessary skills and is not properly trained; communication between staff members takes too long; traditional roles and responsibilities, and how they are assigned, seem to result in too much rework; the traditional methodology does not seem to work; and so on.

BI Projects are often political in nature as lot of people do not like when their performance is being tracked by their management. This requires culture change & creating awareness about benefits of BI within end user community. BI Project should be seen as business enabler rather than a performance tracking tool. They should use BI system to meet or exceed their KPIs.

I have been thinking about writing on this topic for a long time but then I came across a nice research paper on this topic from Gartner. I have shared below the details of Gartner report as is. I have personally experienced and seen some of the flaws mentioned below in lot of BI projects very recently.

Most failed business intelligence (BI) efforts suffer from one or more of nine fatal flaws, generally revolving around people and processes rather than technology, according to Gartner, Inc.

Gartner said the failure to achieve strategic results usually stems from one or more of nine common mistakes:

Flaw No. 1: Believing that “If you build it, they will come”

Often the IT organisation sponsors, funds and leads its BI initiatives from a technical, data-centric perspective. The danger with this approach is that its value is not obvious to the business, and so all the hard work does not result in massive adoption by business users — with the worst case being that more staff are involved in building a data warehouse than use it regularly.

Gartner recommends that the project team include significant representation from the business side. In addition, organisations should establish a BI competency centre (BICC) to drive adoption of BI in the business, as well as to gather the business, technology and communication skills required for successful BI initiatives.

Flaw No. 2: Managers “dancing with the numbers

Many companies are locked into an “Excel culture” in which users extract data from internal systems, load it to spreadsheets and perform their own calculations without sharing them companywide. The result of these multiple, competing frames of reference is confusion and even risk from unmanaged and unsecured data held locally by individuals on their PCs.

BI project instigators should seek business sponsors who believe in a transparent, fact-based approach to management and have the strength to cut through political barriers and change culture.

Flaw No. 3: “Data quality problem? What data quality problem”

Data quality issues are almost ubiquitous and the impact on BI is significant — people won’t use BI applications that are founded on irrelevant, incomplete or questionable data.

To avoid this, firms should establish a process or set of automated controls to identify data quality issues in incoming data and block low-quality data from entering the data warehouse or BI platform.

Flaw No. 4: “Evaluate other BI platforms? Why bother”

“One-stop shopping” or buying a BI platform from the standard corporate resource application vendor doesn’t necessarily lower the total cost of ownership or deliver the best fit for an organisation’s needs.

BI platforms are not commodities and all do not yet deliver all functions to the same level, so organisations should evaluate competitive offerings, rather than blindly taking the path of least resistance.

Integration between the application vendor’s ERP/data warehouse and BI offerings is not a compelling reason for ignoring alternatives, especially as many third-party BI platforms are as well integrated.

Flaw No. 5: “It’s perfect as it is. Don’t ever change “

Many organisations treat BI as a series of discrete (often departmental) projects, focused on delivering a fixed set of requirements. However, BI is a moving target — during the first year of any BI implementation, users typically request changes to suit their needs better or to improve underlying business processes. These changes can affect 35 per cent to 50 per cent of the application’s functions.

Organisations should therefore define a review process that manages obsolescence and replacement within the BI portfolio.

Flaw No. 6: “Let’s just outsource the whole darn BI thing”

Managers often try to fix struggling BI efforts by hiring an outsourcer that they expect will do a better job at a lower cost. Focusing too much on costs and development time often results in inflexible, poorly architected systems.

Organisations should outsource only what is not a core competency or business and rely on outsourcing only temporarily while they build skills within their own IT organisation.

Flaw No. 7: “Just give me a dashboard. Now”

Many companies press their IT organisations to buy or build dashboards quickly and with a small budget. Managers don’t want to fund expensive BI tools or information management initiatives that they perceive as lengthy and risky. Many of the dashboards delivered are of very little value because they are silo-specific and not founded on a connection to corporate objectives.

Gartner recommends that IT organisations make reports as pictorial as possible — for example, by including charting and visualisation — to forestall demands for dashboards, while including dashboarding and more-complex visualisation tools in the BI adoption strategy.

Flaw No. 8: “X + Y = Z, doesn’t it”

A BI initiative aims to create a “single version of the truth” but many organisations haven’t even agreed on the definition of fundamentals, such as “revenue” Achieving one version of the truth requires cross-departmental agreement on how business entities (customers, products, key performance indicators, metrics and so on) are defined.

Many organisations end up creating siloed BI implementations that perpetuate the disparate definitions of their current systems. IT organisations should start with their current master data definitions and performance metrics to ensure that BI initiatives have some consistency with existing vocabulary, and publicise these “standards”.

Flaw No. 9: “BI strategy? No thanks, we’ll just follow our noses”

The final and biggest flaw is the lack of a documented BI strategy, or the use of a poorly developed or implemented one. Gartner recommends creating a team tasked with writing or revising a BI strategy document, with members drawn from the IT organisation and the business, under the auspices of a BICC or similar entity.

“Simple departmental BI projects that pay an immediate return on investment can mean narrow projects that don’t adapt to changing requirements and that hinder the creation of companywide BI strategies,” said James Richardson, research director at Gartner.

Link to Gartner report:
http://www.gartner.com/it/page.jsp?id=774912

Friday, December 4, 2009

Intelligent Operational System

I just received an automated "telemarketing" SMS on my cell phone. Big deal and who cares, right? Well it isn't a big deal, and its doubtful anyone cares. But it did bring to mind an interesting reminder about the Intelligent Operational system when designing a customer experience.

The SMS was an automated SMS from Crossword. Those who do not know about Crossword, Crossword is a big book store chain in India. They didn't SMS me to sell anything. Instead, they were sending SMS to help me, which ultimately helps them.

The SMS was from the Crossword "Book Rewards" program. If you are not familiar with the Book Rewards Program, it is Crossword's customer loyalty program, where you scan a card at the point of sale and a percentage of your purchase counts towards an in-store credit or a gift voucher. Crossword mails you a gift voucher for the credit, and the credit can be used in the store.

The SMS was to alert me that my gift voucher, which I had forgotten about, would expire soon. It provided me with the details of my gift voucher, such as how much it was, and when it would expire. The information was provided a month ahead of the expiration date, which would allow me time to get a replacement voucher if I didn't receive the original, or provide me with ample time to schedule a trip or research a purchase. I wasn't thinking about going to Crossword in the near future, but I was considering buying a new book from Oxford Book Store which is located near our office. Now that I've been reminded that I have credit at Crossword, I'll buy it there.

It reminds us that we need to think about the "systems" in which our customers reside. These "systems" include technology, work, and social context components, and the interaction of these components provide opportunities, or limitations which we may not have considered. The automated SMS from Crossword provides a good example of the application of intelligent operational system, since it leveraged the capabilities of automated computer and telephony systems to reach into my "system" and gently provide a socially acceptable message indicating that "we miss your business, so please shop with us soon."

While designing applications, seriously consider the user's system, and the opportunities and limitations provided by the system. By applying these considerations to our designs, we can make applications which will ultimately be more helpful for our customers, as well as easier to use.

Monday, November 30, 2009

Information Overload & BI

In these difficult times we live in, when resources seem scarce, there is still one thing that is widely and abundantly available: information. According to the most recent statistics, the amount of information created annually by businesses and organizations, paper and digital combined, is growing at a rate of more than 65%. The amount of digital information being created in the world and distributed in emails, instant messages, blog posts, new Web pages, digital phone calls, podcasts and so on, will increase 10-fold over the next five years. The one fact that stands out is this: The growth of information is relentless.

There is too much of infornation available in various forms. Is it information Overload? or is it failure of Information filter? There is so much of information out there that one can't browse through every possible bit of information. Business Intelligence systems can play a very important role here. It can act as a information filter. It can provide information which is very critical and must need your attention. In today's world when someone have hundred's of KPIs to monitor, BI system can help to identify only those KPIs which needs immediate attention.One can start his day with BI portal. Typically, one follows the following routine
  • Check Emails
  • Check Calendar(Meeting Schedule)
  • Check Important News/stocks
  • Check Most critical KPIs
  • Prepare To-Do List for a day
  • Prepare/View status reports
  • Collabrate with collegues using Enterprise messenger
BI Portal can integrate all of the above information and show them on single UI. One need not to log into 5 different systems to perform the above tasks. Business Users will start their day with Business Intelligence system. Business users will also be able to relate some of emails/news with status of most critical KPIs. BI system can be tightly integrated with content categorization tool which can ensure that only relavent information is delivered to users as per their role and choice. Content categorization tool can also categorized documents/news as per meaning of the document/news items. This can help users to weed out information which is not useful. It saves users a lot of time from browsing through every possible infomation.

Saturday, November 21, 2009

Analytical MDM Vs Operational MDM

I was having interesting coversation about MDM with one of my customer last week. It's mid sized bank and they are in process of evaluating MDM. When i asked him  " How confident are you about quality of your data?" He said "Honestly, I do not know". Then i told him that for MDM one of prerequisite is to have good quality of data. If quality of your data is not good then MDM solution implementation is bound to fail.

I have seen quite a few organizations who would like to embark on MDM initiative without having good data quality system in place. Thanks to huge amount of marketing money spent by some of large IT product vendors. Quite often organizations fall in this trap and end up investing hugh amount of money and efforts.

There are two types of MDM solutions in market. Operational MDM and Analytical MDM.

Operational MDM is used to collect customer information at front desk. This solution is used to standardize the mechanism to capture customer information at various customer touch points in organization. Typically, organizations have 5-20 customer touch points. This solution provides customer information to various operational systems in organization.It ensures that any changes made in customer information at any of customer touch point are transferred to all operational systems. It will work in organizations which are still in process of implementing operational systems and have very few customer touch points. This approach requires discipline, and huge amount of training efforts. Currently most of banks in India have operational systems in place. These operational systems are built using old technology and captures customer information specific to their application.Enhancements to these oprational systems are very time consuming and lead to performance issues. Hence this solution will not be suitable for most of the large and mid sized organizations. It also requires huge amount of training efforts to train front desk staff on this solution.

Analytical MDM is used for historical and predictive analysis. This solution sources the data from transactional systems such as CRM, ERP, CBS, LOS etc...Analytical CRM can be updated once in a day or multiple times in a day. I have seen banks updating it once a day which is more than sufficient to cater to their current business requirements. This solution doesn't require retraining of front desk staff. However, it requires tight integration with transactional systems. Analytical MDM should be SOA enabled. This will enable source system to call web service and check whether new customer is already customer of bank or not. Analytical MDM will also provide information related to class of customer(Preffered, Gold etc) and behaviour based on past transactions. This will help to take decision about loan approval or issuing credit card or giving prefferential service to your customer.

The way Analytical MDM & operational MDM store the data is also different. Analytical MDM stores data in denormalized format so that it can be retrieved easily for analysis  whereas Operational MDM stores the data in normalized format so that it can be updated quickly.

Operational MDM stores demograhic details such as age, birthdate, name, address whereas Analytical MDM stores information related to profibility, behaviour score, credit score and propensity to buy product apart from demographic details of a customer.

Both types of MDM solutions require strong data quality engine in backend. This data quality engine should be capable enough to address peculiarities in Indian addresses and names.

Sunday, November 8, 2009

Task Based Intelligence

I was working on "Task Based Intelligence" concept few years ago. The idea was to integrate BI with operational systems. E.g when someone is creating a purchase order in ERP system, he will be able to see scorecard of a supplier without going to a seperate interface or application. The supplier scorecard is embedded into PO application. This will not only prevent PO going to a black listed supplier, but also gives flexibility to users to select supplier based on priority at that point in time. The supplier can be selected based on score which is determined based on the various parameters such as lead time, On time delivery performance, price & quality of supply(Rejection Rate).

The same concept can be applicable to banking industry as well. While granting a loan or credit card to an individual, the bank officer will be able to see application and behaviour score on LOS system. This will help him take more informed decision.

Discount coupons can be printed @ATM machines based on the amount withdrawn from ATM at that point in time. Just imagine a scenario wherein a discount coupon for a digital camera is printed @ATM machine when 10000 Rs is withdrawn from ATM. Competition is increasing day by day in every industry vertical. Margin is going down day by day. I won't be surprised if the bank starts selling cricket match ticket or flight tickets in near future to share the cost of infrastructure and therefore increase profitability of each branch & ATM.

In retail, discount coupons can be printed based on items bought by customer at that point in time. This will not only increase customer satisfaction but also ensure that the customer returns to store for more purchases in near future. Customer loyalty program is in very nascent stage in India. Hence such intelligence embedded into operational system will definitely help retailer to increase revenue per customer.

Saturday, October 24, 2009

Green Business Intelligence

Green Business Intelligence is a new buzz word in BI world nowadays. More than one third of Gartner Survey respondents plan on spending more than 15% of their IT dollars on Green IT projects. Most of these projects fall into the "improve energy efficiency" category for short-term, immediate cost savings.

So, with 15% of IT dollars going toward green projects, can BI initiatives be a part of that? Absolutely. Key to the success of green projects is measuring and monitoring. If a project claims to reduce energy usage, that usage must be measured before the project begins and monitored afterwards through the payback period (and potentially beyond). Web based BI systems can help move companies to a paperless environment.

Reducing power consumptions in servers is one way to contribute to green initiatives of a company.BI Systems can help companies to measure and monitor usage of hardware resources(CPU, RAM, Network, Storage Etc..). It can forecast hardware resource requirements based on historical data & events. This will help companies to consolidate hardware resources & there by reducing power consumption.One of pre-requisite for such BI system is to have centralized IT data mart which can collect & store data from various performance & monitoring tools like CA Unicenter or Tivoli or Open View.

Inventory management is perhaps the most important step where BI can improve efficiency. Not having the right product in store will lead to lost sales and unhappy shoppers or excess inventory. Either is bad for the environment. When consumers don’t find what they are looking for, they make additional trips, increasing the carbon footprint of their shopping. Excess inventory leads to waste (especially in the case of perishable products), affecting both the environment and margins. Most progressive retailers have implemented perpetual inventory systems to keep track of what is on the shelf, and advanced replenishment systems to forecast demand and generate orders based on past trends and current factors. While these systems have helped run a more efficient operation, they are not perfect. Business intelligence can help retailers get smart by analysing issues in forecasting, understanding their root causes and preventing future exceptions. Accurate inventory visibility in store is a key input for upstream operations such as manufacturing and distribution, to reduce waste, cost and carbon footprint.

In the near future green reporting is going to be as mandatory as any other financial reporting. When tight controls are in place, people discover and reinvent more creative and efficient ways to save money. There is no better time than now to take action and allocate a piece of the budget toward serious and productive green IT. Environmental issues will shape the information management landscape for decades to come, affecting areas like data management and data governance. It also will have significant impact on areas such as competitive strategy, business intelligence marketing and even a company’s ability to attract and retain people.

Sunday, September 13, 2009

Application Data Warehouse

The definition of data warehouse is changing in Indian Market. Earlier people use to build data warehouse to cater to MIS reporting need of an organization nowadays data warehouse is built to support various business applications such as
  • Cross Sell/Up Sell
  • Retention
  • Campaign Management
  • Marketing Optimization
  • Market Mix Modeling
  • Basel II compliance
  • Market Risk Analysis
  • Op Risk Analysis
  • Warranty Analysis
  • Supply Chain Optimization etc...
I started my BI implementation career with a data warehouse implementation at one of the media company in India.  The objective of data warehouse project was to replace all excel based MIS reporting with automated reporting system. The sponsor of data warehouse project was IT director.  We designed our data warehouse schema based on reporting requirements of different business functions within the organization. It took 10 months to build a warehouse. We delivered some 25 odd reports using Business Objects. While the data warehouse project was appreciated well within IT department, end users didn't appreciated it. They felt  data warehouse is too rigid. They can't make changes in data warehouse easily. It used to take 4-6 weeks to include any new business requirement change in data warehouse. They also felt that data warehouse was not giving any value add or insight which can help them in their day to day activity. After a year, data warehouse project was scrapped by that organization because the ROI generated from data warehouse was not enough to justify it's investment.


Today scenario has changed. Very recently, we had worked on two enterprise data warehouse RFPs wherein the end goal of implementing data warehouse was to support various business applications. Prospect had clearly stated objective of data warehouse in RFP. They wanted to built a data warehouse to support the following business applications
  • Basel II Compliance
  • Market Risk
  • Credit Scoring
  • Cross Sell/Up Sell
  • Retention
  • Campaign Management 
They wanted to ensure that all variables that are required to do say e.g cross sell/Up sell analysis are included in data warehouse model. There are some 750+ variables to do only cross sell/up sell analysis. Similarly, there are thousands of variables available to support other applications. Lot of time the data warehouse is built keeping in mind only MIS reporting requirements. Hence whenever business users want to do business analysis, they end up creating a seperate mart for data specific to that analysis. This results in data duplication and system overhead. Last week I met up with a senior executive of one of the large bank. Currently they have three data marts. One data mart caters to MIS reporting requirement, second data mart caters to Risk compliance requirements and  third data mart caters to PM requirements. Soon, they are coming up with a RFP to consolidate all three data marts into a single data warehouse.


In today's economic conditions, it is very critical to build "Analytics" friendly data warehouse. Typically, you require historical data to do analytics. Hence you need to capture data related to analytical variables right from day one when data warehouse is implemented. I have seen lot of organization making a mistake of building MIS reporting data warehouse. There are several disadvantages of this approach.

  1. It takes significant amount of time & efforts to build such data warehouse. Your reporting requirements change by the time data warehouse is implemented.
  2. ROI generated from such data warehouse is not significant enough to justify it's investment.
  3. If right variables are not captured in the data model then it takes significant amount of time and efforts to incorporate them in data model at later stage. It involves change in data model, ETL & BI Strategies. Lot of time, it is not possible to incorporate such changes due to complexity of data model and ETL routines, and you end up creating a data mart to cater to Analytical requirements. This results in data de-duplication.
In today's world, it is not just sufficient to know who is buying what & when. You will need to know what they will buy next, & whether they are profitable customer for you or not. This requires analytical capabilities built into your data warehouse. Hence "Application Data warehouse" is way to go.

Saturday, September 5, 2009

New Products Forecasting

Yesterday Nokia officially launched X6 touch phone at Nokia world. This is a nice touch phone with features comparable to or better than iphone 3GS. It took them almost two years to release a phone which is comparable to or better than iphone. This phone is also very competitively priced (Rs 30,000). It's good to see that competition for Apple Iphone has finally arrived. Apple has been ignoring Indian market for long time now. Apple Iphone launch in India was a big failure due to very high pricing. India is suppose to be one of the largest mobile handset markets in the world. One can't afford to ignore them. I hope Apple learns from their past mistakes and launches Iphone 3GS at very competitive pricing in India.

Nokia comes out with a new product or model every month. Lot of time they launch a model which has overlapping features with existing model in market. I always wondered how they forecast inventory for their new phone.Typically, forecasting is done based on past data and events. There is no such data available for new products. Moreover, there are similar products available in market from the same manufacturer. Each of these products eat into each other's revenue. In high tech companies, typically there are 50-60% of mature or stable products, 35% of new products and 5% of "first-of-its-kind" products. How do you ensure that the new product doesn't impact sales of existing product in market?

Each product follows a particular life cycle. A new product launch should be decided in such a way that it doesn't cannablize revenue of other similar products in market. E,g Nokia X3 is a new music phone. Most probably, it will replace Nokia X5300 express music phone. Launch of Nokia X3 is planned when Nokia X5300 is reaching towards end of its life. If two models are going to co-exists than you need to ensure that messaging & target consumer audience is different for both the products. In case of Nokia, both Nokia N97 and N97 mini are going to co-exist. Both the products are meant for different consumer segment.

There are several new product forecasting techniques available. One of the most common one is a Bass diffusion technique. The bass diffusion technique requires 3 parameters
  • Lifetime expected sales - the total amount of units sold in its product lifetime. Also known as market potential
  • P(Mass media) - influenced by the technical aspects of a product that drives a consumer purchase. Also known as coefficient of innovation.
  • Q(Word of Mouth) - Reflects the internal dynamics of the consumer. Also known as coefficient of imitation.
Typically, the following methodology is used for new product forecasting
  • Find a cluster of like (similar) products. This is used to determine the historical data we can use from like products.
  • Perform regression analysis on the cluster
  • Use Bass diffusion model to determine the forecast of the new product
  • Adjust or reforecast after we have some actual data
There are several high tech companies in world who are using sophisticated forecasting techniques to ensure that there is increase in profitability, market share & reduction in excess inventory.

I am ardent fan of Apple iphone. I personally believe that competition is always good for end consumer. With Nokia X6 launch, Apple will not be complacent. It will accelerate innovation at both the places. Finally end consumers like me will get benefited.

You can find more details about Nokia X6@
http://www.mobilenewshome.com/2009/09/nokia-launches-mobile-cum-music-device.html

Nokia Phone Comparision
http://europe.nokia.com/find-products/phone-comparison

Saturday, August 29, 2009

Does BI require business process re-engineering?

I met up with a senior executive of a large firm last week. This firm is in process of rolling out BI intiative enterprise wide. He asked us a question about changes required in their business processes to ensure sucessful roll out of BI initiative. This question took all of us by surprise. We all were ready with our answer of how BI can help them increase revenue, profitability and reduce cost. This type of question shows maturity of an organization to adopt technology like BI. After the meeting, I didn't have any doubt what so ever about readiness of this organization to roll out enterprise wide BI Initiative.

Today many BI initiatives fail as they do not systematically address the business process changes required to capture business value of BI. It is common for BI vendor value propositions to emphasize business benefits such as profitability, responsiveness, customer intimacy, information sharing, flexibility, and collaboration. But investing in BI to achieve such business benefits may actually destroy business value unless those attributes can be defined in operational terms and realized through business processes that affect revenues or costs.

Many companies use BI to improve customer segmentation, customer acquisition and customer retention. These improvements can be linked to reduced customer acquisitions costs, increased revenue, and increased customer lifetime value, which translate to increase in profitability. However, a BI investment that improves demand forecasting will not deliver business value unless the forecasts are actually incorporated into operational business processes that then deliver reduced inventory or some other tangible economic benefit. In other word, the business benefit "Improved forecasting" is useless unless it is somehow converted into incremental after-tax cash flow.

To capture the business value of BI requires organizations to go well beyond the technical implementation of a BI environment. Specifically, organizations must engage in effective process engineering and change management in order to capture business value from BI.

BI systems delivers lot of useful information such as most profitable customers of your organization, fast & slow moving inventories, Good & bad suppliers, product cost & profitability. If you do not integrate this information with your management processes and operational processes such as ERP & CRM then you will not derive any value out of your investment in BI systems.

Process engineering is very essential for successful BI roll out. Process engineering identifies how BI applications will be used within the context of key management and operational processes that drive increased revenue and/or reduce cost. It provides a map of which processes must change and how they must change in order to create business value with BI applications. Thus, it lays the foundation for change management because process changes drive changes in individual and organizational behavior.

Monday, August 24, 2009

BI as Business Enabler

I have started my sales/pre-sales journey with SAS about 3 1/2 year ago. My job is 100% customer facing. I have seen quite a few changes, in a way, people have started adopting business intelligence technology in past few years.

My first assignment at SAS was that of data quality. At that point in time there was not much awareness about data quality solutions in Indian market. It took almost a year to convince customer to invest in data quality solution. There were very few data quality solutions available in Indian market at that time. Today, when I look in market, there are plenty of data quality software vendors. Data quality has become integral part of data governance & compliance strategy for most of the organizations. We have worked on more than 5 enterprise data warehouse RFPs in last 3 months. All of them had included requirements for data quality. Data quality is not only used for basic data cleansing but it is also used for doing de-dup, house hold analysis and cross sell/up sell.

One of the major banks is using data quality to ensure that they give loan to right people and thereby ensuring low NPAs. One of the CPG company is using data quality to create single view of retail outlets across various product categories. This helps them in optimizing sales force & increasing cross sell/Up Sell opportunities within same outlet for different products. One of the manufacturing company is using data quality to create single view of customer across various business functions. This helps them to cross sell/up sell products across various business functions.

I have also seen enormous change in a way people are using business intelligence, now compared with what it was, four years ago. Earlier BI was used by very few people in organization. That's the reason some BI vendors are offering per-user licensing. Today BI reporting access is given to all users at all levels for decision making. User based licensing makes no sense in today's environment. Most of the organizations are now moving towards establishing enterprise-wide reporting framework. They are also standardizing on their reporting platform across various departments. Need for reporting user interface is also changing from one user group to another user group. Customers are expecting different user interfaces for users having different skill sets. There are some users , who are more comfortable with excel, requires excel interface. There are some users, who are more comfortable with web interface, requires web interface for reports. Earlier, reports used to get refreshed every month or week. Today customers are expecting to refresh reports multiple times in a day or atleast once in a day. In past few months I have seen several requirements for real time dashboarding & OLAP analysis.

One of the large private limited bank is using real time dashboards to monitor cash level & down time of each ATM. BI system sends out an alert to respective regional manager when cash drops below certain level or ATM is down for long period. This has reduced cash out situations drastically across all ATMs in country. One of the large manufacturing company is using BI reporting to measure supplier performance. They monitor quality and quantity of s upply of each supplier using this system. They also compare pricing from different supplier for thesame material and quantity. They use this information to better neogotiate material pricing with their suppliers. One of the retailer is using real time dashboard to monitor their fast moving items at store level and thereby reducing stock out situations.

Today customers are considering BI as a business enabler. Their expectations from BI systems have increased in last few years. They are looking for BI solutions which offer capabilities beyond Querying and reporting. They are looking for solutions which can help them forecast, predict and optimize. Lots of companies have started mentioning about data warehouse initiative as a strategic initiative in their annual reports. They believe that the implementation of the BI system will help them increase their sales and profitability. This represents a large opportunity for BI vendors. Hence we are seeing lots of consolidation in BI market.

Saturday, August 22, 2009

Customer Oriented Banking

I have had a bad experience with customer service of one of the large private bank last week. My netbanking id was locked as I didn't use it for past more than 3 months. When I approached their phone banking support to get it unlocked, they advised me to raise support ticket via web service request form. I did the same promptly as I wanted to transfer funds urgently to my some other accounts. They took 5 days to unlock my id. I missed my deadline to transfer the fund.

I am a loyal customer of this bank for more than 4 years now. I am also one of their premium customer and hold more than 4 products from the same bank. However when I submitted support request, they treated my request on the same priority as that of a non premium customer. I was expecting quick turnaround and differentiated service level, being a premium customer of bank. It didn't happen. Today I have decided to move to other bank. I am sure this bank must be losing lots of such customer in a year. This bank is known to be one of the most technology savvy bank in India, but unfortunately they do not use any system which helps them differentiate their premium customers from non premium customers. There are several advanced analytics solutions available in market which can help bank address such problems/challenges proactively. I have seen banks spending lots efforts and money on marketing campaigns to retain their premium customers when they are leaving. But if they invest in analytics systems then they can save lots of money on such marketing spend.

Similarly where a new loan to a non premium customer is given in say 3-4 days, can the same be given to a premium customer in just 1-2 days ? I had to wait for 4 working days to avail an Auto Loan last year through the same bank where I was a premium customer.

Some years ago banks were focussing on acquiring customers and quality of customer was not given lot of importance. In today's economic condition, you need to improve quality of new customers. You would like to increase wallet share per customer and at the same time you want to retain only those customers which are profitable to your organizations. Analytical solutions can help address all of such requirements. To survive in this competitive market conditions, we need to address the following questions pro-actively

1. Am I acquiring good customers?

2. Am I spending my marketing budget in right direction to increase wallet share per customer?

3. Do I know my most profitable customers? What am I doing to retain them?

4. Do I know my non profitable customers? What am I doing to terminate them?

5. Can I expediate the process of giving loans to the most profitable customers ?

Typically in banking when you submit any web request, it goes into queue. CRM system automatically routes your request basis information available in subject line of your request. Lots of time subject line is incorrect and that results in routing of a request to wrong customer support group. The request gets re-assigned to right group and then they start working on it. This is very tedious and time consuming process. Lot of precious time is lost in a assigning request to right support group. This can be avoided by implementing text analytics solutions. Text analytics solution can help you route the request based on text in the web form. If customer id is available in web form then it can also identify whether a particular customer is premium customer or non premium customer. Accordingly it can route it to queue dedicated to the premium customers.

Sunday, August 16, 2009

Change Data Capture - Real Time BI

I have heard need for change data capture from several customers in past few months.

Traditionally, ETL processes have been run periodically, on a monthly or weekly basis, and use a bulk approach that moves and integrates the entire data set from the operational source systems to the target data warehouse. Now, data integration requirements have changed. Customers would like to move the changes made to enterprise data while the operational systems are running, without the need for a downtime window. They do not want to degrade performance and service levels of their operation systems.

Business conditions have changed over a period of time. It requires a new way of integrating data in real time and efficient manner.

  1. Business globalization and 24x7 operations. In the past, enterprises could stop online systems during the night or weekend, to provide a window of time for running bulk ETL processes. Today, running a global business with 24x7 operations means smaller or no downtime windows.
  2. Need for up-to-date, current data. In today’s competitive environment & competitive pressure, organization cannot afford to have their managers’ work on last week or yesterday’s data. Today, decision-makers need data that is updated a few times a day or even in real time.
  3. Data volumes are increasing. Data is doubling every 9 months. The larger the data volumes become, the more resources and time are required by the ETL processes. This trend challenges the bulk extract windows that are getting smaller and smaller.
  4. Cost reduction. Bulk ETL operations are costly and inefficient, as they require more processing power, more memory and more network bandwidth. In addition, as bulk ETL processes run for long periods of time, they also require more administration and IT resources to manage.

The first step in change data capture is detecting the changes! There are four main ways to detect changes:

  • Audit columns. In most cases, the source system contains audit columns. Audit columns are appended to the end of each table to store the date and time a record was added or modified. Audit columns are usually populated via database triggers that are fired off automatically as records are inserted or updated.
  • Database log scraping. Log scraping effectively takes a snapshot of the database redo log at a scheduled point in time (usually midnight) and scours it for transactions that affect the tables you care about for your ETL load. Sniffing involves a “polling” of the redo log, capturing transactions on-the-fly. Scraping the log for transactions is probably the messiest of all techniques. It’s not rare for transaction logs to “blow-out,” meaning they get full and prevent new transactions from occurring. If you’ve exhausted all other techniques and find log scraping is your last resort for finding new or changed records, persuade the DBA to create a special log to meet your specific needs.
  • Timed extracts. With a timed extract you typically select all of the rows where the date in the Create or Modified date fields equal SYSDATE-1, meaning you’ve got all of yesterday’s records. Sounds perfect, right? Wrong. Loading records based purely on time is a common mistake made by most beginning ETL developers. This process is horribly unreliable. Time-based data selection loads duplicate rows when it is restarted from mid-process failures. This means that manual intervention and data cleanup is required if the process fails for any reason. Meanwhile, if the nightly load process fails to run and misses a day, a risk exists that the missed data will never make it into the data warehouse.
  • Full database “diff compare.” A full diff compare keeps a full snapshot of yesterday’s database, and compares it, record by record against today’s database to find what changed. The good news is that this technique is fully general: you are guaranteed to find every change. The obvious bad news is that in many cases this technique is very resource intensive. If you must do a full diff compare, then try to do the compare on the source machine so that you don’t have to transfer the whole database into the ETL environment. Also, investigate using CRC (cyclic redundancy checksum) algorithms to quickly tell if a complex record has changed.

CDC solutions are designed to maximize the efficiency of ETL processes, minimize resource usage by replicating/moving only changes to the data (i.e., the deltas) and minimize the latency in the delivery of timely business information to the potential consumers. Change data capture solutions comprises of the following key components

  • Change Capture Agents
  • Changed Data Services
  • Change Delivery

Change Capture Agents
Change capture agents are the software components that are responsible for the identification and capture of changes to the source operational data store. Change capture agent sits on source system and takes minimal power of source system. Typically it utilizes 1 to 2% of source system processing power. Change capture agents can be optimized and dedicated to the source system (i.e., typically using database journals, triggers or exit hooks) or by using generic methods such as data log comparison.

Change Data Services
Change data services provide a set of functions critical to achieving successful CDC, including but not limited to: filtering (e.g., receiving only committed changes), sequencing (e.g., receiving changes based on transaction/unit of work boundaries, by table or by timestamp), change data enrichment (e.g., add reference data to the delivered change for further processing purposes), life cycle management (i.e., how long will the changes be available for consuming applications) and auditing that enables monitoring of the system's end-to-end behavior, as well as the examination of trends over time.

Change Delivery
Change delivery mechanisms are responsible for the reliable delivery of changed data to change consumers -- typically an ETL program. Change delivery mechanisms can support one or more consumers and provide flexible ways by which the changes can be delivered including push and pull models. A pull model means that the change consumer asks for the changes on a periodic basis (as frequently as needed, typically every few minutes or hours), preferably using a standard interface such as ODBC or JDBC. A push model means that the change consumer listens and waits for changes, and those are delivered as soon as they are captured, typically using some messaging middleware. Another important function of change delivery is the ability to dynamically go back and ask for older changes for repeated, additional or recovery processing.

Following are two sample scenarios that highlight how organizations can leverage CDC.

Sample Scenario 1: Batch-Oriented CDC (pull CDC)
In this scenario, an ETL tool periodically requests the changes, each time receiving a batch of records that represent all the changes that were captured since the last request cycle. Change delivery requests can be done in low or high frequencies (e.g., twice a day or every 15 minutes). For many organizations, the preferred method of providing extracted changes is to expose them as records of a data source table. This approach enables the ETL tool to seamlessly access the changed records using standard interfaces such as ODBC. The CDC solution needs to take care of maintaining the position of the last change delivery and deliver new changes every time.
This scenario is very similar to traditional bulk ETL, except that it processes only the changes to the data instead of the entire source data store. This approach greatly reduces the required resources and eliminates the need for a downtime window for ETL operations.

When should organizations use this approach? This batch-oriented approach is very easy to implement, as it is similar to traditional ETL processes and capitalizes on existing skill sets. Organizations should use this method when their latency requirements are measured in hours or minutes.

Sample Scenario 2: Live/Real-Time CDC (push CDC)
In this scenario, which accommodates near real-time or real-time latency requirements, the change delivery mechanism pushes the changes to the ETL program as soon as changes are captured. This is typically done using a reliable transport such as an event-delivery mechanism or messaging middleware. Some CDC solutions use proprietary event delivery mechanisms, and some support standard messaging middleware (e.g., MQ Series).

Note that while message-oriented or event-driven integration is more common in EAI products (i.e., using tools such as Integration Brokers), many of the leading ETL tool vendors are offering such capabilities in their solutions to accommodate the demands of high-end, real-time BI applications. This real-time approach is required when the BI applications demand zero latency and the most up-to-date data.

Change Data Capture Technical Considerations
While CDC seems to offer significant advantages, there are several factors that need to be considered and evaluated, including:

Change Capture Technique. Change capture methods vary, and each has different implications on the overall solution latency, scalability and level of intrusion. Common techniques for capturing changes include reading database journals or log files, usage of database triggers or exit hooks, data comparison and programming custom event notifications within enterprise programs.

Level of Intrusion. All CDC solutions have a certain degree of system impact, making intrusion a critical evaluation factor. The highest degree of intrusion is "source code" intrusion that requires changes to be made to the enterprise applications that make the changes to the data stores. A lesser degree of intrusion is "in-process" or "address space" intrusion, which means that the change capture solution affects the operational system resources. This is the case when using database triggers and exit hooks because they run as part of the operational system and share its resources. Using database journals or archive logs is the least intrusive solution and it does not affect the operational data sources of applications.

Capture Latency. This factor is a key driver for choosing CDC in the first place. Latency is affected by the change capture method, the processing done to the changes and the choice of change delivery mechanism. As a result, changes can be streamed periodically, in high frequency or in real time. One should note that the more real-time the solution is, the more intrusive it typically is as well. Yet another point to consider is that different BI applications will have different latency requirements, and thus enterprises should look for CDC solutions that support a wide range of configurations.

Filtering and Sequencing Services. CDC solutions should provide various services to facilitate the filtering and sequencing of delivered changes. Filtering helps to guarantee that only the needed changes are indeed delivered, for example: an ETL process will typically need only the committed changes. Another example is the ability to discard redundant changes and deliver the last change to further reduce processing. Sequencing defines the order by which changes are delivered. For example, some ETL applications may need changes on a table by table basis, while others may want the changes based on units of work (i.e., across multiple tables).

Supporting Multiple Con-sumers. Captured changes may need to be delivered to more than one consumer, such as multiple ETL processes, data synchronization applications and business activity monitoring. CDC solutions need to support multiple consumers, each of which may have different latency requirements.

Failover and Recoverability. CDC solutions need to guarantee that changes will be delivered correctly, even when system, network or process failures occur. Recovery means that a change delivery stream can continue from its last position and that the solution keeps transactional integrity to the changes throughout the delivery cycle.

Mainframe and Legacy Data Sources. BI is only as good as the data it relies on. Analysts estimate that mainframe systems still store approximately 70 percent of corporate business information, and mainframes still process most of the business transactions in the world. Mainframe data sources also typically store higher volumes of data, further increasing the need for a more efficient approach to moving data such as change data capture. In addition, popular mainframe data sources such as VSAM, which are non-relational, present additional challenges when incorporating that data into BI solutions. As ETL and DW tools expect relational data, the non-relational data needs to somehow be mapped to a relational data model.

Seamless integration with ETL tools. When choosing a standalone CDC solution, enterprises should consider the ease of interoperability with its ETL program (off-the-shelf or homegrown). Standard interfaces and plug-ins can reduce risk and speed the data integration project.

Change data capture allows organization to deliver real-time business intelligence based on timely data while, at the same time, reducing the cost of data integration.

For organizations looking for ways to meet these demanding business needs, create an event-driven enterprise and provide real-time business intelligence, change data capture is a key component in the data integration architecture.

More and more organizations have started adopting change data capture solution. CDC has become integral part of data integration architecture.

Sunday, August 9, 2009

Applications of Text Analytics

In recent years, market research conducted by various software vendors and consulting firms has attempted to quantify the relative percentage split between structured and unstructured data in the average user organization. Most estimates name unstructured data the unqualified winner at 80–85%, leaving structured data in a distant second place at 15–20%.

The reality is that every hour of every day, directly and indirectly, customers place calls (that are transcribed), send direct emails, complete surveys and talk among themselves online in blogs, forums and social networks. They share their thoughts about products and services, their likes and dislikes, and their hopes for future features. Customers tell companies about product failures. They request help. They offer opinions about their experiences that contain insights for organizations that listen. This data is extremely valuable to customer-facing organizations as it’s in the form of first-person narrative – accounts from a single customer referring to himself or herself explicitly using words such as “I,” “me” or “we” that typically provide detailed opinions, issues, thoughts and sentiment about products and services, requirements and ideas.I have seen very few organizations who have started leveraging this information for decision making

Unstructured information can be used to answer the following questions effectively

  • Is our product launch going well?
  • Is there an emerging product issue?
  • Where should the product team focus its development dollars?
  • Is someone committing fraud?
  • Is my customer happy? Can I sell more to the same customer?
  • Is my customer unhappy? Will he stop using our services?
  • Is there a product defect in the market?

There are several applications of text analytics in various industries.

Sentiment Analysis:

One of the large multinational bank is using text analytics on a daily basis to review customer service requests, complaints and sentiment shifts. The bank's ability to retain and grow current customers is directly correlated to understanding and acting on sentiment shifts and their respective root causes. With text analytics, they monitor opinions and attitudes in order to determine where and how to spend on client initiatives. Text analytics gave them the ability to make decisions regarding expenses on marketing materials and viability of their online offerings. They were also able to produce accurate insights about customer preferences and indicators for what prompts customers to spend more.Each day, using text analytics to analyze customer emails, complaints and call agent notes, the organization looks for answers to questions such as:

  • Did customers like our new product?
  • What was their biggest issue?
  • Are my loyal customers angry about something?
  • Are new customers asking questions that might pose an opportunity for up-sell?
  • Is my most profitable customer unhappy about something?

Early Warning

One consumer electronics manufacturer uses text analytics to uncover product issues early, before they turn into expensive problems. When products are high-priced and marketed as the preeminent option, customers expect not only good quality, but also rapid and competent service when something goes wrong. To meet that expectation, customer loyalty managers at this company set up automatic alerts through their text analytics engine so they would know immediately when new product issues occurred. Once identified, proactive measures are taken to mitigate the issue and customer satisfaction is monitored and acted on. In one example, a product defect was found before the product came out of limited release, giving the company time to fix the issue and greatly reduce potential recall costs, not to mention customer satisfaction issues.

Call Center Optimization

A large cell phone carrier uses text analytics to stay on top of customer issues as they are being discussed online in web forums and blogs. In doing so, they’re able to leverage that knowledge to prepare their call center, proactively handle the customer issues, and possibly even deflect calls.In one instance, this company found a serious issue being discussed in web forums two weeks prior to it actually emerging in inbound calls and chats. Once the issue was identified (on a product that was released that same week), the call center took immediate action, posting remedies in an online FAQ, routing customers to agents who had been trained to handle the specific issue, and even proactively notifying customers about the problem. The company noticed a marked increase in customer satisfaction for the customers involved in this early action, which mitigated both a potential public relations problem and an influx of hard-to-manage inbound calls.

One of the large multinational bank is using text analytics to sort and route web customerservice request forms. This saves huge amount of manual efforts involved in manually routing the service requests. 30-40% of web customer service requests are routed to wrong customer service group by call center auto routing systems due to incorrect subject line. This leads to delay in response to customer. This bank has started using text analytics to auto route emails based in text in body of email & web forms.

Launch Monitoring

An industry-leading mobile phone manufacturer uses text analytics to keep a sharp eye on customer sentiment and any potential issues with products they release into the market. As new product introductions in the cell phone industry are frequent and expensive, and cell phones are some of the most discussed consumer electronic products on the Web, the company is committed to listening to, understanding and acting on feedback. Text analytics enables the manufacturer to identify issues early, improve quality, and increase customer satisfaction with each new product. In one example, this company identified a software flaw with a newly introduced phone within the first 24 hours of the product’s release. Discussion about the issue immediately hit online community forums and their text analytics engine discovered and summarized all of the data. The company was able to take immediate action: sending emails to customers with the solution, fixing new products in the queue for shipment, putting an FAQ on their site and notifying their partner carrier to fix new products sold. These steps turned a potential launch failure into a remarkable success.

Product Innovation and Quality

The largest appliance manufacturer in the world uses text analytics daily across its customer service, marketing, quality, engineering and development groups to identify product quality issues and to uncover new opportunities for innovation. This manufacturer uses the insights and ideas derived from customer feedback to drive product innovation.The company has also experienced “hundreds of millions of dollars” in cost savings resulting from early warning on issues. Had text analytics not identified some of these issues, immediate attention would not have been possible. The company has greatly benefited from the ability to understand the root cause behind product issues and respond quickly to manufacturing defects, as well as customer interactions and repair situations rather than having to react via expensive recalls.

Market Research Analysis

Do you regularly survey your customers? If the answer is yes, then you are among the best companies out there. But the real question is, do you take full advantage of that valuable research and augment it with what customers are telling you through “unaided” interactions? A large consumer high-tech company does this every day using text analytics. They go beyond scores and analyze the verbatims in their market research to get to the “why” behind their scores. Now they know what action to take. One of the things the company recently discovered using text analytics was a large disparity between scores and verbatims. Although customers reported that agents were courteous and provided good service, they explained in the verbatim that the issue wasn’t with the agent being nice, “I just couldn’t understand them.” In fact, the company found that for certain problem types their outsourced call center got good scores, but were actually generating call backs because language barriers prevented the agents from resolving the problem. For those call types, the company re-routed the calls to agents in a different locale and with different skills and were able to measure a material increase in their scores. Without the verbatim analysis the company wouldn’t have known what to do.

Competitive Analysis

One major airline uses text analytics not only to understand exactly why their customers are loyal and some are not, but to garner knowledge about their competitors as well. In an industry where fixed costs have risen dramatically and competitive data is transparent, staying on top of customers and their opinions is paramount. The airline analyzes survey responses and call center notes, but they also “harvest” the Internet for customer conversations about themselves and the competition – topics include everything from services, issues, products and prices to specific customer desires. In doing so, the airline is able to make better decisions such as where to invest to beat the competition, what marketing messages will resonate with customers, and what specific competitive differentiators should be promoted. Such insights enable the airline to truly understand how it compares to its fierce competitors, but more importantly, how it will win!

Fraud Detection

There are several goverment agencies who are using text analytics to detect fraud and monitor terrorist movements. There are several voice to text converters are available in market today. These tools help govt agencies convert voice into text and mine the text to detect terrorist movements.

Sunday, July 26, 2009

Data Governance

As per recently sponsored survey of 50+ Global 5000 size businesses regarding their investments in “data governance” and the challenges they are facing.

  • 84% believe that poor data governance can cause: limited user acceptance, lower productivity, reduced business decision accuracy, and higher total cost of ownership
  • Only 27% have centralized data ownership
  • Fully 66% have not documented or communicated their program, and
  • 50% have no KPIs or measurements of success

What is Data Governance?

As per official definition Data governance is a set of processes that ensures that important data assets are formally managed throughout the enterprise. Data governance ensures that data can be trusted and that people can be made accountable for any adverse event that happens because of poor data quality. It is about putting people in charge of fixing and preventing issues with data so that the enterprise can become more efficient.

Data Governance is the application of policies and processed that:

  • Maximize the value of data within an organization
  • Manage what data is collected and determine how it is used

Why Data Governance?

“You can't protect data if you don't know what it is worth.”

To know what it is worth, you have to know where it is, how it is used, and where and when to integrate and federate it.

Lots of time data governance initiatives are driven by a desire to improve data quality. However, they are also more often driven by external regulations such as Sarbanes-Oxley, Basel II, HIPAA and a number of data privacy regulations. To achieve compliance with these regulations, business processes and controls require formal management processes to govern the data subject to these regulations.

Common themes among the external regulations center on the need to manage risk. The risks can be financial misstatement, inadvertent release of sensitive data, or poor data quality for key decisions. When management understands the value of data and the probability of risk, it is then possible to evaluate how much to spend to protect and manage it, as well as where investments should be made in adequate controls.

A best practice within companies successfully implementing data governance is the collaboration between IT management and business leadership to design and refine “future state” business processes associated with data governance commitments. Moreover, a strong data governance function is very important to deliver reliable and usable business information.

Such a corporate data governance function can help businesses avoid these symptoms of poorly executing IT organizations:

  • Overly complex IT infrastructure
  • Silo-driven, application area-centric solutions
  • Slow-to-market delivery of new or enhanced application solutions
  • Inconsistent definitions of key corporate data assets such as customer, supplier, and product masters
  • Poor data accuracy within and across business areas
  • Line-of-business-focused data with inefficient or nonexistent ability to leverage information assets across lines of business (LOBs)
  • Redundant IT initiatives to re-solve data accuracy problems for each individual LOB

With an operational data governance program, businesses are more likely to benefit from:

  • Uniform communications with customers, suppliers, and channels due to the accuracy of key master data
  • Common understanding of business policies and processes across LOBs and with business partners/channels
  • Rapid cross-business implementation of new application solutions requiring shared access to master data
  • Singular definition and location of master data and related policies to enable transparency and auditability essential to regulatory compliance
  • Continuous data quality improvement as data quality processes are embedded upstream rather than downstream
  • Increased synergy between horizontal business functions via cross business data usage – e.g., each LOB is able to cross-sell and upsell its products to the other LOBs’ customers

What are components of the Data Governance Framework?

  1. Organizational Bodies and Policies
  • Governance Structure
  • Data Custodianship
  • User Group Charter
  • Decision Rights
  • Issue Escalation Process

2. Standards and Processes Data Governance

  • Data Definition and Standard(Meta data management)
  • Third Party Data Extract
  • Metrics Development and Monitoring
  • Data Profiling
  • Data Cleansing

3. Technology

  • Metadata Repository
  • Data Profiling tool
  • Data Cleansing tool

The Data Governance Structure
A Data Governance (DG) structure is defined based on the following roles and responsibilities:

Data Governance Council

Membership of this council consists of executives from various divisions who have an interest in the management of asset data. They are responsible for endorsing policies, resolving cross divisional issues, engaging the IT council at the strategic level, strategically aligning business and IT initiatives,and reviewing budget submission for IT and non IT related projects.

Data Custodian

Asset data is managed by the data custodian on behalf of Company A. It is responsible and accountable for the quality of asset data. The data custodian is responsible for resolving issues raised in user group meetings. If issues become political and impacts stakeholders from other divisions, they are escalated to the DG council level. They are also responsible for endorsing data management plan, endorsing data cleansing plan, ensuring data is fit for purpose, converting strategic plans into tactical plans, change management, and stakeholder management.

Data Steward

Data Stewards have detail knowledge of the business process and data requirements. At the same time they also have good IT knowledge to be able to translate business requirements into technical requirements. They are led by the Data Custodians and are responsible for carrying out the tactical plans. They also act on behalf of the Data Custodians in stakeholder management, change management, asset related information systems management and project management. They manage user group meetings, train and educate data users.

User Groups

Data stakeholders from various divisions are invited to the user group meetings. These key data stakeholders consist of people who collect the data, process and report off the data. Technical IT staff is also invited to these meetings so that their technical expertise is available during the meeting. This is also a venue where urgent operational data issues can be tabled. The data users are responsible for reporting any data related issues, requesting functionality that would help them collect data more efficiently, and specifying reporting requirements.

The Data Governance structure should have the business engagement with IT at the strategic, tactical and operational levels. This level of engagement ensures that IT and business are kept informed and IT initiatives align with the business data governance objectives.

Other Related Terms (Source CDI Institute)

Data Governance

The formal orchestration of people, processes, and technology to enable an organization to leverage data as an enterprise asset.

Master Data Management (MDM)

The authoritative, reliable foundation for data used across many applications and constituencies with the goal to provide a single view of the truth no matter where it lies.

Customer Data Integration (CDI)

Processes and technologies for recognizing a customer and its relationships at any touch-point while aggregating, managing and harmonizing accurate, up-to-date knowledge about that customer to deliver it ‘just in time’ in an actionable form to touch-points.

Master Data Integration (MDI)

Process for harmonizing core business information across heterogeneous sources, augmenting the system of record with rich content by cleansing, standardizing and matching information to provide high data quality in support of a master data management initiative.

Good Articles on Data Governance:
http://www.b-eye-network.com/view/630
www.hds.com/pdf/wp_199_data_governance.pdf

Saturday, July 11, 2009

Cloud Computing

Gartner Says “Cloud Computing will be as Influential As E-business.”

Forrester’s advice to CFOs: Embrace Cloud computing to cut costs.

Is cloud computing new evolution of Software-as-a-Service? How is it different from SaaS, PaaS, grid & Utility Computing? How will it impact BI ?

What is Cloud computing?

The wikipedia entry states "Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet.Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them.

Cloud computing refers to computing resources being accessed which are typically owned and operated by a third-party provider on a consolidated basis in data center locations. Consumers of cloud computing services purchase computing capacity on-demand and are not generally concerned with the underlying technologies used to achieve the increase in server capability. There are, however, increasing options for developers that allow for platform services in the cloud where developers do care about the underlying technology.

Other Related Terms
Software-as-a-service (SaaS): A software solution hosted by a vendor as a fee-based BI service.

On demand: The ability for users to have instant access to a BI service and pay for it based on usage.

Cloud computing: The computing capacity for supporting SaaS BI processing. This may be provided by the SaaS vendor, or a third-party.

Hosted: An alternative approach to running a BI application in house.

Subscription based: A business model for a pay-as-you-go BI service.

Platform: A set of integrated BI software tools that may be offered as a SaaS or on-premises solution.

Grid computing: A large virtual computing platform that provides scalable cloud computing for on-demand SaaS BI processing. Grid computing is a technology approach to managing a cloud. In effect, all clouds are managed by a grid but not all grids manage a cloud.

So a grid is a huge amount of scalable computing power made up of multiple systems that may, or may not, be in the same data center. Grid computers are used to provide the resources for cloud computing, which in turn supports SaaS computing needs.

What does it mean to BI Customers?

There are several cloud-based data warehouse options available to customers in market today. From pure SaaS or DaaS (data-as-a-service) offerings which provide a full software stack, to PaaS (platform-as-a-service) and IaaS (infrastructure-as-a-service) solutions on which you can build your own data warehouse, the cloud has very quickly shown to be fertile ground for managing increasingly large volumes of data.

Several SaaS or DaaS providers offer data analysis services. Using these services, you can employ powerful, full-stack data warehousing technologies with little effort and pay just for what you use. Companies such as 1010data, LogiXML and LucidEra offer various focused solutions for everything from a really big spreadsheet in the cloud to frameworks with built-in extract, transform and load and dashboards all the way through fully developed analysis tools customized for different verticals.

These solutions require no large outlay to get started. You can sign up using a Web browser, in some cases with a free trial, and start uploading data right away. These full-stack solutions include ETL tools for migrating data and automatically building out visualizations to slice and dice your data.

If you need full control over your data warehousing, or the volumes are larger than SaaS providers can handle, you have the option of rolling your own. If building a data warehouse sounds daunting, building one in the cloud would seemingly only complicate matters. But, in fact, the cloud is simple by comparison.

Some of the questions which are still remained unanswered in my mind are as follows..
1. How data will be stored and secured in Cloud environment?
2. Is this environment really suited for data warehousing applications where volume of data is very high?
3. How will be licensing of Cloud based BI applications? Will it be same as SaaS licensing?
4. What kind of bandwidth required to use Cloud based application?

I am sure we will find anwswers to all of these questions once this technology matures.

Some of interesting links:

http://en.wikipedia.org/wiki/Cloud_computing

http://www.cio.com/article/192701/Cloud_Computing_Tales_from_the_Front

http://www.cio.com/article/426214/The_Dangers_of_Cloud_Computing

Wednesday, July 8, 2009

Dashboard Vs Scorecard

If scorecards are supposed to be balanced, are dashboards innately unbalanced? What is the difference between scorecards and dashboards?

The popular concept seems to be that there is no difference. The terms are used
interchangeably in most of the marketing collaterals and performance articles. Perhaps there should be a distinction as a scorecard for a college semester feels like it’s addressing a different problem than a dashboard for an automobile.

What is Scorecard?
A scorecard is an application or custom user interface that helps you manage your
organization's performance by understanding, optimizing, and aligning organizational units, business processes, and individuals. It should also provide internal and industry benchmarks, goals, and targets that help individuals understand their contributions to the organization. This performance management should span the operational, tactical, and strategic aspects of the business and its decisions. You can use a methodology derived from internal best practices or an external industry methodology. (For example, the term "Balanced Scorecard" is a specific reference to the Kaplan & Norton methodology.)

What is Dashboard?
A dashboard is an application or custom user interface that helps you measure your
organization's performance to understand organizational units, business processes, and individuals. Conceptually a subset of a scorecard, it focuses on communicating performance information. Just like an automobile dashboard, it has meters and gauges that represent underlying information. A dashboard may also have some basic controls or knobs that provide feedback and collaboration abilities.

Industry Conceptions
Although many people use the terms "dashboard" and "scorecard" synonymously, there is a subtle distinction that is worth understanding.

Dashboards Monitor and Measure Processes.
The common industry perception is that a dashboard is more real-time in nature, like an automobile dashboard that lets drivers check their current speed, fuel level, and engine temperature at a glance. It follows that a dashboard is linked directly to systems that capture events as they happen and it warns users through alerts or exception notifications when performance against any number of metrics deviates from the norm.

Scorecards Chart Progress Toward Objectives.
The common perception of a scorecard, on the other hand, is that it displays periodic snapshots of performance associated with an organization's strategic objectives and plans. It measures business activity at a summary level against predefined targets to see if performance is within acceptable ranges. Its selection of key performance indicators helps executives communicate strategy and focuses users on the highest priority tasks required to execute plans.

Whereas a dashboard informs users what they are doing, a scorecard tells them how well they are doing. In other words, a dashboard records performance while a scorecard charts progress. In short, a dashboard is a performance monitoring system, whereas a scorecard is a performance management system.

Scorecard can access the quality of execution whereas dashboards provide tacticalguidance. Scorecards inherently measure against goals dashboards need not.

Industry Perceptions





Bringing Balanced Scorecards & Dashboards Together
Customer relationship dashboards use lots of measures that give you data about how your team is operating, but provide little insight into progress towards your goal of reaching maximum resolutions. Its measuring/monitoring, but not managing. Like wise, customer relationship scorecards presents a quick picture of which strategy you need to concentrate to improve customer satisfaction but lacks any detail as to why are you struggling in bringing up maximum resolutions.

However, there are ways to ensure that Dashboards include the critical connections to strategy. Once you have identified the troublesome measure on the scorecard, you can drill down into maximum resolutions dashboard that contained detailed measures like average call resolution time, call queues and hold time.

Sunday, July 5, 2009

Operational Analytics


Operational BI is no longer simply theory as teams (not necessarily on the bleeding edge of technology advancement) are starting to investigate how to more closely link analytics to their operational activities.

The objective of the operational BI environment is to provide the analysis infrastructure from which people from both inside and outside the organization can make better, faster and more informed decisions. Operational analytics, once opposite ideas now comfortably joined, represent the techniques by which organizations are leveraging the BI infrastructure. Companies employ operational analytics to improve business decisions by directly supporting specific operational processes and activities with analytics. They provide an environment where the organization can learn and adapt based on analysis of operational processes. And they reduce the latency between business events and the organization's ability to react to those events by closing the loop between analytics and operations.

How does this actually work? First, we require an event detective or BI service that can continually monitor business events or transactions that happen in the operational world. These events can be generated by applications, by customer transactions or by service personnel. Once the events have been detected, the related information must be pushed through analysis applications to validate the event and determine the course of action. This generally involves moving the event information into the data warehouse in real or near real time. More traditional data warehouse analyses will also be required prior to starting the initiative in order to determine which events are going to be part of the program. Lastly, the results of the analysis must be married with an operational process so the relevant action can be taken.

An example: a bank wants to serve an immediate need for customers who need cash for emergency purchases and have insufficient cash reserves. With timely, relevant communications, the bank plans to offer these customers a credit card with cash advance capability, overdraft protection or a personal loan. First, the bank must detect that the event has happened - a customer has attempted to withdraw funds at the ATM or at a bank branch and has been rejected due to insufficient funds in the account. This requires continuous data mining for rejected transactions at ATMs and branches. Once the event occurs, information about a specific transaction must be pulled from the transaction system and moved through the data warehouse into a special analysis application designed to determine the significance and relevance of the event for that particular customer.

Habitual offenders, potential fraud perpetrators and people who have simply misjudged their account balance (and can get the funds from other accounts within the bank) must be eliminated from consideration. Eliminating these customers requires a look at past behavior patterns and recent transactions to determine that the account has not had an insufficient funds transaction in the past six months, that the card has not been reported lost or stolen, and that the customer does not have another account from where they could withdraw a similar amount in the 24 hours following the initial attempt. Further analysis to ensure an honest need for emergency cash can include verification that the account does not have a paycheck direct deposit due but is simply late arriving (again requiring past account transaction detail mixed with current state information). Once the bank confirms the need for cash, it must perform a credit score on the customer to ensure that the products offered are appropriate. Next, the bank must deliver the resultant product offer back to the operational process where the customer is experiencing the rejection - in this case, to the ATM or branch interaction.

Implementing the operational analytics program in the example requires a closed-loop environment between the operational world and the analytical one. It offers the possibility of tremendous benefit and will become the norm as organizations harness the intelligence in data warehouse for competitive advantage.