In part 1 and 2 we have seen some of the things business wants with democratic access to data. IT has some requirements also.
We look to IT to provide systems that are stable and available – not partly, but all the time. For IT to be able to provide that consistently, they must also have requirements around self service BI. These are requirements which must be reflected in your arhictecture.
IT are not the bad guys here, although my earlier statements might suggest that. Most often the problem is that IT, like everyone else, has too much work to do. IT has deadlines just like everyone else, and sometimes cannot be diverted quickly enough to help an ad-hoc user with an issue. But when we consider democratizing data – who do you think is going to satisfy all the business needs we just covered? Of course it will be IT.
There are other considerations which IT must handle to make this a success story.
- System Performance Impact
- High Availability and Disaster Recovery
- Licensing Issues
For each new tool, data warehouse, data mart, server, desktop configuration – someone must provide support. Who will provide this support? It is likely there will be several groups involved in supporting the solution. Support for the tools and training might belong in a Business Intelligence Comptency Center (BICC). For large enterprises there is a group which supports the deskop. Once we being to discuss the desktop, realize there must be coordination between the destop operating system, the version of office tools, data access providers, and other business applications. Maintaining these and providing security patch and upgrade support is not a trivial task. The solution must include data stores, who supports the hardware, operating system, database management system for these?
One concern which is central to most discussions I have around ad-hoc data relates to system performance impact. There are already data stores which may be adapted for end-user access, most like production systems. It is difficult (impossible) to predict the performance impact with these new users querying the data. Current reports and analysis was implemented by IT, and the underlying queries are optimized and predictable. Any one of the new data consumers could issue queries which may bring the production system to its knees. This can be remediated by overbuilding the data store with excess capacity or by creating a separate copy for ad-hoc queries. Creating these extra “sandboxes” can get expensive. You should have a plan in place to monitor and control this impact.
As you consider making more data available to everyone, you should put some effort into determining the high-availability and disaster recovery needs. This can be difficult. Early in the process, no one is really going to know how this will be used, and whether or not it will be considered mission critical. IT’s job is to get this on the discussion list, so that it becomes part of the thought process.
Everyone will need data access and tools with which to gather, manipulate and report on their data. That means licensing. Consider any issues around licensing and licensing models as you move forward. Some tools are very expensive and should be reserved for the well-defined business need, rather than general distribution.
All of the items we have seen come at a cost. IT should look at all methods to provide cost-effective solutions for the enterprise. That does not necessarily mean the cheapest solution. One of the often-forgotten items in this discussion is return on investment (ROI). How is the business benefitting from the democratization of data? When someone gets benefit from this, document it – keep it, and let the rest of the company know how this is affecting the bottom-line.
The need to provide safe, available computer environment for us work is necessary to the success of all business. Your architecture should include not only the software, tools and support systems, but should also include a framework which can be inclusive of the governance needs of the organization. This governance may span multiple teams and apply to mutliple levels within your architecture.
Now that we have some picture of both IT and Business needs, we come to a crucial junction. In my experience, the most difficult issue around self service is to provide this ad-hoc access to data and tools, which not compromising our safe, secure, available, high performing systems. While this can get policital in a hurry, strive to create a good balance. If both groups are happy, you’re in heaven. If both groups are equally unhappy, I imagine you have done your job. well.
In part 4, we conclude this series with a list of items to consider, as you wade into the sometimes deep waters of self-service BI.