The impact of new BI architectures on the BI process
Last year, I presented at dozens of conferences, explaining the big changes to BI infrastructures brought about by new technologies (in-memory, hadoop, MPP, in-db analytics, real-time, etc.). And this year, I’ve presented dozens of examples of the real-life business changes that have been made possible because of these technologies.
But one of the most important business processes in modern organizations is the provision of business intelligence itself.
I think we haven’t yet talked enough about how the new technologies will disrupt traditional organizations, roles, and processes in analytics. I believe much of the “next big thing in BI” is actually about getting back to the original ideals of BI – hence the reference to “new old BI ideas”.
Here are three big changes I see:
Upside-down, black-box BI
We’ve always known that – in theory – we’re supposed to start from the decisions that need to be made and then work backwards. But the reality is that, because of technology limitations, most BI projects are still organized around traditional layered architecture approaches: gather the data available, figure out how to cleanse it and transform it, load it into an analytic structure, then figure out what reports/dashboards people want to see.
The new technologies promise to get us back to our ideal. Imagine starting your BI project with a blank sheet of paper – literally. Ask the business people what questions they need answering in order to run their business. Using a collaborative environment (like SAP Jam), get them to upload a sketch of the data visualizations they would like to see, imagining that they have access to any data they might want.
Debate, iterate, and prototype with data discovery tools (like SAP Visual Intelligence) and temporary or dummy data sets. Once the outlines of an ideal solution are in place, invite the data architects into the same collaboration session, and ask them where the detailed data required is originally stored (e.g. in SAP Business Suite, in a customer/partner system via Ariba, social media sentiment, etc.).
Once you have determined the visualizations you want, and the origins of the data, everything in-between can now – at least in theory – become a “black box.” Thanks to in-memory solutions like SAP HANA, you can now do real-time queries on huge quantities of row-level data. You should no longer have to care what the underlying analytic infrastructures are: the system should be able to determine what storage is appropriate based on the questions being asked. As new visualizations and data sources are required, the system would adapt the storage appropriately, so there’s no longer any tension between “siloed” and “enterprise” BI.
The dream isn’t really new, but the tools are now much, much more powerful, and this type of architecture is imaginable for the first time.
Sounds unrealistic, a pipe dream? It’s closer than you think.
The poster children are the brand-new SAP EPM OnDemand and SAP Precision Retail solutions. Both take row-level data from core operational systems, and serve up beautiful, interactive, mobile analytics for business people via the cloud. In both cases, they use SAP HANA to deliver great performance. But what’s the underlying analytic structure? You don’t know, and don’t really care!
It’s only a question of time before all of our analytic systems work this way.
Support for seamless, dynamic BI lifecycles.
One of the recurring causes of BI failure is analytic systems that are too static. For ERP, it makes sense to have big projects to put in place coordinated systems to gather and process consistent operational data. These systems are not likely to need to change very often (although as the world changes, more flexibility is increasingly required, and this would be one of the big advantages of an in-memory ERP system such as Business Suite on HANA).
But BI is different. If it’s not changing, it’s not working. It’s like a bicycle – if it’s not moving forward, it’s falling over.
This is because accessing information is, on its own, a useless activity. Unless somebody is looking at those reports and dashboards, and making changes to business process, there’s no benefit to the business. And if changes to business processes are being made, these should, in turn, result in changes to the information needed to run the business. Business analytics teams should be constantly running after the increasingly-sophisticated information needs of business users and executives.
If you are in charge of a business intelligence deployment, and people have been looking at the same reports for over a year, and have made no requests for changes, it’s time to investigate – somebody is not doing their job. Either the reports are being ignored, or their aren’t being acted upon.
So BI systems have to change over time. But building business analytics system for change has been hard.
To explain why, first note that there is a very distinct BI lifecycle for information needs. It starts off with new business initiatives and information sources. By definition, there’s no big infrastructure to support these initiatives, so they make do with the best tools available, typically manual data scrapes and spreadsheet analysis.
Over time, the initiatives prove their worth, and need to be operationalized with more powerful tools. This is where departmental BI initiatives typically start, but there’s usually wrenching change between the old spreadsheet-based approach and the new systems.
Progress continues, and then the data also needs to be integrated with corporate data and made consistent so that it can be used across the company. So there’s an initiative to integrate it into the enterprise data warehouse, typically with more wrenching change.
A concrete example of this would be the recent rise in social media analysis. It started off with a series of small, completely separate tools to allow marketing teams to do some basic analysis of twitter, facebook, and other social tools. Now most larger organizations are starting to take a more strategic approach with tools that give an overview of social data from multiple different systems, and integrate it with the rest of the marketing analytics. And big consumer retailers such as General Mills have already started integrating the data into their corporate data warehouses.
Today, BI is like a car with five gears, but no clutch. Each of the gears represents a different level and sophistication of analysis, from first gear (spreadsheets) to fifth gear (a full enterprise-wide data warehouse). Over time, we’d like the car do be able to move from one gear to another, but without a clutch, it’s a very painful process that generates the business equivalent of loud grinding noises. (This analogy may also illustrate why so many companies have failed with enterprise data warehouse projects in the past – the only way to get a car into fifth gear is to be running downhill or with a very strong wind behind you.)
In the future, BI has to be more like an automatic car. We want to press on the accelerator and smoothly increase speed. The gears will still be there, but handled for us more automatically.
New BI technology and tools enable this. Data discovery tools such as SAP Visual Intelligence provide “second gear” – they can take existing spreadsheet data and other sources and produce business-friendly visualizations in a more robust way. In particular, the move to 64 bit technology and lower memory costs has enabled laptop computers to perform powerful analysis without requiring extensive corporate IT support.
To move to third gear, it should be easy to take the data sets and visualizations and share them, via a “sandbox” on a corporate-wide infrastructure, to mobile devices. This retains the benefits of business unit autonomy – they get to completely control the analysis being done – while IT provides a robust, scalable, cost-effective system.
This process is made much easier by in-memory systems like SAP HANA. Before, it was hard to imagine a business unit looking after its data, because of the extensive skills needed to optimize the data for analytics use. Now, people can upload detailed data directly into memory and carry out powerful shared analysis with much less expertise required.
Fourth gear is when business units realize that their solution needs to be able to access and share data with corporate systems. If this data is already being stored on the central infrastructure, it’s much easier to integrate, and experts in the IT department can monitor what is being done around the company, ensuring that there isn’t needless duplication of information, and proposing new complimentary data to existing solutions.
Finally, fifth gear involves the data set becoming a fully-fledged corporate resource as part of the official enterprise data warehouse, with corresponding guarantees on quality and timeliness.
Putting people back into BI
The changes described above don’t help with the most intractable problems in BI, which are to do with people, culture, and organization.
People are undoubtedly the most powerful “technology” in the business analytics solution. Only people can take information, make sense of it, and actually change something. We only use the word “decision” when it requires a person to make sense of an ambiguous set of data.
Computers cannot, and never will, make “decisions” – as soon as the data is clear, and a computer can make a choice for us, we no longer use that word, and it simply becomes a feature of our applications. Examples include things like the pricing of airline seats, or the routing of goods in supply chains. People used to do these things, and they called it “deciding”.
Now people “decide” the algorithms, and the computers do the rest. The history of computing is the story of how, over time, computers consistently take over low-level “decisions” while we move on to the next level of complexity and ambiguity (which is also why business people will NEVER be happy with their information systems).
All too often, people are considered passive “users” of BI, whereas they should be considered the heart of the system. Every time somebody touches information, and choosing what to take pay attention to, and what actions to take, they are adding value, and this value needs to be captured and made part of the overall solution.
People are a powerful technology, but at the same time, one of the painful, under-acknowledged truths in the industry is that even with “perfect” BI tools, many people could and would still make bad decisions.
This is because, as a host of recent popular books have shown, human beings are meat-based machines with a host of psychological characteristics that can get in the way of using data as effectively as possible. Situations I seen in most real-life business analytics environments include lack of analysis skills, lack of motivation, bad incentives and KPI definitions, and misaligned organizational support.
Clearly, in-memory technology will not help here. But enterprise social collaboration can.
I believe the best way to avoid the problems created by people is to add more people. A classic example is the elementary difference between correlation and causation. The fact that chocolate consumption and earning Nobel prizes is correlated does NOT mean that chocolate makes you smarter.
I believe the best way to avoid bad information analysis is to share that analysis more widely and more transparently, and let people point out any problems with the data (think of it as equivalent to the peer reviews done in the scientific community) .
New products such as SAP Jam let organizations introduce collaboration into the heart of the business analysis process, combining traditional social techniques (feeds, wikis, etc) with structured tools for collaborative decision-making. This makes it easier to correct bad analysis, to record how decisions have been made in the past, and improve that process in the future.
Better collaboration is also essential to improve core BI implementation processes such as data quality (typically a business problem, not a technical problem) and metadata (what do we mean by key terms like “number of employees”). In the longer term, poor collaboration between IT and business teams is the biggest cause of BI failure.
What do you think?
I hope to add more details to all of this in the future, so please give me your feedback (or invite me to your next analytics conference to present them in more detail!)
Comments
3 responses to “Rethinking BI: 3 Big New Old Ideas”
[…] from: Rethinking BI: 3 Big New Old Ideas | Business Analytics This entry was posted in Intelligence, Uncategorized and tagged Analytics, […]
Agreed… another thoughtful & insightful cogitation on the present state of BI and a vision for a better future state.
What can I say? Brilliant post. I particularly liked your analogy about BI being like a car with five gears and no clutch. Thanks so much for generously sharing all your insights.