This blog post is Human-Centered Content: Written by humans for humans.
I’ve had the benefit over the last few years of listening to hundreds of our clients talk about business intelligence (BI) tools and how they are being used across their organizations. Although there’s never a one-size-fits-all approach to success across all companies, there are some pretty common themes that I’m usually walking away with on how to make the visualized reports feel more modern. But before we talk about how to effectively make that happen, it’s worth a quick history lesson on the state of BI from years past.
Shifts in Approaches
You might have heard the terms BI 1.0, BI 2.0 and (now) BI 3.0. Lots of companies have researched and interviewed corporate stakeholders and analysts to help define what these BI iterations stand for. Rather than try and reinvent the wheel, let me just give you a snapshot of what I think these ultimately represent. I’ll keep things with a tool-agnostic perspective as well.
When discussing BI 1.0, we’re mainly talking about the ability to capture and visualize data. The ability to do so likely meant the download of an application, along with a degree of likelihood that components, especially data, need to be “self-hosted.” In general, the outputs in whatever shape or form they might be would stay mostly confined within the tool itself.
Overall adoption rates for BI at the Enterprise scale were rapid during this period.
When discussing BI 2.0, we’re introducing the ability to move or shift our BI stack around to make our reports and data more accessible and agile. We’re working with online versions of tools, in conjunction with desktop applications, to offer up different means and modes of both developing and updating content. The outputs could be confined, embedded or shared internally and externally.
Overall adoption rates for me are less important here. The key is to start asking ourselves which tool or combination of tools get us to where we want to go. For Enterprise, that meant likely a mix and match in their BI stack of the best ones that fit their specific needs.
The BI landscape gets flooded with new players.
And then along came BI 3.0. I feel that some components of BI 3.0 are still being defined, but the way I see the game board is pretty simple: The prior iterations of BI were all about having analytics serve the Enterprise at scale, and some forward motion toward self-service – or analytics for all. Not only do we want our data to come from lots of different places, underlined with layers of security and permissions, we want our end users to easily understand the data and use it as a part a business workflow that defines specific measures to be taken immediately.
Data democratization sums the collective state of BI up well – the practice of making data accessible and understandable to everyone in an organization irregardless of their technical abilities allowing for informed data-driven decision-making across a broad audience.
Artificial intelligence could also be a part of the new BI workflow or automation process in a way that saves an end user time from analyzing a report, instead pushing them toward action.
Gartner suggested in 2023 that nearly 90% of companies will have leveraged some generative artificial intelligence capabilities by the end of 2026. The automatic creation of reports using dynamic data sounds epic. Are you there already? If not, that’s okay. As with most things, implementing change at a large company or organization isn’t an immediate or overnight thing, and we understand that. You still have time. As I said, the BI world is still evolving.
Even with major shifts in BI, and now AI as an extra wrench, a lot of today’s reports that are being circulated and put in front of stakeholders were likely developed years ago and haven’t been touched since. When taking a step back as a BI manager, things can feel dated, un-intuitive or like a free-for-all.
I believe there are ways of addressing this by focusing on one or all of three of these suggestions for modernizing your BI stack.
Tightening Up
Without changing anything, I’d argue that simply standardizing naming conventions, removing stale content and having everyone meet a minimum design standard to both new and already productionalized reports will go a long way. That might mean developing a style guide and outlining the creative assets needed to make something feel official. In doing just these few things, I think it will lead to an overall net positive toward a content refresh.
Some of the most fun we have here at InterWorks is helping build style guides to support a client’s BI efforts.
Overall tightening impact: Having previous content owners log in and look at a prior report they haven’t touched in a while (possibly in years), will inherently have them put their developer hats on to ensure those reports are still best serving the end users’ needs. And as a part of that cleanup, it will also help with the utilization of newer features and efficiency tools, which may not have previously been leveraged. It’s fairly universal that nearly all BI systems will regularly update their features, functionality and controls each year (if not each each quarter).
In the end, with a change organizationally in how things are defined and designed to meet minimum standards, this can have a large impact. Oftentimes, the timing can go alongside other corporate changes, upgrades or migrations in parallel. Once completed, everything will feel universally refreshed and revamped.
Lifting Up
There are a lot of things talked about in terms of data lakes and warehouses and using the most up-to-date features provided to have optimal data response times so that corporate data is available and structured to easily use and query. That sounds like a utopia that data engineers and managers have been asking about for decades but has always felt out of grasp. Until now. Let’s say hello to the cloud.
Did you know there are more than 1,000 data cloud service providers? Many of these are meant to support the Enterprise at scale. Most of the key players’ names you’d recognize, offering a sense of trust and stability to this space – Amazon Web Services, Google Cloud, Microsoft Azure, IBM and Oracle Cloud. Other cloud-based data connectors for ETL or last mile data prep include dbt Labs, Alteryx, Matillion and Databricks. (Many of whom are our data partners.)
But one of the most well-known and respected data providers and partners, increasing in visibility by leaps each year, is Snowflake. Snowflake is a cloud-based data warehouse that stores and analyzes massive data sets for global corporations and security-first model for organizations. It’s a powerful system due to its modern architecture, ease of use, pay-as-you-go model, and security (including encryption, authentication, and role-based access controls). Most of all it delivers power. In some cases, a query no longer takes 20 minutes but under 20 seconds.
So, another way in modernizing the BI stack is to help offset some of the heavy lifting needed from a direct on-prem data warehouse query into an elastic or agile cloud service provider or providers, especially if it’s one that can have you moving at the speed of light, relatively speaking of course.
Opening Up
I have found that a lot of reports inside organizations live in silos. You could have dashboards, liveboards, metrics, tables or interactive reports that stay native to a single tool itself. This ultimately requires a familiarity for the end user on how to locate, access, use and interact with not only the report, but the system too.
As mentioned in the BI 2.0 world, companies have begun to leverage multiple tools to service a specific group, department or project’s need inside of the company. That could mean that a collection of providers such as Tableau, Power BI, Sigma and ThoughtSpot are all under the same corporate umbrella or roof, so the need to enable and train users on more than one tool can begin a series of enablement complications or change management obstacles that take time away from actual work. Embedded analytics solutions can be a great lifeline for modernizing a company’s BI landscape to break from silos.
At InterWorks, we have our own custom product called Curator. It’s used by Fortune 500’s and thousands of other companies to meet two very specific needs:
- It’s a destination that allows for reports from all different tools and service providers to live in one place. As a website-based product, there are already built-in hooks that do the heavy lifting in terms of integration for you. You choose a provider, the named asset and then after a couple of clicks of a button, your report appears.
- The system allows for total customization in terms of the design, the branding and the look and feel. So, making an experience less about a system or tool, and more about the best visual flow and interface will help lead to greater buy-in and adoption rate from end users too. This can be the solution for all your analytics in one place and to modernize the display of reports, so they feel custom and tailored for offering users the most fluid and intuitive experience possible.
Above: A Curator demo site (Contact us for a free 30-day trial)
That’s it. After taking on one or more of the suggestions above, I can promise you that things will already feel much more modern. For further advice on implementing any of these specifically for your organization, just reach out. We can jump on a call to discuss goals and implement a plan that has you only going up from there.