The chief data officer (CDO) is increasingly becoming the chief change agent in organizations, as companies grapple with their transition to become data-driven. While the typical ”people, process and technology” focus remains true, conversations with CDOs highlight two big challenges that need to be balanced: the need for business analytics agility, and the need for governance at the enterprise level.
What makes this balance especially hard for CDOs is that it needs to be done while absorbing massive technology changes, and business culture transformations. Worse, it all has to happen while making sure no one gets into trouble in the quest to become data-driven.
A recent study by Corinium Digital called The Innovation Game highlights some of these fluctuations in detail. For example, the study found that more than 60% of global enterprises are now investing in a hybrid, multi-cloud strategy with both data from cloud environments along with existing on-premise infrastructures.
The Changing Application Landscape & the Evolving Business Culture
A decade ago the business world was on their way to embrace monolithic application suites from SAP, Oracle, and the likes. From 2008 onwards, that path quickly gave way to the adoption of software-as-a-service (SaaS) best of breed applications, like Salesforce.
The result is that, instead of a pre-integrated app suite, you now have hundreds of SaaS apps that increase the burden of data integration, especially when looking for business imperative use cases such as achieving a single view of your customer or identifying how well a product is delivered across your business.
While the application fabric changed, businesses are going through their revolution driven by the consumer and the explosion of IoT with billions of smart devices measuring everything from temperature on a turbine to a shopper’s path through a supermarket. This sensor data stream is now enabling us to know so much more about a customer or product by tracking everything they do on our web site or social media. In turn, this has increased the demand for data and insights about our customers exponentially.
Another element that has driven the business culture is a sense of entitlement – a right to be able to know or have access to the data that holds the answers. If a business user can sign up for a Box account or any other software service using a credit card, how is it impossible for me to get access to my customer data? One of the critical implications for the CDO is that it becomes increasingly challenging to predefine or predict what exactly the business will need or ask, making it very difficult to create pre-modeled data warehouses that could answer all business questions.
The traditional ways to deal with exploding data requests sourced from such a diverse set of apps cannot satisfy the growing ways the business could consume that information in analytical initiatives. The traditional ETL (extract, transform, load) or data integration tools need to be augmented or replaced with modern data preparation solutions. Forrester Research proposes in the Forrester Wave: Data Preparation Solutions, Q4 2018, that data preparation needs to occur wherever it needs to be in the organization. Forrester also indicated that increasingly this means moving the responsibility out of IT and into the business.
Leveraging Big Data Fabrics to Deliver at Scale (and with Governance)
As suggested, the second part of the CDO balancing act is not to just succeed with a single pilot project but to roll these out at scale across the enterprise. It requires empowering all your business analysts with self-service data prep and access to all your data living in best-of-breed apps so that they can deliver the required insights at the speed the business needs.
The key to unlocking this at scale is through what Forrester describes in its most recent Forrester Wave: Big Data Fabric, Q2 2018 as a Big Data Fabric - “an emerging platform that accelerates insights by automating ingestion, curation, discovery, preparation, and integration from data silos.”
A vital requirement of a big data fabric approach is that it should enable you to "play the data where it lies" instead of forcing the creation of carefully curated enterprise data warehouses. Also, being an enterprise platform, the data fabric offers all the requisite enterprise capabilities for governance, security, and manageability.
Also in Forrester’s Big Data Fabric report, they suggest the need to adopt a zero-code approach to creating data flows and data prep. Leveraging a self-service data prep solution that is built on a data fabric enables you to extend the creation of data flows and data pipelines to your data and others within the organization (business analysts, the citizen data scientists and BI analysts in your business) and not only rely on the elite IT developers.
But, being a comprehensive solution, your self-data preparation should also deliver the ability to clean your data, to standardize and match data elements like customers or products, and you should be able to automate these business analyst-created data flows into production pipelines when needed. According to Forrester, this will drive increased agility while at the same time minimizing the complexity of dealing with diverse and even emerging technologies.
Nenshad Bardoliwalla, chief product officer at Paxata, authored this article.
The views and opinions expressed in this article are those of the author and do not necessarily reflect those of CDOTrends.