Email Us
US: +1 339 469 0600
EMEA: +44 (0) 20 3006 6490

Deriving Business Value from Data Analytics

By Broc Kerr

Issue 99
July 2015

Efficiently delivering business intelligence with verifiable business value

To capitalize on rapidly evolving business opportunities, investment managers must be able to answer new questions and solve new problems by quickly analyzing and re-contextualizing their existing data. But traditional approaches to data management are proving too slow and inflexible to support these emerging requirements. This Cutter AdvantEdge looks at some of the new technologies, architectural strategies, and methodologies firms are using to increase the speed and flexibility of their data management capabilities.

New Uses for Data Analytics

Data Analytics go beyond simple reporting of raw data. They provide statistical analyses and visualizations that deliver insights into data relationships, trends, and projections that inform business decisions. Investment managers have used data analytics in the investment management process for many years. Now they are using them to manage the business itself, for purposes that include the following:

  • Increasing Efficiency: Examining relationships between factors driving operational expenses, forecasting resourcing needs, and optimizing brokerage and transaction expenditures.
  • Managing Risk: Identifying unusual trading behavior, monitoring firm-wide exposures, and assessing multiple health factors of client relationships and investment strategies.
  • Supporting Business Strategy: Segmenting client prospects and market opportunities, identifying factors influencing asset flows, and assessing profitability trends at the product and client levels.


New Technologies

Traditional data management initiatives begin by collecting data into an enterprise relational data warehouse, then aggregating and normalizing it in ways that support analysis and reporting. But the required analysis, design, and implementation efforts can take months, during which time the original business opportunity may move in a different direction or be lost altogether.

Investment managers have been receptive to the latest generation of business intelligence (BI) and analytic tools, driven largely by front-end advances in ease of use and data presentation. But under the hood, important technical advances are enabling investment managers to use these tools in new ways.


Thanks to in-memory data processing and other technical advances, the latest generation of analytic tools can manage vastly greater quantities of data, newer types of analytic processing, and new requirements for data presentation. While dedicated ”Big Data” platforms are generating hype and speculation, analytic tools are in many cases quietly meeting many of the same challenges without major up-front investments in new technical skills and new data processing infrastructure.


Traditional BI tools require all data to reside in a single, centralized data store, an architecture requiring significant up-front infrastructure investments for centrally storing and structuring data for analytics. But newer capabilities for dynamically combining diverse data from multiple sources can significantly reduce these costs. The potential importance of these new capabilities in the industry is demonstrated in the CutterResearch chart below, which reveals that more than 85% of investment managers already depend on multiple data sources for reporting.

Number of Major Data Sources
Used for BI and Reporting

Source: 2014 CutterResearch report on
Data Access for Business Users

New Architectural Strategies

To provide more flexible access to data, firms are exploring new tactics such as data virtualization and data lakes. They are also re-examining how to incorporate more mature concepts such as dimensional data warehousing into their overall data strategy.

Data Virtualization

Data virtualization enables firms to blend data from diverse sources, quickly making new data sources and newly identified relationships available for data analytics. Data virtualization also reduces the need for duplicate data, along with related headaches such as inconsistent and unsynchronized data. It enables firms to enforce common data rules and transformations by adding logic to the virtualization layer, and to easily add new data sources as new analysis needs are identified.


Dimensional Data Warehousing

For some time, dimensional data modeling has been the prevailing approach for designing scalable analytic data stores. Now, investment managers are looking at dimensional data warehouses to completely replace their conventional relational data warehouses. Because a relational data warehouse is typically implemented with a big-bang approach, requiring long lead times to realize business value, adding functionality to them can be very expensive. But a dimensional data warehouse can go live supporting just a few specific business needs, and functionality can be readily extended as new use cases are defined. Although a dimensional data warehouse requires more data duplication than a relational data warehouse, it is more scalable and provides data that is easier to read for analysis purposes. And firms can implement a dimensional data warehouse using familiar relational tools and infrastructure.

Data Lakes

A traditional data warehouse is designed to capture and structure data to meet a specific set of predefined business needs. But business needs for data are changing more rapidly now, so predefined needs can quickly become obsolete. With business users continually devising new ways to use existing data, and with storage getting cheaper every day, investment managers are beginning to experiment with a data management concept known as a data lake. The premise of a data lake is to gather all data from all available sources without worrying about normalizing or formatting it, and to make it immediately available for users to explore, combine, and analyze as use cases arise. Data lakes represent a fundamental change in data management philosophy, because while they require no specific technology platform, they require more responsibility from analysts for understanding and accurately representing the data.

New Methodologies

Perhaps the most critical choice in a successful data analytics initiative involves neither tools nor architecture, but a process that can rapidly and repeatedly deliver verifiable business value. For software development and IT projects, Agile Methodologies—tightly tied to measurable business outcomes—have consistently proven their value. Cutter Associates is now seeing investment managers apply Agile Methodologies to data analytics and business intelligence initiatives, with promising results. The following are typical components in these methodologies.

Identify Analytic Hypotheses

Begin the initiative by brainstorming about the kinds of information, currently unavailable, that could deliver verifiable business value. One example might be, “What are the most influential factors in the profitability of an investment unit?” Another could be, “Are the new investment products attracting fresh AUM, or are they cannibalizing existing products?” For each information idea, identify the data required to generate the information, as well as the owners of that data, and the stakeholders who can turn the information into real business value.


Select Value Opportunities

Engage data owners to weigh in on the feasibility of obtaining the desired data, and engage stakeholders to determine the potential business value of developing the information. Determine the best way to combine the feasibility and business value of each type of information to build a prioritized list of analytic projects.

Demonstrate Value

Using live business data—from strategic sources where available, and from tactical sources where necessary—develop data analyses that clearly articulate the potential business value of generating each type of information, and ensure that actual business value outcomes are measured and shared.

Assess Results

Determine the process and technology investments required to automatically generate each set of analytics on a regular basis. Then compare this investment to your measured business outcomes from the data to determine whether you should operationalize these analytics.

Reflect and Repeat

Ensure that lessons learned in this iteration are fed back into the process, including the understanding of stakeholder priorities, and the technical knowledge of its data owners.

Focus on Results

Significant untapped value lies in data that already exists in investment management organizations, and analytics capabilities can effectively exploit this data—but only if the analytics are established with a clear focus on demonstrable business value. Business users expect analytic capabilities to meet needs and opportunities as they arise. And they expect rapid changes in response to their requests for new ways of structuring, processing, and presenting data. Establishing an Agile Analytics data architecture and methodology addresses these ever-changing business requirements and opportunities in a way that can evolve along with the business to become a source of genuine strategic value.


About the Author

Broc Kerr has more than 15 years of experience architecting and implementing major information systems. He has deep expertise in the integration architecture of Order Management, Investment Accounting, Performance and Risk Analytics, Data Warehousing, and Client Reporting systems within Institutional Asset Managers. At Cutter Associates, Broc works in the Strategy and Data Management practices. He has led the development of target operating models and implementation roadmaps. He has also led data strategy programs and the architecture and integration of investment management systems for equities, fixed income, private equity, real estate, and a broad range of derivatives. Broc holds a B.Eng (Microelectronics) from Griffith University, and a Graduate Certificate of Management from the University of Queensland.