CIO Influence
Analytics CIO Influence News

New BCG Study Highlights How Organizations Need To Address Spiraling Data Costs And Complexity

New BCG Study Highlights How Organizations Need To Address Spiraling Data Costs And Complexity

Study finds enterprise architectures are stretched to the limits, with more than 50% of data leaders saying architectural complexity is a significant pain point

New market research led by independent consulting firm Boston Consulting Group (BCG) and sponsored by Red Hat and Starburst explores organizational challenges related to exponential growth in data volumes and rapid innovation across the data stack. The research indicates that organizations are facing a perfect storm of cost, complexity, and talent when it comes to supporting critical analytics and AI initiatives.

The study titled, “A New Architecture to Manage Data Costs and Complexity,”  highlights three main trends reshaping the data landscape 1) The volume and velocity of data are increasing; 2) Data use cases are becoming both more accessible, creating the growth of “citizen data scientists”; and 3) Technology advancements have shifted the pricing model.

CIO INFLUENCE News: Acalvio Launches Identity Threat Detection and Response (ITDR) Solution Built on Active Defense

According to the new report, these trends are creating challenges that have put immense pressure on today’s architectures:

  • Today’s enterprise architectures are stretched thin, with more than 50% of data leaders saying architectural complexity is a significant pain point.
  • Vendor proliferation across all data categories is a major issue. For larger companies with a more mature data stack, the total number of unique data vendors has nearly tripled in the last decade — up from about 50 to over 150 today.
  • 56% of managers said managing data operating costs is a pain point, but they are still continuing to increase their investments in modernizing and building new data architectures.

Regarding the trend of data use cases becoming more accessible, Pranay Ahlawat, Partner and Associate Director at BCG, said, “Accessibility will continue to increase as data literacy and a basic understanding of programming languages such as SQL become more widespread among nontechnical employees. According to our research, almost three-quarters (73%) of survey respondents expect the number of non-technical consumers of data to increase in the next three years.” The research details the growth of ‘citizen data scientists’ and juxtaposes that growth with the increased sophistication of AI-related use cases. The gap between more advanced analytics use cases and technologies, and the analytics skill sets required, is currently limiting the business outcomes AI can drive. Ahlawat continues, “Only 54% of managers believe that their company’s AI initiatives create tangible business value.”

Regarding data storage and architecture complexity, Steven Huels Senior Director, Cloud Services for AI and Machine Learning at Red Hat, said, “The survey responses confirm that many enterprises are struggling with adapting to increasing data volumes across multi-cloud and edge while also maintaining legacy data architectures. This is compounded by increasing data privacy regulations, pressure on IT and data spend and a shortage of highly-skilled talent. Red Hat believes that the solution to managing these challenges will be to implement data architectures that are agile — built for today’s requirements with the flexibility to evolve quickly in the future.”

CIO INFLUENCE News: Veza Launches GitHub Integration to Stop IP Theft, Enabling Organizations to Enforce Access Policies

The research indicates that given the rapid growth of data and use-case volume, increasing complexity and the skyrocketing costs, more organizations are reaching a breaking point. For those willing to take this on, the report offered a few key lessons to keep in mind:

  • Data architectures are evolving to be more federated and service-oriented: According to survey respondents, 68% of companies aspire to implement a more federated and distributed architectural paradigm (i.e., Data Mesh or Data Products) in the next three years.
  • Pay close attention to overall data TCO. To keep costs under control, establish baseline spending and de-average customer segments to understand key drivers—such as people, data transfer and movement, data storage, and software. Drive shorter-term tactical cost improvements by exploring multiple approaches.
  • Economics will drive architecture choices — open source and hyperscalers will continue to influence technology choices: BCG research shows that cloud and open source are expected to continue to play a significant role in the future of enterprise data architectures as enterprises aim to manage costs. In fact, BCG estimates that open source can reduce the total costs of the data stack by 15%-40% for some organizations.

“One of the most significant takeaways from this study is the need for organizations to invest in a decoupled and federated data architecture,” said Starburst CEO and co-founder Justin Borgman. “This approach meets today’s reality that data is everywhere, and companies can’t afford the time, cost, and architectural complexity to centralize it. It allows companies to bring analytics to the data, making it accessible for decision-making without data movement complexities and costs. It is the only viable approach that will allow companies to meet increased demands for data storage and analytics workloads, while getting costs under control.”

CIO INFLUENCE News: VertexGraph Introduces Better Zero Trust Network Access 2.0 Alternative to ShareFile Storage Zones

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

OpSec Security Completes Acquisition of Zacco

Business Wire

Marelli group announces appointment of Kelei Shen as Executive Vice President and President of China

PR Newswire

SlashNext and Stratejm Partner to Deliver AI-Driven Anti-Phishing Security

CIO Influence News Desk