CIO Influence
Application Security Cloud End-point Security Guest Authors Machine Learning Security

How Security Leaders Can Embrace a Long-Term Approach to Managing External Pressures

How Security Leaders Can Embrace a Long-Term Approach to Managing External Pressures

Modern cybersecurity has a problem: Security leaders are finding their systems overloaded with data from a wide range of security tools. That data must be stored and managed, which means ingestion costs are soaring. Faced with this new reality, many companies are developing solutions to try and reduce log volume.

Growing awareness and investment in this arena is a good thing. It’s vital. But by and large, the new tools coming online have taken the wrong approach.

They address immediate needs at the expense of a long-term fix, patching the hole when we need to rebuild the boat. That line of action is short-sighted at best. As data overload and multi-locale is not a problem that will any time soon slow down, it has become evident that the industry needs a new strategy.

The problem with current approaches to managing external pressures

It used to be that deploying practices that reduce log volume for SIEM ingestion was considered niche. These days, it’s downright mainstream.

But even with widespread acceptance, trimming logs comes with considerable risk. In an attempt to manipulate log volumes upstream, companies compromise their security without enhancing the visibility crucial for data security.

In other words, although it’s tempting to access the associated short-term cost savings, trimming logs is not the answer.

Handling cloud-to-cloud complexity

In today’s computing environment, the companies that can avoid cloud complexity and the soaring costs associated with it are few and far between. A whopping 87 percent of businesses are using a multi-cloud approach, according to Flexera’s annual State of the Cloud report.

It’s no longer possible for most organizations to house all their data in one place. But moving large amounts of data from one cloud to another turns out to be quite costly. Aside from driving up management costs, companies need to consider the hidden cybersecurity costs associated with working across multiple clouds.

As our team collects logs from major public cloud infrastructure environments, we’re constantly running into hidden fees. It’s costly to send logs from Azure to an AWS-hosted SIEM, for instance. Similarly, companies may expect that data and log utilization in Microsoft Sentinel would be covered by their Microsoft license; that’s not always the case. These are the sort of expenses companies don’t account for in their budgets, which mean they bleed money from other important buckets.

Also Read: Why Healthcare Firms are in the Middle of a Cyberwarfare

AI’s impact

Will artificial intelligence and machine learning play a role in the future of data ingestion? Sure it will, but the degree to which these technologies impact security operations is only partially known.

In the meantime, companies may be tempted to point an AI or ML model at a broad swath of data and expect accurate production-ready results. While experimentation is a natural part of the AI/ML model development, a great deal of care needs to be taken for production-ready systems to avoid: injecting invalid data into the process, model hallucination, and exposing sensitive data, amongst other potential outcomes.

As we know, AI models are only as strong as the underlying data that train them. Generally, if you put bad data into the process, the results are heavily impacted (garbage in / garbage out).

But even if companies can ensure that the data they’re feeding into AI models is perfectly clean, having an array of the “right” data in the proper format is especially important to most AI and ML solutions. In addition, when it comes to Generative AI, the cost impacts for processing and storing this data are not yet fully understood.  Hence, proceeding with diligent estimation and caution for production-grade systems is highly advised.

Fixing the deeper issue and rebuilding the boat

The bottom line is that today’s security centers can no longer hold the amount of data they need for our security future. Data ingestion is having an outsized impact on cost decisions. Yet companies choosing to reduce log volumes are only weakening their overall security outlook.

Moving forward, it’s imperative that we as an industry embrace an open security data architecture. We need to set our security architecture across multiple data sets and locations that follow one cohesive data organization standard. That will allow our security buckets to remain in their native cloud environments, reducing the cost burden of the current models for SIEM ingestion and storage. An open architecture will also enable us to reduce constraints and bottlenecks while making better use of currently deployed security tools. The resulting distributed data architecture will prepare our systems to reap the future opportunities of AI and ML.

Change is coming for security leaders. We’ve been in constant evolution for years, but the current environment calls instead for revolution. That means dropping patchwork strategies and embracing an open architecture—and with it, stepping into the future of holistic security operations.

Also Read: Leveraging AI and Machine Learning for DataSecOps

[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]

Related posts

Sierra Wireless Partners with Microsoft to Enable Advanced Asset Tracking Capabilities via Azure IoT Central

Supply Chain Security Company Codenotary Raises Additional Capital to Scale Business Operations

CIO Influence News Desk

Anomali Introduces Cloud-Native XDR Solution, Offering Unique Detection and Response Capabilities to Stop Attackers and their Breaches