Carl Perry, Head of Core Services, Snowflake shares about data management, intersection of AI and data, challenges in scaling data operations and more in this Q&A:
———-
Carl, your background spans cloud services, distributed systems, and fintech platforms. Share your transformative career journey over the last five years.
Before joining Snowflake almost four years back, I was a very happy Snowflake customer. I held various product leadership roles across the technology industry, from working on Squareโs developer platform, to Microsoftโs Power BI, and AWSโs S3, which enabled me to see first-hand the transformative power that technology can have on end-users. It also quickly taught me the importance of having the right technology at my fingertips to get the job done, and it was at Square where I experienced Snowflakeโs performance and ease of use firsthand. I deeply admired Snowflake’s commitment to customer-centricity. Since day one, Snowflake has prided itself on an โit just works mentality.โ Customers love the promise that when they integrate their enterprise infrastructure with Snowflake, they do so once, and a broad set of capabilities opens up for them.
Also Read: CIO Influence Interview with Doug Kersten, CISO of Appfire
Now at Snowflake, I oversee the Analytics team, Unistore and Hybrid Tables development, helping Snowflakes engineers and documentation team be more productive. Hybrid Tables in particular has been a crazy project that our team is extremely proud of โ enabling customers to build and run transactional applications alongside their existing analytics workloads, all on Snowflake. Customers expressed a strong desire for Snowflake to serve as their single, unified system. That feedback inspired me and my team to develop Hybrid Tables and Unistore in order to make that vision a reality. The journey over the past few years has been challenging to say the least, but itโs been fueled by our grit, determination, and commitment to supporting our customers.
Collaborating with the talented team at Snowflake to transform bold ideas into reality has been one of the most rewarding experiences of my career. That very first ambitious vision behind Hybrid Tables and Unistore was recently brought to life when we announced the general availability release just two months back. Seeing these products already solving real customer challenges has been a true full-circle moment. And this is just the beginning โ weโre driving even more innovation and progress from here.
Snowflake is revolutionizing data management with its AI Data Cloud. How do you see the intersection of AI and data becoming a game-changer for enterprises in the near future?
As our CEO Sridhar routinely says, thereโs no AI strategy without a data strategy. The evolution of the data platform is essential to the evolution of AI. Next year, we can expect major advancements that will help large language models better understand the meaning behind the data they work with through a semantic layer.
Currently, most data platforms lack this semantic layer, which provides context about what the data actually means. For example, with financial data in a table, itโs usually up to developers or analysts to figure out where the data comes from, how it was calculated, and what it represents. Instead, this understanding should be built directly into the data platforms themselves. Depending on developers and analysts to manually add this context for every application is time-consuming and inefficient. By moving the semantic layer closer to the data, AI can more easily grasp the dataโs meaning, making analysis far more effective. This shift eliminates the need for users to redefine the same concepts repeatedly for each application, which will enable business users to speak in business terms and identify key insights to better drive their decisions and grow their business. Integrating the semantic layer into the data platform will be the next big step forward for enterprises.
With your experience in building large-scale platforms and cloud services, what challenges do you see organizations facing today in scaling their data operations, and how can they overcome them?
Organizations face several challenges in scaling their data operations, including fragmented data silos, which hinder a unified view, and growing complexities with their data security and compliance. Legacy systems often lack the scalability needed to handle the increasing volume and variety of data, while a shortage of skilled professionals creates bottlenecks in optimizing data pipelines. Additionally, the increased demand to support the integration of AI and machine learning adds layers of complexity, making operational efficiency difficult to achieve.
Balancing these needs with cost-effective solutions requires a shift to modern, unified data platforms like that of Snowflake to streamline operations and drive scalability. Snowflake was built to be extremely easy to use for organizations of all sizes, across all industries. Compared to other vendors that require painstaking configuration of knobs, we help customers save time by automating platform management, maintenance, upgrades, and even performance improvements โ all without downtime. Snowflake eliminates complexity, ultimately reducing customer risk. This enables customers to save valuable time and money on administrative tasks so that they can spend it on what matters most: launching projects and products faster for end users. And by integrating advanced AI and ML tools directly into the platform, Snowflake empowers organizations to drive the latest in AI innovation at scale without compromising security, governance, or operational efficiency.
Weโd love to hear about how Snowflakeโs platform addresses the common problem of uniting siloed data for enterprises across industries.
Data silos remain a significant challenge for enterprises, as fragmented data across departments and systems hinders collaboration, slows decision-making, and prevents organizations from fully leveraging their dataโs potential. Breaking down these silos is critical for achieving a unified, data-driven approach to business operations and innovation.
The Snowflake platform eliminates silos across different data governance models, data types, and data sharing ecosystems, improving price performance and making it an ideal choice for organizations with unique compliance requirements and needs. Through the Snowflake Horizon Catalog, customers get a single data governance model with comprehensive compliance, security, privacy, and collaboration controls that are universally enforced to protect PII. Snowflake also supports all types of use cases and data types in one place, with near infinite scale. This includes support for structured data, high-volume semi-structured XML or JSON files, and unstructured data like PDFs or images.
Ally Financial, for example, a digital financial-services company with over 11,000 employees, utilizes Snowflake as its unified and well-governed data foundation, ensuring compliance, security, privacy, interoperability, and access. By eliminating data silos, Snowflake was able to enhance insights across the organization and help foster collaboration with customers and partners.
How do you foresee the role of AI in improving enterprise data management and decision-making processes?
2024 was the year enterprises raced to adopt AI, integrating it into their workflows and experimenting with its potential. However, 2025 is poised to be the year where AI transitions from hype to utility, proving its value as a transformative force that drives real, measurable impact. This shift will be particularly evident in enterprise data management and decision-making, as organizations move beyond pilot projects and embrace AI to unlock entirely new possibilities. From automating data preparation and enhancing predictive analytics to bridging the gap between structured and unstructured data, AI will streamline operations and deliver actionable insights at unprecedented speed and scale. In this phase of utility, enterprises can expect to see AI move from a supporting role to becoming a core driver of innovation, helping businesses optimize processes, improve customer experiences, and uncover hidden opportunities.
I believe AI is also poised to revolutionize data management by fundamentally reshaping the role of data scientists, transforming them into sophisticated AI analysts. Far from simply accelerating insights and decision-making, AI is unlocking advanced capabilities like natural language search for seamless data discovery and enabling data scientists to streamline workflows by automating repetitive tasks, uncovering hidden patterns, generating synthetic data, and performing sentiment analysis. This evolution marks a shift from data scientists being mere data processors to becoming architects of AI-driven transformation, ushering in a new era of innovation and data-driven decision-making.
Also Read:ย CIO Influence Interview with Brett Walkenhorst, CTO of Bastille
Finally, for tech leaders looking to innovate with data, what advice would you give on building a future-proof data architecture that can adapt to the rapid pace of technological change?
For organizations looking to future-proof their data architecture in todayโs rapidly evolving technological landscape, itโs essential to focus on four critical pillars: enterprise-grade scalability, security and governance embedded at the foundational level, seamless data interoperability, and a deep understanding of customer needs.
Enterprise-grade scalability ensures that your architecture can not only handle todayโs workloads, but also grow with the demands of your organization. A scalable system must accommodate increasing volumes of data, higher user demands, and the integration of new technologies without compromising performance or efficiency. Scalability isnโt just about managing growth โ itโs about staying agile and responsive in a competitive environment.
Equally important is embedding security and governance into the foundation of your data architecture. Protecting sensitive information, ensuring compliance with evolving regulations, and maintaining trust requires more than reactive measures. Organizations must design systems with proactive governance policies and robust security frameworks that mitigate risks while enabling data to be a trusted asset across the business.
Additionally, data interoperability is crucial for breaking down silos and enabling seamless collaboration across teams, systems, and platforms. An interoperable architecture facilitates the exchange of data across diverse sources, ensuring that your teams can access and use information where and when itโs needed. This not only drives operational efficiency but also empowers innovation by providing a holistic view of the organizationโs data ecosystem.
Finally, understanding customer needs is foundational to a future-proof data strategy. Itโs imperative that organizations design an architecture that not only scales with their business, but also adapts to their customersโ evolving needs to ensure their data platforms remain relevant and valuable.
Scalability, security and governance, interoperability, and customer-centricity form the foundations of a future-proof data architecture. By embracing these pillars, the next generation of data tech innovators has the opportunity to shape systems that not only drive growth and innovation, but also redefine how we connect, collaborate, and create value in a data-driven world. The future is yours to build โ start with a strong foundation and let your vision lead the way.
[To share your insights with us as part of editorial or sponsored content, please write toย psen@itechseries.com]
Carl Perry is the Head of Core Services at Snowflake, where he oversees the service team, cost governance features, data privacy, and Hybrid Tables development. As a past Snowflake customer, Carl has experienced and is passionate about the value of Snowflakeโs performance and ease of use.
Snowflake delivers the AI Data Cloud โ a global network where thousands of organizations mobilize data with near-unlimited scale, concurrency, and performance. Inside the AI Data Cloud, organizations unite their siloed data, easily discover and securely share governed data, and execute diverse analytic workloads.

