CIO Influence
Analytics CIO Influence News Machine Learning

Databricks Unveils Lakeflow Designer for Data Analysts to Build Reliable Pipelines Without Coding

Databricks Unveils Lakeflow Designer for Data Analysts to Build Reliable Pipelines Without Coding

Databricks Logo (PRNewsfoto/Databricks)

Databricks, the Data and AI company, today announced the upcoming Preview of Lakeflow Designer. This new no-code ETL capability lets non-technical users author production data pipelines using a visual drag-and-drop interface and a natural language GenAI assistant. Lakeflow Designer is backed byย Lakeflow, the unified solution for data engineers to build reliable data pipelines faster with all business-critical data, which is now Generally Available.

Also Read:ย Why Cybersecurity-as-a-Service is the Future for MSPs and SaaS Providers

Traditionally, enterprises have faced a big tradeoff โ€” either letting analysts create pipelines with no-code/low-code tools, while sacrificing governance, scalability and reliability. Or, they’ve relied on technical data engineering teams to code production-ready pipelines, but those teams are overloaded, and their backlog is long. In the end, most enterprises adopt a combination of both approaches, resulting in complex environments to manage and maintain. What data-driven enterprises really want is the best of both worlds: no code pipelines with governance, scalability and reliability.

“There’s a lot of pressure for organizations to scale their AI efforts. Getting high-quality data to the right places accelerates the path to building intelligent applications,” saidย Ali Ghodsi, Co-founder and CEO at Databricks. “Lakeflow Designer makes it possible for more people in an organization to create production pipelines so teams can move from idea to impact faster.”

Lakeflow Designer: AI-Native Drag-and-Drop Data Prep for the Business Analyst
The new Lakeflow Designer empowers business analysts to build no-code ETL pipelines with natural language and a drag-and-drop UI that provides the same scalability,ย  governance, and maintainability as those built by data engineers. Backed by Lakeflow, Unity Catalog, and Databricks Assistant, Lakeflow Designer eliminates the divide between code and no-code tools. With this new approach, non-technical users gain the speed and flexibility they require to solve business problems without burdening data engineers with maintenance issues and governance headaches.

Additional Lakeflow Capabilities Launching

  • Lakeflow Enters GA:ย Today, Lakeflow became generally available, providing a unified data engineering solution from ingestion to transformation and orchestration. Notably, the new Lakeflow Declarative Pipelines capabilities allow data engineers to build end-to-end production pipelines in SQL or Python without having to manage infrastructure.
  • New IDE for Data Engineering:ย Lakeflow is debuting a brand new development experience that speeds up data pipeline development with AI-assisted coding, debugging and validation in an integrated UI.
  • New Ingestion Connectors:ย New point-and-click ingestion connectors for Lakeflow Connect are launching for Google Analytics, ServiceNow, SQL Server, SharePoint, PostgreSQL, and SFTP, joining connectors forย Salesforce Platform and Workday Reportsย that are already available.
  • Direct Write to Unity Catalog with Zerobus: Zerobus enables developers to write high volumes of event data with near real-time latency to their lakehouse without the need to manage extra infrastructure like a message bus. This streamlined, serverless infrastructure provides performance at scale for IoT events, clickstream data, telemetry and other event-driven use cases.

Also Read:ย How Network Tooling Impacts IT Professionalsโ€™ Job Performance and Satisfaction

Customer Momentum
“The new editor brings everything into one place โ€” code, pipeline graph, results, configuration, and troubleshooting. No more juggling browser tabs or losing context. Development feels more focused and efficient. I can directly see the impact of each code change. One click takes me to the exact error line, which makes debugging faster. Everything connects โ€” code to data; code to tables; tables to the code. Switching between pipelines is easy, and features like auto-configured utility folders remove complexity. This feels like the way pipeline development should work.” โ€”ย Chris Sharratt, Data Engineer, Rolls-Royce

“Using the Salesforce connector from Lakeflow Connect helps us close a critical gap for Porsche from the business side on ease of use and price. On the customer side, we’re able to create a completely new customer experience that strengthens the bond between Porsche and the customer with a unified and not fragmented customer journey,” saidย Lucas Salzburger, Project Manager, Porsche Holding Salzburg

“Joby is able to use our manufacturing agents with Lakeflow Connect Zerobus to push gigabytes a minute of telemetry data directly to our lakehouse, accelerating the time to insights โ€” all with Databricks Lakeflow and the Data Intelligence Platform.”ย  โ€“ Dominik Mรผller, Factory Systems Lead, Joby Aviation.

[To share your insights with us as part of editorial or sponsored content, please write toย psen@itechseries.com]

Related posts

QRS Music Technologies, Inc. Debuts the QRS-Tracks IOS and Apple Watch App

Clients Embed ESG Principles into Collateral Management Leveraging BNY Mellon ESG Data Analytics

CIO Influence News Desk

Study Shows Financial Services Organizations are in Early Phases of Multicloud Adoption