CIO Influence
CIO Influence News Natural Language

Kinetica Launches Quick Start for SQL-GPT

Kinetica Launches Quick Start for SQL-GPT

Customers can be up and running with Language to SQL on their enterprise data within one hour for free

Kinetica, the real-time database for analytics and generative AI, announced the availability of a Quick Start for deploying natural language to SQL on enterprise data. This Quick Start is for organizations that want to experience ad-hoc data analysis on real-time, structured data using an LLM that accurately and securely converts natural language to SQL and returns quick, conversational answers. This offering makes it fast and easy to load structured data, optimize the SQL-GPT Large Language Model (LLM), and begin asking questions of the data using natural language. This announcement follows a series of GenAI innovations which began last May with Kinetica becoming the first analytic database to incorporate natural language into SQL.

PREDICTIONS SERIES 2024 - CIO Influence

Here is how it works:

  • First, sign up for Kinetica Cloud Free edition;
  • Second, simply load files into Kinetica;
  • Third, create context for those tables that will help the LLM associate the words and terminology with the names of fields and columns;
  • Finally, use the prompt to ask explicit questions and get near instantaneous answers.

CIO INFLUENCE News: Xtract One Technologies Partners with H-E-B Center at Cedar Park to Secure All Venue Entrances

“We’re thrilled to introduce Kinetica’s groundbreaking Quick Start for SQL-GPT, enabling organizations to seamlessly harness the power of Language to SQL on their enterprise data in just one hour,” said Phil Darringer, VP of Product, Kinetica. “With our fine-tuned LLM tailored to each customer’s data and our commitment to guaranteed accuracy and speed, we’re revolutionizing enterprise data analytics with generative AI.”

CIO INFLUENCE News: Censis Technologies Achieves HITRUST (i1) Certification to Manage Data Protection

The Kinetica database converts natural language queries to SQL, and returns answers within seconds, even for complex and unknown questions. Further, Kinetica converges multiple modes of analytics such as time series, spatial, graph, and machine learning that broadens the types of questions that can be answered. What makes it possible for Kinetica to deliver on conversational query is the use of native vectorization that leverages NVIDIA GPUs and modern CPUs. NVIDIA GPUs are the compute paradigm behind every major AI breakthrough this century, and are now extending into data management and ad-hoc analytics. In a vectorized query engine, data is stored in fixed-size blocks called vectors, and query operations are performed on these vectors in parallel, rather than on individual data elements. This allows the query engine to process multiple data elements simultaneously, resulting in radically faster query execution on a smaller compute footprint.

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

ITC Secure and Omada Join Forces to Simplify Identity Lifecycle Management With Modern Identity Governance

CIO Influence News Desk

NCSC “Decrypting Diversity” Report the UK Cyber Security Council Responds

CIO Influence News Desk

Apiax, BASF, and Fujitsu Research Labs to Headline Neo4j Largest Global Gathering of Graph-Focused Developers

CIO Influence News Desk