CIO Influence
AIOps CIO Influence News IT and DevOps Networking

Over 40% of IT Decision Makers Believe Their Current Data Architectures Won’t Meet Future Model Inferencing Challenges

Over 40% of IT Decision Makers Believe Their Current Data Architectures Won’t Meet Future Model Inferencing Challenges
New market research identifies demand for real-time model training and inferencing; Highlights major challenges with accuracy, latency, and reliability in current architectures

As companies look to expand their use of artificial intelligence (AI) and machine learning (ML) to keep up with the demands of their customers, they are facing hurdles getting these projects to production and ultimately deliver the desired results to their bottom line. In fact, 88% of AI/ML decision-makers expect the use cases that require these technologies to increase in the next one to two years, according to a commissioned study conducted by Forrester Consulting on behalf of Redis Labs. The research looked at the challenges keeping decision-makers from their desired transformation when deploying ML to create AI applications.

Recommended ITech News:  RingCentral Attacks Robocalls with AI-Powered Solution and STIR/SHAKEN Implementation

Study finds companies are developing increasingly more models based on real-time data, however IT decision-makers believe current data architectures won’t meet their future model inferencing requirements.

The study revealed that companies are developing increasingly more models based on real-time data. Still, more than 40% of respondents believe their current data architectures won’t meet their future model inferencing requirements. Most decision-makers (64%) say their firms are developing between 20% to 39% of their models on real-time data from data streams and connected devices. As teams develop more models on real-time data, the need for accuracy and scalability is becoming increasingly critical. Significantly, thirty-eight percent of leaders are developing roughly a third of models on the real-time spectrum.

Other key findings include:

  • Nearly half of decision-makers cite reliability (48%) and performance (44%) as their top challenges for getting models deployed with their current databases. Equally concerning was the revelation that 41% of respondents believe their databases cannot meet the necessary data security and compliance requirements.
  • To achieve the benefits that AI/ML promise, survey respondents said that locating models in an in-memory database would solve key hurdles currently standing in their way. According to the survey, the benefits architecturally would allow firms to prepare data more efficiently (49%), improve analytics efficiency (46%), and keep data safer (46%).

Recommended ITech News:  Hewlett Packard Enterprise Expands 5G Portfolio with Automated 5G Management Solution

As Forrester Consulting concludes, “AI powered by ML models mustn’t slow down applications by necessitating a network hop to a service and/or microservice for an application to use an ML model and/or get reference data. Most applications, especially transactional applications, can’t afford those precious milliseconds while meeting service-level agreements (SLAs).”

“Companies are embracing AI/ML to deliver more value for their mission-critical applications, yet need a modern AI/ML infrastructure to support real-time serving and continuous training. There are still gaps that impede companies from making existing applications smarter and delivering new applications,” said Taimur Rashid, Chief Business Development Officer at Redis Labs. “Customers realize this, and the simplicity and versatility of Redis as an in-memory database is enabling them to implement Redis as an online feature store and inferencing engine for low-latency and real-time serving.”

Recommended ITech News:  Patriot™ Launches The SUPERSONIC RAGE PRIME USB 3.2 Gen 2 Flash Drive

“Fabric was established to help brands migrate from legacy to modern, digital commerce systems,” said Umer Sadiq, CTO of Fabric. “In order to offer businesses the best-in-class technologies that enhance and improve customer experiences, we have crafted and continue to deliver applications that rely on Redis Labs’ real-time data platform hosted on AWS to ensure real-time feature serving to customers, thus maintaining exceptional user satisfaction. Additionally, by combining the power of Amazon SageMaker and Redis Enterprise to bolster the efficiency of our market-leading recommender systems, we guarantee low-latency and high reliability for each individual customer interaction.”

“The Room’s mission is to connect top talent from around the world to meaningful opportunities, and at the core of the technology challenge is a mathematically difficult entity-matching problem,” said Peter Swaniker, CTO of The Room. “To address this complexity, we have architected a joint solution using Scribble Data’s Enrich Feature Store and Redis Labs’ real-time data platform to provide the overall framework for The Room’s Intelligence Platform, which is responsible for entity matching. Using Redis’ high-performance key retrieval based on nearest neighbor vector lookup, the team was able to achieve a 15x+ improvement in the core similarity computation loop without any memory overhead.”

“The use of machine learning (ML) algorithms in simulations continues to grow to improve scientific research with efficiency and accuracy,” said Benjamin Robbins, Director AI & Advanced Productivity, Hewlett Packard Enterprise. “By leveraging Redis and RedisAI in SmartSim, our new open source AI framework which advances simulations that run on supercomputers, users can exchange data between existing simulations and an in-memory database, while the simulation is running. The ease of data exchange helps unlock new machine learning opportunities, such as online inference, online learning, online analysis, reinforcement learning, computational steering, and interactive visualization that can further improve accuracy in simulations and accelerate scientific discovery.”

Recommended ITech News:  NightDragon, Carahsoft Partner to Deliver Innovative Cybersecurity Solutions to Federal Government Customers

Related posts

GoNetspeed Announces Plans to Bring East Hartford 100 Percent Fiber Internet

Business Wire

Leostream Announces Support for AWS WorkSpaces Core

CIO Influence News Desk

John Teltsch Joins Converge Technology Solutions Corp. as Chief Revenue Officer