Redis Labs adds machine learning module to its open source project Redis: Redis-ML



Redis Labs introduced Tuesday to its open source project Redis-ML, the Redis Module for Machine Learning that accelerates the delivery of real-time recommendations and predictions for interactive apps, in combination with Spark Machine Learning (Spark ML).

Machine learning is fast becoming a critical requirement for modern smart applications. Redis-ML accelerates the delivery of real-time predictive analytics for use cases such as fraud detection and risk evaluation in financial products, product or content recommendations for e-commerce applications, demand forecasting for manufacturing applications or sentiment analyses of customer engagements.

Spark ML (previously MLlib) delivers proven machine learning libraries for classification and regression tasks. Combined with Redis-ML, applications can now deliver precise, re-usable machine learning models, faster and with lower execution latencies.

Redis-ML enriches Spark ML to deliver faster prediction generation, so that storing and serving trained Spark Machine Learning models directly from Redis, parallelizes access to the models and significantly improves performance. Initial benchmarks showed five to ten times latency improvement over the standard Spark solution in real time classifications.

Redis-ML avoids the need to generate the model from file systems or other disk based data stores, a process which usually involves long serialization/deserialization overheads with slow disk accesses. With Redis-ML, at the end of the training phase, the model is just stored in its native format in Redis.

As user traffic grows, it is important to guarantee real-time recommendations and predictions at a consistent speed to the end user. With Redis-ML, recommendations and predictions are delivered at consistent speed no matter how many concurrent users are accessing the model.

Redis-ML provides great interoperability for all languages including Scala, Node, .Net, Python and more. With Redis ML, models are no longer restricted to the language they were developed in, they can be accessed by applications written in different languages concurrently using the simple API.

The offering also delivers predictions with better precision requires larger machine learning models. Existing solutions cannot hold the model in-memory when it grows beyond the memory available in a single node. This immediately reduces performance and triggers the serialization/ serialization to disk and performance suffers. The Redis-ML module takes full advantage of Redis Labs’ in-memory distributed architecture to scale the database to any size needed in a fully automated manner without affecting performance.

Once the models are ready, Redis-ML makes it easy to obtain recommendations or predictions for the application using simple APIs, without having to implement custom recommendation/prediction generation code or setting up a highly available and scalable infrastructure that supports it.

Training new models can be done offline. However, reliably delivering real-time predictive intelligence is critical for modern applications. The Redis-ML module, deployed with Redis Labs’ technology delivers always-on availability that protects against process, node, rack or data center failures with instant automatic detection and failover.

“The combination of Apache Spark and Redis simplifies and accelerates the implementation of predictive intelligence in modern applications,” said Ram Sriharsha, product manager for Apache Spark at Databricks. “This latest release from Redis Labs is a great example of Spark’s growth and maturity in enterprise machine learning applications.”

“The Redis-ML module with Apache Spark, delivers lightning fast classifications with larger data sizes, in real-time and under heavy load, while allowing many applications developed in different languages to simultaneously utilize the same models,” states Dvir Volk, senior architect at Redis Labs. “The Redis-ML module is a great demonstration of the power of Redis Modules API in supporting the cutting-edge needs of next generation applications.”

Leave a Reply

WWPI – Covering the best in IT since 1980