"The combination of Apache Spark and Redis simplifies and accelerates the implementation of predictive intelligence in modern applications," said Ram Sriharsha, product manager for Apache Spark at Databricks. "This latest release from Redis Labs is a great example of Spark's growth and maturity in enterprise machine learning applications."

"The Redis-ML module with Apache Spark, delivers lightning fast classifications with larger data sizes, in real-time and under heavy load, while allowing many applications developed in different languages to simultaneously utilize the same models," states Dvir Volk, senior architect at Redis Labs. "The Redis-ML module is a great demonstration of the power of Redis Modules API in supporting the cutting-edge needs of next generation applications."

Redis-ML avoids the need to generate the model from file systems or other disk based data stores, a process which usually involves long serialization/deserialization overheads with slow disk accesses. With Redis-ML, at the end of the training phase, the model is just stored in its native format in Redis.

Consistent Prediction Delivery: As user traffic grows, it is important to guarantee real-time recommendations and predictions at a consistent speed to the end user. With Redis-ML, recommendations and predictions are delivered at consistent speed no matter how many concurrent users are accessing the model.

Greater Interoperability: Redis-ML provides great interoperability for all languages including Scala, Node, .Net, Python and more. With Redis ML, your models are no longer restricted to the language they were developed in, they can be accessed by applications written in different languages concurrently using the simple API.

Scaling Machine Learning Models: Delivering predictions with better precision requires larger machine learning models. Existing solutions cannot hold the model in-memory when it grows beyond the memory available in a single node. This immediately reduces performance and triggers the serialization/serialization to disk and performance suffers. The Redis-ML module takes full advantage of Redis Labs' in-memory distributed architecture to scale the database to any size needed in a fully automated manner without affecting performance.

Simplified Deployment: Once the models are ready, Redis-ML makes it easy to obtain recommendations or predictions for the application using simple APIs, without having to implement custom recommendation/prediction generation code or setting up a highly available and scalable infrastructure that supports it.

Create additional modules to solve modern data challenges at the Redis Module Global Hackathon, registration for which is now open, with submissions concluding on Nov. 12th. The event is expected to bring together over 500 teams from around the world online and in the associated onsite hackathons in San Francisco and Tel Aviv. Participants will be eligible to win up to a total of $10,000 in cash prizes. Grand prize winners will be announced on Nov. 17th.

About Redis LabsRedis Labs is the open source home and provider of enterprise class Redis, an in-memory NoSQL database platform benchmarked as the world's fastest. Thousands of customers rely on Redis Labs' high performance, seamless scalability, true high availability, versatility and best-in-class expertise to power their cutting edge applications. Redis Labs' software and database-as-a-service solutions enhance popular Redis use cases such as real-time analytics, fast high-volume transactions, in-app social functionality, and application job management, queuing and caching.