Sessionsdetails

Deep Learning in Mission-Critical and Scalable Applications with Open Source Frameworks

Intelligent real time applications are a game changer in any industry. This session discusses how traditional companies can leverage deep learning in real time applications. The first part introduces Deep Learning by leveraging open source frameworks like TensorFlow or DeepLearning4J. The second part shows how to deploy the built analytic models to mission-critical real time applications leveraging Apache Kafka as streaming platform and the Kafka Streams API to embed the intelligent business logic into any external application.

Extended AbstractIntelligent real time applications are a game changer in any industry. Deep Learning is one of the hottest buzzwords in this area. New technologies like GPUs combined with elastic cloud infrastructure enable the sophisticated usage of artificial neural networks to add business value in real world scenarios. Tech giants use it e.g. for image recognition and speech translation. This session discusses some real-world scenarios from different industries to explain when and how traditional companies can leverage deep learning in real time applications.This session shows how to deploy Deep Learning models into real time applications to do predictions on new events. Apache Kafka will be used to execute analytic models in a highly scalable and performant way.The first part introduces the use cases and concepts behind Deep Learning. It discusses how to build Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN) and Autoencoders leveraging open source frameworks like TensorFlow, DeepLearning4J or H2O.ai.The second part shows how to deploy the built analytic models to scalable, mission-critical real time applications leveraging Apache Kafka as streaming platform and the Kafka Streams API to embed the intelligent business logic into any external application or microservice.