Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Event Driven Microservices

The presentation explains the reasons we picked Kafka as Streaming Hub and the use of Kafka Streams to avoid common anti-patterns, streamline development experience, improve resilience, enhance performances and enable experimentation. A step-by-step example will be presented to introduce the Kafka Streams DSL and understand what happens under the hood of a stateful streaming application.

13.
µ-services antipatterns 1/2: The God Service
A data service that grows exposing an increasing set of functions to the
point where it starts look like a homegrown database
○ getProduct(id)
○ getAllProducts(saleId)
○ getAllAvailableProducts(saleId)
○ getAllActiveProducts()
○ getSku(id)
○ getAllSkusByProduct()

14.
µ-services antipatterns 2/2: REST-to-ETL problem
When it’s preferable to extract the data from a data service and keep it
local for different reasons:
● Aggregation: data needs to be combined with another dataset
● Caching: data needs to be closer to get better performances
● Ownership: the data services provide limited functionalities and can’t
be changed quickly enough

20.
Summary
● Lambda architecture works well but the implementation is not trivial
● Stream processing introduces a new programming paradigm
● Use the schema registry from day 1 to support schema changes
compatibility and avoid to break downstream consumers
● A replayable log (Kafka) and a streaming library (Kafka Streams) give the
freedom to slice, dice, enrich and evolve data locally as it arrives
increasing resilience and performance