Clickstream analysis is the process of collecting, analyzing, and
reporting aggregate data about which pages a website visitor visits and
in what order. The path the visitor takes though a website is called the
clickstream.

This tutorial focuses on building real-time analytics of users to determine:

If you do not have Docker, you can also run an automated version of the Clickstream tutorial designed for local Confluent Platform installs. Running the Clickstream demo locally without Docker requires that you have Confluent Platform installed locally, along with Elasticsearch and Grafana.

You can configure Java streams applications to deserialize and ingest data in
multiple ways, including Kafka console producers, JDBC source connectors, and
Java client producers. For full code examples,
see connect-streams-pipeline.

To learn how to deploy a Kafka streaming ETL using KSQL for stream processing, you can run the Confluent Platform demo. All components in the Confluent Platform demo have encryption, authentication, and authorization configured end-to-end.

How to explore Kafka topic data, create a STREAM or TABLE from a Kafka topic,
identify fields. Also explains metadata like ROWTIME and TIMESTAMP, and covers
different formats like Avro, JSON, and Delimited.