In the context of GeoMesa, Kafka is a useful tool for working with
streams of geospatial data. Interaction with Kafka in GeoMesa occurs
with the KafkaDataStore which implements the GeoTools
DataStore
interface.

<brokers> your Kafka broker instances, comma separated. For a
local install, this would be localhost:9092.

<zookeepers> your Zookeeper nodes, comma separated. For a local
install, this would be localhost:2181.

The program will create some metadata in Zookeeper and an associated
topic in your Kafka instance, and pause execution to let you add the
newly created KafkaDataStore to GeoServer. Once GeoServer has been
configured, we’ll pick back up with the paused program.

Optional command-line arguments for KafkaQuickStart are:

-zkPath<zkpath>: used for specifying the Zookeeper path for
storing GeoMesa metadata. Defaults to “/geomesa/ds/kafka” and
ordinarily does not need to be changed

Log into GeoServer using your credentials. Click “Stores” in the
left-hand gutter and “Add new Store”. If you do not see the Kafka Data
Store listed under Vector Data Sources, ensure the plugin and
dependencies are in the right directory and restart GeoServer.

Select the Kafka(GeoMesa) vector data source and enter the
following parameters:

GeoServer should find the KafkaQuickStart feature type in the data
store and redirect you to the “New Layer” page, presenting the feature
type as a layer that can be published. Click on the “Publish” link. You
will be taken to the “Edit Layer” page.

Warning

If you have not yet run the quick start code as described
in Run the Code above, the feature type will not have been
registered and you will not get a “New Layer” page after saving the
store. In this case, run the code as described above, click on
“Layers” in the left-hand gutter, click on “Add a new resource”, and
select your data store in the pulldown next to “Add layer from”. The
link to publish the KafkaQuickStart feature should appear.

You can leave most fields as default. In the Data pane, you’ll need to
enter values for the bounding boxes. In this case, you can click on the
links to compute these values from the data. Click “Save”.

Click on the “Layer Preview” link in the left-hand gutter. If you don’t
see the quick-start layer on the first page of results, enter the name
of the layer you just created into the search box, and press <Enter>.

Once you see your layer, click on the “OpenLayers” link, which will open
a new tab. At this point, there are no messages in Kafka so nothing will
be shown.

Resume the program’s execution by inputting <Enter> in your terminal now
that the KafkaDataStore is registered in GeoServer. The program will
create two SimpleFeatures and then write a stream of updates to
the two SimpleFeatures over the course of about a minute.

You can refresh the GeoServer layer preview repeatedly to visualize the
updates being written to Kafka.

The layer preview of GeoServer uses the
LiveKafkaConsumerFeatureSource to show a real time view of the
current state of the data stream. Two SimpleFeatures are being
updated over time in Kafka which is reflected in the GeoServer display.

As you refresh the page, you should see two SimpleFeatures that
start on the left side gradually move to the right side while crossing
each other in the middle. As the two SimpleFeatures get updated,
the older SimpleFeatures disappear from the display.

The program will construct the live and replay consumers and log
SimpleFeatures to the console after all the messages are sent to Kafka
and therefore after all the updates are made.

The live consumer will log the state of the two SimpleFeatures after all
updates are finished. The replay consumer will log the state of the two
SimpleFeatures five seconds earlier than the last update. The replay
consumer will create a new SimpleFeatureType with an additional
attribute KafkaLogTime. By preserving the KafkaLogTime as an
attribute, we can create the state of SimpleFeatures at time x by
querying for when KafkaLogTime equals x.

The GeoTools API also includes a mechanism to fire off a
FeatureEvent
each time there is an event (typically when the data are changed) in a
DataStore. A client may implement a
FeatureListener,
which has a single method called changed() that is invoked as each
FeatureEvent is fired.

The code in KafkaListener implements a simple FeatureListener
that prints the messages received. Open up a second terminal window and
run (with $KAFKA_VERSION set to “08”, “09”, or “10” as appropriate):

and use the same settings for <brokers> and <zookeepers>. Then
in the first terminal window, re-run the KafkaQuickStart code as
before. The KafkaListener terminal should produce messages like the
following:

The portion of KafkaListener that creates and implements the
FeatureListener is:

// the live consumer must be created before the producer writes features// in order to read streaming data.// i.e. the live consumer will only read data written after its instantiationSimpleFeatureSourceconsumerFS=consumerDS.getFeatureSource(sftName);consumerFS.addFeatureListener(newFeatureListener(){@Overridepublicvoidchanged(FeatureEventfeatureEvent){System.out.println("Received FeatureEvent of Type: "+featureEvent.getType());if(featureEvent.getType()==FeatureEvent.Type.CHANGED&&featureEventinstanceofKafkaFeatureEvent){printFeature(((KafkaFeatureEvent)featureEvent).feature());}if(featureEvent.getType()==FeatureEvent.Type.REMOVED){System.out.println("Received Delete for filter: "+featureEvent.getFilter());}}});

Additionally, the KafkaQuickStart class run above can generate a
‘clear’ control message at the end of the run if you specify
“-Dclear=true” on the commandline. This will generate a Feature removed
FeatureEvent with a Filter.INCLUDE.

Given a stream of geospatial data, GeoMesa’s integration with Kafka
enables users to maintain a real time state of SimpleFeatures or
retrieve any arbitrary state preserved in history. One can additionally
process and analyze streams of data by integrating a data processing
system like Storm or
Samza. See the GeoMesa Storm Quick Start
tutorial for more information on using Storm with GeoMesa.