Note that the program contains custom user code and hence requires a JAR file with
the classes of the code attached. The constructor of the remote environment
takes the path(s) to the JAR file(s).

Linking with modules not contained in the binary distribution

The binary distribution contains jar packages in the lib folder that are automatically
provided to the classpath of your distributed programs. Almost all of Flink classes are
located there with a few exceptions, for example the streaming connectors and some freshly
added modules. To run code depending on these modules you need to make them accessible
during runtime, for which we suggest two options:

Either copy the required jar files to the lib folder onto all of your TaskManagers.
Note that you have to restart your TaskManagers after this.

Or package them with your code.

The latter version is recommended as it respects the classloader management in Flink.

Packaging dependencies with your usercode with Maven

To provide these dependencies not included by Flink we suggest two options with Maven.

The maven assembly plugin builds a so-called uber-jar (executable jar) containing all your dependencies.
The assembly configuration is straight-forward, but the resulting jar might become bulky.
See maven-assembly-plugin for further information.

The maven unpack plugin unpacks the relevant parts of the dependencies and
then packages it with your code.

Using the latter approach in order to bundle the Kafka connector, flink-connector-kafka
you would need to add the classes from both the connector and the Kafka API itself. Add
the following to your plugins section.