Extracts structured fields out of a single text field within a document. You choose which field to
extract matched fields from, as well as the grok pattern you expect will match. A grok pattern is like a regular
expression that supports aliased expressions that can be reused.

This tool is perfect for syslog logs, apache and other webserver logs, mysql logs, and in general, any log format
that is generally written for humans and not computer consumption.
This processor comes packaged with many
reusable patterns.

If you need help building patterns to match your logs, you will find the Grok Debugger tool quite useful! The Grok Debugger is an X-Pack feature under the Basic License and is therefore free to use. The Grok Constructor at http://grokconstructor.appspot.com/ is also a useful tool.

Grok sits on top of regular expressions, so any regular expressions are valid in grok as well.
The regular expression library is Oniguruma, and you can see the full supported regexp syntax
on the Oniguruma site.

Grok works by leveraging this regular expression language to allow naming existing patterns and combining them into more
complex patterns that match your fields.

The syntax for reusing a grok pattern comes in three forms: %{SYNTAX:SEMANTIC}, %{SYNTAX}, %{SYNTAX:SEMANTIC:TYPE}.

The SYNTAX is the name of the pattern that will match your text. For example, 3.44 will be matched by the NUMBER
pattern and 55.3.244.1 will be matched by the IP pattern. The syntax is how you match. NUMBER and IP are both
patterns that are provided within the default patterns set.

The SEMANTIC is the identifier you give to the piece of text being matched. For example, 3.44 could be the
duration of an event, so you could call it simply duration. Further, a string 55.3.244.1 might identify
the client making a request.

The TYPE is the type you wish to cast your named field. int, long, double, float and boolean are supported types for coercion.

For example, you might want to match the following text:

3.44 55.3.244.1

You may know that the message in the example is a number followed by an IP address. You can match this text by using the following
Grok expression.

The Grok processor comes pre-packaged with a base set of patterns. These patterns may not always have
what you are looking for. Patterns have a very basic format. Each entry has a name and the pattern itself.

You can add your own patterns to a processor definition under the pattern_definitions option.
Here is an example of a pipeline specifying custom pattern definitions:

Sometimes one pattern is not enough to capture the potential structure of a field. Let’s assume we
want to match all messages that contain your favorite pet breeds of either cats or dogs. One way to accomplish
this is to provide two distinct patterns that can be matched, instead of one really complicated expression capturing
the same or behavior.

Here is an example of such a configuration executed against the simulate API:

Both patterns will set the field pet with the appropriate match, but what if we want to trace which of our
patterns matched and populated our fields? We can do this with the trace_match parameter. Here is the output of
that same pipeline, but with "trace_match": true configured:

Grok expressions that take too long to execute are interrupted and
the grok processor then fails with an exception. The grok
processor has a watchdog thread that determines when evaluation of
a grok expression takes too long and is controlled by the following
settings:

Table 45. Grok watchdog settings

Name

Default

Description

ingest.grok.watchdog.interval

1s

How often to check whether there are grok evaluations that take longer than the maximum allowed execution time.