on sofware development and technology

Cloud Foundry and Logstash

Cloud Foundry has the ability to capture logs from several platform components and from applications running on the platform, aggregate these logs, and export the data to an external log management and analysis system. You can read more about this in the Cloud Foundry documentation.

Logstash is one of the log management systems that Cloud Foundry can work with, but some configuration is required to get logstash to consume and understand Cloud Foundry logs. This post shows how to configure logstash for use with Cloud Foundry. Detailed instructions for installing, configuring, and using logstash as well as elasticsearch and kibana (collectively known as ELK) on Ubuntu Linux are also provided.

Configuring logstash for Cloud Foundry

The Cloud Foundry Loggregator component formats logs according to the syslog standard as defined in RFC5424. The logstash cookbook includes an example configuration for syslog consumption, but that configuration follows an older RFC3164 syslog standard.

Here is a logstash configuration that works with RFC5424 output, with some additional changes for Cloud Foundry:

The important difference between this configuration and the one in the logstash cookbook is in line 15, which configures the parsing of log records according to RFC5424. With this configuration, these logstash event fields will be populated:

syslog5424_host will always be loggregator, indicating that the log entry came from Cloud Foundry

syslog5424_app will contain the GUID for the Cloud Foundry application that generated the log entry

syslog5424_proc will contain the Cloud Foundry component that generated the log entry, with values like:

[DEA] – logs from the DEA

[STG] – logs from the appliation staging process

[RTR] – logs from the Router

[App/n] – logs from an application, with n designating the instance index

syslog5424_msg will contain the log message text

This parsing configuration is almost the same as the built-in RFC5424 parser for logstash, with a few important differences:

the default logstash parsing for syslog5424_app allows only alpha, numeric, and underscore characters, but Cloud Foundry sets this field to a GUID which contains - characters

the default logstash parsing for syslog5424_proc allows only alpha, numeric, and underscore characters, but Cloud Foundry can include a / character

With this configuration, you can follow the instructions in the Cloud Foundry documentation to create a user-provided log draining service and bind the service to an application. The configuration above tells logstash to listen on port 5000, so the user-provided service creation and binding might look something like this:

where [logserver] is the name or IP address of the server where logstash is running and [app-name] is the name of an application running on Cloud Foundry.

If you are knowledgable in logstash installation and configuration, this should be enough to allow your logstash system to consume logs from Cloud Foundry. For a quick and easy setup of logstash, elasticsearch, and kibana for log management and analysis, read on.

Installing ELK on Ubuntu 12.04

Downloading and installation instructions for the ELK stack can be found here. Detailed instructions for Ubuntu 12.04 are provided below for the impatient. Adjust as necessary for other operating systems or Linux distributions by following the instructions for installing and configuring each component.

The easiest way to install elasticsearch, logstash, and kibana on Ubunutu is to add the ELK repositories to your local apt configuration and install using apt-get.

Configuring apt

First, add the elasticsearch public GPG keys to your local apt configuration:

The default elasticsearch configuration should work without modification.

Installing and configuring logstash

Installing logstash is just as simple:

1

$ sudo apt-get install logstash

logstash needs a configuration file that instructs it where to get its inputs from and where to send its outputs. Copy the contents of the logstash configuration file shown at the beginning of this post and paste it into a file named /etc/logstash/conf.d/syslog.conf.

Then start the logstash process:

1

$ sudo service logstash start

Installing and configuring kibana

The kibana web application runs in an Apache httpd web server. Install Apache httpd, download the kibana HTML and JavaScript files, and copy them into the httpd document root directory:

You should now be able to point your browser to a URL like https://logserver/index.html#/dashboard/file/logstash.json, where logserver is the name or IP address of the server where kibana was installed. You should see a logstash dashboard running in kibana. The dashboard will be empty until you create a user-provided log drain service and bind it to an application.

Creating and binding a log drain service

As stated earlier in this poast, you can follow the instructions in the Cloud Foundry documentation to create a user-provided log draining service and bind the service to an application using commands like these: