- This article is a Community contribution and may include unsupported customizations.

The goal is install in a dedicated server or VM, all the components to have a Centralized Log Server, and also a powerfull Dashboard to configure all the reports.

The Logstash, Elasticsearch and Kibana will be installed on this dedicated VM, in the Zimbra Server, or servers, will be installed the Agent.

Hardware and Software requisites

In the Server, or VM, we will install a fresh Ubuntu Server 14.04LTS.
For the Hardware part, depends on how many Zimbra Servers, and how detailed are the Logs. For a regular environment, with the next resources is enough:

OS: Ubuntu 14.04 LTS

vRAM: 4GB

vCPU: 2

vDisk: 100GB (SAS 10K or even better 15K)

Install the Centralized Log Server

Installing Java

Elasticsearch and Logstash needs Java 7 to work, to install it, we need to add the PPA from Oracle to our apt:

Once inside the file, search for the line elasticsearch: and change the port number (default 9200) for the port number 80, later we will connect to the Kibana Server in a easy way, trought the 80 HTTP port:

elasticsearch: "http://"+window.location.hostname+":80",

Also, we will use nginx to serve our app, Kibana, so we will create first the folder in the /var/www directory:

Installing Nginx

We will install nginx from the official apt repositories:

root@logstashkibana01:/home/oper# sudo apt-get install nginx

Kibana and Elsaticsearch works in a particular way, the user needs to access to Elasticsearch directly, so we need to configure Nginx to redirect all the packets to the 9200 port to the 80 port.
But no worries, Kibana have and example that we can use for this. .

We will download the Nginx configuration from the GitHub to our folder:

Installing Logstash

This is the last package that we will install on the Server or VM. Now is time to install Logstash. We will install it from the Elasticsearch repository, that we have from before, so just launch the next commands:

Logstash is now installed, but we need to do this step before continue.

Generate the SSL Certificates to use in the server/client connection

We will use Logstash Forwarder in the Zimbra servers to send the logs to the Centralized Log Server. We want to do it in a secure way also. We need to generate an SSL and a key pair. The SSL will be used for the Client to verify the server identity.

First step is create the path while we will save the SSL and the private key:

Configuring Logstash Forwarder

We are close to finish, inside the Zimbra Server, we need to think about what Logs we need to send to the Centralized Log Server.

Create a configuration file for Logstash Forwarder in JSON format:

root@zimbra-sn-u14-01:/home/oper# sudo vi /etc/logstash-forwarder

Now, we will fill the configuration file, change the IP for your own Centralized Log Server IP. Here in this example I will send to the Centralized Log Server the next logs: syslog, auth.log, mailbox.log, nginx.access.log, nginx.log, zimbra.log y mail.log, but you can add whatever log that you want:

We need to repeat this steps in each Zimbra Server that we want to have the Logs centralized.

Connecting to Kibana

Now is time to play! and also play in HTML5. Open a Web browser and type your IP or FQDN from your Centralized Log Server.
The first thing that we will see is an overview of Kibana, etc.
We will select the option 11. Sample Dashboard.

I really like it Kibana and also have a Centralized Log Server, but this is specially useful because we can search inside the Logs using checkbox, to filter and have the answer easier.
Also we can mix the search and order for type of field, awesome!

Also, we can play with the Dashboard as we want, share a Public URL with some Customers, or between the IT Department and other Departments, etc.

This is a real overview, and I can see the total Logs received during a period of time.

Here we can see an example of how we can see a Log file, perfectly parsed to consume the information in a easier, and human, way.