2016-04-21

The ELK Stack is an open-source product that offers tools for indexing and searching logs that combines Elasticsearch, Logstash, and Kibana. We at log analysis platform Logz.io have already discussed how to install ELK on AWS as well as on the OpenStack cloud on our blog, and in this guide, we will explain how to leverage Azure to install ELK easily using two different techniques: Azure virtual machines and the Azure marketplace.

Installing ELK on Azure VMs



One way to install ELK on Azure is to create the required number of virtual machines and then install the stack.

Creating the VMs on Azure is pretty straightforward and is accompanied by an awesome user-interface. In the left-side menu, there is a virtual machines item, which, after clicking on it, will display a long list of existing virtual machines and provide an option to create new ones.

After clicking the Add button, Azure will show a list of VM types that you can install. For our purposes, we selected Ubuntu Server 15.10. The wizard for creating a VM on Azure is pretty straightforward, which is why we will skip the steps — you only need to click on one button to move on to each step. The only step to which users should pay attention is when you select the deployment model (right after choosing the operating system for the VMs).

The Azure Resource Manager allows you to use pre-built application templates or construct an application template yourself to deploy and manage your cloud resources. We used the Resource Manager, for example, and not the classic option because the Resource Manager deploys and manages resources more easily.



After creating the resource, it requires some configuration, which the VM will do in four easy steps. This configuration includes the type of authentication, the name of machine, the subscription, the resource group, and the location of the VM. Once you are done with the basic configuration, you have to choose the size of VM and then configure any desired optional features such as storage, network, and monitoring. After completing these four steps, Azure will start to deploy your VM.



Once deployment is done, Azure will display your newly-created VM on your dashboard. Now, you are ready to install the ELK stack on your new VM.

An important step is creating rules that will allow you to access Kibana (or ElasticSearch HTTP) from the outside. Configuring input/output security rules is possible under the Network security group section, which can be found on the General Settings overview page.

(Note: The public IP address of your VM can be easily obtained after clicking on ELK-MACHINE-1 within the general settings overview, which sits under the public IP address section.)

As you can see below, adding new inbound security is pretty easy when you need to enter the source and destination port range or source the IP address if you want to grant access to particular IPs. We usually allow Kibana access on port 5601, but in our case, for source port range, we chose * to allow traffic from clients to connect from any port and for destination port range.

Install Elasticsearch on Azure

We’re now ready to install the ELK stack. After connecting with our machine, we will start to download the products from the Elastic website. Before we start to download Elasticsearch, the machine must have Java installed. (Here is how to install Java on Ubuntu.)

The following commands will download Elasticsearch and extract it to the current working directory, which is to where the terminal points at that moment (for example, the Windows shell: C:\test\folder, for *NIX based <user>@<hostname>:~/folder/something$):

The next step is to change the Elasticsearch default configuration as required — including the cluster name, node name, or network binding host as shown here:

We’re now ready to start Elasticsearch with the following instructions:

Installing Logstash on Azure

To download and execute Logstash and Kibana, we need to repeat a similar process:

After extracting the Logstash archive, we then need to specify the configuration file and do the following:

Now, you are ready to configure Logstash and ship logs to the ElasticSearch.

Installing Kibana on Azure

To download and start Kibana, we need to do the following process:

If you want to change the Kibana configuration, use kibana.yml and then start with this command:

Another way to install ELK is with Docker. The main idea behind this method is to install Docker on a VM and then use Dockerfile to create images. We can then use these images to start the ELK Stack on a VM or several VMs across your network. Learn how to create and use these Docker images.

Installing the ELK Stack from the Azure Marketplace

The Microsoft Azure Marketplace is an online market for buying and selling complete applications and premium datasets. In this section, we will show you how to install an Elasticsearch and Kibana cluster from there. Logstash should be installed in the same way as described above.

First, check the Elasticsearch and Kibana solution on the Azure Marketplace. You then have to fill out fields in a series of seven steps.

The first step is to configure the Basic Settings such as the username, type of authentication, resource group, subscription type, and location. This data is mostly determined by personal choice or selecting choices from drop down boxes. Here is an example of the configuration:

The next step involves the Cluster Settings, where you have to choose your Elasticsearch version and give the cluster a name.

The next step is to configure the nodes in terms of the master node size, the number of data nodes, the sizes of the data and client nodes, and more. (Learn more about the common mistakes that people make in this part of the process.)

The next step is to configure settings for Shield users by entering passwords for es_admin, es_read, and es_kibana.

The next step is to configure external access to the cluster. This step requires you to decide if you want to install Kibana on cluster or if you want to use a Jumpbox. If you choose Jumpbox, you will see that it will automatically add the VM. You can then use it to connect and manage VMs within the internal network as well as the load balancer type (internal or external depends of type of network).

As we mentioned, the last two steps are related to the summary and confirmation.

After a few minutes of deployment, the cluster will be pinned to the dashboard. Setting up Logstash to ship logs can now be completed with the steps that we described in the previous section.

Bonus: Shipping Logs from Windows

In this section, we will describe how to ship logs from Windows machines to the ELK Stack. Unfortunately, there is no easy way to get the operating system logs and then use Logstash to ship them to Elasticsearch, but luckily there is Nxlog — a free tool for this purpose. It will help us to get event logs from our machine into Logstash and then into Elasticsearch.

For this example, we will use one machine for the ELK Stack, and the second will be the monitored Windows machine. Download and install Nxlog on the monitored Windows machine and then configure both Nxlog and Logstash to communicate with each other.

The Nxlog configuration looks different from Logstash, but it is pretty intuitive. The Nxlog configuration file is located inside the Conf folder of the application file nxlog.conf. If you do not define the custom path, the Nxlog default location will be under the Program Files folder in Windows. Here is an example of the NXlog configuration:

Basically, the Nxlog configuration will collect Windows events and send them to the specified IP address on port 3515 using TCP protocol.

The next step is to configure Logstash to listen to port 3515 and parse those events. The Logstash configuration is pretty straight forward, and for the input, we will use a TCP plugin that will listen on port 3515. Nxlog transforms the event into JSON format, so we will use a JSON filter plugin. For the output, we will use an Elasticsearch plugin:

After starting the ELK Stack and Nxlog on your Windows machine, you will notice that Logstash has already shipped some logs and that you can analyze them with Kibana.

Another important step is the configuration of the network access rules, which we already described in the section on how to install the ELK Stack. The only things that change are the port numbers. For example, we installed the ELK Stack on the same machine, so we can omit the 9200 port and create rules for the Logstash listen port and Kibana.

A Final Note

In the past, storing and analyzing logs was an arcane art that required a large investment in the building and maintaining of a physical environment. Today, however, you can easily build a whole pipeline of log shipping, storing, and presentation with a public cloud such as Azure and an open-source project such as the ELK Stack. We invite you also to take the next step and learn how to run a scalable ELK Stack in production.

Logz.io is a predictive, cloud-based log management platform that is built on top of the open-source ELK Stack. Start your free trial today!

//

Logz.io offers enterprise-grade ELK as a service
with alerts, unlimited scalability, and collaborative analytics
Start your free trial!

How to Install the ELK Stack on Azure was first posted on April 21, 2016 at 11:48 am.
©2016 "Logz.io". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at shani@anymation.co.il

Show more