Real-time log analytics with ELK, powered by Swisscom Application Cloud

In this blog post, we will show you how to use the power of Elasticsearch, Logstash, and Kibana (ELK) to get real-time insights for your app running on the Swisscom Application Cloud.

Each component of the ELK stack is a separate project, that is driven by the open-source vendor, Elastic. Although each component has been built to work extremely well together, their responsibilities and tasks are still clearly separated. Logstash receives, parses and sends logs to Elasticsearch. Elasticsearch is a search server, based on Apache Lucene, that provides a full-text search engine with a HTTP interface. Kibana is able to analyze and visualize the data over a nice Web UI.

Blog ELK

In this blog, we’ll show you how to log within a Node.js application, and visualize the output with the new ELK service.

This blog post will reuse the base Node.js app described by Marco.
Therefore, an Account on developer.swisscom.com, as well as an installed Cloud Foundry CLI is a pre-requisite.

Create an ELK Service
To get started, we need to create an ELK service – a really easy step. 🙂
You can either do it via CLI with “cf create-service elk small testelk” or via our portal and follow the wizard there.
Additional technical information for our ELK Service can be found in our docs. You can find prices and plan details on our product page.

Prepare demo application
The demo application that we are using is available on github.
To interact with the ELK Stack, we use a Node.js Library called winston, as well as winston-logstash, to send logs directly to logstash. To add them to our Node.js project, we need to add them to our package.json file within the dependencies:

{
  "name": "nodejs-elk-sample",
  "version": "0.0.1",
  "author": "Lukas Lehmann",
  "main": "server.js",
  "engines": {
    "node": "4.2.3"
  },
  "dependencies": {
    "cfenv": "1.0.3",
    "express": "^4.13.3",
    "winston": "2.1.1",
    "winston-logstash": "0.2.11"
  },
  "scripts": {
    "start": "node server.js"
  }
}

Within the code we need to initialize winston, and configure the connection logstash. Code for this is:

var express = require('express'); // Webframework for nodejs, quite powerfull
var app = express(); 

var winston = require('winston');
// 
// Requiring `winston-logstash` will expose 
// `winston.transports.Logstash` 
// 
require('winston-logstash');

var cfenv = require("cfenv")
var appEnv = cfenv.getAppEnv() // get all Cloud Foundry Environment variables
var serviceEnv = appEnv.getService("testelk") // get Credentials for Service named "testelk"

// configure Logstash connection
winston.add(winston.transports.Logstash, {
    port: serviceEnv.credentials.logstashPort,
    host: serviceEnv.credentials.logstashHost
});

var app = express();

app.get('/', function (req, res) {
    //send headers on info level to elk
    winston.info(req.headers);
    //return HTTP 200 response 
    res.send('Hello World!');
})

app.get('/blocked', function (req, res) {
    //send headers on warning level to elk
    winston.warn(req.headers);
    //return HTTP 401 response
    res.status(401).send('Something blocked!');
});

app.get('/broke', function (req, res) {
    //send headers on error level to elk
    winston.error(req.headers);
    //return HTTP 500 response
    res.status(500).send('Something broke!');
});

var port = process.env.PORT || 3000 // either use the port 3000 or a port which is in the "environment variable" - the cloud will deliver us such a port 
app.listen(port); // tell nodejs to listen to this port and give response
 
console.log('I am ready and listening on %d', port); // write something nice to the console

Winston supports different log levels, which helps to easily detect different severities. To see that in action we specify 3 different routes where each sends logs on different levels to logstash.

var express = require('express'); // Webframework for nodejs, quite powerfull
var app = express(); 

var winston = require('winston');
// 
// Requiring `winston-logstash` will expose 
// `winston.transports.Logstash` 
// 
require('winston-logstash');

var cfenv = require("cfenv")
var appEnv = cfenv.getAppEnv() // get all Cloud Foundry Environment variables
var serviceEnv = appEnv.getService("testelk") // get Credentials for Service named "testelk"

// configure Logstash connection
winston.add(winston.transports.Logstash, {
    port: serviceEnv.credentials.logstashPort,
    host: serviceEnv.credentials.logstashHost
});

var app = express();

app.get('/', function (req, res) {
    //send headers on info level to elk
    winston.info(req.headers);
    //return HTTP 200 response 
    res.send('Hello World!');
})

app.get('/blocked', function (req, res) {
    //send headers on warning level to elk
    winston.warn(req.headers);
    //return HTTP 401 response
    res.status(401).send('Something blocked!');
});

app.get('/broke', function (req, res) {
    //send headers on error level to elk
    winston.error(req.headers);
    //return HTTP 500 response
    res.status(500).send('Something broke!');
});

var port = process.env.PORT || 3000 // either use the port 3000 or a port which is in the "environment variable" - the cloud will deliver us such a port 
app.listen(port); // tell nodejs to listen to this port and give response
 
console.log('I am ready and listening on %d', port); // write something nice to the console

As the demo application is now fully prepared, we are ready to push it to the cloud. You can either clone our existing example, the github repo with “git clone”, and then run “cf push” so that the manifest.yml properties automatically handle the process.

Or simply use “cf push myELKSampleApp –no-start” within the root of your project directory to push the application to the cloud, without starting it (no-start). After that, you need to bind the initially created ELK service instance to your app, so that all the services environment variables are available, which can be done via “cf bind-service myELKSampleApp testelk”. Run a “cf start myELKSampleApp” at the end to startup the demo application.

Binding the ELK service instance to your app will automatically send everything that goes to stdout and all associated “cf logs ..” to logstash.
See Marco’s Blog post – section “Move into the Cloud”, for a more detailed introduction on how to push and bind services.

 

First steps with your ELK Service

To interact with our ELK service instance, we first connect to the dashboard, Kibana, where we will get all of the information about our logs. To do so, we connect to the Kibana endpoint with the service connector. Please install the service-connector plugin using “cf install-plugin” as described in the documentation.

Now you need to get the credentials of your ELK, by reading the environment variables of your app :  “cf env myELKSampleAPP”:

 "VCAP_SERVICES": {
  "elk": [
   {
    "credentials": {
     "elasticSearchHost": "rnsm5lxbs0ckyx9r.service.consul",
     "elasticSearchPassword": "o963j10Bg8big11R",
     "elasticSearchPort": 44939,
     "elasticSearchUsername": "KI5pIk1sp5iRnqcA",
     "kibanaPassword": "o96aj10Bg8big11R",
     "kibanaUrl": "http://1uyyxrnomelfmia3.service.consul:40512",
     "kibanaUsername": "KI5pIq1sp5iRnqcA",
     "logstashHost": "g2w58vjylw8mksh9.service.consul",
     "logstashPort": 41088,
     "syslog": "syslog://g2w58vjaaw8wksh9.service.consul:41058"
    },
    "label": "elk",
    "name": "testelk",
    "plan": "beta",
    "syslog_drain_url": "syslog://g2w58vjylw8wksh9.service.consul:41058",
    "tags": []
   }
  ]
 }

Start the console, and then define a local port and the location of the service (host:port or ip:port) and copy the just received credentials from the cf env command into your console
i.e.: “cf service-connector 8888 1uyyxrnomelfmia3.service.consul:40512”.

Now open your browser and go to http://localhost:8888 where you need to login with KibanaUsername and KibanaPassword.

If you have successfully logged in – congrats! You just setup a fully elastc-seach/kibana/logstash environment and binded it to your app!

 

Configuration

Now you will see the create index page, where we specify the following properties:

ELK_config

To find out more about the different indices, check out our docs.

Now you see a pattern, which was generated based on your input. Click on “Discover” in the top to get your first kibana view!

We have now fully configured the kibana view, and can discover our logs. If you now generate some traffic by accessing your app (i.e.  http://node123elk.scapp.io/blocked ) you will see the requests. Every request to our app generates two log entries, one from the router and one from our winston logs:

ELK_winston_logs

As winston sends all its logs in Json format, they are parsed automatically.

 

Visualize Logs

As we now receive properly parsed logs, we are able to play around and create charts or dashboards. One simple example is a pie chart, which is segmented according to return http codes:

ELK_dashboard
(click on the picture to see animation)

With a little practice, you can create really impressive dashboards. Here is an example of a dashboard which monitors various microservice applications (screenshot of a test system):

myCloud_overview_-_Dashboard_-_Kibana_4_and_Visualize_-_Kibana

 

We look forward to feedback from you, and screenshots of dashboards, which you have built. 🙂

Sources to build up knowledge

Swisscom Application Cloud Docs: http://docs.developer.swisscom.com/services/offerings/elk.html

Kibana visualizations: https://www.elastic.co/guide/en/kibana/current/visualize.html

Kibana: https://www.elastic.co/guide/en/kibana/current/index.html

Kibana essentials: http://www.amazon.com/Kibana-Essentials-Yuvraj-Gupta/dp/1784394939

Elasticsearch API Docs: https://www.elastic.co/guide/en/elasticsearch/reference/2.x/docs.html