ELK Stack with BizTalk: Part 1: Introduction about ELK Stack

Recently I have been looking into using ELK stack as a log management platform for our BizTalk environment. This blog is about my experience and exploration towards using the ELK stack with BizTalk applications.

I am going to write this blogpost in multiple parts:

Part 1: Introduction about ELK Stack

Part 2: Notes on installing ELK stack in Windows

Part 3: How to use ELK stack in a BizTalk integration use case/project.

In this part -1 of these series, where I am going explain about the ELK stack.

What’s is ELK stack (Elastic Stack)? Why do we need ELK?

ELK stack is a log management platform.

When I mean ELK stack as a log management platform. We should understand what log is all about. Logs can be broadly divided into two – Application logs and business metrics. Application logs generated by different services about their operation. And the logs related to the business data, like total sales, sales value, high sale period etc. A log management platform should provide you a way to manage all different types of these logs. For example, take an example of a BizTalk environment. Vendors post order messages through HTTP Post. Here when we mean by logs, logs from IIS for the HTTP post, vendors order messages, event log entries. Each log follows its own format.

An IIS log would be something like this:


XML/Vendor order message is something like this:

XML Message

And an event log entry is something like this:


So we end up with diverse collection of logs in different formats scattered across within the same machine/server and also across different machines.

A log management platform should provide ways to collect, store and analyse logs.

ELK stack provides actionable insights in real time from almost any type of structured and unstructured data source. ELK is an acronym for ElasticSearch, Logstash and Kibana.

ElasticSearch (E), Logstash (L), Kibana (K). Logstash collects and parses logs, and then Elasticsearch indexes and stores the information. Kibana then presents the data in visualizations that provide actionable insights into one’s environment. It’s should have been LEK according for the flow, but when it’s started it’s been named as ELK. ELK is a free open source stack for log management with commercial support, managed solutions. Elasticsearch also provide commercially supported by trainings, plugins on security and monitoring.

ElasticSearch from a company called Elastic joined with Logstash and Kibana to form the ELK stack. And later joined with PacketBeat for form Beats (lightweight data/log shippers to Logstach and ElasticSearch) and changed their name to be called as Elastic Stack which is Elasticsearch, Logstash, Kibana, and Beats. But due to the popularity I will continue to use the name as ELK stack has now been renamed to Elastic Stack

In a snapshot, the value of convert these unstructured and structured log files, into more meaningful dashboards and searchable data.

ELK Stack Architecture:

Logstash can collect and parse the log from different sources, store it in Elasticsearch and Kibana provides a visualisation of that data.

ELK Stack Architecture

ELK Architecture – Logstash

As mentioned, Logstash collects and parses logs. This Logstach event processing pipeline has three stages:

Input –> Filter –>  Output

Logstash collect logs from a variety of sources through various plugins available,

process the data into a common format using filters

And using various output plugins available stream the data to a variety of sources

Logstash Architecture

ELK Architecture – Elasticsearch


Elasticsearch is document based search engine. Log could be a document, tweets is seen as document. Log Data from BizTalk is seen as document.xxxx`

Its JSON based. Elasticsearch uses JSON to store documents in JSON format and t interact with other components.

Built on top of Apache Lucene. It’s the core IR-Information retrieval library used. Powerful Java based library.

Its schema free – When you use Elasticsearch, there is no need to define about the data, i.e. you don’t need to define that there is a field known as “status” of type “integer”. So to get documents into Elasticsearch, there is no need to define the elements. But sometimes, schemas could be useful. If you define a graph and want to define a field as integer for aggregation, you can still do it. Elasticsearch is flexible enough to allow to

Its distributed by design – Since it’s going to be the data store, scaling it horizontally by adding more machines is possible and also possible to scale it vertically by adding more nodes for high availability.

Elasticsearch is a NoSQL database with strong focus on searching the stored data.

Its API centre – To search, post, to get the internal statics of the Elasticsearch, to get the health of the node there is a RESTful APIs.

ELK Architecture – Kibana

Kibana is browser based analytics and search dashboard for data stored in Elasticsearch. We can create visualisation charts and dashboards. Its similar to Power BI for analytics but more search based.



This is end of part-1 of this blog series, in next offering of this blog post series, I’ll explain/give notes about installing ELK stack in a Window’s environment.


Posted in: BizTalk, Elasticsearch, ELK Stack, Kibana, Logging, Logstash 3 Comments March 30, 2017


M.R.ASHWINPRABHU is the founder and CEO of Fortuvis Systems Limited, a consulting company specialised in Microsoft technologies. Ashwin is a highly experienced integration consultant who works with clients to deliver high quality solutions. He works as technical lead developer, application architect and consultant, specializing in custom applications, enterprise application integration (BizTalk), Web services and Windows Azure.

Comments (3)

Leave a Reply

Your email address will not be published. Required fields are marked *