January 27, 2021

Beating around the logs with Elastic

Beating around the logs with Elastic

The past few weeks, months I've been spending more and more time with Elasticsearch. And honestly, every time I try a new part of the elastic product stack, my love for Elastic grows larger and larger.

We've been discussing ways how we can get a better overview and more insight into where problems exist in our deployed applications. How many errors occur, where, why, when etc... We already have some basic logging in place and we can read the logs if we need to, but we currently do not have a good way to see at a glance what is happening unless we go see for our self every application independently, which is just too much.

The idea we have right now is to index all our logs, application logs, system logs, database logs and so on into Elasticsearch and create visualizations and alerts where needed.
Indexing in Elasticsearch is not all too hard, there are a lot of options, but basically what you do is sent a json document into an index in Elasticsearch (very simply put).

But creating scripts to read, parse and index all those logs is not all that easy considering all the differences between the logs and their usage. Luckily Elastic has done a lot of work for us.

Beats

Elastic has created applications under the umbrella they call "Beats". Different beats use different types of input to read. But they all have the same goal, sending the data they take in, to a second application that will take in that data and use it, like elasticsearch.

Filebeat

One of the beats is filebeat, as the name might suggest, it reads from files and send the parsed data to a central source. You run filebeat and configure it, where can it find the logs and where to send the output.

You can configure it to read from files you tell it to read. But it also comes shipped with different module presets for applications like nginx, mysql, httpd and others.

Heartbeat

Monitoring uptime? Elastic has got you covered. Give heartbeat a  list of hosts to check and the interval in which to do so and you're set.

Metricbeat

Chances are you want to know how your services are doing? Metricbeat can provide insight in things like memory, CPU, load etc..

Simply tell metricbeat what to check and how and you can send it to Elasticsearch.

Visualisation

Now all this data is very cool. But unless you want spend your days making your way through tons and tons of json and figuring every result out yourself, you probably want to make it visual.

Integrating Kibana with your created indices from all the beats can provide you with tables, charts, monitoring alerts and more.

If you find that Kibana is not the tool for you and your needs for visualization are specific you can always build your own. You can use Elasticsearch to query the logs and create whatever frontend you need.

You have the data, you have the power.

More beats

There are more beats available by Elastic and you can build your own or use beats built by others.
Check out community beats <-

Conclusion

This post was not about offering concrete examples as more a tip to check out these tools. Simply because in the past I've spent way too much time in creating things myself until I found these tools. I hope by spreading this article I can prevent people from making my mistakes.

Elastic offers a lot of great tools for data ingestion, parsing, transforming and visualizing. Give them a look. :)

Thanks for reading.

References