Azure Data Explorer Magic
Hello You,
In today’s blog, we will talk about Azure Data Explorer and how we can use it to ingest logs and query data using a super-efficient and fast query language called Kusto Query Language (KQL).
The idea started when I was solving some HTB Sherlock challenges and was provided with logs like Zeek logs and Windows Event Logs. I had to use tools such as jq
, Get-WinEvent
, and grep
.
To be honest, I didn’t get it why we were using these kinds of tools when we have powerful query languages like KQL, SPL, and etc…
That’s when I started looking for a way to ingest the logs I had into some kind of data analytics tool, and the first thing that came to mind was Azure Data Explorer.
So, Let’s get to it.
Ingesting Logs to ADX
First, you need an Outlook account to sign in to https://dataexplorer.azure.com/publicfreecluster and create a free cluster.
After creating a cluster and a database you can either create a table with the schema you want then ingest data into it, or you can just ingest a local file (you can ingest up to 1000 files where each of them less than 1 gigabyte)
In my case I wanted to ingest the conn.log
file from zeek logs which is one of the most important logs Zeek creates
Now sometimes Azure Data Explorer won’t be able to recognize your data format if there is some changes to it, but you can clean the data and get the right format
even though that the file format is TSV
I got an error
In my case it was because of the leading #
, so I asked chatgpt to write me a script to clean all Zeek logs by removing the lines that starts with #
except the one starting with #fields
to keep the naming of the columns names
Then just go through the process again and upload your cleaned logs
now each column contained the correct data, ensuring the rows and columns were properly aligned and formatted for ingestion.
Then you can query the data as you wish
And yeah this is it for today’s blog :3