Main Menu

Logs

Started by Lazybones, January 02, 2015, 11:51:58 PM

Previous topic - Next topic

Lazybones

Having to frequently dive deep into our logs at work I have been doing some research on central log management.

I have in particular been looking into the ELK stack (Elasticsearch, Logstash, Kabana) http://youtu.be/GrdzX9BNfkg

Well having some useful features on my home router I dumped my IPtables logs from my router into it and noticed that I have multiple attacks on going on my RDP port from China and Russia on my home machine...

Fun.

Tom

I had to turn off some "attack" notices in my firewall logs at one point. just spam from bots looking for ports that I don't have open ....
<Zapata Prime> I smell Stanley... And he smells good!!!

Lazybones

#2
Aww but in this case they where attacking something I did have open

Tom

Quote from: Lazybones on January 03, 2015, 11:03:18 AM
Aww but in this case they where attacking something I did have open
Yeah, thats the annoying bit. Though I sometimes still ignore some things like that, or in the case of persistent sources, I just outright block the addresses. I have two (or three?) ip level blocks atm at home, and it stopped a fair amount of log spam from iptables from showing up in syslog. :D I'm pretty sure one was from china, and the other may have been either russian or eastern europe...
<Zapata Prime> I smell Stanley... And he smells good!!!

Lazybones

Ya I am far more interested in what was allowed to connect and the resulting actions on the applications..

Besides, visualizing the logs makes it so much easier to pick out problems at a glance.


Tom

Yeah, having some nice visuals help.
<Zapata Prime> I smell Stanley... And he smells good!!!

Lazybones

So I have built out a good segment of my network at work now.. I must say it is a lot of work normalizing and setting up rules to break up the various log sources.

However once it is done it is rather amazing how fast it is. It is consuming about 349,464 logs per min and 3,328,041 logs per day at the moment ( not everything is in there). Yet I can still get detailed results in about 2 seconds for almost any query. Multi day query build backward starting on the most recent day and fill in dynamically, meaning you often start to get results instantly then a few seconds later the rest pop in.

If any of you have a logging project coming up and Splunk is out of budget I would highly recommend investigating an ELK stack.