Skip navigation
McAfee Secure sites help keep you safe from identity theft, credit card fraud, spyware, spam, viruses and online scams
735 Views 5 Replies Latest reply: Jan 29, 2013 1:59 PM by Jon Scholten RSS
consoul Newcomer 35 posts since
Aug 18, 2011
Currently Being Moderated

Jan 24, 2013 6:48 PM

Quick and dirty log retrieval, don't judge.

While I have become quite adept at logging into each of our proxies ten times an hour to troubleshoot and 'grep' logs I have a much harder time trying to teach a few of our other guys to do so.

The other day I finally tired of having one kind fellow ask me how to break out of a 'tail' with a panicked look in his eyes...so I just gave in and wrote up a quick ruby script. I am no programmer and I worry a bit about how hard this could get torn apart here, but here it is.

 

If you don't like ruby and don't have it installed this might not be for you. If you 'do' have ruby installed then you can likely script better than me and this serves you nothing. But, if you 'are' like me and dabble in ruby and want to save yourself some time this is an easy way out.

 

I wrote this in ruby 1.9.3, and you will need to install three gems:

gem install highline

gem install net-ssh

gem install colorize

 

Code:

#!/usr/bin/env ruby
require 'highline/import'
require 'net/ssh'
require 'colorize'

user = ask("Username: ") {|q| q.echo = true}
pass = ask("Password: ") {|q| q.echo = '*'}
#user = 'readonlyusername'
#pass = 'readonlyuserpassword'
hosts = ['ip','ip','ip','ip']
filter = ARGV[0]
access_log_path = '/opt/mwg/log/user-defined-logs/access.log/access.log'
fileout = File.open("results-#{ARGV[0]}-#{Time.now.strftime("%Y-%m-%d-%H-%M-%S")}.txt","w")
fileout_name = "results-#{ARGV[0]}-#{Time.now.strftime("%Y-%m-%d-%H-%M-%S")}.txt"

hosts.each do |host|
  host = host.chomp
  puts "Retrieving logs from #{host}".yellow
  Net::SSH.start( host, user, :password => pass ) do |ssh|
    fileinfo = ssh.exec!("ls -lh #{access_log_path}").red
    output = ssh.exec!("grep #{filter} #{access_log_path} | tail -n 45")
    fileout.puts "\r\n#{host} | #{fileinfo}#{output}".cyan
  end
end

puts "Output file:   #{fileout_name}".green

 

Essentially to run this you need to make one edit for your environment and you are up. I used 'highline' to hide the password you type in at the prompt so it's not in your bash history.

(I later went back and created a read-only account on my proxies and statically stored the username and password in the script, which McAfee will tell you not to do and I would "not" recommend unless you have very "very" tight control of where this file ends up.)

 

Save the above code into a text file such as 'proxy-logs.rb'. Place the ip's for your proxies into the 'hosts = ['ip','ip','ip','ip']' values (make more or less as needed, this should also work with a single host). Then, once ruby and the three gems are installed, simply type: ruby proxy-logs.rb filter  at the command prompt.

Whatever you type as the filter will be used to 'grep' out the last 45 matching lines in each access.log file. It also tells you when access.log was last written to and how large it is, we have ours set to roll at 150 MB so if there are no results but I can see it just rolled I know why.

 

The script will create an output file named 'results-whateveryoutypedasyourfilter-year-month-day-hour-minute-second.txt  in the directory you ran it from. (ie 'results-google.com-2013-01-24-16-58-11.txt')

It will also give you prompts as it works so you know what it's doing, as well as the output filename.

 

 

There are a lot of things this needs as it's not really 'done' but it's working just fine for my co-worker thus far. It needs code cleanup, I will likely add a rescue loop in case one of the proxies times out during retrieval, etc etc. I have a ton of ideas but most of them are going into an entirely new tool to have many more functions directed at the web gateways. Comments, concerns, and smart remarks welcome.

 

 

:edited for spelling:

 

Message was edited by: consoul on 1/24/13 6:48:56 PM CST

More Like This

  • Retrieving data ...

Bookmarked By (1)