cancel
Showing results for 
Search instead for 
Did you mean: 

Web Gateway: Configuring Automatic Backups

Web Gateway: Configuring Automatic Backups

Overview

Currently the McAfee Web Gateway has the ability to create automatic backups and push to a remote host using HTTP(s). Using the flexibility of the Web Gateway we can create a daily backup job that has a unique filename and can be pushed off box via FTP. The reason this process was created was to give Web Gateway administrators the option to create daily backups that have a unique filenames and can be transferred off box via FTP.

 

We will be leveraging the McAfee Web Gateway log management functions to ensure the backup files have a unique filename with timestamp as well as push these backups to an external FTP server for a better disaster recovery solution. We will use the Scheduled Jobs functionality of the McAfee Web Gateway to create a daily backup of your current configuration.

 

 

Creating a new File System logging entry

The first step to accomplishing this goal is to create a new File System Logging entry. In your Web Gateway under Policy > Settings > Engines - Right click File System Logging and select " Add...". Give this entry a Name that will allow you to quickly identify it among your other logging settings, such as the example "Automatic Backup". A log name will also need to be entered, such as backup.log. Uncheck "Enable log buffering" and click OK to create and close this window.

 

 

 

Rotating and Pushing the new backup.log file

Pushing the new backup.log off box will be completed under the newly defined File System logging entry, select the name defined in the previous step (see Automatic Backup in screenshot below). On the right hand side expand "Settings for Rotation, Pushing, and Deletion".

 

Under Settings for Rotation, Pushing and Deletion check "Enable specific settings for user defined log", under Auto Rotation select "Enable auto rotation" as well as "Enable scheduling of log file rotation (format: hh:mm)" then enter the time the rotation should occur, keep in mind that your rotation time is important as we will be creating a scheduled backup job that must run before the rotation time (in this example we will be scheduling the backup creation at 1 AM and rotating and pushing the file at 2 AM).

 

Under "Auto Deletion" you can see in the example that we've chosen to automatically delete unchanged logfiles after 2 days, once this is operational we will not need to keep automatic backup files on the appliance as they are stored safely on your FTP server.

 

The last configuration needed here is to enable "Auto Pushing".  Select the "Enable auto pushing" check box and define your destination server, in this case an FTP server was used, and make sure to check "Enable pushing of log files directly after rotation".


RotationDeletionPushSettings.jpg

 

 

Scheduling the automatic backup

The next task is creating a scheduled job. You can find the scheduled jobs section under Configuration > Central Management > Advanced Scheduled Jobs. Click and you will receive the Edit a Scheduled Job window as seen below.

 

This scheduled job will be started daily, set to run at 1 AM (in this example we will be scheduling the backup creation at 1 AM and rotating and pushing the file at 2 AM).

 

Under Job settings, drop down to "Backup Configuration" and give this job a "Unique job ID".  This value is your choice provided it does not conflict with any other existing scheduled jobs.

 

Adding a job description is optional, I find it helpful to reference what the job is configured to accomplish so that if I a problem arises and further modification is needed it's clear what this jobs function is intended to do.

 

Under "Parameter Settings" check "Use most recent configuration" and under the "Save configuration to backup" we must define the full path to the actual backup.log file. The path here must be entered like: /opt/mwg/log/user-defined-logs/backup.log/backup.log.

The duplicate "backup.log" in the defined /path/file.extension field is needed as the directory for these files is named backup.log (automatically by creating a new File System Logging entry) and the new log entry is defined as backup.log (see Creating a new File System logging screenshot).  Filenames created by this job will be named as follows: backup1309040200-10.10.76.10.log

 

ScheduledJobCreation.jpg

 

 

 

Restoring configuration from 'backup.log' file

It is important to note that the files created are not automatically found when performing a restore via the Web Gateway UI, when the UI attempts to find the backup file it will be looking for a .backup extension as seen here:

 

bandrest.backup.jpg

 

 

A change of the "Files of type:" drop down to All Files will allow the Web Gateway to see the backup it should use for restoring.  Here you can see the backup.log appears after the change of the file type change. After the file is selected and Open is clicked the restore works normally.

 

restore.log.jpg

 

 

Backup from the CLI

If you desire to backup the appliance from the CLI, see commands below.

 

Create a backup in /tmp called current.backup:

/opt/mwg/bin/mwg-coordinator -B "file:in=ACTIVE,out=/tmp/current.backup"

 

Create a backup in /tmp called current.backup with a password of BACKUPPASSWORD:

/opt/mwg/bin/mwg-coordinator -B "file:in=ACTIVE,out=/tmp/current.backup;options:password=BACKUPPASSWORD"

 

Troubleshooting

In the event that you are not seeing the backup files on your destination server or the backup is not being created properly, there are some items to check.

 

Backup.log files not being created by Scheduled Job?

There is a log created when scheduled jobs are configured. You can find this log in the Web Gateway user interface under Troubleshooting > Log files > scheduled-jobs > scheduled.log.

 

Backup.log file not being uploaded to configured server

If your backup.log is not showing up on the remote server, review the mwg-logmanager.errors.log located within the Web Gateway user interface under Troubleshooting > Log files > mwg-errors > mwg-logmanager.errors.log

Labels (1)
Comments
Regis

i saw Sven's excellent talk at FOCUS13 where he made mention of this -- love it.       Unfortunately, in true "death by security controls" fashion, I'm in an environment where FTP is verboten, what with those pesky plain text credentials and all.     As such, these wonderfully explicit ftp instructions have sadly failed me.

Is it possible to leverage https somehow, to a box that's running web reporter, but ... just not trying to index the file in any way?  

Or am I headed the way of a PER to request ssh become a supported transfer mechanism for the log archiving feature we're leveraging?      Or, is there a way to set a cron job up that won't run afoul of supportability?     I'm a bit sad because I felt THIS CLOSE to automatic config backup.   :-)

sroering

You could use the HTTP/S methods using a web server.  Web Gateway uses curl to push logs, which uses HTTP PUT method for sending via http/s.   I did a quick google search for "apache http put" and found this article which provides the basics for using a simple php script to accept HTTP PUT commands.

http://stackoverflow.com/questions/2934554/how-to-enable-and-use-http-put-and-delete-with-apache2-an...

cryptochrome

Backups, and for a matter of fact: pushing logs to a remote host has always been a weak point in MWG in my opinion. Why? What Regis said. Because of protocols being used that are not very common in "admin" scenarios. Who uses HTTP or FTP to move files around in a automated fashion? I was always puzzeled by the lack of support for syslog in MWG6, for example. Network admins use Unix, Bash, SCP.

The best way to backup MWGs in my opinion is to do it remotely from another machine. Any machine that runs some flavor of Unix will do, even a Windows machine with Cygwin installed can do it. Just schedule a cron job that runs a script which in turn logs into the MWGs periodically and pulls backups.

/opt/mwg/bin/mwg-coordinator -B "file:in=ACTIVE"    <---- is your friend.

Regis

Thanks for the responses all! 

I have submitted a product enhancement request using the process at https://kc.mcafee.com/corporate/index?page=content&id=KB60021   to add scp and sftp as options for this off box file transfer.       Any other customers care to join me by making a similar request so they know it's a measurable pain point?

Sroering, I agree you "could"  but I just can't religiously bring myself to trusting PHP to do anything securely, and though I love a good hack,  this is really something the product needs to support I think.   This level of gymnastics led me to the product enhancement request route.   Perhaps there's also opportunity to refine the "trick" portion of this otherwise helpful  how-to to include some specifics on how to support a secure transfer elsewhere.   

Sascha, I am intrigued by your suggestion, but it leads to another question:   does MWG support the creation of lower privileged accounts on the shell level?  I'm certainly not keen on leaving root-equivalent keys lying around to grab config backups from somewhere else unless I really really have to.    

Thanks for the discussion here.  I'd love to see the PER have some traction, and in the interim, I'd be intrigued if someone with more research time than a poor operationally enslaved guy like me has might ....  if they could hone a how-to for some  specific bolt on to an Apache server to do the https put thing in a safely authenticated manner.  :-)

Thanks much!

cryptochrome

Regis wrote:


                       

Sascha, I am intrigued by your suggestion, but it leads to another question:   does MWG support the creation of lower privileged accounts on the shell level?  I'm certainly not keen on leaving root-equivalent keys lying around to grab config backups from somewhere else unless I really really have to.   

Regis, MWG runs on Linux. So you can create users all the way you like. Create a normal, low-privileged user. I am just not sure if such user can run the /opt/mwg/bin/mwg-coordinator command. Easy enough to find out though. And if it can't, you'd have to alter your script slightly so it uses 'sudo' to call mwg-coordinator.

You could also do it the other way around:

  1. Create a low-privilege user and put it in the 'mwg' user group (this is important)
  2. Create a backup job on MWG (see sroering's how-to above). For the path and filename you specifiy /opt/mwg/temp/my_backup.zip
  3. Create a script and cron job that runs under the newly created user that pushes /opt/mwg/temo/my_backup.zip to a remote server, using SCP.

I am sure you can come up with even more ways of accomplishing this. It's Linux.

You are right though: This should be built right into the product.

Regis

Sascha Picchiantano wrote:


Regis, MWG runs on Linux. So you can create users all the way you like. Create a normal, low-privileged user. I am just not sure if such user can run the /opt/mwg/bin/mwg-coordinator command. Easy enough to find out though. And if it can't, you'd have to alter your script slightly so it uses 'sudo' to call mwg-coordinator.

Well, MEG runs on McAfee Linux too, but that product doesn't treat us administrators like trusted adults to so much as sneeze on the command line configuration, or have root without signing over your first born  ...  so I figured I'd ask.  :-)

As long as support wouldn't scream if we add command line users for such purposes, I can see a way probably to hack around this.  

If others who think this should be in the product could stoke the product enhancement fire a little so it trips the radar, so much the better though.     Having an ssh or scp way to get files up to MWG support would be nice too.    FTP just feels so dirty in 2013.

phlrnnr

Regis,

I submitted a PER for this in July, 2011, and it has been in "Under Review" status ever since.  So, I wouldn't get my hopes up :-(

malware-alerts

Regis, you asked if you could leverage WebReporter (now CSR).

It is possible (although not exactly pretty.)

When CSR is unable to process a log file it receives (because it is not in the 'access log' standard format) it moves the original log file to the folder "\Content Security Reporter\reporter\tmp\logparsing\dead'

I configured my lab MWG to push it's auto-backup file to my CSR over HTTPS and it works fine.

It's not pretty, because CSR actually tries to process the backup file as a standard log (so you waste some cycles there) before moving it to the 'dead' folder, but it gets the job done.

consoul

Here's my backup script if anyone is wanting to make there own and needs a start. I use key-auth (ssh-copy-id is your friend) from a backup server to my web gateways and pull the backup.

http://pastebin.com/eaNfSRdg

Version history
Revision #:
3 of 3
Last update:
a week ago
Updated by:
 
Contributors