as we all know McAfee McAfee Labs release every day dat files.
i usually download those dat file manually from McAfee Website.
is there any way to auto download those dat and sdat files automatically from the website ?
Here a Sample of the mail i recieve for dat files:
SUBJECT: McAfee Labs Dat Release Notification: 5900 Dat Files Released
The 5900 daily dat files have been released and are available for download.
The various 5900 daily dat file packages can be found at
Keep up-to-date on your McAfee products! Subscribe to McAfee's
Support Notification Service (SNS) to get timely technical info.
Go to: http://my.mcafee.com/content/SNS_Subscription_Center
For breaking security information from McAfee Labs visit:
McAfee Labs Blog
AudioParasitics - The Official PodCast of McAfee Labs
Sign up for McAfee Labs Security Advisories
I would suggest you begin by looking at third party applications such as GNU Wget.
Please note that McAfee will not support the use of third-party scripting applications or download managers.
Customer use such solutions at their own risk.
is there any other software ?
looking forward for your advices.
Following up from your previous post: http://community.mcafee.com/message/113555#113555
PLEASE READ AND UNDERSTAND THE SOLUTION BELOW THOROUGHLY.
Please note that this forum is not intended to educate in scripting. There are many other places that can help with this.
:: Basic Batch file for downloading the latest sdat
:: This script requires Windows 2000 or above.
:: Also, cURL.exe in 32 bit or 64 bit, depending on your system
:: see http://curl.haxx.se/ for more information.
:: see http://curl.haxx.se/docs/manpage.html
:: for command line parameter use.
:: see http://curl.haxx.se/docs/adv_20100209.html for
:: a security advisory.
:: This script is not intended to be complete or without flaws.
:: Use at Your Own Risk. No error recovery or checking is placed
:: within this script and any problems are your responsibility.
:: I ACCEPT NO RESPONSIBILITY OR LIABILTY OF ANY KIND.
:: This script is intended to be a starting point for those who
:: would like to expand on the idea of automating the download
:: of files where only a manual process already exists.
:: If McAfee does not wish this code to be disclosed,
:: Please remove.
:: Please note, this in Not intended to be run more than once
:: per day, as these files are large, and this costs McAfee
:: substantially for misuse. Also, the SDAT files are not
:: intended for every day use, as McAfee supports ePO to handle
:: daily administration tasks.
:: Additionally, your rights to download files from McAfee's
:: sites are your responsibility and in no way do I condone or
:: suggest or even hint that these downloads are legal.
:: Please seek legal counsel.
:: Basic Strategy:
:: 1. download gdeltaavv.ini
:: 2. parse gdeltaavv.ini for current sdat version
:: 3. download current sdat file.
:1 Download gdeltaavv.ini
echo cURL.exe -K cURL.cfg http://update.nai.com/products/commonupdater/gdeltaavv.ini -o gdeltaavv.ini
cURL.exe -K cURL.cfg http://update.nai.com/products/commonupdater/gdeltaavv.ini -o gdeltaavv.ini
:2 Parse gdeltaavv.ini
for /F "usebackq skip=2 delims== tokens=1,2*" %%m in (`Find /I "CurrentVersion" gdeltaavv.ini`) do set Curr=%%n
:3 Download current sdat executable
echo cURL.exe -K cURL.cfg -C - http://download.nai.com/products/licensed/superdat/english/intel/sdat%Curr%.exe -o sdat%Curr%.exe
cURL.exe -K cURL.cfg -C - http://download.nai.com/products/licensed/superdat/english/intel/sdat%Curr%.exe -o sdat%Curr%.exe
:: cURL statements above reference cURL.cfg.
:: Create a file called cURL.cfg using the the text below,
:: or add the single dash parameters to the cURL commands above.
# (HTTP/HTTPS) If the server reports that the requested
# page has a different location (indicated with the
# header line Location:) this flag will let curl attempt
# to reattempt the get on the new place. If used together
# with -i or -I, headers from all requested pages will be
# shown. If this flag is used when making a HTTP POST,
# curl will automatically switch to GET after the initial
# POST has been done.
# ex. (First URL is reporting to second URL)
# --remote-time recover Date and Timestamp from remote site if available
# --remote-time writes the time to the file specified in -o parm
# --retry <num> Retry request <num> times if transient problems occur
# --retry-delay <seconds> When retrying, wait this many seconds between each
# --retry-max-time <seconds> Retry only within this period
Hopefully this is enough to get you going.