Unfortunately, one man's 'best practice' may be unsuitable for another.... Each environment will be different and your needs may differ from mine.
However, we've recently reviewed this ourselves so here are some things for you to consider in your own environment, and they may help you decide if the risk is worth it:
Acknowledge that setting a policy for some (or all) of your machines is a balancing act between the ideal position, and real-world needs. All of us ideally would scan every file all the time - but those pesky users also want to use the files...:)
For on-access scanning -
Scan when writing to disk should catch most (not all) threats before you write them to disk. What you do at that point is up to you (you can elect to do nothing but report on it), but I suspect most people would not allow the infected file to be written. However, what if that infected file contains a transaction for a Zillion (insert your currency here)? Maybe the risk from preventing writing the file is outweighed by the prospect of money...?
My setting: scan on writes (then clean or delete as actions - if really needed I can get it out of quarantine, because they are not really deleted straight away under VSE 8.5 and above)
Scan when reading from Disk - well this depends. Doing this will ensure a scan every time the file is read, and with the latest DAT (see below) However on a server you may have thousands of users reading the same file multiple times a day - is all that scanning really necessary? This will depend on if you have a task to also do on-demand scans, and how often. If you do - and it's fairly comprehensive - then you may consider the 'cost' of these scans as not worth it. Don't think that just because you scanned the file when you wrote it you caught all malware. The file may contain a threat that today's DAT can't find - but tomorrow's can.
Scan network drives: I would NOT recommend this unless you really need it. Imagine a thousand connected users scanning a file as they read it across the network, then scanning again before they write it to disk... I leave each machine to scan it's own files.
Scan files opened for backup. When backups read file to back it up, they lock the file in a special way (opened for backup). What do you want your scanner to do at that point - ignore the file or scan it? Even if you scan it and find a virus and delete the file, it will likely still be backed up. For a single file that's bad enough, but what if you happen to be scanning a database? do you want your scanner to scan and delete one of the files needed for your whole backup to be valid? If you can it's better to avoid scanning the files while they are being backed up - try and schedule them not to overlap.
I hope that helps.
Thanks for spending time writing all that out :)
In your personal opinion, would you say activating Scan on Read and Scan on Write would cause more performance issues than simply having Scan on Write?
We often get complaints from users that when first starting their PC's in the morning, it takes forever to get to a usable state largely due to VirusScan.
'Performance' is a tricky issue....
IMHO what most people mean is 'will it slow up my machine?'
The short answer to that is yes - scanning on both reads and writes will 'slow' your machine more than just scanning on writes. Each scan takes resources (Disk or memory I/O, CPU time, etc), and the resources are finite, therefore two scans (at best case) rather than one will leave fewer resources available for other things. However, if that resource is available anyway and you are just leaving it unused, it makes no difference.
I've spent LOTS of time benchmarking performance from AV trying to prove to others that the cost is negligible - and it really is.... For a single given file, copying a file and scanning it (on write only) at worst adds a miniscule amount of time compared with copying it without scanning. When you take into account that there is also a cache used if a file has been recently scanned, then a cost of (e.g.) half a second longer (or less) to copy a file that would take a minute normally is not a lot to pay - especially if you prevent the time having to recover it later because it was infected or corrupted.
However, there still remains this perception that AV 'slows my machine'. What people fail to realise is that writing ones and zeros to spinning lumps of metal takes FAR longer - yet they happily accept that. :)
On starting PC's in the morning - yes we get that complaint also. I've proven that power-on to logon prompt time is not really affected by the AV (again it's miniscule) - its the logon to usable time that takes forever.... Resolving network addresses, Logging onto the domain, applying group policy, running logon scripts and many other things take the time. The Anti Virus starting up is comparabale to any other application starting (e.g. Word) APART FROM ONE THING... When updating to the latest DAT, at the point where the new DAT is actually loading (and shuffling the old one out of the way) there does appear to be a siginificant pause in other operations (maybe 5-10 seconds here). Again I've proven that with an UP TO DATE DAT the delay is minimal compared to no AV being present.
So a possibility would be to only run DAT updates AFTER people normally logon - however we take the view that having the latest DAT asap is better and worth the cost in delay at logon - particularly as we are a 24x7 shop and cannot legislate when people will actually logon. You may be lucky to have that option.
I think I've seen something from McAfee acknowledging they are aware of the issue (and DAT Size!) and hope to review it in future...
Thanks for the help.
Pretty much proven what i originally thought.
From reading the thread is it worth having 'Opened for Backup' on of off? I have just deployed P1 for 8.7i and the backup time of our file servers has increased ten fold.
Is this a viable option
When writing to disk - enabled
When reading from disk - disabled
On Network drives - disabled
Opened for backup - disabled
Any help would be much appreciated.
I think you need to define the high and low risk processes and then make the backup application a low risk process and then exclude it from scanning
I set my update at logon tasks to happen 15 minutes after logon.
I found this timeframe let's the PCs load up as normal and kind of minimisies the impact to users, because when it is set to happen at logon with no delay I found there was too much other stuff happening at the same time which slowed things down, but doing it 15 minutes after logon was a good compromise
From a developer (yes a _user_!) point of view I am going to have to disagree with you on the performance hit being unoticable! Our company has both Read & Write On Access scanning enabled and I see McShield hit 30-50 percent of the CPU constantly. I would estimate (unscientifically) that I lose at minimum a hour a day due to the on-access scanning. Opening VS2008 and then a large project, compiling and updating the source repository is dreadful. Additionally using some other tools I have seen a 28 minute load time with McShield enabled to a < 2 minute load time with McShield disabled. I think there is a trade-off between being "safe" and complete nonsense. Additionally what I can not get through to Corp IT is that there is a difference between an M$ Office user, Engineers using CAD, and Software Developers.
On the plus side I get a couple of really nice lunches and breaks during the say when my computer is being safe for me.
This thread belongs in the VSE community. That said I did read through and I can tell you that not scanning "on read" does pose a security risk. Some infections cannot be detected/cleaned with scan on read disabled. Also perhaps when the machine was initially infected the DAT files did not have a detection for that threat but a subsequent DAT release does. In that scenario the infection has already been written to the HDD so the OAS would only pick it up if you had "Scan on read" enabled.