Sorry my english, my language native is Portuguese...
Here we have a Central site where ePO is installed and 70 stores. Those shops has a 256k link with the central and i have 30 system on each store
I need deploy virusscan and updates for all systems on each store...
What i need use ?
Distributed repositories ? Super agents ? agent handler ?
Whatever you do, do not use an agent handler. I was thinking of using that for us, but the manual says agent handlers use a lot of bandwidth which obviously you don't want.
You can use a super agent, however you'll need to have a computer that is always on that has the EPO agent installed, but there is no setup with a super agent; you just go to the desired computer in the system tree and check the super agent box. Each subnet will need a super agent. If a subnet is segmented further with VLANs you may need one per VLAN, however I've not been able to test out this feature.
Distributed repositories are the most flexible, as they will run on anything that can provide HTTP, FTP, or UNC access. However, you'll need to manually set up every single distributed repository; create the user account, and manually create the folder you'll be pointing the EPO server to. If you have a couple of people helping you though it won't take too long. If you use distributed repositories and FTP watch out for the gotcha! The EPO server will not change directories to write the files it needs, it will use the default FTP folder you get when you FTP into the server and write to the folder you specify. This means if you want the files to go to FTP://serverip/mcafee/repository but your default FTP folder is FTP://serverip/someotherdirectory it will not put the files where you expect.
Make sure you modify the site list policy if you use distributed repositories. By default computers will ping every distributed repository in the closest 15 subnets (logical layout, not your physical layout). I would change it to only 1 subnet (or maybe 0? Not sure which one only means the local subnet) so it will only look at the local repository and you won't need to make seperate policies for all 70 sites. You do not need to modify the site list policy if you use super agents, as super agents will only talk to computers on their local subnet.
In conclusion, agent handlers use too much bandwidth, distributed repositories are the most flexible but also difficult and time consuming to setup., superaAgents require the least amount of setup, but require a computer with the EPO agent that is always on. If possible, use Super Agents. It's highly recommened you only make a server a super agent, although a computer that nobody uses is just as good. You should not set a computer that somebody uses as a super agent as it can cause slow down and will be unreliable if the computer is shut down. I have read that there may be minimum bandwidth of 512k, but I've not run into this issue when updating over a staturated T1 line.
A few things to note when setting up. Clients will still contact the EPO server for client tasks and policies and some other items, but will get virus definitions and product updates from their local repository. Since bandwidth is an issue you may want to set the systems to update outside of business hours and also set them to update randomly over those hours. By default, when the EPO server gets updates from the McAfee servers it will automatically perform a global update, which will automatically update all of your repositories as well. Make sure you update outside of business hours to avoid this. I'm not sure how you prevent a global update as it's not an automated task. The initial update for us was about 300 MB, this was pushing out all of the packages we had availbile so the amount of data you need to send may vary, if you have the same update size and you can get 256k (I'm assuming kilobits) the entire time it will take about 2.6 hours for the initial update.
Remember that you can set up everything in McAfee to only perform actions when you want, and also have the start time of the actions be randomized so your systems are not hitting the server all at once as well.Message was edited by: noah on 3/31/11 9:27:52 PM CDT
I follow your considerations!
I had a issue with timeout sync repositories but now its OK.
I Follow this configurations to increase the time to replication when use links very slow!
HKLM > Software > Network associates > ePolicy Orchestrator > Site Manager
Where the numbers is a decimal and 24 is to 24 hours and 60 is to 60 minutes and 30 is 30 seconds...
when you increase the registry values, the time for your communication is increased
but i dont have 100% sure about this....
tks again my friend!
One thing to point out is that you may have problems replicating to those repositories over 256K lines - presumably there will be other traffic using the bandwidth as wel.
I would recommend looking at ePO 4.6 and MA 4.6, which is currently in limited release, so you might need to open a support case to request it: this offers the lazy caching function, where only the required files are transferred, rather than a complete replication.
One idea i've been toying with for remote repositories..
Instead of installing a dedicated 'always on' computer at each site to host the repository.
Iomega do a consumer device that can host usb data sticks or usd hdds and present them as network storage. The whole set up would be approx 70GBP per site. But then consumer NAS device are coming down in price every day.