1 of 1 people found this helpful
The superagent repository can contain also the definition updates. So it can also save bandwith for those updates as well. The only thing that a superagent repository won't do is collect client properties and provide policies to the clients.
Clients are pointed to s uperagent repository based on the settings in the agent policy that you assign to thhe clietn.
If you use multiple repositories, agents can determine the nearest repository based on either ping time or subnet value, or you can assign a list to the clients.
Replication to the repositories will be done by a scheduled server task that you can run at any time you want. Keep in mind that if you have global updating enabled, the replication will also occur if a new product or update is added to the master repository.
My concern about the DATs and bandwidth is based on this blog:
Some ePolicy Orchestrator users put a repository at geographic sites that have only a few dozen nodes. If your site does not have at least 200 to 300 nodes it cannot benefit from the bandwidth saved using a repository. If there is no local repository, the agents will go to the next nearest repository for their updates. This repository might be across a WAN link but it will still use less bandwidth since you don’t have to replicate the entire repository across the WAN."
Just to follow up:
I created a superagent repository a week or so ago on this subnet and specified a replication task for the agent, products and DATs for 5am, before work hours. DAT replication looks to be around 115 MB daily.
To keep agents outside of the subnet from using the repository, I created 1 agent policy to ignore that repository, using only the data center repository w/ http fallback. Agents in that subnet have a second agent policy to use that repository 1st, the data center repository 2nd then the http fallback.
So far so good