3 Replies Latest reply on Dec 13, 2010 6:43 AM by Attila Polinger

    Manually replicating repository data to remote sites




      I have 170 remote sites that will be managed by ePO.  Each of these sites will have it's own SuperAgent repository due to enormous bandwidth constraints.  (Rural Australia)

      Currently, I have deployed to 20 of these sites.


      Already we are having great difficulty replicating Distributed Repository data each night through the ePO - with distributed repository replication task timing out and failing for a variety of reasons which McAfee support has all attributed to bandwidth.


      The solution provided to us by Support is to manually schedule a repository replication task for each site, making sure they don't overlap.

      This is pretty unfeasible and not an adequate solution for our needs.


      The master repository is in a Data Center with a good connection (in relative terms)

      Despite this we are unable to get more than 2-3 sites pulling repository data down simultaneously, and with 170 sites at 40+ minutes per pull that does not leave enough time in the day to get each site up to date.



      We have plenty of other data synchronization going from other servers in our Central Environment, and these suites have no issues replicating gigabytes of data on a daily schedule.



      It seems very unfortunate that you cannot put  randomization onto your scheduled server tasks in ePO, as I believe I could get around our replication issues by running a replication task for all the SA's a few times a day with a large randomization interval.




      With all of this out of the way, the real question:


      Is it possible to manually replicate the master repository to your SA distributed repository sites?

      Say for example, using Robocopy.  I can easily write a script to robocopy the repository to each site in a series, and with the robust nature of the tool we can overcome any issues of bandwidth throttling and fluctuating latency.


      I have not found any information on this and would love to get an outside opinion before I start messing with it.

      Is there anything specific about the way the repositories are replicated by ePO which would prevent me copying the data myself?

      If not, where would I copy the data from?

      Can I simply copy a working 100% up to date SuperAgent repository from one SA over the top of the repository on all other SA's?

      Is there a specific file at each site which would need tweaking or excluded from being overwritten?

      Will the agents at the site recognize the local repository as up to date and OK to use?



      Thanks for your time and consideration

        • 1. Re: Manually replicating repository data to remote sites



          1. I created a test site on an ESX box and made a superagent there, with 1 workstation in it's subnet.
          2. I checked the repository status in ePO, which stated the repository was empty (good!)
          3. I checked another (production) distributed repository to ensure it was 100% up to date.
          4. I copied the entire repository from the file system on the production SA to the new test SA.
          5. I checked in the ePO, which listed the packages as now available on the distributed repository (good!)
          6. I manually kicked off an update task on the test workstation on the test SA's subnet



          The Agent Log shows the test workstation checking the local repository for updates!



          From this, I believe it will be OK for me to do all my replication using my own scripts rather than having the ePO do it with Server Tasks.



          I have not yet marked this as resolved because I would really love to hear from someone else in regards to this... I don't want to replicate my repositories across the country and then find out I've broken everything!

          • 2. Re: Manually replicating repository data to remote sites

            The critical file is sitestat.xml (in the root). This contains the catalogue version of the repository, and the clients use it to check if it's at the latest expected version.


            There are improvements in EPO 4.6 (in beta) which may assist you. With the agent 4.6, the super-agent repositories can be configured to only request the files clients actively want, and will cache them. Have a look in the EPO 4.6 forum for more information.


            I used to do something similar to you (copied out files daily to a file share on many servers), but moved back to a centralised model recently (and just like you have disparate regional sites where network performance can be poor).

            1 of 1 people found this helpful
            • 3. Re: Manually replicating repository data to remote sites
              Attila Polinger



              we have been using manual repository mirroring for years now (starting with ePO 3.x). We do not have superagent repositories, jsut plain local distributed ones, if that makes a difference...


              Trick is that once you define repositories on the McAfee Agent policy (and not in the Distributed repositories section), then sitestat.xml won't be accounted for and the agents use the repositories even when they are "obsolete" but structurally valid.


              We use a "second" master repoistory that we also mirrored from the real master repository on the same computer. Then robocopy jobs run - 20+ a day - to mirror it further to country repositories.

              We keep a robocopy process running, monitoring the real master repository and replicating any changes that it sees there to the second master repository (on the same computer, by the way). The other repositories are mirrored by scheduled robocopy jobs.