Hi, I'm wondering, given the appliance-based (rather than agent-based) architecture of NDLP, does that mean when I scan a remote file share the appliance will pull ALL the data back to headquarters to store the data and do content analysis? Wouldn't that be a huge network bandwith issue?
Is that what most customers do or is there a better way?
Yes, it pulls the data locally for indexing then dumps it.
the alternate is to put an appliance network-local to your prime data centers to stop having to pull it over the wan.
I guess you could do the first pass onsite, then move it somewhere else for the incremental updates?
Thanks for your help.
So what you mean is that I can install a data discovery appliance in my remote datacenter, and then my WAN won't be saturated with traffic? What gets sent back from the remote appliance back to headquarters then? I thought the way to create and validate policies is to pull all the data back, then iterativevly refine my policies using the data I've pulled (which can be up to 6TB). ~ Just a bit confused.
No, the appliance is web controlled, so you can connect to it and get reports etc, there's also a console which will collate reports and searches from numerous boxes. 6TB is the size of the index, that could be 60TB of data though.
You need to get a DLP Engineer on the phone and talk about your requirements - it's not really complex, but there are lots of options - you might want a bunch of discovery/monitor boxes scattered around, all rolling up to once central viewpoint for analytics and control.