No announcement yet.

DFP Replication Help

  • Filter
  • Time
  • Show
Clear All
new posts

  • DFP Replication Help

    Guys, we have an issue were users have discovered files on a DFS share are different. Folders & files are missing and have been for a while now so the replication appears to be broken.

    I've checked the replication schedule for the DFS folder and there are 4 member servers in a mesh topology. From the checks I've done so far it looks like some of the server cant see each other as the firewall is blocking the replication. The health checks & DFS logs confirm this so I'm guessing that opening the firewalls up should rectify the issues.

    The file & folder count on the shares looks like this:
    65,253 Files, 2,228 Folders

    65,264 Files, 2,231 Folders

    65,578 Files, 2,256 Folders

    65,664 Files, 2,256 Folders

    So my questions are:

    01. Is there a way to run a report to compare files on each server to work out which are the newest?

    02. Will a firewall change fix the replication issues and update all of the data in the mesh or will it potential cause some more issues? With this question I'm thinking if one person has updated a file on SERVER01 and another person updated the same file on SERVER02, Im assuming DFS will overwrite all files on all member servers with the newest copies, potentially losing change in some files?

    Any thoughts?

  • #2
    Sounds like you have a mess going on. What OS are we dealing with? 2012 or 2012 R2 (there were some significant changes to DFS in R2)

    First, backup each server (file copy, system backup, whatever to preserve a copy... potential for overwrite is high)
    Next, go through the DFS Replication events on the servers. You could be past the tombstone period.

    You'll need to fix the communication and then you'll probably need to start the replication again (you can find the wmic command in the event logs)
    You'll have to rely on users to report when they need a different version of the file.

    You might want to start the DFSR setup from scratch to avoid confusion on what will be overwritten. You could then provide readonly shares of each server's copies to allow users to access files they need and move them into the proper location without having to involve IT each time.

    Network Consultant/Engineer
    Baltimore - Washington area and beyond


    • #3
      Thanks for the info but I think I may have sorted this.

      I checked the Admin Events log & it would appear that not all of the servers in the replication mesh had access to each other.

      I've also discovered masses of PST files which keep being transmitted causing blockages. As the servers are on different sites they have to replicate over the WAN which is queuing files.

      If anyone else needs DFS help some good DOS / PS commands I found were:


      This one was exceptionally useful to check the replication queues.
      DFRDiag /ReplicationState


      • #4
        Thanks for the info!

        Network Consultant/Engineer
        Baltimore - Washington area and beyond


        • #5
          A little late to the party, but if this is useful to anyone... it's a bit rough but it seems to work. It doesn't tell the whole story, but it checks to see if the files, folders, and subfolders match, which seems like more than half the battle.
          # I wrote this script to compare the contents of two folders to see if replication is working properly
          # I wrote it to run from one of the servers that host the replicated folders
          # But you could easily modify to run from anywhere, as long as the permissions allow full access
          # you would probably want to parameterize things to make this script more general
          # that's all yours if you want to
          # server1 =
          # server2 =
          # path1 = pathtoshare
          # path2 = pathtoshare
          # this script writes some temp files - you could easily pipe the output into two objects and compare them with compare-object
          # it would be short and more elegant that way
          # but I was worried that would be a memory hog, better to use a little disc (chicken, not elegant)
          # and also compare-object doesn't work that great in this instance.
          # powershell is almost great
          # you could do better with the time stamp - I did this just so output files wouldn't overwrite each other and I could save them for later
          echo "Reading off time stamp..."
          $timestamp=get-date -format "HHmm"
          echo "Recording file list on server1..."
          # Did this using drive letters rather than UNCs because it's easier for me ...
          # so I have to map a drive for the remote server
          net use z: \\server2\sharename
          # using dir /s /a /b rather than get-childitem because dir handles long filenames gracefully.
          # e:\data\share is the folder we're comparing
          cmd /c dir e:\data\share /s /a /b > rawlist1
          echo "Recording file list on server2..."
          cmd /c dir z:\ /s /a /b > rawlist2
          # the rawlist and trimlist below don't embed the timestamp in their names
          # they are intermediate files so I don't care if they're overwritten next time
          # smartest place for them might be a temp folder ...
          # #
          # first step is to remove the contents of the dfsrprivate folder - that's not the actual replicated data
          echo "... now removing listings of dfsr folder on server1 file ..."
          get-content rawlist1 | where {$_ -notmatch "dfsrprivate"} > trimlist1
          echo "... now removing listings of dfsr folder on server10 file ..."
          get-content rawlist2 | where {$_ -notmatch "dfsrprivate"} > trimlist2
          # now since the output of the dir comand gives the full path, trim that off ...
          # could this step be eliminated by different parameters on the dir command? idk...
          echo "... now trimming off parent path on server1 file ..."
          Get-Content trimlist1 | Foreach-Object { $_ -replace "z:\", "." } | Set-Content list1.$timestamp.txt
          echo "... now trimming off parent path on server2 file ..."
          Get-Content trimlist2 | Foreach-Object { $_ -replace "e:\\data\\share\", "." } | Set-Content list2.$timestamp.txt
          # I was going to compare the contents with powershell but for this purpose fc actually works much better (diff would be better still)
          # compare-object (get-content list10.$timestamp.txt) (get-content list12.$timestamp.txt) | out-file comparison.$timestamp.txt
          cmd /c fc /lb250 list1.$timestamp.txt list2.$timestamp.txt > comparison.$timestamp.txt


          • #6

            Is there a way to get complete DFS-R topology/folder structure ?

            I tried ADTD but its giving no diagram in .vsd image.

            can someone suggest me a similar tool or someway to achieve this.