Announcement

Collapse
No announcement yet.

Dedup question

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Dedup question

    Hi All,

    Hope you can help.

    I have a 9TB drive which we use Windows Server 2012's dedup on to get space savings but we have run into problems where the sys information folder where the dedup folder is stored has grown to over 2TB eliminating any savings on the disk we want to get.

    I understand there is a garbage collection task that can be run but this takes over 13 hours to run (?) judging by the screenshot1 attached none of the optimisation tasks are running or progressing. Screenshot2 shows the 9TB drive space layout from Treeszie just for your reference with the 2TB devoted to dedup sys files that I need to free up.

    Can anyone help?

    Thanks
    RM

  • #2
    By default, garbage collection runs weekly.

    Can you run Get-DedupStatus and Get-DedupSchedule?
    If the status is reporting that it has saved space then you will need that amount of saved space plus the 2TB for the size of the chunk store to rehydrate the data. You can't (and don't want to) get rid of that 2TB chunk store because that is where your space savings comes from.

    If the garbage collector has been disabled, then reenable and you may see some space recovered.

    Otherwise I think you're just out of space.
    Regards,
    Jeremy

    Network Consultant/Engineer
    Baltimore - Washington area and beyond
    www.gma-cpa.com

    Comment


    • #3
      Hi Jeremy, thanks for replying so quickly.

      De-dupe status below:

      FreeSpace SavedSpace OptimizedFiles InPolicyFiles Volume
      --------- ---------- -------------- ------------- ------
      225.56 GB 1.45 TB 185 220 D:

      De-dupe schedule status below:

      Enabled Type StartTime Days Name
      ------- ---- --------- ---- ----
      True Optimization BackgroundOptimization
      True Optimization 01:45 {Sunday, Monday... ThroughputOptimization
      True Optimization 08:00 {Sunday, Monday... ThroughputOptimization-2

      The weekly garbage collection task has been set for daily at 8am.

      Thanks
      RM





      Comment


      • #4
        So it looks like you have about 3.4TB taking up 2TB on disk. The chunk store is your dedupped data. If you were to remove dedup, you would need 1.45 free on disk to allow the data to expand. So you're pretty much out of space.

        It looks like you're storing backups on this server. If the backup software you use has compression/deduplication then Windows dedup will not save much, if any, space since all the data is pretty unique.

        I think you need to get more space or clean up the data.
        Regards,
        Jeremy

        Network Consultant/Engineer
        Baltimore - Washington area and beyond
        www.gma-cpa.com

        Comment


        • #5
          Thanks for looking, had a feeling that we were leaning more to space issue rather than there being a problem with dedup itself. To start rehydrating the data, I'm assuming it is the command(?):

          start-dedupjob -Volume D: -Type Unoptimization

          Is rehydrating the data a long process for a drive of this size (9TB)?

          Comment


          • #6
            When I start rehydrating the data, will the chunk store reduce or do I need 1.45tb on top of the 2tb dedup folder that is already there?

            Comment


            • #7
              Correct, you need 1.45TB of additional space.
              Regards,
              Jeremy

              Network Consultant/Engineer
              Baltimore - Washington area and beyond
              www.gma-cpa.com

              Comment


              • #8
                Once the dedup has been completely disabled that chunk store folder will self delete I assume?

                Comment


                • #9
                  It should. But what are you trying to accomplish? Dedup is saving you space (1.45TB) so why do you want to get rid of it?
                  Regards,
                  Jeremy

                  Network Consultant/Engineer
                  Baltimore - Washington area and beyond
                  www.gma-cpa.com

                  Comment


                  • #10
                    The issue we have is that the dedup garbage collection task is taking a long time to free up space so we are hitting the drive space limit and backup jobs are failing. Ideally I would like to get that 2TB back from the dedup store folder in quick order so that our backups remain unaffected?

                    At the moment, it looks like we are saving 1.45TB via dedup but the dedup needs 2TB for the chunk so not seeing a space saving benefit

                    Comment


                    • #11
                      You're not seeing this correctly. Without the dedup, you would need an additional 1.45TB of space to store the same amount of data. So instead of it taking up 2TB on the drive it would take up 3.45TB on the drive.
                      Regards,
                      Jeremy

                      Network Consultant/Engineer
                      Baltimore - Washington area and beyond
                      www.gma-cpa.com

                      Comment


                      • #12
                        I think I am a bit confused about this

                        Just reading this article on the 4th point in the article:

                        http://www.hayesjupe.com/windows-201...ng-your-space/

                        He says that he starts to rehydrate the deduped data but the dedup chunk store remains, the issue is that we want to turn off dedup and recover any space used by dedup?

                        Comment


                        • #13
                          Well sounds like you should give it a try and see what happens. The problem is you sill need more space on your server to do it.
                          Regards,
                          Jeremy

                          Network Consultant/Engineer
                          Baltimore - Washington area and beyond
                          www.gma-cpa.com

                          Comment


                          • #14
                            Thanks for your help and insight Jeremy, I'll let you know how I get on.

                            Comment


                            • #15
                              Hi Jeremy,

                              I think I have found the issue. We have a number of backup files that are around 1.4-1.8tb each which seem to be slowing down the rate of dedupe. Adding the attributes column in Windows Explorer, I can see that some of the files are still listed with an attribute of 'A' rather than 'APL' (which indicates they have been deduped).

                              Quick question, Is there a command that I can use to determine how old the file needs to be before it is deduped?

                              Thanks
                              Richard

                              Comment

                              Working...
                              X